Tech Stack
AirflowApacheAWSCloudETLKafkaSeleniumSpark
About the role
- LegitScript mission: making the internet and payment ecosystems safer; helping companies stay legal and safe for consumers\n
- Combines big data with experts in regulated sectors (transaction laundering detection, pharmaceuticals, online gambling, and more) for risk analysis\n
- Trusted by major search engines, internet platforms, payment companies, and regulatory agencies\n
- Overview: leads two scrum teams; align technical efforts with business goals; deep AWS and Databricks expertise desired\n
- What You\'ll Do: manage multiple scrum teams of software and quality engineers; coach and mentor; ensure quality, security, and timely delivery; customer-oriented software delivery; collaborate with Product Manager; performance assessments; compensation and career growth; adherence to best practices; strong judgment and independent decisions\n
- What You\'ll Bring: extensive experience with Jira, Scrum/Agile, microservices, cloud deployments (AWS), Databricks, DMS, MLOps, ML tools, data storage and processing with Delta Lake, Spark, Kafka; web crawlers experience; strong interpersonal skills\n
- Benefits: competitive salaries, comprehensive medical plans, dental & vision, 401k with match, generous PTO and holidays
Requirements
- Experience in coaching, mentoring, and overall performance management of both developers and quality engineers\n
- Expert-level knowledge of Jira and project management in a Scrum/Agile environment\n
- Experience delivering systems in a micro-service or service-oriented architecture\n
- Experience deploying systems into cloud platforms (AWS preferred)\n
- Extensive knowledge and hands-on experience with Databricks for building and maintaining ETL pipelines, managing Delta Lake storage, running large-scale Apache Spark jobs, and integrating machine learning workflows\n
- Proven ability to work closely with application development teams to design, optimize, and integrate Databricks solutions into production systems\n
- Experience using AWS Database Migration Service (DMS) to move and synchronize data from on-premises or cloud-based sources into AWS for downstream processing in Databricks\n
- Experience with MLOps deployments, including model training, versioning, monitoring, and automated CI/CD pipelines for ML models\n
- Familiarity with common MLOps tools such as MLflow, Kubeflow, Apache Airflow, AWS SageMaker, and AWS Step Functions\n
- Proficiency with data storage and processing tools that integrate with Databricks, such as Delta Lake, Apache Spark, and Kafka\n
- Experience designing and implementing web crawlers for data collection and processing at scale (e.g., using Scrapy, BeautifulSoup, Selenium, or Playwright)\n
- Must possess and consistently exhibit the competencies relative to the position\n
- Strong interpersonal and communication skills, including the ability to lead discussions in diverse groups of varying sizes