Salary
💰 $137,439 - $167,981 per year
Tech Stack
Amazon RedshiftApacheAWSCloudDockerDynamoDBGraphQLKafkaKubernetesNoSQLPostgresPySparkPythonScalaSparkSQL
About the role
- Design and build scalable, efficient, and fault-tolerant data operations with structured and unstructured data systems
- Develop, Build and maintain AWS data platform services such as S3, RDS, Aurora, Redshift, PostgreSQL, DynamoDB using cloud deployment pipeline
- Develop and maintain robust data pipelines for ingesting, transforming, and distributing data streams
- Partner with senior technical leads, analysts, engineers, and scientists to implement scalable data initiatives using AWS cloud services
- Implement advanced data architectures for accelerated solution design, including data integration, modeling, governance, and applications
- Champion best practices in data engineering standards
- Mentor and guide internal teams on database standards, pipeline development, and efficient data consumption
- Implement and maintain database security measures, including user access controls and data encryption
- Manage healthcare data and maintain HIPAA, SOC2, FedRamp, StateRAMP data security controls
- Collaborate with SaaS application teams to build robust and scalable data solutions used by healthcare providers
- Travel to a designated company location for on-site onboarding during initial days of employment (company coordinated and paid)
Requirements
- Bachelor’s degree (or higher) in Computer Science or a related field from an accredited institution
- Minimum 8 years of solid experience deploying and managing cloud data platforms on AWS
- Advanced proficiency in SQL
- Deep experience with AWS S3, AWS RDS, Aurora PostgreSQL, and MSSQL
- Proficiency with both SQL and NoSQL databases (e.g., PostgreSQL, Redshift, DynamoDB)
- Experience in cloud-native architectures and distributed computing with AWS tools like Glue, and EMR
- Expertise in Python, PySpark, Scala, and advanced SQL techniques
- Familiarity with GraphQL
- Experience with Docker, Kubernetes, and microservice-based Data API development
- Experience with stream processing tools like Amazon Kinesis, Apache Spark, Storm, or Kafka
- Agile development experience; proficiency with tools like JIRA and Confluence
- Strong collaboration and communication skills
- AWS professional certificate(s)
- Must be legally authorized to work in country of employment without sponsorship for employment visa status (e.g., H1B status)
- May include up to 15% domestic/international travel