Salary
💰 $85,000 - $115,000 per year
Tech Stack
AirflowAmazon RedshiftAWSAzureBigQueryCloudDistributed SystemsETLGoogle Cloud PlatformJavaKafkaMicroservicesNoSQLPySparkPythonRDBMSTableauTerraform
About the role
- Lead the design, development, and support of scalable data solutions that power analytics, reporting, and operational systems across the organization.
- Translate business requirements into robust data pipelines and services using modern cloud technologies.
- Collaborate with Product and Engineering teams to develop APIs, integrations, and backend services aligned with business requirements.
- Work within microservices and event-driven architectures to deliver modular and responsive data solutions.
- Ensure code quality and deployment safety through CI/CD pipelines and version control practices.
Requirements
- Bachelor’s degree in computer science, Information Technology, or a related field.
- Minimum 6 years of experience in data engineering, with hands-on work on platforms such as Snowflake, Google Big Query, Databricks, or Redshift.
- Minimum 5 years of programming experience in Python, Java, or a similar language.
- Strong experience with AWS Glue, PySpark, Airflow, or equivalent data processing and orchestration tools.
- Solid experience with relational (RDBMS) and NoSQL databases.
- Strong understanding of data modeling, especially with columnar databases and experience using tools such as SQLDBM, or similar.
- Experience with ETL/ELT processes, including transformation tools like DBT Core.
- Experience with streaming platforms such as Kafka for real-time data processing and event ingestion.
- Strong understanding of distributed systems architecture, including microservices and event-driven design.
- Proficiency with cloud platforms (AWS, Azure, or GCP).
- Experience with GitHub, Terraform, CircleCI, CI/CD pipelines, and modern software development practices.
- Excellent analytical and problem-solving skills with a data-driven mindset.
- Exposure to data visualization and BI tools such as Looker, Power BI, or Tableau.
- Strong communication and collaboration skills in a remote or hybrid team environment.
- Self-motivated, proactive, and accountable for delivering high-quality results.
- Experience with monitoring and observability tools such as Datadog or Honeycomb.
- Certifications such as AWS Certified Data Engineer – Associate or Professional, or equivalent in cloud and data engineering.
- Medical, dental, vision, and basic life insurances
- 401k plan with 100% matching for the first 4% contributed
- 15 days of PTO each year
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
data engineeringPythonJavaAWS GluePySparkAirflowETLELTdata modelingstreaming platforms
Soft skills
analytical skillsproblem-solving skillscommunication skillscollaboration skillsself-motivatedproactiveaccountable
Certifications
AWS Certified Data Engineer – AssociateAWS Certified Data Engineer – Professional