Salary
💰 $100,000 - $145,000 per year
Tech Stack
AirflowAmazon RedshiftApacheAWSAzureBigQueryCloudETLGoogle Cloud PlatformInformaticaOraclePostgresPythonSparkSQL
About the role
- Design, develop, and maintain ETL/ELT pipelines to move and transform data from multiple sources into enterprise data platforms
- Build and optimize data models, schemas, and storage solutions to support analytics and reporting
- Ensure compliance with federal security and data governance frameworks (HIPAA, NIST, CMMC, RMF)
- Collaborate with data stewards, analysts, and business teams to define requirements and deliver trusted data products
- Implement monitoring and logging for data pipelines to ensure reliability, scalability, and performance
- Support data integration and migration projects across enterprise systems
- Document data flows, transformations, and system integrations for audit and compliance needs
- Troubleshoot and resolve pipeline, ingestion, and data quality issues
Requirements
- 5+ years of experience in data engineering or data platform development
- Strong proficiency with ETL/ELT tools and frameworks (e.g., Apache Spark, Apache Airflow, Talend, Informatica)
- Solid experience with SQL and relational databases (PostgreSQL, Oracle, SQL Server, etc.)
- Familiarity with data warehousing concepts and modern architectures (Snowflake, Redshift, BigQuery, or equivalent)
- Proficiency with Python or another scripting language for building and automating data pipelines
- Experience designing and supporting cloud-based data environments (AWS preferred; Azure or GCP a plus)
- Knowledge of compliance and security requirements in healthcare or federal environments (HIPAA, NIST, RMF)
- Strong problem-solving skills and attention to detail when handling large, complex data sets
- Applicants must be authorized to work in the United States; certain roles may require U.S. citizenship and ability to obtain and maintain a federal background investigation and/or security clearance
- Bachelor's Degree