Salary
💰 $65 - $70 per hour
Tech Stack
ApacheAWSAzureCloudCognosKafkaPythonSparkTableauTerraform
About the role
- Design, build, and maintain large-scale data systems, including data warehouses, data lakes, and data pipelines
- Develop and implement data architectures that meet the needs of our business stakeholders
- Implement infrastructure-as-code and automation solutions to streamline and build resiliency into data processing workflows
- Collaborate with data scientists and analysts to develop and deploy machine learning models and data products
- Ensure data quality, security, and compliance with standards
- Develop and maintain technical documentation of data systems and architectures
- Troubleshoot and resolve data-related issues and optimize system performance
- Develop and implement automated testing and deployment scripts
- Collaborate with cross-functional teams to identify and prioritize data requirements and develop solutions to meet business needs
- Work on a 6 month contract with potential to convert to full-time
Requirements
- U.S. Citizenship is required for this role (candidates with dual citizenship are not eligible)
- Bachelor's degree in Computer Science, Information Technology, or related field (or equivalent combination of education and experience)
- 7+ years of experience in data engineering, software development, or a related field
- Strong programming skills in languages such as Python and Terraform
- Strong understanding of data architecture, data modeling, and data warehousing concepts
- Experience with data pipeline tools such as Apache Spark, Apache Kafka, AWS Kinesis
- Experience with Artificial Intelligence and Machine Learning technologies
- Strong understanding of data security and compliance principles and practices
- Excellent problem-solving skills
- Strong communication and collaboration skills
- Must be located within 30-50 minutes of one of the listed office locations (Dallas, Central Florida, King of Prussia PA, Stratford CT, Denver CO, Marietta GA)