Trace3

Data Engineer

Trace3

contract

Posted on:

Origin:  • 🇺🇸 United States • Colorado

Visit company website
AI Apply
Manual Apply

Salary

💰 $65 - $70 per hour

Job Level

SeniorLead

Tech Stack

ApacheAWSAzureCloudCognosKafkaPythonSparkTableauTerraform

About the role

  • Responsible for development, optimization, and management of data ingestion, transformation, and storage processes using modern frameworks
  • Implement infrastructure-as-code and automation solutions to streamline and build resiliency into data processing workflows
  • Design, build, and maintain large-scale data systems, including data warehouses, data lakes, and data pipelines
  • Develop and implement data architectures that meet business stakeholder needs
  • Collaborate with data scientists and analysts to develop and deploy machine learning models and data products
  • Ensure data quality, security, and compliance with standards
  • Develop and maintain technical documentation of data systems and architectures
  • Troubleshoot and resolve data-related issues and optimize system performance
  • Develop and implement automated testing and deployment scripts to ensure smooth delivery of data solutions
  • Collaborate with cross-functional teams to identify and prioritize data requirements and develop solutions
  • Position is a 6 month contract through Trace3 with potential to convert to full-time
  • Candidate must be located within 30-50 minutes of listed cities for a potential hybrid schedule

Requirements

  • U.S. Citizenship is required (candidates with dual citizenship are not eligible)
  • Bachelor's degree in Computer Science, Information Technology, or related field (or equivalent combination of education and experience)
  • 7+ years of experience in data engineering, software development, or a related field
  • Strong programming skills in languages such as Python and Terraform
  • Strong understanding of data architecture, data modeling, and data warehousing concepts
  • Experience with data pipeline tools such as Apache Spark, Apache Kafka, AWS Kinesis
  • Experience with Artificial Intelligence and Machine Learning technologies
  • Strong understanding of data security and compliance principles and practices
  • Excellent problem-solving skills
  • Strong communication and collaboration skills
  • Experience with cloud-based data platforms such as AWS and Azure (desired)
  • Experience with agile development methodologies and version control systems such as Git (desired)
  • Experience with data visualization tools such as Tableau and Cognos Analytics (desired)
  • Familiarity with complex data regulatory requirements such as NIST 800-171 and SOX (desired)
  • Familiarity with OpenTelemetry standards (desired)
  • Experience optimizing observability data and real-time anomaly detection (desired)