
Solutions Architect, Data Engineer
General Dynamics Information Technology
full-time
Posted on:
Location Type: Remote
Location: Remote • 🇺🇸 United States
Visit company websiteSalary
💰 $136,000 - $184,000 per year
Job Level
SeniorLead
Tech Stack
AirflowAnsibleAWSAzureCloudCyber SecurityETLKubernetesPythonSQLTerraform
About the role
- Implement complex systems to meet the current and future needs of a federal agency in Washington, DC
- Work closely with stakeholders to ensure IT systems are efficient, secure and compliant with industry standards
- Perform engineering work associated with the design, development, maintenance, and testing of infrastructures for data generation
- Optimize data flow and collection for cross functional teams
- Provide enterprise-level technical design and support to leadership to align IT systems and data solutions with organizational goals
- Develop and maintain scalable, secure, and integrated system architectures across cloud and on-premises platforms
- Bridge business needs and technology capabilities to guide complex solution development and implementation
- Leverage expertise in SQL, Python, ETL automation, data pipelines, and cloud platforms (AWS, Azure)
- Apply best practices in system integration, data governance, cybersecurity, and enterprise frameworks (e.g., TOGAF, Zachman)
- Design, develop, and implement methods, processes and systems to consolidate and analyze diverse data sets, both structured and unstructured
- Proficient in building infrastructure pipelines required for optimal extraction, transformation, and loading of data from a wide variety of data sources.
Requirements
- Requires a BA/BS degree in a related discipline
- At least 8 years of experience in solution architecture/data engineering
- Strong experience in Kubernetes orchestration for scalable deployment environments
- Expertise in software development, preferably using Python
- Knowledge of machine learning model deployment practices
- Familiarity with ML orchestration tools (e.g., Kubeflow, MLflow , Airflow, SageMaker, or similar)
- Experience with infrastructure-as-code using Terraform and OpenTofu
- Proficiency with GitLab for source control, CI/CD, and DevOps workflows
- Hands-on experience with Ansible for configuration management and automated provisioning
- Experience with Databricks for large-scale data engineering, ML workflows, and collaborative analytics
- Certificate in Kubernetes administration and/or machine learning certification, a plus.
- This position requires an existing Public Trust or the ability to obtain one.
Benefits
- Comprehensive benefits and wellness packages
- 401K with company match
- Paid time off
- Full-flex work week to own your priorities at work and at home
- Paid Family Leave program provides up to 160 hours of paid leave in a rolling 12 month period for eligible employees
- Short and long-term disability benefits
- Life, accidental death and dismemberment, critical illness and business travel and accident insurance
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
SQLPythonETL automationdata pipelinesKubernetesmachine learninginfrastructure-as-codeGitLabAnsibleDatabricks
Soft skills
stakeholder engagementtechnical designsolution developmentdata governancecybersecurityorganizational alignmentcross-functional collaborationproblem-solvingcommunicationleadership
Certifications
Kubernetes administrationmachine learning certificationPublic Trust