
Senior Data Programmer, Analyst
Boeing
full-time
Posted on:
Location Type: Hybrid
Location: Seattle • Washington • United States
Visit company websiteExplore more
Salary
💰 $141,950 - $192,050 per year
Job Level
About the role
- Serve as the Lead Data Programmer Analyst and technical owner for data platform initiatives
- Build and maintain strong partnerships with business stakeholders to deepen domain knowledge and align data solutions with strategic needs
- Design and implement data ingestion patterns and pipelines to migrate and integrate on-premise sources (Oracle, Teradata) to cloud-based platforms (AWS)
- Build and operate modern cloud-based ingestion tools and frameworks (Databricks, Snowflake, or equivalents)
- Define and maintain data architecture, lineage, and platform documentation
- Curate and structure data for business usage and self-service analytics
- Leverage cloud Artificial Intelligence/Machine Learning (AI/ML) marketplaces and services to enable advanced analytics and model deployment
- Map, document, and analyze current application architectures, ingestion patterns, data flows, and platform constraints to drive a clear modernization roadmap
- Design conceptual, logical, and physical data models tailored to manufacturing, engineering, and PLM (product lifecycle management) systems
- Lead integration of diverse data sources including IoT/factory equipment telemetry, Enterprise Resource Planning (ERP) systems (e.g., SAP), and Computer-Aided Design/Product Lifecycle Management (CAD/PLM) tools (e.g., Siemens Teamcenter, CATIA)
- Oversee end-to-end data integration and Extract, Transform, Load/ Extract Load, Transform (ETL/ELT) processes across heterogeneous sources
- Promote and operationalize modern platform practices including infrastructure-as-code, pipeline observability, metadata/cataloging, data contracts and versioning, and policy-as-code
- Automate access provisioning, ingestion pipelines, testing, Continuous Integration/Continuous Delivery (CI/CD), and deployment to reduce manual work and accelerate safe delivery
- Collaborate closely with business stakeholders to understand data needs and deliver actionable insights
- Mentor engineers and guide best practices for secure, scalable, maintainable data engineering
Requirements
- 5+ years of experience as a Developer
- 3+ years of experience with cloud (AWS) and modern ingestion tools (Databricks, Snowflake, or similar)
- Experience with data warehousing and cloud/on-prem platforms such as Oracle, Teradata, Redshift, and AWS services
- Experience in data processing with Python and PySpark
- Experience with ETL/ELT patterns, data integration techniques, and data modeling
- Experience with Linux operating systems and shell scripting
- Experience with infrastructure-as-code (e.g., Terraform, CloudFormation), CI/CD pipelines, and automated testing for data pipelines
- Experience communicating with stakeholders and cross-functional teams
Benefits
- competitive base pay and variable compensation opportunities
- health insurance
- flexible spending accounts
- health savings accounts
- retirement savings plans
- life and disability insurance programs
- paid and unpaid time away from work
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard Skills & Tools
data programmingdata ingestiondata integrationdata modelingETLELTPythonPySparkinfrastructure-as-codeautomated testing
Soft Skills
stakeholder communicationmentoringcollaborationleadershiporganizational skills