
Data Integration Engineer
Leidos
full-time
Posted on:
Location Type: Remote
Location: United States
Visit company websiteExplore more
Salary
💰 $107,900 - $195,050 per year
About the role
- Design and implement data integration solutions across enterprise systems and cloud data platforms (e.g., Oracle, Snowflake, AWS, Azure)
- Extend existing physical data models into logical and semantic data models that support analytics and AI use cases
- Partner with data owners and CDAO technical staff to define, design, and refine enterprise data products, including domains, schemas, interfaces, SLAs, and consumption patterns
- Collaborate with the team to translate enterprise architecture standards and data governance guidelines into implementable models (logical, physical, domain) and integration patterns
- Work closely with Data Engineers to ensure pipelines are aligned to target logical, physical, domain, and semantic models
- Develop, implement, and maintain dimensional, relational, and domain-driven data product models and databases using the IDERA /Embarcadero suite of products
- Ensure assets are optimized for performance, scalability, and AI-readiness within scalable cloud-native data platforms
- Collaborate with data owners and CDAO technical staff to develop and maintain Leidos data protection and data privacy policies governing data use
- Collaborate with CDAO technical staff to develop and maintain the IDERA / Embarcadero repository and portal data objects
- Support metadata registration and governance alignment within Collibra
- Implement data integration patterns including batch, streaming, API-based, and event-driven architectures
- Participate in data quality and validation processes to ensure trusted, production-ready data products
- Contribute to documentation, standards, and modeling best practices
Requirements
- Bachelor’s degree in Computer Science, Information Systems, or related field
- 8+ years of relevant experience
- Strong experience with data modeling (conceptual, logical, semantic, and physical)
- Hands-on experience on assembling and implementing DDL for Tables, Views, SQL frameworks, and Security policies within relational database systems
- Understanding of Data Replication products, preferably Oracle’s Golden Gate replication
- Hands-on experience with cloud data platforms such as Snowflake, AWS, Azure, or GCP
- Hands-on experience working with ETL/ELT/API developers on the design and implementation of data integration pipelines; preferably experience with Informatica
- Proficiency in SQL and understanding of performance optimization techniques
- Experience working with Data Architects and cross-functional technical teams
- Strong analytical and problem-solving skills.
- US Citizenship is required.
Benefits
- competitive compensation
- Health and Wellness programs
- Income Protection
- Paid Leave
- Retirement
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard Skills & Tools
data integration solutionsdata modelingDDLSQLETLELTdata replicationperformance optimizationdata qualitydata validation
Soft Skills
analytical skillsproblem-solving skillscollaborationcommunication
Certifications
Bachelor’s degree in Computer ScienceBachelor’s degree in Information Systems