
AWS Data Engineer II
LPL Financial
full-time
Posted on:
Location Type: Hybrid
Location: San Diego • California • South Carolina • United States
Visit company websiteExplore more
Salary
💰 $44 - $74 per hour
About the role
- Develop, maintain, and enhance data ingestion pipelines within the Enterprise Data Integration Framework (EDIF).
- Build and update AWS Glue ETL jobs in Python (PySpark) for validation, transformation, enrichment, and microbatch processing.
- Collaborate across engineering, QA, Cloud Operations, and vendor partners to implement new ingestion workflows.
- Contribute to architectural design, documentation, and best practices that improve EDIF scalability and resilience.
- Monitor, troubleshoot, and optimize ingestion workflow performance across AWS services (S3, Lambda, Glue, DynamoDB, PostgreSQL, Step Functions, CloudWatch, EventBridge, Athena).
- Assist with the onboarding of new vendor feeds, schemas, and operational schedules into EDIF.
- Participate in platform release management, change control, and nightly batch support activities.
- Maintain ingestion observability through CloudWatch dashboards and EDIF event-monitoring tools.
- Provide technical support to ensure successful nightly data ingestion and data quality compliance.
Requirements
- Bachelor’s degree in Computer Science, Information Technology, Engineering, or related field.
- 3+ years of hands-on experience in data engineering or ETL development.
- Experience with core AWS data and compute services, including S3, Lambda, Glue, DynamoDB, PostgreSQL, Step Functions, CloudWatch, IAM, Athena.
- Proficiency with Python and/or PySpark for data transformation.
- Experience working with relational databases (PostgreSQL, SQL Server, Oracle, etc.).
- Understanding of data modeling, schema evolution, and data validation principles.
- Experience with Git-based version control and CI/CD workflows.
Benefits
- 401K matching
- health benefits
- employee stock options
- paid time off
- volunteer time off
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard Skills & Tools
PythonPySparkETL developmentdata engineeringdata transformationdata modelingschema evolutiondata validationversion controlCI/CD workflows