
Data Engineering Analyst
World Business Lenders, LLC
contract
Posted on:
Location Type: Remote
Location: Colombia
Visit company websiteExplore more
About the role
- Improved data pipelines (via automation) with quality checks to ensure data accuracy, consistency, and reliability throughout the entire data processing workflow, reducing manual intervention and minimizing errors.
- Migration of legacy systems to new systems, involving thorough analysis, seamless data transfer, and integration to enhance system performance and maintain continuity without disrupting existing operations.
Requirements
- A Bachelor’s degree in Computer Science, Engineering, Information Systems, or a related field is preferred — but we also highly value equivalent hands-on experience!
- 4 to 7 years of experience as a Data Engineer.
- Experience working with cloud-based data platforms and environments
- Strong SQL skills with data modeling experience for data warehouses
- Strong Python skills (especially notebooks) for building and maintaining data workflows
- Experience developing and maintaining ETL/ELT processes
- Experience working with data warehouses used by reporting/business teams
- Experience using Git/GitHub for version control in collaborative environments
- Understanding of data engineering best practices:
- Pipeline orchestration and dependency management
- Data quality, validation, and monitoring fundamentals
- Hands-on data migration experience:
- Schema mapping and transformation
- Data reconciliation and validation (must be strong here)
- Data quality checks, integrity validation, and issue resolution
- Backfills and historical data handling
- Cutover planning and execution
- Experience building and maintaining automated data pipelines:
- Scheduling, orchestration, and failure handling
- Workflow reliability, monitoring, and repeatability
- Experience within Microsoft Fabric or broader Microsoft Azure data ecosystem
- Experience with orchestration tools such as Apache Airflow, Azure Data Factory, or Fabric pipelines
- Experience with data quality and observability practices:
- Validation frameworks, alerting, SLAs, monitoring
- Experience optimizing performance in data environments:
- Query tuning, partitioning, indexing, cost optimization
- Familiarity with modern data formats and large-scale processing:
- Parquet, Delta, incremental processing patterns
- Experience integrating with external systems:
- REST APIs, authentication, retries, error handling
- Exposure to CI/CD practices for data workflows:
- Git-based deployments, PR workflows, environment promotion
- Clear communication: can explain data issues, tradeoffs, and results to both technical and business stakeholders.
- Ownership & accountability: takes end-to-end responsibility for pipelines, data quality, and outcomes.
- Problem-solving mindset: able to debug complex data issues and work through ambiguity independently.
- Collaboration: works effectively across engineering, analytics, and business teams.
- Adaptability: comfortable operating in evolving environments (migration, changing requirements, new tools).
Benefits
- USD Base Salary.
- Enjoy Paid Time Off (PTO) after just 6 months of service.
- Full-time opportunity.
- Enjoy the freedom of a completely remote work environment!
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard Skills & Tools
SQLPythonETLELTdata modelingdata migrationdata quality checkspipeline orchestrationquery tuningdata processing
Soft Skills
clear communicationownershipproblem-solvingcollaborationadaptability