Apply

Ready to go for it?

AI Apply speeds things up—apply directly if you prefer.

FREE ACCESS
5,000–10,000 jobs/day
JobTailor Logo

See all jobs on JobTailor

Search thousands of fresh jobs every day.

Discover
  • Fresh listings
  • Fast filters
  • No subscription required
Create a free account and start exploring right away.
eSimplicity

Data Engineer I

eSimplicity

. Develop production-grade ETL workflows using Python and Microsoft-based frameworks to ingest, transform, and validate large-scale structured and unstructured data.

Posted 4/21/2026full-timeRemote • Maryland • 🇺🇸 United StatesMid-LevelSenior💰 $76,400 - $98,100 per yearWebsite

Tech Stack

Tools & technologies
AzureCloudETLMicroservicesNumpyPandasPythonPyTorchScikit-LearnSQLTensorflow

About the role

Key responsibilities & impact
  • Develop production-grade ETL workflows using Python and Microsoft-based frameworks to ingest, transform, and validate large-scale structured and unstructured data.
  • Implement schema enforcement, data validation, and quality checks to maintain integrity across diverse sources.
  • Optimize pipelines for performance, scalability, and fault tolerance using open-source and cloud-native patterns.
  • Manage Azure-based data solutions, including Data Lake Storage, Azure SQL, and cloud storage access from Python services.
  • Deploy workflow orchestration using Azure Data Factory or Foundry for scheduling, monitoring, and automation.
  • Ensure secure integration of APIs and services within the Microsoft ecosystem for seamless data exchange.
  • Build Python-based data services leveraging libraries such as Pandas, Pytorch, and other open-source frameworks for high-performance processing.
  • Implement logging, monitoring, and performance tuning for robust operational reliability.
  • Develop API endpoints and microservices to enable interoperability with analytics and ML platforms.
  • Work closely with data scientists, analysts, and cloud architects to deliver clean, reliable data for predictive modeling and real-time dashboards.
  • Apply data governance best practices, ensuring compliance, reproducibility, and auditability across workflows.
  • Contribute to Agile team processes, driving iterative improvements and shared problem-solving.
  • Work in Agile teams; drive iterative delivery, joint problem-solving, and continuous improvement.
  • Engage closely with project managers, technical leads, client representatives, and cross-functional teams to provide timely updates, resolve issues, and ensure alignment with business goals.
  • Translate technical specifications into code and design documents.

Requirements

What you’ll need
  • Bachelor’s degree or equivalent professional experience in Data Science, Computer Science, Engineering, or related field.
  • All candidates must pass public trust clearance through the U.S. Federal Government.
  • 3+ years developing and deploying advanced statistical and machine learning models or supporting data pipelines for such models.
  • Proficiency in Python (Pandas required; scikit-learn, NumPy, and related libraries preferred).
  • Strong SQL skills and experience integrating data from relational databases.
  • Familiarity with open-source data processing libraries (Pandas, PyTorch, Tensorflow etc.).
  • Open-source frameworks for production-grade data pipelines.
  • ETL development using Python and Microsoft technologies.
  • Data validation, schema enforcement, and quality assurance.
  • API development within Microsoft ecosystem.
  • Performance optimization, logging, and monitoring for large-scale systems.
  • Azure Data Lake Storage integration and Azure SQL connectivity.
  • Workflow orchestration with Azure Data Factory.
  • Deployment and operation of Python-based data services in Azure.
  • Strong attention to detail with a commitment to delivering high-quality and accurate work.
  • Excellent communication skills, both written and verbal, with the ability to collaborate effectively across teams.
  • Proven ability to manage time and prioritize tasks in a fast-paced environment.
  • Demonstrated problem-solving skills with a proactive and solution-oriented mindset.

Benefits

Comp & perks
  • full healthcare benefits

ATS Keywords

✓ Tailor your resume
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
PythonETL developmentdata validationschema enforcementAPI developmentperformance optimizationloggingmonitoringmachine learningstatistical modeling
Soft Skills
attention to detailcommunicationtime managementprioritizationproblem-solvingcollaborationproactive mindsetsolution-orientedadaptabilityteamwork
Certifications
Bachelor’s degree in Data ScienceBachelor’s degree in Computer ScienceBachelor’s degree in Engineeringpublic trust clearance