
Data Engineer
Sagent
full-time
Posted on:
Location Type: Remote
Location: Remote • 🇺🇸 United States
Visit company websiteJob Level
Mid-LevelSenior
Tech Stack
AirflowAzureCloudETLGoogle Cloud PlatformPostgresPySparkPythonSDLCSparkSQL
About the role
- Sagent is seeking an experienced Data Engineer to join our Dara product(s) team.
- Responsible for leading the design and implementation of comprehensive data solutions.
- Leverage extensive experience and expertise in data management to drive the development of data collection frameworks, infrastructure systems, and data standards to meet business needs.
- Apply data extraction, transformation, and loading techniques to connect large datasets from diverse sources, ensuring data consistency and integrity.
- Develop and maintain infrastructure systems, including data warehouses and data lakes, and data access APIs.
- Define data standards, governance routines, and data quality monitoring controls to ensure compliance with business needs and regulatory requirements.
- Implement processes and procedures to maintain data quality and consistency across the organization.
- Research current industry, market, and technology trends to identify opportunities for innovation and business enablement.
- Apply knowledge and experience to drive the adoption of new technologies and methodologies to improve data architecture and management practices.
Requirements
- Bachelor’s degree in computer science, Information Technology, or relevant discipline with 3-5 years of experience as a data engineer, developer, data warehouse consumer and / or data related operations
- Proficiency in applying a theoretical knowledge base to achieve goals through independent work.
- Experience in data extraction, transformation, and loading (ETL) techniques to connect large datasets from various sources.
- Demonstrated ability to create data collection frameworks for structured and unstructured data.
- Proficiency in developing and maintaining infrastructure systems, such as data warehouses and data lakes, including data access APIs.
- Strong understanding of data standards, governance routines, and data quality monitoring controls to meet business needs.
- Experience applying a structured methodology for delivering new or enhanced products to the marketplace.
- Proficient in refining project goals and tactics to achieve those goals; understands the value of development management and project management frameworks and best practices.
- 3-5+ years of experience as a data engineer, developer, data warehouse consumer and / or data related operations
- 3+ Years of working experience or exposure to Azure, GCP or any other Cloud development and deployment practices.
- 3+ years of experience with deployments via devops CI/CD pipelines.
- Proficient in database development across PostgreSQL, SQL Server, and Snowflake environments.
- Working knowledge of Python, PySPARK and strong PL/SQL skills.
- Working Knowledge in orchestration/scheduling tools like AirFlow is an added advantage
- HandsOn experience in Spark/Python Job management through services like Dataproc is an added advantage
- Knowledge in data integration concepts, source-to-target mapping techniques and requirements documentation.
- Highly skilled in managing all aspects of all sizes and complexities of multiple projects simultaneously across the software development life cycle phases (SDLC).
- Understanding of Agile development methodologies and software project life cycle and associated methodologies.
- Experience with standard systems analysis tools and procedures
- Ability to prioritize and handle multiple projects.
- Excellent problem solving, analytical, and customer service skills.
- Excellent verbal and written communication skills with the ability to establish deep understanding of client’s business issues.
Benefits
- Remote/Hybrid workplace options
- Health Benefits
- Unlimited Flexible Time Off
- Family Planning Services
- Tuition Reimbursement
- Paid Family Leave
- 401(k) Matching
- Pet Insurance
- In-person and Virtual Social Experiences
- Career Pathing
- Focus Time Fridays
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
data extractiondata transformationdata loading (ETL)data collection frameworksdata warehousesdata lakesdata access APIsdatabase developmentPostgreSQLSQL Server
Soft skills
problem solvinganalytical skillscustomer service skillsverbal communicationwritten communicationproject managementability to prioritizeindependent workinnovationcollaboration