
Data Engineer III
LexisNexis
full-time
Posted on:
Location Type: Hybrid
Location: Horsham • Pennsylvania • United States
Visit company websiteExplore more
Salary
💰 $71,600 - $119,400 per year
About the role
- Interface with other technical personnel or team members to document, interpret, and finalize requirements.
- Produce code that is efficient, repeatable, without defects, and adherent to best practices such as naming conventions, encapsulation, etc.
- Write and review portions of detailed specifications for the development of data components.
- Complete data engineering bug fixes and issues, researching and identifying root causes as appropriate.
- Identify opportunities to apply automation or other tools to improve effectiveness or efficiency.
- Work closely with other development team members to understand product requirements and translate them into data engineering and/or data management designs.
- Innovate process improvements that enable efficient delivery and maintenance.
- Participate in the development processes, coding best practices, and code reviews.
- Oversee specific database management ensuring structure and data flow adheres to department standards.
- Utilize various data workflow management and analysis tools.
- Maintain a good level of intimacy within a specific data content area.
- Participate in process improvement and compliance to successfully and consistently deliver high quality services on time, and to specification, resulting in flexibility to react quickly to changes in priorities or circumstances to meet business needs.
- Complete simple data engineering bug fixes and resolve technical issues as necessary.
- Work closely with other engineering team members to understand data and translate requirements.
- Operate in various development environments (Agile, etc.) while collaborating with key stakeholders.
- Keep abreast of new technology developments.
- All other duties as assigned.
Requirements
- Bachelor’s Degree (Engineering/Computer Science preferred but not required); or equivalent experience required.
- Hands-on experience in Azure Synapse Analytics, Azure Databricks, or Hadoop environments.
- Minimum 3 years of strong experience developing Python-based Spark code for large-scale data processing.
- Solid understanding of Big Data concepts, distributed computing, and data lake-based medallion architecture.
- Strong communication and collaboration skills to work with cross-functional teams.
- Strong proficiency in Python for data engineering, ETL, and automation tasks.
- Good working knowledge of Azure services, including Azure Data Factory (ADF) or Synapse Pipelines, Azure Data Lake Storage (ADLS), and Azure Key Vault/Secrets Management.
- Strong experience with writing SQL
- Experience working in both Linux and Windows environments for development and deployment.
- Solid working knowledge of GIT with Azure DevOps (Repos, Pipelines).
- Experience deploying code to higher environments using CI/CD pipelines & DevOps practices in Azure.
- Ability to troubleshoot, optimize, and monitor data pipelines in production.
- Exposure to Data Warehouses like AWS Redshift, Azure Dedicated SQL Pool, Netezza, etc.
- Familiarity with data security, governance, and compliance in cloud environments.
- Exposure to monitoring and logging frameworks for data workloads.
- Exposure to Microsoft Fabric.
- Knowledge of industry best practices (e.g., code coverage).
- Knowledge of software development methodologies (Agile)
- Good documentation skills.
- Attention to detail.
- Strong oral and written communication skills.
Benefits
- Comprehensive, multi-carrier program for medical, dental and vision benefits
- 401(k) with match and an Employee Share Purchase Plan
- Wellness platform with incentives, Headspace app subscription, Employee Assistance and Time-off Programs
- Short-and-Long Term Disability, Life and Accidental Death Insurance, Critical Illness, and Hospital Indemnity
- Family Benefits, including bonding and family care leaves, adoption and surrogacy benefits
- Health Savings, Health Care, Dependent Care and Commuter Spending Accounts
- Up to two days of paid leave each to participate in Employee Resource Groups and to volunteer with your charity of choice
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard Skills & Tools
PythonSparkSQLETLAzure Synapse AnalyticsAzure DatabricksHadoopGITCI/CDdata pipeline optimization
Soft Skills
communicationcollaborationattention to detaildocumentationprocess improvementflexibilityproblem-solvinginnovationteamworkadaptability
Certifications
Bachelor’s Degree in EngineeringBachelor’s Degree in Computer Science