Effectual

Data Engineer

Effectual

full-time

Posted on:

Origin:  • 🇺🇸 United States

Visit company website
AI Apply
Manual Apply

Salary

💰 $132,000 - $166,000 per year

Job Level

Mid-LevelSenior

Tech Stack

AirflowAmazon RedshiftApacheAWSAzureBigQueryCloudETLHadoopInformaticaJavaNoSQLPythonScalaSDLCSparkSQL

About the role

  • As a Data Engineer, you play a key role in building and maintaining the data architecture that supports our business needs. Your work includes developing scalable data pipelines to transform and move data from diverse sources, enabling analysts and data scientists to work with clean, accessible datasets.
  • You'll focus on the secure and efficient management of data systems, ensuring information is stored effectively and retrieved quickly when needed. This includes designing, testing, and maintaining data architectures such as databases, data warehouses, data lakes, and large-scale processing systems.
  • In this role, you'll assemble large, complex datasets to meet both functional and non-functional requirements. Expect to build high-performance algorithms, predictive models, and proofs of concept while using modern programming languages and tools to integrate systems and manage complex data workflows.
  • A day in the life of a Data Engineer states with building and delivering high quality data architectures and pipelines that support clients, business analysts, and data scientists. A Data Engineer also interfaces with other technology teams to extract, transform, and load [ETL] data from a wide variety of data sources. Effectual Data Engineers continually improve ongoing reporting and processes, as well as automate or simplify self-service for our clients. Effectual Data Engineers develop, code, and deploy scripts written in the Python programming language, as Python is the language of Data. All Data Engineers are first and foremost Software Engineers with an understanding of the SDLC process.

Requirements

  • Bachelor's or master's degree in Computer Science, Engineering or a related field
  • 4-7 years experience working as a Data Engineer, preferably in a professional services or consulting environment
  • Strong proficiency in programming languages such as Python, Java, or Scala, with expertise in data processing frameworks and libraries (e.g., Spark, Hadoop, SQL, etc.)
  • Proficiency in designing and implementing ETL processes and data integration workflows using tools like Apache Airflow, Informatica, or Talend
  • In-depth knowledge of database systems (relational and NoSQL), data modeling, and data warehousing concepts
  • Experience with cloud-based data platforms and services (e.g., AWS, Azure, Google Cloud) including familiarity with relevant tools and technologies (e.g., S3, Redshift, BigQuery)
  • Familiarity with data governance practices, data quality frameworks, and data security principles
  • Strong analytical and problem-solving skills, with the ability to translate business requirements into technical solutions
  • An understanding of object-oriented programming
  • AWS background, AWS CloudFormation & Data Migration Services (DMS)
  • Snowflake or Databricks certifications and/or hands-on-keyboard experience