DKSH Portugal, Unipessoal, Lda.

Senior Data Engineer

DKSH Portugal, Unipessoal, Lda.

full-time

Posted on:

Location Type: Hybrid

Location: LisbonPortugal

Visit company website

Explore more

AI Apply
Apply

Job Level

About the role

  • Design, develop, and maintain scalable data processing pipelines and workflows using frameworks such as Apache Spark, PySpark, and Apache Beam.
  • Build and maintain microservices in Python that serve data-driven features in production.
  • Develop internal tools to support CI/CD pipelines, experiment tracking, and data versioning.
  • Collect, process, and integrate large datasets from multiple sources, including databases, file systems, and APIs.
  • Ensure data integrity, consistency, and quality through robust validation and monitoring processes.
  • Optimize data systems for performance, scalability, and high availability.
  • Implement best practices for data security, access control, and privacy.
  • Collaborate with data scientists, analysts, and engineers to support analytics and ML workflows.

Requirements

  • You have a solid academic background in Computer Science, Engineering, or related fields.
  • You are passionate about data, enjoy working in fast-paced, collaborative environments, and thrive on solving complex problems.
  • You are an innovator that thinks about how data technology can solve new and product opportunities.
  • You understand the ecosystem of data technologies, for example data governance and data integration softwares.
  • You can manage interactions with potential customers.
  • 5+ years of professional experience in software engineering or data engineering.
  • Strong software engineering skills with Python in large-scale, high-performance production environments.
  • Hands-on experience with Spark/PySpark and other big data frameworks.
  • Expertise in data modeling and working with both structured and unstructured data.
  • Hands-on experience with streaming data platforms, particularly Apache Kafka.
  • Strong understanding of distributed systems and modern data architectures.
  • Experience working with cloud platforms, preferably GCP (BigQuery, Dataflow, Pub/Sub, Dataproc).
  • Excellent problem-solving and communication skills.
  • Experience with Databricks and real-time data processing frameworks is nice to have.
  • Experience with NoSQL databases (e.g., Redis, Neo4j) and data lakes is nice to have.
  • Knowledge of ML workflows and algorithms is nice to have.
  • Exposure to other cloud platforms (AWS, Azure) and relevant certifications is nice to have.
  • Familiarity with ETL/integration tools (Talend, Airflow, dbt, etc.) is nice to have.
  • Hands-on experience with version control (Git) and CI/CD pipelines is nice to have.
Benefits
  • Office Snacks and Activities: Fuel your work with various snacks and enjoy fun activities that keep our team spirit high. Whether it's a darts match, board games, or yoga, we believe a happy team is productive.

Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard skills
PythonApache SparkPySparkApache Beamdata modelingstreaming data platformsApache KafkaNoSQL databasesdata lakesCI/CD
Soft skills
problem-solvingcommunicationcollaborationinnovationadaptability