Leadtech Group

Data Engineer – Fintech

Leadtech Group

full-time

Posted on:

Location Type: Remote

Location: Spain

Visit company website

Explore more

AI Apply
Apply

About the role

  • Design, build, and maintain scalable data pipelines in AWS to support operational and analytical use cases
  • Define and enforce best practices for data ingestion, cataloging, and lineage across our cloud infrastructure (AWS S3, Glue, EMR, Lambda, etc.).
  • Develop and maintain real-time processing applications using Kafka (Producers, Consumers, Streams API) or similar technologies to aggregate, filter, and enrich streaming data from multiple sources.
  • Define data schemas, partitioning strategies, and access patterns optimized for performance and cost
  • Collaborate with development and analytics teams to understand and fulfill the company's data requirements.
  • Implement monitoring and alerting mechanisms to ensure the integrity and availability of data streams.
  • Work with the operations team to optimize the performance and efficiency of the data infrastructure.
  • Automate management and maintenance tasks of the infrastructure using tools such as Terraform, Ansible, etc.
  • Stay updated on best practices and trends in data architectures, especially in the realm of real-time data ingestion and processing.
  • Monitor and troubleshoot data workflows using tools such as CloudWatch, Prometheus, or Datadog—proactively identifying bottlenecks, ensuring pipeline reliability, and handling incident response when necessary.
  • Ensure data quality and performance
  • Define and test disaster recovery plans (multi-region backups, Kafka replication, Snowflake Time Travel) and collaborate with security/infra teams on encryption, permissions, and compliance

Requirements

  • Bachelor's degree in Computer Science, Software Engineering, or a related field (equivalent experience is valued).
  • At least 3 years of programming experience with Java / Python
  • Experience in data engineer design and delivery with cloud based data Warehouse technologies, in particular Snowflake, or Redshift, BigQuery
  • Experience in a wide range of DB technologies such as, DynamoDB, Postgres, and Mongo
  • Development with cloud services, especially Amazon Web Services
  • Demonstrable experience in designing and implementing data pipeline architectures based on Kafka in cloud environments, preferably AWS.
  • Deep understanding of distributed systems and high availability design principles.
  • Experience in building and optimizing data pipelines using technologies like Apache Kafka, Apache Flink, Apache Spark, etc., including real-time processing frameworks such as Apache Flink or Apache Spark Streaming.
  • Excellent communication and teamwork skills.
  • Ability to independently and proactively solve problems.
  • Extra bonus if:
  • Experience with other streaming platforms such as Apache Pulsar or RabbitMQ.
  • Experience in DBA administration and performance tuning standards
  • Familiarity with data lake architectures and technologies such as Amazon S3, Apache Hadoop, or Apache Druid.
  • Relevant certifications in cloud platforms such as AWS (optional).
  • Understanding of serverless architecture and event-driven systems
  • Previous professional experience in FinTech / online payment flows
  • Experience with data visualization tools like Tableau, PowerBI, or Apache Superset.
  • Understanding of machine learning concepts and frameworks for real-time data analytics.
  • Previous experience in designing and implementing data governance and compliance solutions.
Benefits
  • Competitive compensation package, including health insurance and performance bonuses.
  • Opportunities for professional growth and development in a high-growth fintech environment.
  • Collaborative and innovative culture focused on making an impact in the global payments industry.
  • Flexible working environment with support for work-life balance.
  • Full remote work.
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
data pipeline designAWSJavaPythonKafkaSnowflakeRedshiftBigQueryApache FlinkApache Spark
Soft Skills
communicationteamworkproblem-solvingcollaborationproactive
Certifications
AWS certification