Salary
💰 $170,000 - $180,000 per year
Tech Stack
ApacheAWSAzureCloudDockerETLGoogle Cloud PlatformHadoopJavaKafkaKubernetesMySQLPostgresScalaSparkSQL
About the role
- Manage, mentor, and grow a team of data integration engineers; foster a culture of technical excellence, continuous improvement, and ownership.
- Set the strategic direction for data integration efforts and develop/manage the team's roadmap.
- Provide technical guidance and oversight for design, development, and maintenance of data integration solutions; ensure robustness and scalability.
- Establish and enforce data quality control measures and develop proactive monitoring and alerting systems.
- Serve as the primary point of contact for integrations; collaborate with product management, engineering, customer success, and senior management.
- Implement and refine processes for CI/CD, change management, and incident response for data pipelines; improve support protocols with partners.
- Work with data architects to define data models, schemas, and integration patterns for long-term scalability of the CDP and other data platforms.
- Evaluate and recommend emerging data integration technologies and tools, including AI capabilities to streamline onboarding.
- Act as a hands-on leader conducting technical research, POCs, and code merges to guide and manage teams.
Requirements
- Bachelor's or Master's degree in Computer Science, Software Engineering, or a related quantitative field.
- 7+ years of professional experience in software engineering, with at least 3 years in a leadership or management role in data integration space.
- Hands-on leadership experience managing a team of < 10 and experience working with business stakeholders like Customer Success and Partner implementation teams.
- Proficient in integration technologies like APIs, SFTP, Cloud-Cloud data integration, file transfers, streaming transfers, Kafka etc.
- Solid understanding of data formats (JSON, XML), and data transformation best practices.
- Deep programming skills in Java/Scala.
- Experience in writing SQL queries and extensive experience with relational databases (e.g., MySQL, Postgres).
- Exposure to ETL/ELT pipelines using frameworks like Apache Spark or other enterprise-level tools.
- Proven experience working with cloud platforms (e.g., AWS, GCP, Azure) and their data services.
- Excellent problem-solving, communication, and interpersonal skills.
- Bonus: Experience designing and building large-scale systems; big data technologies (Hadoop, Kafka); Snowflake/Databricks; Docker/Kubernetes; hospitality or travel technology experience.