
Big Data Tech Lead – Vice President
Citi
full-time
Posted on:
Location Type: Hybrid
Location: Tampa • Florida • 🇺🇸 United States
Visit company websiteSalary
💰 $113,840 - $170,760 per year
Job Level
Senior
Tech Stack
AirflowApacheDockerETLHadoopKafkaKubernetesOraclePostgresPythonRedisScalaSparkSQL
About the role
- Partner with multiple management teams to ensure appropriate integration of functions to meet goals as well as identify and define necessary system enhancements to deploy new products and process improvements.
- Resolve variety of high impact problems/projects through in-depth evaluation of complex business processes, system processes, and industry standards.
- Provide expertise in area and advanced knowledge of applications programming and ensure application design adheres to the overall architecture blueprint.
- Utilize advanced knowledge of system flow and develop standards for coding, testing, debugging, and implementation.
- Develop comprehensive knowledge of how areas of business, such as architecture and infrastructure, integrate to accomplish business goals.
- Provide in-depth analysis with interpretive thinking to define issues and develop innovative solutions.
- Serve as advisor or coach to mid-level developers and analysts, allocating work as necessary.
- Appropriately assess risk when business decisions are made, demonstrating consideration for the firm's reputation and safeguarding Citigroup, its clients and assets.
Requirements
- Design, build, and maintain scalable ETL/ELT pipelines to ingest, transform, and load data from multiple sources.
- Develop and manage large-scale data processing systems using frameworks like Apache Spark, Hadoop, and Kafka.
- Proficiency in programming languages like Python, or Scala.
- Strong expertise in data processing frameworks such as Apache Spark, Hadoop.
- Expertise in Data Lakehouse technologies (Apache Iceberg, Apache Hudi, Trino)
- Expertise in SQL and database technologies (e.g., Oracle, PostgreSQL, etc.).
- Expertise with data orchestration tool Apache Airflow is mandatory.
- Familiarity with containerization (Docker, Kubernetes) is a plus.
- Distributed caching solutions (Hazelcast or Redis).
- Prior experience with building distributed, multi-tier applications is highly desirable.
- Experience with building apps which are highly performant and scalable will be great.
- Bachelor’s degree/University degree or equivalent experience.
- Master’s degree preferred.
Benefits
- medical, dental & vision coverage
- 401(k)
- life, accident, and disability insurance
- wellness programs
- paid time off packages, including planned time off (vacation), unplanned time off (sick leave), and paid holidays.
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
ETLELTApache SparkHadoopKafkaPythonScalaSQLApache AirflowData Lakehouse
Soft skills
problem solvinganalytical thinkingcoachingrisk assessmentcommunication