
Data Engineer – Tech Lead
Jones Lang LaSalle Americas, Inc.
full-time
Posted on:
Location Type: Hybrid
Location: Tel Aviv • Israel
Visit company websiteExplore more
Job Level
About the role
- Design and implement robust, scalable data pipelines using Databricks, Apache Spark, and Delta Lake as well as BigQuery
- Use SQL and Python to develop, scale, and optimize advanced data pipelines
- Build and optimize ETL/ELT processes for capital markets data
- Develop real-time and batch processing solutions to support trading and risk management operations
- Implement data quality monitoring, validation, and alerting systems
- Configure and optimize Databricks workspaces, clusters, and job scheduling
- Work in a Multi-cloud environment including Azure, GCP and AWS
- Implement security best practices including access controls, encryption, and audit logging
- Build integrations with market data vendors, trading systems, and risk management platforms
- Establish monitoring and performance tuning for data pipeline health and efficiency
- Collaborate with various stakeholders across the company and support business insight requests
- Work closely with quantitative researchers, risk analysts, and product teams to understand data requirements
- Collaborate with other data engineering teams and infrastructure groups
- Provide technical guidance to junior engineers and contribute to code reviews
- Participate in architecture discussions and technology selection decisions
Requirements
- Engineering experience with strong expertise in Apache Spark and distributed computing
- Strong programming skills in Python including data handling libraries (pandas, numpy, etc.), and data modeling.
- Proficient in Databricks platform and Delta Lake for data lake architecture
- Proficient in writing complex SQL queries
- Experience with cloud-based databases (Google BigQuery, Snowflake)
- Advanced SQL skills and experience with both relational and NoSQL databases
- Bachelor's degree in Computer Science, Engineering or related field
- Experience with integration of multiple data sources and working with various databases technologies.
- Experience with Azure cloud platform and associated data services (Data Factory, Event Hubs, Storage) (preferred)
- Experience with EKS / AKS (preferred)
- Knowledge of data streaming platforms (Kafka, Azure Event Hubs) for real-time processing (preferred)
- Experience working in a multi cloud environment (preferred)
Benefits
- Health insurance
- Retirement plans
- Paid time off
- Flexible work arrangements
- Professional development
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
Apache SparkPythonSQLETLELTDatabricksDelta Lakedata modelingdata handling librariesdata quality monitoring
Soft skills
collaborationtechnical guidancecommunicationproblem-solvingstakeholder engagement
Certifications
Bachelor's degree in Computer ScienceBachelor's degree in Engineering