Salary
💰 $70 - $75 per hour
Tech Stack
AirflowAmazon RedshiftAWSCloudDjangoDockerETLFlaskGoogle Cloud PlatformGraphQLKubernetesMicroservicesMongoDBPostgresPythonPyTorchReactRedisReduxSCSSSQLSSISTensorflow
About the role
- Design, write, and optimize complex SQL queries for data extraction, transformation, and loading (ETL).
- Develop Python scripts and applications for data processing, automation, and integration.
- Collaborate with data scientists and analysts to build data pipelines and analytical tools.
- Work with Salesforce APIs to extract, transform, and load data into internal systems.
- Support Salesforce datamodeling and reporting requirements.
- Ensure data consistency and integrity across Salesforce and other platforms.
- Build and maintain scalable ETL/ELT pipelines using tools like Airflow or similar.
- Manage and optimize relational databases and data warehouses (e.g., Snowflake, AWS Redshift).
- Implement data governance, profiling, and validation processes.
- Partner with cross-functional teams including product managers, researchers, and engineers.
- Document technical specifications, data flows, and system architecture.
- Participate in code reviews and contribute to best practices in data engineering.
Requirements
- Bachelor’s degree in Computer Science, Information Technology, or related field.
- 3+ years of experience in SQL and Python development.
- Hands-on experience with Salesforce data structures and APIs.
- Familiarity with cloud platforms (AWS, GCP) and containerization tools (Docker, Kubernetes).
- Experience with version control systems (Git) and CI/CD pipelines.
- Experience with deep learning frameworks (PyTorch, TensorFlow) is a plus.
- Knowledge of data modeling, big data analytics, and business intelligence tools.
- Strong problem-solving and communication skills.
- Ability to work independently in a fast-paced, dynamic environment.