
Analytics Engineer
Mars United Commerce
full-time
Posted on:
Location Type: Hybrid
Location: Atlanta • 🇺🇸 United States
Visit company websiteSalary
💰 $72,390 - $90,440 per year
Job Level
Mid-LevelSenior
Tech Stack
ApacheAWSAzureCloudETLHadoopPythonPyTorchScikit-LearnSparkSQLTensorflow
About the role
- Develop and maintain data pipelines and ETL processes
- Optimize data infrastructure for efficient data processing
- Ensure data quality and accessibility for data scientists and analysts
- Collaborate with cross-functional teams to address data needs and challenges
- Implement data governance and security best practices
- Support annual planning initiatives with clients
- Work closely with cross-functional teams, including analysts, product managers and domain experts to understand business requirements, formulate problem statements, and deliver relevant data science solutions
- Develop and optimize machine learning models by processing, analyzing and extracting data from varying internal and external data sources
- Develop supervised, unsupervised, and semi-supervised machine learning models using state-of-the-art techniques to solve client problems
- Show up - be accountable, take responsibility, and get back up when you are down
- Make stuff
- Share so others can see what’s happening
- Establish and create scalable and intuitive reporting methodologies through Power BI, suggesting the best representation and visualizations
- Identify business intelligence needs recommending the best KPIs and customer valuation models and dashboards
- Interpret data, analyze results, and identify trends or patterns in complex data sets
- Filter and “clean” data and review computer reports, printouts, and performance indicators to locate and correct data corruption problems
- Automate data pipelines and develop automation workflows
- Develop Single Customer View stitching 1P data from various data sources
Requirements
- A bachelor’s/master’s degree in data analytics, computer science, or a directly related field
- 3-5 years of industry experience in a data analytics or related role
- Proficiency in SQL for data querying and manipulation
- Experience with data warehousing solutions
- Design, implement, and manage ETL workflows to ensure data is accurately and efficiently collected, transformed, and loaded into our data warehouse
- Proficiency in programming languages such as Python and R
- Experience with cloud platforms such as AWS, Azure, and Google Cloud
- Experience in developing and deploying machine learning models
- Knowledge of machine learning engineering practices, including model versioning, deployment, and monitoring
- Familiarity with machine learning frameworks and libraries (e.g., TensorFlow, PyTorch, Scikit-learn)
- Ability to design and develop scalable data pipelines for batch and real-time data processing
- Experience with big data technologies such as Apache Spark, Hadoop, or similar
- Proficiency in working with structured and unstructured data sources
- Knowledge of data governance and security best practices
- Strong understanding of data modeling techniques and best practices
- Experience with DevOps or MLOps practices for continuous integration and deployment
- Establish and create scalable and intuitive reporting methodologies through Power BI, suggesting the best representation and visualizations
- Identify business intelligence needs recommending the best KPIs and customer valuation models and dashboards
- Interpret data, analyze results, and identify trends or patterns in complex data sets
- Filter and “clean” data and review computer reports, printouts, and performance indicators to locate and correct data corruption problems
- Data collection, setting and leveraging DMP and CDP-based infrastructures, attribution modeling, A/B & multivariate testing, and dynamic creative
- Develop, evaluate, test, and maintain architectures and data solutions such as ETL Pipelines, Data Warehouses, Data Marts, etc
- Automate data pipelines and develop automation workflows
- Develop scalable and intuitive ETL & ELT pipelines from a variety of marketing sources such as Salesforce, Adobe Analytics, etc
- Identify data sources and create data pipelines using shell scripts or Python scripts
- Create technical documentation
- Plan data analysis work and develop execution estimates, continuously improving the accuracy of the estimates
- Develop Single Customer View stitching 1P data from various data sources
Benefits
- comprehensive group health plans
- a parental leave program that includes paid maternity and paternity benefits for pregnancy, adoption and surrogacy
- flexible paid time off
- a broad and confidential employee assistance program
- ongoing wellness support initiatives
- trusted financial health advice and guidance
- promotion of education through tuition support and assistance
- a flexible and supportive work environment and culture
- Temporary roles may be eligible to participate in our freelancer/temporary employee medical plan through a third-party benefits administration system once certain criteria have been met
- Temporary roles may also qualify for participation in our 401(k) plan after eligibility criteria have been met
- For regular roles, the Company will offer medical coverage, dental, vision, disability, 401k, and paid time off.
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
SQLPythonRETLdata warehousingmachine learningApache SparkHadoopPower BIdata modeling
Soft skills
collaborationaccountabilityresponsibilitycommunicationproblem-solvinganalytical thinkingattention to detailadaptabilitycreativitytime management
Certifications
bachelor's degree in data analyticsmaster's degree in computer science