Tech Stack
AirflowCloudETLJavaScriptKafkaPythonSparkSQLTableauTerraform
About the role
- Serve as an expert in gathering actionable requirements and executing against them in an agile environment
- Collaborate closely with business stakeholders to gather requirements, understand reporting needs, and translate them into effective data solutions
- Become deeply familiar with all relevant data sources used in projects—including where data is stored, associated data dictionaries, and dependencies
- Investigate and assess existing reports with the ability to convert them into Power BI dashboards or enhance them as needed
- Design and develop new reports and dashboards that provide meaningful insights and support decision-making
- Define and develop metrics to measure operational health and excellence across organizations and initiatives
- Conduct deep-dive analyses using SQL and BI tools to identify trends, patterns, and insights
- Build interactive dashboards and reports using Power BI, Tableau, and/or JavaScript-based tools
- Maintain and improve the integrity and reliability of internal and external data sources; collaborate with stakeholders to capture additional data and ensure accuracy and consistency
- Regularly prepare reports and presentations that summarize findings, provide actionable insights, and track the impact of implemented strategies
- Effectively communicate insights and recommendations to influence decision-making and drive growth
Requirements
- Bachelor’s degree
- 3+ years of experience with SQL
- 3+ years of experience with BI tools such as Tableau, Power BI, Looker
- Intermediate coding skills in Python and JS
- Advanced understanding of database structures, query optimizations, ETL development
- Experience working in cloud infrastructures
- Experienced with data orchestration tools like Airflow, transformation frameworks like dbt, and cloud deployment tools like Terraform.
- Demonstrated exceptional oral and written communication skills.
- The ability to work independently and in a team environment.
- The ability to work effectively across functions, levels and disciplines.
- Strong problem solving and critical thinking skills.
- Superior team-working skills, and a desire to learn, contribute, and explore.
- Experience with Snowflake, Databricks, Kafka, Flume, Spark, or Flink is a plus
- Familiarity with Agile methodologies and working within Agile Teams
- Experience with Jira for tracking work