Tech Stack
AirflowAmazon RedshiftAWSCloudETLPythonScalaScikit-LearnSQLTensorflow
About the role
- Architect next-generation business intelligence solutions combining traditional analytics with AI-powered insights.
- Design and maintain intelligent dashboards, automated reports, and self-service analytics platforms using Looker (LookML), Power BI, and AI-native BI tools.
- Build and optimize real-time data models, ETL/ELT pipelines, and streaming analytics architectures for scalable reporting and analytics.
- Implement automated data quality monitoring, anomaly detection, and intelligent alerting systems.
- Integrate large language models (LLMs) and generative AI tools for natural language querying and automated insight generation.
- Develop predictive models, recommendation engines, and forecasting systems embedded within BI workflows.
- Use AI coding assistants (e.g., GitHub Copilot, Cursor) for development, automated testing, and code optimization.
- Apply modular code design and version-controlled development workflows for scalable, maintainable analytics solutions.
- Collaborate with cross-functional teams to identify AI-enhanced decision-making opportunities and autonomous business processes.
- Lead transition from static reporting to dynamic, AI-driven analytics and mentor junior engineers.
- Adapt to changing business needs, priorities, and technologies.
Requirements
- 3+ years of experience in BI development, including integration of AI/ML tools into analytics workflows.
- Advanced SQL proficiency and hands-on experience with modern data platforms (e.g., Databricks, Amazon Redshift, Amazon S3, Cloud-based Data Lakes).
- Proficiency with BI tools like Looker and Power BI, including AI-enhanced features such as PBI Smart visuals and ML integration.
- Hands-on experience with the modern data stack and orchestration tools (e.g., dbt, Airflow, Prefect), including reverse ETL, data mesh architectures, and real-time personalization.
- Experience building and optimizing scalable analytics solutions for high-volume data processing in integrated cloud environments.
- Hands-on experience with machine learning frameworks (e.g., scikit-learn, TensorFlow) and deployment tools (e.g., MLflow).
- Hands-on experience with generative AI tools and prompt engineering, applied to code generation, data analysis, and automated insight discovery within AI-assisted development workflows.
- Hands-on experience with cloud-native data and AI services (e.g., AWS Bedrock, Databricks AI/Genie), with the ability to communicate insights effectively to non-technical stakeholders.
- Bachelor’s or Master’s degree in Data Science, Computer Science, Statistics, or a related quantitative field (preferred).