Quantiphi

Associate Data Architect – Snowflake

Quantiphi

full-time

Posted on:

Location Type: Office

Location: BengaluruIndia

Visit company website

Explore more

AI Apply
Apply

About the role

  • Work with business users and other stakeholders to understand business processes.
  • Ability to design and implement Dimensional and Fact tables
  • Identify and implement data transformation/cleansing requirements
  • Develop a highly scalable, reliable, and high-performance data processing pipeline to extract, transform and load data from various systems to the Enterprise Data Warehouse
  • Develop conceptual, logical and physical data models with associated metadata including data lineage and technical data definitions
  • Design, develop and maintain ETL workflows and mappings using the appropriate data load technique.
  • Provide research, high-level design and estimates for data transformation and data integration from source applications to end-user BI solutions.
  • Provide production support of ETL processes to ensure timely completion and availability of data in the data warehouse for reporting use.
  • Analyze and resolve problems and provide technical assistance as necessary.
  • Partner with the BI team to evaluate, design, develop BI reports and dashboards according to functional specifications while maintaining data integrity and data quality.
  • Work collaboratively with key stakeholders to translate business information needs into well-defined data requirements to implement the BI solutions.
  • Leverage transactional information, data from ERP, CRM, HRIS applications to model, extract, and transform into reporting & analytics.
  • Build transformation Jobs on SnowFlake to transform source data to target schema used in BI tools.
  • Develop and support data governance processes, analyze data to identify and articulate trends, patterns, outliers, quality issues, and continuously validate reports, dashboards and suggest improvements.
  • Train business end-users, IT analysts, and developers.

Requirements

  • Bachelor’s degree in Computer Science or similar field or equivalent work experience.
  • Experience with Snowflake warehouse and developing applications on Snowflake, SnowSQL, Snowpipe, Javascript UDF & Stored procedures is must have.
  • Strong in SQL is must have.
  • Hands on experience with Big Data, Python, Redshift/Big Query warehouse.
  • Knowledge of Data ingestion and Engineering Platforms is must have
  • Good understanding and experience on Data Warehousing or Data Integration projects is must have.
  • Good knowledge of at least one ETL tool and experience of creating data pipelines is must have.
  • IICS/Informatica knowledge is a good to have
  • Knowledge of the Insurance domain is good to have.
  • Expert with data warehousing standards, strategies and tools.
  • Expert with SDLC processes.
  • Strong knowledge of relational databases preferably ORACLE and SQL Server.
  • Knowledge of Python UNIX/LINUX shell scripting is must
  • Strong problem-solving, multitasking and organizational skills.
  • Good written and verbal communication skills.
  • Demonstrated experience of leading a team spread across multiple locations.
  • Migration experience from On Premise Source to Snowflake
  • Experience with Data Ingestion into SnowFlake such as Snowpipe, Bulk Copy Commands
  • Experience in data modeling techniques such as de-normalized, star and snowflake schemas.
Benefits
  • Flexible working hours
  • Professional development opportunities
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
SQLPythonSnowflakeETLData WarehousingData IntegrationData ModelingBig DataUNIXLINUX
Soft Skills
problem-solvingmultitaskingorganizational skillswritten communicationverbal communicationleadership