Talan

Dataiku Architect

Talan

full-time

Posted on:

Origin:  • 🇺🇸 United States • New York

Visit company website
AI Apply
Manual Apply

Salary

💰 $90,000 - $140,000 per year

Job Level

Mid-LevelSenior

Tech Stack

AWSAzureCloudETLGoogle Cloud PlatformHadoopIoTNoSQLPythonSparkSQLWeb3

About the role

  • We are seeking a highly skilled Data Architect with strong expertise in designing and implementing data pipelines using Dataiku . The ideal candidate will play a key role in defining the data architecture, ensuring scalability, reliability, and performance of our client's data infrastructure, while collaborating with cross-functional teams to deliver business value from data. Key Responsibilities Design, develop, and implement data pipelines and workflows in Dataiku to support data ingestion, transformation, and processing. Define and maintain data architecture standards, best practices, and governance policies. Collaborate with data engineers, analysts, and business stakeholders to translate business requirements into technical solutions. Ensure data quality, integrity, and security throughout the data lifecycle. Optimize data pipelines for scalability, performance, and cost-effectiveness. Support migration and integration of data from multiple sources into a unified data platform. Provide technical leadership and mentoring to junior data team members.

Requirements

  • 5+ years in a similar role. Data architecture, engineering or related roles experience. Solid understanding of ETL/ELT, data modeling , and SQL/NoSQL systems. Familiarity with cloud platforms (AWS, Azure, GCP) and big data tools (e.g., Spark, Hadoop). Experience with Python, SQL scripting, and API integrations . Ability to evaluate multiple implementation paths and recommend optimal solutions. Strong communication skills and capacity to mentor and challenge peers constructively. Preferred Qualifications Previous experience in a Financial institution Dataiku certification or advanced expertise in automation, APIs, and deployment. Experience with data migration or modernization projects. Knowledge of platforms like Snowflake, Azure , or Hadoop .