Quantiphi

Associate Technical Architect – DE

Quantiphi

full-time

Posted on:

Location Type: Remote

Location: United States

Visit company website

Explore more

AI Apply
Apply

About the role

  • Design and implement data warehouses and data lakes/lakehouses on AWS.
  • Collaborate with globally distributed teams for project delivery.
  • Engage in the development of ETL and metadata frameworks.
  • Optimize data pipelines for performance and scalability.
  • Integrate and migrate data using AWS services.

Requirements

  • 5+ years of experience in designing and implementing data warehouses and data lakes/lakehouses on AWS.
  • Hands-on experience with AtScale or similar semantic layer tools to enable governed, business-friendly data access across BI platforms.
  • Proven success working with globally distributed teams in collaborative delivery environments.
  • Deep working knowledge across key AWS Data & Analytics services, including: Building large-scale data lake architectures on Amazon S3 and open table formats.
  • Implementing governance and cataloging through AWS Lake Formation.
  • Developing ETL and metadata frameworks using AWS Glue.
  • Leveraging AWS Lambda for serverless data processing.
  • Running distributed data workloads on Amazon EMR.
  • Enabling real-time data pipelines with AWS Kinesis (Data Streams and Firehose).
  • Orchestrating pipelines using AWS Step Functions/Amazon MWAA/similar services.
  • Designing and optimizing schemas and query performance on Amazon Redshift, including Spectrum and Serverless features.
  • Querying large datasets interactively using Amazon Athena.
  • Managing operational databases using Amazon RDS across engines such as PostgreSQL, MySQL, and Aurora.
  • Integrating and migrating data using AWS DMS, Glue Connectors, EventBridge, SNS, and SQS.
  • Strong understanding of semantic modeling, including logical data models, virtual cubes, and centralized metric definitions (single source of truth).
  • Experience optimizing performance using query pushdown, caching, and aggregate awareness over platforms like Redshift and Athena.
  • Ability to integrate semantic layers with BI tools (QuickSight, Tableau, Power BI) and enforce row/column-level security aligned with governance frameworks.
  • Strong programming capability in Python and PySpark for large-scale data processing.
  • Proficiency in writing complex SQL queries, analytical functions, and performance tuning for large datasets.
  • Familiarity with NoSQL databases such as Amazon DynamoDB, MongoDB, or DocumentDB.
  • Strong understanding of partitioning, indexing, scaling approaches, and query optimization techniques.
  • Proven experience in architecting and implementing data pipelines using native AWS services in a modular and resilient manner.
  • Solid understanding of data modeling concepts, including dimensional, normalized, and lakehouse patterns.
  • Good to have skills: Experience of working for customers/workloads in one of FSI,Retail,CPG domain.
  • Exposure to IaC tools like Terraform and to CI/CD tools.
  • Familiarity with data virtualization (e.g., Amazon QuickSight,PowerBI, Tableau) and data governance tools (e.g., Collibra).
  • Managing security, monitoring, and compliance with AWS IAM, Secrets Manager, CloudWatch, CloudTrail, and KMS.
  • Experience with AI-assisted development tools such as GitHub Copilot, Amazon Kiro (or similar GenAI IDEs) for improving developer productivity, code generation, and pipeline acceleration.
  • Exposure to Data Mesh architecture,Data Governance frameworks.
Benefits
  • Make an impact at one of the world’s fastest-growing AI-first digital engineering companies.
  • Upskill and discover your potential as you solve complex challenges in cutting-edge areas of technology alongside passionate, talented colleagues.
  • Work where innovation happens - work with disruptive innovators in a research-focused organization with 60+ patents filed across various disciplines.
  • Stay ahead of the curve—immerse yourself in breakthrough AI, ML, data, and cloud technologies and gain exposure working with Fortune 500 companies.
  • If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us !
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
data warehousingdata lakesETLmetadata frameworksAWS GlueAWS LambdaAmazon EMRAWS KinesisSQLPython
Soft Skills
collaborationproject deliverycommunicationteamworkproblem-solving