Parser

Senior Data Architect

Parser

full-time

Posted on:

Location Type: Hybrid

Location: Charlotte • North Carolina • 🇺🇸 United States

Visit company website
AI Apply
Apply

Job Level

Senior

Tech Stack

AirflowApacheJavaScriptMicroservicesNoSQLPythonSparkSQL

About the role

  • Evaluate and design scalable data architectures to support Global Risk Management (GRM) data and reporting needs
  • Lead the strategy and architecture for data management products such as Data Catalog, Data Lineage, Data Feeds Registry, and Data Quality services
  • Design high-quality data products that support quantitative modeling and risk reporting solutions
  • Perform regular assessments of the health and maturity of data and information capabilities within the GRM domain
  • Evangelize and design new data solutions to support risk lines of business and evolving regulatory requirements
  • Contribute to defining data strategy, including mission, goals, principles, and operating procedures
  • Support the development of business cases to secure approval and funding for data and analytics initiatives
  • Understand and manage end-to-end change impacts, linking information capabilities with operational and analytical technical assets
  • Maintain integrated logical data models and data flows to ensure data consistency and traceability across platforms

Requirements

  • Bachelor’s degree in Computer Science, Engineering, Data Science, Analytics, Risk Management, or a related field; advanced degree preferred
  • 8+ years of experience in data platforms, data solutions, and data management architecture
  • Strong knowledge of data management, data governance practices, and industry standards
  • Deep understanding of BCBS 239 principles and their implementation in large organizations
  • Proven experience with risk reporting systems, data warehouses, reporting tools, and governance frameworks
  • Expertise in designing complex architectures, including microservices, APIs, data warehouses, data lakes, and data pipelines
  • Hands-on experience with data technologies such as Python, Spark, Airflow, JavaScript, and SQL
  • Experience with relational and NoSQL databases, as well as big data environments
  • Strong experience with data modeling for complex data pipelines and data platforms
  • Familiarity with data quality frameworks, metadata management, data lineage tools, and control monitoring
  • Experience with data management platforms such as Collibra (preferred) or open-source alternatives like Apache Atlas, Amundsen, DataHub, or Marquez
  • Proficiency in Agile delivery methodologies (e.g., Scrum, SAFe)
  • Excellent communication skills with the ability to translate regulatory requirements into actionable technical solutions
  • Strong stakeholder management and cross-functional leadership capabilities
  • Knowledge of modern data storage layers and formats such as Apache Iceberg, Hudi, Delta Lake, Parquet, JSON, and Avro (nice to have)
  • Experience with graph processing and storage technologies, including knowledge graphs or property graphs (TigerGraph preferred) (nice to have)
  • Exposure to Linked Data, Open Data initiatives, and GenAI-based data solutions (nice to have)
Benefits
  • Medical insurance
  • Competitive compensation and growth opportunities
  • Work in a rapidly growing, innovative tech company
  • Collaborate with top-tier multinational clients
  • A multicultural community of experts

Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard skills
data architecturedata managementdata governancedata modelingdata pipelinesdata warehousesdata lakesPythonSQLSpark
Soft skills
communication skillsstakeholder managementcross-functional leadership