Informa

Senior Data Architect

Informa

full-time

Posted on:

Location Type: Remote

Location: Remote • New York • 🇺🇸 United States

Visit company website
AI Apply
Apply

Salary

💰 $150,000 - $160,000 per year

Job Level

Senior

Tech Stack

AzureETLPySparkPythonScalaSQLUnity

About the role

  • Serve as a technical data architect supporting the next generation reporting platform, enabling insights for Finance, Commercial, Product, and Client Success teams
  • Support the development and implementation of a data strategy that enables current and future data processing, analytics, and reporting requirements in alignment with the overall data strategy and priorities
  • Maintain and enhance the scalable data warehouse architecture
  • Ensure robust version control, deployment pipelines, and observability across environments, supporting continuous improvement and governance
  • Maintain and build new ETL pipelines bringing data from Salesforce, NetSuite, HubSpot, Pendo, ADP and Adaptive Planning, ensuring clean, enriched, and joined datasets for analytics
  • Manage BI or reporting functions with internal and external reporting outputs. Deliver company-wide dashboards using Power BI (including Embedded)
  • Prioritize high-volume reporting backlogs across multiple business functions
  • Collaborate with Finance, Operations, HR and Sales to deliver reporting such as automated revenue and ARR bridges, pricing analysis, P&L reporting, client product usage reporting and headcount reporting in Power BI
  • Support key data initiatives including ARR bridge, Client Intimacy metrics, Pricing Governance, Customer Health Scoring, Funnel Velocity, and Marketing Attribution
  • Design KPI frameworks and reporting structures in collaboration with stakeholders.
  • Lead QA reviews of reporting assets to ensure accuracy, accessibility, and usability
  • Champion and drive data quality and enrichment across Salesforce and other systems, implementing dashboards and tagging logic to support commercial actions.
  • Educate and guide Power BI dashboard creators across the business to adopt self-service dashboard creation, promoting best practices and scalable design standards
  • Train users on reporting tools and implement self-service enablement programs.
  • Collaborate with business stakeholders to define reporting requirements, validate outputs, and embed dashboards into operational cycles

Requirements

  • Have 5+ years of experience designing and managing scalable data architectures
  • Demonstrate hands-on experience with Databricks and Delta Lake, including ACID transactions, schema evolution, time travel, and audit logging
  • Show strong proficiency in SQL (T-SQL, ANSI SQL), and Python, PySpark, and Scala for data engineering within Databricks notebooks
  • Familiar with Unity Catalog for metadata management, data cataloguing, role-based access control, data lineage, and quality monitoring
  • Experience using Fivetran or similar tools for building reliable, low-maintenance data ingestion pipelines
  • Experience integrating and transforming data from Salesforce, NetSuite, HubSpot, Pendo, ADP, Adaptive Planning, and other SaaS platforms
  • Demonstrate a strong understanding of data governance, including version control (e.g., Git/Azure Repos), CI/CD pipelines, and observability practices
  • Manage the full report development lifecycle from brief through to release and adoption
  • Can demonstrate hands-on experience with Power BI, including dashboard development, parameterization, performance optimization, and administration (permissions, server maintenance, upgrades, patching, monitoring etc.).
  • Comfortable building and optimizing fact and dimension models, including STAR schemas and SCD-2 logic, to support consistent and high-performance reporting
  • Use the Power BI platform (including DAX, Dataflows, Embedded publishing)
  • Ability to apply visualization best practices and data storytelling techniques.
  • Ability to apply business-aligned KPI design and definition governance
  • Ability to apply transformation logic, data model validation, and documentation
  • Plan and prioritize work to meet commitments aligned with organizational goals
  • Develop and deliver multi-mode communications that convey a clear understanding of the unique needs of different audiences
  • Collaborate with cross-functional teams to define and implement data models that support business use cases and enable self-service analytics
  • Build formal and informal relationship networks inside and outside the organization and work collaboratively with others to meet shared objectives
  • Demonstrate excellent communication and stakeholder engagement skills, with a consulting mindset and the ability to translate business needs into scalable data solutions
  • Learn actively through experimentation when tackling new problems, using both successes and failures as learning fodder
  • Develop people to meet both their career goals and the organization’s goals.
  • Consistently achieve results, even under tough circumstances
  • Hold a Bachelor’s degree in Computer Science, Engineering, or a related discipline; an advanced degree or relevant certifications are a plus
Benefits
  • Competitive benefits, including a range of Financial, Health and Lifestyle benefits to choose from
  • Flexible working options, including home working, flexible hours and part time options, depending on the role requirements – please ask!
  • Competitive annual leave, floating holidays, volunteering days and a day off for your birthday!
  • Learning and development tools to assist with your career development
  • Work with industry leading Subject Matter Experts and specialist products
  • Regular social events and networking opportunities
  • Collaborative, supportive culture, including an active DE&I program
  • Employee Assistance Program which provides expert third-party advice on wellbeing, relationships, legal and financial matters, as well as access to counselling services

Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard skills
data architectureETL pipelinesSQLPythonPySparkScalaPower BIDatabricksDelta Lakedata governance
Soft skills
communicationcollaborationstakeholder engagementproblem-solvingtrainingprioritizationrelationship buildingconsulting mindsetadaptabilityresults-oriented
Certifications
Bachelor's degree in Computer ScienceBachelor's degree in Engineeringadvanced degreerelevant certifications