FreedomPay

Senior Data Engineer

FreedomPay

full-time

Posted on:

Location Type: Hybrid

Location: PhiladelphiaNew JerseyNew YorkUnited States

Visit company website

Explore more

AI Apply
Apply

Job Level

About the role

  • Design and evolve data models and database objects for operational and analytical workloads in Microsoft SQL Server and Snowflake (schemas, roles, warehouses, performance and cost optimization).
  • Build and maintain ELT/ETL pipelines (batch and near-real-time), leveraging Snowflake capabilities (Snowpipe, Streams/Tasks) and orchestration tools (e.g., Airflow or Azure Data Factory) as appropriate.
  • Implement and support data streaming and event-driven ingestion patterns using technologies such as Kafka, Azure Event Hubs, (topics/streams, schemas, consumers, and replay strategies).
  • Leverage Redis and other low-latency data stores for caching and real-time access patterns; partner with application teams to define fit-for-purpose SLAs and data freshness targets.
  • Develop and maintain curated datasets and self-service analytics in Sigma Computing (workbooks, datasets, governance and performance), and support legacy reporting where needed (e.g., SSRS).
  • Collaborate with engineering, analytics, and product teams to deliver data solutions that meet business requirements.
  • Automate deployments using Git-based workflows and CI/CD (e.g., Azure DevOps), including database migration/versioning (Flyway).
  • Use Claude Code (AI-assisted development) to accelerate data pipeline delivery (design, implementation, refactoring, documentation, and troubleshooting) while adhering to security, quality, and SDLC standards.
  • Participate in Agile ceremonies and contribute to continuous improvement of data engineering processes and standards.
  • Establish data quality, testing, and observability (e.g., unit/integration tests for pipelines, data validation, lineage, alerting, SLAs) to ensure reliable delivery.
  • Partner with engineering, analytics, and product teams to define and deliver data products (source-to-target mappings, contracts, SLAs), enabling trustworthy analytics and operational use cases.
  • Ensure data security, governance, and compliance across platforms (PII handling, encryption, auditing, retention), including Snowflake RBAC, secure data sharing, and access controls.
  • Troubleshoot and resolve performance, reliability, and scalability issues across data platforms; instrument pipelines with logging/metrics and on-call friendly runbooks.

Requirements

  • 7+ years of experience in data engineering and/or database engineering, including building and operating production data pipelines.
  • Strong understanding of modern data engineering practices and tools (cloud data platforms, orchestration, testing/observability, DataOps, and AI-assisted development with Claude Code).
  • Strong English reading and writing communication skills, with an ability to express and understand complex technical concepts.  As other languages are a requirement, that will be explicitly noted during the recruitment process.
  • Strong analytical, problem-solving, and conceptual skills.
  • Hands-on experience with Snowflake and integrating it into production data pipelines.
  • Experience enabling governed self-service analytics with Sigma Computing (datasets, workbooks, access controls, and performance best practices).
  • Experience with streaming/event platforms such as Kafka, Azure Event Hubs, including schema/versioning considerations and operational support.
  • Proficiency with Python for data engineering automation and/or building pipeline components; experience with orchestration (Airflow and/or Azure Data Factory) is strongly preferred.
  • Experience using Claude Code to develop, test, and iterate on data pipeline solutions (e.g., generating boilerplate, improving SQL/Python, and speeding up root-cause analysis) with appropriate human review.
  • Ability to work in teams and strong interpersonal skills.
  • Ability to work under pressure and meet tight deadlines.
  • Ability to anticipate potential problems and determine and implement solutions.
Benefits
  • medical, prescription, dental and vision coverage
  • Life Insurance
  • Retirement Plans with company match
  • commission sharing plan
  • flexible hybrid working environment
  • great parental and other leave programs
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
data engineeringdatabase engineeringdata pipelinesdata modelingELTETLdata streamingdata validationPythonSQL
Soft Skills
communication skillsanalytical skillsproblem-solving skillsinterpersonal skillsteamworkability to work under pressuretime managementconceptual skillsadaptabilityattention to detail