540

Data Platform Operations Lead – Contract

540

contract

Posted on:

Location Type: Remote

Location: United States

Visit company website

Explore more

AI Apply
Apply

Job Level

About the role

  • Serve as an embedded SME and lead advisor for the data platform operations team supporting a production GCP environment
  • Drive operational readiness, including development and refinement of runbooks, SOPs, escalation procedures, and day-two support processes
  • Provide hands-on support for production incidents, including troubleshooting, root cause analysis (RCA), and long-term remediation
  • Monitor system health, logs, and performance metrics to identify risks and optimization opportunities
  • Lead PostgreSQL / AlloyDB performance tuning, including indexing strategies, query optimization, and maintenance practices
  • Provide guidance across GCP services including Dataflow, BigQuery, Pub/Sub, Cloud Functions, and Dataplex
  • Support release management, including database deployments and rollback strategies
  • Contribute to resiliency planning, including backup/recovery, disaster recovery (DR), and high availability (HA)
  • Develop and maintain operational documentation and technical artifacts aligned to production needs
  • Serve as the primary technical point of contact for client stakeholders, providing strategic database roadmaps and defending technical architectural decisions during program reviews

Requirements

  • 8+ years of experience in data engineering, database engineering, or platform operations
  • Strong experience supporting production data platforms in cloud environments
  • Deep, hands-on expertise in PostgreSQL (performance tuning, indexing, query optimization, vacuuming, maintenance)
  • Experience supporting PostgreSQL in production at scale (large datasets, high availability environments)
  • Experience with AlloyDB or similar PostgreSQL-based managed database services
  • Hands-on experience with Google Cloud Platform (GCP), including data and streaming services
  • Experience with Dataflow, BigQuery, Pub/Sub, Cloud Functions, and/or Dataplex
  • Experience with data lifecycle management, retention policies, data governance and cataloging
  • Proven experience in production operations, including incident response and root cause analysis
  • Hands-on experience with SLIs/SLOs/SLAs, including defining reliability targets, implementing monitoring and using error budgets to guide operational decisions and platform improvements
  • Proven track record of institutionalizing operational knowledge through technical writing
  • Experience supporting release management and database deployment processes
  • Experience working in FedRAMP or regulated cloud environments
  • Strong communication skills and ability to work directly with engineering teams and stakeholders
  • Demonstrated ability to take ownership, lead operational efforts, and deliver in a fast-paced environment.
Benefits
  • Citizenship & Clearance Requirement: per client requirements, must be a U.S. Citizen with the ability to obtain a Public Trust clearance
  • Initial 6-month contract (40 hours/week), with strong likelihood of extension and potential for long-term or full-time conversion based on performance and program needs
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
PostgreSQLAlloyDBperformance tuningindexingquery optimizationGoogle Cloud PlatformDataflowBigQueryPub/SubCloud Functions
Soft Skills
strong communication skillsownershipleadershipoperational readinessstrategic guidancetechnical writingincident responseroot cause analysiscollaborationproblem-solving