Object Computing, Inc.

Senior Geospatial Data Engineer

Object Computing, Inc.

full-time

Posted on:

Location Type: Remote

Location: MontanaUnited States

Visit company website

Explore more

AI Apply
Apply

Job Level

About the role

  • Architect, design, and maintain robust, scalable data pipelines and infrastructures for geospatial and big data applications maintaining a focus on performance and the ultimate end-user product experience.
  • Lead the development and optimization of ETL processes for ingesting, cleaning, transforming, and storing large volumes of geospatial and tabular data.
  • Design, build, and interact with API-driven, service-to-service web services (using FastAPI, Litestar, Flask, etc.) to enable integration across a suite of products.
  • Collaborate with backend and platform engineers to ensure secure, reliable, and scalable service-to-service communication.
  • Translate complex analytics and business questions into actionable, production-grade data solutions.
  • Collaborate closely with data scientists, analysts, and business stakeholders to deliver high-impact data products.
  • Drive the adoption and optimization of cloud-based data solutions (e.g., GCP, AWS, Azure).
  • Ensure data quality, integrity, and security across all stages of the data lifecycle.
  • Mentor and provide technical guidance to junior data engineers and team members.
  • Communicate technical details and insights clearly to both technical and non-technical audiences, including leadership.
  • Proactively recommend and implement improvements to existing data infrastructure and software programs.
  • Stay current with industry trends and emerging technologies in geospatial data engineering.

Requirements

  • Experience in software development, data engineering, or big data roles, preferably with a focus on geospatial data.
  • Experience building solutions with Python.
  • Experience with relational databases (e.g., SQL), including advanced query building, data extraction, and manipulation.
  • Experience architecting and optimizing cloud-based data solutions (preferably GCP, AWS, or Azure).
  • Deep experience with big data technologies such as Hadoop, Spark, MapReduce, or Kafka.
  • Experience integrating with API-driven, service-to-service web services.
  • Demonstrated ability to lead projects, mentor team members, and drive technical decisions.
  • Strong problem-solving skills, resourcefulness, and ability to work independently or collaboratively.
  • Excellent organizational, interpersonal, and communication skills.
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
data pipelinesETL processesAPI-driven web servicesPythonrelational databasesSQLHadoopSparkMapReduceKafka
Soft Skills
leadershipmentoringproblem-solvingresourcefulnessindependent workcollaborationorganizational skillsinterpersonal skillscommunication skills