Anaqua

Senior Backend Developer – Web Scraping, DevOps – Python, Java, GCP

Anaqua

full-time

Posted on:

Location Type: Hybrid

Location: Arhus • 🇩🇰 Denmark

Visit company website
AI Apply
Apply

Job Level

Senior

Tech Stack

CloudDockerGoogle Cloud PlatformJavaKubernetesPythonSelenium

About the role

  • Develop and maintain our Python-based web scraping pipeline
  • Own and improve systems built on Google Cloud Platform, using Pub/Sub, Kubernetes, Docker, Compute Engine Instances and Bash scripting
  • Collaborate closely with internal stakeholders, product owner, and backend engineers
  • Set the standard for code quality, reviews, and testing within the automation/web scraping team
  • Integrate with internal APIs (primarily Java-based), accessed through Python services

Requirements

  • Familiarity with GCP, Kubernetes, Bash, and message-based architectures like Pub/Sub
  • Experience with Python, particularly for web scraping and automation pipelines
  • Solid habits around testing, code review, and code maintainability
  • Confidence and experience to set a technical example for other developers
  • Nice to Have: Experience with browser automation tools (e.g., Playwright, Selenium)
  • Familiarity with CI/CD pipelines, logging, or monitoring
Benefits
  • Hybrid-friendly work environment
  • Opportunity to work with a global brand protection team

Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard skills
Pythonweb scrapingBash scriptingKubernetesGoogle Cloud PlatformPub/SubDockerCompute EngineCI/CDbrowser automation
Soft skills
collaborationcode qualitycode reviewtestingtechnical leadership