SAPSOL Technologies Inc. : Systems and Process Solutions for your Enterprise

AWS DataHub Developer

SAPSOL Technologies Inc. : Systems and Process Solutions for your Enterprise

contract

Posted on:

Origin:  • 🇨🇦 Canada

Visit company website
AI Apply
Manual Apply

Job Level

Mid-LevelSenior

Tech Stack

ApacheAWSCloudDynamoDBJenkinsKafkaMicroservicesPythonTerraformTypeScript

About the role

  • Senior AWS DataHub Developer to design and build real-time, event-driven data services on AWS. This role is developer-first (application-side) rather than infrastructure-led. You will architect and deliver Kafka-based streaming pipelines and serverless data applications that ingest, transform, and serve data at scale.
  • You’ll collaborate with architects, data engineers, and product teams to deliver secure, resilient, observable, and highly scalable solutions that power enterprise-grade analytics and event-driven applications.
  • What You’ll Do (Key Responsibilities)
  • Design & Deliver Event-Driven Pipelines: Build serverless data flows using AWS Lambda, Step Functions, EventBridge, SNS, SQS, API Gateway.
  • Real-Time Streaming: Develop Kafka (Apache Kafka/Amazon MSK) consumers/producers for high-throughput, low-latency streaming and decoupled microservices.
  • Microservices & APIs: Build and optimize TypeScript (preferred) or Python services/APIs for data ingestion, transformation, and delivery.
  • AWS Data Services Integration: Work with S3, DynamoDB, Glue, Athena, CloudWatch for storage, metadata, querying, and observability.
  • Quality & Reliability: Implement idempotency, retries, dead-letter queues, exactly-once/at-least-once semantics where appropriate, and schema evolution strategies.
  • CI/CD & Testing: Use Git-based workflows and CI/CD (e.g., GitHub Actions, Jenkins) with automated tests (unit/integration/load) and infrastructure deployments.
  • IaC (Developer View): Define application-layer infrastructure using AWS CDK, Terraform, or CloudFormation—with strong emphasis on developer productivity and repeatability.
  • Agile Collaboration: Contribute to technical design, story sizing, peer reviews, and continuous improvement.
  • Ideal Candidate Profile: You are a cloud-native, application-side developer who thinks in events, streams, and services—not servers. You design for resiliency, observability, and scale, and you’re comfortable pairing Kafka with AWS serverless to deliver business outcomes.

Requirements

  • Design & Deliver Event-Driven Pipelines: Build serverless data flows using AWS Lambda, Step Functions, EventBridge, SNS, SQS, API Gateway.
  • Real-Time Streaming: Develop Kafka (Apache Kafka/Amazon MSK) consumers/producers for high-throughput, low-latency streaming and decoupled microservices.
  • Microservices & APIs: Build and optimize TypeScript (preferred) or Python services/APIs for data ingestion, transformation, and delivery.
  • AWS Data Services Integration: Work with S3, DynamoDB, Glue, Athena, CloudWatch for storage, metadata, querying, and observability.
  • Quality & Reliability: Implement idempotency, retries, dead-letter queues, exactly-once/at-least-once semantics where appropriate, and schema evolution strategies.
  • CI/CD & Testing: Use Git-based workflows and CI/CD (e.g., GitHub Actions, Jenkins) with automated tests (unit/integration/load) and infrastructure deployments.
  • IaC (Developer View): Define application-layer infrastructure using AWS CDK, Terraform, or CloudFormation—with strong emphasis on developer productivity and repeatability.
  • Agile Collaboration: Contribute to technical design, story sizing, peer reviews, and continuous improvement.