Design, build, and operate components of our Streaming Platform, including Kafka, the streaming runtime, high-level APIs, and developer-facing abstractions.
Implement resilient, high-throughput stream processing systems that handle unbounded datasets with strong correctness guarantees (delivery, checkpointing, watermarking, and more).
Build scalable automation and control plane for Kafka fleet management and improve efficiency.
Partner with product engineers to ensure our abstractions enable fast, reliable, and consistent ingestion pipelines.
Improve observability, monitoring, and failover for mission-critical real-time systems.
Requirements
5+ years of software engineering experience, with background in distributed systems, data infrastructure, or real-time streaming.
Proficiency in a programming language such as Python, Rust, Go, or Java (we primarily use Python and Rust, but experience in similar languages is valuable).
Experience with streaming technologies such as Kafka, Flink, Spark Streaming, or similar tools.
Strong understanding of partitioning, watermarks, windowing, stateful/stateless processing, and delivery guarantees.
Experience building and operating systems in cloud environments such as Kubernetes, AWS, or GCP.
Nice to have: experience with ClickHouse, Arrow or other columnar data processing, or modern streaming SQL engines such as Materialize or RisingWave.
Benefits
Employee benefit plans/programs applicable to the candidate’s position
Incentive compensation
Equity grants
Paid time off
Group health insurance coverage
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.