Tech Stack
Distributed SystemsGoHerokuKafkaOpen SourcePythonRust
About the role
- Design and build Keycard’s operational and analytics data backbone powering near real-time enforcement and feedback loops
- Build resilient, observable, low-latency streaming and batch data pipelines
- Integrate external data sources and enable policy evaluation, dashboards, and APIs
- Balance streaming and batch paradigms to meet transactional and OLAP needs
- Deliver APIs, data models, and pipelines that empower developers and security teams
- Work closely with founders, customers, and cross-functional team members to shape data architecture
- Own end-to-end delivery with high autonomy and accountability
Requirements
- Strong experience designing resilient operational and analytics data pipelines balancing streaming and batch
- Proficiency with Python, Go, and/or Rust
- Experience with Kafka, Parquet, and Iceberg
- Experience building streaming and batch systems that serve transactional integrity and OLAP requirements
- Product-minded, pragmatic, hands-on systems builder with API, pipeline, and storage design experience
- Experience iterating in 0→1/ambiguous environments
- Leadership skills: empathetic leader and builder who raises the bar for technical excellence
- Experience collaborating in distributed/remote teams and strong written communication for async work
- Experience designing and shipping developer-facing APIs, SDKs, or tools (preferred)
- Open source contributions to data infrastructure (Kafka, Flink, Iceberg, etc.) (preferred)
- Experience experimenting with agents and agent-friendly data systems (preferred)
- Authorization to work in the United States or Canada (required on application)