AUTHEO

Senior Data Engineer, Blockchain

AUTHEO

part-time

Posted on:

Location Type: Remote

Location: Remote • 🌎 Anywhere in the World

Visit company website
AI Apply
Apply

Job Level

Senior

Tech Stack

AirflowApacheETLIPFSKafkaMongoDBPostgresSparkWeb3

About the role

  • Design end-to-end data pipelines using Apache Spark, Kafka, Airflow, and Trino
  • Build pipelines for 200GB/s IPFS, Ceph, PostgreSQL, and MongoDB data flows
  • Implement zero-ETL lakehouse analytics with 25μs query latencies
  • Design streaming systems for 50B+ daily DePIN events via Kafka
  • Integrate homomorphic encryption for exabyte-scale data security
  • Ensure differential privacy (ε=0.5) for DePIN datasets
  • Collaborate with AI/ML, blockchain, and security teams for integration
  • Lead data architecture reviews for scalability and compliance
  • Mentor engineers and contribute to open-source data components
  • Publish at Spark+AI/Web3 Summit on data innovations

Requirements

  • Bachelor’s/Master’s in Computer Science, Data Engineering, or equivalent
  • 5+ years building petabyte-scale data pipelines for high-throughput systems
  • Expertise in Spark, Kafka, Airflow, Trino, and IPFS/Ceph
  • Proficiency in zk-proofs and compliance auditing
  • Background in DeFi analytics or healthcare FHIR processing (preferred)
  • Experience with open-source data tooling or multi-language SDKs (preferred)
  • Contributions to blockchain data standards or patents (preferred)
Benefits
  • Equity in Launch Legends, Autheo, and the WFO Creator Network
  • Token allocations in the Autheo blockchain
  • Salaried compensation expected to begin within 4 to 5 months

Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard skills
Apache SparkKafkaAirflowTrinoIPFSCephPostgreSQLMongoDBzk-proofsdifferential privacy
Soft skills
collaborationmentoringleadership
Certifications
Bachelor’s in Computer ScienceMaster’s in Data Engineering