Top Benefits
Flexible time off
Training and development investments
Competitive salary and equity opportunities
About the role
- We are seeking a Senior Data Developer with a strong background in event streaming to join our growing team
- You will help build and maintain the data streaming platform that directly powers the MaintainX product and enables internal analytics, while developing streaming platform capabilities and tooling used by engineering teams daily
- You will own the data platform’s CDC-streaming service, including runtime, reliability, capabilities, deployment, governance, and developer tooling
- Build and operate the end-to-end CDC streaming platform (Debezium, Kafka, Flink) that produce near-real-time data products
- Own the streaming infrastructure (Kafka, Flink) using Terraform and Atmos IaC, including multi-region deployments
- Build and maintain CI/CD pipelines for the CDC-streaming platform
- Define and enforce pipeline reliability standards
- Instrument and maintain end-to-end observability for the streaming pipeline
- Build self-service tooling and runbooks for onboarding new CDC sources, including automation scripts, snapshot reconciliation checks, and operational documentation
- Collaborate with engineering teams to expand the CDC footprint, support new streaming data use cases, and evolve the streaming architecture
Benefits
- Flexible time off
- Training and development investments
- Competitive salary and equity opportunities
- Comprehensive healthcare coverage- Experience with CDC tooling (ex: Debezium, DMS) for real-time database change capture
- Strong reliability engineering instincts: alerting design, runbook authorship, load testing, and failure recovery planning for distributed systems
- Strong infrastructure-as-code skills with Terraform or Atmos; comfortable managing cloud infrastructure across multiple AWS accounts and regions
- 4+ years of experience building and operating production-grade event streaming pipelines in a modern cloud data environment
- Experience building and evolving CI/CD pipelines
- Experience working collaboratively in a fast-paced, cross-functional environment
- Strong familiarity with Kafka: topic design, consumer groups, retention policies, event replayability, schema management, partitioning, and indexing
- Proficiency in Python or Java for Flink application development and streaming tooling
- Hands-on experience with Apache Flink
- Knowledge of compliance and regulatory frameworks (ex: FedRAMP, SOC2, GDPR)
- Familiarity with OpenSearch
- Familiarity with schema management in a CDC context