We're building the data infrastructure that makes AI agents trustworthy instead of error-prone.
We provide continuously refreshed, verified B2B data for autonomous AI agents and GTM workflows.
We've tripled growth while maintaining 100% gross dollar retention and staying cashflow positive.
We power AI agents for Clay, Zoominfo, Dun & Bradstreet, and the next generation of AI GTM tools.
Our data platform is scaling rapidly, and we need engineer who can own pipelines end-to-end, keep data quality high, and ensure reliability as we grow.
This role exists to strengthen our data infrastructure, accelerate delivery through automation, and ensure our B2B customers receive accurate, timely data they can trust.
You'll work on data systems that directly power customer workflows - where pipeline reliability and data quality directly impact retention.
Build and maintain production-ready data pipelines using DBT, Snowflake, and modern orchestration tools.
Own data engineering features end-to-end, from implementation through optimization and deployment.
Fix and improve existing pipelines - identify bottlenecks, resolve issues, and enhance performance.
Drive automation initiatives across the data stack to accelerate delivery and reduce manual interventions.
Provide 2nd line support for B2B customers - investigate data issues, clarify edge cases, and ensure customers can trust their data.
Design and implement new data import pipelines as we expand our data source coverage.
Implement data quality improvements - validation, monitoring, and testing to ensure reliable, accurate data delivery.
Contribute to code reviews, architectural discussions, and data engineering best practices.
Who You Are:
You have 3+ years of professional data engineering experience.
Strong fundamentals in SQL, data modeling, Python and ETL/ELT principles.
Must have:
DBT - hands-on experience building and maintaining transformation pipelines
Nice to have:
Snowflake
Databricks
AWS (S3, Lambda, Glue, etc.)
Prefect or similar orchestration tools (Airflow, Dagster)
Solid understanding of data quality principles, testing strategies, and monitoring practices.
Comfortable working in a fast-moving, remote-first environment.
Strong communicator - able to explain technical issues clearly to both technical and non-technical stakeholders.
Async-first mindset - can work independently, document decisions, and keep stakeholders informed without constant synchronous communication.
End-to-end ownership mentality - you see tasks through from planning to production, handling blockers and follow-through.
You care about data quality, pipeline reliability, and long-term maintainability.
Why RevenueBase:
Product with real traction: Customers rely on our platform in production.
High ownership: Small team where your work directly shapes the product.
Engineering-driven culture: Quality and correctness matter.
Growth stage company: Clear product-market fit and momentum.
Impact over process: Less bureaucracy, more building.
Competitive compensation based on experience.
Meaningful ownership and long-term growth opportunities.
Flexible working hours.
Fully remote-friendly team.
Direct collaboration with founders and core engineering leadership.
Top Skills
Similar Jobs
What you need to know about the Colorado Tech Scene
Key Facts About Colorado Tech
- Number of Tech Workers: 260,000; 8.5% of overall workforce (2024 CompTIA survey)
- Major Tech Employers: Lockheed Martin, Century Link, Comcast, BAE Systems, Level 3
- Key Industries: Software, artificial intelligence, aerospace, e-commerce, fintech, healthtech
- Funding Landscape: $4.9 billion in VC funding in 2024 (Pitchbook)
- Notable Investors: Access Venture Partners, Ridgeline Ventures, Techstars, Blackhorn Ventures
- Research Centers and Universities: Colorado School of Mines, University of Colorado Boulder, University of Denver, Colorado State University, Mesa Laboratory, Space Science Institute, National Center for Atmospheric Research, National Renewable Energy Laboratory, Gottlieb Institute

.jpg)

