Design, develop, and maintain data warehouse infrastructure while ensuring secure and efficient data pipelines for advanced analytics. Requires collaboration across teams and robust data governance practices.
Bridgeway is seeking a Senior Data Engineer to design, develop, and maintain our data warehouse infrastructure. This role involves working closely with analysts, engineers, and other stakeholders to shape our data architecture, ensuring secure and efficient data pipelines, and enabling advanced analytics across the organization. The ideal candidate will have a strong background in data engineering, data warehousing, and ELT processes, along with a passion for optimizing data systems.
This is a remote position, with preference given to East Coast candidates.
Key Responsibilities:
- Design, develop, and maintain a scalable data warehouse/lakehouse environment.
- Design and implement ELT pipelines to ingest, transform, and deliver high-quality data for analytics and reporting, incorporating current best practices, such as “pipelines as code”.
- Ensure data security and compliance, including role-based access controls for security, encryption, masking, and governance best practices to ensure compliant handling of sensitive information.
- Optimize performance of data workflows and storage for cost efficiency and speed.
- Partner with engineers, analysts, and stakeholders to meet data needs; balance cost, performance, simplicity, and time-to-value while mentoring teams and documenting standards.
- Define and implement robust testing frameworks, enforce data contracts, and establish observability practices including lineage tracking, SLAs/SLOs, and incident response runbooks to maintain data integrity and trustworthiness.
- Monitor, troubleshoot, and resolve data & automation issues.
- Collaborate within an Agile-Scrum framework and develop comprehensive technical design documentation to ensure efficient and successful delivery.
- Serve as a trusted expert on organizational data domains, processes, and best practices.
Requirements:
- 5+ years of experience in data engineering and ELT with a focus on large-scale data platforms
- 3+ years of experience with Databricks
- Advanced proficiency in analytical SQL, including ANSI SQL, T-SQL, and Spark SQL
- Strong Python skills for data engineering
- Expertise in data modeling
- Hands-on experience with data quality and observability practices (tests, contracts, lineage tracking, alerts)
- Practical knowledge of orchestration tools and CI/CD concepts for data workflows
- Excellent communication and a track record of technical leadership and mentoring
- Strong understanding of integrating data solutions with AI and machine learning models
- Strong problem-solving skills and attention to detail.
- Experience with version control systems like Git preferred
- Strong understanding of data governance and best practices in data management, with hands-on experience using Unity Catalog
- Hands-on experience in designing and managing data pipelines using Delta Live Tables (DLT) on Databricks
- Streaming and ingestion tools, such as Kafka, Kinesis, Event Hubs, Debezium, or Fivetran
- DAX, LookML, dbt; Airflow/Dagster/Prefect, Terraform; Azure DevOps; Power BI/Looker/Tableau; GitHub CoPilot knowledge is a plus
- Bachelor’s degree in Computer Science, Information Technology, or a related field. Master’s degree preferred
Top Skills
Airflow
Azure Devops
Dagster
Databricks
Dax
Dbt
Debezium
Delta Live Tables
Event Hubs
Fivetran
Git
Kafka
Kinesis
Looker
Lookml
Power BI
Prefect
Python
Spark Sql
SQL
T-Sql
Tableau
Terraform
Similar Jobs
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
Design and implement automated test suites for AI/ML workflows, analyze clinical data, perform various tests, and mentor junior engineers.
Top Skills:
Api Testing ToolsAWSCloudFormationCypressGCPGithub ActionsPythonSeleniumTerraform
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
The Senior Data Engineer will design data models, develop ETL pipelines using Azure and Databricks, deploy AI models, and ensure compliance with data regulations.
Top Skills:
AzureAzure Data FactoryDatabricksDelta LakeMlflowPysparkSQLUnity Catalog
AdTech • Artificial Intelligence • Marketing Tech • Software • Analytics
The Senior Data Engineer will design, build, and operate data pipelines for Zeta's AdTech platform, focusing on high-scale data processing and analytics-ready datasets.
Top Skills:
AirflowAthenaAWSCassandraDagsterDeltaDynamoDBEmrFlinkGlueGoHudiIcebergJavaKafkaKinesisMySQLParquetPostgresPythonRedisRedshiftS3ScalaSparkSQLStep Functions
What you need to know about the Colorado Tech Scene
With a business-friendly climate and research universities like CU Boulder and Colorado State, Colorado has made a name for itself as a startup ecosystem. The state boasts a skilled workforce and high quality of life thanks to its affordable housing, vibrant cultural scene and unparalleled opportunities for outdoor recreation. Colorado is also home to the National Renewable Energy Laboratory, helping cement its status as a hub for renewable energy innovation.
Key Facts About Colorado Tech
- Number of Tech Workers: 260,000; 8.5% of overall workforce (2024 CompTIA survey)
- Major Tech Employers: Lockheed Martin, Century Link, Comcast, BAE Systems, Level 3
- Key Industries: Software, artificial intelligence, aerospace, e-commerce, fintech, healthtech
- Funding Landscape: $4.9 billion in VC funding in 2024 (Pitchbook)
- Notable Investors: Access Venture Partners, Ridgeline Ventures, Techstars, Blackhorn Ventures
- Research Centers and Universities: Colorado School of Mines, University of Colorado Boulder, University of Denver, Colorado State University, Mesa Laboratory, Space Science Institute, National Center for Atmospheric Research, National Renewable Energy Laboratory, Gottlieb Institute

