Solace (solace.health) Logo

Solace (solace.health)

Data Engineer

Reposted 24 Days Ago
Remote
Hiring Remotely in United States
Mid level
Remote
Hiring Remotely in United States
Mid level
Role involves architecting and optimizing data pipelines, building infrastructure as code, ensuring data reliability, and collaborating with cross-functional teams to deliver quality data products.
The summary above was generated by AI
About Solace

Healthcare in the U.S. is fundamentally broken. The system is so complex that 88% of U.S. adults do not have the health literacy necessary to navigate it without help. Solace cuts through the red tape of healthcare by pairing patients with expert advocates and giving them the tools to make better decisions—and get better outcomes.

We're a Series C startup, founded in 2022 and backed by Inspired Capital, Craft Ventures, Torch Capital, Menlo Ventures, Signalfire, and IVP. Our U.S. based team is lean, mission-driven, and growing quickly.

Solace isn't a place to coast. We're here to redefine healthcare—and that demands urgency, precision, and heart. If you're looking to stretch yourself, sharpen your edge, and do the best work of your life alongside a team that cares deeply, you're in the right place. We’re intense, and we like it that way.

Read more in our Bloomberg funding announcement here.

About the Role

At Solace, we are building the data foundation that will power patient outcomes. We are a small but growing team where everyone wears many hats and moves fast. We are looking for a Data Engineer who loves solving hard problems with clean, maintainable code and thrives in this high-growth startup environment.

In this role, you will architect the infrastructure that allows us to scale. You will be a core builder of our data platform, establishing the frameworks, standards, and best practices that will define our engineering culture for years to come. From raw data ingestion to complex modeling, your work will ensure that our data is not just available, but trusted, reliable, and ready for action.


What You’ll Do
  • Architect Robust Pipelines: Design, build, and optimize scalable data pipelines using Airflow, Python, dbt, and Snowflake. You will replace brittle manual processes with resilient, automated workflows.

  • Build Infrastructure as Code: Manage and evolve our cloud infrastructure (AWS/GCP) using Terraform, ensuring our platform is reproducible, secure, and scalable.

  • Elevate Code Quality: Write clean, production-grade code for complex data processing. You will champion engineering best practices, including code reviews, testing, and CI/CD.

  • Optimize Data Models: Collaborate with analysts to design performant SQL transformations and data models in Snowflake (experience with dbt is a huge plus).

  • Ensure Data Reliability: Implement observability and monitoring to catch issues before they impact stakeholders. You are the first line of defense for data quality.

  • Partner Cross-Functionally: Work closely with Data Analysts and Product Managers to understand their data needs and deliver high-quality data products that empower decision-making.

What You Bring to the Table
  • Strong Python Proficiency: You are comfortable writing modular, testable, and efficient Python code for data processing and automation.

  • Advanced SQL & Snowflake: You have deep expertise in SQL and cloud data warehousing (Snowflake preferred), understanding how to optimize queries for performance and cost.

  • Orchestration Mastery: Proven experience building and maintaining complex workflows using Airflow (or similar tools).

  • Infrastructure Mindset: Familiarity with Terraform and cloud services (AWS or GCP). You understand how to provision and manage the resources your pipelines run on.

  • Security & Stewardship: You understand the gravity of handling sensitive medical data. You are experienced in properly handling PHI and PII, implementing secure access controls (RBAC), and adhering to strict governance standards.

  • Startup DNA: You are a self-starter who is comfortable with ambiguity. You take ownership of problems and are willing to wear many hats to get the job done.

  • Communication Skills: You can translate complex technical challenges into clear options for non-technical stakeholders.

Bonus Points
  • dbt Expertise: Experience using dbt to manage transformations and implement testing/documentation standards.

  • Healthcare Background: Experience working with healthcare data standards or strictly regulated environments (HIPAA) a plus.

  • Containerization: Experience with Docker and Kubernetes for deploying data applications.

Our Tech Stack
  • Compute & Storage: Snowflake, Postgres

  • Orchestration: Airflow

  • Infrastructure: Terraform, AWS/GCP

  • Transformation: dbt, SQL, Python

Applicants must be based in the United States.

Up for the Challenge?

We look forward to meeting you.

Fraudulent Recruitment Advisory: Solace Health will NEVER request bank details or offer employment without an interview. All legitimate communications come from official solace.health emails only or ashbyhq.com. Report suspicious activity to [email protected] or [email protected].

Top Skills

Airflow
AWS
Dbt
GCP
Python
Snowflake
SQL
Terraform

Similar Jobs

8 Days Ago
Easy Apply
Remote or Hybrid
USA
Easy Apply
175K-225K Annually
Mid level
175K-225K Annually
Mid level
Fintech • Information Technology • Software • Financial Services
The Data Engineer will build real-time data pipelines for pricing algorithms, collaborate with teams, and contribute to batch data workflows.
Top Skills: Cloud-Based Distributed Data InfrastructureFlinkKafkaPythonSQL
8 Days Ago
Easy Apply
Remote or Hybrid
United States
Easy Apply
102K-154K Annually
Mid level
102K-154K Annually
Mid level
Artificial Intelligence • Cloud • Computer Vision • Hardware • Internet of Things • Software
As a Data Engineer II, you will build and optimize data pipelines in Databricks, ensure high-quality data delivery, and support generative AI applications. You will collaborate with data scientists and AI engineers to provide reliable data infrastructures and meet business needs.
Top Skills: DatabricksDbtGainsightGongOutreachPythonRdsRedshiftSalesforceSnowflakeSparkSQL
2 Hours Ago
Easy Apply
Remote
USA
Easy Apply
186K-219K Annually
Senior level
186K-219K Annually
Senior level
Artificial Intelligence • Blockchain • Fintech • Financial Services • Cryptocurrency • NFT • Web3
The role involves designing and operating foundational data services, building data integration SDKs, and ensuring data security and observability. The engineer will also convert functional requests into scalable data solutions across the organization.
Top Skills: AirflowGoJavaKafkaPythonSparkSQL

What you need to know about the Colorado Tech Scene

With a business-friendly climate and research universities like CU Boulder and Colorado State, Colorado has made a name for itself as a startup ecosystem. The state boasts a skilled workforce and high quality of life thanks to its affordable housing, vibrant cultural scene and unparalleled opportunities for outdoor recreation. Colorado is also home to the National Renewable Energy Laboratory, helping cement its status as a hub for renewable energy innovation.

Key Facts About Colorado Tech

  • Number of Tech Workers: 260,000; 8.5% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Lockheed Martin, Century Link, Comcast, BAE Systems, Level 3
  • Key Industries: Software, artificial intelligence, aerospace, e-commerce, fintech, healthtech
  • Funding Landscape: $4.9 billion in VC funding in 2024 (Pitchbook)
  • Notable Investors: Access Venture Partners, Ridgeline Ventures, Techstars, Blackhorn Ventures
  • Research Centers and Universities: Colorado School of Mines, University of Colorado Boulder, University of Denver, Colorado State University, Mesa Laboratory, Space Science Institute, National Center for Atmospheric Research, National Renewable Energy Laboratory, Gottlieb Institute

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account