v4c.ai Logo

v4c.ai

Data Engineer

Posted 5 Hours Ago
Remote
Hiring Remotely in United States
Junior
Remote
Hiring Remotely in United States
Junior
Build and maintain Databricks-based ETL/ELT pipelines, transform and model data, optimize workflows, monitor and troubleshoot pipelines, and collaborate with cross-functional teams.
The summary above was generated by AI
Position Overview

V4C.ai is seeking a motivated Data Engineer to join our remote team in the United States. In this role, you will support the design, development, and maintenance of data solutions using Databricks, helping clients and internal teams process, transform, and analyze data effectively. You'll work on building reliable data pipelines and workflows in a collaborative environment, gaining hands-on experience with modern data engineering tools and cloud technologies.

Key Responsibilities
  • Collaborate with team members and stakeholders to understand data requirements and contribute to building scalable data pipelines and workflows in Databricks.
  • Develop and implement ETL/ELT processes using Databricks, Python, SQL, and related tools to ingest, transform, and prepare data.
  • Assist in optimizing data workflows for better performance, reliability, and cost-efficiency within Databricks environments.
  • Support the creation and maintenance of data models, tables, and integrations in cloud platforms (Azure, AWS, or similar).
  • Work closely with cross-functional teams (data analysts, scientists, and engineers) to deliver clean, accessible data for analytics and reporting.
  • Monitor data pipelines, troubleshoot basic issues, and contribute to documentation and best practices.
  • Stay curious about new Databricks features and data engineering trends to support ongoing improvements.
Requirements
  • Bachelor's degree in Computer Science, Data Science, Engineering, Information Systems, or a related field (or equivalent practical experience).
  • 1-2 years of professional experience in data engineering, data processing, analytics engineering, or a closely related role (internships, co-ops, or academic projects with relevant tools count toward this).
  • Hands-on experience and comfort building basic data pipelines or transformations.
  • Proficiency in Python and SQL; experience with Scala is a plus but not required.
  • Basic understanding of cloud platforms such as Azure, AWS, or GCP (ex: working with storage, compute, or data services).
  • Solid analytical and problem-solving skills with attention to detail and a focus on writing clean, maintainable code.
  • Strong communication skills and ability to work collaboratively in a remote team environment.
  • Eagerness to learn, take ownership of tasks, and grow within data engineering.

Top Skills

AWS
Azure
Databricks
Elt
ETL
GCP
Python
Scala
SQL

Similar Jobs

9 Hours Ago
In-Office or Remote
Minnetonka, MN, USA
Senior level
Senior level
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
Design, build, and maintain scalable data architectures and ETL/ELT pipelines on cloud platforms (Azure). Ensure data quality, security, compliance, and support AI/ML workflows while leading projects and mentoring junior engineers.
Top Skills: Pyspark,Scala,Apache Spark,Hive,Hadoop,Nosql,Sql,Postgresql,Mysql,Azure,Ci/Cd,Devops,Mlops,Etl,Elt,Data Warehousing,Paas,Python
9 Hours Ago
In-Office or Remote
Little Rock, AR, USA
90K-161K Annually
Senior level
90K-161K Annually
Senior level
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
Develop and maintain ETL pipelines (ADF) to load claims into the Medicaid reporting system, resolve data quality issues, optimize performance, create and monitor automated data shares and file extracts, and work with state partners on business-driven extracts.
Top Skills: Azure Data Factory,Snowflake,Oracle Pl/Sql,Sql Server T-Sql,Powershell,Python,Json,Xml,Fixed-Width,Delimited,Etl
2 Days Ago
In-Office or Remote
Minnetonka, MN, USA
Senior level
Senior level
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
Design and optimize database architectures and data models; build, automate, and maintain ETL/ELT pipelines and large-scale data platforms on Azure; ensure data quality, security, and compliance; collaborate with data scientists to deliver AI/ML solutions; lead projects, mentor junior engineers, and manage vendor and stakeholder relationships.
Top Skills: Sql,Postgres,Mysql,Azure,Ci/Cd,Devops,Mlops,Etl,Elt,Pyspark,Scala Spark,Hive,Hadoop,Nosql,Python,Scala,Data Warehousing,Paas

What you need to know about the Colorado Tech Scene

With a business-friendly climate and research universities like CU Boulder and Colorado State, Colorado has made a name for itself as a startup ecosystem. The state boasts a skilled workforce and high quality of life thanks to its affordable housing, vibrant cultural scene and unparalleled opportunities for outdoor recreation. Colorado is also home to the National Renewable Energy Laboratory, helping cement its status as a hub for renewable energy innovation.

Key Facts About Colorado Tech

  • Number of Tech Workers: 260,000; 8.5% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Lockheed Martin, Century Link, Comcast, BAE Systems, Level 3
  • Key Industries: Software, artificial intelligence, aerospace, e-commerce, fintech, healthtech
  • Funding Landscape: $4.9 billion in VC funding in 2024 (Pitchbook)
  • Notable Investors: Access Venture Partners, Ridgeline Ventures, Techstars, Blackhorn Ventures
  • Research Centers and Universities: Colorado School of Mines, University of Colorado Boulder, University of Denver, Colorado State University, Mesa Laboratory, Space Science Institute, National Center for Atmospheric Research, National Renewable Energy Laboratory, Gottlieb Institute

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account