Cybermedia Technologies (CTEC) Logo

Cybermedia Technologies (CTEC)

Data Engineer

Reposted 11 Days Ago
Easy Apply
Remote
Hiring Remotely in United States
Senior level
Easy Apply
Remote
Hiring Remotely in United States
Senior level
The Data Engineer will design and maintain ETL pipelines in Azure Databricks, manage data migrations, and ensure data quality and governance.
The summary above was generated by AI

CTEC is a leading technology firm that provides modernization, digital transformation, and application development services to the U.S. Federal Government. Headquartered in McLean, VA, CTEC has over 300 team members working on mission-critical systems and projects for agencies such as the Department of Homeland Security, Internal Revenue Service, and the Office of Personnel Management. The work we do effects millions of U.S. citizens daily as they interact with the systems we build. Our best-in-class commercial solutions, modified for our customers’ bespoke mission requirements, are enabling this future every day.

The Company has experienced rapid growth over the past 3 years and recently received a strategic investment from Main Street Capital Corporation (NYSE: MAIN). In addition to our recent growth in Federal Civilian agencies, we are seeking to expand our capabilities in cloud development and footprint in national-security focused agencies within the Department of Defense and U.S. Intelligence Community.


We are seeking to hire a Data Engineer to our team!

Client:
CTEC develops and delivers innovative customer-centric technologies and solutions that support the Office of Personnel Management’s (OPM) Health and Insurance business unit and Office of the Chief Information Officer (OCIO).

Duties and Responsibilities:

  • Data Pipeline & ETL Development: Design, develop, and maintain scalable ETL pipelines and data workflows to ingest, transform, and integrate data from legacy systems and external sources into modern cloud-based data platforms.
  • Cloud Data Platform & Databricks Implementation: Build, optimize, and maintain data processing solutions using Azure Databricks and lakehouse architectures to support analytical, operational, and reporting use cases.
  • Data Migration Support: Support phased data migration from legacy databases and ETL tools to Azure Databricks environments, including transformation documentation and data mapping.
  • Lakehouse & Data Layer Design: Implement layered lakehouse data architectures (e.g., bronze, silver, gold layers) in Databricks to support data quality, performance, and downstream reporting needs.
  • Python & PySpark Development: Develop data processing notebooks, workflows, and distributed data transformations using Python and PySpark within Databricks environments.
  • Data Quality & Validation: Develop data validation, reconciliation, and testing processes to ensure data accuracy, completeness, and consistency across data domains.
  • Data Integration & Interoperability: Integrate Databricks data platforms with analytics and reporting tools to enable business intelligence and operational dashboards.
  • Data Governance & Security Implementation: Support data governance initiatives including metadata management, data catalog integration, encryption, access controls, and compliance with federal data protection requirements.
  • DevOps & CI/CD Support: Maintain source control and CI/CD pipelines for Databricks and data engineering workflows, supporting automated promotion across environments.
  • Technical Collaboration: Work closely with data architects, solution architects, business analysts, and reporting teams to implement approved data solutions.
  • Operational Support & Troubleshooting: Provide ongoing support for Databricks workflows, resolve pipeline failures, and troubleshoot complex data processing issues.
  • Mentorship & Knowledge Sharing: Provide guidance to junior data engineers and contribute to documentation and team enablement.
  • Works under minimal supervision with minor guidance from senior personnel.

Skills & Work Experience:

  • Professional Experience: At least seven (7–9+) years of experience in data engineering, ETL development, or large-scale data integration environments.
  • Strong experience designing and developing ETL pipelines and data transformations in Azure Databricks environments.
  • Strong proficiency in SQL and Python, with hands-on experience using PySpark for distributed data processing.
  • Experience working with cloud-based data platforms, data lakes, and lakehouse environments, preferably on Microsoft Azure.
  • Experience implementing layered lakehouse data architectures (bronze, silver, gold) for enterprise analytics.
  • Familiarity with Spark-based big data processing frameworks.
  • Experience supporting data migration from legacy databases and ETL tools to Databricks-based platforms.
  • Experience integrating Databricks platforms with business intelligence and reporting tools such as Power BI.
  • Familiarity with data governance, metadata management, and data security best practices.
  • Experience with source control and CI/CD pipelines for data engineering and Databricks workflows.
  • Working knowledge of SDLC and Agile delivery methodologies.
  • Excellent organizational, communication, and collaboration skills.

Preferred:

  • Hands-on experience with Azure Databricks workflows, Delta tables, and Databricks job orchestration.
  • Experience with Azure Data Lake, Azure Data Factory, or similar Azure data services.
  • Experience supporting federal IT modernization or large-scale enterprise data transformation initiatives.
  • Familiarity with healthcare, insurance, or benefits administration data environments.
  • Experience implementing data governance or data catalog platforms.

Education:

Bachelor’s degree in Computer Science, Engineering, Information Systems, or a related technical discipline. Equivalent education or professional experience will be considered in lieu of a degree.

Clearance:
Must be a U.S. citizen and be able to obtain a Public Trust clearance.

If you are looking for a fun and challenging environment with talented, motivated people to work with, CTEC is the right place for you. In addition to employee salary, we offer an array of employee benefits including:

  • Paid vacation & Sick leave

  • Health insurance coverage

  • Career training

  • Performance bonus programs

  • 401K contribution & Employer Match

  • 11 Federal Holidays


 

Top Skills

Azure Data Factory
Azure Data Lake
Azure Databricks
Ci/Cd
Data Governance
ETL
Power BI
Pyspark
Python
SQL

Similar Jobs

18 Hours Ago
Easy Apply
Remote
USA
Easy Apply
107K-150K Annually
Junior
107K-150K Annually
Junior
Big Data • Healthtech • HR Tech • Machine Learning • Software • Telehealth • Big Data Analytics
As a Data Engineer II, you will build and optimize data pipelines, define reusable datasets, and create a data validation framework while ensuring user privacy and security.
Top Skills: AirbyteAirflowArgoAWSDbtDuckdbElasticsearchIcebergLookerPythonSparkSQLTerraform
2 Days Ago
Easy Apply
Remote or Hybrid
USA
Easy Apply
158K-205K Annually
Senior level
158K-205K Annually
Senior level
Food • Software
The Senior Data Engineer handles ChowNow's data platform, collaborating with teams to enhance data availability and insights, supporting internal and customer-facing products.
Top Skills: AWSDbtPythonSnowflakeSQL
21 Days Ago
Remote or Hybrid
Framingham, MA, USA
69K-129K Annually
Mid level
69K-129K Annually
Mid level
Big Data • Healthtech • Software
Design, build, and maintain scalable ETL/ELT pipelines using Python, Spark, Databricks, Airflow and SSIS. Integrate and cleanse diverse healthcare datasets, implement Unity Catalog for metadata and governance, optimize Spark performance and JVM tuning, support Medallion architecture, and collaborate with cross-functional teams to automate CI/CD, observability, and data quality processes.
Top Skills: Apache AirflowSparkAWSCsvDatabricksDeltaGitlab CiJenkinsJvmMedallion ArchitectureNoSQLParquetPythonScalaSQLSsisUnity CatalogXML

What you need to know about the Colorado Tech Scene

With a business-friendly climate and research universities like CU Boulder and Colorado State, Colorado has made a name for itself as a startup ecosystem. The state boasts a skilled workforce and high quality of life thanks to its affordable housing, vibrant cultural scene and unparalleled opportunities for outdoor recreation. Colorado is also home to the National Renewable Energy Laboratory, helping cement its status as a hub for renewable energy innovation.

Key Facts About Colorado Tech

  • Number of Tech Workers: 260,000; 8.5% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Lockheed Martin, Century Link, Comcast, BAE Systems, Level 3
  • Key Industries: Software, artificial intelligence, aerospace, e-commerce, fintech, healthtech
  • Funding Landscape: $4.9 billion in VC funding in 2024 (Pitchbook)
  • Notable Investors: Access Venture Partners, Ridgeline Ventures, Techstars, Blackhorn Ventures
  • Research Centers and Universities: Colorado School of Mines, University of Colorado Boulder, University of Denver, Colorado State University, Mesa Laboratory, Space Science Institute, National Center for Atmospheric Research, National Renewable Energy Laboratory, Gottlieb Institute

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account