The Junior Data Engineer will develop and support ETL processes in the Enterprise Data Warehouse, enhance data transformations, and collaborate with teams for technical solutions. Responsibilities include production support and documentation.
Job Summary & Responsibilities
POSITION OVERVIEW:
We are seeking a junior Data Engineer to provide support and enhancement of our Enterprise Data Warehouse. The role focuses on supporting and modernizing ETL processes within an on-premises Cloudera Data Platform (CDP) environment, adopting technologies like Apache Spark, Apache Iceberg, and Apache Airflow for scalable, efficient, and reliable data transformation and management. The ideal candidate will have knowledge of the ETL development, along with experience participating in production support environments.
Essential job functions:
- Development
- Contribute development efforts for ETL pipelines in the Enterprise Data Warehouse (EDW)
- Support and rebuild legacy ETL jobs (currently not using ACID transactions) with modern solutions using Apache Spark and Apache Iceberg to support ACID transactions.
- Transform and integrate EBCDIC Mainframe data into Hive and Impala tables using Precisely Connect for Big Data.
- Optimize data transformation processes for performance, scalability, and reliability.
- Ensure data consistency, accuracy, and quality across the ETL pipelines.
- Utilizes best practices for ETL code development, version control, and deployment using Azure DevOps.
- Production Support
- Shares weekly 24/7 production support with managed service vendor on a 4-week rotation.
- Monitor ETL workflows and troubleshoot issues to ensure smooth production operations.
- Research and resolve user requests and issues
- Collaboration and Stakeholder Engagement
- Collaborate with cross-functional teams, including data engineers, business analysts, administrators, and quality analyst engineers to ensure alignment on requirements and deliverables.
- Engage with business stakeholders to understand data requirements and translate them into scalable technical solutions.
- Technical Governance
- Contribute to process documentation, and follow best practices within the Enterprise Data Warehouse
- Follow proper SDLC protocols within Azure DevOps code repository
- Stay updated on emerging technologies and trends to continuously improve data platform capabilities.
- Other tasks as assigned by management
MINIMUM REQUIREMENTS:
- Bachelor’s degree in IT or similar field. (Additional equivalent experience above the required minimum may be substituted for the degree requirement.)
- Basic SQL experience
- Basic Python and general SDLC best practices.
- Basic Linux operations skills.
- Familiarity with version control systems
- Familiarity with data modeling, schema design, and building data models for reporting needs.
- Understanding of ETL frameworks, ACID transactions, change data capture, and distributed computing.
- Effective communication and collaboration abilities to collaborate with diverse teams and stakeholders.
- Timeline centric mindset
- This position requires (6C) personnel security screening in accordance with the U.S. Department of Education’s (ED) policy regarding the personnel security screening requirements for all contractor and subcontractor employees. A qualified applicant must successfully submit for personnel security screening within 14 calendar days from employment offer.
PREFERRED QUALIFICATIONS:
- Experience with Cloudera Data Platform (CDP), including Hive and Impala
- Knowledge of Precisely Connect for Big Data or similar tools for mainframe data transformation
Top Skills
Apache Airflow
Apache Iceberg
Spark
Azure Devops
Cloudera Data Platform
Hive
Impala
Linux
Precisely Connect For Big Data
Python
SQL
Similar Jobs
Information Technology • Software • Analytics
The role involves writing and deploying scripts for data collection, transforming data sets, diagnosing bugs, and collaborating with other engineers using agile principles.
Top Skills:
Apache AirflowSparkAWSDockerGCPPostgresPythonScalaSQL
Automotive • Cloud • Greentech • Information Technology • Other • Software • Cybersecurity
Responsible for designing and implementing AI solutions across cloud platforms, leveraging Generative AI, and managing AI services, pipelines, and infrastructure.
Top Skills:
AWSAzureGCPHugging Face TransformersLangchainLanggraphPythonPyTorchScikit-LearnTensorFlowTerraform
Aerospace • Information Technology • Cybersecurity • Defense • Manufacturing
The Software Security Engineer will automate security assessments, analyze security risks, and support compliance for open-source software in the enterprise.
Top Skills:
Automation ToolsCotsOpen SourceSoftware Composition Analysis
What you need to know about the Colorado Tech Scene
With a business-friendly climate and research universities like CU Boulder and Colorado State, Colorado has made a name for itself as a startup ecosystem. The state boasts a skilled workforce and high quality of life thanks to its affordable housing, vibrant cultural scene and unparalleled opportunities for outdoor recreation. Colorado is also home to the National Renewable Energy Laboratory, helping cement its status as a hub for renewable energy innovation.
Key Facts About Colorado Tech
- Number of Tech Workers: 260,000; 8.5% of overall workforce (2024 CompTIA survey)
- Major Tech Employers: Lockheed Martin, Century Link, Comcast, BAE Systems, Level 3
- Key Industries: Software, artificial intelligence, aerospace, e-commerce, fintech, healthtech
- Funding Landscape: $4.9 billion in VC funding in 2024 (Pitchbook)
- Notable Investors: Access Venture Partners, Ridgeline Ventures, Techstars, Blackhorn Ventures
- Research Centers and Universities: Colorado School of Mines, University of Colorado Boulder, University of Denver, Colorado State University, Mesa Laboratory, Space Science Institute, National Center for Atmospheric Research, National Renewable Energy Laboratory, Gottlieb Institute



