J5 Consulting Logo

J5 Consulting

Data Architect

Posted Yesterday
Remote
Hiring Remotely in United States
Expert/Leader
Remote
Hiring Remotely in United States
Expert/Leader
Support to evaluate and implement scalable data infrastructure. Work with technical teams to optimize data delivery and build reliable data systems. Expertise in data modeling, integration patterns, and data quality assurance is essential.
The summary above was generated by AI

J5 Consulting is a Maryland based company established in 2006 to provide computing and consulting services for government and commercial entities. Our services improve Information System networking performance and compliance and protect electronic assets from loss and compromise. We welcome your application to receive consideration for the following position.

Introduction
The Sponsor requires Data Engineering support to evaluate, optimize, and implement robust data infrastructure that enables reliable, accessible, and scalable data delivery across the organization. The Contractor will work collaboratively with data consumers, technical teams, leadership, and stakeholders to assess current data pipelines, identify gaps in data accessibility and reliability, and architect solutions that establish trusted data foundations. Work involves applying engineering best practices to implement proper data modeling and integration patterns, ensuring data quality and observability throughout pipelines, and creating maintainable infrastructure that supports analytics, reporting, and operational use cases.
The Sponsor's data landscape includes enterprise operational systems such as ServiceNow, network
management platforms (NetIM), and network modeling tools (Forward Networks). The Data Engineering support must be adept at extracting data from these systems via APIs (application Programming Interface), exports, and vendor-specific interfaces, often with limited documentation or non-standard data structures, and transforming this operational data into accessible, integrated datasets.

 

Required Skills and Demonstrated Experience:  

  • ​​​​​​​​​​​​​​Demonstrated experience designing, building, and maintaining production data pipelines using orchestration tools such as Apache Airflow or similar.
  • Demonstrated experience with SQL skills including complex queries, optimization, and performance tuning across multiple database platforms.
  • Demonstrated experience integrating data from Sponsor SaaS platforms and operational systems via APIs, including handling authentication, pagination, and rate limiting.
  • Demonstrated experience working with semi-structured data (JSON and XML) from API responses and transforming into structured datasets.
  • Demonstrated experience with developing robust API integrations with proper error handling and retry logic.
  • Demonstrated experience working with systems that have limited documentation or vendor-specific data models.
  • Demonstrated experience with dimensional modeling and data warehouse design patterns.
  • Demonstrated proficiency in Python for data engineering including working with data processing libraries.
  • Demonstrated experience with cloud data platforms such as AWS, Azure, or GCP, including data services and infrastructure.
  • Demonstrated experience implementing ETL/ELT processes from diverse data sources
  • Demonstrated experience with version control (Git) and software engineering best practices.
  • Demonstrated experience with strong problem-solving and troubleshooting skills for complex data pipeline issues. 
  • Demonstrated experience with strong problem-solving and troubleshooting skills for complex data pipeline issues.
  • Demonstrated experience implementing data quality checks and validation frameworks
  • Demonstrated experience translating business requirements into technical data solutions
  • Demonstrated experience in having a proven track record of delivering reliable, scalable data infrastructure
    •    

Highly Desired Skills and Demonstrated Experience: 

  • Demonstrated experience with ServiceNow APIs, data models, and integration patterns.
  • Demonstrated experience with network management or IT operations systems data extraction.
  • Demonstrated experience with Forward Networks, NetIM, SolarWinds, or similar network management platforms.
  • Demonstrated experience and knowledge of ITSM (Information Technology Service Management), ITOM (Information Technology Operations Management), and CMDB (Configuration Management Database) data structures and relationships.
  • Demonstrated experience with API gateway platforms and API management tools.
  • Demonstrated experience with Apache Spark, particularly PySpark, for distributed data processing.
  • Demonstrated experience with DBT (data build tool) for transformation workflows.
  • Demonstrated experience with infrastructure-as-code tools such as Terraform or CloudFormation.
  • Demonstrated experience implementing CI/CD (Continuous Integration/Continuous Delivery) pipelines for data engineering code.
  • Demonstrated experience and knowledge of streaming data technologies such as Kafka, Kinesis, or similar platforms.
  • Demonstrated experience with data quality platforms such as Great Expectations, Soda, or Monte Carlo.
  • Demonstrated experience implementing data observability and monitoring solutions.
  • Demonstrated experience and knowledge of Data Vault or other advanced modeling methodologies.
  • Demonstrated experience with containerization (Docker) and orchestration (Kubernetes) for data workloads.
  • Demonstrated experience with reverse ETL and operational analytics patterns.
  • Demonstrated experience with data governance platforms and metadata management tools.
  • Demonstrated experience with multiple cloud platforms and multi-cloud architectures.
  • Demonstrated experience mentoring or leading data engineering initiatives.
 

__________________________________________________________________________________

US Citizenship:

  • This position requires US Citizenship. Verification of US Citizenship to meet federal government security requirements will be confirmed.

Security Clearance:

  • The successful candidate must have an active U.S. Government Top Secret Security Clearance with a Full Scope Polygraph.
  • Clearance Verification: This position requires successful verification of the stated security clearance to meet federal government customer requirements. You will be asked to provide clearance verification information prior to an offer of employment.

Travel: 

  • This position is expected to be onsite. The position will be located within the Washington Metropolitan Area (WMA).   Local travel/POV will be on an as needed basis, within the local place of performance.

​​​​​​​
 

Similar Jobs

16 Days Ago
Remote or Hybrid
Roy, FL, USA
136K-231K Annually
Senior level
136K-231K Annually
Senior level
Aerospace • Hardware • Information Technology • Security • Software • Cybersecurity • Defense
Lead the software and data architecture for diverse defense programs; ensure technical direction and mentor staff while developing complex IT solutions from proof-of-concept to production.
Top Skills: .NetAgileArtificial IntelligenceC/C++DevsecopsGoMachine LearningMatlabPythonRRust
21 Days Ago
Remote or Hybrid
United States
Senior level
Senior level
Fintech • Software
The Cloud Data Architect will design scalable data architectures for SaaS solutions, set data standards, collaborate with engineering, and ensure data governance and optimization. Requires advanced SQL and cloud platform proficiency.
Top Skills: Apache AirflowAzureAzure Data FactoryAzure SynapseDatabricksOciOraclePysparkPythonSnowflakeSpark SqlSQL
Yesterday
Remote or Hybrid
US
147K-212K Annually
Senior level
147K-212K Annually
Senior level
Information Technology
The Principal Solutions Architect will develop Hybrid Infrastructure solutions, mentor teams, maintain strategic relationships, and support consultative sales efforts within healthcare.
Top Skills: Cloud EnvironmentsComputeDatacenterEnterprise RoutingHybrid InfrastructureNetworkingStorageSwitchingVirtualization Platforms

What you need to know about the Colorado Tech Scene

With a business-friendly climate and research universities like CU Boulder and Colorado State, Colorado has made a name for itself as a startup ecosystem. The state boasts a skilled workforce and high quality of life thanks to its affordable housing, vibrant cultural scene and unparalleled opportunities for outdoor recreation. Colorado is also home to the National Renewable Energy Laboratory, helping cement its status as a hub for renewable energy innovation.

Key Facts About Colorado Tech

  • Number of Tech Workers: 260,000; 8.5% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Lockheed Martin, Century Link, Comcast, BAE Systems, Level 3
  • Key Industries: Software, artificial intelligence, aerospace, e-commerce, fintech, healthtech
  • Funding Landscape: $4.9 billion in VC funding in 2024 (Pitchbook)
  • Notable Investors: Access Venture Partners, Ridgeline Ventures, Techstars, Blackhorn Ventures
  • Research Centers and Universities: Colorado School of Mines, University of Colorado Boulder, University of Denver, Colorado State University, Mesa Laboratory, Space Science Institute, National Center for Atmospheric Research, National Renewable Energy Laboratory, Gottlieb Institute

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account