ComboCurve Logo

ComboCurve

Senior Data Engineer, Python

Posted 2 Hours Ago
Remote
Hiring Remotely in United States
Senior level
Remote
Hiring Remotely in United States
Senior level
The Senior Data Engineer will design and maintain data pipelines, develop data models, and collaborate with teams on optimization problems, using Python and cloud technologies.
The summary above was generated by AI

ComboCurve is a industry leading cloud-based software solution for A&D, reservoir management, and forecasting in the energy sector. Our platform empowers professionals to evaluate assets, optimize workflows, and manage reserves efficiently, all in one integrated environment.
By streamlining data integration and enhancing collaboration, we help operators, engineers, and financial teams make informed decisions faster. Trusted by top energy companies, ComboCurve delivers real-time analytics and exceptional user support, with a world-class customer experience team that responds to inquiries in under 5 minutes.

We are seeking a highly analytical and experienced Senior Data Engineer to help optimize production forecasting and operations scheduling within the petroleum engineering domain. You’ll bridge the gap between complex mathematical models (reservoir dynamics, optimization, logistics) and robust, cloud-scale data systems.

This role requires a unique combination of deep Python expertise, mastery of modern data processing and API frameworks, and a strong foundational understanding of mathematics, reasoning, and petroleum engineering principles.


Responsibilities

Data Architecture & Engineering

  • Design, build, and maintain scalable data pipelines for ingesting, transforming, and validating time-series data related to well performance, sensor readings, and operational logs.
  • Develop robust, high-performance data models using PyArrow and Pandas for efficient analysis and transfer.
  • Implement data quality and schema validation using Pydantic to ensure data integrity across all stages of the pipeline.
  • Manage and optimize data storage and retrieval in MongoDB, and integrate with cloud-native platforms like GCP BigQuery or Snowflake where applicable.

API & Application Development

  • Build, deploy, and maintain high-performance asynchronous microservices and prototypes using FastAPI or Flask to serve complex optimization and scheduling model predictions.
  • Use Postman for testing, documenting, and automating API workflows.
  • Containerize and orchestrate applications using Docker and manage deployment on Google Cloud Platform (GCP).

Quantitative Analysis & Optimization

  • Collaborate with reservoir and operations teams to translate complex scheduling and logistics problems into mathematical models (e.g., linear programming, resource allocation).
  • Implement numerical routines and simulations efficiently using NumPy for use in production environments.
  • Apply strong logical and analytical reasoning to debug, validate, and interpret the outputs of operational scheduling algorithms.

Requirements

  • Education: Bachelor’s or Master’s degree in Petroleum Engineering, Computer Science, Mathematics, Operations Research, or related quantitative field, or equivalent experience.
  • Quantitative Strength: Proven ability to work with mathematical modeling, optimization, and time-series analysis, including:

o   Linear and Mixed-Integer Programming

o   Probability and Statistics

o   Algorithmic Complexity and Performance Reasoning

  • Collaborative mindset — experience working closely with data scientists, product owners, and domain experts to deliver production-ready systems.

Preferred Qualifications

  • Domain Expertise: Solid understanding of well operations, drilling logistics, production data, and scheduling workflows.
  • Experience working with large-scale or streaming datasets.
  • Experience with mathematical modeling and optimization libraries (SciPy, PuLP, OR-Tools).
  • Experience setting up CI/CD pipelines and container deployments on GCP.

Top Skills

Ci/Cd
Docker
Fastapi
Flask
Gcp Bigquery
MongoDB
Numpy
Or-Tools
Pandas
Postman
Pulp
Pyarrow
Pydantic
Python
Scipy
Snowflake

Similar Jobs

15 Days Ago
Easy Apply
Remote or Hybrid
USA
Easy Apply
170K-200K Annually
Senior level
170K-200K Annually
Senior level
eCommerce • Information Technology • Machine Learning • Marketing Tech • Database • Analytics • Big Data Analytics
As a Senior Software Engineer for Data Systems, you'll design scalable data pipelines, optimize data flows, and build APIs for integrations, requiring strong collaboration with cross-functional teams.
Top Skills: AirflowBigQueryDockerKubernetesPythonSQL
9 Days Ago
In-Office or Remote
2 Locations
Senior level
Senior level
Fintech • Consulting
The Senior Data Engineer develops and maintains data pipelines, collaborates on scalable data solutions, and ensures data security via cloud services.
Top Skills: .NetAWSAzureETLJavaPythonScala
20 Days Ago
Easy Apply
In-Office or Remote
2 Locations
Easy Apply
165K-195K Annually
Senior level
165K-195K Annually
Senior level
Healthtech • Information Technology • Mobile • Productivity • Software • Analytics • Telehealth
The Senior Software Engineer will enhance Doximity's data platforms, collaborating with data teams to resolve challenges, and implement solutions to improve the data stack.
Top Skills: Apache AirflowAWSDockerKafkaKubernetesPodmanPythonSnowflakeSQL

What you need to know about the Colorado Tech Scene

With a business-friendly climate and research universities like CU Boulder and Colorado State, Colorado has made a name for itself as a startup ecosystem. The state boasts a skilled workforce and high quality of life thanks to its affordable housing, vibrant cultural scene and unparalleled opportunities for outdoor recreation. Colorado is also home to the National Renewable Energy Laboratory, helping cement its status as a hub for renewable energy innovation.

Key Facts About Colorado Tech

  • Number of Tech Workers: 260,000; 8.5% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Lockheed Martin, Century Link, Comcast, BAE Systems, Level 3
  • Key Industries: Software, artificial intelligence, aerospace, e-commerce, fintech, healthtech
  • Funding Landscape: $4.9 billion in VC funding in 2024 (Pitchbook)
  • Notable Investors: Access Venture Partners, Ridgeline Ventures, Techstars, Blackhorn Ventures
  • Research Centers and Universities: Colorado School of Mines, University of Colorado Boulder, University of Denver, Colorado State University, Mesa Laboratory, Space Science Institute, National Center for Atmospheric Research, National Renewable Energy Laboratory, Gottlieb Institute

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account