DeleteMe Logo

DeleteMe

Data & Analytics Engineer

Posted Yesterday
Remote
Hiring Remotely in United States
100K-100K Annually
Mid level
Remote
Hiring Remotely in United States
100K-100K Annually
Mid level
The Data & Analytics Engineer designs and optimizes scalable data models in Snowflake, translates business requirements into actionable data products, and ensures data integrity.
The summary above was generated by AI

About DeleteMe: 

DeleteMe is the leader in proactive privacy protection. We help Individuals, Families, Businesses and Security teams reduce their human attack surface by continuously monitoring and removing exposed personal data (PII) from the open web — the very data threat actors use to launch social engineering, phishing, Gen-AI deepfake, doxxing campaigns, physical threats, and identity fraud.

Operating as a fast-growing, global SaaS company, DeleteMe serves both consumers and enterprises. DeleteMe has completed over 100 million opt-out removals, helping customers reduce risks associated with identity theft, spam, doxxing, and other cybersecurity threats. We deliver detailed privacy reports, continuous monitoring, and expert support to ensure ongoing protection.

DeleteMe acts as a scalable, managed defense layer for your most vulnerable attack vector: your people. That’s why 30% of the Fortune 100, top tech firms, major banks, federal agencies, and U.S. states rely on DeleteMe to protect their workforce.

DeleteMe is led by a passionate and experienced team and driven by a powerful mission to empower consumers with privacy.


Job Summary:

This position is a key partner across the organization, sitting within the Data Warehouse team to bridge the gap between raw data engineering and business strategy. The Data & Analytics Engineer is responsible for designing, building, and optimizing scalable data models in Snowflake using dbt, ensuring data integrity and high performance. This role balances technical warehouse architecture with the ability to translate complex business requirements into actionable data products.

Job Responsibilities:

  • Data Modeling & Development: Architect and maintain robust, modular data models in Snowflake using dbt, following industry-standard modeling methodologies (e.g., Kimball).
  • Warehouse Optimization: Write and tune advanced SQL to ensure optimal query performance, cost-efficiency, and resource management within the Snowflake environment.
  • Data Observability & Quality: Implement and manage automated testing, monitoring, and alerting frameworks to ensure data accuracy, freshness, and lineage.
  • Stakeholder Collaboration: Partner with business units to define KPIs, capture requirements, and translate business logic into technical data specifications.
  • End-to-End Delivery: Own the full data lifecycle from ingestion to production-grade data marts and strategic BI visualizations and dashboard building.
  • Engineering Excellence: Apply software engineering best practices to data development, including version control (Git), CI/CD, and detailed technical documentation.
  • Process Improvement: Continuous refactoring of legacy code and data structures to improve maintainability and scalability of the analytics stack.

Job Requirements:

  • Mastery of complex SQL, including window functions, CTEs, and performance tuning for large-scale datasets.
  • Proven experience building production-grade dbt projects, including macros, seeds, and testing suites.

  • Strong understanding of Snowflake-specific features such as clustering, virtual warehouses, and zero-copy cloning.

  • Deep knowledge of dimensional modeling, fact/dimension design, and data warehousing principles.

  • Availability in US Eastern (EST) hours.

  • Ability to understand organizational drivers and communicate technical details effectively to non-technical stakeholders.

  • Strong problem-solving skills with the ability to identify root causes in data discrepancies or performance bottlenecks.

Qualifications:

  • Bachelor’s degree in Computer Science, Data Science, Statistics, Business, or a related field.

  • 3+ years of experience in Analytics Engineering, Data Engineering, or a highly technical BI role.

  • Proficiency in Snowflake, dbt (with strong SQL), and data architecture.

  • Proven track record of delivering end-to-end data solutions in a cloud warehouse environment.

  • Strong data storytelling and presentation skills.

  • Experience supporting various business functions like Finance, Operations, Sales, Marketing , preferably in SaaS.

Nice to Have:

  • Experience with Python for data scripting or automation.

  • Familiarity with data observability tools (e.g., Monte Carlo, Elementary).

  • Experience in a high-growth startup environment.

  • Cybersecurity experience

What We Offer:

  • Comprehensive health benefits - Medical, Vision, Dental 

  • Flexible work schedule

  • Generous 401k matching up to 6%

  • 20 days paid time off

  • 15 sick days

  • 12 company-paid holidays

  • Childcare expense reimbursement

  • Fitness and cell phone reimbursement

  • Birthday time off

  • Competitive salary - We publish salary ranges to promote transparency and ensure fair compensation. Final offers are based on skills, experience, and internal equity.

  • This role may require occasional domestic and international travel. All standard travel expenses will be covered in accordance with the company's travel reimbursement policy.

Top Skills

Ci/Cd
Dbt
Git
Snowflake
SQL

Similar Jobs

9 Days Ago
Remote or Hybrid
3 Locations
140K-215K Annually
Senior level
140K-215K Annually
Senior level
Cloud • Computer Vision • Information Technology • Sales • Security • Cybersecurity
The role involves leading the FinOps practice, overseeing cloud cost management across product and platform groups, and ensuring strategic alignment and cost efficiency. Responsibilities include team leadership, collaboration with engineering and finance, and reporting to senior leadership.
Top Skills: AWSAzureGCP
Yesterday
In-Office or Remote
184K-357K Annually
Senior level
184K-357K Annually
Senior level
Artificial Intelligence • Computer Vision • Hardware • Robotics • Metaverse
As a Senior Software Engineer, you will develop parallel algorithms for data processing, optimize C++ and CUDA code, and contribute to open-source projects.
Top Skills: C++CudaOpenaccOpenmp
4 Days Ago
Easy Apply
Remote or Hybrid
USA
Easy Apply
200K-250K Annually
Senior level
200K-250K Annually
Senior level
Fintech • Information Technology • Software • Financial Services
The Senior Data Engineer will design and implement scalable data models in BigQuery using dbt for analytics and reporting, ensuring data governance and optimal performance.
Top Skills: BigQueryDbtFivetranSQL

What you need to know about the Colorado Tech Scene

With a business-friendly climate and research universities like CU Boulder and Colorado State, Colorado has made a name for itself as a startup ecosystem. The state boasts a skilled workforce and high quality of life thanks to its affordable housing, vibrant cultural scene and unparalleled opportunities for outdoor recreation. Colorado is also home to the National Renewable Energy Laboratory, helping cement its status as a hub for renewable energy innovation.

Key Facts About Colorado Tech

  • Number of Tech Workers: 260,000; 8.5% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Lockheed Martin, Century Link, Comcast, BAE Systems, Level 3
  • Key Industries: Software, artificial intelligence, aerospace, e-commerce, fintech, healthtech
  • Funding Landscape: $4.9 billion in VC funding in 2024 (Pitchbook)
  • Notable Investors: Access Venture Partners, Ridgeline Ventures, Techstars, Blackhorn Ventures
  • Research Centers and Universities: Colorado School of Mines, University of Colorado Boulder, University of Denver, Colorado State University, Mesa Laboratory, Space Science Institute, National Center for Atmospheric Research, National Renewable Energy Laboratory, Gottlieb Institute

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account