MeridianLink Logo

MeridianLink

Principal Data Architect - AI

Posted Yesterday
Remote
Hiring Remotely in US
126K-215K Annually
Expert/Leader
Remote
Hiring Remotely in US
126K-215K Annually
Expert/Leader
The Principal Data Architect will design and oversee the enterprise data architecture, ensuring effective modeling, integration, governance, and consumption of data across the organization. Responsibilities include building data models, implementing lakehouse strategies, and partnering with multiple teams for the evolution of data practices.
The summary above was generated by AI

Principal Data Architect

About the Role

Reporting to the Vice President of Data, the Principal Data Architect is the senior technical authority for how data is modeled, integrated, governed, and consumed across MeridianLink. You will design our enterprise data architecture end-to-end — from source-system ingestion through our Azure Databricks lakehouse and into the analytical, operational, and customer-facing data products our business depends on.

This is a hands-on role. You will not only define the meta-models, conceptual models, logical models, and physical schemas that govern our data — you will build them, prove them out in code, partner closely with data engineers as they implement them, and evolve them as the business grows. The right candidate has done this before in a FinTech SaaS environment and understands the trade-offs that come with multi-tenant data, regulated workloads, and customer-facing analytics.

What You Will Do

• Define the enterprise data architecture: Own the conceptual, logical, and physical data models for MeridianLink's analytical and operational data platform, including source-aligned, integrated, and consumption-ready layers.

• Build the meta-model: Design and maintain a meta-model that captures entities, relationships, business definitions, ownership, lineage, sensitivity classifications, and SLAs — and make sure it is wired into our tooling, not stuck in a slide deck.

• Drive the lakehouse strategy: Architect our medallion (bronze / silver / gold) Delta Lake patterns on Databricks; define standards for partitioning, clustering, schema evolution, slowly changing dimensions, and historical reproducibility.

• Be hands-on: Write PySpark, SQL, and Delta Lake code. Build reference implementations, prototype patterns, review pull requests, and personally model critical domains rather than delegating every detail.

• Lead data integration design: Set patterns for ingestion through Informatica Data Management Cloud (IDMC) and direct Databricks pipelines, including CDC, batch, streaming, and API-based sourcing from our SaaS products and third-party systems.

• Champion data governance and lineage: Partner with data governance, security, and compliance leaders to operationalize cataloging, lineage, classification, masking, and access controls across the platform (Unity Catalog, IDMC, and adjacent tools).

• Standardize data modeling practices: Establish the standards, naming conventions, and review processes used by the Data Engineering team. Coach engineers on dimensional modeling, Data Vault, and other techniques where they best fit the use case.

• Partner across the business: Work closely with Product, Engineering, Analytics, ML, Finance, Risk, and Customer-facing teams to translate business needs into durable data designs.

• Influence the roadmap: Identify gaps in tooling, capability, and skill; propose investments; and drive multi-quarter initiatives that materially improve how MeridianLink uses its data.

Required Qualifications

• 12–15+ years of progressive experience in data engineering, data warehousing, and data architecture roles, with at least the most recent several years at the architect level.

• Demonstrated experience as a Data Architect at a SaaS company in the FinTech or financial services software space (lending, banking, payments, capital markets, insurance, or a closely related domain).

• Deep, hands-on expertise with Databricks and PySpark on Azure, including Delta Lake, Unity Catalog, structured streaming, and performance tuning at scale.

• Production experience with Informatica Data Management Cloud (IDMC) — or comparable enterprise integration platforms — for ingestion, transformation, and metadata-driven pipelines.

• Proven track record of designing and implementing detailed meta-models and end-to-end data models (conceptual, logical, and physical) that have shipped to production and stood up over time.

• Strong command of dimensional modeling (Kimball), Data Vault 2.0, and modern lakehouse patterns, including the ability to choose the right approach for the right use case.

• Expert SQL skills and strong proficiency in Python/PySpark; comfortable writing the code, not just the diagrams.

• Demonstrated experience implementing data governance, lineage, and metadata management programs (e.g., Unity Catalog, IDMC Data Governance, Collibra, Atlan, or similar).

• Working knowledge of FinTech-relevant regulatory and compliance considerations (e.g., GLBA, SOC 2, PCI, NIST, state lending regulations) and how they shape data design.

• Excellent written and verbal communication skills; able to explain complex data concepts to engineers, executives, customers, and auditors.

Preferred Qualifications

• Prior experience designing data architectures for multi-tenant SaaS platforms with customer-facing analytics or embedded reporting.

• Experience supporting Loan Origination, deposit account opening, or other consumer lending workflows and the underlying data domains (applicants, applications, decisions, funding, servicing, credit data, fraud, KYC/AML).

• Experience building feature stores or curated data products that serve both ML/AI workloads and BI consumers.

• Familiarity with Azure data services (ADLS Gen2, Azure Data Factory, Event Hubs, Synapse, Purview) and their interplay with Databricks.

• Experience with dbt, Great Expectations, or other modern data quality and transformation tooling layered on top of Databricks.

• Bachelor's or Master's degree in Computer Science, Information Systems, Engineering, or a related field, or equivalent professional experience.

Our Data Stack

• Lakehouse: Azure Databricks, Delta Lake, Unity Catalog, PySpark, SQL

• Integration: Informatica Data Management Cloud (IDMC)

• Cloud: Microsoft Azure (ADLS Gen2, Azure Data Factory, Event Hubs, Key Vault)

• BI & Consumption: Modern BI tooling, embedded analytics, ML feature delivery

• Governance: Unity Catalog, IDMC governance, lineage, and data quality controls

Similar Jobs

42 Minutes Ago
Easy Apply
Remote or Hybrid
United States
Easy Apply
128K-240K Annually
Senior level
128K-240K Annually
Senior level
Fintech • Mobile • Software • Financial Services
Design, build, deploy, and optimize scalable full-stack applications integrating databases, APIs, and AI/LLM workflows. Lead architecture, CI/CD (ArgoCD, GitLab), Airflow pipelines, and collaborate cross-functionally to deliver production-grade fintech solutions.
Top Skills: Apache AirflowArgocdAWSGitlab Ci/CdLlmsNode.jsReactRetrieval-Augmented Generation (Rag)SnowflakeSnowflake Cortex
43 Minutes Ago
Easy Apply
Remote or Hybrid
United States
Easy Apply
154K-264K Annually
Senior level
154K-264K Annually
Senior level
Fintech • Mobile • Software • Financial Services
As a Staff AI Software Engineer, you'll develop and optimize scalable AI applications, manage deployment pipelines, and collaborate with cross-functional teams to drive innovative solutions.
Top Skills: Apache AirflowArgocdAWSCortexGenerative AiGitlab Ci/CdNode.jsReactSnowflake
43 Minutes Ago
Easy Apply
Remote or Hybrid
United States
Easy Apply
125K-215K Annually
Senior level
125K-215K Annually
Senior level
Fintech • Mobile • Software • Financial Services
The Senior Manager of Corporate Communications will develop strategies to enhance awareness and trust in SoFi's crypto products, manage media relations, and track communication effectiveness.
Top Skills: Ai ToolsCommunication StrategiesDigital AssetsFinancial Technology

What you need to know about the Colorado Tech Scene

With a business-friendly climate and research universities like CU Boulder and Colorado State, Colorado has made a name for itself as a startup ecosystem. The state boasts a skilled workforce and high quality of life thanks to its affordable housing, vibrant cultural scene and unparalleled opportunities for outdoor recreation. Colorado is also home to the National Renewable Energy Laboratory, helping cement its status as a hub for renewable energy innovation.

Key Facts About Colorado Tech

  • Number of Tech Workers: 260,000; 8.5% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Lockheed Martin, Century Link, Comcast, BAE Systems, Level 3
  • Key Industries: Software, artificial intelligence, aerospace, e-commerce, fintech, healthtech
  • Funding Landscape: $4.9 billion in VC funding in 2024 (Pitchbook)
  • Notable Investors: Access Venture Partners, Ridgeline Ventures, Techstars, Blackhorn Ventures
  • Research Centers and Universities: Colorado School of Mines, University of Colorado Boulder, University of Denver, Colorado State University, Mesa Laboratory, Space Science Institute, National Center for Atmospheric Research, National Renewable Energy Laboratory, Gottlieb Institute

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account