Anvilogic is a Palo Alto-based AI cybersecurity startup founded in 2019 by security veterans and data scientists from Fortune 500 companies.
Our mission is to democratize threat detection and hunting for today’s SOC teams to easily be done across hybrid, multi-clouds and security data lakes without needing to centralize data or rip and replace tools. Further, with our investments in AI-powered automation of detection-as-code to create, test, tune and deploy detections, SOC users can implement high-efficacy detection and hunting techniques without writing a single line of code nor manually wrangling data.
Anvilogic raised funding in April 2024 and is backed by top-tier VC firms and prominent industry executives. Anvilogic’s AI-powered Multi-Data Platform SIEM is used by many of the industry’s most advanced security teams.
Learn about our customers:
JOB DESCRIPTION
As a Senior Software Engineer, Data at Anvilogic, you are responsible for designing, building, and operating our high scale data ingestion pipelines. You will work directly on company critical projects related to how our users’ data is brought into our data stores, normalized across different data stores, and exposed for analysis, using technologies such as AWS, Azure, Snowflake, Databricks, and Splunk.
- Design, build, and operate data ingestion and normalization pipelines
- Work with product management teams to map out non-functional requirements and implement those requirements in your services
- Deploy and monitor resources in cloud and data lake providers
- Spread knowledge of data pipeline best practices throughout Anvilogic through mentorship, documentation, and brown bag sessions
Minimum Qualifications
- 4+ years of software development experience
- Excellent written and verbal communication skills
- Experience with data lakes such as Snowflake or Databricks
- Experience with cloud providers such as AWS, GCP, or Azure
- Experience with defining non-functional requirements, measuring SLOs, and balancing tech foundation and product timelines
- Ability to quickly come up to speed on our data pipeline techstack, which uses Python deployed on Snowflake and Databricks via AWS, Azure, and GCP.
Preferred Qualifications
- Experience with ingesting large amounts of user data into Snowflake or Databricks
- Experience deploying services using infrastructure-as-code (Terraform, AWS SAM, CloudFormation, or CDK).
- Experience with observability technologies like Grafana and Sentry
- Some experience with LLMs, implementing standard patterns (Agents, RAG, Tools), and leveraging popular frameworks.
- Familiarity with security data (e.g., endpoint and network logs)
- Competitive salary with equity in the company
- Comprehensive medical, dental, and vision insurance
- Unlimited paid time off policy for work life balance
- 401(k) retirement plan with company match
- Monthly stipend for home internet and cell phone expenses
Top Skills
Similar Jobs
What you need to know about the Colorado Tech Scene
Key Facts About Colorado Tech
- Number of Tech Workers: 260,000; 8.5% of overall workforce (2024 CompTIA survey)
- Major Tech Employers: Lockheed Martin, Century Link, Comcast, BAE Systems, Level 3
- Key Industries: Software, artificial intelligence, aerospace, e-commerce, fintech, healthtech
- Funding Landscape: $4.9 billion in VC funding in 2024 (Pitchbook)
- Notable Investors: Access Venture Partners, Ridgeline Ventures, Techstars, Blackhorn Ventures
- Research Centers and Universities: Colorado School of Mines, University of Colorado Boulder, University of Denver, Colorado State University, Mesa Laboratory, Space Science Institute, National Center for Atmospheric Research, National Renewable Energy Laboratory, Gottlieb Institute