We are looking for exceptional Lead Engineers to build the engine that powers Salesforce's enterprise intelligence. In this role, you will be a hands-on technical contributor responsible for modernizing our core data ecosystem. You will move beyond simple ETL scripts to build a robust, software-defined Data Mesh using Snowflake, dbt, Airflow, and Informatica.
You will bridge the gap between "Data Engineering" and "Software Engineering"-treating data pipelines as production code, automating infrastructure with Terraform, and optimizing high-scale distributed systems to enable AI and Analytics across the enterprise.
Key Responsibilities Core Platform Engineering & Architecture - Build & Ship: Design and implement scalable data pipelines and transformation logic using Snowflake (SQL) and dbt. Replace Legacy hardcoded scripts with modular, testable, and reusable data components.
- Orchestration: Engineer robust workflows in Airflow. Write custom Python operators and ensure DAGs are dynamic, factory-generated, and resilient to failure.
- Performance Tuning: Own the performance of your datasets. Deep dive into query profiles, optimize pruning/clustering in Snowflake, and reduce credit consumption while improving data freshness.
DevOps, Reliability & Standards - Infrastructure as Code: Manage the underlying platform infrastructure (warehouses, roles, storage integration) using Terraform or Helm. Click-ops is not an option.
- CI/CD & Quality: Enforce a strict DataOps culture. Ensure every PR has unit tests, schema validation, and automated deployment pipelines.
- Reliability (SRE): Build monitoring and alerting (Monte Carlo, Grafana, Newrelic, Splunk) to detect data anomalies before stakeholders do.
Collaboration & Modernization - Data Mesh Implementation: Work with domain teams (Sales, Marketing, Finance) to onboard them to the platform, helping them decentralize their data ownership while adhering to platform standards.
- AI Readiness: Prepare structured data for AI consumption, ensuring high-quality, governed datasets are available for LLM agents and advanced analytics models.
- Focus: System Design & Technical Leadership. You proactively identify problems (eg, Our ingestion pattern won't scale 10x ) and design the architectural solution. You lead the technical direction for a squad.
- Scope: You own entire subsystems or domain architectures. You are the Tech Lead for a group of engineers, driving technical consensus, RFCs, and coordinating cross-team dependencies.
What We're Looking For Core Qualifications - Engineering Roots: Strong background in software engineering (Python/Java/Go) applied to data. You are comfortable writing custom API integrations and complex Python scripts.
- The Modern Stack: Deep production experience with Snowflake (architecture/tuning) and dbt (Jinja/Macros/Modeling).
- Workflow Orchestration: Advanced proficiency with Airflow (Managed Workflows for Apache Airflow).
- Cloud Native: Hands-on experience with AWS services (S3, Lambda, IAM, ECS) and containerization (Docker/Kubernetes).
- DevOps Mindset: Experience with Git, CI/CD (GitHub Actions/Jenkins), and Terraform.
Experience Requirements - 8+ years of experience, with a proven track record of leading technical projects or small teams.
Nice to Have - Knowledge Graph Experience: Familiarity with Graph Databases (4j) or Semantic Standards (RDF/SPARQL, TopQuadrant) is a strong plus as we integrate these technologies into the platform.
- Open Table Formats: Experience with Apache Iceberg or Delta Lake.
- Streaming: Experience with Kafka or Snowpipe Streaming.
- AI Integration: Experience using AI coding assistants (Copilot, Cursor) to accelerate development.

San Francisco, CA, United States of America
Click apply
JS26489_25304_CD046BF9161DA0716C955D5C5FDCC9C7
1/28/2026 7:31:45 AM
We strongly recommend that you should never provide your bank account details to an advertiser during the job application process. Should you receive a request of this nature
please contact support giving the advertiser's name and job reference.