Technical Program Manager (Data Platform)
Location: Mountain View, CA / Plano, TX (Hybrid)
About the Role
We are seeking an experienced Technical Program/Product Manager (TPM) to lead the end-to-end delivery of our modern data platform. This role involves driving multi-team programs focused on designing, building, and scaling a secure, cost-efficient Lakehouse architecture on AWS.
You’ll work cross-functionally with data engineering, platform, security, and product teams to implement capabilities in Databricks (Delta Lake, Unity Catalog, Workflows/Jobs, Delta Live Tables) and support event-driven data products using AWS EventBridge, Kafka, and Kinesis — across both real-time streaming and batch pipelines.
Key Responsibilities
-
Own the program delivery lifecycle for enterprise data platform initiatives.
-
Coordinate roadmaps, manage dependencies, and mitigate risks across teams.
-
Define and track program metrics, SLAs, and success KPIs.
-
Build dashboards and executive reports to communicate program progress and business impact.
-
Lead quarterly roadmap planning and prioritization efforts.
-
Partner with engineering leads on design decisions, scaling strategy, and cost optimization.
-
Manage delivery of new data capabilities in Databricks and event-driven environments.
-
Oversee migration efforts from legacy ETL tools (Informatica or equivalent) to modern lakehouse patterns.
-
Drive FinOps practices to optimize data compute spend.
-
Apply AI-native and GenAI-agent product thinking to improve program execution.
Required Skills & Experience
-
7+ years of experience in technical program or product management within data platform environments.
-
Deep understanding of Databricks, AWS, and event-driven architectures.
-
Proven experience managing large-scale, cross-functional programs in Agile settings.
-
Hands-on experience with data lakehouse architectures, streaming pipelines, and batch ETL processes.
-
Strong executive communication and stakeholder management skills.
-
Experience in Informatica or similar enterprise ETL tools.
-
Familiarity with AI-native tooling (Builder.io, Cursor, etc.) and modern data ecosystems.
-
Ability to prototype, experiment, and drive execution independently.
Preferred Qualifications
-
Prior experience managing migrations from legacy ETL to Databricks or AWS-based lakehouse systems.
-
Exposure to enterprise internal data platforms and FinOps cost governance.
-
Deep understanding of semantic models, analytical consumption layers, and modern BI tools.
What You’ll Bring
-
A combination of technical fluency and delivery excellence.
-
The ability to bridge the gap between data engineers, architects, and business stakeholders.
-
A forward-thinking mindset — blending data, AI, and product innovation.