Job Summary
Job Title: Senior Technical GCP Delivery Head (Agentic AI & Data)
Level: Senior Leadership (Director Level)
Core Mission: To lead the architectural design and delivery of GCP-native platforms where Data is the Foundation and Agentic AI is the Engine.
1. Role Executive Summary
As the Senior Technical Delivery Head, you are the visionary lead for our most complex Google Cloud engagements. You will move beyond "cloud migration" to "intelligent transformation." Your primary responsibility is to architect and deliver Agentic Workflows—autonomous AI systems that can reason, use tools, and access real-time data to solve business problems. You will bridge the gap between Data Engineering and Generative AI to create scalable, production-grade agentic ecosystems.
Key Responsibilities
Key Responsibilities (The "Agentic & Data" Core)
- Agentic Orchestration & Delivery: Lead the deployment of multi-agent systems using Vertex AI Agents and LangGraph. You will be responsible for the technical delivery of agents that don't just "chat," but execute complex tasks (e.g., automated procurement, intelligent supply chain rerouting).
- Data-Centric AI Strategy: Oversee the architecture of BigQuery-centric Data Clean Rooms and Data Fabrics (using Dataplex) to ensure AI models are grounded in high-quality, real-time enterprise data.
- RAG & Vector Architecture: Direct the implementation of advanced Retrieval-Augmented Generation (RAG) patterns, utilizing Vertex AI Vector Search and AlloyDB for high-performance context retrieval.
- LLMOps & Governance: Establish the "Factory Floor" for AI delivery, including automated model evaluation, prompt versioning, and safety guardrails using GCP Model Armor and Sensitive Data Protection.
- Technical Governance: Conduct deep-dive architectural reviews to ensure every project adheres to the Google Cloud Well-Architected Framework, with a specific focus on the AI/ML Pillar.
Skill Requirements
Technical Stack Requirements
|
Pillar |
Focus Areas |
|
Agentic AI |
Vertex AI Agent Builder, LangChain, Function Calling, Reasoning Engine, Model Garden (Gemini 1.5 Pro/Flash). |
|
Data Engineering |
BigQuery (BigLake, Omni), Dataflow (Streaming), Pub/Sub, Dataplex for Data Governance. |
|
Vector & Search |
Vertex AI Search and Conversation, Vector Search, pgvector on Cloud SQL/AlloyDB. |
|
Cloud Native |
GKE (for hosting custom agent services), Cloud Run, Terraform (Infrastructure as Code). |
|
AI Safety |
Vertex AI Model Monitoring, Data Masking, VPC Service Controls for AI workloads. |
Other Requirements
Qualifications
- Experience: 12+ years in Technical Delivery/Architecture, with at least 4 years focused on Data/AI at scale on GCP.
- Architectural Depth: Proven track record of moving AI models from "PoC" (Proof of Concept) to "Production" with full CI/CD and LLMOps pipelines.
- Leadership: Experience managing a "Technical Office" of 20+ Lead Architects and Data Engineers.
- Education: Master’s in CS, AI, or Data Science preferred.
- Certifications:
- Required: GCP Professional Cloud Architect.
- Preferred: GCP Professional Data Engineer OR Professional Machine Learning Engineer.
5. Success Metrics (KPIs)
- Agent Autonomy Score: Success rate of deployed agents in completing multi-step business processes without human intervention.
- Time-to-Insight: Reducing the latency between raw data ingestion in BigQuery and its availability for AI agent grounding.
- Inference Unit Economics: Optimizing token usage and model selection (e.g., Gemini Flash vs. Pro) to maintain project margins.
- Architecture Reusability: Creation of modular "Agentic Blueprints" that can be deployed across multiple clients.