Why this role exists
The AI Center needs a single accountable leader for portfolio strategy and industrialisation-turning experiments into adopted products with controlled run‑costs, clear governance, and decision‑grade evidence. This role operationalises a “single‑door” direction: close the door on ad‑hoc tools and drive consistent, scalable AI product delivery through a governed factory model.
Portfolio scope
- AI use‑cases moving through Innovation Board gates
- Recipe.AI, Chef.AI, Portfolio.AI, Marketing Translation.
- AI Center enablement services and assets
- Innovation Board operating rhythm and gate criteria
- AI Lab Principles/Playbook (PoC vs pilot vs MVP vs BAU)
- AI Toolkit + Sandbox adoption (standard experimentation route)
- Ethics enablement
- FinOps/showback, cost governance, run‑cost strategy
Outcomes you will be measured on
- Roadmap, investment logic, and value clarity
- 12–18 month portfolio roadmap with sequencing, dependencies, business cases (ROI + adoption assumptions), and operating model per product.
- Clear prioritisation logic based on value, feasibility, risk, and run‑cost.
- Innovation Board runs as a product factory
- Board has a clear charter, RACI, cadence, scorecard (funnel volume, pass/fail rates, cycle time), and decision templates.
- Every gate decision is supported by decision‑quality artifacts: data readiness, evaluation plan/benchmarks, security posture, cost forecast, adoption plan.
- Service catalog clarity (standardisation at scale)
- AI Lab services defined with SLAs, required artifacts, and volume forecasts (Sandbox, MLOps, Ethics, FinOps, AI CoE patterns).
- “Default route” for experimentation is the Toolkit/Sandbox-exceptions are explicit and time‑boxed.
- Value realisation discipline (BAU products)
- Quarterly value reviews for BAU products (Portfolio.AI, Marketing Translation) with funded optimisation backlog and measurable KPI movement.
- Governance scaled (risk, ethics, and delivery discipline)
- Clear, enforceable principles distinguishing PoC vs pilot vs MVP vs GA; “done means” standards include wired release readiness, telemetry, and operational handover.
- Joint‑success principles with vendors and business sponsors are explicit and measurable.
- FinOps and run‑cost control
- Cross‑charging model maintained; GPU/compute strategy options presented early (trade‑offs among performance, cost, and reliability).
- No “silent run‑cost blowups”: predictable budgets and proactive mitigations.
- Singapore government co‑funding success
- Maintain the co‑funding deliverable map (milestones, KPI targets, eligible cost tracking).
- Produce structured quarterly reporting, demo days, and audit‑ready evidence packs.
- Manage expectations and stakeholder engagement with government counterparts and ecosystem partners.
What You Do
- Portfolio strategy and prioritisation
- Define product strategy per initiative: problem statement, target users, adoption plan, KPIs, operating model.
- Run prioritisation with Sonal/Dan; explicitly manage trade‑offs among value, feasibility, risk, time‑to‑market, and run‑cost.
- Governance and operating model
- Own Innovation Board mechanics: gate criteria, templates, decision logs, scorecards, and escalation paths.
- Define and enforce portfolio “definition‑of‑done” standards (including data/model contracts, security controls, observability, BAU handover).
- Ensure the Toolkit/Sandbox is not optional “nice-to-have,” but the standard platform path.
- Stakeholder leadership (business + architecture + security)
- Drive alignment and sign‑offs with CDAO LT and Enterprise Architecture; ensure solution strategy is board‑endorsed and operationally viable.
- Own narrative and executive‑ready decision packs: trade‑offs, cost, risk, adoption constraints, and recommended decision.
- Vendor and partner shaping (without doing procurement’s job)
- Translate roadmap into outcome‑based requirements and success metrics that can be contracted.
- Ensure vendor delivery aligns with “industrialisation” expectations (security-by-design, documentation, monitoring, handover, cost controls).
- Define joint success measures and governance cadence with partners.
- Adoption, ethics, and community flywheel
- Drive enterprise enablement motions that materially affect adoption: ethics hub, Copilot/GenAI usage guidance, community ideation intake, playbooks for responsible AI rollout.
- Ensure adoption is measurable (telemetry + feedback loops) and drives roadmap decisions.