Game Portfolio & Ecosystem Strategy
Defined investment strategy, shared infrastructure, and durable performance signal across a multi-title mobile game portfolio.

Situation
PlayEmber had a growing roster of mobile games. Some were performing. Others were stalled or quietly dying. Resources were tight, and every title ran on its own metrics, its own standards, its own definition of success. It was hard to compare games honestly, let alone decide which ones deserved more investment.
The problem wasn't a lack of games. It was a lack of conviction. We had plenty of data but no shared definition of what "good" looked like. Without that, investment decisions stayed reactive and engineering time got spread across bets nobody fully believed in.
My Role
I owned portfolio-level product strategy. That meant looking at performance across every title, building the frameworks to decide what mattered, shaping roadmaps, and getting design, engineering, and leadership aligned on where we invested, what we tested, and what we killed.
My job was the health of the ecosystem. Not any single game. The whole thing.
Key Actions
Portfolio diagnosis & triage
Ran deep dives across every game in the portfolio. Combined performance data with hands-on gameplay to separate what was dead from what was recoverable. No title got a pass based on history alone.
Roadmap strategy
Restructured the portfolio roadmap into three tracks: Invest (strong fundamentals, clear upside), Validate (promising but uncertain, scoped to short experiments), and Maintain or Sunset (limited long-term potential). Titles moved between tracks as new signal came in. The roadmap adapted without constant resets or reactive pivots.
Ecosystem & infrastructure
Shifted from isolated optimization to shared leverage. Standardized analytics and success metrics, introduced shared SDKs and UX patterns, aligned monetization models, and set clear criteria for investment decisions across the portfolio. Product decisions got faster. Conviction went up.
Results
The portfolio got easier to reason about and easier to operate. High-potential titles got focused attention earlier. Engineering effort went where the signal pointed instead of where the loudest opinion was. Roadmap discussions moved from debate to evidence-informed decisions grounded in data and product judgment.
Most importantly, the team gained conviction. We knew what we believed in and why.
Key Learnings
Portfolios fail from lack of conviction, not lack of ideas
Clear signal enables better judgment. But judgment still requires context, taste, and the willingness to commit. Data alone doesn't get you there.
Standardization is leverage, not bureaucracy
Shared frameworks didn't slow things down. They made decisions faster and more defensible. The goal was clarity, not control.