In-Game Revenue Optimization

Period: Feb 2022 – May 2022
Type: Data science contract

Problem

A major sports video game needed to optimize when and what microtransaction offers to show players. The challenge: new transaction types are introduced regularly, and a model trained on historical transactions cannot score items it has never seen.

What Was Novel

I designed a hybrid Set Transformer + Two-Tower architecture that solves two problems at once. The Set Transformer encodes the player’s current marketplace, which varies in size from player to player and changes over time, as a permutation-invariant set. This means the model handles a marketplace with 5 items the same way it handles one with 50, without padding or truncation. The Two-Tower component then embeds both the player state and individual transaction features into a shared scoring space.

Two things made this architecture novel:

  1. Variable-size marketplace handling. Standard sequence models require fixed-length inputs. By using a Set Transformer, the model naturally processes whatever items are currently available to each player, regardless of marketplace size.
  2. Generalization to unseen transactions. Because transactions are embedded by their features (price, category, rarity, timing) rather than by ID, the model scores new item types it has never seen during training. A new item with similar features to past successful offers gets a meaningful score without retraining.

Results

  • Demonstrated potential 10% revenue uplift through targeted offer timing
  • Model generalized to transaction types not present in training data
  • Designed A/B testing framework for production validation

Technical Stack

  • PyTorch (Transformer encoder + Two-Tower scoring)
  • Player behavioral sequence modeling
  • Feature-based transaction embeddings

← Back to Projects