Modular AI tooling framework · for Unity studios

Modular AI tooling
for Unity.

Two problems block AI automation in gamedev: bottlenecks that agents can't solve on their own (QA, UI, tech debt) and blockers that prevent agents from forming a pipeline at all. We're building a modular framework that addresses both.

Get Early Access → Or write directly: contact@aigamedev.fi
Tooling Hub
Roslyn · Editor · Runtime · uGUI
Validator Engine
Deterministic boundary · Tech debt control
Problem 01 · The Blocker

Why current agents and assistants don't scale.

They solve a narrow set of tasks. They can't be chained into a pipeline. And vibe-coded internal tools only make integration harder.

01

Weak tool support

No native access to engine or codebase. Agents talk to chat endpoints, not to UnityEditor or Roslyn.

no.AssetDatabase · no.SyntaxTree · no.PrefabUtility
02

Fragile and complex

Break on edge cases. Hard to debug. One unexpected stack trace and the whole loop falls apart.

stacktrace → context window → silence
03

Not modular, not extensible

Can't compose into pipelines. Each tool ships its own chat UI; nothing speaks the same protocol.

tool A → human → tool B → human → tool C
04

Vibe-coded internal tools

In-house glue scripts integrate poorly and add their own debt. Built in a weekend, owned forever.

scripts/agent_v3_FINAL_actually_works.py
Problem 02 · Three Bottlenecks

QA, UI, Tech Debt — the three things agents can't solve alone.

Even with a working agent, three categories of work block the pipeline. Each needs its own native tooling.

01Bottleneck

QA

→ Tooling Hub · runtime layer

Agent can write code, but can it play the game? QA needs typed access to the running build — debug overlays, cheat commands, frame capture.

02Bottleneck

UI

→ Tooling Hub · uGUI layer

Agents reverse-engineer layout from videos and screenshots, then assemble it in uGUI under your conventions. Layout is no longer a handoff.

03Bottleneck

Tech Debt

→ Validator Engine

LLMs absorb the worst patterns in your repo and reproduce them at 10× speed. The Validator catches them deterministically and feeds back fixes.

Schema 01 · The Tech Debt Multiplier

Why human code review breaks at LLM speed.

LLMs absorb the bad practices already present in a legacy codebase. Human reviewers cannot keep up with the rate of generation. Debt compounds silently.

— The past · human pipeline

Legacy code+Human dev=linear debt
debt time
Tech debt grows
at human review pace.

Review cadence roughly matches generation cadence. Bad patterns are caught in PR.

— The AI reality · naive LLM pipeline

Legacy code+LLM agent=10× compounding
debt time 10×
10× Tech debt multiplied
by generation rate.

The agent learns how not to write code by looking at the legacy codebase — then reproduces exactly that. Human PR becomes the bottleneck; debt compounds.

Schema 02 · Pipeline architecture

Human as a middleman, vs. Validator talks to Agent directly.

In the legacy setup the human ferries errors back into chat. In the native pipeline the Validator speaks to the agent in actionable, deterministic items — and the loop closes itself.

— Legacy · ping-pong

Human review · prompts · stale docs
01Agent generates
02Human PR · bottleneck
03Human manually corrects
04Agent guesses fix?
The human is a 5th wheel, ping-ponging with the bot. Every cycle loses context.

+ Native · agent harness

Self-closing loop · deterministic feedback
01Human configures validators
02Agent acts via Tooling Hub
03Validator → actionable items
loop · agent auto-corrects
04Agent applies fix
05All checks pass
The agent closes the loop itself. Human sets the rules, not the fixes.
Product 01 · Tooling Hub

The Agent's Hands.

A single integration point. The agent gets typed access to coder tools, the editor, and the running game build over the network.

Caller

Agent

claude · cursor · custom
TOOLING HUB
typed RPC · network
01
Codebase
Roslyn
02
Unity Editor
native API
03
Runtime Game
over network
01 · Codebase

Roslyn & proger-tooling

Symbol-level access. The agent reads, analyses and rewrites C# with the same primitives the IDE uses.

RoslynSyntaxTreeSemanticModelCodeFix
02 · Unity Editor

Native Editor APIs

Typed access to UnityEditor workflows: AssetDatabase, Prefab pipeline, scenes, build graph.

AssetDatabasePrefabUtilitySceneManagerBuildPipeline
03 · Runtime Game

QA tools in the actual build

Connects to the build over the network. Debug overlays, cheat commands, frame capture — the agent can test the game, not just generate code.

DebugOverlayCheatConsoleQAHarnessFrameCapture
Product 02 · Validator Engine

Spot the pattern. Configure. Let the agent fix.

Three steps. The Validator turns a recurring problem into a deterministic rule, then turns matches into actionable items the agent applies on its own.

01
Spot · human

Find a bad pattern

An engineer notices recurring debt in the codebase: presentation logic in domain layer, deprecated APIs, brittle scene refs.

02
Rule · human

Add a diagnostic

Configure the Validator to detect it deterministically. Same input → same verdict, every time.

03
Fix · agent

Auto-correct all instances

Every match becomes an actionable item. The agent applies the fix; the dashboard tracks the result.

Tech Debt Dashboard

5 violated 2 trending up last 30 days
Diagnostics+3
38
Passing↓ stable
31
Violated+2
5
Trending up+1
2
Performance test regressionscene/CombatArena 12 +4
Business logic in UI layerHealthBar.cs · 7 more 8 +2
! Scene validator: missing refs3 scenes 5 0
Game config schema mismatchresolved 12d ago 0 −3
Code convention #1: namingwhole repo 0 −6
Code convention #2: structurewhole repo 0 −2
Acceptance test failuresCI · last 30 days 0 −1
Bottleneck · UI

Reverse-engineer the screen, assemble in uGUI.

Layout handoff is the third chronic bottleneck. Agents reverse-engineer layout and behaviour from videos or screenshots and assemble it in uGUI — bounded by your conventions.

The Agent does the layout work.

Drop a screen recording, a static mock, or a competitor's reference. The agent infers hierarchy, spacing, anchors and interaction states, then produces real uGUI prefabs against the assemblies you permit.

Input
Video · screenshot · mock
Infer
Hierarchy · anchors · states
Generate
uGUI prefab + presenter
Output
Bounded by conventions
Generated · HealthBarOverlay.prefabpasses
01Input: video / screenshot / mock
02Reverse-engineer hierarchy
03Generate uGUI prefabs
04Conventions + tooling enforce rules
05✓ Ready