Commercial iOS Game Portfolio

Agentic AI engineering applied to commercial mobile development

Live on App Store3 Shipped · 1 In Development · 2 In Pipeline

Overview

Three commercial iOS games live on the App Store, each built using agentic AI engineering practices and monetised via Google AdMob. This is not a hobby portfolio - it is a systematic, deliberate programme of commercial product development that proves a repeatable methodology for taking ideas from concept to revenue-generating App Store products in days rather than months.

Each game deliberately tests a different mobile platform capability. The result is a growing library of reusable components, documented learnings, and production-tested patterns that compound with every release.

The Portfolio

GameStatusPlatform CapabilityKey Technology
Shade ShiftLive on App StoreColour perception, HSB theorySwiftUI, @Observable, AdMob
Minesweeper ZenLive on App StoreProcedural puzzle generationSpriteKit, CoreHaptics, AdMob
Signal ZenLive on App StoreReal-time audio synthesisAVAudioEngine, CoreHaptics, AdMob
FledglingIn DevelopmentGyroscope, physics, procedural graphicsSpriteKit, CoreMotion

What Makes This Commercial

Every game in this portfolio is a monetised product, not a demo or prototype.

Revenue model

Google AdMob integration with rewarded video ads and interstitials. Players choose when to engage with ads - no forced interruptions, no paywalls. The monetisation respects the zen design philosophy across the portfolio while generating real advertising revenue.

Production quality

These are fully featured App Store products with haptic feedback, procedurally generated audio, particle animations, privacy policies, App Tracking Transparency compliance, and multi-device screenshot sets. Each game went through Apple's full review process and meets App Store commercial standards.

Ongoing operation

Live games receive updates, and the portfolio continues to grow. This is an active commercial product line, not a one-off experiment.

Agentic Engineering in Action

The real story here is not the games themselves - it is the engineering methodology behind them. "Vibe coding" gets the headlines, but agentic engineering is the real skill to demonstrate. This portfolio proves a factory model for building commercial software products.

The Methodology

Specs before code

Product specification, design documentation, theming system - all written before touching Xcode. Decomposing the problem into clear, buildable features is where the value lies.

Persistent learnings

Every gotcha is documented. New projects start with accumulated knowledge from previous builds, eliminating repeated mistakes and accelerating delivery.

Reusable components

The AdManager, design system, sound engine, and haptics framework all carry forward from one game to the next. Each release extends a shared component library.

Decomposition over prompting

Problems are broken into features: "design the theming system", then "architect the tile state machine", then "implement the reveal algorithm". This mirrors how enterprise architects decompose complex systems - same discipline, different domain.

Idea to execution speed

Traditional development takes weeks from concept to working prototype. Agentic development takes days from concept to App Store submission.

The Pattern That Scales

Each project builds upon the foundations of the last. The specs get tighter. The components get reused. The learnings accumulate.

This is not "AI writes code for me" - it is a collaboration pattern that scales. The same practices apply whether building a mobile game, a multi-agent data pipeline, or an enterprise architecture deliverable.

The Games

Shade Shift

Pipeline validation: zero to App Store in 3 days

Shade Shift gameplay

The first game in the series - a deliberate pipeline validation project. The question: how fast can you go from zero to a commercially published App Store product with AI-assisted development? The answer: three days and approximately 950 lines of code.

Shade Shift is a colour perception game where players spot the tile that is a slightly different shade. The grid grows as scores increase and colours become more subtle. No timers, no lives, no stress. The game mechanic was chosen for technical simplicity - grid logic, no physics, no AI, no pathfinding - because the point was proving the end-to-end commercial pipeline, not the game itself.

What it proved

This is a native iOS app with haptic feedback on every tap, procedurally generated sound effects, particle animations, AdMob integration for rewarded video ads, and a published privacy policy. The full commercial pipeline works.

Technical details

MVVM architecture with SwiftUI and @Observable. HSB colour shifting for difficulty progression. Programmatically generated audio. Rewarded video ads via AdMob.

Minesweeper Zen

The satisfaction of Minesweeper without the stress

Minesweeper Zen gameplay

Classic Minesweeper is a punishment game - one wrong click equals death. Minesweeper Zen flips it: you are discovering treasure. No explosions, no fail state, no anxiety. Same logical deduction, completely different emotional experience.

The design philosophy is ASMR satisfying, not exciting. Soft sounds, gentle haptics, warm colours. The cascade reveal - where tiles dissolve in a ripple pattern like dominoes falling - is the signature moment. Mesmerising, not jarring.

Agentic engineering showcase

This game's writeup explicitly documents the agentic methodology - specs before code, persistent learnings, reusable components, decomposition over prompting. Every puzzle is 100% solvable, validated by a BFS solver before presentation. Daily puzzles are seeded by date so all players get the same board.

Technical highlights

BFS solver validates 100% solvability. Cascade animation with wave-based timing at 0.05s stagger between tiles. Procedural marimba/xylophone tones generated programmatically. Rewarded ads for hints, interstitials after 5 levels.

Signal Zen

A zen memory game where sound IS the puzzle

Signal Zen gameplay

The most technically ambitious and visually impressive game in the portfolio. Most memory games treat sound as decoration. Signal Zen makes sound the entire puzzle. Players memorise musical sequences synthesised in real-time via AVAudioEngine - no audio files, just programmatic synthesis. Each hexagon has its own tone, and the sequences you memorise are genuinely musical.

The space theme draws from Close Encounters of the Third Kind - that iconic moment where communication happens through music, not words. A glowing hexagon centrepiece floats in space with particle effects drifting across a starfield. The opening animation rivals games with ten times the budget.

What makes it unique

The audio mechanic is genuinely rare on mobile. Most musical games are rhythm games trying to be Guitar Hero. Signal Zen is not competing with those at all. It sits in its own pocket of "musical puzzle" that barely has any competition.

Development timeline

Four days from concept to App Store submission, plus four days for App Store review. Eight days total from zero to live.

Technical highlights

AVAudioEngine for programmatic sound synthesis with zero audio files. CoreHaptics integration for tactile feedback. Particle effects and glow animations. Three difficulty modes for flow state, challenge, or relaxation.

Fledgling (In Development)

Bouncy baby bird landing

Fledgling gameplay

The fourth game in the portfolio, currently in active development. A cozy casual game where a fluffy baby bird learns to land. Players tilt their phone to guide the fledgling from nest to forest floor - no fail state, just bounce. Features the CHONK growth system (bigger bird equals higher bounce height but faster fall speed), procedural levels, and 100% procedural graphics with zero image files in the bundle.

Platform capability being tested

Gyroscope and accelerometer input via CoreMotion, physics simulation, and fully procedural graphic generation.

Technical highlights

CoreMotion accelerometer maps phone tilt to bird velocity. Xorshift64 algorithm creates infinite replayable content with date-seeded daily challenges. 6-layer parallax scrolling. Baby schema design principles (head 1.2x larger than body, large eyes placed low) to trigger "cute" response.

Technology Stack

CategoryTechnologies
UI FrameworksSwiftUI, SpriteKit
AudioAVAudioEngine, programmatic synthesis
InputCoreMotion, CoreHaptics
MonetisationGoogle AdMob (rewarded video, interstitials)
ArchitectureMVVM, @Observable
GenerationProcedural graphics, procedural audio, BFS puzzle validation

Portfolio Context

This is not "five puzzle games". It is a systematic demonstration of mobile platform capabilities, each one proving a different technical dimension - built in weeks, not months.

GameCapability Tested
Shade ShiftColour perception, HSB theory
Minesweeper ZenProcedural puzzle generation
Signal ZenAudio system, programmatic synthesis
FledglingGyroscope, physics, procedural graphics

The methodology behind this portfolio - agentic AI engineering with specs-first decomposition, reusable components, and persistent learnings - is the same approach applied to enterprise architecture, data modernisation strategies, and multi-agent AI pipelines. The domain changes. The discipline does not.