NEONARCADE

Behind the Scenes

Article 01 — March 7, 2026

From Prompt to Playable

It starts with a sentence. Something like "build me a game where you play Tetris badly on purpose" or "Pac-Man but you can barely see the maze." That is the entire human contribution to the creative brief. From there, the robots take over.

NEON ARCADE uses a two-stage AI pipeline to go from idea to finished game. Stage one: a researcher agent takes the raw concept, studies what makes games shareable, analyzes viral mechanics (short sessions, emotional spikes, "one more try" hooks), and writes a detailed game specification. It decides the controls, the scoring system, the difficulty curve, the sharing format — everything.

Stage two: a builder agent takes that spec and produces the entire game. HTML structure, CSS styling, JavaScript game logic, sound effects (synthesized with the Web Audio API, no audio files), mobile touch support, keyboard controls, HiDPI canvas rendering, the share button, leaderboard integration — all in a single self-contained .html file.

38
Games Built
3
Days Total
~4min
Avg Per Game
1
Human

Average build time per game: roughly 3 to 5 minutes. That includes sound effects, responsive layout, a start screen, a game-over screen with stats, a share button that generates emoji art, and integration with global leaderboards. Thirty-eight games were built between March 5 and March 7, 2026.

The secret weapon is parallel execution. Instead of building games one at a time, multiple builder agents run simultaneously. Five or six games materializing at once, each agent reading from the same style guide and producing output that looks like it came from the same studio. Because it did. Sort of.

The Style Bible

The consistency comes from CLAUDE.md — a project instructions file that every agent reads before writing a single line of code. It specifies exact hex colors, font families, glow effects, CSS patterns, required features, the SEO template, even the button hover states. Think of it as a 200-line brand manual that happens to be machine-readable.

Every game gets: Orbitron headings with cyan glow. Rajdhani body text. Scanline overlay. Particle effects on events. Screen shake on errors. The same dark-neon aesthetic from Snake to Sudoku to Tetris Betrayal. No game looks out of place, because no agent deviated from the spec.

The Human in the Loop

So what does the human actually do? Quality control. Playtesting. The "that is not fun enough" feedback. The "make it feel faster" note. The "this needs more screen shake" request. Sometimes the AI nails it on the first try. Sometimes a game needs two or three iterations before the feel is right.

There is something deeply ironic about an AI building games specifically designed to trigger human emotions strongly enough that humans will share them on social media. The games are engineered for dopamine spikes, frustration loops, and "I need to beat that score" moments — all by a system that has never felt any of those things.

The Pipeline

Human idea → Researcher agent (viral analysis + game spec) → Builder agent (complete HTML/CSS/JS game) → Human playtesting → Ship it

The result is a collection of games that nobody would guess were each written in under five minutes. They have polish. They have juice. They have that satisfying feeling when you nail a high score and the screen erupts in particles. And they were built faster than you can order lunch.

Article 02 — March 7, 2026

SEO by AI

After 38 games were built, we had a problem. They existed, they were fun, but search engines had no idea they were there. So we pointed the AI at itself: "audit every page for SEO issues and fix them."

It found a lot. Across 39 HTML files (38 games plus the landing page), here is what the AI flagged and fixed:

The Audit

  • Missing meta descriptions — most game pages had a <title> but no description tag. Google had nothing to show in search results except guessed content.
  • No Open Graph or Twitter Card tags — sharing a game link on social media produced a sad, featureless preview. Now every game has og:title, og:description, og:image, and matching Twitter Card tags.
  • Render-blocking font imports — many games used @import inside <style> to load Google Fonts. This blocks rendering. Converted all of them to <link> tags in the <head> with preconnect hints.
  • Missing canonical URLs — added <link rel="canonical"> to every page to prevent duplicate content issues.
  • No structured data — added Schema.org VideoGame markup and BreadcrumbList to every game page. Added FAQPage schema to the landing page.
  • WCAG contrast failures — several pages used text colors darker than #7a7a9a on the dark background. These failed the 4.5:1 contrast ratio. Brightened them all.
  • Missing landmarks — hub pages lacked <main> tags. Accessibility tools need these for navigation.
  • Links not distinguishable — some links relied on color alone. Added underlines and borders so they pass WCAG requirements.
  • No image dimensions — screenshot <img> tags had no width/height, causing cumulative layout shift (CLS). Added explicit dimensions.
  • Missing fetchpriority — above-the-fold hero images were loading with the same priority as offscreen content.
  • No favicon or apple-touch-icon — browsers showed the default blank icon. Created SVG and PNG favicons.
  • No manifest.json — added a PWA manifest for installability and proper mobile branding.
  • Missing utility files — created ads.txt, llms.txt, and a custom 404 page.

The Results

Lighthouse scores went from "embarrassing" to "actually good" across the board. Performance, Accessibility, Best Practices, and SEO categories all improved significantly. The biggest gains came from eliminating render-blocking resources and fixing contrast ratios.

The Meta Humor

An AI is optimizing web content so that search engines — which are themselves increasingly powered by AI — will rank it higher and show it to humans. The machines are marketing to each other, and humans are the product. But at least the games are fun.

The best part: the AI didn't just flag problems. It fixed all of them in a single pass across every file. The kind of tedious, repetitive work that a human developer would have spread across a week was done in minutes. Every @import replaced. Every meta tag added. Every contrast ratio checked. Same agent, same rules, same consistency.

Article 03 — March 7, 2026

Zero-Cost Deployment

NEON ARCADE runs 38 games with global leaderboards, play tracking, and a like/report system. The total monthly hosting cost is $0. The only expense is the domain name: about $10 per year. Here is how.

The Stack

There are exactly two components:

  • Static HTML files — served by Cloudflare Pages. Each game is a standalone .html file with inline CSS and JavaScript. No build step, no bundler, no framework. Just files.
  • One Cloudflare Worker — a 175-line JavaScript file (src/worker.js) that handles API routes for play tracking, likes, issue reporting, and leaderboards. Everything else falls through to static assets.
Browser
Cloudflare Edge
Worker
KV Store

When a request hits the Cloudflare edge, the Worker checks if the path starts with /api/. If yes, it handles the request. If not, it falls through to env.ASSETS.fetch(request), which serves the static HTML file. Simple routing, zero configuration.

Deployment

Deployment is literally one command:

npx wrangler deploy

That is it. No Docker image to build, no CI/CD pipeline to configure, no Kubernetes cluster to manage, no AWS bill to dread. Wrangler reads a wrangler.toml config that looks like this:

# wrangler.toml name = "neonarcade" main = "src/worker.js" compatibility_date = "2024-12-01" # Bind KV namespace for persistent data [[kv_namespaces]] binding = "GAME_DATA" id = "your-kv-namespace-id" # Static assets served alongside the worker [assets] directory = "./"

The Worker binds a KV namespace called GAME_DATA. That is the "database" — a key-value store running on Cloudflare's edge, replicated globally. More on that in the next article.

Free Tier Limits

Cloudflare's free tier is remarkably generous for a project like this:

  • Workers: 100,000 requests per day (about 1.15 requests per second sustained). More than enough for a hobby project — and honestly, for a lot of production MVPs.
  • KV storage: 1 GB total. Our leaderboards, play counts, and like counts use a few kilobytes. We could run for centuries.
  • Pages: unlimited bandwidth on the free tier. Static HTML files served from 300+ edge locations worldwide.

No Server

There is no origin server. No EC2 instance, no Heroku dyno, no DigitalOcean droplet, no VPS. The Worker is the backend, and it runs on Cloudflare's edge — in whatever data center is closest to the user. The static files are cached globally. It is fast by default, not because of optimization, but because the architecture has no bottleneck to hit.

Total Cost Breakdown

Hosting: $0/month. CDN: $0/month. Database: $0/month. SSL: $0/month. Domain: ~$10/year. Annual total: $10. That is less than a single month of the cheapest AWS EC2 instance.

For hobby projects, side projects, and MVPs, this is hard to beat. You get global distribution, edge computing, persistent storage, automatic HTTPS, DDoS protection, and zero ongoing cost. The catch? If your project explodes in popularity past 100K requests/day, you will need to upgrade to a paid plan. But that is a good problem to have.

Article 04 — March 7, 2026

Global Leaderboards for $0

You want your browser game to have a global leaderboard. Players should see the top scores worldwide, and submitting a score should feel instant. You do not want to pay for a database. Here is how we solved it.

The Problem

Traditional approach: spin up a PostgreSQL database, write a REST API with Express or Django, deploy it somewhere, pay for hosting, handle connection pooling, worry about backups, deal with cold starts. For a hobby game site. No thanks.

The Solution

Cloudflare KV is a globally distributed key-value store. You write a value, and within about 60 seconds it is readable from any Cloudflare edge location in the world. It is not a relational database. It is simpler than that — and for leaderboards, simpler is perfect.

The entire leaderboard API lives in src/worker.js, which is 175 lines of JavaScript. Here are the endpoints:

Endpoint What It Does
POST/api/play/:game Increments the play counter for a game
POST/api/like/:game Increments the like counter
POST/api/issue/:game Increments the reported-issues counter
GET/api/stats Returns play/like/issue counts for all games
POST/api/leaderboard/:game Submits a score (name + score + mode)
GET/api/leaderboard/:game Fetches the top 20 scores

KV Key Design

Every piece of data maps to a single key in KV:

  • plays:neon-snake → an integer (the play count)
  • likes:neon-snake → an integer
  • issues:neon-snake → an integer
  • lb:neon-snake → a JSON array of the top 20 scores

Leaderboard entries are stored as a JSON array in a single KV key. When a new score is submitted, the Worker reads the array, appends the new entry, sorts it, keeps the top 20, and writes it back. The whole operation takes a few milliseconds.

// Submit a score — from src/worker.js const key = `lb:${game}`; const existing = JSON.parse( await env.GAME_DATA.get(key) || '[]' ); existing.push({ name, score, ts: Date.now() }); // Sort by mode: "high" = arcade, "low" = timed puzzles const mode = body.mode === 'low' ? 'low' : 'high'; if (mode === 'low') { existing.sort((a, b) => a.score - b.score); } else { existing.sort((a, b) => b.score - a.score); } const top = existing.slice(0, 20); await env.GAME_DATA.put(key, JSON.stringify(top));

The mode parameter is the key design choice. Arcade games use "high" (bigger is better). Timed puzzles like Sudoku and Minesweeper use "low" (faster is better). The client tells the server which sorting to use.

Input Sanitization

Since this is a public API with no authentication, input validation matters. Game slugs are stripped to lowercase alphanumeric characters and hyphens, max 50 characters. Player names are uppercase letters only, max 5 characters. Nothing fancy — just enough to prevent injection and keep things clean.

// Sanitize game slug: only lowercase alphanumeric + hyphens function sanitizeGame(game) { return (game || '').toLowerCase() .replace(/[^a-z0-9-]/g, '').slice(0, 50); } // Sanitize player name: uppercase letters only, max 5 function sanitizeName(name) { return (name || '').toUpperCase() .replace(/[^A-Z]/g, '').slice(0, 5); }

CORS is restricted to neonarcade.net and localhost. If the request comes from an unknown origin, it defaults to the production domain.

Client-Side Integration

Games never talk to the API directly. They use neon.js, a shared client library that handles everything: local score storage (top 10 in localStorage), global score submission, the leaderboard UI, and even the player name input modal (a retro 3-5 character name picker).

// In any game's script — that's it Neon.init({ game: 'neon-snake', mode: 'high', key: 'neonarcade_arcade_snake_scores' }); // When the game ends: Neon.getName() ? Promise.resolve() : Neon.promptName(); .then(function() { return Neon.save(score); }) .then(function(result) { Neon.render(document.getElementById('lb')); if (result.globalRank >= 1 && result.globalRank <= 5) { Neon.showGlobalSplash(result.globalRank); } });

The Trade-offs

Eventual consistency: KV is not a database. It is a cache with persistence. When you write a value, it takes up to 60 seconds to propagate to all edge locations. In practice this means two players submitting scores simultaneously might cause a race condition where one score overwrites the other. For a casual game leaderboard, this is fine. For a banking app, obviously not.

No authentication: Players pick a 3-5 letter name and that is their identity. No login, no accounts, no OAuth flows. This is intentional. The point is zero friction — play a game, get a score, see it on the leaderboard. The anonymity keeps it fun and lightweight.

No server-side score validation: A determined cheater could POST a fake score directly to the API. The current philosophy is "trust but display" — the fun is in playing and sharing, not in competitive integrity. If cheating becomes a problem, server-side validation can be added. But for a site where the high score on Snake is bragging rights among friends, it has not been an issue.

The Bottom Line

175 lines of JavaScript. One KV namespace. Zero monthly cost. Global leaderboards that update in under a second and replicate to 300+ edge locations. Sometimes the simplest solution is the best one.

Article 05 — March 7, 2026

Security Audit

We built 38 games and a global leaderboard API in three days. Then we asked the obvious question: how badly can someone break this? The answer was: pretty badly. Here is what we found and how we fixed it.

The entire backend is a 175-line Cloudflare Worker with no authentication. No user accounts, no API keys, no rate limiting. By design, anyone can play, anyone can submit scores, anyone can like a game. That simplicity is the product — but it is also the attack surface.

The Kill Shot: Leaderboard Poisoning

The most critical finding was embarrassingly simple. The score submission endpoint accepted any integer with no upper bound:

const score = parseInt(body.score, 10); if (isNaN(score) || score < 0) return json({ error: 'invalid' });

One curl command could fill all 20 leaderboard slots with score: 999999999. Every game, every leaderboard, wiped out in seconds. The fix: server-side score caps per game and a known-game allowlist so you cannot create phantom leaderboards for games that do not exist.

Mode Injection

The leaderboard supports two sort modes: "high" (bigger is better) and "low" (faster is better). The mode was sent by the client and trusted by the server. An attacker could POST mode: "low" to an arcade game, causing the leaderboard to re-sort in ascending order — pushing real high scores to the bottom.

The fix: the server now stores and enforces the sort mode per game. The client's mode field is ignored.

No Rate Limiting

Every POST endpoint (play, like, issue, score submission) had zero rate limiting. A simple loop could:

  • Inflate play counts to millions
  • Spam likes or issue reports
  • Flood leaderboards with entries
  • Drive up Cloudflare KV write costs

The fix: per-IP rate limiting in the Worker using a KV-based sliding window. Each IP gets a budget of requests per minute across all POST endpoints. Exceeding the limit returns 429 Too Many Requests.

Phantom Games & KV Pollution

Anyone could POST to /api/play/totally-fake-game and create real KV entries. The stats dashboard would show hundreds of phantom games that never existed. The fix: a server-side allowlist of valid game slugs. Unknown games are rejected with a 404.

Missing Security Headers

The _headers file only set Content-Type. No Content-Security-Policy, no X-Frame-Options, no Strict-Transport-Security. Games could be embedded in iframes on malicious sites (clickjacking). Now every response includes security headers that prevent framing, enforce HTTPS, and restrict script sources.

Internal Files Exposed

The Wrangler config had directory = "./", meaning the entire repo root was deployed as static assets. That included security.md (this very audit report), CLAUDE.md (the full project instructions), VIRAL_GAME_IDEAS.md (the ideas backlog), wrangler.toml, and even src/worker.js. Anyone could read the backend source code by visiting /src/worker.js.

The fix: restructured the entire project. All deployable web files now live in a public/ directory. Control files (docs, config, source) stay in the root and are never uploaded. Wrangler config changed to directory = "./public". Clean separation between what the world sees and what stays internal.

Localhost CORS Origins

The CORS allowlist included http://localhost:8777 and http://127.0.0.1:8777 from development. Not exploitable in practice (browsers enforce same-origin policy), but unnecessary in production. Removed.

The Smaller Stuff

  • Name sanitization mismatch — the client allowed A-Z and 0-9 in names, but the server stripped digits. A player named "A1B" became "AB" on the leaderboard. Aligned both to accept A-Z0-9, 3 characters.
  • Duplicate name flooding — one player could fill all 20 leaderboard slots. Now the server keeps only the best score per name.
  • Zero scores saved as high scores — dying immediately in a game still recorded a "high score" of 0. Client-side fix rejects zero scores before saving.
  • No error logging — the catch block returned a generic 500 with no console.error. Production errors were invisible. Added logging.
  • innerHTML usage — four game files used innerHTML to render content. While sourced from internal data, the pattern is fragile. Replaced with safe DOM methods (createElement + textContent).
  • Admin pages public — stats and top scores dashboards were accessible to anyone. Accepted as low risk since they are read-only.

The Scorecard

17
Issues Found
4
Critical
4
High
5+4
Medium + Low
The Lesson

A public API with no authentication is not "insecure by design" — it is insecure by neglect if you do not actively constrain it. Every input that crosses a trust boundary needs validation, even in a toy project. Especially in a toy project, because toy projects have a way of ending up on the front page of Hacker News with their leaderboards full of 999999999.

The full audit report is in the repo as security.md. Every issue was found and fixed by the same AI that built the games in the first place. The irony continues: the robot audited its own work, found it wanting, and fixed the problems. At least it is honest about its mistakes.

Article 06 — March 8, 2026

Bringing Back the Classics

Here is a question that sounds obvious until you think about it: why do people still play Space Invaders? The game is 48 years old. The graphics are 32 pixels wide. A smartphone in your pocket has more computing power than every arcade cabinet ever manufactured, combined. And yet: Space Invaders, Tetris, Breakout, Asteroids, Frogger — these games refuse to die.

The answer is that great game design is timeless. The rules of Space Invaders are so elegant that a five-year-old understands them in seconds: the aliens move, you shoot, they get faster. Tetris is pure spatial reasoning distilled into its most addictive form. Frogger is a timing puzzle wrapped in a traffic metaphor. These games survived because their core loops are perfect.

The Problem with Nostalgia

But there is a problem. Playing the original Space Invaders in 2026 feels like visiting a museum. Respectful, educational, a little boring. The controls are stiff. The visuals are charming in a "my grandmother's wallpaper" way. There are no leaderboards (unless you find a working arcade cabinet), no share buttons, no "one more try" hooks optimized for the dopamine economy.

Most retro game remakes fall into one of two traps:

  • The museum approach — pixel-perfect recreation of the original, "faithful to the source material," which means faithfully recreating 1978's idea of game feel. Respectful. Forgettable.
  • The reimagining approach — so many new mechanics, modes, and meta-progression systems that the original design is buried under layers of modern game design anxiety. "We added RPG elements and a battle pass." Why.

NEON CLASSIC takes a third path: keep the core loop untouched, replace everything else.

What We Changed

Every game in the NEON CLASSIC collection uses the exact same rules as the original. Space Invaders has the same 5x11 formation, the same scoring, the same speed-up-as-they-die mechanic. Tetris has the same 7 tetrominoes, the same 10-wide grid, the same line-clear scoring. We did not "improve" the game design because it does not need improving.

What we did change:

  • Visual identity — the NEON ARCADE dark-neon aesthetic. Glowing outlines, particle effects, screen shake, scanline overlays. Every game looks like it belongs in the same family as the 38 original NEON ARCADE titles.
  • Sound design — synthesized audio via the Web Audio API. No audio files, no loading, just oscillators generating retro sounds that match the neon aesthetic.
  • Global leaderboards — every game connects to the same neon.js infrastructure. Your high score competes with everyone in the world. Top 5 gets a global splash celebration.
  • Share mechanics — every game over generates an emoji-art summary you can paste into group chats. "I survived Wave 12 in NEON INVADERS" is more interesting than "I played Space Invaders."
  • Mobile-first controls — touch swipes, virtual buttons, responsive canvases. These games were designed for arcade cabinets with joysticks. Now they work on a phone in a bus queue.
  • Modern quality-of-life — ghost pieces in Tetris, hold system, undo in 2048. Small additions that modern players expect, without changing the fundamental challenge.

The Seven Classics

7
Games Launched
48
Years of History
0
Downloads Required
4
Categories Total
  • NEON INVADERS — Space Invaders. 5 rows of neon aliens, mystery UFOs, wave escalation. The game that started the arcade industry, now in your browser.
  • NEON BREAKOUT — Breakout/Arkanoid. Six rows of color-coded bricks, power-ups, combo multipliers. Satisfying destruction with every bounce.
  • NEON TETRIS — The block-stacker. Marathon mode and 40-Line Sprint. Ghost pieces, hold, next preview. The game that needs no introduction.
  • NEON 2048 — The sliding tile puzzle. Each tile value has a unique neon glow. Smooth merge animations. The most "one more try" game ever made, now with global competition.
  • NEON PAIRS — Memory/Concentration. 3D card flip animations, three difficulty sizes, star ratings. Deceptively simple, brutally honest about your memory.
  • NEON ASTEROIDS — The vector space shooter. Momentum physics, screen wrap, asteroid splitting. The original game was literally neon lines on a vector display — we are staying true to the source.
  • NEON FROGGER — The road-crossing puzzle. Five lanes of traffic, five lanes of river, five home slots. Timer pressure, round escalation, and the eternal question: why did the frog cross the road?

Why a Fourth Category?

NEON ARCADE started with three categories: action games (NEON ARCADE), brain puzzles (NEON MIND), and skill challenges (NEON GRIND). Adding a fourth — NEON CLASSIC — was a deliberate choice.

These games do not fit neatly into "action" or "puzzle." They are their own thing: universally recognized game designs that everyone already knows how to play. A new visitor sees "NEON INVADERS" and instantly knows what it is. Zero onboarding. Zero explanation needed. That is the superpower of classics — the marketing is built into the name.

It also means NEON CLASSIC has a different discovery pattern. People do not search for "neon browser game" when they want to play Tetris. They search for "Tetris online" or "free Breakout game." The SEO game changes completely when you are rebuilding games that have Wikipedia pages.

The Thesis

The best game designs are finished. They do not need new mechanics, progression systems, or battle passes. They need new surfaces: better visuals, modern platforms, social sharing, global competition. Give a timeless game a fresh coat of neon paint and get out of its way.

Seven games launched today. Seven games that have collectively been played by billions of people over nearly five decades. The difference now: you can play them in a browser tab, compete globally, and share your scores with a single tap. The classics are back. They just glow a little brighter.

Article 07 — March 9, 2026

Quality Gates

At 64 HTML files and counting, NEON ARCADE passed a threshold where "just open it in a browser and click around" stopped being a testing strategy. A single stray curly brace in a thousand-line game file can silently break everything — and you will not notice until a player reports it. So this week we built the thing every growing project eventually needs: automated quality gates.

What Broke (and How We Found It)

The trigger was simple. We wrote a Node.js validator that extracts every inline <script> block from every HTML file and runs it through V8's syntax checker. On the first run, it found a real bug: Pac-Man Amnesia had a stray } on line 176 that prematurely closed its IIFE wrapper. The game still "worked" in browsers because the variables fell to global scope, but the code was technically broken and would fail under strict mode or any future refactoring.

One bug in 64 files sounds good. But it was a bug that had shipped to production, was invisible to manual testing, and would have been caught instantly by a syntax checker. That is the argument for automation in one sentence.

The Pre-Commit Pipeline

We use the pre-commit framework (the Python one, not husky). Three hooks run on every commit:

  • HTML/JS validator — For every staged .html file: checks HTML structure (matching tags, DOCTYPE), extracts inline scripts and validates JS syntax via vm.Script, checks required patterns (neon.js inclusion, Neon.init call, viewport meta, SEO tags), and flags dangerous patterns (innerHTML, eval, @import). Errors block the commit; warnings pass through.
  • Worker syntax checknode --check on src/worker.js if it changed.
  • neon.js syntax checknode --check on public/neon.js if it changed.

The validator runs in under 2 seconds across all 64 files. Fast enough that you forget it is there — which is exactly how pre-commit hooks should feel.

The Second Security Audit

We also ran a fresh security audit, comparing against the 15-issue report from March 7. The results were satisfying:

0
Critical Issues
15
Issues Fixed
64
Files Scanned
3
Auditors (AI)

All four critical issues from the original audit are resolved: score caps enforce per-game maximums, rate limiting blocks abuse at 30 POST requests per minute per IP, leaderboard sort mode is server-enforced (client cannot inject), and the game slug allowlist prevents KV namespace pollution.

New hardening this round:

  • Origin validation on POST — The worker now rejects POST requests from unknown origins, preventing cross-site counter manipulation. Previously, any website could trigger play/like increments by sending a fetch request.
  • CORS tightened — Non-matching origins no longer receive Access-Control-Allow-Origin in the response. Clean behavior instead of a misleading fallback.
  • localStorage guards — Every JSON.parse(localStorage.getItem(...)) across the codebase is now wrapped in try/catch. If a player's browser storage gets corrupted, the game gracefully resets instead of crashing.
  • Dead code removal — Removed an unused esc() function from the admin dashboard and a dead mode parameter that the client was still sending to the API (the server already ignored it, but dead params confuse future developers).
  • Zero innerHTML — The last three innerHTML = '' usages (safe but inconsistent) were replaced with textContent = ''. The codebase now has zero innerHTML assignments anywhere.

What Remains

One issue that three independent AI auditors all flagged: admin pages have no authentication. Anyone who knows /admin/stats.html can view all game analytics and leaderboard data. The fix is straightforward — Cloudflare Access or a bearer token check in the worker — but it is the last meaningful security gap.

Everything else is architectural: CSP needs 'unsafe-inline' because the games use inline scripts (single-file architecture), and KV's read-modify-write pattern has inherent race conditions that only matter at much higher traffic than we see today.

The Lesson

Testing infrastructure pays for itself the first time it catches something. Our validator found a production bug on its first run. The security audit closed 15 issues. The total time to set up both: less than an hour. The math is obvious — yet most side projects skip it because "we will add tests later." Later never comes. Add the hooks on day one.

Article 08 — March 9, 2026

The Self-Audit

At 54 games across four categories, NEON ARCADE hit the point where no single person — human or AI — can hold the entire project in their head. Every new game adds another set of meta tags to maintain, another screenshot to capture, another hub page card to update, another entry in the worker registry. The surface area for drift is enormous.

So we did something that sounds circular but turned out to be one of the most productive hours of the entire project: we asked the AI to audit its own work.

Five Agents, One Mission

Instead of one monolithic audit, we launched five specialized agents in parallel, each focused on a different dimension:

5
Parallel Agents
54
Games Scanned
15
Issues Found
15
Issues Fixed
  • SEO Agent — Checked every page for meta tags, Open Graph, Twitter Cards, structured data, canonical URLs, and screenshot file existence. Found 3 games missing og:locale and twitter:url, all 7 neonclassic games referencing wrong screenshot paths, and 15 games without screenshots entirely.
  • Design Agent — Verified color palette adherence, typography, WCAG contrast, scanline overlays, screen shake, and particle effects across all games. Flagged 4 games with off-palette colors: Afterglow used yellow (#ffff00) instead of gold, Palette had custom colors that deviated from the brand, Flappy Rewind used the wrong orange, and Truecolor had a pure black background.
  • Neon.js Agent — Audited score integration, leaderboard rendering, share buttons, sound effects, storage key conventions, and the neon.js library itself. Found 100% coverage across all features, with only 3 games using a slightly inconsistent localStorage key pattern.
  • Hub & Navigation Agent — Cross-referenced every game file against hub page cards, game counts, screenshot files, and cross-linking. Found the main hub claimed "45+ games" when we had 54, NEON ARCADE said "25 GAMES" with 28 actual, and NEON GRIND said "4 GAMES" with 7 actual.
  • Code Quality Agent — Scanned for security issues (innerHTML, eval, XSS), performance patterns (requestAnimationFrame, HiDPI), mobile support (touch events, viewport), and the Cloudflare Worker backend. Gave the codebase 9/10 overall. The one gap: 33 canvas games lacked devicePixelRatio scaling.

What We Found

The results split cleanly into two categories: things that were wrong and things that were drifting.

Wrong things are bugs. The game counts were incorrect — we had added games without updating the hub. The neonclassic screenshot paths pointed to a subdirectory that did not exist. Three games were missing SEO tags that every other game had. These are the kind of issues that a checklist prevents, but only if the checklist gets followed.

Drifting things are subtler. Four games had crept away from the standard color palette. Not broken, but inconsistent. Afterglow used bright yellow because it "fit the theme" — but at the brand level, gold (#ffd700) is the correct warm accent. Palette used custom CSS variable names that looked right but were slightly different hex values. These are the issues that accumulate silently until your design system looks like it was built by committee.

The Fix

Everything found was fixed in two passes:

Pass 1 — The Obvious Stuff

  • Game counts — Updated main hub (45+→54), NEON ARCADE (25→28), NEON GRIND (4→7). Added 6 missing games to the CATALOG array.
  • Screenshot paths — Standardized all screenshot references to category subdirs ({category}/screenshots/). The old root /screenshots/ folder had duplicates of everything.
  • Missing screenshots — Automated Playwright captures at 1280×800 for all 15 missing games.
  • SEO tags — Added og:locale and twitter:url to tango, chimp, and clickspeed.
  • Admin pages — Added noindex, nofollow to prevent search engines from indexing dashboards.
  • Color palette — Corrected afterglow, palette, flappy-rewind, and truecolor to use standard neon colors.

Pass 2 — The Deep Stuff

  • Visual effects — Added missing particles, screen shake, and scanline overlays to 15 games. Every game now has the full neon effects set.
  • HiDPI canvas — Added devicePixelRatio scaling to 5 canvas games that were rendering blurry on Retina displays.
  • ESC to quit — Every game (all 54) now lets you press ESC to bail out to the start screen without saving score. Universal rage-quit support.
  • Sound effects — Added Web Audio oscillator sounds to neon-snake (the OG game, somehow the only one missing audio).
  • Touch support — Added touch-action: none to 10 games that were letting mobile browsers hijack touch events.
  • Performance — Replaced setInterval game loops with requestAnimationFrame in 4 games for smoother rendering and less battery drain.
  • Accessibility — Added <main> landmarks to 3 hub pages for screen reader navigation.
  • Sitemap — Was missing 15 games, the entire NEON CLASSIC category, and the blog/updates pages. Now has all 61 URLs.
  • llms.txt — Updated from "35+ games" to full listing of all 54 games with descriptions.
  • Worker registry — Removed phantom tetris-sprint entry (game file never existed).
  • Chromaself leaderboard — Added missing Neon.render() call so players can see scores.
  • CLAUDE.md — Updated with ESC requirement, correct screenshot paths, mandatory effects, and stricter new-game checklist.

Why Self-Auditing Works

The interesting insight is not that we found bugs — every project has bugs. It is that the AI found its own bugs, in its own code, by reading its own rules.

The CLAUDE.md file is both the specification and the enforcement mechanism. When the audit agents checked "does every game use #0a0a12 as the background?" they were reading a rule that the build agents were also supposed to follow. The drift happened because the build agents were focused on making games work, not on cross-referencing every hex value against the style guide.

This is exactly the same failure mode as human teams. Developers read the style guide when they start, then rely on memory as they get deeper into the work. Colors get approximated. Tags get skipped. Counts get stale. The fix is the same for both AI and human teams: periodic automated audits with explicit checklists.

The Lesson

Build agents are optimized for creation. Audit agents are optimized for verification. They are different cognitive modes, and they are complementary. The most reliable system is one where the builder and the checker are separate processes — even when they are the same underlying model. Never skip the review just because the author is "smart enough." Intelligence and thoroughness are different axes.

The Result

Two audit passes, 8 parallel agents, 54 games touched. Every issue found was fixed in the same session. The audit turned unknown unknowns into a prioritized backlog, and then cleared the entire backlog. Zero items deferred. That is the power of automated auditing: when finding and fixing have the same cost, there is no reason to postpone anything.

Article 09 — March 9, 2026

Battleship: Our First Online Multiplayer Game

We added Battleship — the classic naval strategy game — with a twist: real online 2-player mode. For the first time in NEON ARCADE, you can play against someone on a completely different device.

How It Works

The flow is dead simple. Create a room → get a 4-letter code → share it with a friend → they join → you battle. No accounts, no friend lists, no matchmaking queues. Just a code and a browser.

Both players place their 5-ship fleet on a 10×10 grid, then take turns firing shots. You see your own board (with your ships and incoming hits) and the enemy board (a fog of war that reveals hits and misses as you fire). First to sink all 5 enemy ships wins.

The Technical Approach

Online multiplayer sounds like it requires WebSockets, game servers, and a monthly hosting bill. We did it with none of those. The entire multiplayer system runs through our existing Cloudflare Worker backend using simple HTTP polling.

Here is the architecture: when a player creates a room, the Worker generates a random 4-letter code and stores the room state in KV. Both players poll the Worker every 800ms asking "is it my turn? what happened?" The Worker relays messages between players — fire coordinates, hit/miss results, ship sunk notifications — but never makes game decisions. Each client validates moves locally.

Room codes auto-expire after 10 minutes of inactivity. No zombie rooms, no cleanup cron jobs, no database maintenance. KV's built-in TTL handles garbage collection for free.

The AI Opponent

Not everyone has a friend online at the same time, so the single-player AI needed to be good. It uses a hunt-and-target algorithm with two modes:

  • Hunt mode — Fires on a checkerboard pattern. Why? Because every ship is at least 2 cells long, a checkerboard pattern guarantees you will hit every possible ship position with half the total shots. It is mathematically the most efficient random search.
  • Target mode — When a shot hits, the AI switches to probing adjacent cells. If it gets a second hit, it follows the line (horizontal or vertical) until it sinks the ship, then returns to hunt mode.

The result is an AI that feels smart without being unfair. It does not cheat — it never peeks at your board. It just uses the same logical deduction a skilled human player would.

Privacy by Design

Ship positions never leave your browser. Only fire coordinates and hit/miss results are transmitted. The Worker relays messages between players but never sees your fleet layout. Even if someone intercepted the network traffic, they would only see where shots landed, not where ships are. Your fleet is your secret.

A Generic Room System

This multiplayer infrastructure is not just for Battleship. It is a generic room-based relay system that any future game can use. The Worker does not know or care what game is being played — it just passes messages between two players in a room. Tetris Battle, Puzzle Races, Word Duels, or any game where players need to exchange moves could plug into the same system.

The room API is simple: create (returns a code), join (joins with a code), send (posts a message), poll (reads new messages). Four endpoints, no game-specific logic on the server.

What's Next

Battleship is game 55, and the first with cross-device multiplayer. But the real milestone is the infrastructure. We are working on engagement features like challenge links (share a URL that drops someone straight into a game against you), daily modes with streaks, and percentile rankings in share text. The multiplayer room system opens up an entire category of games we could not build before.

The Takeaway

You do not need WebSockets or a game server to build online multiplayer. A Cloudflare Worker, KV storage, and 800ms polling intervals give you a surprisingly smooth turn-based multiplayer experience — for $0/month. Sometimes the simplest architecture is the best one.

Article 10 — March 13, 2026

Multiplayer Without Servers

NEON ARCADE hit 62 games. Most of them are solo. You play, you score, you share. But games become something different when another person is involved. The tension of a head-to-head match, the satisfaction of beating a friend's time, the drama of a close finish — those are the moments that make people actually send a link.

So we asked: how many of our games can become multiplayer without adding servers, accounts, or matchmaking? The answer turned out to be 10 — and we built three different multiplayer patterns to make it work.

10
Multiplayer Games
3
Multiplayer Modes
$0
Server Cost for Challenge Links
0
Accounts Required

The Three Modes

Not all multiplayer is the same. A fighting game and a crossword puzzle need completely different approaches. We settled on three modes that cover most cases without overcomplicating things:

  • VS AI — A computer opponent good enough to make you think. Battleship, Mine Finder, and Tic Tac Toe all have AI opponents with proper strategy, not random moves.
  • 2P Local — Same device, pass and play. Two humans, one screen, taking turns. Pairs, Tic Tac Toe, Veto, and Mimic all support this. No network needed.
  • Online (Challenge Links) — Send a URL. Your friend gets the exact same puzzle. Compare results. No room codes, no server relay, no waiting. Battleship is the only game using real-time room codes — everything else uses challenge links encoded in the URL.

Every game on the site now has mode tags visible on its card — colored badges showing SOLO, VS AI, 2P LOCAL, or ONLINE. A new multiplayer landing page groups all 10 games with filter tabs.

Challenge Links: The Zero-Server Multiplayer

Challenge links are the most interesting pattern because they require zero server infrastructure. The entire multiplayer state is encoded in the URL itself. When you complete a puzzle in Connections or Minesweeper, you get a "CHALLENGE FRIEND" button. It generates a link like:

neonarcade.net/neonmind/connections.html#c=17 neonarcade.net/neonmind/minesweeper.html#c=839274516-3

The hash fragment encodes which puzzle to load. For Connections, it is a puzzle index. For Minesweeper, it is a seed + difficulty level. When your friend opens the link, JavaScript reads the hash, and they get the exact same challenge. No server involved — the URL is the entire protocol.

This means challenge links are:

  • Free — no API calls, no bandwidth, no KV storage
  • Instant — no waiting for the other player
  • Permanent — the link works forever, not just during a session
  • Shareable anywhere — text messages, Discord, Slack, social media, email

The tradeoff? Both players play independently. There is no live connection, no real-time view of the other player. You share a link, they play, they share back. It is asynchronous by nature. But for puzzles like Minesweeper and Connections, that is exactly the right model. You do not need to watch someone solve a Sudoku in real time. You just want to know if they were faster than you.

Seeded PRNG: Making Randomness Reproducible

Minesweeper posed a specific problem. Its boards are randomly generated — so how do you give two players the exact same random board? The answer is a seeded pseudo-random number generator.

We used mulberry32, a fast 32-bit PRNG. Given the same seed, it produces the exact same sequence of "random" numbers every time. The board generation uses this instead of Math.random():

function mulberry32(a) { return function() { a |= 0; a = a + 0x6D2B79F5 | 0; var t = Math.imul(a ^ a >>> 15, 1 | a); t = t + Math.imul(t ^ t >>> 7, 61 | t) ^ t; return ((t ^ t >>> 14) >>> 0) / 4294967296; }; }

When a player finishes a Minesweeper game, we save the seed. When they challenge a friend, the seed and board size go into the URL hash. The friend's browser seeds the PRNG with the same value, runs the same Fisher-Yates shuffle, and gets the same mine placement. Same board, different player, fair competition.

First-click safety is preserved too. If the first click lands on a mine, the mine swaps to a safe cell — and since the first-click position is part of the board state, both players get the same first-click behavior as long as they click the same cell first. In practice, challenge mode starts with a pre-revealed board, so this edge case is handled naturally.

Same-Device 2P: The Couch Multiplayer

Sometimes multiplayer means two people staring at the same screen. For Neon Pairs (the memory card game), we added a VS Friend mode that turns it into a competitive card-matching duel.

The rules are simple: players take turns flipping two cards. If they match, you score a point and go again. If they do not match, the cards flip back and it is the other player's turn. The person with the most matches wins.

The UI makes it obvious whose turn it is. Player 1 glows cyan. Player 2 glows pink. A brief transition overlay announces each turn change. The top bar shows both scores updating live. It is minimal UI on top of the existing solo game — same cards, same animations, same sounds, just with turn tracking layered on.

Tic Tac Toe already had this mode from launch (3×3 or 4×4 grids, two humans on one screen). Veto and Mimic also support same-device play alongside their online challenge modes.

The Tagging System

With 62 games and 10 supporting multiplayer, players need a way to find the right ones. We added a tagging system: every game in the catalog now carries a modes array with values like ['1p', 'ai', 'local', 'online']. Game cards render these as colored badges:

  • SOLO — grey, every game has this
  • VS AI — purple, for games with computer opponents
  • 2P LOCAL — green, same-device pass-and-play
  • ONLINE — cyan, challenge links or room codes

The badges only show when a game has more than one mode. Solo-only games (52 of them) get no tags — keeping the cards clean. The dedicated multiplayer hub shows all 10 games with filter tabs for each mode.

Why Not WebSockets?

Battleship uses real-time room codes via the Cloudflare Worker relay (see Article 09). That works because Battleship needs live turn-by-turn communication — you fire, they react, you see the result. It is genuinely synchronous.

But most puzzle games are not. Minesweeper does not need the other player present. Connections does not either. They are solo experiences that become competitive when you compare results. For these, challenge links are strictly better than real-time connections:

  • No timing coordination (play whenever you want)
  • No server load (the link contains all state)
  • No connection drops or lobby timeouts
  • Works in group chats (send one link, everyone plays)

The lesson: multiplayer does not always mean real-time. For most competitive puzzle experiences, "play the same thing and compare" is more fun, less friction, and infinitely cheaper to build.

What's Next

We are eyeing more games for multiplayer upgrades. Sudoku and KenKen could get challenge links (same puzzle, race the clock). TypeBlitz could support same-device speed duels. Reflex Chain could compare reaction times via shared seeds. The patterns are established — adding multiplayer to a new game now takes hours, not days.

The multiplayer hub also sets up future possibilities: tournament brackets, daily challenge leaderboards, and "play what your friends played" social features. All without adding a single WebSocket connection.

The Takeaway

Most multiplayer does not need servers. A seeded PRNG and a URL hash give you reproducible shared experiences for free. Real-time connections are for real-time games — everything else can be a link. The best multiplayer architecture is the one where the URL is the entire protocol.

Article 11 — March 13, 2026

NEON QUEST — Why We Built Games You Can’t Finish in 60 Seconds

NEON ARCADE started as a collection of 30-second viral hits. Open a link, play, share your score, move on. That formula works — 62 games later, it clearly works. But something was missing.

Players kept asking: “Do you have anything I can play on my commute?”

A 30-second game does not fill a 45-minute train ride. It fills the first 30 seconds. So we built NEON QUEST — a fifth category with 5 games designed for 5-60 minute sessions. Games you come back to. Games with progression, save states, and the kind of depth that keeps you thinking about strategy when you are not playing.

5
New Deep Games
67
Total Games
5
Categories
60m
Max Session Length

The Design Tension

Every rule in CLAUDE.md says “sessions under 60 seconds.” NEON QUEST throws that out. But the deeper principles still apply: learnable in 30 seconds (tutorial pacing, not upfront rules), a shareable moment in the first session, and the “one more try” hook that makes you restart instead of close the tab.

The key adaptation was save/resume. You cannot ask someone on a train to finish a 25-minute tower defense run in one sitting. Games with sessions over 15 minutes auto-save between natural break points. Close the tab, come back tomorrow, pick up where you left off. It is the same localStorage we already use for scores — just saving more state.

The Five Games

FUSION — Suika Game meets the periodic table. Drop atoms into a reactor. When two identical atoms touch, they fuse into the next element. Physics-based chain reactions are the hook — watching a cascade of fusions ripple through a packed reactor is the screenshot moment. The hardest engineering problem here was stable circle-circle collision resolution. We used Verlet integration with iterative constraint solving. No physics library — just inline JavaScript computing forces 60 times per second.

JACKPOT — A roguelike deckbuilder where your deck is a slot machine. Spin 3 reels, match symbols to deal damage, buy new symbols between fights, bend probability with weights and relics. 15 floors, 5 bosses, each with a mechanic that breaks your strategy. The Ransomware boss encrypts one of your reels. The Zero-Day boss is immune to your most common symbol. Every run is 10-15 minutes and ends with “if I had just bought that Lightning symbol on floor 4…”

FIREWALL — Tower defense on a circuit board. Place security programs on nodes, watch enemies stream along glowing circuit traces. The twist: every 5 waves, the board evolves. You choose from 3 circuit expansions that add new paths and new tower slots, reshaping the battlefield. Overclocking lets any tower fire at 2x speed — but 3 overclocks without cooling and it burns out permanently. Save/resume between waves.

SWARM — Vampire Survivors, but you control a swarm of drones instead of one character. The core mechanic is split/merge. Right-click to leave a splinter group guarding a position. Left-click to recall everything. More drones together means stronger attacks (cohesion bonus), but splitting covers more ground. 15-minute survival runs with bosses at 5, 10, and 15 minutes. Performance was the engineering challenge — 50 drones, 200 enemies, and hundreds of projectiles at 60fps required spatial hashing and object pooling.

SYNTHESIS — Infinite Craft in a neon lab. Start with 4 base elements. Drag two items onto the workbench to discover new substances. 200+ items across 3 tiers, organized in 5 themed branches (Chemistry, Electronics, Energy, Biology, Space). Discovering specific items unlocks new equipment: a Furnace enables heat combinations, a Centrifuge splits items back into components for reverse-engineering hints. Full save/resume — this is the game you play for weeks, discovering a few new items each session.

Still Single-File HTML

Every NEON QUEST game is a single .html file. No build step, no framework, no external dependencies beyond Google Fonts and neon.js. SYNTHESIS is the largest at around 200KB — the recipe database is a JSON object embedded directly in the source. SWARM is the most technically demanding, running a full entity-component system with spatial hashing inside an inline <script> tag.

This is either impressive discipline or questionable judgment. We choose to interpret it as both.

The Category System

NEON QUEST is the fifth category, joining NEON ARCADE (action), NEON MIND (puzzles), NEON GRIND (skill), and NEON CLASSIC (retro). Each has its own hub page, accent color, and design philosophy. NEON QUEST uses orange — warm, distinct, and fitting for the “deeper/longer” vibe. The main hub page now shows all five categories with 3 random games each.

What We Learned

Building games with 15-minute sessions is fundamentally different from building 30-second viral hits. Three things surprised us:

  • Balancing is everything. JACKPOT needed dozens of tuning passes. If swords are too strong, nobody buys other symbols. If bosses are too hard on floor 6, nobody sees floor 7. In a 30-second game, balance barely matters. In a 15-minute roguelike, it is the game.
  • Save state is harder than gameplay. Serializing a FIREWALL board with 12 towers, 3 upgrade levels each, active cooldowns, and mid-wave enemy positions into localStorage is more complex than the pathfinding algorithm.
  • The “one more try” hook works differently. In NEON SNAKE, you die and restart in 1 second. In JACKPOT, you die and spend 5 seconds thinking about what you would do differently. That reflection time is what creates the “I could do better” feeling that drives retention.
The Takeaway

Viral games and deep games need different design instincts, but the same core principle: the player should always know what they would do differently next time. In a 30-second game, that is “tap faster.” In a 15-minute roguelike, that is “buy the Lightning symbol instead of the Shield next run.” Both drive the same behavior — hitting play again.

Article 12 — March 14, 2026

Building a Casino With Fake Money and Real Architecture

NEON ARCADE is 72 games now. Most are quick hits — 30 seconds, one mechanic, done. But we kept hearing the same request: "Make something I can sit with." Games where you settle in, build something up, come back later. Casino games are perfect for that. They have natural session breaks (between hands, between spins), persistent progression (your bankroll), and a dopamine loop that is older than video games themselves.

So we built NEON CASINO — five games, one shared bankroll, and the sixth category on the site.

5
Casino Games
1
Shared Bankroll
6
Poker Seats
$0
Real Money

The Shared Bankroll Problem

Every other game on the site is self-contained. Your Tetris score has nothing to do with your Sudoku time. Casino games are different. You want to win chips at Blackjack, blow them at Roulette, and grind them back on Slots — all from the same wallet. Five HTML files needed to share state.

The solution is localStorage. All five games read and write to the same key: neoncasino_bankroll. The value is a JSON object with two fields: the chip count and a hash. Open Blackjack, win 500 chips, close it, open Slots — your 1,500 chips are right there.

But localStorage is client-side. Anyone with a browser console can type localStorage.setItem and give themselves a million chips. Since the bankroll feeds into the global leaderboard, we needed some integrity checking.

FNV-1a: Hashing for Honesty

We chose FNV-1a — a fast, non-cryptographic hash. The bankroll is stored as:

{ "chips": 1500, "hash": "a7f3k2" }

The hash is computed from the chip count plus a salt. When any game loads, it reads the bankroll, recomputes the hash, and checks if they match. If someone edits the chip count without knowing the salt, the hash won't verify and the bankroll resets to 1,000.

Is this bulletproof? No. The salt is in the source code — a determined player can find it. But it stops casual tampering, which is all we need for a free game with no real money. The goal is not cryptographic security. It is keeping the leaderboard honest enough that high scores feel earned.

Five Games, Five Challenges

Blackjack was first. Standard rules: 6-deck shoe, hit/stand/double/split, dealer stands on soft 17, blackjack pays 3:2. The interesting part is the card rendering — DOM elements styled to look like neon playing cards, with cyan for hearts and diamonds, pink for spades and clubs. CSS 3D transforms handle the flip animation when the dealer reveals.

Roulette needed a wheel. A real, spinning, ball-bouncing wheel. This meant canvas rendering with HiDPI support. The wheel has 37 pockets (European single-zero), each drawn as an arc segment in alternating red and black with green for zero. The ball follows a decelerating arc with bounce physics when it lands. Sound effects tick faster as the ball slows down. The betting table supports all standard bet types — straight, split, street, corner, line, dozen, column, and even-money positions.

Slots is the most visually complex. Five reels, three rows, twenty paylines. Seven neon vector symbols drawn on canvas — no emoji, no images. Each reel spins independently with staggered stops (left to right, 0.3 seconds apart) and a spring-back bounce on landing. Diamond wilds, lightning scatters, and a free spins bonus round with retriggering. The payline visualization traces glowing lines across the grid when you win.

Video Poker (Jacks or Better) is the cleanest design. Five cards, hold what you want, draw the rest. A visible paytable updates to highlight your current hand. The double-or-nothing feature after each win is the hook — do you take your 40-chip win or risk it for 80?

Poker is the ambitious one. A 6-seat Texas Hold'em table with AI opponents. Three AI personalities — tight, loose, and aggressive — each evaluating hand strength, pot odds, and bluff frequency with realistic thinking delays. The table is an oval with CSS-positioned seats, a community card area, and a pot display. And it is the only casino game with multiplayer: create a room, share the code, play with up to 5 friends.

Scaling the Room System

NEON ARCADE already had a 2-player room system for Battleship (see Article 09). But Poker needs 6 seats. The existing endpoints assumed exactly two players — creator and joiner, with a simple token pair.

We built a new set of endpoints under /api/mroom/* (multi-player room). The room data structure uses numbered seats:

{ "seats": { "0": { "token": "abc...", "name": "ACE" }, "1": { "token": "def...", "name": "PIX" } }, "messages": [], "state": "waiting", "maxSeats": 6 }

Players join by taking the next available seat. The poll endpoint returns messages from other players filtered by timestamp, plus seat info (names only, never tokens). The room lives in Cloudflare KV with a 10-minute TTL — if everyone leaves, it cleans itself up.

In solo mode, the table renders all 6 seats. You sit at seat 0. Three AI opponents take seats 1, 3, and 4. Seats 2 and 5 show empty "JOIN" placeholders — a visual hint that multiplayer exists without cluttering the table.

The Daily Bonus Safety Net

Casino games have a problem that arcade games do not: you can go to zero. In Tetris, losing means starting over with the same nothing you started with. In Blackjack, losing means your bankroll is empty and you literally cannot play anymore.

The daily bonus handles this. If your bankroll drops below 100 chips, you get a one-time +500 bonus once per day. It is enough to play a few more hands without feeling punished. The threshold is low enough that it only triggers when you genuinely need it — not when you are doing fine at 890 chips. (We caught that bug in testing. The original code showed the bonus modal regardless of bankroll. Oops.)

There is also a nuclear option: a RESET BANKROLL button with a confirmation dialog that wipes everything back to 1,000 chips. Fresh start, no questions asked. Sometimes you just want a clean slate.

Building Five Games in Parallel

The implementation used five parallel AI agents, one per game. Each agent got the same design spec, bankroll system code, and visual style guide. They built independently, committed separately, and we merged the results.

This worked because casino games are surprisingly independent. They share a bankroll and a visual language, but the game logic is completely isolated. Blackjack's shoe shuffling has nothing to do with Roulette's ball physics. The only integration point was the shared localStorage key — and since that was defined upfront in the spec, there were no merge conflicts.

Total build time for all five games: under 15 minutes of wall clock, running in parallel. Integration (hub page, main index, sitemap, worker registration) took one more agent pass.

The Takeaway

Casino games are a different beast from quick arcade hits. They need shared state, session persistence, and enough integrity checking to keep leaderboards fair. But the same single-HTML-file architecture scales to handle it. FNV-1a hashes in localStorage, a 6-seat room system on Cloudflare KV, and canvas-rendered wheels and reels — all running without a framework, without a build step, without a database. Open the file, play some cards.