Browser-first 2D / 2.5D game engine for TheWorldTable.ai. Canvas2D primary backend, ECS, render-graph stages, Director-bridge SSE integration. No external engine reuse - built from scratch in TypeScript.
Repo: sadhaka/loom-engine.
API docs: loom-engine.pages.dev.
The design spec (LOOM-ENGINE-SPEC.md) lives in the consuming
TheWorldTable.ai repo and is the canonical source for phase
plans and architectural decisions.
npm install @sadhaka/loom-engine
Pre-alpha. ESM-only, browser-first. TypeScript types ship in the
package (dist/index.d.ts). Node 18+ for the build toolchain;
the runtime targets evergreen browsers (Canvas2D + Web Audio +
EventSource).
API reference (TypeDoc) - generated from the public surface in
src/index.ts on every push to main:
https://loom-engine.pages.dev/
Build it locally with npm run docs (writes to ./docs/).
See Docs deploy for the hosting chain and one-time activation steps (Cloudflare Pages, since GitHub Pages is unavailable on private repos for free user plans).
// 1. Install
// npm install @sadhaka/loom-engine
import {
Engine,
SpriteRenderSystem,
InputSystem,
VeilBudgetSystem,
SYSTEM_PHASE_INPUT,
SYSTEM_PHASE_RENDER,
} from '@sadhaka/loom-engine';
// 2. Attach to a canvas. Engine.create wires Canvas2DDevice, World,
// TransformPool, SpritePool, Time + Camera resources, and the
// default SpriteRenderSystem in SYSTEM_PHASE_RENDER.
var canvas = document.querySelector('canvas');
var engine = Engine.create({ canvas: canvas });
// 3. Register the systems your game needs. Order within a phase is
// deterministic; phases run INPUT -> LOGIC -> PHYSICS -> ANIMATION
// -> RENDER -> POST_RENDER per frame.
engine.world.addSystem(new InputSystem(), SYSTEM_PHASE_INPUT);
engine.world.addSystem(new VeilBudgetSystem(), SYSTEM_PHASE_INPUT);
// 4. Drive the frame loop. engine.tick advances Time, beginFrame on
// the device, world.update across all phases, endFrame.
function tick(now) {
engine.tick(now);
requestAnimationFrame(tick);
}
requestAnimationFrame(tick);
SSEDirectorBridge and SnapshotRecoveryHelper send credentials with
their network requests by default. The default eventSourceFactory
constructs new EventSource(url, { withCredentials: true }) and
SnapshotRecoveryHelper calls fetch(url, { credentials: 'include' }).
This is the right default for the embedded TheWorldTable.ai
same-origin use case (cookies + auth headers flow with the request
to the same origin), but a third-party consumer pointing the bridge
at a URL configured from user input could end up sending their own
site's credentials cross-origin (the browser still requires the
target server to opt in via Access-Control-Allow-Credentials: true
plus a specific Access-Control-Allow-Origin, so this is not a
one-sided SSRF; it requires attacker control of the target server's
CORS policy).
If you do not want credentials to flow with director-bridge requests, override the seams the engine already exposes - no engine code change needed:
import {
SSEDirectorBridge,
SnapshotRecoveryHelper,
} from '@sadhaka/loom-engine';
// Credential-free SSE subscription.
var bridge = new SSEDirectorBridge({
baseUrl: directorUrl,
characterId: characterId,
eventSourceFactory: function(u) {
return new EventSource(u, { withCredentials: false });
},
});
// Credential-free snapshot recovery.
var recovery = new SnapshotRecoveryHelper({
baseUrl: snapshotUrl,
characterId: characterId,
fetchImpl: function(input, init) {
var safeInit = Object.assign({}, init, { credentials: 'omit' });
return fetch(input, safeInit);
},
});
The override hooks have always existed; 0.10.1 documents them. Internal security audit references are kept in the repository, not shipped with the npm package.
Pre-alpha, productized as of 0.10.0 (Phase 11B.3 - npm publish under MIT). Phases 0 through 9.3 + 11A.2 are shipped; the engine runs the public TheWorldTable.ai pre-alpha. Productization is a fund-raising and distribution decision, not a stability claim - the public API surface will evolve until 1.0.
| Phase | Status | Surface |
|---|---|---|
| 0 | shipped | scaffolding, package.json, tsconfig, PRIOR-ART log |
| 1 | shipped | Canvas2D iso renderer, camera, transform pool (SoA) |
| 2 | shipped | ECS World, system scheduler, resource registry, Engine facade, asset pipeline |
| 3 | shipped | clip-aware sprite-sheet manifests, AnimationStatePool, AnimationSystem |
| 4 | shipped | particle pool, emitter component, three-system VFX pipeline, additive blend |
| 5 | shipped | Web Audio bus mixer with VE-budget gating, unified keyboard / mouse / touch input |
| 6 | shipped | Director-bridge: SSE event-stream subscription, eventSourceFactory hook, snapshot-recovery |
| 7 | shipped | Survivor combat layer (projectile pool, hit resolution, damage application) ported onto Loom Engine |
| 8 | shipped | 2.5D ARPG hub-and-spoke per LOOM-CLASS-SYSTEM-SPEC, plaza narrator, mobile + touch input (virtual D-pad, tap-to-walk) |
| 9.1 | shipped | perf pass: alloc-churn fixes + bench harness |
| 9.3 | shipped | TypeDoc public-API site with auto-deploy |
| 11A.2 | shipped | docs hosting migrated to Cloudflare Pages |
| 11B.3 | shipped | MIT license + npm publish posture (this release) |
| 12.4 | shipped | License pivot from MIT to BUSL 1.1 with $1M revenue cap |
| 13.2 | shipped | Engine hardening: 12.6 audit lows L-08..L-12 closed |
| 14.1 | shipped | WebGL2 instanced sprite batcher |
| 15.1 | shipped | Multiplayer presence: pluggable bridge (SSE / Mock), peer pool with per-peer linear interpolation, render system (this release) |
See LOOM-ENGINE-SPEC.md Section 7 for the full phase plan with effort estimates.
Two backends ship behind the same IGraphicsDevice contract:
| Backend | Status | When to use |
|---|---|---|
canvas2d |
default, stable | Hundreds to ~2k sprites; broadest browser coverage |
webgl2 |
0.12.0+, opt-in | Thousands of sprites; instanced batching with atlas grouping |
Existing consumers do not need to change anything - Engine.create({ canvas })
keeps the Canvas2D path it has always had, with byte-for-byte
compatibility on every public API.
import { Engine, WebGL2Device } from '@sadhaka/loom-engine';
// Importing WebGL2Device side-effect-registers the 'webgl2' backend
// factory. The string-based form then works:
var engine = Engine.create({ canvas: myCanvas, backend: 'webgl2' });
Or inject a pre-built device for absolute control over construction (useful when sharing the GL context with another renderer):
import { Engine, WebGL2Device } from '@sadhaka/loom-engine';
var device = new WebGL2Device(myCanvas);
var engine = Engine.create({ canvas: myCanvas, device: device });
Every drawSprite / drawTile / drawParticle / drawText call
flows through SpriteBatcher, which accumulates per-instance data
(screen-space origin + size, atlas UV rect, RGBA tint) into a single
Float32Array. A flush triggers when the next call uses a different
atlas or blend mode, and at endFrame. Each flush issues exactly
one drawArraysInstanced for the batch.
For maximum throughput, group draws by atlas at the system level
(e.g. render all hamlet props from one atlas, all NPCs from another).
The default SpriteRenderSystem already sorts globally by iso depth
key; submission order within an atlas-bounded run is preserved
through the batcher.
engine.ts deliberately does not statically import
WebGL2Device. A consumer that only uses Canvas2D never pulls
WebGL2-specific code into their bundle - the only way the WebGL2
path enters the dependency graph is via an explicit
import { WebGL2Device }, which also triggers backend registration
through a /*#__PURE__*/-marked side effect.
webglcontextlost /
webglcontextrestored automatically: every atlas re-uploads from
its cached source image; frames during the lost interval no-op.WebGL2Device construction. Wrap the upgrade in a
feature check and fall back to Canvas2D:function pickBackend() {
var probe = document.createElement('canvas').getContext('webgl2');
return probe ? 'webgl2' : 'canvas2d';
}
var engine = Engine.create({ canvas: myCanvas, backend: pickBackend() });
npm install
npm run build # tsc src/ -> dist/
npm run build:demo # tsc demo/*.ts -> demo/*.js
npm run build:all # both
npm run watch # rebuild src on change
npm run test # tsx tests/*.test.ts
npm run clean # remove dist + compiled demo
npm run build:all
python -m http.server 8765
# browse http://localhost:8765/demo/
http://localhost:8765/demo/ is the gallery index. The same tree is
published to loom-engine.pages.dev/demo/
on every push to main.
Three minimal, copy-paste-ready starters live under demo/. Each is
roughly 150 lines of TypeScript, imports from @sadhaka/loom-engine
(resolved via importmap to the local engine bundle), and runs in
the browser without a build step on the consumer side.
MOB_CATALOG, projectile
physics, system-phase ordering.narrator.line event drained from MockDirectorBridge. Demonstrates
iso projection, input snapshot, the bridge / event-log / DOM-overlay
boundary.Resource, custom System
reading both InputSnapshot and DOM events, DOM as the primary UI.MockMultiplayerBridge. WASD to walk; the local player broadcasts
position at 10 Hz and the three peers (Alice / Bob / Carol) lerp
smoothly between presence updates. Demonstrates the pluggable
multiplayer bridge, PeerPool linear interpolation, and the
PeerPresenceSystem / PeerRenderSystem pipeline. See the
Multiplayer section below for the wire protocol.The legacy reference demos (Phase 6 director, Phase 7 combat, Phase 8 ARPG slice) stay accessible from the gallery index.
Controls in the legacy director demo (demo/director.html):
Phase 15.1 adds a thin presence layer for showing other players in
real time on the same world. The transport is pluggable: the engine
ships an SSEMultiplayerBridge (server-sent events) and a
MockMultiplayerBridge (in-process; tests + offline demos), and the
IMultiplayerBridge interface is small enough to swap in WebSocket
or WebRTC without touching anything above it. No CRDT - peers carry
position only, and conflict resolution is "last write wins" at the
server. Shared state beyond position is deferred to a later phase.
The bridge layer hides this from gameplay code, but for anyone implementing a server (or a custom transport) the contract is:
Server -> client (SSE event types):
presence.snapshot { peers: [{ character_id, x, y, zone, ts_ms, name? }] }
emitted once on connect with the full current peer roster. The
client treats this as authoritative and drops any peer not in the
snapshot.presence.update { character_id, x, y, zone, ts_ms, name? }
emitted as peers move.presence.depart { character_id }
emitted when a peer disconnects.Client -> server (HTTP POST):
POST <broadcastUrl> { character_id, x, y, zone, ts_ms }
the engine rate-limits to 10 Hz (BROADCAST_HZ); excess calls
to broadcastPosition() are silently dropped and counted in
bridge.stats().rateLimitedDrops. Calling once per frame is fine.ts_ms is the wall clock at which the position was true. The
PeerPool uses it to interpolate between successive samples so peers
don't jitter at the network rate. Acceptable lag is ~150 ms (one
update interval at 10 Hz), imperceptible at walk speed.
import {
Engine,
// Multiplayer
MockMultiplayerBridge, // or SSEMultiplayerBridge
PeerPool,
PeerSpritePool,
PeerPresenceSystem,
PeerRenderSystem,
POOL_PEER_SPRITE,
RESOURCE_MULTIPLAYER_BRIDGE,
RESOURCE_PEER_POOL,
SYSTEM_PHASE_INPUT,
SYSTEM_PHASE_RENDER,
} from '@sadhaka/loom-engine';
const engine = Engine.create({ canvas });
// 1. Create a bridge. Production code uses SSEMultiplayerBridge:
//
// const bridge = new SSEMultiplayerBridge({
// baseUrl: '/api/v1/loom/presence/events',
// characterId: 'me',
// zone: 'plaza',
// });
//
// Tests + offline dev use the in-process mock:
const bridge = new MockMultiplayerBridge();
bridge.connect();
// 2. PeerPool stores all known peers + their interpolated positions.
// Self-filter: tell the pool which character_id is the local player
// so it isn't rendered as a ghost when the server echoes it back.
const peerPool = new PeerPool();
peerPool.setLocalCharacterId('me');
// 3. PeerSpritePool maps character_id -> rendering hint. A default
// atlas + frame is enough for an undifferentiated demo; setOverride()
// lets you assign per-class sprites or cosmetic shards.
const peerSprites = new PeerSpritePool({ defaultAtlas: peerAtlas });
engine.world.resources.set(RESOURCE_MULTIPLAYER_BRIDGE, bridge);
engine.world.resources.set(RESOURCE_PEER_POOL, peerPool);
engine.world.registerPool(POOL_PEER_SPRITE, peerSprites);
// 4. Wire the systems. PeerPresenceSystem drains the bridge each
// frame; PeerRenderSystem submits a drawSprite + name label per
// peer at their interpolated position.
engine.world.addSystem(new PeerPresenceSystem(), SYSTEM_PHASE_INPUT);
engine.world.addSystem(new PeerRenderSystem(), SYSTEM_PHASE_RENDER);
// 5. Inside your walk-system, call broadcastPosition() each frame.
// The bridge enforces the 10 Hz wire rate.
bridge.broadcastPosition(playerX, playerY, 'plaza', Date.now());
The bridge interface is just five methods (connect, disconnect,
status, pollMessages, broadcastPosition, plus stats). To use
WebSocket or WebRTC, implement IMultiplayerBridge with the same
PresenceMessage shape (update / depart / snapshot); none of
the systems above the bridge change.
loom-engine/
src/
util/ math, color, typed-arrays
components/ transform, sprite, particle-emitter
renderer/ graphics-device, canvas2d-device, camera, iso-projection
animation/ animation-clip, animation-state-pool
asset/ sprite-sheet-loader
audio/ audio-bus
input/ input-manager
systems/ sprite-render, animation, particle-{simulation,emitter,render}, input, veil-budget
vfx/ particle-pool
entity.ts entity allocator (32-bit handle, generation guard)
world.ts ECS World class
system.ts System interface + phase constants
resources.ts ResourceRegistry + Time + VeilBudget
engine.ts Engine facade
index.ts public API barrel
demo/ browser demo (one tile + animated knight + sparkles + click-to-burst)
tests/ node-based smoke tests (tsx --test)
assets/ placeholder game assets (knight walk-cycle PNG + JSON)
tools/ helper scripts (gen-knight.py - Pillow generator)
PRIOR-ART.md cumulative inspirations log (clean-room defense)
package.json tsc + tsx as only dev deps
tsconfig.json ES2022 strict + noUncheckedIndexedAccess
dist/ tsc output (gitignored)
node_modules/ npm install output (gitignored)
particleBudget, audioBudget, shaderBudget,
eventBudget. VeilBudgetSystem propagates updates to ParticlePool,
AudioBus, etc. Director-bridge mutates the budget; subsystems readengine.tick(now) runs in this order:
import {
Engine,
// ECS
POOL_TRANSFORM, POOL_SPRITE, POOL_ANIMATION, POOL_PARTICLE,
POOL_EMITTER,
TransformPool, SpritePool, AnimationStatePool, ParticlePool,
ParticleEmitterPool,
SYSTEM_PHASE_INPUT, SYSTEM_PHASE_LOGIC, SYSTEM_PHASE_PHYSICS,
SYSTEM_PHASE_ANIMATION, SYSTEM_PHASE_RENDER, SYSTEM_PHASE_POST_RENDER,
// Default systems
AnimationSystem, SpriteRenderSystem,
ParticleEmitterSystem, ParticleSimulationSystem, ParticleRenderSystem,
InputSystem, VeilBudgetSystem,
// Resources
RESOURCE_TIME, RESOURCE_CAMERA, RESOURCE_DEVICE,
RESOURCE_VEIL_BUDGET, RESOURCE_INPUT, RESOURCE_AUDIO_BUS,
// Renderer
Canvas2DDevice, ISO_TILE_WIDTH, ISO_TILE_HEIGHT,
// Asset
loadSpriteSheet, computeFrameIndex,
// Audio
AudioBus, AUDIO_BUDGET_AMBIENT_FLOOR, AUDIO_BUDGET_ESSENTIAL_FLOOR,
// Input
InputManager,
// Math + color
vec2, vec3, rect, clamp, lerp,
hexToRgba, rgbaToCssString,
COLOR_KNOT_STR, COLOR_KNOT_DEX, COLOR_KNOT_INT, COLOR_KNOT_CENTER,
// Iso
tileToIso, worldToIso, isoToTile, isoDepthKey,
} from '@sadhaka/loom-engine';
const engine = Engine.create({ canvas });
engine.world.addSystem(new InputSystem(), SYSTEM_PHASE_INPUT);
engine.world.addSystem(new VeilBudgetSystem(), SYSTEM_PHASE_INPUT);
// ... game systems ...
engine.world.addSystem(new SpriteRenderSystem(), SYSTEM_PHASE_RENDER);
function tick(now: number) {
engine.tick(now);
requestAnimationFrame(tick);
}
requestAnimationFrame(tick);
The engine's defensible novelty is in the Loom integration layer, not the rasterizer. Director-driven scene state, Veil Essence economy gating render budget, knot-aware encounter generation, event-sourced rendering. The renderer underneath uses public-domain techniques (sprite batching, isometric projection, ECS) implemented from scratch.
See PRIOR-ART.md for the cumulative inspirations log (public talks, papers, OSS architecture - took / declined per source).
Every architectural commit names its inspirations in plain text. No copy-paste from any external engine source. PRIOR-ART.md is the audit trail any future productization or patent dispute would lean on.
208 / 208 tests pass on Node 24 via tsx --test. Coverage spans
all twelve test files in tests/:
smoke.test.ts - public API barrel, version stampworld.test.ts - ECS world, system scheduling, sprite pool, sprite render, timeasset-loader.test.ts - sprite-sheet manifest, frame stepper, error discriminatoranimation.test.ts - animation clip math, state pool, AnimationSystem end-to-endvfx.test.ts - particle pool, emitter pool, simulation, emitter system, veil budgetaudio-input.test.ts - audio bus + ducking, input manager, input system, budget propagationdirector.test.ts - SSE bridge, eventSourceFactory hook, scene-state derivationcombat.test.ts - hit resolution, damage application, knockbackprojectile.test.ts - projectile pool, lifetime, collisionarpg.test.ts - ARPG hub-and-spoke, plaza narrator, encounter schedulingsnapshot-recovery.test.ts - SnapshotRecoveryHelper for Director reconnecttouch-input.test.ts - virtual D-pad, tap-to-walk, multi-touch arbitrationRun via npm test. Each suite is fully node-based; no DOM dependency.
Browser-only paths (Canvas2DDevice rasterization, AudioContext
unlock, DOM event listeners) are exercised via the demo's preview
verification, not unit tests.
The TypeDoc site at https://loom-engine.pages.dev/ is served by
Cloudflare Pages from the gh-pages branch of this repo. The chain:
main triggers .github/workflows/docs.ymlnpm ci, npm test, npm run docs:ci, then publishes
./docs-build/ to the gh-pages branch via peaceiris/actions-gh-pagesgh-pages branch and auto-deploys on
every push, typically within 1-2 minGitHub Pages itself is not used: the repo is private and free user
plans do not include Pages on private repos. The 422 error from the
Pages create API is the canonical signal: "Your current plan does not support GitHub Pages for this repository."
If the Cloudflare Pages project is ever deleted or the repo is forked to a new owner, re-activate as follows:
loom-engine, name the project loom-engine (default URL
becomes loom-engine.pages.dev)gh-pages/ (root)gh-pages; subsequent deploys auto-trigger on push to that branchengine.theworldtable.ai)
under the project's Custom domains tab. CF DNS for
theworldtable.ai is already on the same account, so this is a
one-click CNAME addIf the workflow ever stops updating gh-pages (CF Pages will keep
serving the last successful build but go stale), check
gh run list --repo sadhaka/loom-engine --workflow=docs.yml.
Versions 0.11.0 and later are licensed under the Business Source License 1.1 ("BUSL-1.1"). Copyright (c) 2026 Misha Mitiev.
licensor@theworldtable.ai. Standard terms include a 5% royalty on
excess revenue; lump-sum buyouts and equity-for-license arrangements
are negotiable. See
COMMERCIAL_LICENSE_TERMS.md.Version 0.10.0 (the only previously-published release) remains
permanently licensed under MIT for backwards compatibility. Projects
pinned to 0.10.0 are unaffected by the license change but will not
receive future updates without accepting BUSL-1.1.
Tagged releases publish to npm via
.github/workflows/npm-publish.yml.
The workflow runs npm test and npm run build, then
npm publish --access public, when a tag matching v* is pushed to
main. It needs the NPM_TOKEN repo secret to authenticate.
Manual publish from a local checkout:
npm login # one-time, npm account named sadhaka
npm test # 208/208 must pass
npm run build # tsc -> dist/
npm publish --dry-run # inspect tarball contents first
npm publish --access public # scoped packages default to private; flag is required
prepublishOnly in package.json re-runs npm test && npm run build
before any publish, so the dry-run and the final publish always rebuild
from a clean source tree.
This is a single-author project (Misha Mitiev) for TheWorldTable.ai.
The MIT license permits forking and modification; pull requests are
welcome but not actively triaged - the canonical roadmap is the spec
file (LOOM-ENGINE-SPEC.md in the parent repo) and capacity is
limited. For bug reports, file an issue with a minimal repro.