App
TriFrost's App
class is the entrypoint to your server. It manages routing, runtime integration, observability, lifecycle hooks, and core configuration like cookies, cache, tracing, and JSX hydration.
You typically instantiate App
once per deployment, whether thatβs on Node, Bun, or Workerd, and everything flows through it.
π See also: Routing Basics | Request Lifecycle
π Creating an App
TriFrost apps are created via new App(...)
. You can optionally pass runtime modules, caching, cookies, tracing, and more:
import {App, RedisCache, ConsoleExporter} from '@trifrost/core';
import {css} from './css';
import {script} from './script';
const app = new App({
tracing: {
exporters: () => [new ConsoleExporter()],
},
cache: ({env}) => new RedisCache({
store: redis /* Your redis instance */
}),
client: {css, script},
});
You can configure:
- Runtime (Node, Bun, Workerd, etc)
- Tracing and exporters
- Cookie config (default options)
- Cache store
- Default timeout
- JSX hydration behavior (client)
- ...
Note: Runtime is automatically detected, you should not have to pass this manually.
π See also: Hello World Example | Runtime Adapters
π Typing your App's Environment
TriFrost lets you define your environment shape via the generic <Env>
parameter. This ensures your ctx.env
is fully typed everywhere in your app.
type Env = {
DB_URL: string;
COOKIE_SECRET: string;
};
const app = new App<Env>({});
Now, ctx.env.DB_URL
and others are fully typed across:
- Middleware
- Handlers
- Routers
π See also: Context & State Management
π§ Prefer a guided setup?
You can let the CLI scaffold everything for you, including runtime setup, middleware, styling, and more.
Run:
# Bun
bun create trifrost@latest
# NPM
npm create trifrost@latest
... and youβll get a fully functional project in under a minute.
βοΈ Configuration Options
When constructing an app, you can pass any of the following options:
new App<Env>({
cache, // Cache adapter (Redis, Memory, etc)
client, // Client object css/script setup for auto-mounting atomic
cookies, // Global cookie defaults
env, // Custom object added ON TOP OF the detected environment env
rateLimit, // Rate Limiter Instance
runtime, // Custom runtime if no auto-detect is wanted
timeout, // Maximum timeout in milliseconds globally (default: 30 000)
tracing, // Tracing config (exporters, requestId)
});
Example:
new App<Env>({
cache: ({env}) => new DurableObjectCache({
store: env.MainDurable,
}),
tracing: {
exporters: ({env}) => {
if (isDevMode(env)) return [new ConsoleExporter()];
return [
new JsonExporter(),
new OtelHttpExporter({
logEndpoint: 'https://otlp.uptrace.dev/v1/logs',
spanEndpoint: 'https://otlp.uptrace.dev/v1/traces',
headers: {
'uptrace-dsn': env.UPTRACE_DSN,
},
}),
];
},
},
client: {css, script},
})
cache
TriFrost uses this cache to power ctx.cache
. If omitted, it falls back to a MemoryCache
. You can plug in Redis, Durable Objects, KV, or any custom adapter that implements the TriFrostCache
interface.
cache: ({env}) => new RedisCache({
store: ... /* your redis instance */,
})
π Also see: Caching
client
Provides JSX hydration support via css
and script
mounts. If set, TriFrost will automatically register routes for /__atomics__/client.css
and /__atomics__/client.js
to serve these fragments.
client: {
css: atomicCssInstance,
script: compiledScript,
}
This enables progressive hydration and style rehydration in SSR flows.
π Also see: JSX Basics | JSX Atomic
cookies
Sets global defaults for all cookies set via ctx.cookies.set(...)
.
Default:
cookies: {
path: '/',
secure: true,
httpOnly: true,
sameSite: 'Strict'
}
You can override these for all cookie calls globally:
cookies: {
sameSite: 'Lax',
secure: true,
}
π Also see: Cookies
env
An object that extends the current runtime's environment, useful for injecting additional configuration that does not live on the environment.
env: {
MYHAPPY_VARIABLE: true,
...
}
This is merged on top of the runtime's
process.env
,Bun.env
, orenv
object (Workerd).
rateLimit
Optional rate limiter instance β supports Redis, KV, Durable Objects, etc. If provided, routes/groups can call .limit(...)
and TriFrost will auto-enforce quotas.
rateLimit: ({env}) => new RedisRateLimit({
store: ... /* your redis instance */
}),
Use app.limit(...)
, router.limit(...)
, or route.limit(...)
to apply limits.
π Also see: RateLimiting
runtime
Override the runtime manually if needed (rare). Usually you don't need to do this, TriFrost auto-detects Bun, Node, or Workerd.
Example:
import {NodeRuntime} from '@trifrost/core/runtimes/Node/Runtime';
...
runtime: new NodeRuntime(),
π Also see: Supported Runtimes
timeout
Sets the default timeout for all routes and requests, in milliseconds. This can be overridden per route or group.
timeout: 15_000, // 15 seconds
If null
, disables timeout globally (not recommended unless you're building a streaming service).
Defaults to
30_000
(30 seconds).
tracing
TriFrost supports structured logging and tracing via configurable exporters and trace ID propagation.
tracing: {
exporters: () => [new ConsoleExporter()],
requestId: {
inbound: ['x-request-id'],
outbound: 'x-request-id',
validate: id => id.length > 8,
}
}
You can return a single exporter, or an array. TriFrost will normalize it.
π Also see: Console Exporter | JSON Exporter | OTEL Exporter
π§ Environment Variables
The TriFrost App automatically picks up configuration from a whole slew of environment variables.
These variables allow for app name/version, networking port, proxy trust, dev mode, debug and more.
For a full overview see:
π¦ Routing
The App
class inherits from Router, meaning you can call any routing methods directly:
app.get('/status', (ctx) => ctx.text('ok'));
app.patch('/users/:userId', async (ctx) => {
console.log('patching user', ctx.state.userId, 'with', ctx.body);
return ctx.json({ received: ctx.body });
});
All of these work:
app.get(...) // Add a HTTP GET route
app.post(...) // Add a HTTP POST route
app.put(...) // Add a HTTP PUT route
app.patch(...) // Add a HTTP PATCH route
app.del(...) // Add a HTTP DELETE route
app.health(...) // Add a GET route specifically for health checks
app.route(...) // Add a subroute with a builder approach
app.group(...) // Add a subrouter with dynamic path handling
app.onNotFound(...) // Configure a catch-all not found handler
app.onError(...) // Configure a catch-all error handler
app.limit(...) // Configure limit for this part of the chain
app.bodyParser(...) // Configure bodyparser for this part of the chain
app.use(...) // Add a middleware to the router
Routes registered directly on app are equivalent to root-level routes β they live at the top of the route tree.
π See: Router & Route | Routing Basics
𧬠Lifecycle
app.boot()
Boots the runtime and wires up the server.
await app.boot();
This:
- If no runtime is detected, detects the runtime (node/bun/workerd/...) based on heuristics
- Boots up the runtime
- Registers the
.onIncoming(...)
handler with the runtime - Attaches route handling, tracing, and lifecycle hooks
- Resolves once the runtime is listening (or ready)
- Is safe to call multiple times (noop after first run)
Required before requests will be handled
app.shutdown()
Shuts down the runtime (eg: closes the server in bun/node)
await app.shutdown();
π₯ Inbound Tracing
When tracing.requestId.inbound
is set (default: ['x-request-id', 'cf-ray']
), TriFrost will:
- Accept an inbound trace ID header
- Use it as the root of the logger trace context
- Auto-propagate it to outbound fetches
Set it to []
to disable entirely.
tracing: {
requestId: {
inbound: ['x-request-id'], // accepted inbound header
outbound: 'x-request-id', // header sent on ctx.fetch(...) with trace id
validate: val => ..., // validation method to validate for id format
}
}
The default setting is:
requestId: {
inbound: ['x-request-id', 'cf-ray'],
outbound: 'x-request-id',
validate: val => /^[a-z0-9-]{8,64}$/i.test(val),
}
π See: Logging & Observability
Best Practices
- β
Call
await app.boot()
before handling any requests - β
Use
.health(...)
for readiness/liveness probes, they are excluded from tracing/logging - π« Never manually call
runtime.boot()
β useApp.boot()
instead - β For custom exporters, prefer a function that returns an array (TriFrost will wrap singletons automatically)
- β
Avoid setting
runtime
andenv
unless you're fully overriding behavior - β
Make use of the
isDevMode
utility to switch between configurations for dev/prod