Inference Logoinference.sh

Hono

Set up the inference.sh proxy with Hono for edge and serverless deployments.


Installation

bash
1npm install @inferencesh/sdk hono

Basic Setup

typescript
1import { Hono } from "hono";2import { createRouteHandler } from "@inferencesh/sdk/proxy/hono";34const app = new Hono();5const proxyHandler = createRouteHandler();67app.all("/api/inference/proxy", proxyHandler);89export default app;

Environment Variable

Set your API key in your runtime environment:

bash
1# Cloudflare Workers (wrangler.toml)2[vars]3INFERENCE_API_KEY = "inf_your_key_here"
bash
1# Deno Deploy2export INFERENCE_API_KEY="inf_your_key_here"

Cloudflare Workers

typescript
1// src/index.ts2import { Hono } from "hono";3import { createRouteHandler } from "@inferencesh/sdk/proxy/hono";45const app = new Hono();67const proxyHandler = createRouteHandler();8app.all("/api/inference/proxy", proxyHandler);910export default app;

Deploy:

bash
1npx wrangler deploy

Deno Deploy

typescript
1// main.ts2import { Hono } from "https://deno.land/x/hono/mod.ts";3import { createRouteHandler } from "@inferencesh/sdk/proxy/hono";45const app = new Hono();6const proxyHandler = createRouteHandler();78app.all("/api/inference/proxy", proxyHandler);910Deno.serve(app.fetch);

Client Configuration

typescript
1import { inference } from "@inferencesh/sdk";23const client = inference({4  proxyUrl: "https://my-worker.workers.dev/api/inference/proxy"5});67const result = await client.run({8  app: "infsh/flux",9  input: { prompt: "A sunset" }10});

Custom API Key Resolution

Load from KV, secrets manager, or other sources:

typescript
1import { createRouteHandler } from "@inferencesh/sdk/proxy/hono";23const proxyHandler = createRouteHandler({4  resolveApiKey: async () => {5    // Load from Cloudflare KV, D1, or external service6    const key = await env.SECRETS.get("INFERENCE_API_KEY");7    return key;8  }9});

With Middleware

typescript
1import { Hono } from "hono";2import { cors } from "hono/cors";3import { rateLimiter } from "hono-rate-limiter";4import { createRouteHandler } from "@inferencesh/sdk/proxy/hono";56const app = new Hono();78// CORS9app.use("/api/*", cors({10  origin: "https://myapp.com",11  allowHeaders: ["Content-Type", "x-inf-target-url"],12}));1314// Rate limiting15app.use("/api/*", rateLimiter({16  windowMs: 15 * 60 * 1000,17  limit: 100,18}));1920// Proxy21const proxyHandler = createRouteHandler();22app.all("/api/inference/proxy", proxyHandler);2324export default app;

Why Hono?

Hono is ideal for:

  • Edge deployments (Cloudflare Workers, Vercel Edge)
  • Minimal bundle size (~12kb)
  • Fast cold starts for serverless
  • Multi-runtime support (Node, Deno, Bun, Workers)

we use cookies

we use cookies to ensure you get the best experience on our website. for more information on how we use cookies, please see our cookie policy.

by clicking "accept", you agree to our use of cookies.
learn more.