Inference Logoinference.sh

SDK Overview

The official inference.sh SDKs for Python and JavaScript/TypeScript.


Installation

1# Installation2pip install inferencesh3 4# With async support5pip install inferencesh[async]

Quick Start

1from inferencesh import inference2 3client = inference(api_key="inf_your_key")4 5# Run an app and wait for result6result = client.run({7    "app": "infsh/flux",8    "input": {"prompt": "A sunset over mountains"}9})10 11print(result["output"])

What's in the SDK?

Running Apps

Execute AI apps on inference.sh infrastructure.

Agent SDK

Build and interact with AI agents programmatically.

Server Proxy

Protect API keys in frontend applications.

Building Apps

Create your own apps to run on inference.sh. → See Extending Apps for the app development guide.


Environment Variables

1import os2from inferencesh import inference3 4client = inference(api_key=os.environ["INFERENCE_API_KEY"])

TypeScript Support

The JavaScript SDK includes full TypeScript definitions:

typescript
1import type { Task, ApiTaskRequest, InferenceConfig } from '@inferencesh/sdk';

Browser & Node.js

The JavaScript SDK works in both environments:

typescript
1// Node.js (CommonJS)2const { inference } = require('@inferencesh/sdk');3 4// ES Modules / TypeScript5import { inference } from '@inferencesh/sdk';6 7// Browser (ESM CDN)8import { inference } from 'https://esm.sh/@inferencesh/sdk';

Note: Never expose API keys in client-side code. Use the Server Proxy in production.


Requirements

  • Python 3.8+
  • requests for sync client
  • aiohttp for async client (optional)
  • Node.js 18.0.0+
  • Modern browsers with fetch support

we use cookies

we use cookies to ensure you get the best experience on our website. for more information on how we use cookies, please see our cookie policy.

by clicking "accept", you agree to our use of cookies.
learn more.