Build inference.sh apps with AI coding assistants like Claude Code, Cursor, or Windsurf.
Setup
1. Install CLI
bash
1curl -fsSL https://cli.inference.sh | sh2infsh login2. Create App
bash
1infsh app init my-app2cd my-appThis creates your app with all the files needed, including skills that teach your coding agent how to build inference.sh apps.
Skills Included
When you run infsh app init, the CLI creates a skills/ directory containing guidance for AI coding assistants:
| Skill | What it teaches |
|---|---|
building-inferencesh-apps | Overview, CLI commands |
writing-app-logic | How to write inference.py |
configuring-resources | How to set up inf.yml |
managing-secrets | Handling API keys |
using-oauth-integrations | Google Sheets, Drive |
tracking-usage | Output metadata for billing |
handling-cancellation | Graceful task cancellation |
optimizing-performance | Best practices |
debugging-issues | Troubleshooting |
How It Works
- You describe what you want — Tell your coding agent what app to build
- Agent reads skills — The agent uses the skills to understand inference.sh patterns
- Agent writes code — Creates inference.py, updates inf.yml, adds dependencies
- Test locally — Run
infsh app runto verify - Deploy — Run
infsh app deployto publish
Example Prompts
code
1Create an image resizing app that takes an image and dimensions, 2and returns the resized image.code
1Build an LLM app that calls OpenAI's API. 2It should track token usage for billing.code
1Make a YouTube audio downloader that extracts audio from videos 2and returns MP3 files.Supported Agents
The skills work with any AI coding assistant that supports project-level context:
- Claude Code — Reads skills from
.claude/skills/or project files - Cursor — Uses project context
- Windsurf — Uses project context
- GitHub Copilot — Uses project context
Next
→ CLI Setup — Manual CLI installation
→ Creating an App — Manual app creation