Scaffold a new app with the CLI.
Create App
bash
1infsh app init my-app23# Node.js app4infsh app init my-app --lang nodeOr interactive mode:
bash
1infsh app initWhat's Created
1my-app/2├── inf.yml # Configuration3├── inference.py # Your code4├── requirements.txt # Python dependencies5├── packages.txt # System dependencies (optional)6├── skills/ # AI coding agent guidance7└── README.md1my-app/2├── inf.yml # Configuration3├── src/4│ └── inference.js # Your code5├── package.json # Node.js dependencies6├── packages.txt # System dependencies (optional)7├── skills/ # AI coding agent guidance8└── README.mdFiles Explained
| File | Python | Node.js | Purpose |
|---|---|---|---|
inf.yml | Yes | Yes | App settings, resources |
inference.py | Yes | — | App logic (Python) |
src/inference.js | — | Yes | App logic (Node.js) |
requirements.txt | Yes | — | Python packages (pip) |
package.json | — | Yes | Node.js packages (npm/pnpm) |
packages.txt | Yes | Yes | System packages (apt) — optional |
skills/ | Yes | Yes | Guidance for AI coding assistants |
Dependencies
App Packages
1torch>=2.02transformers3accelerateSystem Packages (packages.txt)
For system-level dependencies (installed via apt):
code
1ffmpeg2libgl1-mesa-glxBase Images
Apps run in containers with these base images:
| Type | Image | Use Case |
|---|---|---|
| GPU | docker.inference.sh/gpu:latest-cuda | CUDA GPU apps |
| CPU | docker.inference.sh/cpu:latest | CPU-only apps |
Note: Currently only NVIDIA CUDA GPUs are supported for GPU apps.
Next
→ App Code