Inference Logoinference.sh

Running Apps

Execute AI apps on inference.sh.


Basic Usage

1from inferencesh import inference2 3client = inference(api_key="inf_your_key")4 5result = client.run({6    "app": "infsh/flux",7    "input": {"prompt": "A sunset over mountains"}8})9 10print(f"Task ID: {result['id']}")11print(f"Output: {result['output']}")

Parameters

ParameterTypeDescription
appstringApp identifier (namespace/name or namespace/name@version)
inputobjectInput matching app schema
setupobjectSetup parameters (affects worker warmth)
infra'cloud' | 'private'Infrastructure type
variantstringApp variant
workersstring[]Specific worker IDs (for private)

Setup Parameters

Setup parameters configure the app instance. Workers with matching setup are "warm" and start faster:

1result = client.run({2    "app": "infsh/flux",3    "setup": {"model": "schnell"},4    "input": {"prompt": "A sunset"}5})

Private Workers

Run on your own infrastructure:

1result = client.run({2    "app": "my-team/my-app",3    "input": {...},4    "infra": "private",5    "workers": ["worker-id-1"]  # Optional: specific workers6})

Task Status

1from inferencesh import TaskStatus2 3TaskStatus.QUEUED      # 2 - Waiting4TaskStatus.RUNNING     # 7 - Executing5TaskStatus.CANCELLING  # 8 - Cancelling6TaskStatus.COMPLETED   # 10 - Done7TaskStatus.FAILED      # 11 - Error8TaskStatus.CANCELLED   # 12 - Cancelled

Next Steps

  • Streaming — Real-time progress updates
  • Files — File uploads and downloads

we use cookies

we use cookies to ensure you get the best experience on our website. for more information on how we use cookies, please see our cookie policy.

by clicking "accept", you agree to our use of cookies.
learn more.