Building Web Endpoints
Targon makes it easy to deploy HTTP-facing services by decorating your functions. This guide covers the available patterns and shows how to adapt the examples from the SDK.
FastAPI Endpoint
Expose a single handler with automatic request parsing and optional documentation.
# targon-sdk/examples/gettin_started/web_endpoint_simple.py
@app.function()
@targon.fastapi_endpoint(method="GET", docs=True)
def greet(name: str = "World"):
return {"greeting": f"Hello, {name}!"}
docs=Trueenables Swagger UI at/greet/docs.- Add
requires_auth=Trueto gate the endpoint behind API keys.
Run locally:
targon run .../web_endpoint_simple.py --name Alice
Full ASGI Application
Deploy entire ASGI frameworks (FastAPI, Starlette, Quart) by returning the app instance.
# targon-sdk/examples/web/web_endpoint_asgi.py
@app.function()
@targon.asgi_app(label="api")
def hello():
from fastapi import FastAPI
api = FastAPI(title="Targon Demo API")
@api.get("/")
def root():
return {"message": "Welcome to Targon ASGI Demo API"}
return api
Tips:
- Add multiple routes, middleware, or dependency injection as needed.
- Use the optional
labelto group endpoints in the dashboard. - Set
requires_auth=Trueto protect the entire app.
Custom Web Server Process
For frameworks that manage their own servers (Uvicorn, Gradio, vLLM), wrap the startup command with @targon.web_server.
# targon-sdk/examples/llm/vllm_example.py
@app.function(resource="h200-small", max_replicas=1)
@targon.web_server(port=8080)
def serve():
import subprocess
cmd = [
"vllm",
"serve",
"TinyLlama/TinyLlama-1.1B-Chat-v1.0",
"--host",
"0.0.0.0",
"--port",
"8080",
]
subprocess.Popen(" ".join(cmd), shell=True)
Guidelines:
- Ensure the process listens on the declared port.
- Adjust
startup_timeoutif the server needs longer to boot. - Combine with GPU tiers via
resource=Compute.H200_SMALL.
Authentication Options
All web decorators accept requires_auth. When enabled, requests must include a valid Targon API key via the Authorization header. Use this for internal services or to restrict usage to trusted clients.
Next Steps
- Explore the Web Decorators API reference for parameter details.
- For LLM-specific deployments, continue with the LLM deployment guide.
- Review
targon.Computeto pick the right resource tier for your endpoint.