Creating Serverless Apps
Deploy autoscaling applications without managing infrastructure. Targon Serverless scales your app from zero to peak traffic automatically, so you only pay for the compute you use.
Quick Start (Dashboard)
The easiest way to launch an app is through the Targon Dashboard.
- Create App: Navigate to Serverless > New App.
- Select Hardware: Pick the CPU or GPU tier that fits your workload.
- Choose a Starting Point:
- Public Templates: Ready-to-use apps pre-configured by Targon.
- Custom: Deploy your own image.
Custom Configuration
If you choose Custom, you will need to provide the Image, Port, and optional Command.
View configuration reference for details on arguments, environment variables, and auto-scaling.
- Image: The Docker image URL (e.g.,
python:3.11orghcr.io/my-org/my-app). - Port: The internal port your app listens on (Targon routes public traffic here).
- Command (Optional): The startup command (e.g.,
python main.py).
Infrastructure as Code (SDK)
For production workflows, define your infrastructure in Python alongside your application code. This ensures your deployment is reproducible and version-controlled.
# targon-sdk/examples/getting_started/getting_started.py
import targon
# Define the environment
app = targon.App("getting-started", image=targon.Image.debian_slim("3.12"))
# Define the endpoint
@app.function()
def hello(message: str) -> dict[str, str]:
return {"message": message, "status": "success"}
Deployment Workflow
- Test Locally:
targon run main.py - Deploy to Cloud:
targon deploy main.py - Manage:
targon app list
Next Steps
- Compute Resources: Compare CPU vs. GPU options.
- Web Endpoints: Learn how to host FastAPI, Flask, or other web servers.
- LLM Deployment: Run large language models serverlessly.