Guides
Local Development
This guide shows how to develop and run a Nobs project entirely on your local machine. When you start your local environment with nobs up, Nobs automatically provisions all required resources including PostgreSQL, Redis, NATS, and any worker or app containers. It also mirrors the same compute limits you define for the cloud environment, making local development representative of production.
The project definition
Assume that you have the following project definition defined at project.py.
from nobs.models import Compute, Project, MlflowServer, Job, StreamlitApp
from nobs.secrets import MlflowConfig, S3StorageConfig
from nobs.models import Worker
from src.pokemon_app import main
from src.pokemon import LoadData, TrainConfig, load_all_data, train_model
background = Worker("background")
project = Project(
name="mlops-example",
shared_secrets=[S3StorageConfig],
workers=[priority_queue],
mlflow_server=MlflowServer(
domain_names=[
"example.aligned.codes"
]
),
is_legendary_app=StreamlitApp(
main,
secrets=[MlflowConfig],
compute=Compute(
mb_memory_limit=4 * 1024
)
)
)
Any jobs, pub/sub workers, or model services you define here will be launched automatically in local mode.
Start the Local Environment
Run:
nobs up
Nobs will:
- Build and start all services
- Create local infrastructure (PostgreSQL, NATS, Redis, S3 Storage, etc.)
- Enable hot-reloading for Python app containers (FastAPI, Streamlit, etc.)
- Launch pub/sub workers and attach them to their subscriptions
- Allocate compute limits exactly as defined, matching what would run in the cloud
Example Output
A typical startup log might look like:
INFO:nobs.cli:Updating source code from /Users/john/dev/mlops-example/src
INFO:nobs.docker:Creating app mlflow_server
INFO:nobs.docker:Creating app is_legendary_app
INFO:nobs.docker:Available Containers:
INFO:nobs.docker:Container mlflow_server (4ee7deab4) is accessible at http://localhost:8000
INFO:nobs.docker:Container is_legendary_app (bad32a5ba) is accessible at http://localhost:8501
[is_legendary_app] You can now view your Streamlit app in your browser.
[is_legendary_app] Local URL: http://localhost:8501
[mlflow_server] [INFO] Starting gunicorn...
INFO:nobs.docker:Container is_legendary_app exited with exit code 139
Pub/Sub Workers in Local Mode
If your project includes Pub/Sub subscribers, Nobs will automatically start workers that listen for events. For example, if you have:
project = Project(
...
verify_email=on_user_created.subscriber(send_verify_email)
)
Nobs launches a worker that processes messages from the on_user_created subject. As soon as messages are published, your subscriber handler functions will execute locally.
All you have to do is send an event:
await on_user_created.publish(User(...))
Updating Code While Running
The dev environment supports hot reloading. Any change in src/ is synced into running containers automatically. Containers restart when needed, similar to how uvicorn --reload behaves.
Stopping the Environment
Use:
nobs down
This stops and removes the local containers while keeping volumes intact unless configured otherwise.
Summary
Running locally with Nobs provides:
- Automatic provisioning of infrastructure
- Hot-reloading developer experience
- Local workers for pub/sub subscriptions
- Local model servers, Streamlit apps, FastAPI apps, etc.
- Same compute configuration as cloud deployments
- Unified command:
nobs up
Once you're happy with your project locally, you can deploy with:
nobs deploy