07 - Extending Triggers (Consumption)¶
Extend your Consumption (Y1) app with queue and blob triggers. On this plan, scaling is app-level (not per-function), and the standard blob trigger polling model works well.
Prerequisites¶
| Tool | Version | Purpose |
|---|---|---|
| Azure CLI | 2.61+ | Create storage queues/containers and test messages |
| Azure Functions Core Tools | v4 | Publish updated functions |
| Function App | Consumption (Y1) | Existing deployed app |
What You'll Build¶
You will extend the existing blueprint-based Python app with queue and blob triggers, publish the update, and validate trigger execution from live logs.
Infrastructure Context
Plan: Consumption (Y1) | Network: Public internet only | VNet: ❌ Not supported
Consumption has no VNet integration or private endpoint support. All traffic flows over the public internet. Storage uses connection string authentication.
flowchart TD
INET[Internet] -->|HTTPS| FA[Function App\nConsumption Y1\nLinux Python 3.11]
FA -->|System-Assigned MI| ENTRA[Microsoft Entra ID]
FA -->|"AzureWebJobsStorage__accountName\n+ connection string"| ST[Storage Account\npublic access]
FA --> AI[Application Insights]
subgraph STORAGE[Storage Services]
ST --- FS[Azure Files\ncontent share]
end
NO_VNET["⚠️ No VNet integration\nNo private endpoints"] -. limitation .- FA
style FA fill:#0078d4,color:#fff
style NO_VNET fill:#FFF3E0,stroke:#FF9800
style STORAGE fill:#FFF3E0 flowchart LR
A["Add queue/blob blueprints"] --> B["Register in apps/python/function_app.py"]
B --> C[Publish to Consumption app]
C --> D[Send queue message and upload blob]
D --> E[Validate trigger logs] Steps¶
Step 1 - Set variables¶
export RG="rg-func-consumption-demo"
export APP_NAME="func-consumption-demo-001"
export STORAGE_NAME="stconsumptiondemo001"
export LOCATION="koreacentral"
Step 2 - Add a queue trigger¶
import logging
import azure.functions as func
bp = func.Blueprint()
@bp.function_name(name="queue_worker")
@bp.queue_trigger(arg_name="msg", queue_name="work-items", connection="AzureWebJobsStorage")
def queue_worker(msg: func.QueueMessage) -> None:
payload = msg.get_body().decode("utf-8")
logging.info("Queue item processed: %s", payload)
Save this in apps/python/blueprints/queue_blob_worker.py, then register it in apps/python/function_app.py with:
from blueprints.queue_blob_worker import bp as queue_blob_worker_bp
app.register_blueprint(queue_blob_worker_bp)
Step 3 - Add a blob trigger (standard polling)¶
@bp.function_name(name="blob_worker")
@bp.blob_trigger(arg_name="blob", path="uploads/{name}", connection="AzureWebJobsStorage")
def blob_worker(blob: func.InputStream) -> None:
logging.info("Blob processed: %s (%s bytes)", blob.name, blob.length)
Standard polling blob trigger is supported on Consumption. Event Grid-based blob trigger is an optional upgrade for event-driven routing scenarios.
Step 4 - Publish changes¶
Use --build remote on Linux Consumption
The --build remote flag is required for Linux Consumption to ensure Python dependencies are installed on the server. Without it, the publish may fail or produce incomplete deployments.
Step 5 - Send queue message and upload blob¶
az storage queue create \
--name "work-items" \
--account-name "$STORAGE_NAME" \
--auth-mode login
az storage message put \
--queue-name "work-items" \
--content '{"id":"1001","action":"reindex"}' \
--account-name "$STORAGE_NAME" \
--auth-mode login
az storage container create \
--name "uploads" \
--account-name "$STORAGE_NAME" \
--auth-mode login
az storage blob upload \
--container-name "uploads" \
--name "sample.txt" \
--file "apps/python/host.json" \
--account-name "$STORAGE_NAME" \
--auth-mode login
Step 6 - Confirm trigger activity¶
Consumption scaling reminder:
- Scale-to-zero when idle.
- App-level scaling up to 100 instances on Linux Consumption.
- Queue and blob workloads scale together at app scope (not per-function scaling).
- Default timeout is 5 minutes, maximum 10 minutes.
Not available on Consumption
VNet integration requires Flex Consumption, Premium, or Dedicated plan.
Not available on Consumption
Private endpoints require Flex Consumption, Premium, or Dedicated plan.
Verification¶
Publish output excerpt:
Deployment successful.
Functions in func-consumption-demo-001:
queue_worker - [queueTrigger]
blob_worker - [blobTrigger]
Log stream excerpt:
Executing 'Functions.queue_worker' (Reason='New queue message detected on work-items.', Id=xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx)
Queue item processed: {"id":"1001","action":"reindex"}
Executed 'Functions.queue_worker' (Succeeded, Duration=42ms)
Executing 'Functions.blob_worker' (Reason='New blob detected(uploads/sample.txt)', Id=xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx)
Blob processed: uploads/sample.txt (1234 bytes)
Executed 'Functions.blob_worker' (Succeeded, Duration=58ms)
Next Steps¶
You completed the Consumption tutorial track. Continue with core runtime concepts.
Next: How Functions Works
See Also¶
- Tutorial Overview & Plan Chooser
- Python Language Guide
- Platform: Hosting Plans
- Operations: Deployment
- Recipes Index