07 - Extending Triggers (Premium)¶
Add non-HTTP triggers to a Premium Function App, focusing on Blob polling and production-safe trigger behavior on always-warm Elastic Premium instances.
Prerequisites¶
- You completed 06 - CI/CD.
- You exported
$RG,$APP_NAME,$PLAN_NAME,$STORAGE_NAME,$LOCATION. - Your app settings include
AzureWebJobsStorage(connection string or identity-based).
What You'll Build¶
- Two additional triggers (Timer and Blob) using Python blueprints.
- Updated blueprint registration in
apps/python/function_app.py. - A trigger validation flow using blob upload and live log checks.
Infrastructure Context
Plan: Premium (EP1) | Network: VNet + Private Endpoints | Always warm: ✅
Premium deploys with VNet integration (delegated subnet), a private endpoint for inbound access, private DNS zone, and pre-warmed instances. Storage uses connection string or identity-based authentication.
flowchart TD
INET[Internet] -->|HTTPS| FA[Function App\nPremium EP1\nLinux Python 3.11]
subgraph VNET["VNet 10.0.0.0/16"]
subgraph INT_SUB["Integration Subnet 10.0.1.0/24\nDelegation: Microsoft.Web/serverFarms"]
FA
end
subgraph PE_SUB["Private Endpoint Subnet 10.0.2.0/24"]
PE_BLOB[PE: blob]
PE_QUEUE[PE: queue]
PE_TABLE[PE: table]
PE_FILE[PE: file]
end
end
PE_BLOB --> ST["Storage Account\nallowPublicAccess: false\nallowSharedKeyAccess: true"]
PE_QUEUE --> ST
PE_TABLE --> ST
PE_FILE --> ST
subgraph DNS[Private DNS Zones]
DNS_BLOB[privatelink.blob.core.windows.net]
DNS_QUEUE[privatelink.queue.core.windows.net]
DNS_TABLE[privatelink.table.core.windows.net]
DNS_FILE[privatelink.file.core.windows.net]
end
PE_BLOB -.-> DNS_BLOB
PE_QUEUE -.-> DNS_QUEUE
PE_TABLE -.-> DNS_TABLE
PE_FILE -.-> DNS_FILE
FA -.->|System-Assigned MI| ENTRA[Microsoft Entra ID]
FA --> AI[Application Insights]
subgraph STORAGE[Content Backend]
SHARE[Azure Files\ncontent share]
end
ST --- SHARE
WARM["🔥 Pre-warmed instances\nMin: 1, Max: 20-100"] -.- FA
style FA fill:#ff8c00,color:#fff
style VNET fill:#E8F5E9,stroke:#4CAF50
style ST fill:#FFF3E0
style DNS fill:#E3F2FD
style WARM fill:#FFF3E0,stroke:#FF9800 flowchart LR
A[Add timer blueprint] --> B[Add blob blueprint]
B --> C[Register in function_app.py]
C --> D[Upload test blob]
D --> E[Publish and verify logs] Steps¶
-
Add a Timer trigger blueprint.
# apps/python/blueprints/scheduled.py import azure.functions as func import logging bp = func.Blueprint() @bp.timer_trigger(schedule="0 */5 * * * *", arg_name="timer", run_on_startup=False) def scheduled_cleanup(timer: func.TimerRequest) -> None: if timer.past_due: logging.warning("Timer trigger is past due") logging.info("Scheduled cleanup executed") -
Add a standard polling Blob trigger (supported on Premium).
# apps/python/blueprints/blob_processor.py import azure.functions as func import logging bp = func.Blueprint() @bp.blob_trigger(arg_name="blob", path="uploads/{name}", connection="AzureWebJobsStorage") def process_blob(blob: func.InputStream) -> None: logging.info("Processing blob: %s", blob.name) logging.info("Blob size: %d bytes", blob.length)On Premium, polling blob trigger works by default. Event Grid is optional when you need lower-latency eventing.
-
Register new blueprints in
apps/python/function_app.py. -
Create containers and upload a blob for trigger testing.
```bash az storage container create \ --name "uploads" \ --account-name "$STORAGE_NAME" \ --auth-mode login
az storage container create \ --name "processed" \ --account-name "$STORAGE_NAME" \ --auth-mode login
python3 -c "from pathlib import Path; Path('/tmp/sample.txt').write_text('hello premium blob\n', encoding='utf-8')"
az storage blob upload \ --container-name "uploads" \ --name "sample.txt" \ --file "/tmp/sample.txt" \ --account-name "$STORAGE_NAME" \ --auth-mode login
-
Publish updated code to Premium.
-
Verify trigger execution with log stream.
-
Validate Premium trigger behavior.
- Pre-warmed instances keep the host warm, so trigger cold starts are largely eliminated.
- Scale remains plan-level for Premium apps (not per-function).
- Standard blob polling trigger is supported; Event Grid trigger remains optional.
- Keep timeout-sensitive jobs aware of Premium defaults (30 minutes default, unlimited max).
Verification¶
2026-01-01T00:05:00.000 [Information] Executing 'Functions.scheduled_cleanup' (Reason='Timer fired', Id=xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx)
2026-01-01T00:05:00.120 [Information] Executed 'Functions.scheduled_cleanup' (Succeeded, Id=xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx, Duration=120ms)
2026-01-01T00:05:10.000 [Information] Processing blob: uploads/sample.txt
Next Steps¶
See Also¶
- Tutorial Overview & Plan Chooser
- Python Language Guide
- Platform: Hosting Plans
- Operations: Deployment
- Recipes Index