Event-Driven Triggers
Martha can automatically start workflows when events happen in the system. Upload a document, and a compliance analysis workflow runs. Ingestion fails, and a notification workflow fires. No polling, no cron jobs -- events flow through Redis Streams and the trigger dispatcher matches them to workflows in real time.
Concepts
Events
Every significant action in Martha emits a CloudEvent -- a standard envelope with a type, source, timestamp, and data payload.
Built-in event types:
| Type | When it fires |
|---|---|
document.uploaded | A document is uploaded to a collection |
document.ingested | The ingestion pipeline completes successfully |
document.ingestion_failed | The ingestion pipeline fails |
workflow.completed | A workflow execution finishes successfully |
workflow.failed | A workflow execution fails |
approval.created | An approval case is created (workflow gate or agent escalation) |
approval.resolved | An approval case is approved or rejected |
webhook.received | An external webhook POST is received |
schedule.fired | A scheduled trigger fires on its configured cron schedule |
deadman.fired | An expected event did not occur within the configured time window |
chat.session.created | A chat session is created via the API |
chat.session.ended | A chat session workflow completes or times out |
chat.tool.called | An agent invokes a tool during a chat session |
chat.tool.completed | A tool execution succeeds during a chat session |
chat.tool.failed | A tool execution fails during a chat session |
agent.loop.started | An agent loop begins execution |
agent.loop.iteration | Each agent reasoning cycle (opt-in via emit_iteration_events) |
agent.loop.completed | An agent loop finishes successfully |
agent.loop.failed | An agent loop errors or exceeds its iteration/token budget |
definition.created | A function, workflow, or agent definition is created |
definition.updated | A definition is updated (includes changed_fields in payload) |
definition.deleted | A definition is deleted |
Custom events can be emitted by platform functions, workflows, or the manual emit API. Tenants can also define their own event types with descriptions and JSON Schema validation via the custom event types API (see Custom Event Types below).
Browse all event types, their sample payloads, and which triggers listen to them on the Events page in the admin UI (under AUTOMATIONS).
Triggers
A trigger is a rule: "when event X happens, start workflow Y with these inputs." Triggers are tenant-scoped definitions, managed through the API or admin UI.
A trigger has:
- Event type -- which events to listen for (exact match or wildcard like
document.*) - Event filter -- optional conditions on event data fields
- Target workflow -- the workflow to start when the trigger fires
- Input mapping -- how to transform event data into workflow inputs
Fan-out
One event can match multiple triggers. Each trigger starts its own independent workflow execution. This is by design -- compose automation from independent, reusable pieces.
Creating Triggers
Via Admin UI
Navigate to Triggers in the sidebar (under AUTOMATIONS). Click New Trigger.
| Field | Description |
|---|---|
| Name | Unique identifier (lowercase, hyphens allowed). Example: on-document-ingested |
| Event Type | Select a built-in type or enter a custom event type |
| Event Filter | Optional conditions (see Filtering below) |
| Target Type | workflow (function dispatch planned for Phase 2) |
| Target Name | Name of the workflow to start |
| Input Mapping | JSON template mapping event data to workflow inputs |
| Active | Toggle to enable/disable without deleting |
Via API
curl -X POST https://martha.example.com/api/admin/triggers/ \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d '{
"name": "analyze-on-ingest",
"event_type": "document.ingested",
"target_name": "compliance-analysis",
"input_mapping": {
"document_id": "{{event.data.document_id}}",
"collection_id": "{{event.data.collection_id}}"
}
}'Filtering
By default, a trigger matches all events of its type. Add an event filter to narrow the match.
Filter syntax
Filters use an EventBridge-inspired declarative syntax. Each key is a dot-path into the event, each value is an array of conditions.
{
"data.content_type": ["application/pdf"],
"data.page_count": [{"numeric": [">", 0]}]
}- Multiple keys = AND (all must match)
- Multiple values in an array = OR (any can match)
Operators
| Operator | Syntax | Example |
|---|---|---|
| Exact match | ["value"] | "data.status": ["ready"] |
| Match any | ["val1", "val2"] | "data.content_type": ["application/pdf", "image/png"] |
| Starts with | [{"prefix": "report_"}] | "data.filename": [{"prefix": "report_"}] |
| Greater than | [{"numeric": [">", 10]}] | "data.page_count": [{"numeric": [">", 10]}] |
| Less than | [{"numeric": ["<", 100]}] | |
| Greater or equal | [{"numeric": [">=", 1]}] | |
| Less or equal | [{"numeric": ["<=", 50]}] | |
| Between | [{"numeric": [">=", 10, "<=", 50]}] | |
| Field exists | [{"exists": true}] | "data.metadata": [{"exists": true}] |
| Field absent | [{"exists": false}] |
Wildcard event types
Use * as a suffix to match multiple event types:
document.*matchesdocument.uploaded,document.ingested,document.ingestion_failed*matches any event type
!!! note Martha's wildcard matches across dots (document.* also matches document.ingestion.completed). This differs from AWS EventBridge where * matches exactly one segment.
Input Mapping
Input mapping transforms event data into workflow inputs using the same {{template}} syntax used in workflow node configs.
{
"document_id": "{{event.data.document_id}}",
"collection_id": "{{event.data.collection_id}}",
"filename": "{{event.data.filename}}",
"triggered_by": "{{event.type}}",
"static_param": "always-this-value"
}Available template paths:
| Path | Description |
|---|---|
event.type | Event type string |
event.source | Event source |
event.id | Unique event ID |
event.time | ISO 8601 timestamp |
event.tenantid | Tenant ID |
event.data.* | Any field in the event data payload |
Static values (no {{}}) are passed through unchanged.
Deduplication
If multiple events arrive rapidly for the same logical operation (e.g., a document re-uploaded), dedup prevents duplicate workflow starts.
Set Dedup Key to a template that resolves to a unique identifier:
{{event.data.document_id}}Within the Dedup Window (default 300 seconds), events with the same resolved dedup key will not start duplicate workflows. The first event starts the workflow; subsequent events within the window are silently deduplicated.
Testing Triggers
Dry-run test
Test a trigger against a sample event without actually starting a workflow:
Admin UI: Click the play button on any trigger, paste an event JSON, and click Run Test. The result shows whether the event matched the trigger's type and filter, and what the resolved input mapping would be.
API:
curl -X POST https://martha.example.com/api/admin/triggers/analyze-on-ingest/test \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d '{
"event": {
"type": "document.ingested",
"data": {"document_id": "abc-123", "content_type": "application/pdf"}
}
}'Response:
{
"matched": true,
"filter_result": true,
"input_mapping_result": {
"document_id": "abc-123",
"collection_id": null
}
}Manual event emission
Emit a test event to see the full trigger -> workflow pipeline:
curl -X POST https://martha.example.com/api/admin/triggers/events/emit \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d '{
"type": "document.ingested",
"source": "manual/admin",
"data": {"document_id": "test-123", "collection_id": "col-456"}
}'This emits a real event to the Redis Stream. If any active triggers match, they will start workflows.
API Reference
| Method | Path | Description |
|---|---|---|
POST | /api/admin/triggers/ | Create a trigger |
GET | /api/admin/triggers/ | List triggers (supports active_only, event_type, skip, limit) |
GET | /api/admin/triggers/{name} | Get a trigger by name |
PUT | /api/admin/triggers/{name} | Update a trigger |
DELETE | /api/admin/triggers/{name} | Delete a trigger |
POST | /api/admin/triggers/{name}/test | Dry-run test against sample event |
POST | /api/admin/triggers/{name}/backfill | Replay historical events against trigger |
POST | /api/admin/triggers/events/emit | Manually emit a CloudEvent |
GET | /api/admin/triggers/events/types | List built-in event types |
GET | /api/admin/triggers/events/recent | Recent events from Redis Stream (supports limit) |
POST | /api/webhooks/events/{tenant_id}/{webhook_name} | Receive external webhook (shared secret auth) |
Webhooks
External systems can push events into Martha's event bus via the webhook receiver endpoint. This lets you trigger workflows from GitHub pushes, Stripe payments, or any system that can send HTTP POST requests.
curl -X POST https://martha.example.com/api/webhooks/events/my-tenant/github-push \
-H "X-Webhook-Secret: $WEBHOOK_SECRET" \
-H "Content-Type: application/json" \
-d '{"action": "push", "ref": "refs/heads/main", "repository": "my-repo"}'The endpoint:
- Validates the
X-Webhook-Secretheader against the server'sWEBHOOK_SECRETenvironment variable - Emits a
webhook.receivedevent with the request body asdata - Returns
202 Acceptedimmediately
The event flows through the same Redis Stream -> Dispatcher -> Trigger pipeline as internal events. Create a trigger with event_type: webhook.received to react to webhooks.
!!! note No JWT/Keycloak auth required -- webhooks use a shared secret. The webhook name (github-push in the example) becomes part of the event source for filtering.
Backfill
Triggers only fire for events that arrive after the trigger is created. The backfill API lets you retroactively replay historical events against a trigger.
curl -X POST https://martha.example.com/api/admin/triggers/analyze-on-ingest/backfill \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d '{"since": "2026-03-30T00:00:00Z", "limit": 100}'Response:
{
"scanned": 150,
"matched": 12,
"dispatched": 12,
"errors": []
}| Parameter | Default | Max | Description |
|---|---|---|---|
since | beginning of stream | -- | ISO 8601 timestamp -- only replay events after this time |
limit | 100 | 500 | Maximum events to scan |
!!! warning Backfill is not idempotent. Running it twice will start workflows twice, unless the trigger has a dedup_key set. Triggers with dedup keys are naturally safe.
How It Works
Document Upload
|
v
emit_event("document.uploaded") --> Redis Stream (martha:events:{tenant_id})
|
v
TriggerDispatcherWorkflow (Temporal, singleton per tenant)
|
v
Match against trigger_definitions (event type + filter)
|
v
Resolve input_mapping templates
|
v
client.start_workflow() on TemporalThe dispatcher runs as a Temporal workflow, one per tenant. It polls the Redis Stream, matches events against triggers, and starts target workflows. It uses continue-as-new to prevent history growth and acknowledges events only after successful dispatch.
Events are durable (Redis Streams with MAXLEN 10,000 per tenant). If the dispatcher is briefly down, events queue up and are processed when it restarts.
Custom Event Types
Tenants can define their own event types beyond the built-in ones. Custom types appear alongside built-in types in the Event Type Browser and trigger configuration UI.
API
# Create a custom event type
curl -X POST https://martha.example.com/api/admin/triggers/events/types/custom \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d '{
"type": "order.placed",
"description": "An order was placed in the system",
"sample_payload": {"order_id": "123", "amount": 99.99},
"schema": {"type": "object", "required": ["order_id"]}
}'
# List custom event types
curl https://martha.example.com/api/admin/triggers/events/types/custom \
-H "Authorization: Bearer $TOKEN"
# Update
curl -X PUT https://martha.example.com/api/admin/triggers/events/types/custom/order.placed \
-H "Authorization: Bearer $TOKEN" \
-d '{"description": "Updated description"}'
# Delete
curl -X DELETE https://martha.example.com/api/admin/triggers/events/types/custom/order.placed \
-H "Authorization: Bearer $TOKEN"Custom type names cannot collide with built-in types. The schema field (JSON Schema) is optional and used for documentation; validation is warn-only and never blocks event emission.
Per-Webhook Secrets
Each webhook endpoint can have its own secret instead of sharing the global WEBHOOK_SECRET environment variable. Secrets are server-generated and shown only once on creation.
Managing webhooks
# Create a webhook (returns secret once)
curl -X POST https://martha.example.com/api/admin/webhooks \
-H "Authorization: Bearer $TOKEN" \
-d '{"name": "github-push", "description": "GitHub push events"}'
# Response: {"webhook": {...}, "secret": "a1b2c3...64-hex-chars"}
# List webhooks (secrets never shown)
curl https://martha.example.com/api/admin/webhooks -H "Authorization: Bearer $TOKEN"
# Rotate secret (old secret immediately invalidated)
curl -X POST https://martha.example.com/api/admin/webhooks/github-push/rotate \
-H "Authorization: Bearer $TOKEN"
# Delete
curl -X DELETE https://martha.example.com/api/admin/webhooks/github-push \
-H "Authorization: Bearer $TOKEN"Auth fallback chain
The webhook receiver tries authentication in this order:
- Per-webhook secret: If a
webhook_definitionsrecord exists for the tenant+name, verifyX-Webhook-Secretheader against the stored hash - Shared secret: Fall back to the
WEBHOOK_SECRETenvironment variable (backward compatible) - Neither: Return 401/503
When optional headers are present (X-Webhook-Timestamp, X-Webhook-Id), the receiver also validates timestamp freshness and idempotency.
Scheduled Triggers
A trigger with schedule_config emits schedule.fired events on a cron schedule. The existing dispatcher matches these events to triggers and starts workflows -- no special dispatch path needed.
curl -X POST https://martha.example.com/api/admin/triggers/ \
-H "Authorization: Bearer $TOKEN" \
-d '{
"name": "daily-report",
"event_type": "schedule.fired",
"target_name": "generate-report",
"schedule_config": {
"cron": "0 9 * * MON-FRI",
"timezone": "Europe/Lisbon",
"note": "Weekday 9am report"
},
"input_mapping": {
"report_type": "daily",
"triggered_at": "{{event.data.scheduled_time}}"
}
}'Schedule lifecycle
- Creating a trigger with
schedule_configcreates a Temporal Schedule - Deactivating the trigger pauses the schedule
- Reactivating unpauses it
- Deleting the trigger removes the schedule
- Updating
schedule_configupdates the Temporal Schedule spec
Cron syntax
Standard cron expressions: minute hour day-of-month month day-of-week. Examples:
| Cron | Meaning |
|---|---|
0 9 * * MON-FRI | Weekdays at 9:00 AM |
0 * * * * | Every hour |
0 0 1 * * | First day of each month at midnight |
*/15 * * * * | Every 15 minutes |
Deadman Switches (Proactive Triggers)
A deadman switch fires when an expected event does NOT happen within a time window. Use this for monitoring: "alert me if no document is ingested within 1 hour."
curl -X POST https://martha.example.com/api/admin/triggers/ \
-H "Authorization: Bearer $TOKEN" \
-d '{
"name": "daily-ingestion-check",
"event_type": "deadman.fired",
"target_name": "alert-ops-team",
"deadman_config": {
"watch_event_type": "document.ingested",
"within_seconds": 3600,
"after_event_type": "document.uploaded",
"recurring": true
}
}'Configuration
| Field | Required | Description |
|---|---|---|
watch_event_type | Yes | Event type to watch for |
within_seconds | Yes | Fire if no matching event arrives within this window (minimum 60s) |
watch_filter | No | Optional EventBridge-style filter on watched events |
after_event_type | No | Only start watching after this event occurs (prevents cold-start false positives) |
recurring | No | Reset and watch again after firing (default: true) |
How it works
Each deadman trigger runs as a Temporal workflow with a durable timer. When a matching event arrives, the dispatcher signals the workflow to reset its timer. If the timer expires without a signal, the workflow emits a deadman.fired event and (if recurring) starts watching again.
!!! note A trigger cannot have both schedule_config and deadman_config. A trigger is one of: reactive (default), scheduled, or proactive.
Updated API Reference
| Method | Path | Description |
|---|---|---|
POST | /api/admin/triggers/ | Create a trigger |
GET | /api/admin/triggers/ | List triggers (supports active_only, event_type, skip, limit) |
GET | /api/admin/triggers/{name} | Get a trigger by name |
PUT | /api/admin/triggers/{name} | Update a trigger |
DELETE | /api/admin/triggers/{name} | Delete a trigger |
POST | /api/admin/triggers/{name}/test | Dry-run test against sample event |
POST | /api/admin/triggers/{name}/backfill | Replay historical events against trigger |
POST | /api/admin/triggers/events/emit | Manually emit a CloudEvent |
GET | /api/admin/triggers/events/types | List built-in event types |
GET | /api/admin/triggers/events/recent | Recent events from Redis Stream |
POST | /api/admin/triggers/events/types/custom | Create a custom event type |
GET | /api/admin/triggers/events/types/custom | List custom event types |
PUT | /api/admin/triggers/events/types/custom/{type} | Update a custom event type |
DELETE | /api/admin/triggers/events/types/custom/{type} | Delete a custom event type |
POST | /api/admin/webhooks | Create a webhook definition (returns secret) |
GET | /api/admin/webhooks | List webhook definitions |
GET | /api/admin/webhooks/{name} | Get a webhook definition |
DELETE | /api/admin/webhooks/{name} | Delete a webhook definition |
POST | /api/admin/webhooks/{name}/rotate | Rotate webhook secret |
POST | /api/webhooks/events/{tenant_id}/{webhook_name} | Receive external webhook |