Skip to content

Event-Driven Triggers

Martha can automatically start workflows when events happen in the system. Upload a document, and a compliance analysis workflow runs. Ingestion fails, and a notification workflow fires. No polling, no cron jobs -- events flow through Redis Streams and the trigger dispatcher matches them to workflows in real time.


Concepts

Events

Every significant action in Martha emits a CloudEvent -- a standard envelope with a type, source, timestamp, and data payload.

Built-in event types:

TypeWhen it fires
document.uploadedA document is uploaded to a collection
document.ingestedThe ingestion pipeline completes successfully
document.ingestion_failedThe ingestion pipeline fails
workflow.completedA workflow execution finishes successfully
workflow.failedA workflow execution fails
approval.createdAn approval case is created (workflow gate or agent escalation)
approval.resolvedAn approval case is approved or rejected
webhook.receivedAn external webhook POST is received
schedule.firedA scheduled trigger fires on its configured cron schedule
deadman.firedAn expected event did not occur within the configured time window
chat.session.createdA chat session is created via the API
chat.session.endedA chat session workflow completes or times out
chat.tool.calledAn agent invokes a tool during a chat session
chat.tool.completedA tool execution succeeds during a chat session
chat.tool.failedA tool execution fails during a chat session
agent.loop.startedAn agent loop begins execution
agent.loop.iterationEach agent reasoning cycle (opt-in via emit_iteration_events)
agent.loop.completedAn agent loop finishes successfully
agent.loop.failedAn agent loop errors or exceeds its iteration/token budget
definition.createdA function, workflow, or agent definition is created
definition.updatedA definition is updated (includes changed_fields in payload)
definition.deletedA definition is deleted

Custom events can be emitted by platform functions, workflows, or the manual emit API. Tenants can also define their own event types with descriptions and JSON Schema validation via the custom event types API (see Custom Event Types below).

Browse all event types, their sample payloads, and which triggers listen to them on the Events page in the admin UI (under AUTOMATIONS).

Triggers

A trigger is a rule: "when event X happens, start workflow Y with these inputs." Triggers are tenant-scoped definitions, managed through the API or admin UI.

A trigger has:

  • Event type -- which events to listen for (exact match or wildcard like document.*)
  • Event filter -- optional conditions on event data fields
  • Target workflow -- the workflow to start when the trigger fires
  • Input mapping -- how to transform event data into workflow inputs

Fan-out

One event can match multiple triggers. Each trigger starts its own independent workflow execution. This is by design -- compose automation from independent, reusable pieces.


Creating Triggers

Via Admin UI

Navigate to Triggers in the sidebar (under AUTOMATIONS). Click New Trigger.

FieldDescription
NameUnique identifier (lowercase, hyphens allowed). Example: on-document-ingested
Event TypeSelect a built-in type or enter a custom event type
Event FilterOptional conditions (see Filtering below)
Target Typeworkflow (function dispatch planned for Phase 2)
Target NameName of the workflow to start
Input MappingJSON template mapping event data to workflow inputs
ActiveToggle to enable/disable without deleting

Via API

bash
curl -X POST https://martha.example.com/api/admin/triggers/ \
  -H "Authorization: Bearer $TOKEN" \
  -H "Content-Type: application/json" \
  -d '{
    "name": "analyze-on-ingest",
    "event_type": "document.ingested",
    "target_name": "compliance-analysis",
    "input_mapping": {
      "document_id": "{{event.data.document_id}}",
      "collection_id": "{{event.data.collection_id}}"
    }
  }'

Filtering

By default, a trigger matches all events of its type. Add an event filter to narrow the match.

Filter syntax

Filters use an EventBridge-inspired declarative syntax. Each key is a dot-path into the event, each value is an array of conditions.

json
{
  "data.content_type": ["application/pdf"],
  "data.page_count": [{"numeric": [">", 0]}]
}
  • Multiple keys = AND (all must match)
  • Multiple values in an array = OR (any can match)

Operators

OperatorSyntaxExample
Exact match["value"]"data.status": ["ready"]
Match any["val1", "val2"]"data.content_type": ["application/pdf", "image/png"]
Starts with[{"prefix": "report_"}]"data.filename": [{"prefix": "report_"}]
Greater than[{"numeric": [">", 10]}]"data.page_count": [{"numeric": [">", 10]}]
Less than[{"numeric": ["<", 100]}]
Greater or equal[{"numeric": [">=", 1]}]
Less or equal[{"numeric": ["<=", 50]}]
Between[{"numeric": [">=", 10, "<=", 50]}]
Field exists[{"exists": true}]"data.metadata": [{"exists": true}]
Field absent[{"exists": false}]

Wildcard event types

Use * as a suffix to match multiple event types:

  • document.* matches document.uploaded, document.ingested, document.ingestion_failed
  • * matches any event type

!!! note Martha's wildcard matches across dots (document.* also matches document.ingestion.completed). This differs from AWS EventBridge where * matches exactly one segment.


Input Mapping

Input mapping transforms event data into workflow inputs using the same {{template}} syntax used in workflow node configs.

json
{
  "document_id": "{{event.data.document_id}}",
  "collection_id": "{{event.data.collection_id}}",
  "filename": "{{event.data.filename}}",
  "triggered_by": "{{event.type}}",
  "static_param": "always-this-value"
}

Available template paths:

PathDescription
event.typeEvent type string
event.sourceEvent source
event.idUnique event ID
event.timeISO 8601 timestamp
event.tenantidTenant ID
event.data.*Any field in the event data payload

Static values (no {{}}) are passed through unchanged.


Deduplication

If multiple events arrive rapidly for the same logical operation (e.g., a document re-uploaded), dedup prevents duplicate workflow starts.

Set Dedup Key to a template that resolves to a unique identifier:

{{event.data.document_id}}

Within the Dedup Window (default 300 seconds), events with the same resolved dedup key will not start duplicate workflows. The first event starts the workflow; subsequent events within the window are silently deduplicated.


Testing Triggers

Dry-run test

Test a trigger against a sample event without actually starting a workflow:

Admin UI: Click the play button on any trigger, paste an event JSON, and click Run Test. The result shows whether the event matched the trigger's type and filter, and what the resolved input mapping would be.

API:

bash
curl -X POST https://martha.example.com/api/admin/triggers/analyze-on-ingest/test \
  -H "Authorization: Bearer $TOKEN" \
  -H "Content-Type: application/json" \
  -d '{
    "event": {
      "type": "document.ingested",
      "data": {"document_id": "abc-123", "content_type": "application/pdf"}
    }
  }'

Response:

json
{
  "matched": true,
  "filter_result": true,
  "input_mapping_result": {
    "document_id": "abc-123",
    "collection_id": null
  }
}

Manual event emission

Emit a test event to see the full trigger -> workflow pipeline:

bash
curl -X POST https://martha.example.com/api/admin/triggers/events/emit \
  -H "Authorization: Bearer $TOKEN" \
  -H "Content-Type: application/json" \
  -d '{
    "type": "document.ingested",
    "source": "manual/admin",
    "data": {"document_id": "test-123", "collection_id": "col-456"}
  }'

This emits a real event to the Redis Stream. If any active triggers match, they will start workflows.


API Reference

MethodPathDescription
POST/api/admin/triggers/Create a trigger
GET/api/admin/triggers/List triggers (supports active_only, event_type, skip, limit)
GET/api/admin/triggers/{name}Get a trigger by name
PUT/api/admin/triggers/{name}Update a trigger
DELETE/api/admin/triggers/{name}Delete a trigger
POST/api/admin/triggers/{name}/testDry-run test against sample event
POST/api/admin/triggers/{name}/backfillReplay historical events against trigger
POST/api/admin/triggers/events/emitManually emit a CloudEvent
GET/api/admin/triggers/events/typesList built-in event types
GET/api/admin/triggers/events/recentRecent events from Redis Stream (supports limit)
POST/api/webhooks/events/{tenant_id}/{webhook_name}Receive external webhook (shared secret auth)

Webhooks

External systems can push events into Martha's event bus via the webhook receiver endpoint. This lets you trigger workflows from GitHub pushes, Stripe payments, or any system that can send HTTP POST requests.

bash
curl -X POST https://martha.example.com/api/webhooks/events/my-tenant/github-push \
  -H "X-Webhook-Secret: $WEBHOOK_SECRET" \
  -H "Content-Type: application/json" \
  -d '{"action": "push", "ref": "refs/heads/main", "repository": "my-repo"}'

The endpoint:

  1. Validates the X-Webhook-Secret header against the server's WEBHOOK_SECRET environment variable
  2. Emits a webhook.received event with the request body as data
  3. Returns 202 Accepted immediately

The event flows through the same Redis Stream -> Dispatcher -> Trigger pipeline as internal events. Create a trigger with event_type: webhook.received to react to webhooks.

!!! note No JWT/Keycloak auth required -- webhooks use a shared secret. The webhook name (github-push in the example) becomes part of the event source for filtering.


Backfill

Triggers only fire for events that arrive after the trigger is created. The backfill API lets you retroactively replay historical events against a trigger.

bash
curl -X POST https://martha.example.com/api/admin/triggers/analyze-on-ingest/backfill \
  -H "Authorization: Bearer $TOKEN" \
  -H "Content-Type: application/json" \
  -d '{"since": "2026-03-30T00:00:00Z", "limit": 100}'

Response:

json
{
  "scanned": 150,
  "matched": 12,
  "dispatched": 12,
  "errors": []
}
ParameterDefaultMaxDescription
sincebeginning of stream--ISO 8601 timestamp -- only replay events after this time
limit100500Maximum events to scan

!!! warning Backfill is not idempotent. Running it twice will start workflows twice, unless the trigger has a dedup_key set. Triggers with dedup keys are naturally safe.


How It Works

Document Upload
     |
     v
emit_event("document.uploaded") --> Redis Stream (martha:events:{tenant_id})
     |
     v
TriggerDispatcherWorkflow (Temporal, singleton per tenant)
     |
     v
Match against trigger_definitions (event type + filter)
     |
     v
Resolve input_mapping templates
     |
     v
client.start_workflow() on Temporal

The dispatcher runs as a Temporal workflow, one per tenant. It polls the Redis Stream, matches events against triggers, and starts target workflows. It uses continue-as-new to prevent history growth and acknowledges events only after successful dispatch.

Events are durable (Redis Streams with MAXLEN 10,000 per tenant). If the dispatcher is briefly down, events queue up and are processed when it restarts.


Custom Event Types

Tenants can define their own event types beyond the built-in ones. Custom types appear alongside built-in types in the Event Type Browser and trigger configuration UI.

API

bash
# Create a custom event type
curl -X POST https://martha.example.com/api/admin/triggers/events/types/custom \
  -H "Authorization: Bearer $TOKEN" \
  -H "Content-Type: application/json" \
  -d '{
    "type": "order.placed",
    "description": "An order was placed in the system",
    "sample_payload": {"order_id": "123", "amount": 99.99},
    "schema": {"type": "object", "required": ["order_id"]}
  }'

# List custom event types
curl https://martha.example.com/api/admin/triggers/events/types/custom \
  -H "Authorization: Bearer $TOKEN"

# Update
curl -X PUT https://martha.example.com/api/admin/triggers/events/types/custom/order.placed \
  -H "Authorization: Bearer $TOKEN" \
  -d '{"description": "Updated description"}'

# Delete
curl -X DELETE https://martha.example.com/api/admin/triggers/events/types/custom/order.placed \
  -H "Authorization: Bearer $TOKEN"

Custom type names cannot collide with built-in types. The schema field (JSON Schema) is optional and used for documentation; validation is warn-only and never blocks event emission.


Per-Webhook Secrets

Each webhook endpoint can have its own secret instead of sharing the global WEBHOOK_SECRET environment variable. Secrets are server-generated and shown only once on creation.

Managing webhooks

bash
# Create a webhook (returns secret once)
curl -X POST https://martha.example.com/api/admin/webhooks \
  -H "Authorization: Bearer $TOKEN" \
  -d '{"name": "github-push", "description": "GitHub push events"}'
# Response: {"webhook": {...}, "secret": "a1b2c3...64-hex-chars"}

# List webhooks (secrets never shown)
curl https://martha.example.com/api/admin/webhooks -H "Authorization: Bearer $TOKEN"

# Rotate secret (old secret immediately invalidated)
curl -X POST https://martha.example.com/api/admin/webhooks/github-push/rotate \
  -H "Authorization: Bearer $TOKEN"

# Delete
curl -X DELETE https://martha.example.com/api/admin/webhooks/github-push \
  -H "Authorization: Bearer $TOKEN"

Auth fallback chain

The webhook receiver tries authentication in this order:

  1. Per-webhook secret: If a webhook_definitions record exists for the tenant+name, verify X-Webhook-Secret header against the stored hash
  2. Shared secret: Fall back to the WEBHOOK_SECRET environment variable (backward compatible)
  3. Neither: Return 401/503

When optional headers are present (X-Webhook-Timestamp, X-Webhook-Id), the receiver also validates timestamp freshness and idempotency.


Scheduled Triggers

A trigger with schedule_config emits schedule.fired events on a cron schedule. The existing dispatcher matches these events to triggers and starts workflows -- no special dispatch path needed.

bash
curl -X POST https://martha.example.com/api/admin/triggers/ \
  -H "Authorization: Bearer $TOKEN" \
  -d '{
    "name": "daily-report",
    "event_type": "schedule.fired",
    "target_name": "generate-report",
    "schedule_config": {
      "cron": "0 9 * * MON-FRI",
      "timezone": "Europe/Lisbon",
      "note": "Weekday 9am report"
    },
    "input_mapping": {
      "report_type": "daily",
      "triggered_at": "{{event.data.scheduled_time}}"
    }
  }'

Schedule lifecycle

  • Creating a trigger with schedule_config creates a Temporal Schedule
  • Deactivating the trigger pauses the schedule
  • Reactivating unpauses it
  • Deleting the trigger removes the schedule
  • Updating schedule_config updates the Temporal Schedule spec

Cron syntax

Standard cron expressions: minute hour day-of-month month day-of-week. Examples:

CronMeaning
0 9 * * MON-FRIWeekdays at 9:00 AM
0 * * * *Every hour
0 0 1 * *First day of each month at midnight
*/15 * * * *Every 15 minutes

Deadman Switches (Proactive Triggers)

A deadman switch fires when an expected event does NOT happen within a time window. Use this for monitoring: "alert me if no document is ingested within 1 hour."

bash
curl -X POST https://martha.example.com/api/admin/triggers/ \
  -H "Authorization: Bearer $TOKEN" \
  -d '{
    "name": "daily-ingestion-check",
    "event_type": "deadman.fired",
    "target_name": "alert-ops-team",
    "deadman_config": {
      "watch_event_type": "document.ingested",
      "within_seconds": 3600,
      "after_event_type": "document.uploaded",
      "recurring": true
    }
  }'

Configuration

FieldRequiredDescription
watch_event_typeYesEvent type to watch for
within_secondsYesFire if no matching event arrives within this window (minimum 60s)
watch_filterNoOptional EventBridge-style filter on watched events
after_event_typeNoOnly start watching after this event occurs (prevents cold-start false positives)
recurringNoReset and watch again after firing (default: true)

How it works

Each deadman trigger runs as a Temporal workflow with a durable timer. When a matching event arrives, the dispatcher signals the workflow to reset its timer. If the timer expires without a signal, the workflow emits a deadman.fired event and (if recurring) starts watching again.

!!! note A trigger cannot have both schedule_config and deadman_config. A trigger is one of: reactive (default), scheduled, or proactive.


Updated API Reference

MethodPathDescription
POST/api/admin/triggers/Create a trigger
GET/api/admin/triggers/List triggers (supports active_only, event_type, skip, limit)
GET/api/admin/triggers/{name}Get a trigger by name
PUT/api/admin/triggers/{name}Update a trigger
DELETE/api/admin/triggers/{name}Delete a trigger
POST/api/admin/triggers/{name}/testDry-run test against sample event
POST/api/admin/triggers/{name}/backfillReplay historical events against trigger
POST/api/admin/triggers/events/emitManually emit a CloudEvent
GET/api/admin/triggers/events/typesList built-in event types
GET/api/admin/triggers/events/recentRecent events from Redis Stream
POST/api/admin/triggers/events/types/customCreate a custom event type
GET/api/admin/triggers/events/types/customList custom event types
PUT/api/admin/triggers/events/types/custom/{type}Update a custom event type
DELETE/api/admin/triggers/events/types/custom/{type}Delete a custom event type
POST/api/admin/webhooksCreate a webhook definition (returns secret)
GET/api/admin/webhooksList webhook definitions
GET/api/admin/webhooks/{name}Get a webhook definition
DELETE/api/admin/webhooks/{name}Delete a webhook definition
POST/api/admin/webhooks/{name}/rotateRotate webhook secret
POST/api/webhooks/events/{tenant_id}/{webhook_name}Receive external webhook

Martha is built by aiaiai-pt.