Logfire
Set up Logfire for AI Observability
Triggers
Section titled “Triggers” On Alert Received Trigger when a Logfire alert is received via webhook
Actions
Section titled “Actions” Query Logfire Execute a read-only SQL query against Logfire Query API
Instructions
Section titled “Instructions”Create a Logfire API key for SuperPlane
Section titled “Create a Logfire API key for SuperPlane”- Open Settings in Logfire.
- Under ORG: <your-username>, select API Keys.
- Click New API Key.
- Enter a key name.
- Enable these five scopes:
- Organization scopes:
organization:write_channel(required for auto-creating webhook channels) - Project scopes:
project:read,project:read_token,project:read_alert, andproject:write_alert
- Organization scopes:
- Select All Project or a specific project from the dropdown.
- Click Create API Key.
- Copy the API key and paste it.
On Alert Received
Section titled “On Alert Received”The On Alert Received trigger starts a workflow execution when Logfire sends an alert payload to your SuperPlane webhook URL.
Configuration
Section titled “Configuration”Select the Logfire Project and Alert you want to trigger the workflow.
Webhook setup
Section titled “Webhook setup”After you save this trigger, SuperPlane provides a webhook URL. Add that URL as a Logfire notification webhook target so alert events are sent to this workflow.
Example Data
Section titled “Example Data”{ "data": { "alertId": "alt_123", "alertName": "Latency spike", "eventType": "firing", "message": "p95 latency exceeded threshold", "severity": "warning", "url": "https://logfire-us.pydantic.dev/my-org/my-project/alerts/alt_123" }, "timestamp": "2026-03-23T12:00:00.000000000Z", "type": "logfire.alert.received"}Query Logfire
Section titled “Query Logfire”The Query Logfire component executes a read-only SQL query against Logfire and returns query results for use in downstream steps.
Use Cases
Section titled “Use Cases”- Investigate traces and spans: Query recent records for errors, latency spikes, or specific services
- Build reporting workflows: Export Logfire data into Slack, email, dashboards, or data stores
- Conditional automation: Query for specific conditions, then branch workflow logic based on returned rows
- Scheduled analytics: Run recurring SQL queries to monitor usage and operational metrics
Configuration
Section titled “Configuration”- Project: Required Logfire project to query (scopes the generated read token)
- SQL: Required SQL query (supports expressions). Example:
SELECT start_timestamp, message FROM records LIMIT 10 - Time Window: Optional preset time window (e.g., Last 5 minutes, Last 1 hour). Select “Custom” to specify exact timestamps
- Limit: Optional maximum rows to return. If omitted, Logfire defaults to
500; maximum is10000 - Row Oriented: Optional JSON format toggle.
falsereturns column-oriented JSON;truereturns row-oriented JSON
Output
Section titled “Output”Emits one logfire.query event containing the Logfire query response (for example columns and/or rows, depending on format options).
Use this output to transform, filter, or route query results to other components.
Example Output
Section titled “Example Output”{ "data": { "columns": [ { "name": "start_timestamp", "type": "timestamp" }, { "name": "message", "type": "text" } ], "rows": [ [ "2026-01-01T00:00:00Z", "Example Logfire record" ] ] }, "timestamp": "2026-03-23T12:00:00.000000000Z", "type": "logfire.query"}