All templates
AI#LLM#Batch API#Status Branch#OpenAI

Batch LLM Status Tracker

Send an array through the OpenAI Batch API, branch on live status updates, and collect the final per-item results.

Batch LLM Status Tracker

Use one LLM node for bulk prompting and a second branch for progress-aware side effects. This template demonstrates Heym's Batch API mode on the plain LLM node, including the dedicated batchStatus branch.

What this workflow does

  1. Variable node seeds a demo array of prompts
  2. LLM node sends that array through Batch API mode
  3. STATUS branch maps live progress updates such as pending, processing, and completed
  4. Main path reshapes the final per-item results and returns them as the workflow output

Use cases

  • Lower-cost bulk prompting via OpenAI's Batch API
  • Queue-style AI enrichment jobs
  • Progress notifications while the provider batch is still running
  • Batch JSON extraction with downstream per-item handling

Setup

Use an OpenAI API credential and a model Heym shows as batch-capable. Replace the demo $vars.promptList array with a dynamic expression such as $input.items.map("item.text") when you turn this into a production workflow.

Notes

Batch mode is LLM-only and text-only. It cannot be combined with image output or image input.

How to import this template

  1. 1Click Import → Copy JSON on this page.
  2. 2Open your Heym and navigate to a workflow canvas.
  3. 3Press Cmd+V / Ctrl+V — nodes appear instantly.
  4. 4Add your API keys in the node config panels and click Run.
#LLM#Batch API#Status Branch#OpenAI

Click a node to select it — same as the Heym editor; the panel shows its settings.

7 nodes · Free & source-available

Explore related automations — each page links to other templates so you can discover more use cases.

Heym
incident analysis · production AI
Observed across 100s of AI rollouts

AI workflows don't fail because of prompts.
They fail because of orchestration.

symptom · glue code01
5 tools
Scripts, vector DB, approval bot, tracing, browser runner — none of them talk.
symptom · visibility02
~0%
Observable behavior across the stack. Debugging is guesswork.
with heym · one runtime
1 canvas
Agents, RAG, HITL, MCP, traces & evals. Self-hosted. Observable.
AI-Native RuntimeProduction-Grade
github.com/heymrun/heym