Thrubit 1.3.2 is a major step forward for developers building AI-powered workflows with AWS Step Functions. This release expands Bedrock support beyond simple model invocation and brings the full range of Bedrock service integrations into local execution.
The result is a development experience where you can design, run, and debug complex AI workflows locally, without deploying to AWS and without guessing how your state machine will behave in production.
Why This Release Matters
Until now, most local development workflows for Step Functions treated AI as a black box. You could simulate a model call, but not the full orchestration around retrieval, guardrails, or agents.
Thrubit 1.3.2 changes that.
You can now build complete AI pipelines locally, including retrieval augmented generation, content moderation, and agent-driven workflows, all using the same service integrations that run in AWS.
This closes a major gap between development and production for teams building with Bedrock.
Expanded Bedrock Service Support
Thrubit now supports the full set of Step Functions Bedrock integrations, enabling realistic orchestration patterns that mirror AWS behavior.
Knowledge Base Retrieval with bedrock:retrieve
You can query a Bedrock Knowledge Base directly and return ranked document chunks without generating a response.
This is critical for:
- Validating ingestion pipelines
- Debugging retrieval quality
- Building custom generation layers on top of retrieved data
It gives developers visibility into what the model sees before generation happens.
Full RAG Workflows with bedrock:retrieveAndGenerate
This integration enables true retrieval augmented generation in a single step.
You can:
- Retrieve relevant context from a knowledge base
- Generate a grounded response based on that context
Locally.
That means you can test hallucination handling, prompt grounding, and response quality without deploying or incurring costs.
Start free. No AWS account needed.
ZERO AWS costs.
Download Thrubit and run your first state machine locally in under five minutes. No cloud setup, no IAM policies, no waiting.
Content Safety with bedrock:applyGuardrail
Guardrails are now part of your local workflow.
You can run content through Bedrock Guardrails and receive:
- Action results such as NONE or INTERVENED
- Policy-level assessments
This enables:
- Pre-processing safety checks
- Enforcement of compliance rules
- Early detection of risky inputs
Instead of discovering issues in production, you can validate moderation logic during development.
Agent Workflows with bedrock-agent:invokeAgent
Thrubit now supports invoking Bedrock Agents directly from your state machines.
Both standard and .waitForTaskToken patterns are recognized and executed synchronously in local mode.
This unlocks:
- Multi-step agent orchestration
- Tool usage simulation
- Complex decision flows driven by AI agents
You can now debug agent behavior visually in your execution graph, seeing exactly how decisions are made across states.
Smarter Mock Responses That Actually Match Reality

One of the biggest challenges in local AI development is mismatched response structures.
Thrubit 1.3.2 solves this with model-aware mock responses.
When Mock Bedrock AI is enabled, responses are shaped exactly like the real models, including:
- Claude 3 and Nova using
$.Body.output.message.content[0].text - Claude 2 using
$.Body.completion - Titan using
$.Body.results[0].outputText - Llama models using
$.Body.generation - Mistral and Mixtral using
$.Body.outputs[0].text - Cohere using
$.Body.generations[0].text - Image models returning base64 artifacts
This means your ResultSelector logic behaves identically in mock and real environments.
No more rewriting JSON paths when switching from local to AWS.
Unified Execution Modes in the UI
The Settings UI now introduces a consistent toggle for Bedrock:
- Real AWS
- Mock
This matches the existing Lambda execution mode, making it easier to control how your workflows run without changing your code.
You can:
- Develop quickly in mock mode
- Switch to real AWS for validation
- Move to production with confidence
Updated Sample Workflows
Thrubit’s bundled workflows now demonstrate real-world AI orchestration patterns using the new integrations.
Content Moderation Pipeline
The updated moderation workflows now:
- Run input through
bedrock:applyGuardrailas a first-pass filter - Classify content using
bedrock:invokeModel - Fall back to Lambda-based classification on failure
This pattern reflects how production systems handle safety and reliability.
RAG Knowledge Base Workflow
The RAG workflows now include:
- Validation of document indexing using
bedrock:retrieve - Grounded summary generation with
bedrock:retrieveAndGenerate - Express workflows that confirm index propagation before updates
This provides a complete lifecycle view of how knowledge bases are built and queried.
What This Means for Developers
Thrubit 1.3.2 fundamentally changes how AI workflows are built with Step Functions.
Build Complete AI Pipelines Locally
You are no longer limited to simulating a single model call. You can develop full pipelines including retrieval, generation, guardrails, and agents in one place.
Eliminate Cloud Guesswork
Instead of deploying to AWS to test behavior, you can:
- Inspect execution paths visually
- Validate JSON transformations
- Debug failures instantly
Reduce Costs While Iterating
AI workflows are expensive to test in the cloud. Every retrieval, generation, and orchestration step adds up.
Running these workflows locally removes that cost during development.
Ship with Confidence
Because Thrubit mirrors real Bedrock integrations and response structures, what works locally will behave the same in AWS.
Closing Thoughts
Thrubit 1.3.2 is more than a feature release. It is a shift toward fully local AI workflow development.
By bringing Bedrock’s orchestration capabilities into a local environment, developers can move faster, reduce risk, and build more sophisticated systems without the friction of constant cloud deployment.
For teams working with AWS Step Functions and Bedrock, this release removes one of the biggest bottlenecks in modern AI development.
You can now design, test, and refine intelligent workflows where they should have always been built first. Locally.