update docs

This commit is contained in:
zachary62 2025-01-01 19:37:50 +00:00
parent 976a266cd3
commit 27a86e8568
4 changed files with 16 additions and 16 deletions

View File

@ -1,11 +1,11 @@
---
layout: default
title: "Async"
title: "(Advanced) Async"
parent: "Core Abstraction"
nav_order: 5
---
# Async
# (Advanced) Async
**Mini LLM Flow** allows fully asynchronous nodes by implementing `prep_async()`, `exec_async()`, `exec_fallback_async()`, and/or `post_async()`. This is useful for:

View File

@ -6,16 +6,15 @@ nav_order: 1
# Mini LLM Flow
A [100-line](https://github.com/zachary62/miniLLMFlow/blob/main/minillmflow/__init__.py) minimalist LLM framework for agents, task decomposition, RAG, etc.
A [100-line](https://github.com/zachary62/miniLLMFlow/blob/main/minillmflow/__init__.py) minimalist LLM framework for *Agents, Task Decomposition, RAG, etc*.
We model the LLM workflow as a **Nested Flow**:
- Each **Node** handles a simple LLM task.
- Nodes are chained together to form a **Flow** for compute-intensive tasks.
- One Node can be chained to multiple Nodes through **Actions** as an agent.
- A Flow can be treated as a Node for **Nested Flows**.
- Both Nodes and Flows can be **Batched** for data-intensive tasks.
- Nodes and Flows can be **Async** for user inputs.
- **Async** Nodes and Flows can be executed in **Parallel**.
We model the LLM workflow as a **Nested Directed Graph**:
- **Nodes** handle simple (LLM) tasks.
- Nodes connect through **Actions** (labeled edges) for *Agents*.
- **Flows** orchestrate a directed graph of Nodes for *Task Decomposition*.
- A Flow can be used as a Node (for **Nesting**).
- **Batch** Nodes/Flows for data-intensive tasks.
- **Async** Nodes/Flows allow waits or **Parallel** execution
<div align="center">
<img src="https://github.com/zachary62/miniLLMFlow/blob/main/assets/minillmflow.jpg?raw=true" width="400"/>
@ -27,8 +26,8 @@ We model the LLM workflow as a **Nested Flow**:
- [Flow](./flow.md)
- [Communication](./communication.md)
- [Batch](./batch.md)
- [Async](./async.md)
- [Parallel](./parallel.md)
- [(Advanced) Async](./async.md)
- [(Advanced) Parallel](./parallel.md)
## Preparation

View File

@ -7,7 +7,7 @@ nav_order: 1
# Node
A **Node** is the smallest building block of Mini LLM Flow. Each Node has three lifecycle methods:
A **Node** is the smallest building block of Mini LLM Flow. Each Node has 3 steps:
1. **`prep(shared)`**
- Reads and preprocesses data from the `shared` store for LLMs.
@ -25,6 +25,7 @@ A **Node** is the smallest building block of Mini LLM Flow. Each Node has three
- Examples: finalize outputs, trigger next steps, or log results.
- Returns a **string** to specify the next action (`"default"` if nothing or `None` is returned).
All 3 steps are optional. For example, you might only need to run the Prep without calling the LLM.
## Fault Tolerance & Retries

View File

@ -1,11 +1,11 @@
---
layout: default
title: "Parallel"
title: "(Advanced) Parallel"
parent: "Core Abstraction"
nav_order: 6
---
# Parallel
# (Advanced) Parallel
**Parallel** Nodes and Flows let you run multiple tasks **concurrently**—for example, summarizing multiple texts at once. Unlike a regular **BatchNode**, which processes items sequentially, **AsyncParallelBatchNode** and **AsyncParallelBatchFlow** can fire off tasks in parallel. This can improve performance by overlapping I/O and compute.