Go to file
zachary62 c3c7dcaa2b update the guide 2025-02-21 19:30:14 -05:00
assets update readme 2025-02-16 15:49:26 -05:00
cookbook update name“ 2025-02-02 05:39:33 +00:00
data/PaulGrahamEssaysLarge add examples 2024-12-27 05:29:24 +00:00
docs update the guide 2025-02-21 19:30:14 -05:00
pocketflow m 2025-02-02 22:30:18 +00:00
tests rename as mini llm flow is poor 2025-01-09 03:01:25 +00:00
.gitignore add examples 2024-12-27 05:29:24 +00:00
LICENSE Create LICENSE 2024-12-26 00:44:17 -05:00
README.md Update README.md 2025-02-21 18:37:08 -05:00
setup.py track the current retry 2025-01-25 05:39:44 +00:00

README.md

Pocket Flow - LLM Framework in 100 Lines

License: MIT Docs


A 100-line minimalist LLM framework for (Multi-)Agents, Prompt Chaining, RAG, etc.

  • Install via pip install pocketflow, or just copy the source codes (only 100 lines)

  • If the 100 lines feel terse and youd prefer a friendlier intro, check this out

Documentation: https://the-pocket.github.io/PocketFlow/

Why Pocket Flow?

Pocket Flow is designed to be the framework used by LLMs. In the future, LLM projects will be self-programmed by LLMs themselves: Users specify requirements, and LLMs will design, build, and maintain. To build LLM projects with LLMs assistants (ChatGPT, Claude, Cursor.ai, etc.):

(🫵 Click to expand) Use Claude to build LLM apps
  • Create a project and upload the docs to project knowledge

  • Set project custom instructions. For example:

    1. check "tool.md" and "llm.md" for the required functions.
    2. design the high-level (batch) flow and nodes in artifact using mermaid
    3. design the shared memory structure: define its fields, data structures, and how they will be updated.
    Think out aloud for above first and ask users if your design makes sense.
    4. Finally, implement. Start with simple, minimalistic codes without, for example, typing. Write the codes in artifact.
    
  • Ask it to build LLM apps (Sonnet 3.5 strongly recommended)!

    Help me build a chatbot based on a directory of PDFs.
    
(🫵 Click to expand) Use ChatGPT to build LLM apps

How does it work?

The 100 lines capture what we see as the core abstraction of LLM frameworks: a Graph that breaks down tasks into multiple (LLM) steps, with branching and recursion for agent-like decision-making, and a Shared Store that communicates across graph nodes.



From there, its easy to implement popular design patterns ike (Multi-)Agents, Prompt Chaining, RAG, etc.



  • To learn more details, please check out the documentation
  • For a more in-depth dive on the design choices, check out the essay