pocketflow/cookbook/pocketflow-fastapi-websocket
zachary62 953a506c05 finish websocket example 2025-05-26 17:34:21 -04:00
..
docs update nodes 2025-05-26 16:43:54 -04:00
static finish websocket example 2025-05-26 17:34:21 -04:00
utils finish websocket example 2025-05-26 17:34:21 -04:00
README.md finish websocket example 2025-05-26 17:34:21 -04:00
flow.py update nodes 2025-05-26 16:43:54 -04:00
main.py finish websocket example 2025-05-26 17:34:21 -04:00
nodes.py finish websocket example 2025-05-26 17:34:21 -04:00
requirements.txt add websocket example 2025-05-26 16:23:11 -04:00

README.md

PocketFlow FastAPI WebSocket Chat

Real-time chat interface with streaming LLM responses using PocketFlow, FastAPI, and WebSocket.

Features

  • Real-time Streaming: See AI responses typed out in real-time as the LLM generates them
  • Conversation Memory: Maintains chat history across messages
  • Modern UI: Clean, responsive chat interface with gradient design
  • WebSocket Connection: Persistent connection for instant communication
  • PocketFlow Integration: Uses PocketFlow AsyncNode and AsyncFlow for streaming

How to Run

  1. Set OpenAI API Key:

    export OPENAI_API_KEY="your-openai-api-key"
    
  2. Install Dependencies:

    pip install -r requirements.txt
    
  3. Run the Application:

    python main.py
    
  4. Access the Web UI: Open http://localhost:8000 in your browser.

Usage

  1. Type Message: Enter your message in the input field
  2. Send: Press Enter or click Send button
  3. Watch Streaming: See the AI response appear in real-time
  4. Continue Chat: Conversation history is maintained automatically

Files