llm wrapper
This commit is contained in:
parent
34eee02da3
commit
0a420b3513
|
|
@ -7,8 +7,10 @@ nav_order: 1
|
||||||
|
|
||||||
# LLM Wrappers
|
# LLM Wrappers
|
||||||
|
|
||||||
We **don't** provide built-in LLM wrappers. Instead, please implement your own, for example by asking an assistant like ChatGPT or Claude. If you ask ChatGPT to "implement a `call_llm` function that takes a prompt and returns the LLM response," you shall get something like:
|
We **don't** provide built-in LLM wrappers. Instead, please implement your own or check out libraries like [litellm](https://github.com/BerriAI/litellm).
|
||||||
|
Here, we provide some minimal example implementations:
|
||||||
|
|
||||||
|
1. OpenAI
|
||||||
```python
|
```python
|
||||||
def call_llm(prompt):
|
def call_llm(prompt):
|
||||||
from openai import OpenAI
|
from openai import OpenAI
|
||||||
|
|
@ -22,9 +24,59 @@ def call_llm(prompt):
|
||||||
# Example usage
|
# Example usage
|
||||||
call_llm("How are you?")
|
call_llm("How are you?")
|
||||||
```
|
```
|
||||||
|
|
||||||
> Store the API key in an environment variable like OPENAI_API_KEY for security.
|
> Store the API key in an environment variable like OPENAI_API_KEY for security.
|
||||||
{: .note }
|
|
||||||
|
2. Claude (Anthropic)
|
||||||
|
```python
|
||||||
|
def call_llm(prompt):
|
||||||
|
from anthropic import Anthropic
|
||||||
|
client = Anthropic(api_key="YOUR_API_KEY_HERE")
|
||||||
|
response = client.messages.create(
|
||||||
|
model="claude-2",
|
||||||
|
messages=[{"role": "user", "content": prompt}],
|
||||||
|
max_tokens=100
|
||||||
|
)
|
||||||
|
return response.content
|
||||||
|
```
|
||||||
|
|
||||||
|
3. Google (Generative AI Studio / PaLM API)
|
||||||
|
```python
|
||||||
|
def call_llm(prompt):
|
||||||
|
import google.generativeai as genai
|
||||||
|
genai.configure(api_key="YOUR_API_KEY_HERE")
|
||||||
|
response = genai.generate_text(
|
||||||
|
model="models/text-bison-001",
|
||||||
|
prompt=prompt
|
||||||
|
)
|
||||||
|
return response.result
|
||||||
|
```
|
||||||
|
|
||||||
|
4. Azure (Azure OpenAI)
|
||||||
|
```python
|
||||||
|
def call_llm(prompt):
|
||||||
|
from openai import AzureOpenAI
|
||||||
|
client = AzureOpenAI(
|
||||||
|
azure_endpoint="https://<YOUR_RESOURCE_NAME>.openai.azure.com/",
|
||||||
|
api_key="YOUR_API_KEY_HERE",
|
||||||
|
api_version="2023-05-15"
|
||||||
|
)
|
||||||
|
r = client.chat.completions.create(
|
||||||
|
model="<YOUR_DEPLOYMENT_NAME>",
|
||||||
|
messages=[{"role": "user", "content": prompt}]
|
||||||
|
)
|
||||||
|
return r.choices[0].message.content
|
||||||
|
```
|
||||||
|
|
||||||
|
5. Ollama (Local LLM)
|
||||||
|
```python
|
||||||
|
def call_llm(prompt):
|
||||||
|
from ollama import chat
|
||||||
|
response = chat(
|
||||||
|
model="llama2",
|
||||||
|
messages=[{"role": "user", "content": prompt}]
|
||||||
|
)
|
||||||
|
return response.message.content
|
||||||
|
```
|
||||||
|
|
||||||
## Improvements
|
## Improvements
|
||||||
Feel free to enhance your `call_llm` function as needed. Here are examples:
|
Feel free to enhance your `call_llm` function as needed. Here are examples:
|
||||||
|
|
|
||||||
Loading…
Reference in New Issue