The supported model
Model and context length for Meta API support:
Model | Context Length |
---|---|
meta-llama/Llama-3-8b-chat-hf | 8000 |
meta-llama/Llama-3-70b-chat-hf | 8000 |
codellama/CodeLlama-13b-Instruct-hf | 16384 |
codellama/CodeLlama-34b-Instruct-hf | 16384 |
codellama/CodeLlama-70b-Instruct-hf | 4096 |
codellama/CodeLlama-7b-Instruct-hf | 16384 |
meta-llama/Llama-2-70b-chat-hf | 4096 |
meta-llama/Llama-2-13b-chat-hf | 4096 |
meta-llama/Llama-2-7b-chat-hf | 4096 |
from openai import OpenAI
client = OpenAI(
api_key = "Own API key",
base_url = "https://api.zgiai.com/v1"
)
chat_completion = client.chat.completions.create(
messages=[
{
"role": "user",
"content": "Who are you?",
}
],
model="meta-llama/Llama-3-70b-chat-hf",
)
print(chat_completion.choices[0].message.content)