Mistral AI API
API Key​
# env variable
os.environ['MISTRAL_API_KEY']
Sample Usage​
from litellm import completion
import os
os.environ['MISTRAL_API_KEY'] = ""
response = completion(
model="mistral/mistral-tiny",
messages=[
{"role": "user", "content": "hello from litellm"}
],
)
print(response)
Sample Usage - Streaming​
from litellm import completion
import os
os.environ['MISTRAL_API_KEY'] = ""
response = completion(
model="mistral/mistral-tiny",
messages=[
{"role": "user", "content": "hello from litellm"}
],
stream=True
)
for chunk in response:
print(chunk)
Supported Models​
All models listed here https://docs.mistral.ai/platform/endpoints are supported. We actively maintain the list of models, pricing, token window, etc. here.
Model Name | Function Call |
---|---|
mistral-tiny | completion(model="mistral/mistral-tiny", messages) |
mistral-small | completion(model="mistral/mistral-small", messages) |
mistral-medium | completion(model="mistral/mistral-medium", messages) |
mistral-large-latest | completion(model="mistral/mistral-large-latest", messages) |
open-mixtral-8x22b | completion(model="mistral/open-mixtral-8x22b", messages) |
Sample Usage - Embedding​
from litellm import embedding
import os
os.environ['MISTRAL_API_KEY'] = ""
response = embedding(
model="mistral/mistral-embed",
input=["good morning from litellm"],
)
print(response)
Supported Models​
All models listed here https://docs.mistral.ai/platform/endpoints are supported
Model Name | Function Call |
---|---|
mistral-embed | embedding(model="mistral/mistral-embed", input) |