Mistral Typescript Client
Summary
Mistral AI API: Our Chat Completion and Embeddings APIs specification. Create your account on La Plateforme to get access and read the docs to learn how to use it.
Table of Contents
SDK Installation
The SDK can be installed with either npm, pnpm, bun or yarn package managers.
NPM
npm add @mistralai/mistralai
PNPM
pnpm add @mistralai/mistralai
Bun
bun add @mistralai/mistralai
Yarn
yarn add @mistralai/mistralai zod
# Note that Yarn does not install peer dependencies automatically. You will need
# to install zod as shown above.
Requirements
For supported JavaScript runtimes, please consult RUNTIMES.md.
API Key Setup
Before you begin, you will need a Mistral AI API key.
- Get your own Mistral API Key: https://docs.mistral.ai/#api-access
- Set your Mistral API Key as an environment variable. You only need to do this once.
# set Mistral API Key (using zsh for example)
$ echo 'export MISTRAL_API_KEY=[your_key_here]' >> ~/.zshenv
# reload the environment (or just quit and open a new terminal)
$ source ~/.zshenv
SDK Example Usage
Create Chat Completions
This example shows how to create chat completions.
import { Mistral } from "@mistralai/mistralai";
const mistral = new Mistral({
apiKey: process.env["MISTRAL_API_KEY"] ?? "",
});
async function run() {
const result = await mistral.chat.complete({
model: "mistral-small-latest",
messages: [
{
content:
"Who is the best French painter? Answer in one short sentence.",
role: "user",
},
],
});
console.log(result);
}
run();
Upload a file
This example shows how to upload a file.
import { Mistral } from "@mistralai/mistralai";
import { openAsBlob } from "node:fs";
const mistral = new Mistral({
apiKey: process.env["MISTRAL_API_KEY"] ?? "",
});
async function run() {
const result = await mistral.files.upload({
file: await openAsBlob("example.file"),
});
console.log(result);
}
run();
Create Agents Completions
This example shows how to create agents completions.
import { Mistral } from "@mistralai/mistralai";
const mistral = new Mistral({
apiKey: process.env["MISTRAL_API_KEY"] ?? "",
});
async function run() {
const result = await mistral.agents.complete({
messages: [
{
content:
"Who is the best French painter? Answer in one short sentence.",
role: "user",
},
],
agentId: "<id>",
});
console.log(result);
}
run();
Create Embedding Request
This example shows how to create embedding request.
import { Mistral } from "@mistralai/mistralai";
const mistral = new Mistral({
apiKey: process.env["MISTRAL_API_KEY"] ?? "",
});
async function run() {
const result = await mistral.embeddings.create({
model: "mistral-embed",
inputs: [
"Embed this sentence.",
"As well as this one.",
],
});
console.log(result);
}
run();
Providers' SDKs
We have dedicated SDKs for the following providers:
Available Resources and Operations
for await...of
][mdn-for-await-of] loop. The loop will
terminate when the server no longer has any events to send and closes the
underlying connection.
typescript
import { Mistral } from "@mistralai/mistralai";
const mistral = new Mistral({
apiKey: process.env["MISTRAL_API_KEY"] ?? "",
});
async function run() {
const result = await mistral.beta.conversations.startStream({
inputs: [
{
object: "entry",
type: "agent.handoff",
previousAgentId: "<id>",
previousAgentName: "<value>",
nextAgentId: "<id>",
nextAgentName: "<value>",
},
],
});
for await (const event of result) {
// Handle the event
console.log(event);
}
}
run();
[mdn-sse]: https://developer.mozilla.org/en-US/docs/Web/API/Server-sent_events/Using_server-sent_events
[mdn-for-await-of]: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/for-await...of
## File uploads
Certain SDK methods accept files as part of a multi-part request. It is possible and typically recommended to upload files as a stream rather than reading the entire contents into memory. This avoids excessive memory consumption and potentially crashing with out-of-memory errors when working with very large files. The following example demonstrates how to attach a file stream to a request.
> [!TIP]
>
> Depending on your JavaScript runtime, there are convenient utilities that return a handle to a file without reading the entire contents into memory:
>
> - Node.js v20+: Since v20, Node.js comes with a native openAsBlob
function in node:fs
.
> - Bun: The native Bun.file
function produces a file handle that can be used for streaming file uploads.
> - Browsers: All supported browsers return an instance to a File
when reading the value from an <input type="file">
element.
> - Node.js v18: A file stream can be created using the fileFrom
helper from fetch-blob/from.js
.
typescript
import { Mistral } from "@mistralai/mistralai";
import { openAsBlob } from "node:fs";
const mistral = new Mistral({
apiKey: process.env["MISTRAL_API_KEY"] ?? "",
});
async function run() {
const result = await mistral.beta.libraries.documents.upload({
libraryId: "a02150d9-5ee0-4877-b62c-28b1fcdf3b76",
requestBody: {
file: await openAsBlob("example.file"),
},
});
console.log(result);
}
run();
## Retries
Some of the endpoints in this SDK support retries. If you use the SDK without any configuration, it will fall back to the default retry strategy provided by the API. However, the default retry strategy can be overridden on a per-operation basis, or across the entire SDK.
To change the default retry strategy for a single API call, simply provide a retryConfig object to the call:
typescript
import { Mistral } from "@mistralai/mistralai";
const mistral = new Mistral({
apiKey: process.env["MISTRAL_API_KEY"] ?? "",
});
async function run() {
const result = await mistral.models.list({
retries: {
strategy: "backoff",
backoff: {
initialInterval: 1,
maxInterval: 50,
exponent: 1.1,
maxElapsedTime: 100,
},
retryConnectionErrors: false,
},
});
console.log(result);
}
run();
If you'd like to override the default retry strategy for all operations that support retries, you can provide a retryConfig at SDK initialization:
typescript
import { Mistral } from "@mistralai/mistralai";
const mistral = new Mistral({
retryConfig: {
strategy: "backoff",
backoff: {
initialInterval: 1,
maxInterval: 50,
exponent: 1.1,
maxElapsedTime: 100,
},
retryConnectionErrors: false,
},
apiKey: process.env["MISTRAL_API_KEY"] ?? "",
});
async function run() {
const result = await mistral.models.list();
console.log(result);
}
run();
## Error Handling
MistralError
is the base class for all HTTP error responses. It has the following properties:
| Property | Type | Description |
| ------------------- | ---------- | --------------------------------------------------------------------------------------- |
| error.message
| string
| Error message |
| error.statusCode
| number
| HTTP response status code eg 404
|
| error.headers
| Headers
| HTTP response headers |
| error.body
| string
| HTTP body. Can be empty string if no body is returned. |
| error.rawResponse
| Response
| Raw HTTP response |
| error.data$
| | Optional. Some errors may contain structured data. See Error Classes. |
### Example
typescript
import { Mistral } from "@mistralai/mistralai";
import * as errors from "@mistralai/mistralai/models/errors";
const mistral = new Mistral({
apiKey: process.env["MISTRAL_API_KEY"] ?? "",
});
async function run() {
try {
const result = await mistral.models.list();
console.log(result);
} catch (error) {
// The base class for HTTP error responses
if (error instanceof errors.MistralError) {
console.log(error.message);
console.log(error.statusCode);
console.log(error.body);
console.log(error.headers);
// Depending on the method different errors may be thrown
if (error instanceof errors.HTTPValidationError) {
console.log(error.data$.detail); // ValidationError[]
}
}
}
}
run();
### Error Classes
Primary error:
MistralError
: The base class for HTTP error responses.
Network errors:
ConnectionError
: HTTP client was unable to make a request to a server.
RequestTimeoutError
: HTTP request timed out due to an AbortSignal signal.
RequestAbortedError
: HTTP request was aborted by the client.
InvalidRequestError
: Any input used to create a request is invalid.
UnexpectedClientError
: Unrecognised or unexpected error.
Inherit from MistralError
:
HTTPValidationError
: Validation Error. Status code 422
. Applicable to 47 of 68 methods.
* ResponseValidationError
: Type mismatch between the data returned from the server and the structure expected by the SDK. See error.rawValue
for the raw value and error.pretty()
for a nicely formatted multi-line string.
* Check the method documentation to see if the error is applicable.
Server Selection
Select Server by Name
You can override the default server globally by passing a server name to the server: keyof typeof ServerList
optional parameter when initializing the SDK client instance. The selected server will then be used as the default on the operations that use it. This table lists the names associated with the available servers:
Name | Server | Description |
---|---|---|
eu |
https://api.mistral.ai |
EU Production server |
Example
import { Mistral } from "@mistralai/mistralai";
const mistral = new Mistral({
server: "eu",
apiKey: process.env["MISTRAL_API_KEY"] ?? "",
});
async function run() {
const result = await mistral.models.list();
console.log(result);
}
run();
Override Server URL Per-Client
The default server can also be overridden globally by passing a URL to the serverURL: string
optional parameter when initializing the SDK client instance. For example:
import { Mistral } from "@mistralai/mistralai";
const mistral = new Mistral({
serverURL: "https://api.mistral.ai",
apiKey: process.env["MISTRAL_API_KEY"] ?? "",
});
async function run() {
const result = await mistral.models.list();
console.log(result);
}
run();
Custom HTTP Client
The TypeScript SDK makes API calls using an HTTPClient
that wraps the native
Fetch API. This
client is a thin wrapper around fetch
and provides the ability to attach hooks
around the request lifecycle that can be used to modify the request or handle
errors and response.
The HTTPClient
constructor takes an optional fetcher
argument that can be
used to integrate a third-party HTTP client or when writing tests to mock out
the HTTP client and feed in fixtures.
The following example shows how to use the "beforeRequest"
hook to to add a
custom header and a timeout to requests and how to use the "requestError"
hook
to log errors:
import { Mistral } from "@mistralai/mistralai";
import { HTTPClient } from "@mistralai/mistralai/lib/http";
const httpClient = new HTTPClient({
// fetcher takes a function that has the same signature as native `fetch`.
fetcher: (request) => {
return fetch(request);
}
});
httpClient.addHook("beforeRequest", (request) => {
const nextRequest = new Request(request, {
signal: request.signal || AbortSignal.timeout(5000)
});
nextRequest.headers.set("x-custom-header", "custom value");
return nextRequest;
});
httpClient.addHook("requestError", (error, request) => {
console.group("Request Error");
console.log("Reason:", `${error}`);
console.log("Endpoint:", `${request.method} ${request.url}`);
console.groupEnd();
});
const sdk = new Mistral({ httpClient });
Authentication
Per-Client Security Schemes
This SDK supports the following security scheme globally:
Name | Type | Scheme | Environment Variable |
---|---|---|---|
apiKey |
http | HTTP Bearer | MISTRAL_API_KEY |
To authenticate with the API the apiKey
parameter must be set when initializing the SDK client instance. For example:
import { Mistral } from "@mistralai/mistralai";
const mistral = new Mistral({
apiKey: process.env["MISTRAL_API_KEY"] ?? "",
});
async function run() {
const result = await mistral.models.list();
console.log(result);
}
run();
Providers Support
We also provide provider specific SDK for:
Standalone functions
All the methods listed above are available as standalone functions. These functions are ideal for use in applications running in the browser, serverless runtimes or other environments where application bundle size is a primary concern. When using a bundler to build your application, all unused functionality will be either excluded from the final bundle or tree-shaken away.
To read more about standalone functions, check FUNCTIONS.md.
agentsComplete
- Agents CompletionagentsStream
- Stream Agents completionaudioTranscriptionsComplete
- Create TranscriptionaudioTranscriptionsStream
- Create streaming transcription (SSE)batchJobsCancel
- Cancel Batch JobbatchJobsCreate
- Create Batch JobbatchJobsGet
- Get Batch JobbatchJobsList
- Get Batch JobsbetaAgentsCreate
- Create a agent that can be used within a conversation.betaAgentsGet
- Retrieve an agent entity.betaAgentsList
- List agent entities.betaAgentsUpdate
- Update an agent entity.betaAgentsUpdateVersion
- Update an agent version.betaConversationsAppend
- Append new entries to an existing conversation.betaConversationsAppendStream
- Append new entries to an existing conversation.betaConversationsGet
- Retrieve a conversation information.betaConversationsGetHistory
- Retrieve all entries in a conversation.betaConversationsGetMessages
- Retrieve all messages in a conversation.betaConversationsList
- List all created conversations.betaConversationsRestart
- Restart a conversation starting from a given entry.betaConversationsRestartStream
- Restart a conversation starting from a given entry.betaConversationsStart
- Create a conversation and append entries to it.betaConversationsStartStream
- Create a conversation and append entries to it.betaLibrariesAccessesDelete
- Delete an access level.betaLibrariesAccessesList
- List all of the access to this library.betaLibrariesAccessesUpdateOrCreate
- Create or update an access level.betaLibrariesCreate
- Create a new Library.betaLibrariesDelete
- Delete a library and all of it's document.betaLibrariesDocumentsDelete
- Delete a document.betaLibrariesDocumentsExtractedTextSignedUrl
- Retrieve the signed URL of text extracted from a given document.betaLibrariesDocumentsGet
- Retrieve the metadata of a specific document.betaLibrariesDocumentsGetSignedUrl
- Retrieve the signed URL of a specific document.betaLibrariesDocumentsList
- List document in a given library.betaLibrariesDocumentsReprocess
- Reprocess a document.betaLibrariesDocumentsStatus
- Retrieve the processing status of a specific document.betaLibrariesDocumentsTextContent
- Retrieve the text content of a specific document.betaLibrariesDocumentsUpdate
- Update the metadata of a specific document.betaLibrariesDocumentsUpload
- Upload a new document.betaLibrariesGet
- Detailed information about a specific Library.betaLibrariesList
- List all libraries you have access to.betaLibrariesUpdate
- Update a library.chatComplete
- Chat CompletionchatStream
- Stream chat completionclassifiersClassify
- ClassificationsclassifiersClassifyChat
- Chat ClassificationsclassifiersModerate
- ModerationsclassifiersModerateChat
- Chat ModerationsembeddingsCreate
- EmbeddingsfilesDelete
- Delete FilefilesDownload
- Download FilefilesGetSignedUrl
- Get Signed UrlfilesList
- List FilesfilesRetrieve
- Retrieve FilefilesUpload
- Upload FilefimComplete
- Fim CompletionfimStream
- Stream fim completionfineTuningJobsCancel
- Cancel Fine Tuning JobfineTuningJobsCreate
- Create Fine Tuning JobfineTuningJobsGet
- Get Fine Tuning JobfineTuningJobsList
- Get Fine Tuning JobsfineTuningJobsStart
- Start Fine Tuning JobmodelsArchive
- Archive Fine Tuned ModelmodelsDelete
- Delete ModelmodelsList
- List ModelsmodelsRetrieve
- Retrieve ModelmodelsUnarchive
- Unarchive Fine Tuned ModelmodelsUpdate
- Update Fine Tuned ModelocrProcess
- OCR
Debugging
You can setup your SDK to emit debug logs for SDK requests and responses.
You can pass a logger that matches console
's interface as an SDK option.
[!WARNING] Beware that debug logging will reveal secrets, like API tokens in headers, in log messages printed to a console or files. It's recommended to use this feature only during local development and not in production.
import { Mistral } from "@mistralai/mistralai";
const sdk = new Mistral({ debugLogger: console });
You can also enable a default debug logger by setting an environment variable MISTRAL_DEBUG
to true.
Development
Contributions
While we value open-source contributions to this SDK, this library is generated programmatically. Any manual changes added to internal files will be overwritten on the next generation. We look forward to hearing your feedback. Feel free to open a PR or an issue with a proof of concept and we'll do our best to include it in a future release.