Skip to content

Commit 40c4974

Browse files
docs: expand MIGRATION.md with v0→v1 and v1→v2 guides
- Add v1→v2 section explaining PEP 420 namespace change - Explain motivation (azure/gcp companion packages) - Include automated migration sed commands - Preserve v0→v1 method mapping tables and examples
1 parent a594141 commit 40c4974

File tree

1 file changed

+161
-14
lines changed

1 file changed

+161
-14
lines changed

MIGRATION.md

Lines changed: 161 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -1,31 +1,178 @@
1-
# Migration Guide: v1 to v2
1+
# Migration Guide
22

3-
## Import Changes
3+
This guide covers migrating between major versions of the Mistral Python SDK.
44

5-
### Main Client
5+
---
6+
7+
## Migrating from v1.x to v2.x
8+
9+
Version 2.0 introduces a **namespace change** to support PEP 420 implicit namespace packages. This enables future companion packages like `mistralai-azure` and `mistralai-gcp` to coexist under the `mistralai` namespace.
10+
11+
### Import Changes
12+
13+
All imports move from `mistralai` to `mistralai.client`:
614

715
```python
816
# v1
917
from mistralai import Mistral
18+
from mistralai.models import UserMessage, AssistantMessage
19+
from mistralai.types import BaseModel
1020

1121
# v2
1222
from mistralai.client import Mistral
23+
from mistralai.client.models import UserMessage, AssistantMessage
24+
from mistralai.client.types import BaseModel
25+
```
26+
27+
### Quick Reference
28+
29+
| v1 | v2 |
30+
|---|---|
31+
| `from mistralai import Mistral` | `from mistralai.client import Mistral` |
32+
| `from mistralai.models import ...` | `from mistralai.client.models import ...` |
33+
| `from mistralai.types import ...` | `from mistralai.client.types import ...` |
34+
| `from mistralai.utils import ...` | `from mistralai.client.utils import ...` |
35+
36+
### What Stays the Same
37+
38+
- All method names and signatures remain identical
39+
- The `Mistral` client API is unchanged
40+
- All models (`UserMessage`, `AssistantMessage`, etc.) work the same way
41+
42+
### Automated Migration
43+
44+
You can use a simple find-and-replace to update your codebase:
45+
46+
```bash
47+
# Using sed (macOS/Linux)
48+
find . -name "*.py" -exec sed -i '' 's/from mistralai import/from mistralai.client import/g' {} +
49+
find . -name "*.py" -exec sed -i '' 's/from mistralai\.models/from mistralai.client.models/g' {} +
50+
find . -name "*.py" -exec sed -i '' 's/from mistralai\.types/from mistralai.client.types/g' {} +
1351
```
1452

15-
### Models and Types
53+
---
54+
55+
## Migrating from v0.x to v1.x
1656

57+
Version 1.0 introduced significant changes to improve usability and consistency.
58+
59+
### Major Changes
60+
61+
1. **Unified Client Class**: `MistralClient` and `MistralAsyncClient` consolidated into a single `Mistral` class
62+
2. **Method Structure**: Methods reorganized into resource-based groups (e.g., `client.chat.complete()`)
63+
3. **Message Classes**: `ChatMessage` replaced with typed classes (`UserMessage`, `AssistantMessage`, etc.)
64+
4. **Streaming Response**: Stream chunks now accessed via `chunk.data.choices[0].delta.content`
65+
66+
### Method Mapping
67+
68+
#### Sync Methods
69+
70+
| v0.x | v1.x |
71+
|---|---|
72+
| `MistralClient` | `Mistral` |
73+
| `client.chat` | `client.chat.complete` |
74+
| `client.chat_stream` | `client.chat.stream` |
75+
| `client.completions` | `client.fim.complete` |
76+
| `client.completions_stream` | `client.fim.stream` |
77+
| `client.embeddings` | `client.embeddings.create` |
78+
| `client.list_models` | `client.models.list` |
79+
| `client.delete_model` | `client.models.delete` |
80+
| `client.files.create` | `client.files.upload` |
81+
| `client.jobs.create` | `client.fine_tuning.jobs.create` |
82+
| `client.jobs.list` | `client.fine_tuning.jobs.list` |
83+
| `client.jobs.retrieve` | `client.fine_tuning.jobs.get` |
84+
| `client.jobs.cancel` | `client.fine_tuning.jobs.cancel` |
85+
86+
#### Async Methods
87+
88+
| v0.x | v1.x |
89+
|---|---|
90+
| `MistralAsyncClient` | `Mistral` |
91+
| `async_client.chat` | `client.chat.complete_async` |
92+
| `async_client.chat_stream` | `client.chat.stream_async` |
93+
| `async_client.completions` | `client.fim.complete_async` |
94+
| `async_client.completions_stream` | `client.fim.stream_async` |
95+
| `async_client.embeddings` | `client.embeddings.create_async` |
96+
| `async_client.list_models` | `client.models.list_async` |
97+
| `async_client.files.create` | `client.files.upload_async` |
98+
| `async_client.jobs.create` | `client.fine_tuning.jobs.create_async` |
99+
| `async_client.jobs.list` | `client.fine_tuning.jobs.list_async` |
100+
| `async_client.jobs.retrieve` | `client.fine_tuning.jobs.get_async` |
101+
| `async_client.jobs.cancel` | `client.fine_tuning.jobs.cancel_async` |
102+
103+
### Example: Non-Streaming Chat
104+
105+
**v0.x:**
17106
```python
18-
# v1
19-
from mistralai.models import UserMessage
107+
from mistralai.client import MistralClient
108+
from mistralai.models.chat_completion import ChatMessage
20109

21-
# v2
22-
from mistralai.client.models import UserMessage
110+
client = MistralClient(api_key=api_key)
111+
112+
messages = [ChatMessage(role="user", content="What is the best French cheese?")]
113+
response = client.chat(model="mistral-large-latest", messages=messages)
114+
115+
print(response.choices[0].message.content)
23116
```
24117

25-
## Quick Reference
118+
**v1.x:**
119+
```python
120+
from mistralai import Mistral, UserMessage
26121

27-
| v1 | v2 |
28-
|----|-----|
29-
| `from mistralai import` | `from mistralai.client import` |
30-
| `from mistralai.models` | `from mistralai.client.models` |
31-
| `from mistralai.types` | `from mistralai.client.types` |
122+
client = Mistral(api_key=api_key)
123+
124+
messages = [UserMessage(content="What is the best French cheese?")]
125+
response = client.chat.complete(model="mistral-large-latest", messages=messages)
126+
127+
print(response.choices[0].message.content)
128+
```
129+
130+
### Example: Streaming Chat
131+
132+
**v0.x:**
133+
```python
134+
from mistralai.client import MistralClient
135+
from mistralai.models.chat_completion import ChatMessage
136+
137+
client = MistralClient(api_key=api_key)
138+
messages = [ChatMessage(role="user", content="What is the best French cheese?")]
139+
140+
for chunk in client.chat_stream(model="mistral-large-latest", messages=messages):
141+
print(chunk.choices[0].delta.content)
142+
```
143+
144+
**v1.x:**
145+
```python
146+
from mistralai import Mistral, UserMessage
147+
148+
client = Mistral(api_key=api_key)
149+
messages = [UserMessage(content="What is the best French cheese?")]
150+
151+
for chunk in client.chat.stream(model="mistral-large-latest", messages=messages):
152+
print(chunk.data.choices[0].delta.content) # Note: chunk.data
153+
```
154+
155+
### Example: Async Streaming
156+
157+
**v0.x:**
158+
```python
159+
from mistralai.async_client import MistralAsyncClient
160+
from mistralai.models.chat_completion import ChatMessage
161+
162+
client = MistralAsyncClient(api_key=api_key)
163+
messages = [ChatMessage(role="user", content="What is the best French cheese?")]
164+
165+
async for chunk in client.chat_stream(model="mistral-large-latest", messages=messages):
166+
print(chunk.choices[0].delta.content)
167+
```
168+
169+
**v1.x:**
170+
```python
171+
from mistralai import Mistral, UserMessage
172+
173+
client = Mistral(api_key=api_key)
174+
messages = [UserMessage(content="What is the best French cheese?")]
175+
176+
async for chunk in await client.chat.stream_async(model="mistral-large-latest", messages=messages):
177+
print(chunk.data.choices[0].delta.content)
178+
```

0 commit comments

Comments
 (0)