Skip to content

Commit

Permalink
Merge pull request #140 from mistralai/speakeasy-sdk-regen-1724922548
Browse files Browse the repository at this point in the history
chore: 🐝 Update SDK - Generate MISTRALAI MISTRALAI-SDK 1.0.3
  • Loading branch information
hrjn authored Sep 4, 2024
2 parents 00fea16 + 951f0bb commit c3b2f84
Show file tree
Hide file tree
Showing 15 changed files with 62 additions and 86 deletions.
8 changes: 4 additions & 4 deletions .speakeasy/gen.lock
Original file line number Diff line number Diff line change
Expand Up @@ -3,10 +3,10 @@ id: 2d045ec7-2ebb-4f4d-ad25-40953b132161
management:
docChecksum: ad1a7d6946828a089ca3831e257d307d
docVersion: 0.0.2
speakeasyVersion: 1.376.0
generationVersion: 2.402.5
releaseVersion: 1.0.2
configChecksum: ed07f7fc253047a5a4dd2c0f813b8ea4
speakeasyVersion: 1.382.0
generationVersion: 2.404.11
releaseVersion: 1.0.3
configChecksum: 818970b881ec69b05f6660ca354f26f5
repoURL: https://github.com/mistralai/client-python.git
installationURL: https://github.com/mistralai/client-python.git
published: true
Expand Down
2 changes: 1 addition & 1 deletion .speakeasy/gen.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ generation:
auth:
oAuth2ClientCredentialsEnabled: true
python:
version: 1.0.2
version: 1.0.3
additionalDependencies:
dev:
pytest: ^8.2.2
Expand Down
2 changes: 1 addition & 1 deletion .speakeasy/workflow.lock
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
speakeasyVersion: 1.376.0
speakeasyVersion: 1.382.0
sources:
mistral-azure-source:
sourceNamespace: mistral-openapi-azure
Expand Down
12 changes: 11 additions & 1 deletion RELEASES.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,4 +28,14 @@ Based on:
### Generated
- [python v1.0.2] .
### Releases
- [PyPI v1.0.2] https://pypi.org/project/mistralai/1.0.2 - .
- [PyPI v1.0.2] https://pypi.org/project/mistralai/1.0.2 - .

## 2024-08-29 09:09:05
### Changes
Based on:
- OpenAPI Doc
- Speakeasy CLI 1.382.0 (2.404.11) https://github.com/speakeasy-api/speakeasy
### Generated
- [python v1.0.3] .
### Releases
- [PyPI v1.0.3] https://pypi.org/project/mistralai/1.0.3 - .
9 changes: 3 additions & 6 deletions docs/sdks/agents/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,8 +38,6 @@ if res is not None:

```



### Parameters

| Parameter | Type | Required | Description | Example |
Expand All @@ -56,17 +54,18 @@ if res is not None:
| `tool_choice` | [Optional[models.AgentsCompletionRequestToolChoice]](../../models/agentscompletionrequesttoolchoice.md) | :heavy_minus_sign: | N/A | |
| `retries` | [Optional[utils.RetryConfig]](../../models/utils/retryconfig.md) | :heavy_minus_sign: | Configuration to override the default retry behavior of the client. | |


### Response

**[models.ChatCompletionResponse](../../models/chatcompletionresponse.md)**

### Errors

| Error Object | Status Code | Content Type |
| -------------------------- | -------------------------- | -------------------------- |
| models.HTTPValidationError | 422 | application/json |
| models.SDKError | 4xx-5xx | */* |


## stream

Mistral AI provides the ability to stream responses back to a client in order to allow partial results for certain requests. Tokens will be sent as data-only server-sent events as they become available, with the stream terminated by a data: [DONE] message. Otherwise, the server will hold the request open until the timeout or until completion, with the response containing the full result as JSON.
Expand Down Expand Up @@ -96,8 +95,6 @@ if res is not None:

```



### Parameters

| Parameter | Type | Required | Description | Example |
Expand All @@ -114,10 +111,10 @@ if res is not None:
| `tool_choice` | [Optional[models.AgentsCompletionStreamRequestToolChoice]](../../models/agentscompletionstreamrequesttoolchoice.md) | :heavy_minus_sign: | N/A | |
| `retries` | [Optional[utils.RetryConfig]](../../models/utils/retryconfig.md) | :heavy_minus_sign: | Configuration to override the default retry behavior of the client. | |


### Response

**[Union[Generator[models.CompletionEvent, None, None], AsyncGenerator[models.CompletionEvent, None]]](../../models/.md)**

### Errors

| Error Object | Status Code | Content Type |
Expand Down
9 changes: 3 additions & 6 deletions docs/sdks/chat/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,8 +38,6 @@ if res is not None:

```



### Parameters

| Parameter | Type | Required | Description | Example |
Expand All @@ -59,17 +57,18 @@ if res is not None:
| `safe_prompt` | *Optional[bool]* | :heavy_minus_sign: | Whether to inject a safety prompt before all conversations. | |
| `retries` | [Optional[utils.RetryConfig]](../../models/utils/retryconfig.md) | :heavy_minus_sign: | Configuration to override the default retry behavior of the client. | |


### Response

**[models.ChatCompletionResponse](../../models/chatcompletionresponse.md)**

### Errors

| Error Object | Status Code | Content Type |
| -------------------------- | -------------------------- | -------------------------- |
| models.HTTPValidationError | 422 | application/json |
| models.SDKError | 4xx-5xx | */* |


## stream

Mistral AI provides the ability to stream responses back to a client in order to allow partial results for certain requests. Tokens will be sent as data-only server-sent events as they become available, with the stream terminated by a data: [DONE] message. Otherwise, the server will hold the request open until the timeout or until completion, with the response containing the full result as JSON.
Expand Down Expand Up @@ -99,8 +98,6 @@ if res is not None:

```



### Parameters

| Parameter | Type | Required | Description | Example |
Expand All @@ -120,10 +117,10 @@ if res is not None:
| `safe_prompt` | *Optional[bool]* | :heavy_minus_sign: | Whether to inject a safety prompt before all conversations. | |
| `retries` | [Optional[utils.RetryConfig]](../../models/utils/retryconfig.md) | :heavy_minus_sign: | Configuration to override the default retry behavior of the client. | |


### Response

**[Union[Generator[models.CompletionEvent, None, None], AsyncGenerator[models.CompletionEvent, None]]](../../models/.md)**

### Errors

| Error Object | Status Code | Content Type |
Expand Down
Loading

0 comments on commit c3b2f84

Please sign in to comment.