Skip to content

release: 1.13.4 #1200

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 15 commits into from
Mar 13, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .release-please-manifest.json
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
{
".": "1.13.3"
".": "1.13.4"
}
29 changes: 29 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,34 @@
# Changelog

## 1.13.4 (2024-03-13)

Full Changelog: [v1.13.3...v1.13.4](https://github.com/openai/openai-python/compare/v1.13.3...v1.13.4)

### Bug Fixes

* **streaming:** improve error messages ([#1218](https://github.com/openai/openai-python/issues/1218)) ([4f5ff29](https://github.com/openai/openai-python/commit/4f5ff298601b5a8bfbf0a9d0c0d1329d1502a205))


### Chores

* **api:** update docs ([#1212](https://github.com/openai/openai-python/issues/1212)) ([71236e0](https://github.com/openai/openai-python/commit/71236e0de4012a249af4c1ffd95973a8ba4fa61f))
* **client:** improve error message for invalid http_client argument ([#1216](https://github.com/openai/openai-python/issues/1216)) ([d0c928a](https://github.com/openai/openai-python/commit/d0c928abbd99020fe828350f3adfd10c638a2eed))
* **docs:** mention install from git repo ([#1203](https://github.com/openai/openai-python/issues/1203)) ([3ab6f44](https://github.com/openai/openai-python/commit/3ab6f447ffd8d2394e58416e401e545a99ec85af))
* export NOT_GIVEN sentinel value ([#1223](https://github.com/openai/openai-python/issues/1223)) ([8a4f76f](https://github.com/openai/openai-python/commit/8a4f76f992c66f20cd6aa070c8dc4839e4cf9f3c))
* **internal:** add core support for deserializing into number response ([#1219](https://github.com/openai/openai-python/issues/1219)) ([004bc92](https://github.com/openai/openai-python/commit/004bc924ea579852b9266ca11aea93463cf75104))
* **internal:** bump pyright ([#1221](https://github.com/openai/openai-python/issues/1221)) ([3c2e815](https://github.com/openai/openai-python/commit/3c2e815311ace4ff81ccd446b23ff50a4e099485))
* **internal:** improve deserialisation of discriminated unions ([#1227](https://github.com/openai/openai-python/issues/1227)) ([4767259](https://github.com/openai/openai-python/commit/4767259d25ac135550b37b15e4c0497e5ff0330d))
* **internal:** minor core client restructuring ([#1199](https://github.com/openai/openai-python/issues/1199)) ([4314cdc](https://github.com/openai/openai-python/commit/4314cdcd522537e6cbbd87206d5bb236f672ce05))
* **internal:** split up transforms into sync / async ([#1210](https://github.com/openai/openai-python/issues/1210)) ([7853a83](https://github.com/openai/openai-python/commit/7853a8358864957cc183581bdf7c03810a7b2756))
* **internal:** support more input types ([#1211](https://github.com/openai/openai-python/issues/1211)) ([d0e4baa](https://github.com/openai/openai-python/commit/d0e4baa40d32c2da0ce5ceef8e0c7193b98f2b5a))
* **internal:** support parsing Annotated types ([#1222](https://github.com/openai/openai-python/issues/1222)) ([8598f81](https://github.com/openai/openai-python/commit/8598f81841eeab0ab00eb21fdec7e8756ffde909))
* **types:** include discriminators in unions ([#1228](https://github.com/openai/openai-python/issues/1228)) ([3ba0dcc](https://github.com/openai/openai-python/commit/3ba0dcc19a2af0ef869c77da2805278f71ee96c2))


### Documentation

* **contributing:** improve wording ([#1201](https://github.com/openai/openai-python/issues/1201)) ([95a1e0e](https://github.com/openai/openai-python/commit/95a1e0ea8e5446c413606847ebf9e35afbc62bf9))

## 1.13.3 (2024-02-28)

Full Changelog: [v1.13.2...v1.13.3](https://github.com/openai/openai-python/compare/v1.13.2...v1.13.3)
Expand Down
6 changes: 3 additions & 3 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,7 @@ If you’d like to use the repository from source, you can either install from g
To install via git:

```bash
pip install git+ssh://[email protected]:openai/openai-python.git
pip install git+ssh://[email protected]/openai/openai-python.git
```

Alternatively, you can build from source and install the wheel file:
Expand All @@ -82,7 +82,7 @@ pip install ./path-to-wheel-file.whl

## Running tests

Most tests will require you to [setup a mock server](https://github.com/stoplightio/prism) against the OpenAPI spec to run the tests.
Most tests require you to [set up a mock server](https://github.com/stoplightio/prism) against the OpenAPI spec to run the tests.

```bash
# you will need npm installed
Expand Down Expand Up @@ -117,7 +117,7 @@ the changes aren't made through the automated pipeline, you may want to make rel

### Publish with a GitHub workflow

You can release to package managers by using [the `Publish PyPI` GitHub action](https://www.github.com/openai/openai-python/actions/workflows/publish-pypi.yml). This will require a setup organization or repository secret to be set up.
You can release to package managers by using [the `Publish PyPI` GitHub action](https://www.github.com/openai/openai-python/actions/workflows/publish-pypi.yml). This requires a setup organization or repository secret to be set up.

### Publish manually

Expand Down
1 change: 1 addition & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@ The REST API documentation can be found [on platform.openai.com](https://platfor
> The SDK was rewritten in v1, which was released November 6th 2023. See the [v1 migration guide](https://github.com/openai/openai-python/discussions/742), which includes scripts to automatically update your code.

```sh
# install from PyPI
pip install openai
```

Expand Down
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[project]
name = "openai"
version = "1.13.3"
version = "1.13.4"
description = "The official Python library for the openai API"
readme = "README.md"
license = "Apache-2.0"
Expand Down
4 changes: 2 additions & 2 deletions requirements-dev.lock
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ argcomplete==3.1.2
# via nox
attrs==23.1.0
# via pytest
azure-core==1.30.0
azure-core==1.30.1
# via azure-identity
azure-identity==1.15.0
certifi==2023.7.22
Expand Down Expand Up @@ -96,7 +96,7 @@ pydantic-core==2.10.1
# via pydantic
pyjwt==2.8.0
# via msal
pyright==1.1.351
pyright==1.1.353
pytest==7.1.1
# via pytest-asyncio
pytest-asyncio==0.21.1
Expand Down
4 changes: 3 additions & 1 deletion src/openai/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@
from typing_extensions import override

from . import types
from ._types import NoneType, Transport, ProxiesTypes
from ._types import NOT_GIVEN, NoneType, NotGiven, Transport, ProxiesTypes
from ._utils import file_from_path
from ._client import Client, OpenAI, Stream, Timeout, Transport, AsyncClient, AsyncOpenAI, AsyncStream, RequestOptions
from ._models import BaseModel
Expand Down Expand Up @@ -37,6 +37,8 @@
"NoneType",
"Transport",
"ProxiesTypes",
"NotGiven",
"NOT_GIVEN",
"OpenAIError",
"APIError",
"APIStatusError",
Expand Down
15 changes: 14 additions & 1 deletion src/openai/_base_client.py
Original file line number Diff line number Diff line change
Expand Up @@ -79,7 +79,7 @@
RAW_RESPONSE_HEADER,
OVERRIDE_CAST_TO_HEADER,
)
from ._streaming import Stream, AsyncStream
from ._streaming import Stream, SSEDecoder, AsyncStream, SSEBytesDecoder
from ._exceptions import (
APIStatusError,
APITimeoutError,
Expand Down Expand Up @@ -431,6 +431,9 @@ def _prepare_url(self, url: str) -> URL:

return merge_url

def _make_sse_decoder(self) -> SSEDecoder | SSEBytesDecoder:
return SSEDecoder()

def _build_request(
self,
options: FinalRequestOptions,
Expand Down Expand Up @@ -777,6 +780,11 @@ def __init__(
else:
timeout = DEFAULT_TIMEOUT

if http_client is not None and not isinstance(http_client, httpx.Client): # pyright: ignore[reportUnnecessaryIsInstance]
raise TypeError(
f"Invalid `http_client` argument; Expected an instance of `httpx.Client` but got {type(http_client)}"
)

super().__init__(
version=version,
limits=limits,
Expand Down Expand Up @@ -1319,6 +1327,11 @@ def __init__(
else:
timeout = DEFAULT_TIMEOUT

if http_client is not None and not isinstance(http_client, httpx.AsyncClient): # pyright: ignore[reportUnnecessaryIsInstance]
raise TypeError(
f"Invalid `http_client` argument; Expected an instance of `httpx.AsyncClient` but got {type(http_client)}"
)

super().__init__(
version=version,
base_url=base_url,
Expand Down
5 changes: 5 additions & 0 deletions src/openai/_files.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,12 +13,17 @@
FileContent,
RequestFiles,
HttpxFileTypes,
Base64FileInput,
HttpxFileContent,
HttpxRequestFiles,
)
from ._utils import is_tuple_t, is_mapping_t, is_sequence_t


def is_base64_file_input(obj: object) -> TypeGuard[Base64FileInput]:
return isinstance(obj, io.IOBase) or isinstance(obj, os.PathLike)


def is_file_content(obj: object) -> TypeGuard[FileContent]:
return (
isinstance(obj, bytes) or isinstance(obj, tuple) or isinstance(obj, io.IOBase) or isinstance(obj, os.PathLike)
Expand Down
23 changes: 20 additions & 3 deletions src/openai/_legacy_response.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@
import pydantic

from ._types import NoneType
from ._utils import is_given
from ._utils import is_given, extract_type_arg, is_annotated_type
from ._models import BaseModel, is_basemodel
from ._constants import RAW_RESPONSE_HEADER
from ._streaming import Stream, AsyncStream, is_stream_class_type, extract_stream_chunk_type
Expand Down Expand Up @@ -107,6 +107,8 @@ class MyModel(BaseModel):
- `list`
- `Union`
- `str`
- `int`
- `float`
- `httpx.Response`
"""
cache_key = to if to is not None else self._cast_to
Expand Down Expand Up @@ -172,6 +174,10 @@ def elapsed(self) -> datetime.timedelta:
return self.http_response.elapsed

def _parse(self, *, to: type[_T] | None = None) -> R | _T:
# unwrap `Annotated[T, ...]` -> `T`
if to and is_annotated_type(to):
to = extract_type_arg(to, 0)

if self._stream:
if to:
if not is_stream_class_type(to):
Expand Down Expand Up @@ -213,13 +219,24 @@ def _parse(self, *, to: type[_T] | None = None) -> R | _T:
)

cast_to = to if to is not None else self._cast_to

# unwrap `Annotated[T, ...]` -> `T`
if is_annotated_type(cast_to):
cast_to = extract_type_arg(cast_to, 0)

if cast_to is NoneType:
return cast(R, None)

response = self.http_response
if cast_to == str:
return cast(R, response.text)

if cast_to == int:
return cast(R, int(response.text))

if cast_to == float:
return cast(R, float(response.text))

origin = get_origin(cast_to) or cast_to

if inspect.isclass(origin) and issubclass(origin, HttpxBinaryResponseContent):
Expand Down Expand Up @@ -307,7 +324,7 @@ def to_raw_response_wrapper(func: Callable[P, R]) -> Callable[P, LegacyAPIRespon

@functools.wraps(func)
def wrapped(*args: P.args, **kwargs: P.kwargs) -> LegacyAPIResponse[R]:
extra_headers = {**(cast(Any, kwargs.get("extra_headers")) or {})}
extra_headers: dict[str, str] = {**(cast(Any, kwargs.get("extra_headers")) or {})}
extra_headers[RAW_RESPONSE_HEADER] = "true"

kwargs["extra_headers"] = extra_headers
Expand All @@ -324,7 +341,7 @@ def async_to_raw_response_wrapper(func: Callable[P, Awaitable[R]]) -> Callable[P

@functools.wraps(func)
async def wrapped(*args: P.args, **kwargs: P.kwargs) -> LegacyAPIResponse[R]:
extra_headers = {**(cast(Any, kwargs.get("extra_headers")) or {})}
extra_headers: dict[str, str] = {**(cast(Any, kwargs.get("extra_headers")) or {})}
extra_headers[RAW_RESPONSE_HEADER] = "true"

kwargs["extra_headers"] = extra_headers
Expand Down
Loading