diff --git a/README.md b/README.md index 0b907131..656bc5b8 100644 --- a/README.md +++ b/README.md @@ -1,14 +1,14 @@ -# Finch Python API Library +# Finch Python API library [![PyPI version](https://img.shields.io/pypi/v/finch-api.svg)](https://pypi.org/project/finch-api/) The Finch Python library provides convenient access to the Finch REST API from any Python 3.7+ -application. It includes type definitions for all request params and response fields, +application. The library includes type definitions for all request params and response fields, and offers both synchronous and asynchronous clients powered by [httpx](https://github.com/encode/httpx). ## Documentation -The API documentation can be found [here](https://developer.tryfinch.com/). +The API documentation can be found at [https://developer.tryfinch.com/](https://developer.tryfinch.com/). ## Installation @@ -34,7 +34,7 @@ directory = page.individuals[0] print(directory.first_name) ``` -## Async Usage +## Async usage Simply import `AsyncFinch` instead of `Finch` and use `await` with each API call: @@ -58,11 +58,11 @@ asyncio.run(main()) Functionality between the synchronous and asynchronous clients is otherwise identical. -## Using Types +## Using types -Nested request parameters are [TypedDicts](https://docs.python.org/3/library/typing.html#typing.TypedDict). Responses are [Pydantic models](https://docs.pydantic.dev), which provide helper methods for things like serializing back into json ([v1](https://docs.pydantic.dev/1.10/usage/models/), [v2](https://docs.pydantic.dev/latest/usage/serialization/)). To get a dictionary, you can call `dict(model)`. +Nested request parameters are [TypedDicts](https://docs.python.org/3/library/typing.html#typing.TypedDict). Responses are [Pydantic models](https://docs.pydantic.dev), which provide helper methods for things like serializing back into JSON ([v1](https://docs.pydantic.dev/1.10/usage/models/), [v2](https://docs.pydantic.dev/latest/usage/serialization/)). To get a dictionary, call `dict(model)`. -This helps provide autocomplete and documentation within your editor. If you would like to see type errors in VS Code to help catch bugs earlier, set `python.analysis.typeCheckingMode` to `"basic"`. +Typed requests and responses provide autocomplete and documentation within your editor. If you would like to see type errors in VS Code to help catch bugs earlier, set `python.analysis.typeCheckingMode` to `basic`. ## Pagination @@ -169,10 +169,10 @@ async def handler(request: Request): ## Handling errors -When the library is unable to connect to the API (e.g., due to network connection problems or a timeout), a subclass of `finch.APIConnectionError` is raised. +When the library is unable to connect to the API (for example, due to network connection problems or a timeout), a subclass of `finch.APIConnectionError` is raised. -When the API returns a non-success status code (i.e., 4xx or 5xx -response), a subclass of `finch.APIStatusError` will be raised, containing `status_code` and `response` properties. +When the API returns a non-success status code (that is, 4xx or 5xx +response), a subclass of `finch.APIStatusError` is raised, containing `status_code` and `response` properties. All errors inherit from `finch.APIError`. @@ -210,11 +210,11 @@ Error codes are as followed: ### Retries -Certain errors will be automatically retried 2 times by default, with a short exponential backoff. +Certain errors are automatically retried 2 times by default, with a short exponential backoff. Connection errors (for example, due to a network connectivity problem), 408 Request Timeout, 409 Conflict, -429 Rate Limit, and >=500 Internal errors will all be retried by default. +429 Rate Limit, and >=500 Internal errors are all retried by default. -You can use the `max_retries` option to configure or disable this: +You can use the `max_retries` option to configure or disable retry settings: ```python from finch import Finch @@ -231,8 +231,8 @@ client.with_options(max_retries=5).hris.directory.list_individuals() ### Timeouts -Requests time out after 1 minute by default. You can configure this with a `timeout` option, -which accepts a float or an [`httpx.Timeout`](https://www.python-httpx.org/advanced/#fine-tuning-the-configuration): +By default requests time out after 1 minute. You can configure this with a `timeout` option, +which accepts a float or an [`httpx.Timeout`](https://www.python-httpx.org/advanced/#fine-tuning-the-configuration) object: ```python from finch import Finch @@ -254,7 +254,7 @@ client.with_options(timeout=5 * 1000).hris.directory.list_individuals() On timeout, an `APITimeoutError` is thrown. -Note that requests which time out will be [retried twice by default](#retries). +Note that requests that time out are [retried twice by default](#retries). ## Default Headers @@ -276,7 +276,7 @@ client = Finch( ### How to tell whether `None` means `null` or missing -In an API response, a field may be explicitly null, or missing entirely; in either case, its value is `None` in this library. You can differentiate the two cases with `.model_fields_set`: +In an API response, a field may be explicitly `null`, or missing entirely; in either case, its value is `None` in this library. You can differentiate the two cases with `.model_fields_set`: ```py if response.my_field is None: @@ -286,27 +286,30 @@ if response.my_field is None: print('Got json like {"my_field": null}.') ``` -### Configuring custom URLs, proxies, and transports +### Configuring the HTTP client -You can configure the following keyword arguments when instantiating the client: +You can directly override the [httpx client](https://www.python-httpx.org/api/#client) to customize it for your use case, including: + +- Support for proxies +- Custom transports +- Additional [advanced](https://www.python-httpx.org/advanced/#client-instances) functionality ```python import httpx from finch import Finch client = Finch( - # Use a custom base URL base_url="http://my.test.server.example.com:8083", - proxies="http://my.test.proxy.example.com", - transport=httpx.HTTPTransport(local_address="0.0.0.0"), + http_client=httpx.Client( + proxies="http://my.test.proxy.example.com", + transport=httpx.HTTPTransport(local_address="0.0.0.0"), + ), ) ``` -See the httpx documentation for information about the [`proxies`](https://www.python-httpx.org/advanced/#http-proxying) and [`transport`](https://www.python-httpx.org/advanced/#custom-transports) keyword arguments. - ### Managing HTTP resources -By default we will close the underlying HTTP connections whenever the client is [garbage collected](https://docs.python.org/3/reference/datamodel.html#object.__del__) is called but you can also manually close the client using the `.close()` method if desired, or with a context manager that closes when exiting. +By default the library closes underlying HTTP connections whenever the client is [garbage collected](https://docs.python.org/3/reference/datamodel.html#object.__del__). You can manually close the client using the `.close()` method if desired, or with a context manager that closes when exiting. ## Versioning diff --git a/src/finch/_base_client.py b/src/finch/_base_client.py index 4f46f6f0..1ef2efa0 100644 --- a/src/finch/_base_client.py +++ b/src/finch/_base_client.py @@ -6,10 +6,12 @@ import email import inspect import platform +import warnings import email.utils from types import TracebackType from random import random from typing import ( + TYPE_CHECKING, Any, Dict, Type, @@ -52,11 +54,12 @@ AnyMapping, ProxiesTypes, RequestFiles, + AsyncTransport, RequestOptions, UnknownResponse, ModelBuilderProtocol, ) -from ._utils import is_dict, is_mapping +from ._utils import is_dict, is_given, is_mapping from ._compat import model_copy from ._models import ( BaseModel, @@ -84,6 +87,15 @@ _StreamT = TypeVar("_StreamT", bound=Stream[Any]) _AsyncStreamT = TypeVar("_AsyncStreamT", bound=AsyncStream[Any]) +if TYPE_CHECKING: + from httpx._config import DEFAULT_TIMEOUT_CONFIG as HTTPX_DEFAULT_TIMEOUT +else: + try: + from httpx._config import DEFAULT_TIMEOUT_CONFIG as HTTPX_DEFAULT_TIMEOUT + except ImportError: + # taken from https://github.com/encode/httpx/blob/3ba5fe0d7ac70222590e759c31442b1cab263791/httpx/_config.py#L366 + HTTPX_DEFAULT_TIMEOUT = Timeout(5.0) + # default timeout is 1 minute DEFAULT_TIMEOUT = Timeout(timeout=60.0, connect=5.0) @@ -303,11 +315,12 @@ async def get_next_page(self: AsyncPageT) -> AsyncPageT: class BaseClient: _client: httpx.Client | httpx.AsyncClient _version: str + _base_url: URL max_retries: int timeout: Union[float, Timeout, None] _limits: httpx.Limits _proxies: ProxiesTypes | None - _transport: Transport | None + _transport: Transport | AsyncTransport | None _strict_response_validation: bool _idempotency_header: str | None @@ -315,16 +328,18 @@ def __init__( self, *, version: str, + base_url: str, _strict_response_validation: bool, max_retries: int = DEFAULT_MAX_RETRIES, timeout: float | Timeout | None = DEFAULT_TIMEOUT, limits: httpx.Limits, - transport: Transport | None, + transport: Transport | AsyncTransport | None, proxies: ProxiesTypes | None, custom_headers: Mapping[str, str] | None = None, custom_query: Mapping[str, object] | None = None, ) -> None: self._version = version + self._base_url = self._enforce_trailing_slash(URL(base_url)) self.max_retries = max_retries self.timeout = timeout self._limits = limits @@ -335,6 +350,11 @@ def __init__( self._strict_response_validation = _strict_response_validation self._idempotency_header = None + def _enforce_trailing_slash(self, url: URL) -> URL: + if url.raw_path.endswith(b"/"): + return url + return url.copy_with(raw_path=url.raw_path + b"/") + def _make_status_error_from_response( self, response: httpx.Response, @@ -391,6 +411,19 @@ def _prepare_request(self, request: httpx.Request) -> None: """ return None + def _prepare_url(self, url: str) -> URL: + """ + Merge a URL argument together with any 'base_url' on the client, + to create the URL used for the outgoing request. + """ + # Copied from httpx's `_merge_url` method. + merge_url = URL(url) + if merge_url.is_relative_url: + merge_raw_path = self.base_url.raw_path + merge_url.raw_path.lstrip(b"/") + return self.base_url.copy_with(raw_path=merge_raw_path) + + return merge_url + def _build_request( self, options: FinalRequestOptions, @@ -432,7 +465,7 @@ def _build_request( headers=headers, timeout=self.timeout if isinstance(options.timeout, NotGiven) else options.timeout, method=options.method, - url=options.url, + url=self._prepare_url(options.url), # the `Query` type that we use is incompatible with qs' # `Params` type as it needs to be typed as `Mapping[str, object]` # so that passing a `TypedDict` doesn't cause an error. @@ -570,6 +603,7 @@ def auth_headers(self) -> dict[str, str]: @property def default_headers(self) -> dict[str, str | Omit]: return { + "Accept": "application/json", "Content-Type": "application/json", "User-Agent": self.user_agent, **self.platform_headers(), @@ -590,12 +624,11 @@ def user_agent(self) -> str: @property def base_url(self) -> URL: - return self._client.base_url + return self._base_url @base_url.setter def base_url(self, url: URL | str) -> None: - # mypy doesn't use the type from the setter - self._client.base_url = url # type: ignore[assignment] + self._client.base_url = url if isinstance(url, URL) else URL(url) @lru_cache(maxsize=None) def platform_headers(self) -> Dict[str, str]: @@ -687,6 +720,7 @@ def _idempotency_key(self) -> str: class SyncAPIClient(BaseClient): _client: httpx.Client + _has_custom_http_client: bool _default_stream_cls: type[Stream[Any]] | None = None def __init__( @@ -695,34 +729,79 @@ def __init__( version: str, base_url: str, max_retries: int = DEFAULT_MAX_RETRIES, - timeout: float | Timeout | None = DEFAULT_TIMEOUT, + timeout: float | Timeout | None | NotGiven = NOT_GIVEN, transport: Transport | None = None, proxies: ProxiesTypes | None = None, - limits: Limits | None = DEFAULT_LIMITS, + limits: Limits | None = None, + http_client: httpx.Client | None = None, custom_headers: Mapping[str, str] | None = None, custom_query: Mapping[str, object] | None = None, _strict_response_validation: bool, ) -> None: - limits = limits or DEFAULT_LIMITS + if limits is not None: + warnings.warn( + "The `connection_pool_limits` argument is deprecated. The `http_client` argument should be passed instead", + category=DeprecationWarning, + stacklevel=3, + ) + if http_client is not None: + raise ValueError("The `http_client` argument is mutually exclusive with `connection_pool_limits`") + else: + limits = DEFAULT_LIMITS + + if transport is not None: + warnings.warn( + "The `transport` argument is deprecated. The `http_client` argument should be passed instead", + category=DeprecationWarning, + stacklevel=3, + ) + if http_client is not None: + raise ValueError("The `http_client` argument is mutually exclusive with `transport`") + + if proxies is not None: + warnings.warn( + "The `proxies` argument is deprecated. The `http_client` argument should be passed instead", + category=DeprecationWarning, + stacklevel=3, + ) + if http_client is not None: + raise ValueError("The `http_client` argument is mutually exclusive with `proxies`") + + if not is_given(timeout): + # if the user passed in a custom http client with a non-default + # timeout set then we use that timeout. + # + # note: there is an edge case here where the user passes in a client + # where they've explicitly set the timeout to match the default timeout + # as this check is structural, meaning that we'll think they didn't + # pass in a timeout and will ignore it + if http_client and http_client.timeout != HTTPX_DEFAULT_TIMEOUT: + timeout = http_client.timeout + else: + timeout = DEFAULT_TIMEOUT + super().__init__( version=version, limits=limits, - timeout=timeout, + # cast to a valid type because mypy doesn't understand our type narrowing + timeout=cast(Timeout, timeout), proxies=proxies, + base_url=base_url, transport=transport, max_retries=max_retries, custom_query=custom_query, custom_headers=custom_headers, _strict_response_validation=_strict_response_validation, ) - self._client = httpx.Client( + self._client = http_client or httpx.Client( base_url=base_url, - timeout=timeout, - proxies=proxies, # type: ignore - transport=transport, # type: ignore + # cast to a valid type because mypy doesn't understand our type narrowing + timeout=cast(Timeout, timeout), + proxies=proxies, + transport=transport, limits=limits, - headers={"Accept": "application/json"}, ) + self._has_custom_http_client = bool(http_client) def is_closed(self) -> bool: return self._client.is_closed @@ -1040,6 +1119,7 @@ def get_api_list( class AsyncAPIClient(BaseClient): _client: httpx.AsyncClient + _has_custom_http_client: bool _default_stream_cls: type[AsyncStream[Any]] | None = None def __init__( @@ -1049,18 +1129,62 @@ def __init__( base_url: str, _strict_response_validation: bool, max_retries: int = DEFAULT_MAX_RETRIES, - timeout: float | Timeout | None = DEFAULT_TIMEOUT, - transport: Transport | None = None, + timeout: float | Timeout | None | NotGiven = NOT_GIVEN, + transport: AsyncTransport | None = None, proxies: ProxiesTypes | None = None, - limits: Limits | None = DEFAULT_LIMITS, + limits: Limits | None = None, + http_client: httpx.AsyncClient | None = None, custom_headers: Mapping[str, str] | None = None, custom_query: Mapping[str, object] | None = None, ) -> None: - limits = limits or DEFAULT_LIMITS + if limits is not None: + warnings.warn( + "The `connection_pool_limits` argument is deprecated. The `http_client` argument should be passed instead", + category=DeprecationWarning, + stacklevel=3, + ) + if http_client is not None: + raise ValueError("The `http_client` argument is mutually exclusive with `connection_pool_limits`") + else: + limits = DEFAULT_LIMITS + + if transport is not None: + warnings.warn( + "The `transport` argument is deprecated. The `http_client` argument should be passed instead", + category=DeprecationWarning, + stacklevel=3, + ) + if http_client is not None: + raise ValueError("The `http_client` argument is mutually exclusive with `transport`") + + if proxies is not None: + warnings.warn( + "The `proxies` argument is deprecated. The `http_client` argument should be passed instead", + category=DeprecationWarning, + stacklevel=3, + ) + if http_client is not None: + raise ValueError("The `http_client` argument is mutually exclusive with `proxies`") + + if not is_given(timeout): + # if the user passed in a custom http client with a non-default + # timeout set then we use that timeout. + # + # note: there is an edge case here where the user passes in a client + # where they've explicitly set the timeout to match the default timeout + # as this check is structural, meaning that we'll think they didn't + # pass in a timeout and will ignore it + if http_client and http_client.timeout != HTTPX_DEFAULT_TIMEOUT: + timeout = http_client.timeout + else: + timeout = DEFAULT_TIMEOUT + super().__init__( version=version, + base_url=base_url, limits=limits, - timeout=timeout, + # cast to a valid type because mypy doesn't understand our type narrowing + timeout=cast(Timeout, timeout), proxies=proxies, transport=transport, max_retries=max_retries, @@ -1068,14 +1192,15 @@ def __init__( custom_headers=custom_headers, _strict_response_validation=_strict_response_validation, ) - self._client = httpx.AsyncClient( + self._client = http_client or httpx.AsyncClient( base_url=base_url, - timeout=timeout, - proxies=proxies, # type: ignore - transport=transport, # type: ignore + # cast to a valid type because mypy doesn't understand our type narrowing + timeout=cast(Timeout, timeout), + proxies=proxies, + transport=transport, limits=limits, - headers={"Accept": "application/json"}, ) + self._has_custom_http_client = bool(http_client) def is_closed(self) -> bool: return self._client.is_closed diff --git a/src/finch/_client.py b/src/finch/_client.py index 013c8161..864333ee 100644 --- a/src/finch/_client.py +++ b/src/finch/_client.py @@ -18,15 +18,16 @@ NotGiven, Transport, ProxiesTypes, + AsyncTransport, RequestOptions, ) +from ._utils import is_given from ._version import __version__ from ._streaming import Stream as Stream from ._streaming import AsyncStream as AsyncStream from ._exceptions import APIStatusError from ._base_client import ( DEFAULT_LIMITS, - DEFAULT_TIMEOUT, DEFAULT_MAX_RETRIES, SyncAPIClient, AsyncAPIClient, @@ -66,16 +67,18 @@ def __init__( webhook_secret: str | None = None, base_url: Optional[str] = None, access_token: Optional[str] = None, - timeout: Union[float, Timeout, None] = DEFAULT_TIMEOUT, + timeout: Union[float, Timeout, None, NotGiven] = NOT_GIVEN, max_retries: int = DEFAULT_MAX_RETRIES, default_headers: Mapping[str, str] | None = None, default_query: Mapping[str, object] | None = None, + # Configure a custom httpx client. See the [httpx documentation](https://www.python-httpx.org/api/#client) for more details. + http_client: httpx.Client | None = None, # See httpx documentation for [custom transports](https://www.python-httpx.org/advanced/#custom-transports) - transport: Optional[Transport] = None, + transport: Transport | None = None, # See httpx documentation for [proxies](https://www.python-httpx.org/advanced/#http-proxying) - proxies: Optional[ProxiesTypes] = None, + proxies: ProxiesTypes | None = None, # See httpx documentation for [limits](https://www.python-httpx.org/advanced/#pool-limit-configuration) - connection_pool_limits: httpx.Limits | None = DEFAULT_LIMITS, + connection_pool_limits: httpx.Limits | None = None, # Enable or disable schema validation for data returned by the API. # When enabled an error APIResponseValidationError is raised # if the API responds with invalid data for the expected schema. @@ -112,6 +115,7 @@ def __init__( base_url=base_url, max_retries=max_retries, timeout=timeout, + http_client=http_client, transport=transport, proxies=proxies, limits=connection_pool_limits, @@ -164,7 +168,8 @@ def copy( access_token: str | None = None, base_url: str | None = None, timeout: float | Timeout | None | NotGiven = NOT_GIVEN, - connection_pool_limits: httpx.Limits | NotGiven = NOT_GIVEN, + http_client: httpx.Client | None = None, + connection_pool_limits: httpx.Limits | None = None, max_retries: int | NotGiven = NOT_GIVEN, default_headers: Mapping[str, str] | None = None, set_default_headers: Mapping[str, str] | None = None, @@ -195,7 +200,24 @@ def copy( elif set_default_query is not None: params = set_default_query - # TODO: share the same httpx client between instances + if connection_pool_limits is not None: + if http_client is not None: + raise ValueError("The 'http_client' argument is mutually exclusive with 'connection_pool_limits'") + + if self._has_custom_http_client: + raise ValueError( + "A custom HTTP client has been set and is mutually exclusive with the 'connection_pool_limits' argument" + ) + + http_client = None + else: + if self._limits is not DEFAULT_LIMITS: + connection_pool_limits = self._limits + else: + connection_pool_limits = None + + http_client = http_client or self._client + return self.__class__( client_id=client_id or self.client_id, client_secret=client_secret or self.client_secret, @@ -203,10 +225,9 @@ def copy( base_url=base_url or str(self.base_url), access_token=access_token or self.access_token, timeout=self.timeout if isinstance(timeout, NotGiven) else timeout, - connection_pool_limits=self._limits - if isinstance(connection_pool_limits, NotGiven) - else connection_pool_limits, - max_retries=self.max_retries if isinstance(max_retries, NotGiven) else max_retries, + http_client=http_client, + connection_pool_limits=connection_pool_limits, + max_retries=max_retries if is_given(max_retries) else self.max_retries, default_headers=headers, default_query=params, ) @@ -216,6 +237,13 @@ def copy( with_options = copy def __del__(self) -> None: + if not hasattr(self, "_has_custom_http_client") or not hasattr(self, "close"): + # this can happen if the '__init__' method raised an error + return + + if self._has_custom_http_client: + return + self.close() def get_access_token( @@ -331,16 +359,18 @@ def __init__( webhook_secret: str | None = None, base_url: Optional[str] = None, access_token: Optional[str] = None, - timeout: Union[float, Timeout, None] = DEFAULT_TIMEOUT, + timeout: Union[float, Timeout, None, NotGiven] = NOT_GIVEN, max_retries: int = DEFAULT_MAX_RETRIES, default_headers: Mapping[str, str] | None = None, default_query: Mapping[str, object] | None = None, + # Configure a custom httpx client. See the [httpx documentation](https://www.python-httpx.org/api/#asyncclient) for more details. + http_client: httpx.AsyncClient | None = None, # See httpx documentation for [custom transports](https://www.python-httpx.org/advanced/#custom-transports) - transport: Optional[Transport] = None, + transport: AsyncTransport | None = None, # See httpx documentation for [proxies](https://www.python-httpx.org/advanced/#http-proxying) - proxies: Optional[ProxiesTypes] = None, + proxies: ProxiesTypes | None = None, # See httpx documentation for [limits](https://www.python-httpx.org/advanced/#pool-limit-configuration) - connection_pool_limits: httpx.Limits | None = DEFAULT_LIMITS, + connection_pool_limits: httpx.Limits | None = None, # Enable or disable schema validation for data returned by the API. # When enabled an error APIResponseValidationError is raised # if the API responds with invalid data for the expected schema. @@ -377,6 +407,7 @@ def __init__( base_url=base_url, max_retries=max_retries, timeout=timeout, + http_client=http_client, transport=transport, proxies=proxies, limits=connection_pool_limits, @@ -429,7 +460,8 @@ def copy( access_token: str | None = None, base_url: str | None = None, timeout: float | Timeout | None | NotGiven = NOT_GIVEN, - connection_pool_limits: httpx.Limits | NotGiven = NOT_GIVEN, + http_client: httpx.AsyncClient | None = None, + connection_pool_limits: httpx.Limits | None = None, max_retries: int | NotGiven = NOT_GIVEN, default_headers: Mapping[str, str] | None = None, set_default_headers: Mapping[str, str] | None = None, @@ -460,7 +492,24 @@ def copy( elif set_default_query is not None: params = set_default_query - # TODO: share the same httpx client between instances + if connection_pool_limits is not None: + if http_client is not None: + raise ValueError("The 'http_client' argument is mutually exclusive with 'connection_pool_limits'") + + if self._has_custom_http_client: + raise ValueError( + "A custom HTTP client has been set and is mutually exclusive with the 'connection_pool_limits' argument" + ) + + http_client = None + else: + if self._limits is not DEFAULT_LIMITS: + connection_pool_limits = self._limits + else: + connection_pool_limits = None + + http_client = http_client or self._client + return self.__class__( client_id=client_id or self.client_id, client_secret=client_secret or self.client_secret, @@ -468,10 +517,9 @@ def copy( base_url=base_url or str(self.base_url), access_token=access_token or self.access_token, timeout=self.timeout if isinstance(timeout, NotGiven) else timeout, - connection_pool_limits=self._limits - if isinstance(connection_pool_limits, NotGiven) - else connection_pool_limits, - max_retries=self.max_retries if isinstance(max_retries, NotGiven) else max_retries, + http_client=http_client, + connection_pool_limits=connection_pool_limits, + max_retries=max_retries if is_given(max_retries) else self.max_retries, default_headers=headers, default_query=params, ) @@ -481,6 +529,13 @@ def copy( with_options = copy def __del__(self) -> None: + if not hasattr(self, "_has_custom_http_client") or not hasattr(self, "close"): + # this can happen if the '__init__' method raised an error + return + + if self._has_custom_http_client: + return + try: asyncio.get_running_loop().create_task(self.close()) except Exception: diff --git a/src/finch/_types.py b/src/finch/_types.py index a48c2298..d32a6871 100644 --- a/src/finch/_types.py +++ b/src/finch/_types.py @@ -18,12 +18,13 @@ import httpx import pydantic -from httpx import Proxy, Timeout, Response, BaseTransport +from httpx import URL, Proxy, Timeout, Response, BaseTransport, AsyncBaseTransport if TYPE_CHECKING: from ._models import BaseModel Transport = BaseTransport +AsyncTransport = AsyncBaseTransport Query = Mapping[str, object] Body = object AnyMapping = Mapping[str, object] @@ -31,7 +32,7 @@ _T = TypeVar("_T") # Approximates httpx internal ProxiesTypes and RequestFiles types -ProxiesDict = Dict[str, Union[None, str, Proxy]] +ProxiesDict = Dict["str | URL", Union[None, str, URL, Proxy]] ProxiesTypes = Union[str, Proxy, ProxiesDict] FileContent = Union[IO[bytes], bytes] FileTypes = Union[ diff --git a/tests/test_client.py b/tests/test_client.py index 520dcf53..d2f1cee4 100644 --- a/tests/test_client.py +++ b/tests/test_client.py @@ -18,7 +18,12 @@ from finch._types import Omit from finch._models import BaseModel, FinalRequestOptions from finch._exceptions import APIResponseValidationError -from finch._base_client import BaseClient, make_request_options +from finch._base_client import ( + DEFAULT_TIMEOUT, + HTTPX_DEFAULT_TIMEOUT, + BaseClient, + make_request_options, +) base_url = os.environ.get("TEST_API_BASE_URL", "http://127.0.0.1:4010") access_token = os.environ.get("API_KEY", "something1234") @@ -155,6 +160,57 @@ def test_copy_signature(self) -> None: copy_param = copy_signature.parameters.get(name) assert copy_param is not None, f"copy() signature is missing the {name} param" + def test_request_timeout(self) -> None: + request = self.client._build_request(FinalRequestOptions(method="get", url="/foo")) + timeout = httpx.Timeout(**request.extensions["timeout"]) # type: ignore + assert timeout == DEFAULT_TIMEOUT + + request = self.client._build_request( + FinalRequestOptions(method="get", url="/foo", timeout=httpx.Timeout(100.0)) + ) + timeout = httpx.Timeout(**request.extensions["timeout"]) # type: ignore + assert timeout == httpx.Timeout(100.0) + + def test_client_timeout_option(self) -> None: + client = Finch( + base_url=base_url, access_token=access_token, _strict_response_validation=True, timeout=httpx.Timeout(0) + ) + + request = client._build_request(FinalRequestOptions(method="get", url="/foo")) + timeout = httpx.Timeout(**request.extensions["timeout"]) # type: ignore + assert timeout == httpx.Timeout(0) + + def test_http_client_timeout_option(self) -> None: + # custom timeout given to the httpx client should be used + with httpx.Client(timeout=None) as http_client: + client = Finch( + base_url=base_url, access_token=access_token, _strict_response_validation=True, http_client=http_client + ) + + request = client._build_request(FinalRequestOptions(method="get", url="/foo")) + timeout = httpx.Timeout(**request.extensions["timeout"]) # type: ignore + assert timeout == httpx.Timeout(None) + + # no timeout given to the httpx client should not use the httpx default + with httpx.Client() as http_client: + client = Finch( + base_url=base_url, access_token=access_token, _strict_response_validation=True, http_client=http_client + ) + + request = client._build_request(FinalRequestOptions(method="get", url="/foo")) + timeout = httpx.Timeout(**request.extensions["timeout"]) # type: ignore + assert timeout == DEFAULT_TIMEOUT + + # explicitly passing the default timeout currently results in it being ignored + with httpx.Client(timeout=HTTPX_DEFAULT_TIMEOUT) as http_client: + client = Finch( + base_url=base_url, access_token=access_token, _strict_response_validation=True, http_client=http_client + ) + + request = client._build_request(FinalRequestOptions(method="get", url="/foo")) + timeout = httpx.Timeout(**request.extensions["timeout"]) # type: ignore + assert timeout == DEFAULT_TIMEOUT # our default + def test_default_headers_option(self) -> None: client = Finch( base_url=base_url, @@ -350,10 +406,24 @@ class Model2(BaseModel): assert isinstance(response, Model1) assert response.foo == 1 - def test_base_url_trailing_slash(self) -> None: - client = Finch( - base_url="http://localhost:5000/custom/path/", access_token=access_token, _strict_response_validation=True - ) + @pytest.mark.parametrize( + "client", + [ + Finch( + base_url="http://localhost:5000/custom/path/", + access_token=access_token, + _strict_response_validation=True, + ), + Finch( + base_url="http://localhost:5000/custom/path/", + access_token=access_token, + _strict_response_validation=True, + http_client=httpx.Client(), + ), + ], + ids=["standard", "custom http client"], + ) + def test_base_url_trailing_slash(self, client: Finch) -> None: request = client._build_request( FinalRequestOptions( method="post", @@ -363,10 +433,24 @@ def test_base_url_trailing_slash(self) -> None: ) assert request.url == "http://localhost:5000/custom/path/foo" - def test_base_url_no_trailing_slash(self) -> None: - client = Finch( - base_url="http://localhost:5000/custom/path", access_token=access_token, _strict_response_validation=True - ) + @pytest.mark.parametrize( + "client", + [ + Finch( + base_url="http://localhost:5000/custom/path/", + access_token=access_token, + _strict_response_validation=True, + ), + Finch( + base_url="http://localhost:5000/custom/path/", + access_token=access_token, + _strict_response_validation=True, + http_client=httpx.Client(), + ), + ], + ids=["standard", "custom http client"], + ) + def test_base_url_no_trailing_slash(self, client: Finch) -> None: request = client._build_request( FinalRequestOptions( method="post", @@ -376,6 +460,124 @@ def test_base_url_no_trailing_slash(self) -> None: ) assert request.url == "http://localhost:5000/custom/path/foo" + @pytest.mark.parametrize( + "client", + [ + Finch( + base_url="http://localhost:5000/custom/path/", + access_token=access_token, + _strict_response_validation=True, + ), + Finch( + base_url="http://localhost:5000/custom/path/", + access_token=access_token, + _strict_response_validation=True, + http_client=httpx.Client(), + ), + ], + ids=["standard", "custom http client"], + ) + def test_absolute_request_url(self, client: Finch) -> None: + request = client._build_request( + FinalRequestOptions( + method="post", + url="https://myapi.com/foo", + json_data={"foo": "bar"}, + ), + ) + assert request.url == "https://myapi.com/foo" + + def test_transport_option_is_deprecated(self) -> None: + with pytest.warns( + DeprecationWarning, + match="The `transport` argument is deprecated. The `http_client` argument should be passed instead", + ): + transport = httpx.MockTransport(lambda: None) + + client = Finch( + base_url=base_url, access_token=access_token, _strict_response_validation=True, transport=transport + ) + + assert client._client._transport is transport + + def test_transport_option_mutually_exclusive_with_http_client(self) -> None: + with httpx.Client() as http_client: + with pytest.raises(ValueError, match="The `http_client` argument is mutually exclusive with `transport`"): + with pytest.warns(DeprecationWarning): + Finch( + base_url=base_url, + access_token=access_token, + _strict_response_validation=True, + transport=httpx.MockTransport(lambda: None), + http_client=http_client, + ) + + def test_connection_pool_limits_option_is_deprecated(self) -> None: + with pytest.warns( + DeprecationWarning, + match="The `connection_pool_limits` argument is deprecated. The `http_client` argument should be passed instead", + ): + connection_pool_limits = httpx.Limits( + max_connections=101, max_keepalive_connections=76, keepalive_expiry=23 + ) + + client = Finch( + base_url=base_url, + access_token=access_token, + _strict_response_validation=True, + connection_pool_limits=connection_pool_limits, + ) + + assert isinstance(client._client._transport, httpx.HTTPTransport) + assert client._client._transport._pool._max_connections == 101 + assert client._client._transport._pool._max_keepalive_connections == 76 + assert client._client._transport._pool._keepalive_expiry == 23 + + def test_connection_pool_limits_option_mutually_exclusive_with_http_client(self) -> None: + with httpx.Client() as http_client: + with pytest.raises( + ValueError, match="The `http_client` argument is mutually exclusive with `connection_pool_limits`" + ): + with pytest.warns(DeprecationWarning): + Finch( + base_url=base_url, + access_token=access_token, + _strict_response_validation=True, + connection_pool_limits=httpx.Limits( + max_connections=101, max_keepalive_connections=76, keepalive_expiry=23 + ), + http_client=http_client, + ) + + def test_proxies_option_is_deprecated(self) -> None: + with pytest.warns( + DeprecationWarning, + match="The `proxies` argument is deprecated. The `http_client` argument should be passed instead", + ): + proxies = "https://www.example.com/proxy" + + client = Finch( + base_url=base_url, access_token=access_token, _strict_response_validation=True, proxies=proxies + ) + + mounts = list(client._client._mounts.keys()) + assert len(mounts) == 1 + + pattern = mounts[0].pattern + assert pattern == "all://" + + def test_proxies_option_mutually_exclusive_with_http_client(self) -> None: + with httpx.Client() as http_client: + with pytest.raises(ValueError, match="The `http_client` argument is mutually exclusive with `proxies`"): + with pytest.warns(DeprecationWarning): + Finch( + base_url=base_url, + access_token=access_token, + _strict_response_validation=True, + proxies="https://www.example.com/proxy", + http_client=http_client, + ) + def test_client_del(self) -> None: client = Finch(base_url=base_url, access_token=access_token, _strict_response_validation=True) assert not client.is_closed() @@ -384,6 +586,18 @@ def test_client_del(self) -> None: assert client.is_closed() + def test_copied_client_does_not_close_http(self) -> None: + client = Finch(base_url=base_url, access_token=access_token, _strict_response_validation=True) + assert not client.is_closed() + + copied = client.copy() + assert copied is not client + + copied.__del__() + + assert not copied.is_closed() + assert not client.is_closed() + def test_client_context_manager(self) -> None: client = Finch(base_url=base_url, access_token=access_token, _strict_response_validation=True) with client as c2: @@ -575,6 +789,57 @@ def test_copy_signature(self) -> None: copy_param = copy_signature.parameters.get(name) assert copy_param is not None, f"copy() signature is missing the {name} param" + async def test_request_timeout(self) -> None: + request = self.client._build_request(FinalRequestOptions(method="get", url="/foo")) + timeout = httpx.Timeout(**request.extensions["timeout"]) # type: ignore + assert timeout == DEFAULT_TIMEOUT + + request = self.client._build_request( + FinalRequestOptions(method="get", url="/foo", timeout=httpx.Timeout(100.0)) + ) + timeout = httpx.Timeout(**request.extensions["timeout"]) # type: ignore + assert timeout == httpx.Timeout(100.0) + + async def test_client_timeout_option(self) -> None: + client = AsyncFinch( + base_url=base_url, access_token=access_token, _strict_response_validation=True, timeout=httpx.Timeout(0) + ) + + request = client._build_request(FinalRequestOptions(method="get", url="/foo")) + timeout = httpx.Timeout(**request.extensions["timeout"]) # type: ignore + assert timeout == httpx.Timeout(0) + + async def test_http_client_timeout_option(self) -> None: + # custom timeout given to the httpx client should be used + async with httpx.AsyncClient(timeout=None) as http_client: + client = AsyncFinch( + base_url=base_url, access_token=access_token, _strict_response_validation=True, http_client=http_client + ) + + request = client._build_request(FinalRequestOptions(method="get", url="/foo")) + timeout = httpx.Timeout(**request.extensions["timeout"]) # type: ignore + assert timeout == httpx.Timeout(None) + + # no timeout given to the httpx client should not use the httpx default + async with httpx.AsyncClient() as http_client: + client = AsyncFinch( + base_url=base_url, access_token=access_token, _strict_response_validation=True, http_client=http_client + ) + + request = client._build_request(FinalRequestOptions(method="get", url="/foo")) + timeout = httpx.Timeout(**request.extensions["timeout"]) # type: ignore + assert timeout == DEFAULT_TIMEOUT + + # explicitly passing the default timeout currently results in it being ignored + async with httpx.AsyncClient(timeout=HTTPX_DEFAULT_TIMEOUT) as http_client: + client = AsyncFinch( + base_url=base_url, access_token=access_token, _strict_response_validation=True, http_client=http_client + ) + + request = client._build_request(FinalRequestOptions(method="get", url="/foo")) + timeout = httpx.Timeout(**request.extensions["timeout"]) # type: ignore + assert timeout == DEFAULT_TIMEOUT # our default + def test_default_headers_option(self) -> None: client = AsyncFinch( base_url=base_url, @@ -770,10 +1035,24 @@ class Model2(BaseModel): assert isinstance(response, Model1) assert response.foo == 1 - def test_base_url_trailing_slash(self) -> None: - client = AsyncFinch( - base_url="http://localhost:5000/custom/path/", access_token=access_token, _strict_response_validation=True - ) + @pytest.mark.parametrize( + "client", + [ + AsyncFinch( + base_url="http://localhost:5000/custom/path/", + access_token=access_token, + _strict_response_validation=True, + ), + AsyncFinch( + base_url="http://localhost:5000/custom/path/", + access_token=access_token, + _strict_response_validation=True, + http_client=httpx.AsyncClient(), + ), + ], + ids=["standard", "custom http client"], + ) + def test_base_url_trailing_slash(self, client: AsyncFinch) -> None: request = client._build_request( FinalRequestOptions( method="post", @@ -783,10 +1062,24 @@ def test_base_url_trailing_slash(self) -> None: ) assert request.url == "http://localhost:5000/custom/path/foo" - def test_base_url_no_trailing_slash(self) -> None: - client = AsyncFinch( - base_url="http://localhost:5000/custom/path", access_token=access_token, _strict_response_validation=True - ) + @pytest.mark.parametrize( + "client", + [ + AsyncFinch( + base_url="http://localhost:5000/custom/path/", + access_token=access_token, + _strict_response_validation=True, + ), + AsyncFinch( + base_url="http://localhost:5000/custom/path/", + access_token=access_token, + _strict_response_validation=True, + http_client=httpx.AsyncClient(), + ), + ], + ids=["standard", "custom http client"], + ) + def test_base_url_no_trailing_slash(self, client: AsyncFinch) -> None: request = client._build_request( FinalRequestOptions( method="post", @@ -796,6 +1089,124 @@ def test_base_url_no_trailing_slash(self) -> None: ) assert request.url == "http://localhost:5000/custom/path/foo" + @pytest.mark.parametrize( + "client", + [ + AsyncFinch( + base_url="http://localhost:5000/custom/path/", + access_token=access_token, + _strict_response_validation=True, + ), + AsyncFinch( + base_url="http://localhost:5000/custom/path/", + access_token=access_token, + _strict_response_validation=True, + http_client=httpx.AsyncClient(), + ), + ], + ids=["standard", "custom http client"], + ) + def test_absolute_request_url(self, client: AsyncFinch) -> None: + request = client._build_request( + FinalRequestOptions( + method="post", + url="https://myapi.com/foo", + json_data={"foo": "bar"}, + ), + ) + assert request.url == "https://myapi.com/foo" + + def test_transport_option_is_deprecated(self) -> None: + with pytest.warns( + DeprecationWarning, + match="The `transport` argument is deprecated. The `http_client` argument should be passed instead", + ): + transport = httpx.MockTransport(lambda: None) + + client = AsyncFinch( + base_url=base_url, access_token=access_token, _strict_response_validation=True, transport=transport + ) + + assert client._client._transport is transport + + async def test_transport_option_mutually_exclusive_with_http_client(self) -> None: + async with httpx.AsyncClient() as http_client: + with pytest.raises(ValueError, match="The `http_client` argument is mutually exclusive with `transport`"): + with pytest.warns(DeprecationWarning): + AsyncFinch( + base_url=base_url, + access_token=access_token, + _strict_response_validation=True, + transport=httpx.MockTransport(lambda: None), + http_client=http_client, + ) + + def test_connection_pool_limits_option_is_deprecated(self) -> None: + with pytest.warns( + DeprecationWarning, + match="The `connection_pool_limits` argument is deprecated. The `http_client` argument should be passed instead", + ): + connection_pool_limits = httpx.Limits( + max_connections=101, max_keepalive_connections=76, keepalive_expiry=23 + ) + + client = AsyncFinch( + base_url=base_url, + access_token=access_token, + _strict_response_validation=True, + connection_pool_limits=connection_pool_limits, + ) + + assert isinstance(client._client._transport, httpx.AsyncHTTPTransport) + assert client._client._transport._pool._max_connections == 101 + assert client._client._transport._pool._max_keepalive_connections == 76 + assert client._client._transport._pool._keepalive_expiry == 23 + + async def test_connection_pool_limits_option_mutually_exclusive_with_http_client(self) -> None: + async with httpx.AsyncClient() as http_client: + with pytest.raises( + ValueError, match="The `http_client` argument is mutually exclusive with `connection_pool_limits`" + ): + with pytest.warns(DeprecationWarning): + AsyncFinch( + base_url=base_url, + access_token=access_token, + _strict_response_validation=True, + connection_pool_limits=httpx.Limits( + max_connections=101, max_keepalive_connections=76, keepalive_expiry=23 + ), + http_client=http_client, + ) + + def test_proxies_option_is_deprecated(self) -> None: + with pytest.warns( + DeprecationWarning, + match="The `proxies` argument is deprecated. The `http_client` argument should be passed instead", + ): + proxies = "https://www.example.com/proxy" + + client = AsyncFinch( + base_url=base_url, access_token=access_token, _strict_response_validation=True, proxies=proxies + ) + + mounts = list(client._client._mounts.keys()) + assert len(mounts) == 1 + + pattern = mounts[0].pattern + assert pattern == "all://" + + async def test_proxies_option_mutually_exclusive_with_http_client(self) -> None: + async with httpx.AsyncClient() as http_client: + with pytest.raises(ValueError, match="The `http_client` argument is mutually exclusive with `proxies`"): + with pytest.warns(DeprecationWarning): + AsyncFinch( + base_url=base_url, + access_token=access_token, + _strict_response_validation=True, + proxies="https://www.example.com/proxy", + http_client=http_client, + ) + async def test_client_del(self) -> None: client = AsyncFinch(base_url=base_url, access_token=access_token, _strict_response_validation=True) assert not client.is_closed() @@ -805,6 +1216,19 @@ async def test_client_del(self) -> None: await asyncio.sleep(0.2) assert client.is_closed() + async def test_copied_client_does_not_close_http(self) -> None: + client = AsyncFinch(base_url=base_url, access_token=access_token, _strict_response_validation=True) + assert not client.is_closed() + + copied = client.copy() + assert copied is not client + + copied.__del__() + + await asyncio.sleep(0.2) + assert not copied.is_closed() + assert not client.is_closed() + async def test_client_context_manager(self) -> None: client = AsyncFinch(base_url=base_url, access_token=access_token, _strict_response_validation=True) async with client as c2: