Skip to content

feat: support client["/endpoint"].GET() style calls #1791

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 4 commits into from
Aug 10, 2024

Conversation

gzm0
Copy link
Contributor

@gzm0 gzm0 commented Jul 31, 2024

Changes

Allow calls of the form client["/path"].GET(...) by using a JavaScript proxy object.

How to Review

Suggested order:

  • index.js
  • index.d.ts
  • Everything else.

Points to watch out for:

  • Performance
  • Reliance on additional ECMAScript features (Proxy).
  • Sufficient unit test coverage (IMO light but sufficient given the implementation).
  • Exporting / non-exporting of type helpers.

Checklist

  • Unit tests updated
  • docs/ updated (if necessary)
  • pnpm run update:examples run (only applicable for openapi-typescript)

Copy link

changeset-bot bot commented Jul 31, 2024

🦋 Changeset detected

Latest commit: f570294

The changes in this PR will be included in the next version bump.

This PR includes changesets to release 2 packages
Name Type
openapi-fetch Minor
openapi-react-query Major

Not sure what this means? Click here to learn what changesets are.

Click here if you're a maintainer who wants to add another changeset to this PR

@gzm0
Copy link
Contributor Author

gzm0 commented Jul 31, 2024

CI failure is being addressed in #1792.

@gzm0 gzm0 changed the title feat: support path-index call style (client["/path"].GET(...)) feat: support client["/endpoint"].GET()` style calls Jul 31, 2024
@gzm0 gzm0 changed the title feat: support client["/endpoint"].GET()` style calls feat: support client["/endpoint"].GET() style calls Jul 31, 2024
@gzm0 gzm0 changed the title feat: support client["/endpoint"].GET() style calls feat: support client["/endpoint"].GET() style calls Jul 31, 2024
@gzm0 gzm0 marked this pull request as ready for review July 31, 2024 11:01
@gzm0 gzm0 requested a review from a team as a code owner July 31, 2024 11:01
@drwpow
Copy link
Contributor

drwpow commented Aug 1, 2024

Thanks for submitting this! But I don’t see an attached issue discussing why this is needed / what problem this is solving. I’m seeing a significant performance regression by using Proxies and additional logic just for what seems to be a minor API change? Could you talk more why this direction was sought? What prior art it references? Does it solve any problems for you?

I’m not necessarily against adding this, but as stated in the CONTRIBUTING guidelines, changing the API in a PR without any prior discussion isn’t recommended, especially if it impacts performance for all users like this PR does (sidenote: for many libraries, yes, things like Proxies are negligible; while we don’t want to micro-optimize to no end, this library being a fetch client strives to shave off every nanosecond possible, and this introduces the biggest slowdown, performance-wise, in this library to-date so we’d better weigh the cost of what we get in return!).

Edit: note on performance

To dig more into my thoughts on performance, Optimizing JavaScript for fun and profit is a great article written recently on V8 performance. Particularly this note:

The proxy benchmark is particularly brutal on V8 at the moment. Last time I checked, proxy objects were always falling back from the JIT to the interpreter, seeing from those results it might still be the case.

In your code, every fetch will be noticeably slower because the compiler can’t optimize the static entries like we have currently. And again, this is where library code differs from application code. If you’re writing this in one place, in an app, that only gets called once, who cares? Proxies are great. But this is a widely-used, lower-level library where every millisecond matters (and if you read the project goals, performance is a major concern). Here it does matter micro-optimizing everything, because this will get called hundreds if not thousands of times.

There are ways to avoid this perf hit with lazy/“smart” getters, which I’d probably be OK with the trade-off! You get a slow first hit, and “free” subsequent hits, but still worth profiling.

@gzm0
Copy link
Contributor Author

gzm0 commented Aug 1, 2024

🤦 sorry, my bad. This fixes (and is discussed in) #1526.

The primary use case is better type inference / go to definition. Another reason why I'm interested in this is thinner mocking (e.g. with jest-mock-extended).

import { paths } from "...";

const mockClient = mock<Client<paths>>();

mockClient["/path"].GET(...).resolvedValue(...);

If you have discouraging performance numbers we can investigate a pay-only-if-you-use it alternative (probably wrapping the result of createClient).

(please excuse my brevity, I'm in-between things, but that missing cross-ref was embarassing).

@drwpow
Copy link
Contributor

drwpow commented Aug 1, 2024

🤦 sorry, my bad. This fixes (and is discussed in) #1526.

Ah, thanks! That issue came to mind but just wanted to check that this built off that convo. No worries—I think there are some great arguments in there.

If you have discouraging performance numbers we can investigate a pay-only-if-you-use it alternative (probably wrapping the result of createClient).

I’d love to explore this, yeah! Just edited my original comment with more info. Benchmarking in general is really tricky (it’s very hard to automate regression tests for it, without investing significant time and resources into maintaining it), but in general I’d like to explore all possible options of avoiding using a Proxy for everything because of the performance hit. It slows down everything by quite a lot (see naïve implementation; I may have made a mistake but hopefully this gives some reference):

If this API just took a separate codepath entirely I think that would be ideal.

}

// Assume the property is an URL.
return Object.fromEntries(methods.map((method) => [method, (init) => coreFetch(property, { ...init, method })]));
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Performance regression: taking previously-static properties, and iterating over them on every fetch call will mean a big performance hit when it’s all added up. Every time we convert a static object with known properties into another loop means the overall runtime extends by a little bit each time. Prefer “WET” code here when possible (and again, this isn’t the norm on all codebases; just library code)!

expect(client).toHaveProperty("OPTIONS");
expect(client).toHaveProperty("HEAD");
expect(client).toHaveProperty("PATCH");
expect(client).toHaveProperty("TRACE");
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggestion: OK this is kind of a dumb test, I’ll admit, but we should at least improve this rather than remove it outright (this tests both runtime code and type code in one test, which is a recurring theme in the current test suite)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I removed it because it was failing (due to the replacement with a proxy). If we go for a more static alternative, I'll be able to add it again.

await client["/blogposts/{post_id}"].GET({
params: { path: { post_id: "1234" } },
});
expect(getRequestUrl().pathname).toBe("/blogposts/1234");
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Comment: these are great tests, by the way! 🙂 I would also like to test this API for POST calls and fetches with requestbodies in addition to what you have here. But this is a great start.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have added some more. They are all adapted from the base tests above.

I'm a bit unsure how much we want to cover this API: On one hand, it is a new public API, so we should probably err on the "too-many-tests-side". On the other hand, the integration is quite thin, so a lot of the core type inference and calling logic is re-used. So I'm not sure it makes sense to essentially test it twice.

WDYT?

@gzm0
Copy link
Contributor Author

gzm0 commented Aug 1, 2024

TY for the quick feedback. I'll attempt a separate factory alternative tomorrow.

RE performance:

  • If the JIT falls back to interpretation, that's definitely an issue.
  • TBC we're on the same page: performance is more important than code size (at least in the KB range?). (Although with a separate createClient function it will not matter if a decent bundler is used downstream).

@drwpow
Copy link
Contributor

drwpow commented Aug 1, 2024

TBC we're on the same page: performance is more important than code size (at least in the KB range?). (Although with a separate createClient function it will not matter if a decent bundler is used downstream).

Yes that’s correct! In the README and docs we do mention the code size as a proxy for performance, but at the end of the day performance > code size. Because this library is meant to be a typed fetch() replacement for their OpenAPI schema, the goal is that users should be able to swap fetch() calls for this without any drop in performance (while that’s technically unattainable—any code introduced has some runtime—we want to at least make the performance hit so negligible it’s not a concern).

Also taking into account some people may wrap this with another library such as TanStack Fetch, openapi-fetch usually won’t be the only slowdown for every request (and to that extent, every nanosecond we introduce here counts against the “budget” that users allot for how slow their entire SDK stack can be, so let’s minimize our impact)

@gzm0 gzm0 force-pushed the path-index branch 2 times, most recently from 800efd6 to ad69d62 Compare August 2, 2024 09:09
Copy link
Contributor Author

@gzm0 gzm0 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've added a commit that proposes an opt-in alternative.

I have pointed out points I think might be problematic individually.

@@ -169,6 +169,25 @@ const { data, error, response } = await client.GET("/url");
| `error` | `5xx`, `4xx`, or `default` response if not OK; otherwise `undefined` |
| `response` | [The original Response](https://developer.mozilla.org/en-US/docs/Web/API/Response) which contains `status`, `headers`, etc. |

### Path-property style
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can also remove this section here, if you feel it is too prominent.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No this is great! Docs aren’t too prescriptive. Better to just add it anywhere to start. Always easy to clean up after-the-fact. But missing information is worse than information in the wrong place (the Algolia search helps with that bit)


const clientProxyHandler = {
// Assume the property is an URL.
get: (coreClient, url) => new UrlCallForwarder(coreClient, url),
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We could memoize UrlCallForwarder here on either:

  • An internal map
  • The proxied object itself (that would probably imply creating another intermediary object in wrapAsPathBasedClient to not pollute the core client).

However, I'm not sure it is worth the effort:

The lazy getter technique is primarily for statically known getter names. We can adjust it to work with proxies, but we'll still be left with an indirection through the proxy:

const clientProxyHandler = {
  // Assume the property is an URL.
  get: (coreClient, url) => {
    if (!(url in coreClient)) {
      // BAD: pollutes client, but OK for the example here.
      coreClient[url] = new UrlCallForwarder(coreClient, url);
    }
    return coreClient[url];
  }
}

Based on optimizing JS, on modern Chrome, Proxies lead to 30x slowdown (with trivial no-op indirection). At this point, I doubt that the additional object allocation is noticeable.

It feels like we're a bit stuck in this sub-optimal situation. Something we can do for users that want high-performance and path based calls is recommend pre-calculating the UrlCallForwarder:

const myMethod = client["/my-path"];

function performanceSensitive() {
  myMethod.GET(...);
}

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hum, I just had an idea: We could potentially get around the proxy limitation by chaining prototypes strangely:

If we inherit (as in: prototype chain) from the proxy with the object we use to memoize the properties, that might mitigate the performance impact.

I'll give this a try.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just saw this improvement. Yeah I think this solves my concerns. I hate making “vibe-based” comments, especially when we don’t have solid benchmarking in the project. But once we give people a good DX like this, it’s tough to take it away again, and it’s easy to be stuck with any performance loss permanently.

But your explorations here seem to have found a happy medium of adding great DX but without incurring a noticeable perf hit (IMO having a proxy for 100% of calls would be a noticeable perf hit, and we managed to avoid that).

Overall I’m pretty happy with this approach!

await client["/blogposts/{post_id}"].GET({
params: { path: { post_id: "1234" } },
});
expect(getRequestUrl().pathname).toBe("/blogposts/1234");
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have added some more. They are all adapted from the base tests above.

I'm a bit unsure how much we want to cover this API: On one hand, it is a new public API, so we should probably err on the "too-many-tests-side". On the other hand, the integration is quite thin, so a lot of the core type inference and calling logic is re-used. So I'm not sure it makes sense to essentially test it twice.

WDYT?

@gzm0 gzm0 requested a review from drwpow August 2, 2024 09:30
@gzm0
Copy link
Contributor Author

gzm0 commented Aug 2, 2024

It does not look like the proxy has any measurable performance impact:

Without the memoization:

 ✓ test/index.bench.js (16) 11283ms
   ✓ setup (4) 3562ms
     name                                hz     min     max    mean     p75     p99    p995    p999      rme  samples
   · openapi-fetch               963,767.47  0.0008  5.9080  0.0010  0.0009  0.0021  0.0024  0.0290   ±2.39%   481884   fastest
   · openapi-fetch (path based)  958,197.98  0.0009  0.5356  0.0010  0.0010  0.0020  0.0023  0.0029   ±0.44%   479099
   · openapi-typescript-fetch    442,678.98  0.0011  5.8343  0.0023  0.0013  0.0048  0.0055  0.0176  ±10.71%   221340
   · axios                       102,057.14  0.0089  0.5606  0.0098  0.0094  0.0169  0.0192  0.0335   ±0.50%    51029   slowest
   ✓ get (only URL) (6) 3861ms
     name                              hz     min     max    mean     p75     p99    p995    p999     rme  samples
   · openapi-fetch               1,374.44  0.5659  5.6875  0.7276  0.7245  4.4769  5.1658  5.6875  ±4.72%      688
   · openapi-fetch (path based)  1,521.29  0.4680  8.1287  0.6573  0.6306  3.8165  5.9574  8.1287  ±6.24%      761
   · openapi-typescript-fetch    1,616.65  0.4747  6.3194  0.6186  0.5989  1.1297  5.1087  6.3194  ±5.51%      809   fastest
   · openapi-typescript-codegen  1,550.53  0.4916  5.5255  0.6449  0.6238  4.7036  5.0065  5.5255  ±5.04%      776
   · axios                         423.41  1.8487  6.3936  2.3618  2.4666  6.1766  6.2922  6.3936  ±4.03%      212   slowest
   · superagent                    503.85  1.6337  5.9643  1.9847  1.9958  5.3915  5.6255  5.9643  ±3.51%      253
   ✓ get (headers) (6) 3855ms
     name                              hz     min      max    mean     p75     p99    p995     p999      rme  samples
   · openapi-fetch               1,681.39  0.4554   5.8572  0.5947  0.5841  4.6528  4.8856   5.8572   ±5.40%      841
   · openapi-fetch (path based)  1,479.07  0.4598  37.2277  0.6761  0.5962  3.9853  5.4106  37.2277  ±15.19%      740
   · openapi-typescript-fetch    1,709.66  0.4375   7.1737  0.5849  0.5561  4.6950  4.9280   7.1737   ±5.73%      855   fastest
   · openapi-typescript-codegen  1,679.25  0.4550   5.4615  0.5955  0.5749  4.6242  4.8575   5.4615   ±5.27%      840
   · axios                         461.21  1.7771   6.6207  2.1682  2.1944  5.0415  5.9881   6.6207   ±3.53%      231   slowest
   · superagent                    480.56  1.5588   5.8003  2.0809  2.1562  5.3616  5.7713   5.8003   ±3.42%      241


 BENCH  Summary

  openapi-fetch - test/index.bench.js > setup
    1.01x faster than openapi-fetch (path based)
    2.18x faster than openapi-typescript-fetch
    9.44x faster than axios

  openapi-typescript-fetch - test/index.bench.js > get (only URL)
    1.04x faster than openapi-typescript-codegen
    1.06x faster than openapi-fetch (path based)
    1.18x faster than openapi-fetch
    3.21x faster than superagent
    3.82x faster than axios

  openapi-typescript-fetch - test/index.bench.js > get (headers)
    1.02x faster than openapi-fetch
    1.02x faster than openapi-typescript-codegen
    1.16x faster than openapi-fetch (path based)
    3.56x faster than superagent
    3.71x faster than axios

With the memoization:

 ✓ test/index.bench.js (16) 11160ms
   ✓ setup (4) 3438ms
     name                                hz     min      max    mean     p75     p99    p995    p999      rme  samples
   · openapi-fetch               898,597.93  0.0009   2.0690  0.0011  0.0010  0.0021  0.0025  0.0265   ±1.00%   449299   fastest
   · openapi-fetch (path based)  329,820.97  0.0013  10.6270  0.0030  0.0016  0.0045  0.0058  0.0147  ±12.57%   166556
   · openapi-typescript-fetch    425,604.10  0.0011   6.3861  0.0023  0.0013  0.0046  0.0053  0.0309  ±11.77%   212803
   · axios                       111,898.57  0.0083   0.8467  0.0089  0.0088  0.0129  0.0165  0.0228   ±0.50%    55950   slowest
   ✓ get (only URL) (6) 3828ms
     name                              hz     min     max    mean     p75     p99    p995    p999     rme  samples
   · openapi-fetch               1,239.24  0.5681  7.3555  0.8069  0.7880  4.1139  5.2457  7.3555  ±5.21%      620
   · openapi-fetch (path based)  1,550.97  0.4764  5.5317  0.6448  0.6180  4.8220  5.1287  5.5317  ±5.08%      776
   · openapi-typescript-fetch    1,637.86  0.4533  5.6228  0.6106  0.5893  1.0828  5.0218  5.6228  ±5.00%      819   fastest
   · openapi-typescript-codegen  1,604.55  0.4969  5.6166  0.6232  0.6060  2.1447  5.0693  5.6166  ±5.09%      803
   · axios                         357.33  2.0225  6.9643  2.7986  2.9122  6.2065  6.9643  6.9643  ±3.21%      179   slowest
   · superagent                    402.58  1.8439  6.7624  2.4840  2.5649  6.3109  6.6520  6.7624  ±3.72%      202
   ✓ get (headers) (6) 3888ms
     name                              hz     min     max    mean     p75     p99    p995    p999     rme  samples
   · openapi-fetch               1,501.18  0.4718  7.3496  0.6661  0.6251  4.9542  5.3748  7.3496  ±5.75%      755
   · openapi-fetch (path based)  1,736.68  0.4355  5.5869  0.5758  0.5641  2.3819  4.8996  5.5869  ±5.00%      869
   · openapi-typescript-fetch    1,783.00  0.4338  6.3428  0.5609  0.5448  4.4160  4.8103  6.3428  ±5.27%      892   fastest
   · openapi-typescript-codegen  1,694.24  0.4466  5.6741  0.5902  0.5613  4.3757  4.9665  5.6741  ±5.31%      848
   · axios                         375.33  1.9158  6.8901  2.6643  2.8660  6.4472  6.8901  6.8901  ±3.88%      188   slowest
   · superagent                    375.35  1.7544  6.9554  2.6642  2.7840  6.4310  6.9554  6.9554  ±3.90%      188


 BENCH  Summary

  openapi-fetch - test/index.bench.js > setup
    2.11x faster than openapi-typescript-fetch
    2.72x faster than openapi-fetch (path based)
    8.03x faster than axios

  openapi-typescript-fetch - test/index.bench.js > get (only URL)
    1.02x faster than openapi-typescript-codegen
    1.06x faster than openapi-fetch (path based)
    1.32x faster than openapi-fetch
    4.07x faster than superagent
    4.58x faster than axios

  openapi-typescript-fetch - test/index.bench.js > get (headers)
    1.03x faster than openapi-fetch (path based)
    1.05x faster than openapi-typescript-codegen
    1.19x faster than openapi-fetch
    4.75x faster than superagent
    4.75x faster than axios

@gzm0 gzm0 mentioned this pull request Aug 2, 2024
3 tasks
@gzm0
Copy link
Contributor Author

gzm0 commented Aug 5, 2024

I have opened #1818 to discuss benchmarking in general. @drwpow how would you like to proceed with this one?

@kerwanp kerwanp added the openapi-fetch Relevant to the openapi-fetch library label Aug 10, 2024
@drwpow
Copy link
Contributor

drwpow commented Aug 10, 2024

It does not look like the proxy has any measurable performance impact:

Thanks for taking the time to inspect that! Perhaps you’re right, and the way we’re using it, proxies wouldn’t have slowed this library down at all even in the worst-case scenarios. But at least you minimized any potential impact as much as possible, and I’m happy with it!

Also really appreciate all the great work you’ve done in improving benchmarking as well. I’d be happy to ship this! Thanks for adding such a great feature 🙂

Copy link
Contributor

@drwpow drwpow left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great docs, test, and code!

@drwpow drwpow merged commit a956d5d into openapi-ts:main Aug 10, 2024
7 checks passed
@github-actions github-actions bot mentioned this pull request Aug 10, 2024
@gzm0 gzm0 deleted the path-index branch August 12, 2024 06:53
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
openapi-fetch Relevant to the openapi-fetch library
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants