-The LLM Instance Gateway came out of [wg-serving](https://github.com/kubernetes/community/tree/master/wg-serving) and is sponsored by [SIG Apps](https://github.com/kubernetes/community/blob/master/sig-apps/README.md#llm-instance-gateway). This repo contains: the load balancing algorithm, [ext-proc](https://www.envoyproxy.io/docs/envoy/latest/configuration/http/http_filters/ext_proc_filter) code, CRDs, and controllers to support the LLM Instance Gateway.
0 commit comments