diff --git a/2024/09/05/perf-update.html b/2024/09/05/perf-update.html index 9b74a58..bce15ba 100644 --- a/2024/09/05/perf-update.html +++ b/2024/09/05/perf-update.html @@ -210,7 +210,7 @@

Conclusion & Future Work

Get Involved

-

If you haven’t, we highly recommend you to update the vLLM version (see instructions here) and try it out for yourself! We always love to learn more about your use cases and how we can make vLLM better for you. The vLLM team can be reached out via vllm-questions@lists.berkeley.edu. vLLM is also a community project, if you are interested in participating and contributing, we welcome you to check out our roadmap and see good first issues to tackle. Stay tuned for more updates by following us on X.

+

If you haven’t, we highly recommend you to update the vLLM version (see instructions here) and try it out for yourself! We always love to learn more about your use cases and how we can make vLLM better for you. The vLLM team can be reached out via vllm-questions@lists.berkeley.edu. vLLM is also a community project, if you are interested in participating and contributing, we welcome you to check out our roadmap and see good first issues to tackle. Stay tuned for more updates by following us on X.

If you are in the Bay Area, you can meet the vLLM team at the following events: vLLM’s sixth meetup with NVIDIA(09/09), PyTorch Conference (09/19), CUDA MODE IRL meetup (09/21), and the first ever vLLM track at Ray Summit (10/01-02).