4529 links
38 private links
Strak.ch | Actu et liens en vrac
Home
Login
RSS Feed
ATOM Feed
Tag cloud
Picture wall
Daily
Links per page:
20
50
100
GitHub - vllm-project/vllm: A high-throughput and memory-efficient inference and serving engine for LLMs
Une alternative à Ollama, qui semble avoir de meilleures performances dans les tests.
February 18, 2026 at 10:30:48 PM GMT+1 *
- permalink
-
-
https://github.com/vllm-project/vllm
LLM
IA
AutoHébergement
Links per page:
20
50
100