vllm-project/vllm

Python

A high-throughput and memory-efficient inference and serving engine for LLMs

82/ 100

Activity

Stable

Score Breakdown

25
Issue Response
/ 25 pts
15
PR Merge Rate
/ 20 pts
12
External Contributions
/ 15 pts
14
Commit Distribution
/ 15 pts
10
Good First Issues
/ 10 pts
1
Complexity
/ 10 pts
4
CONTRIBUTING
/ 4 pts
1
Code of Conduct
/ 1 pts
0
No Penalty
Repo is code
0
No Penalty
Repo is active

Repository Stats

Stars
72.3k
Forks
14k
Open Issues
1.7k
Lines of Code
910.7k
Contributors
2.3k

Contributor Friendliness

Good First Issues
36
Help Wanted
45
Avg Issue Response
1h
Avg PR Merge Time
13h
Avg PR Review Time
4m
Issue Close Rate
18%

Project Info

Created
Feb 9, 2023
Age
3 years old
Last Push
5d ago
Size
167.9 MB