vllm-project/vllm

Python

A high-throughput and memory-efficient inference and serving engine for LLMs

82/ 100

Activity

Stable

Score Breakdown

25
Issue Response
/ 25 pts
14
PR Merge Rate
/ 20 pts
13
External Contributions
/ 15 pts
14
Commit Distribution
/ 15 pts
10
Good First Issues
/ 10 pts
1
Complexity
/ 10 pts
4
CONTRIBUTING
/ 4 pts
1
Code of Conduct
/ 1 pts
0
No Penalty
Repo is code
0
No Penalty
Repo is active

Repository Stats

Stars
77.2k
Forks
15.8k
Open Issues
1.9k
Lines of Code
1M
Contributors
2.5k

Contributor Friendliness

Good First Issues
39
Help Wanted
57
Avg Issue Response
10h
Avg PR Merge Time
19h
Avg PR Review Time
1m
Issue Close Rate
11%

Project Info

Created
Feb 9, 2023
Age
3 years old
Last Push
20d ago
Size
185.8 MB