Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[tiny] Remove vLLM Colab link and fix Alpaca Eval quickstart #1530

Merged
merged 9 commits into from
Mar 11, 2025

Conversation

wizeng23
Copy link
Contributor

Description

vLLM doesn't work on Colab because the GPU is too old, so we shouldn't advertise this notebook. Also adds results/ directory where eval results are populated to the gitignore.

Before submitting

  • This PR only changes documentation. (You can ignore the following checks in that case)
  • Did you read the contributor guideline Pull Request guidelines?
  • Did you link the issue(s) related to this PR in the section above?
  • Did you add / update tests where needed?

@wizeng23 wizeng23 requested review from taenin, kaisopos and nikg4 March 11, 2025 01:25
@wizeng23 wizeng23 merged commit a609a51 into main Mar 11, 2025
3 checks passed
@wizeng23 wizeng23 deleted the wizeng/config branch March 11, 2025 01:39
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants