Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

vLLM version increment #1502

Merged
merged 4 commits into from
Mar 4, 2025
Merged

vLLM version increment #1502

merged 4 commits into from
Mar 4, 2025

Conversation

nikg4
Copy link
Collaborator

@nikg4 nikg4 commented Mar 3, 2025

Description

-- Also run manual tests with: oumi infer -i -c configs/recipes/llama3_2/inference/1b_vllm_infer.yaml , oumi infer -i -c configs/recipes/vision/llava_7b/inference/vllm_infer.yaml --image=<...>

Related issues

Fixes OPE-1106

Before submitting

  • This PR only changes documentation. (You can ignore the following checks in that case)
  • Did you read the contributor guideline Pull Request guidelines?
  • Did you link the issue(s) related to this PR in the section above?
  • Did you add / update tests where needed?

Reviewers

At least one review from a member of oumi-ai/oumi-staff is required.

@nikg4 nikg4 marked this pull request as ready for review March 3, 2025 22:58
Copy link
Contributor

@wizeng23 wizeng23 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you also update configs/examples/bulk_inference/gcp_job.yaml? I think you can remove the version requirement there

@wizeng23
Copy link
Contributor

wizeng23 commented Mar 3, 2025

Note that our version requirements are only used for CI targets, so those should be the only things affected. We don't have GCP jobs for vllm aside from the one I mentioned above, so users installing vLLM will be getting the latest version

@nikg4
Copy link
Collaborator Author

nikg4 commented Mar 4, 2025

Could you also update configs/examples/bulk_inference/gcp_job.yaml? I think you can remove the version requirement there

updated

@nikg4 nikg4 merged commit 6590974 into main Mar 4, 2025
2 checks passed
@nikg4 nikg4 deleted the nikg4/vllm-ver branch March 4, 2025 00:32
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants