Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Tiny] Add more warnings for "special" requirements of Qwen2.5-VL #1453

Merged
merged 1 commit into from
Feb 20, 2025

Conversation

optas
Copy link
Contributor

@optas optas commented Feb 20, 2025

Description

Adds explicit user warnings on the configuration and readme files regarding Qwen2.5-VL-3B and its dependency on the latest dev transformers version.

Related issues

Fixes # Towards: OPE-988

Before submitting

  • This PR only changes documentation. (You can ignore the following checks in that case)
  • Did you read the contributor guideline Pull Request guidelines?
  • Did you link the issue(s) related to this PR in the section above?
  • Did you add / update tests where needed?

Reviewers

At least one review from a member of oumi-ai/oumi-staff is required.

@optas optas requested review from taenin and nikg4 February 20, 2025 02:32
Copy link
Collaborator

@taenin taenin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! Do we need to add this for any other models?

@optas
Copy link
Contributor Author

optas commented Feb 20, 2025

LGTM! Do we need to add this for any other models?

Thanks for pointing this @taenin. I think such an aggressive deviation ("dev transformers" requirement) is unnecessary for other models, but I have heard from @xrdaukar that some VL models we have tried have some "peculiarities". Writing these things in the .md files for standard/easy reference is a good idea. Will follow up with @xrdaukar.

@optas optas merged commit eee596f into main Feb 20, 2025
1 check passed
@optas optas deleted the optas/qwen_vl_micro branch February 20, 2025 05:50
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants