Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Switch from tiny to smollm:135m #891

Merged
merged 1 commit into from
Feb 27, 2025
Merged

Switch from tiny to smollm:135m #891

merged 1 commit into from
Feb 27, 2025

Conversation

ericcurtin
Copy link
Collaborator

@ericcurtin ericcurtin commented Feb 26, 2025

This is probably a consequence of my slow network, but I switched to smollm:135m, it's easier for demos. tiny was taking too long to download.

Summary by Sourcery

Switch the default model in the demo script from tiny to smollm:135m to improve the out-of-the-box experience for users with slower networks.

Copy link
Contributor

sourcery-ai bot commented Feb 26, 2025

Reviewer's Guide by Sourcery

This pull request switches the default model used in the demo script from 'tiny' to 'smollm:135m'. The change involves updating the model name in various commands and messages within the script. Additionally, the pull request updates the shellcheck command in the Makefile to include nested directories.

Sequence diagram for pulling the model

sequenceDiagram
    participant User
    participant ramalama script
    participant ramalama CLI
    participant Ollama

    User->>ramalama script: Executes ramalama.sh pull
    ramalama script->>ramalama CLI: ramalama rm --ignore smollm:135m
    ramalama CLI-->>ramalama script: Removes the model (if it exists)
    ramalama script->>ramalama CLI: ramalama pull smollm:135m
    ramalama CLI->>Ollama: Downloads smollm:135m model
    Ollama-->>ramalama CLI: Model downloaded
    ramalama CLI-->>ramalama script: Model pulled
    ramalama script->>ramalama CLI: ramalama ls
    ramalama CLI-->>ramalama script: Lists models, grep for smollm:135m
    ramalama script->>ramalama CLI: podman images
    ramalama CLI-->>ramalama script: Lists container images, grep for ramalama
Loading

Sequence diagram for converting the model to OCI content

sequenceDiagram
    participant User
    participant ramalama script
    participant ramalama CLI
    participant podman

    User->>ramalama script: Executes ramalama.sh kubernetes
    ramalama script->>ramalama CLI: ramalama convert smollm:135m quay.io/ramalama/smollm:1.0
    ramalama CLI->>podman: Converts smollm:135m model to OCI content
    podman-->>ramalama CLI: OCI content created
    ramalama CLI-->>ramalama script: Model converted
    ramalama script->>ramalama CLI: podman images
    ramalama CLI->>podman: Lists container images, grep for quay.io/ramalama/smollm
    podman-->>ramalama CLI: Lists images
    ramalama script->>ramalama CLI: ramalama serve --generate kube --name smollm-service oci://quay.io/ramalama/smollm:1.0
    ramalama CLI-->>ramalama script: Generates kubernetes YAML file
Loading

File-Level Changes

Change Details Files
Switched the default model from 'tiny' to 'smollm:135m' in the demo script.
  • Replaced all instances of 'tiny' with 'smollm:135m' in the script's commands and messages.
  • Updated the kubernetes and quadlet sections to reflect the new model name.
docs/demo/ramalama.sh
Corrected shellcheck command to include nested directories.
  • Updated the shellcheck command in the Makefile to include nested directories.
Makefile
Replaced deprecated usage of command.
  • Replaced deprecated usage of command.
  • Replaced deprecated usage of read.
docs/demo/ramalama.sh

Tips and commands

Interacting with Sourcery

  • Trigger a new review: Comment @sourcery-ai review on the pull request.
  • Continue discussions: Reply directly to Sourcery's review comments.
  • Generate a GitHub issue from a review comment: Ask Sourcery to create an
    issue from a review comment by replying to it. You can also reply to a
    review comment with @sourcery-ai issue to create an issue from it.
  • Generate a pull request title: Write @sourcery-ai anywhere in the pull
    request title to generate a title at any time. You can also comment
    @sourcery-ai title on the pull request to (re-)generate the title at any time.
  • Generate a pull request summary: Write @sourcery-ai summary anywhere in
    the pull request body to generate a PR summary at any time exactly where you
    want it. You can also comment @sourcery-ai summary on the pull request to
    (re-)generate the summary at any time.
  • Generate reviewer's guide: Comment @sourcery-ai guide on the pull
    request to (re-)generate the reviewer's guide at any time.
  • Resolve all Sourcery comments: Comment @sourcery-ai resolve on the
    pull request to resolve all Sourcery comments. Useful if you've already
    addressed all the comments and don't want to see them anymore.
  • Dismiss all Sourcery reviews: Comment @sourcery-ai dismiss on the pull
    request to dismiss all existing Sourcery reviews. Especially useful if you
    want to start fresh with a new review - don't forget to comment
    @sourcery-ai review to trigger a new review!
  • Generate a plan of action for an issue: Comment @sourcery-ai plan on
    an issue to generate a plan of action for it.

Customizing Your Experience

Access your dashboard to:

  • Enable or disable review features such as the Sourcery-generated pull request
    summary, the reviewer's guide, and others.
  • Change the review language.
  • Add, remove or edit custom review instructions.
  • Adjust other review settings.

Getting Help

Copy link
Contributor

@sourcery-ai sourcery-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey @ericcurtin - I've reviewed your changes - here's some feedback:

Overall Comments:

  • Consider renaming ramalama.sh to ramalama_smollm.sh or similar to reflect the specific model it uses.
  • The ramalama rm --ignore commands could be simplified to ramalama rm smollm:135m.
Here's what I looked at during the review
  • 🟢 General issues: all looks good
  • 🟢 Security: all looks good
  • 🟢 Testing: all looks good
  • 🟢 Complexity: all looks good
  • 🟢 Documentation: all looks good

Sourcery is free for open source - if you like our reviews please consider sharing them ✨
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.

This is probably a consequence of my slow network, but I switched
to smollm:135m, it's easier for demos. tiny was taking too long
to download.

Signed-off-by: Eric Curtin <[email protected]>
@rhatdan
Copy link
Member

rhatdan commented Feb 27, 2025

LGTM

@rhatdan rhatdan merged commit ff258ae into main Feb 27, 2025
17 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants