-
Notifications
You must be signed in to change notification settings - Fork 6.5k
feat(batch prediction): Adds tuned model support comments to samples. #13392
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(batch prediction): Adds tuned model support comments to samples. #13392
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hello @br1mendez, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!
Summary of Changes
Hello team, Gemini here to provide a quick summary of this pull request by @br1mendez. The primary goal of this PR, as indicated by the title, is to enhance the existing batch prediction samples by adding comments that guide users on how to specify a tuned model instead of the default base model. This change aims to make the samples more versatile and informative for users who have trained their own models.
Highlights
- Documentation via Comments: Adds inline comments to the batch prediction sample code to demonstrate the format required for using a tuned model resource path.
- Batch Prediction Samples: Specifically targets the Python samples for batch prediction using both BigQuery and Google Cloud Storage as input/output sources.
Changelog
- genai/batch_prediction/batchpredict_with_bq.py
- Added comments explaining how to use a tuned model by providing the full resource path (
projects/{PROJECT_ID}/locations/{LOCATION}/models/{MODEL_ID}
) for themodel
parameter in theclient.batches.create
call.
- Added comments explaining how to use a tuned model by providing the full resource path (
- genai/batch_prediction/batchpredict_with_gcs.py
- Added comments explaining how to use a tuned model by providing the full resource path (
projects/{PROJECT_ID}/locations/{LOCATION}/models/{MODEL_ID}
) for themodel
parameter in theclient.batches.create
call.
- Added comments explaining how to use a tuned model by providing the full resource path (
Using Gemini Code Assist
The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.
Invoking Gemini
You can request assistance from Gemini at any point in your pull request via creating an issue comment (i.e. comment on the pull request page) using either /gemini <command>
or @gemini-code-assist <command>
. Below is a summary of the supported commands.
Feature | Command | Description |
---|---|---|
Code Review | /gemini review |
Performs a code review for the current pull request in its current state. |
Pull Request Summary | /gemini summary |
Provides a summary of the current pull request in its current state. |
Comment | @gemini-code-assist | Responds in comments when explicitly tagged, both in issue comments and review comments. |
Help | /gemini help |
Displays a list of available commands. |
Customization
To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/
folder in the base of the repository. Detailed instructions can be found here.
Limitations & Feedback
Gemini Code Assist is currently in preview and may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments to provide feedback.
You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.
Code comments appear,
Guiding users, making clear,
Tuned models can run.
Footnotes
-
Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution. ↩
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Code Review
This pull request adds helpful comments to the batch prediction samples, guiding users on how to specify a tuned model. The changes are straightforward and improve the clarity of the code for this specific use case. Based on the review settings, I was instructed to only provide comments for issues of medium, high, or critical severity. My review found no issues meeting these criteria.
Summary of Findings
- No issues found at specified severity levels: My review found no issues of medium, high, or critical severity in the code changes. The added comments are clear, correctly formatted, and provide valuable guidance for users wanting to utilize tuned models in the batch prediction samples.
Merge Readiness
Based on my review, which found no issues of medium, high, or critical severity, the code appears to be in good shape and ready for merging. Please note that I am unable to approve this pull request directly; another reviewer will need to provide approval.
…mendez/python-docs-samples into tuned-model-support-comments
…mendez/python-docs-samples into tuned-model-support-comments
Description
Adds a comment that shows how to use tuned models in a batch prediction job.
Internal: b/417300996
Checklist
nox -s py-3.9
(see Test Environment Setup)nox -s lint
(see Test Environment Setup)