-
Notifications
You must be signed in to change notification settings - Fork 405
Remove @/lib/encryption from route due to IV error #90
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hello @mfbloom, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!
Summary of Changes
This pull request aims to remove encryption from the route due to an IV error, which was preventing local LLMs from working correctly. I have removed the encryption and decryption logic from the specified files, specifically in the message route and LLM provider route. This change effectively disables API key encryption when the deployment environment is local, allowing local LLMs to function without the need for decryption.
Highlights
- Encryption Removal: I have removed the encryption and decryption functions from the message route (
src/app/api/chat-rooms/[id]/messages/route.ts) to resolve issues with local LLMs. - API Key Handling: I have modified the API key handling logic to directly use the API key without decryption when running in a local environment.
- LLM Provider Update: I have updated the LLM provider route (
src/app/api/llm-providers/[id]/route.ts) to prevent encryption of the API key during updates.
Changelog
- src/app/api/chat-rooms/[id]/messages/route.ts
- Removed import statement for
decryptfrom@/lib/encryption(line 4). - Modified the API key assignment logic to directly use
llmProvider.apiKeywhen the environment is local (line 92), instead of decrypting it.
- Removed import statement for
- src/app/api/llm-providers/[id]/route.ts
- Removed import statement for
encryptfrom@/lib/encryption(line 4). - Modified the PATCH request handler to directly assign the API key without encryption (line 21).
- Removed import statement for
Using Gemini Code Assist
The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.
Invoking Gemini
You can request assistance from Gemini at any point in your pull request via creating an issue comment (i.e. comment on the pull request page) using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands.
| Feature | Command | Description |
|---|---|---|
| Code Review | /gemini review |
Performs a code review for the current pull request in its current state. |
| Pull Request Summary | /gemini summary |
Provides a summary of the current pull request in its current state. |
| Comment | @gemini-code-assist | Responds in comments when explicitly tagged, both in issue comments and review comments. |
| Help | /gemini help |
Displays a list of available commands. |
Customization
To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.
Limitations & Feedback
Gemini Code Assist is currently in preview and may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments to provide feedback.
You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.
The cipher's hold,
Loosed for local's command,
LLM takes flight.
Footnotes
-
Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution. ↩
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Code Review
The pull request aims to remove encryption from the route due to an IV error, which is causing issues with local LLMs. The changes involve removing encryption-related code from the messages route and the llm-providers route. Overall, the changes seem straightforward and address the stated problem.
Summary of Findings
- Security Implications of Removing Encryption: Removing encryption, even temporarily, can expose sensitive API keys. It's crucial to understand the security implications and implement alternative safeguards if necessary. Consider adding a warning or documentation about the lack of encryption in local environments.
- Conditional Encryption: The code now conditionally applies encryption based on the deployment environment. While this addresses the immediate issue, it introduces complexity. Ensure that the deployment environment is correctly configured and that the encryption logic is thoroughly tested in cloud environments.
Merge Readiness
The pull request addresses the stated issue of IV errors with local LLMs by removing encryption. However, the removal of encryption introduces potential security concerns, especially around API key exposure. While I am unable to directly approve the pull request, I recommend carefully considering the security implications and ensuring that appropriate safeguards are in place before merging. At a minimum, I would recommend adding a comment or documentation about the lack of encryption in local environments. It is important to have others review and approve this code before merging.
| if (currentDeploymentEnv === 'local') { | ||
| apiKey = decrypt(llmProvider.apiKey) | ||
| apiKey = llmProvider.apiKey | ||
| } else if (currentDeploymentEnv === 'cloud') { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Consider adding a comment here explaining why encryption is disabled in the local environment. This will help future developers understand the reasoning behind this conditional logic and potential security implications.
apiKey = llmProvider.apiKey // Encryption disabled in local environment due to IV error| // API key encryption | ||
| if (body.apiKey) body.apiKey = encrypt(body.apiKey) | ||
| if (body.apiKey) body.apiKey = body.apiKey | ||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Consider adding a comment explaining why encryption is bypassed here. This is important for maintainability and understanding the code's behavior in different environments.
| // API key encryption | |
| if (body.apiKey) body.apiKey = encrypt(body.apiKey) | |
| if (body.apiKey) body.apiKey = body.apiKey | |
| if (body.apiKey) body.apiKey = body.apiKey // Encryption bypassed due to IV error in local LLMs | |
bisonbet
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I made these changes and they seem to work as expected
|
Don't forget to also close the related issue! :) #87 |
name: 🔄 Pull Request
about: Create a pull request for OpenHealth
What's changed?
Removed the encryption routine to allow local llm's to work again.
✨ Screenshots (optional)