Skip to content

Conversation

@Trinity-SYT-SECURITY
Copy link

Added ollama settings, I will also add troubleshooting content in the future


~~~bibtex
┌──(venv)─(root㉿kali)-[/home/kali/Downloads/hackingBuddyGPT]
└─# cd src/hackingBuddyGPT/cli
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hm, have you tried using the latest hackingBuddyGPT version (0.3.1)? You can install it with pip install hackingBuddyGPT and then can just call hackingBuddyGPT from the command line. This might make the setup easier!

┌──(venv)─(root㉿kali)-[/home/…/hackingBuddyGPT/src/hackingBuddyGPT/cli]
└─# python wintermute.py
usage: wintermute.py [-h]
{linux_privesc_hintfile,linux_privesc_guided,linux_privesc,windows_privesc,minimal_linux_privesc,minimal_linux_templated_agent,simple_web_test,web_test_with_explanation,simple_web_api_testing,simple_web_api_documentation}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

those are named differently by now.. would it be possible for you to do the same with the latest version of hackingBuddyGPT? Functionally, nothing (or not much) should change but you should be able to directly call hackingBuddyGPT and the use-cases are named differently

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I will go back to comfirm and modify the file again, thank you for the reminder!!


~~~bibtex
//then copy the selected model into gpt-3.5-turbo
ollama cp gdisney/mistral-uncensored gpt-3.5-turbo
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this will break some of the internal logic of hackingBuddyGPT (e.g., token estimation as this uses the given model name to switch between used tokenizers), maybe mention that this is not recommended?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants