Replies: 2 comments
-
Hi there, thanks for the idea! |
Beta Was this translation helpful? Give feedback.
-
Thank you for your response. I completely agree with you regarding these potential security considerations. I have another question, please. Is it possible to utilize a different large language model that is open source and available for download, such as Falcon-40B-instruct from TII's or LLaMA from Meta AI? Or perhaps any other open LLM listed in this list: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard? And for the APIs, we can utilize the "LocalAI" repository found at https://github.com/go-skynet/LocalAI. This repository provides the capability to host an OpenAI-compatible API locally. |
Beta Was this translation helpful? Give feedback.
-
Can we incorporate PentestGPT into a fully automated pipeline, eliminating the need for user interaction? The idea is that the user initiates the testing process by providing target information such as IP address, and then PentestGPT takes this input and begins the testing process. It generates commands and selects the most suitable one(s) to send to a Python script, which executes the commands generated by PentestGPT. This cycle continues until the testing is completed. Once finished, we can access the testing report saved in a designated location.
Beta Was this translation helpful? Give feedback.
All reactions