Additional Observations:
8 Matching Annotations
- Oct 2025
-
github.com github.com
-
-
github.com github.com
-
I also just ran into this issue after cloning from master a few hours ago; message_agent went over the limit once, after which subsequent calls also failed. Telling the system to delete and re-create the agent got it past the bottleneck. Maybe some way to restrict the history provided to sub-agents would work?
-
Basically max is 8192 tokens in this context, lowering that will force it to split something less into chunks IE: def split_text(text: str, max_length: int = 4192) -> Generator[str, None, None]: basically would split anything above that I believe. It's linked into messages and other functs
-
-
github.com github.com
-
I'm also running into this problem. I've confirmed that writing to the workspace within Docker yields the expected result, so I know it isn't a problem with my use of Docker
-
-
github.com github.com
-
Thanks for the tip. I had to enable Virtual Maschine in BIOS to run the Docker now. (...)! I believe it worked! One strange thing though, as you can see: it first states it cant find the file, then proceeds to read the output of the fily anyway -meaning it found the file-: Executing file 'generate_dinner_recipe.py' in workspace 'auto_gpt_workspace' [2023-04-07T03:22:43.847792900Z][docker-credential-desktop.EXE][W] Windows version might not be up-to-date: The system cannot find the file specified. SYSTEM: Command execute_python_file returned: (...) BUT, I can now read from executing files, which feels amazing, like this was a big step and THANK you
-
- Sep 2025
-
github.com github.com
-
Hehe, just remove the software - no problem then. Doesn't quite fix anything. If looking for a fix, you could check out my previous comment.
-
-
github.com github.com
-
but i try pincone,it seems useless…i will try again
-
- Aug 2025
-
github.com github.com
-
The solution where I installed tf_keras worked for that section, but I’m encountering a similar error in the "Attach a classification head" section of the same notebook. However, the previous solution does not seem to work in this case.
-