-
Notifications
You must be signed in to change notification settings - Fork 7.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(context optimization): Optimize LLM Context Management and File Handling #578
feat(context optimization): Optimize LLM Context Management and File Handling #578
Conversation
@thecodacus I am going to review this over the next couple of days, this is absolutely fantastic and I want to give it some proper testing! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fantastic work @thecodacus!! I took a look at everything and tested it on my end.
I think this is ready to merge except I just have a couple of questions:
-
Do you intend on removing all the debug messages in the terminal? The ones that print out processedMessages. Super helpful for me to see what is happening behind the scenes btw! But it would be good to remove these before merging I'm thinking.
-
Which LLMs have you tested this with? I tested with Qwen 2.5 Coder 32b and seemed to get suboptimal results compared to what I usually get, but then again you never know with local LLMs sometimes, it could be a fluke.
tested with some local models 7B and llama3.2B not much difference there,also tested with gpt-4o and claude sonnet almost similar output I got |
@wonderwhy-er , Since this is a small architectural change, I like to confirm with you , if this has any conflict with any changes that you planned for future? if not then I will merge |
I like this a lot but yes, it is very verbose console logging. The way I would imagine handling this is disable console logging for production builds. So, the build command would turn it off (or something like that). For Development, and tracking down issues, this is great! |
…zation feat(context optimization): Optimize LLM Context Management and File Handling
Optimize LLM Context Management and File Handling
Overview
This PR significantly improves how we manage LLM context and file handling in chat interactions. The changes optimize memory usage, extend chat context length, and provide a single source of truth for file content, resulting in more reliable and efficient AI operations.
Key Changes
1. Context Optimization
2. Chat History Management
3. Workbench Improvements
Benefits
Technical Details
createFilesContext
function to generate structured file contextssimplifyBoltActions
for optimizing chat historyTesting
Migration Notes
No breaking changes. Existing chat interactions will automatically benefit from the optimizations.
Future Improvements
Preview
Context.Optimization.demo.mp4