-
Notifications
You must be signed in to change notification settings - Fork 5.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix: CrewAI-based flows with no extra openai #4683
Conversation
CodSpeed Performance ReportMerging #4683 will degrade performances by 26.61%Comparing Summary
Benchmarks breakdown
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Misclicked on approve
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I like this.
What do you say we put this logic in the BaseCrewComponent?
The LLM can be passed to the agent or to the Crew. Also, do you know if they support all llms we support?
This is the main problem right now i think. I didn't see an easy way to find the "api_key" attribute in an arbitrary embeddings model (the langchain version). For example, in OpenAI this is As for putting it in BaseCrewComponent, i think thats a great idea! I can update that. |
@ogabrielluiz not sure if this would be a suitable solution in general, but we could do something like: def _find_api_key(self, model):
"""Attempts to find the API key attribute for a LangChain LLM model instance.
Args:
model: LangChain LLM model instance.
Returns:
The API key if found, otherwise None.
"""
# Define the possible API key attribute names
key_patterns = ["api_key", "key", "token"]
# Iterate over the model attributes
for attr in dir(model):
# Check if the attribute name contains any of the key patterns
if any(pattern in attr.lower() for pattern in key_patterns):
value = getattr(model, attr, None)
# Check if the value is a non-empty string
if isinstance(value, str) and value:
return value
return None |
I think our best bet for now would be to downgrade crewai to a version it worked. What do you think? |
I could be wrong about this but i believe one of the reasons for upgrading crewai was the move to langchain~=0.3.x.... so .... i agree its the best solution, but if thats correct then it may not be an easy option. I'll double check that though |
I see... Well. I guess, for now we focus on the short term solution. Maybe if the user passes a model that we haven't added support yet, raise an error. |
I just updated the PR to factor the code a little better. i search for an If only there was some sort of |
They use litellm which can be ok to integrate later. @phact has experience with it. |
* fix: CrewAI-based flows with no extra openai * [autofix.ci] apply automated fixes * Clean up the location of the crewai model processing * [autofix.ci] apply automated fixes * Properly subclass the tasks and agents method --------- Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
* fix: CrewAI-based flows with no extra openai * [autofix.ci] apply automated fixes * Clean up the location of the crewai model processing * [autofix.ci] apply automated fixes * Properly subclass the tasks and agents method --------- Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
This pull request creates LLM() and Tool() objects which are CrewAI compatible, based on the components in the Crew-Ai based flow (including openai api key, etc)
@ogabrielluiz this fixes the crewai issues in my experience - the root problem is that we were (are) attempting to use tools and agent LLMs that are not in the format that Crew AI expects. This PR attempts to build the crewai versions from the specifications of the input tool and LLM. It's not perfect, as you'll see, but i think it might be a better short term solution than the setting env like you pointed out. Let me know what you think. CC @NadirJ, note i did upgrade CrewAI to the latest release with this, but this is also compatible with the prior release as well.