You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I managed to use ollama with Process.hierarchical, this is my config:
manager = LLM(model="ollama/phi4", base_url="http://localhost:11434")
@CrewBase
class DocAutomation:
@crew
def crew(self) -> Crew:
"""Creates the DocAutomation crew"""
# To learn how to add knowledge sources to your crew, check out the documentation:
# https://docs.crewai.com/concepts/knowledge#what-is-knowledge
return Crew(
agents=self.agents, # Automatically created by the @agent decorator
tasks=self.tasks, # Automatically created by the @task decorator
# process=Process.sequential,
process=Process.hierarchical,
manager_llm=manager,
manager_agent=None,
verbose=True,
# planning=True,
)
Description
Cannot use Ollama with process=Process.hierarchical. But if I change to sequential, it works.
Steps to Reproduce
Expected behavior
The hierarchical process should work the same way when not using Ollama.
Screenshots/Code snippets
Input
Output
Operating System
Other (specify in additional context)
Python Version
3.12
crewAI Version
0.86.0
crewAI Tools Version
0.25.8
Virtual Environment
Venv
Evidence
Possible Solution
None
Additional context
OS: macOS 15 Sequoia
The text was updated successfully, but these errors were encountered: