Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Cannot use Ollama with process=Process.hierarchical #1863

Open
DiogoMartins004 opened this issue Jan 7, 2025 · 1 comment
Open

[BUG] Cannot use Ollama with process=Process.hierarchical #1863

DiogoMartins004 opened this issue Jan 7, 2025 · 1 comment
Labels
bug Something isn't working

Comments

@DiogoMartins004
Copy link

DiogoMartins004 commented Jan 7, 2025

Description

Cannot use Ollama with process=Process.hierarchical. But if I change to sequential, it works.

Steps to Reproduce

  1. Run the code
  2. Error

Expected behavior

The hierarchical process should work the same way when not using Ollama.

Screenshots/Code snippets

Input

crew = Crew(
    tasks=[task],
    agents=[researcher, writer],
    process=Process.hierarchical,
    respect_context_window=True,
    memory=True,
    manager_agent=manager,
    planning=True,
    verbose=True,
)

Output

raise AuthenticationError(
litellm.exceptions.AuthenticationError: litellm.AuthenticationError: AuthenticationError: OpenAIException - Error code: 401 -
{
  'error':
  {
    'message': 'Incorrect API key provided: sk-proj-1111. You can find your API key at https://platform.openai.com/account/api-keys.',
    'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'
  }
}

Operating System

Other (specify in additional context)

Python Version

3.12

crewAI Version

0.86.0

crewAI Tools Version

0.25.8

Virtual Environment

Venv

Evidence

raise AuthenticationError(
litellm.exceptions.AuthenticationError: litellm.AuthenticationError: AuthenticationError: OpenAIException - Error code: 401 -
{
  'error':
  {
    'message': 'Incorrect API key provided: sk-proj-1111. You can find your API key at https://platform.openai.com/account/api-keys.',
    'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'
  }
}

Possible Solution

None

Additional context

OS: macOS 15 Sequoia

@DiogoMartins004 DiogoMartins004 added the bug Something isn't working label Jan 7, 2025
@douglaschalegre
Copy link

I managed to use ollama with Process.hierarchical, this is my config:

manager = LLM(model="ollama/phi4", base_url="http://localhost:11434")

@CrewBase
class DocAutomation:
    @crew
        def crew(self) -> Crew:
            """Creates the DocAutomation crew"""
            # To learn how to add knowledge sources to your crew, check out the documentation:
            # https://docs.crewai.com/concepts/knowledge#what-is-knowledge
    
            return Crew(
                agents=self.agents,  # Automatically created by the @agent decorator
                tasks=self.tasks,  # Automatically created by the @task decorator
                # process=Process.sequential,
                process=Process.hierarchical,
                manager_llm=manager,
                manager_agent=None,
                verbose=True,
                # planning=True,
            )

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants