Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tool use fails with error 'Tool' object has no attribute 'name' #1109

Open
kjw227 opened this issue Feb 4, 2025 · 0 comments
Open

Tool use fails with error 'Tool' object has no attribute 'name' #1109

kjw227 opened this issue Feb 4, 2025 · 0 comments

Comments

@kjw227
Copy link

kjw227 commented Feb 4, 2025

Hi,

I'm trying to use the tools feature in guidance to track each time a grammar function is called in guidance (further down the line I would like to use this functionality to implement on-the-fly interpreters for programs generated by llms).

The bug

The problem is that tool calls are not working. A minimal example:

To reproduce

import guidance
from guidance import models, gen
from guidance import one_or_more, select, zero_or_more
from guidance import capture, Tool

MODELFILE = 'codellama/CodeLlama-7b-hf'

@guidance(stateless=True)
def number(lm):
  base = select(['0', '2', '5'])
  rec = number() + number()
  return lm + select([base, rec], name='tool_arg')

@guidance
def numbertool(lm):
  exp = lm['tool_arg']
  print('TOOLING: ' + str(exp))
  return lm

def get_model(model):
  return guidance.models.Transformers(model, device_map='auto')

if __name__ == '__main__':
  model = get_model(MODELFILE)
  prompt = 'How many states are there in the United States? '
  ntool = Tool(number(), numbertool)
  query_result = model + prompt + gen(max_tokens=10, tools=[ntool], stop = ' ')
  print(query_result)

What I expected to see is TOOLING: n logged on the console for each number generation (for example, if the llm generates 50, I would expect to see TOOLING: 5\nTOOLING: 0\nTOOLING: 50). Instead, I'm getting the following error trace:

Traceback (most recent call last):
  File "/home/ubuntu/llmtest/ihatellms/grammar_test.py", line 27, in <module>
    query_result = model + prompt + gen(max_tokens=10, tools=[ntool], stop = ' ')
  File "/home/ubuntu/.conda/envs/newllm/lib/python3.9/site-packages/guidance/models/_model.py", line 1212, in __add__
    out = value(lm)
  File "/home/ubuntu/.conda/envs/newllm/lib/python3.9/site-packages/guidance/_grammar.py", line 60, in __call__
    return self.f(model, *self.args, **self.kwargs)
  File "/home/ubuntu/.conda/envs/newllm/lib/python3.9/site-packages/guidance/library/_gen.py", line 147, in gen
    raise ValueError(f"Could not infer unambiguous tool call prefix for tool {tool.name}")
AttributeError: 'Tool' object has no attribute 'name'

The same behavior holds for more complex constraints that I've tried out. Interestingly though, the example from the readme works if I copy-paste it verbatim, but that's the only tool I've got working.

System info (please complete the following information):

  • OS (e.g. Ubuntu, Windows 11, Mac OS, etc.): Ubuntu 22.04
  • Guidance Version (guidance.__version__): 0.2.0

Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant