Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Potential mapping template missmatch for converse and invokeModel in @aws-appsync/utils/ai #380

Open
malikalimoekhamedov opened this issue Dec 2, 2024 · 3 comments

Comments

@malikalimoekhamedov
Copy link

I just tried the brand new @aws-appsync/utils/ai converse and invokeModel helpers to write a quick Bedrock invocation resolver. Here's a prototype code:

import { type Context } from '@aws-appsync/utils';
import { converse } from '@aws-appsync/utils/ai';

const request = (ctx: Context<{ readonly query: string; readonly limit?: number }>) =>
  converse({
    modelId: 'us.anthropic.claude-3-5-haiku-20241022-v1:0',
    system: [
      {
        text: `Do something useful for once.`,
      },
    ],
    messages: [
      {
        role: 'user',
        content: [{ text: `<query>${ctx.arguments.query}</query>` }],
      },
    ],
  });

const response = (ctx: Context) => ctx.result.output.message.content[0].text

export { request, response };

This results in the following error:

{
  "data": null,
  "errors": [
    {
      "path": [
        "explorerCompleteSearchQuery"
      ],
      "data": null,
      "errorType": "Code",
      "errorInfo": null,
      "locations": [
        {
          "line": 2,
          "column": 3,
          "sourceName": null
        }
      ],
      "message": "Value for field '$[version]' not found."
    }
  ]
}

Here's the object generated by converse:

{
    "system": [
        {
            "guardContent": null,
            "text": "Do something useful for once."
        }
    ],
    "additionalModelRequestFields": null,
    "modelId": "us.anthropic.claude-3-5-haiku-20241022-v1:0",
    "additionalModelResponseFieldPaths": [],
    "guardrailConfig": null,
    "inferenceConfig": null,
    "messages": [
        {
            "role": "user",
            "content": [
                {
                    "text": "<query>What do we know about the effects of</query>",
                    "guardContent": null,
                    "toolResult": null,
                    "toolUse": null
                }
            ]
        }
    ],
    "operation": "Converse",
    "version":  null
}

So, if I manually replace "version": null with "version": "2023-07-27", the system will start complaining about other things, such as not knowing what a "toolConfig" is or "system".

The same behaviour was exhibited by invokeModel, tooK

There might be a mismatch between what AppSync/Bedrock expect and the return of converse and invokeModel.

What do you think?

@gilbertw1
Copy link

Hey, thanks for reaching out. I've been unable to reproduce this issue so far. Could you please provide the raw JavaScript resolver code sent to the service? Is this a unit resolver or a function in a pipeline resolver?

Thanks!

@malikalimoekhamedov
Copy link
Author

malikalimoekhamedov commented Dec 4, 2024

Hi, @gilbertw1,

Certainly, I can. Here's the generated JavaScript code I use for this unit resolver.

// src/stacks/graphql/constructs/endpoint/resolvers/explorer-complete-search-query.resolver.ts
import { converse } from "@aws-appsync/utils/ai";
var request = (ctx) => converse({
  modelId: "us.anthropic.claude-3-5-haiku-20241022-v1:0",
  system: [
    {
      text: `
You are a life science assistant who completes scientific search queries based on the text already entered by the researcher. 
Return one possible search query completion relevant to the text entered by the researcher.
`
    }
  ],
  messages: [
    {
      role: "user",
      content: [{ text: `<query>${ctx.arguments.query}</query>` }]
    }
  ]
});
var response = (ctx) => ctx.result.output.message.content[0].text;
export {
  request,
  response
};
//# sourceMappingURL=data:application/json;base64,…

By the way, I'd be interested in knowing if there are better ways to complete search queries than simply asking Haiku.
Regardless, I'll need the kinds of micro-interactions with AI going forward for other use cases anyway, so the resolution of this is still very welcome.

@gilbertw1
Copy link

Hey, sorry for the delayed response here. Are you still experiencing this issue?

I've tested using the JavaScript code you included in your most recent comment verbatim and was unable to produce the same error you reported.

If you are still seeing this error, could you provide some additional information, such as the region you're running in and the resolver runtime configuration?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants