Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs: Update Qianfan ChatModel docstring #23336

Closed
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
195 changes: 184 additions & 11 deletions libs/community/langchain_community/chat_models/baidu_qianfan_endpoint.py
Original file line number Diff line number Diff line change
Expand Up @@ -109,22 +109,195 @@ def _convert_dict_to_message(_dict: Mapping[str, Any]) -> AIMessage:


class QianfanChatEndpoint(BaseChatModel):
"""Baidu Qianfan chat models.
"""Baidu Qianfan chat model integration.

Setup:
Install ``qianfan`` and set environment variables ``QIANFAN_AK``, ``QIANFAN_SK``.

.. code-block:: bash

pip install qianfan
export QIANFAN_AK="your-api-key"
export QIANFAN_SK="your-secret_key"

Key init args — completion params:
model: str
Name of Qianfan model to use.
temperature: Optional[float]
Sampling temperature.
endpoint: Optional[str]
Endpoint of the Qianfan LLM
top_p: Optional[float]
What probability mass to use.

Key init args — client params:
timeout: Optional[int]
Timeout for requests.
api_key: Optional[str]
Qianfan API KEY. If not passed in will be read from env var QIANFAN_AK.
secret_key: Optional[str]
Qianfan SECRET KEY. If not passed in will be read from env var QIANFAN_SK.

See full list of supported init args and their descriptions in the params section.

Instantiate:
.. code-block:: python

To use, you should have the ``qianfan`` python package installed, and
the environment variable ``qianfan_ak`` and ``qianfan_sk`` set with your
API key and Secret Key.
from langchain_community.chat_models import QianfanChatEndpoint

ak, sk are required parameters
which you could get from https://cloud.baidu.com/product/wenxinworkshop
qianfan_chat = QianfanChatEndpoint(
model="ERNIE-3.5-8K",
temperature=0.2,
timeout=30,
# api_key="...",
# secret_key="...",
# top_p="...",
# other params...
)

Invoke:
.. code-block:: python

messages = [
("system", "你是一名专业的翻译家,可以将用户的中文翻译为英文。"),
("human", "我喜欢编程。"),
]
qianfan_chat.invoke(message)

Example:
.. code-block:: python

from langchain_community.chat_models import QianfanChatEndpoint
qianfan_chat = QianfanChatEndpoint(model="ERNIE-Bot",
endpoint="your_endpoint", qianfan_ak="your_ak", qianfan_sk="your_sk")
"""
AIMessage(content='I enjoy programming.', additional_kwargs={'finish_reason': 'normal', 'request_id': 'as-7848zeqn1c', 'object': 'chat.completion', 'search_info': []}, response_metadata={'token_usage': {'prompt_tokens': 16, 'completion_tokens': 4, 'total_tokens': 20}, 'model_name': 'ERNIE-3.5-8K', 'finish_reason': 'normal', 'id': 'as-7848zeqn1c', 'object': 'chat.completion', 'created': 1719153606, 'result': 'I enjoy programming.', 'is_truncated': False, 'need_clear_history': False, 'usage': {'prompt_tokens': 16, 'completion_tokens': 4, 'total_tokens': 20}}, id='run-4bca0c10-5043-456b-a5be-2f62a980f3f0-0')

Stream:
.. code-block:: python

for chunk in qianfan_chat.stream(messages):
print(chunk)

.. code-block:: python

content='I enjoy' response_metadata={'finish_reason': 'normal', 'request_id': 'as-yz0yz1w1rq', 'object': 'chat.completion', 'search_info': []} id='run-0fa9da50-003e-4a26-ba16-dbfe96249b8b' role='assistant'
content=' programming.' response_metadata={'finish_reason': 'normal', 'request_id': 'as-yz0yz1w1rq', 'object': 'chat.completion', 'search_info': []} id='run-0fa9da50-003e-4a26-ba16-dbfe96249b8b' role='assistant'

.. code-block:: python

full = next(stream)
for chunk in stream:
full += chunk
full

.. code-block::

AIMessageChunk(content='I enjoy programming.', response_metadata={'finish_reason': 'normalnormal', 'request_id': 'as-p63cnn3ppnas-p63cnn3ppn', 'object': 'chat.completionchat.completion', 'search_info': []}, id='run-09a8cbbd-5ded-4529-981d-5bc9d1206404')

Async:
.. code-block:: python

await qianfan_chat.ainvoke(messages)

# stream:
# async for chunk in qianfan_chat.astream(messages):
# print(chunk)

# batch:
# await qianfan_chat.abatch([messages])

.. code-block:: python

[AIMessage(content='I enjoy programming.', additional_kwargs={'finish_reason': 'normal', 'request_id': 'as-mpqa8qa1qb', 'object': 'chat.completion', 'search_info': []}, response_metadata={'token_usage': {'prompt_tokens': 16, 'completion_tokens': 4, 'total_tokens': 20}, 'model_name': 'ERNIE-3.5-8K', 'finish_reason': 'normal', 'id': 'as-mpqa8qa1qb', 'object': 'chat.completion', 'created': 1719155120, 'result': 'I enjoy programming.', 'is_truncated': False, 'need_clear_history': False, 'usage': {'prompt_tokens': 16, 'completion_tokens': 4, 'total_tokens': 20}}, id='run-443b2231-08f9-4725-b807-b77d0507ad44-0')]

Tool calling:
.. code-block:: python

from langchain_core.pydantic_v1 import BaseModel, Field


class GetWeather(BaseModel):
'''Get the current weather in a given location'''

location: str = Field(
..., description="The city and state, e.g. San Francisco, CA"
)


class GetPopulation(BaseModel):
'''Get the current population in a given location'''

location: str = Field(
..., description="The city and state, e.g. San Francisco, CA"
)

chat_with_tools = qianfan_chat.bind_tools([GetWeather, GetPopulation])
ai_msg = chat_with_tools.invoke(
"Which city is hotter today and which is bigger: LA or NY?"
)
ai_msg.tool_calls

.. code-block:: python

[
{
'name': 'GetWeather',
'args': {'location': 'Los Angeles, CA'},
'id': '533e5f63-a3dc-40f2-9d9c-22b1feee62e0'
}
]

Structured output:
.. code-block:: python

from typing import Optional

from langchain_core.pydantic_v1 import BaseModel, Field


class Joke(BaseModel):
'''Joke to tell user.'''

setup: str = Field(description="The setup of the joke")
punchline: str = Field(description="The punchline to the joke")
rating: Optional[int] = Field(description="How funny the joke is, from 1 to 10")


structured_chat = qianfan_chat.with_structured_output(Joke)
structured_chat.invoke("Tell me a joke about cats")

.. code-block:: python

Joke(
setup='A cat is sitting in front of a mirror and sees another cat. What does the cat think?',
punchline="The cat doesn't think it's another cat, it thinks it's another mirror.",
rating=None
)

Response metadata
.. code-block:: python

ai_msg = qianfan_chat.invoke(messages)
ai_msg.response_metadata

.. code-block:: python
{
'token_usage': {
'prompt_tokens': 16,
'completion_tokens': 4,
'total_tokens': 20},
'model_name': 'ERNIE-3.5-8K',
'finish_reason': 'normal',
'id': 'as-qbzwtydqmi',
'object': 'chat.completion',
'created': 1719158153,
'result': 'I enjoy programming.',
'is_truncated': False,
'need_clear_history': False,
'usage': {
'prompt_tokens': 16,
'completion_tokens': 4,
'total_tokens': 20
}
}

""" # noqa: E501

init_kwargs: Dict[str, Any] = Field(default_factory=dict)
"""init kwargs for qianfan client init, such as `query_per_second` which is
Expand Down
8 changes: 5 additions & 3 deletions libs/community/langchain_community/chat_models/tongyi.py
Original file line number Diff line number Diff line change
Expand Up @@ -254,7 +254,7 @@ def lc_secrets(self) -> Dict[str, str]:
top_p: float = 0.8
"""Total probability mass of tokens to consider at each step."""

dashscope_api_key: Optional[SecretStr] = Field(None, alias="api_key")
dashscope_api_key: SecretStr = Field(default=None, alias="api_key")
"""Dashscope api key provide by Alibaba Cloud."""

streaming: bool = False
Expand All @@ -273,11 +273,13 @@ def _llm_type(self) -> str:
"""Return type of llm."""
return "tongyi"

@root_validator()
@root_validator(pre=True)
def validate_environment(cls, values: Dict) -> Dict:
"""Validate that api key and python package exists in environment."""
values["dashscope_api_key"] = convert_to_secret_str(
get_from_dict_or_env(values, "dashscope_api_key", "DASHSCOPE_API_KEY")
get_from_dict_or_env(
values, ["dashscope_api_key", "api_key"], "DASHSCOPE_API_KEY"
)
)
try:
import dashscope
Expand Down
6 changes: 4 additions & 2 deletions libs/community/langchain_community/embeddings/baichuan.py
Original file line number Diff line number Diff line change
Expand Up @@ -48,12 +48,14 @@ class Config:

allow_population_by_field_name = True

@root_validator(allow_reuse=True)
@root_validator(pre=True, allow_reuse=True)
def validate_environment(cls, values: Dict) -> Dict:
"""Validate that auth token exists in environment."""
try:
baichuan_api_key = convert_to_secret_str(
get_from_dict_or_env(values, "baichuan_api_key", "BAICHUAN_API_KEY")
get_from_dict_or_env(
values, ["baichuan_api_key", "api_key"], "BAICHUAN_API_KEY"
)
)
except ValueError as original_exc:
try:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -13,10 +13,10 @@
class QianfanEmbeddingsEndpoint(BaseModel, Embeddings):
"""`Baidu Qianfan Embeddings` embedding models."""

qianfan_ak: Optional[str] = None
qianfan_ak: Optional[str] = Field(default=None, alias="api_key")
"""Qianfan application apikey"""

qianfan_sk: Optional[str] = None
qianfan_sk: Optional[str] = Field(default=None, alias="secret_key")
"""Qianfan application secretkey"""

chunk_size: int = 16
Expand Down Expand Up @@ -48,7 +48,12 @@ class QianfanEmbeddingsEndpoint(BaseModel, Embeddings):
model_kwargs: Dict[str, Any] = Field(default_factory=dict)
"""extra params for model invoke using with `do`."""

@root_validator()
class Config:
"""Configuration for this pydantic object."""

allow_population_by_field_name = True

@root_validator(pre=True)
def validate_environment(cls, values: Dict) -> Dict:
"""
Validate whether qianfan_ak and qianfan_sk in the environment variables or
Expand All @@ -74,15 +79,15 @@ def validate_environment(cls, values: Dict) -> Dict:
values["qianfan_ak"] = convert_to_secret_str(
get_from_dict_or_env(
values,
"qianfan_ak",
["qianfan_ak", "api_key"],
"QIANFAN_AK",
default="",
)
)
values["qianfan_sk"] = convert_to_secret_str(
get_from_dict_or_env(
values,
"qianfan_sk",
["qianfan_sk", "secret_key"],
"QIANFAN_SK",
default="",
)
Expand All @@ -91,16 +96,25 @@ def validate_environment(cls, values: Dict) -> Dict:
try:
import qianfan

default_values = {
name: field.default
for name, field in cls.__fields__.items()
if field.default is not None
}
default_values.update(values)
params = {
**values.get("init_kwargs", {}),
"model": values["model"],
"model": default_values.get("model"),
}
if values["qianfan_ak"].get_secret_value() != "":
params["ak"] = values["qianfan_ak"].get_secret_value()
if values["qianfan_sk"].get_secret_value() != "":
params["sk"] = values["qianfan_sk"].get_secret_value()
if values["endpoint"] is not None and values["endpoint"] != "":
params["endpoint"] = values["endpoint"]
if (
default_values.get("endpoint") is not None
and default_values["endpoint"] != ""
):
params["endpoint"] = default_values["endpoint"]
values["client"] = qianfan.Embedding(**params)
except ImportError:
raise ImportError(
Expand Down
11 changes: 5 additions & 6 deletions libs/community/langchain_community/embeddings/dashscope.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@
)

from langchain_core.embeddings import Embeddings
from langchain_core.pydantic_v1 import BaseModel, Extra, root_validator
from langchain_core.pydantic_v1 import BaseModel, Extra, Field, root_validator
from langchain_core.utils import get_from_dict_or_env
from requests.exceptions import HTTPError
from tenacity import (
Expand Down Expand Up @@ -101,27 +101,26 @@ class DashScopeEmbeddings(BaseModel, Embeddings):
client: Any #: :meta private:
"""The DashScope client."""
model: str = "text-embedding-v1"
dashscope_api_key: Optional[str] = None
dashscope_api_key: Optional[str] = Field(default=None, alias="api_key")
max_retries: int = 5
"""Maximum number of retries to make when generating."""

class Config:
"""Configuration for this pydantic object."""

extra = Extra.forbid
allow_population_by_field_name = True

@root_validator(pre=True)
def validate_environment(cls, values: Dict) -> Dict:
import dashscope

"""Validate that api key and python package exists in environment."""
values["dashscope_api_key"] = get_from_dict_or_env(
values, "dashscope_api_key", "DASHSCOPE_API_KEY"
values, ["dashscope_api_key", "api_key"], "DASHSCOPE_API_KEY"
)
dashscope.api_key = values["dashscope_api_key"]
try:
import dashscope

dashscope.api_key = values["dashscope_api_key"]
values["client"] = dashscope.TextEmbedding
except ImportError:
raise ImportError(
Expand Down
Loading
Loading