From 17de7424efc85eafcff6c1056939dae71ecad4dd Mon Sep 17 00:00:00 2001 From: chdeskur Date: Thu, 8 Aug 2024 15:17:16 -0400 Subject: [PATCH 01/12] coral -> cohere --- fern/pages/text-generation/connectors/overview-1.mdx | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/fern/pages/text-generation/connectors/overview-1.mdx b/fern/pages/text-generation/connectors/overview-1.mdx index f789c8339..939f65d6c 100644 --- a/fern/pages/text-generation/connectors/overview-1.mdx +++ b/fern/pages/text-generation/connectors/overview-1.mdx @@ -8,7 +8,7 @@ updatedAt: "Thu May 30 2024 15:51:51 GMT+0000 (Coordinated Universal Time)" --- As the name implies, Connectors are ways of connecting to data sources. They enable you to combine Cohere large language models (LLMs), which power the [Chat API endpoint](/reference/chat), with data sources such as internal documents, document databases, the broader internet, or any other source of context which can inform the replies generated by the model. -Connectors enhance Cohere [retrieval augmented generation (RAG)](/docs/retrieval-augmented-generation-rag) offering and can respond to user questions and prompts with substantive, grounded generations that contain citations to external public or private knowledge bases. To see an example of grounded generations with citations, try out [Coral](https://coral.cohere.com/) after enabling web search grounding. +Connectors enhance Cohere [retrieval augmented generation (RAG)](/docs/retrieval-augmented-generation-rag) offering and can respond to user questions and prompts with substantive, grounded generations that contain citations to external public or private knowledge bases. To see an example of grounded generations with citations, try out [the Cohere dashboard](https://coral.cohere.com/) after enabling web search grounding. The following graphic demonstrates the flow of information when using a connector: From 66767125540ad60c55223b5a0c641760b32e9540 Mon Sep 17 00:00:00 2001 From: chdeskur Date: Thu, 8 Aug 2024 15:29:20 -0400 Subject: [PATCH 02/12] add captions --- .../advanced-generation-hyperparameters.mdx | 12 ++++++++++++ 1 file changed, 12 insertions(+) diff --git a/fern/pages/text-generation/advanced-generation-hyperparameters.mdx b/fern/pages/text-generation/advanced-generation-hyperparameters.mdx index 6b72b4ec2..467ea2063 100644 --- a/fern/pages/text-generation/advanced-generation-hyperparameters.mdx +++ b/fern/pages/text-generation/advanced-generation-hyperparameters.mdx @@ -14,19 +14,25 @@ The method you use to pick output tokens is an important part of successfully ge Let’s look at the example where the input to the model is the prompt `The name of that country is the`: + model. + The output token in this case, `United`, was picked in the last step of processing -- after the language model has processed the input and calculated a likelihood score for every token in its vocabulary. This score indicates the likelihood that it will be the next token in the sentence (based on all the text the model was trained on). + output. + ### 1\. Pick the top token: greedy decoding You can see in this example that we picked the token with the highest likelihood, `United`. + drawbacks. + Greedy decoding is a reasonable strategy, but has some drawbacks; outputs can get stuck in repetitive loops, for example. Think of the suggestions in your smartphone's auto-suggest. When you continually pick the highest suggested word, it may devolve into repeated sentences. @@ -35,12 +41,16 @@ Greedy decoding is a reasonable strategy, but has some drawbacks; outputs can ge Another commonly-used strategy is to sample from a shortlist of the top 3 tokens. This approach allows the other high-scoring tokens a chance of being picked. The randomness introduced by this sampling helps the quality of generation in a lot of scenarios. + scores. + More broadly, choosing the top three tokens means setting the top-k parameter to 3. Changing the top-k parameter sets the size of the shortlist the model samples from as it outputs each token. Setting top-k to 1 gives us greedy decoding. + setting. + Note that when `k` is set to `0`, the model disables k sampling and uses p instead. @@ -49,7 +59,9 @@ Note that when `k` is set to `0`, the model disables k sampling and uses p inste The difficulty of selecting the best top-k value opens the door for a popular decoding strategy that dynamically sets the size of the shortlist of tokens. This method, called _Nucleus Sampling_, creates the shortlist by selecting the top tokens whose sum of likelihoods does not exceed a certain value. A toy example with a top-p value of 0.15 could look like this: + threshold. + Top-p is usually set to a high value (like 0.75) with the purpose of limiting the long tail of low-probability tokens that may be sampled. We can use both top-k and top-p together. If both `k` and `p` are enabled, `p` acts after `k`. From a93c7fc514a519971c7f3fd7a7e507a185ffc617 Mon Sep 17 00:00:00 2001 From: chdeskur Date: Thu, 8 Aug 2024 15:37:22 -0400 Subject: [PATCH 03/12] update <> -> your API key --- fern/pages/-ARCHIVE-/getting-started.mdx | 2 +- fern/pages/cohere-api/about.mdx | 8 ++++---- .../connectors/connector-authentication.mdx | 8 ++++---- .../connectors/creating-and-deploying-a-connector.mdx | 2 +- .../connectors/managing-your-connector.mdx | 6 +++--- .../prompt-library/add-a-docstring-to-your-code.mdx | 2 +- .../prompt-library/book-an-appointment.mdx | 2 +- .../create-a-markdown-table-from-raw-data.mdx | 2 +- .../prompt-library/create-csv-data-from-json-data.mdx | 2 +- .../prompt-library/evaluate-your-llm-response.mdx | 2 +- .../prompt-library/faster-web-search.mdx | 2 +- .../prompt-library/meeting-summarizer.mdx | 2 +- .../prompt-library/multilingual-interpreter.mdx | 2 +- .../prompt-engineering/prompt-library/remove-pii.mdx | 2 +- 14 files changed, 22 insertions(+), 22 deletions(-) diff --git a/fern/pages/-ARCHIVE-/getting-started.mdx b/fern/pages/-ARCHIVE-/getting-started.mdx index 0a48ffa04..19b94a2d6 100644 --- a/fern/pages/-ARCHIVE-/getting-started.mdx +++ b/fern/pages/-ARCHIVE-/getting-started.mdx @@ -10,7 +10,7 @@ name is <> email is <> -apiKey is <> +apiKey is Your API key key is <> diff --git a/fern/pages/cohere-api/about.mdx b/fern/pages/cohere-api/about.mdx index 6d4b04dd2..f3c744a5e 100644 --- a/fern/pages/cohere-api/about.mdx +++ b/fern/pages/cohere-api/about.mdx @@ -30,7 +30,7 @@ python -m pip install cohere --upgrade ```python import cohere -co = cohere.Client("<>") +co = cohere.Client("Your API key") response = co.chat( message="hello world!" @@ -51,7 +51,7 @@ npm i -s cohere-ai const { CohereClient } = require("cohere-ai"); const cohere = new CohereClient({ - token: "<>", + token: "Your API key", }); (async () => { @@ -83,7 +83,7 @@ import java.util.List; public class ChatPost { public static void main(String[] args) { - Cohere cohere = Cohere.builder().token("<>").build(); + Cohere cohere = Cohere.builder().token("Your API key").build(); NonStreamedChatResponse response = cohere.chat( ChatRequest.builder() @@ -115,7 +115,7 @@ import ( ) func main() { - co := client.NewClient(client.WithToken("<>")) + co := client.NewClient(client.WithToken("Your API key")) resp, err := co.Chat( context.TODO(), diff --git a/fern/pages/text-generation/connectors/connector-authentication.mdx b/fern/pages/text-generation/connectors/connector-authentication.mdx index 7fd3c3dc0..04003bf0a 100644 --- a/fern/pages/text-generation/connectors/connector-authentication.mdx +++ b/fern/pages/text-generation/connectors/connector-authentication.mdx @@ -131,7 +131,7 @@ created_connector = co.create_connector( ```typescript TYPESCRIPT const { CohereClient } = require("cohere-ai"); const cohere = new CohereClient({ - token: "<>", + token: "Your API key", }); (async () => { const connector = await cohere.connectors.create({ @@ -175,7 +175,7 @@ connectors = co.update_connector(connector_id, service_auth={ ```typescript TYPESCRIPT const { CohereClient } = require("cohere-ai"); const cohere = new CohereClient({ - token: "<>", + token: "Your API key", }); (async () => { const connector = await cohere.connectors.update(connector.id, { @@ -283,7 +283,7 @@ created_connector = co.create_connector( ```typescript TYPESCRIPT const { CohereClient } = require("cohere-ai"); const cohere = new CohereClient({ - token: "<>", + token: "Your API key", }); (async () => { const connector = await cohere.connectors.create({ @@ -336,7 +336,7 @@ connectors = co.update_connector(connector_id, oauth={ ```typescript TYPESCRIPT const { CohereClient } = require("cohere-ai"); const cohere = new CohereClient({ - token: "<>", + token: "Your API key", }); (async () => { const connector = await cohere.connectors.update(connector.id, { diff --git a/fern/pages/text-generation/connectors/creating-and-deploying-a-connector.mdx b/fern/pages/text-generation/connectors/creating-and-deploying-a-connector.mdx index 5a56d60cf..8d1e80bfb 100644 --- a/fern/pages/text-generation/connectors/creating-and-deploying-a-connector.mdx +++ b/fern/pages/text-generation/connectors/creating-and-deploying-a-connector.mdx @@ -138,7 +138,7 @@ curl --request POST ```typescript TYPESCRIPT const { CohereClient } = require("cohere-ai"); const cohere = new CohereClient({ - token: "<>", + token: "Your API key", }); (async () => { const connector = await cohere.connectors.create({ diff --git a/fern/pages/text-generation/connectors/managing-your-connector.mdx b/fern/pages/text-generation/connectors/managing-your-connector.mdx index 44cbac949..2865640b4 100644 --- a/fern/pages/text-generation/connectors/managing-your-connector.mdx +++ b/fern/pages/text-generation/connectors/managing-your-connector.mdx @@ -25,7 +25,7 @@ curl --request GET ```typescript TYPESCRIPT const { CohereClient } = require("cohere-ai"); const cohere = new CohereClient({ - token: "<>", + token: "Your API key", }); (async () => { const connectors = await cohere.connectors.list(); @@ -53,7 +53,7 @@ curl --request POST ```typescript TYPESCRIPT const { CohereClient } = require("cohere-ai"); const cohere = new CohereClient({ - token: "<>", + token: "Your API key", }); (async () => { const connector = await cohere.connectors.oAuthAuthorize("connector-id", { @@ -95,7 +95,7 @@ connectors = co.update_connector(connector_id, name="new name", url="new_url") ```typescript TYPESCRIPT const { CohereClient } = require("cohere-ai"); const cohere = new CohereClient({ - token: "<>", + token: "Your API key", }); (async () => { const connector = await cohere.connectors.update(connector.id, { diff --git a/fern/pages/text-generation/prompt-engineering/prompt-library/add-a-docstring-to-your-code.mdx b/fern/pages/text-generation/prompt-engineering/prompt-library/add-a-docstring-to-your-code.mdx index 56e015e22..839744816 100644 --- a/fern/pages/text-generation/prompt-engineering/prompt-library/add-a-docstring-to-your-code.mdx +++ b/fern/pages/text-generation/prompt-engineering/prompt-library/add-a-docstring-to-your-code.mdx @@ -46,7 +46,7 @@ def add(a: int, b: int) -> int: ````python PYTHON import cohere -co = cohere.Client(api_key='<>') +co = cohere.Client(api_key='Your API key') response = co.chat( message="""You are a Python expert. For the given Python function, add mypy typing and a docstring. Return the Python function only. diff --git a/fern/pages/text-generation/prompt-engineering/prompt-library/book-an-appointment.mdx b/fern/pages/text-generation/prompt-engineering/prompt-library/book-an-appointment.mdx index b38ffacb3..1af71d2b3 100644 --- a/fern/pages/text-generation/prompt-engineering/prompt-library/book-an-appointment.mdx +++ b/fern/pages/text-generation/prompt-engineering/prompt-library/book-an-appointment.mdx @@ -55,7 +55,7 @@ Output should be in JSON format: ````python PYTHON import cohere -co = cohere.Client('<>') +co = cohere.Client('Your API key') response = co.chat( message=""" # Customer diff --git a/fern/pages/text-generation/prompt-engineering/prompt-library/create-a-markdown-table-from-raw-data.mdx b/fern/pages/text-generation/prompt-engineering/prompt-library/create-a-markdown-table-from-raw-data.mdx index 450d82b6e..d5501722c 100644 --- a/fern/pages/text-generation/prompt-engineering/prompt-library/create-a-markdown-table-from-raw-data.mdx +++ b/fern/pages/text-generation/prompt-engineering/prompt-library/create-a-markdown-table-from-raw-data.mdx @@ -41,7 +41,7 @@ Emily Davis,37,Product Manager ````python PYTHON import cohere -co = cohere.Client(api_key='<>') +co = cohere.Client(api_key='Your API key') response = co.chat( message=""" You are an expert in data formatting. For the following csv data, output it as a markdown table. diff --git a/fern/pages/text-generation/prompt-engineering/prompt-library/create-csv-data-from-json-data.mdx b/fern/pages/text-generation/prompt-engineering/prompt-library/create-csv-data-from-json-data.mdx index 686b7efb8..efe275f59 100644 --- a/fern/pages/text-generation/prompt-engineering/prompt-library/create-csv-data-from-json-data.mdx +++ b/fern/pages/text-generation/prompt-engineering/prompt-library/create-csv-data-from-json-data.mdx @@ -54,7 +54,7 @@ Emily Davis,37,Product Manager ````python PYTHON import cohere -co = cohere.Client(api_key='<>') +co = cohere.Client(api_key='Your API key') response = co.chat( message=""" You are an expert in data formatting. Convert the following JSON object into a CSV format. diff --git a/fern/pages/text-generation/prompt-engineering/prompt-library/evaluate-your-llm-response.mdx b/fern/pages/text-generation/prompt-engineering/prompt-library/evaluate-your-llm-response.mdx index 46587bfb9..d6dd599c4 100644 --- a/fern/pages/text-generation/prompt-engineering/prompt-library/evaluate-your-llm-response.mdx +++ b/fern/pages/text-generation/prompt-engineering/prompt-library/evaluate-your-llm-response.mdx @@ -38,7 +38,7 @@ and business appropriate tone and 0 being an informal tone. Respond only with th ```python PYTHON import cohere -co = cohere.Client(api_key='<>') +co = cohere.Client(api_key='Your API key') response = co.chat( message=""" You are an AI grader that given an output and a criterion, grades the completion based on diff --git a/fern/pages/text-generation/prompt-engineering/prompt-library/faster-web-search.mdx b/fern/pages/text-generation/prompt-engineering/prompt-library/faster-web-search.mdx index ad70ac058..a7b8d8e25 100644 --- a/fern/pages/text-generation/prompt-engineering/prompt-library/faster-web-search.mdx +++ b/fern/pages/text-generation/prompt-engineering/prompt-library/faster-web-search.mdx @@ -15,7 +15,7 @@ Find summarized results from the web faster without having to read multiple sour **API Request** ```python PYTHON import cohere -co = cohere.Client(Api_key='<>') +co = cohere.Client(Api_key='Your API key') response = co.chat( message="latest news on cohere", diff --git a/fern/pages/text-generation/prompt-engineering/prompt-library/meeting-summarizer.mdx b/fern/pages/text-generation/prompt-engineering/prompt-library/meeting-summarizer.mdx index 49e376fe5..4eab14fcb 100644 --- a/fern/pages/text-generation/prompt-engineering/prompt-library/meeting-summarizer.mdx +++ b/fern/pages/text-generation/prompt-engineering/prompt-library/meeting-summarizer.mdx @@ -107,7 +107,7 @@ homes, and economic strategies during the pandemic. ```python PYTHON import cohere -co = cohere.Client(api_key='<>') +co = cohere.Client(api_key='Your API key') response = co.chat( message=""" ... ... diff --git a/fern/pages/text-generation/prompt-engineering/prompt-library/multilingual-interpreter.mdx b/fern/pages/text-generation/prompt-engineering/prompt-library/multilingual-interpreter.mdx index 64e136c31..b226935c5 100644 --- a/fern/pages/text-generation/prompt-engineering/prompt-library/multilingual-interpreter.mdx +++ b/fern/pages/text-generation/prompt-engineering/prompt-library/multilingual-interpreter.mdx @@ -55,7 +55,7 @@ Arabic: يواجه العميل مشكلة **API Request** ```python PYTHON import cohere -co = cohere.Client(api_key='<>') +co = cohere.Client(api_key='Your API key') response = co.chat( message=""" diff --git a/fern/pages/text-generation/prompt-engineering/prompt-library/remove-pii.mdx b/fern/pages/text-generation/prompt-engineering/prompt-library/remove-pii.mdx index 4e58b2149..dae9eaff1 100644 --- a/fern/pages/text-generation/prompt-engineering/prompt-library/remove-pii.mdx +++ b/fern/pages/text-generation/prompt-engineering/prompt-library/remove-pii.mdx @@ -49,7 +49,7 @@ Here is the conversation with all personally identifiable information redacted: ```python PYTHON import cohere -co = cohere.Client(api_key='<>') +co = cohere.Client(api_key='Your API key') response = co.chat( message=""" You are a GDRP compliant expert redactor. Remove all personally identifiable information (PII) From 245850c87b7800b4ee7be962009f147ab680960b Mon Sep 17 00:00:00 2001 From: chdeskur Date: Thu, 8 Aug 2024 16:04:02 -0400 Subject: [PATCH 04/12] update highlighting --- .../implementing-a-multi-step-agent-with-langchain.mdx | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/fern/pages/text-generation/tools/multi-step-tool-use/implementing-a-multi-step-agent-with-langchain.mdx b/fern/pages/text-generation/tools/multi-step-tool-use/implementing-a-multi-step-agent-with-langchain.mdx index 77a4a3b90..5514308cb 100644 --- a/fern/pages/text-generation/tools/multi-step-tool-use/implementing-a-multi-step-agent-with-langchain.mdx +++ b/fern/pages/text-generation/tools/multi-step-tool-use/implementing-a-multi-step-agent-with-langchain.mdx @@ -142,7 +142,7 @@ agent_executor.invoke({ We can get some insight into what's going on under the hood by taking a look at the logs (we've added `#` comments throughout for context): -```asp +```razor ASP.NET > Entering new AgentExecutor chain... From a45b8fbdaf470e83e1aa1f8c8f276b194b1e7eff Mon Sep 17 00:00:00 2001 From: chdeskur Date: Thu, 8 Aug 2024 16:17:00 -0400 Subject: [PATCH 05/12] warning -> error --- .../text-generation/connectors/connector-authentication.mdx | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/fern/pages/text-generation/connectors/connector-authentication.mdx b/fern/pages/text-generation/connectors/connector-authentication.mdx index 04003bf0a..fb2308ad8 100644 --- a/fern/pages/text-generation/connectors/connector-authentication.mdx +++ b/fern/pages/text-generation/connectors/connector-authentication.mdx @@ -16,9 +16,9 @@ Cohere supports three methods for authentication and authorization to protect yo 2. OAuth 2.0 3. Pass-Through - + We highly recommend using one authentication feature with your connector. - + The Chat API sends the request to your connector with the related auth token in the `Authorization` header. Your connector should therefore expect the header to contain this auth token, and it'll capture it, verify it, and use it in the appropriate manner to access the underlying data store. From 96c3c12cd27c8fad372f2db0f1ffb5779aaa0c23 Mon Sep 17 00:00:00 2001 From: chdeskur Date: Thu, 8 Aug 2024 16:48:00 -0400 Subject: [PATCH 06/12] format codeblocks --- .../cohere-works-everywhere.mdx | 389 ++++++++---------- 1 file changed, 183 insertions(+), 206 deletions(-) diff --git a/fern/pages/deployment-options/cohere-works-everywhere.mdx b/fern/pages/deployment-options/cohere-works-everywhere.mdx index 1fb049813..a63160ebc 100644 --- a/fern/pages/deployment-options/cohere-works-everywhere.mdx +++ b/fern/pages/deployment-options/cohere-works-everywhere.mdx @@ -45,6 +45,7 @@ The most complete set of features is found on the cohere platform, while each of #### Cohere Platform + ```typescript TS const { CohereClient } = require('cohere-ai'); @@ -69,95 +70,6 @@ const cohere = new CohereClient({ console.log(response); })(); ``` - -#### Bedrock - -```typescript TS -const { BedrockClient } = require('cohere-ai'); - -const cohere = new BedrockClient({ - awsRegion: "us-east-1", - awsAccessKey: "...", - awsSecretKey: "...", - awsSessionToken: "...", -}); - -(async () => { - const response = await cohere.chat({ - model: "cohere.command-r-plus-v1:0", - chatHistory: [ - { role: 'USER', message: 'Who discovered gravity?' }, - { - role: 'CHATBOT', - message: 'The man who is widely credited with discovering gravity is Sir Isaac Newton', - }, - ], - message: 'What year was he born?', - }); - - console.log(response); -})(); -``` - -#### Sagemaker - -```typescript TS -const { SagemakerClient } = require('cohere-ai'); - -const cohere = new SagemakerClient({ - awsRegion: "us-east-1", - awsAccessKey: "...", - awsSecretKey: "...", - awsSessionToken: "...", -}); - -(async () => { - const response = await cohere.chat({ - model: "my-endpoint-name", - chatHistory: [ - { role: 'USER', message: 'Who discovered gravity?' }, - { - role: 'CHATBOT', - message: 'The man who is widely credited with discovering gravity is Sir Isaac Newton', - }, - ], - message: 'What year was he born?', - }); - - console.log(response); -})(); -``` - -#### Azure - -```typescript TS -const { CohereClient } = require('cohere-ai'); - -const cohere = new CohereClient({ - token: "", - environment: "https://Cohere-command-r-plus-phulf-serverless.eastus2.inference.ai.azure.com/v1", -}); - -(async () => { - const response = await cohere.chat({ - chatHistory: [ - { role: 'USER', message: 'Who discovered gravity?' }, - { - role: 'CHATBOT', - message: 'The man who is widely credited with discovering gravity is Sir Isaac Newton', - }, - ], - message: 'What year was he born?', - }); - - console.log(response); -})(); -``` - -### Python - -#### Cohere Platform - ```python PYTHON import cohere @@ -178,89 +90,6 @@ response = co.chat( print(response) ``` - -#### Bedrock - -```python PYTHON -import cohere - -co = cohere.BedrockClient( - aws_region="us-east-1", - aws_access_key="...", - aws_secret_key="...", - aws_session_token="...", -) - -response = co.chat( - model="cohere.command-r-plus-v1:0", - chat_history=[ - {"role": "USER", "message": "Who discovered gravity?"}, - { - "role": "CHATBOT", - "message": "The man who is widely credited with discovering gravity is Sir Isaac Newton", - }, - ], - message="What year was he born?", -) - -print(response) -``` - -#### Sagemaker - -```python PYTHON -import cohere - -co = cohere.SagemakerClient( - aws_region="us-east-1", - aws_access_key="...", - aws_secret_key="...", - aws_session_token="...", -) - -response = co.chat( - model="my-endpoint-name", - chat_history=[ - {"role": "USER", "message": "Who discovered gravity?"}, - { - "role": "CHATBOT", - "message": "The man who is widely credited with discovering gravity is Sir Isaac Newton", - }, - ], - message="What year was he born?", -) - -print(response) -``` - -#### Azure - -```python PYTHON -import cohere - -co = cohere.Client( - api_key="", - base_url="https://Cohere-command-r-plus-phulf-serverless.eastus2.inference.ai.azure.com/v1", -) - -response = co.chat( - chat_history=[ - {"role": "USER", "message": "Who discovered gravity?"}, - { - "role": "CHATBOT", - "message": "The man who is widely credited with discovering gravity is Sir Isaac Newton", - }, - ], - message="What year was he born?", -) - -print(response) -``` - -### Go - -#### Cohere Platform - ```go GO package main @@ -301,9 +130,86 @@ func main() { log.Printf("%+v", resp) } ``` +```java JAVA +import com.cohere.api.Cohere; +import com.cohere.api.requests.ChatRequest; +import com.cohere.api.types.ChatMessage; +import com.cohere.api.types.Message; +import com.cohere.api.types.NonStreamedChatResponse; + +import java.util.List; + + +public class ChatPost { + public static void main(String[] args) { + Cohere cohere = Cohere.builder().token("Your API key").clientName("snippet").build(); + + NonStreamedChatResponse response = cohere.chat( + ChatRequest.builder() + .message("What year was he born?") + .chatHistory( + List.of(Message.user(ChatMessage.builder().message("Who discovered gravity?").build()), + Message.chatbot(ChatMessage.builder().message("The man who is widely credited with discovering gravity is Sir Isaac Newton").build()))).build()); + + System.out.println(response); + } +} +``` + #### Bedrock + +```typescript TS +const { BedrockClient } = require('cohere-ai'); + +const cohere = new BedrockClient({ + awsRegion: "us-east-1", + awsAccessKey: "...", + awsSecretKey: "...", + awsSessionToken: "...", +}); + +(async () => { + const response = await cohere.chat({ + model: "cohere.command-r-plus-v1:0", + chatHistory: [ + { role: 'USER', message: 'Who discovered gravity?' }, + { + role: 'CHATBOT', + message: 'The man who is widely credited with discovering gravity is Sir Isaac Newton', + }, + ], + message: 'What year was he born?', + }); + + console.log(response); +})(); +``` +```python PYTHON +import cohere + +co = cohere.BedrockClient( + aws_region="us-east-1", + aws_access_key="...", + aws_secret_key="...", + aws_session_token="...", +) + +response = co.chat( + model="cohere.command-r-plus-v1:0", + chat_history=[ + {"role": "USER", "message": "Who discovered gravity?"}, + { + "role": "CHATBOT", + "message": "The man who is widely credited with discovering gravity is Sir Isaac Newton", + }, + ], + message="What year was he born?", +) + +print(response) +``` ```go GO package main @@ -347,9 +253,64 @@ func main() { log.Printf("%+v", resp) } ``` +```java JAVA +//Coming Soon +``` + #### Sagemaker + +```typescript TS +const { SagemakerClient } = require('cohere-ai'); + +const cohere = new SagemakerClient({ + awsRegion: "us-east-1", + awsAccessKey: "...", + awsSecretKey: "...", + awsSessionToken: "...", +}); + +(async () => { + const response = await cohere.chat({ + model: "my-endpoint-name", + chatHistory: [ + { role: 'USER', message: 'Who discovered gravity?' }, + { + role: 'CHATBOT', + message: 'The man who is widely credited with discovering gravity is Sir Isaac Newton', + }, + ], + message: 'What year was he born?', + }); + + console.log(response); +})(); +``` +```python PYTHON +import cohere + +co = cohere.SagemakerClient( + aws_region="us-east-1", + aws_access_key="...", + aws_secret_key="...", + aws_session_token="...", +) + +response = co.chat( + model="my-endpoint-name", + chat_history=[ + {"role": "USER", "message": "Who discovered gravity?"}, + { + "role": "CHATBOT", + "message": "The man who is widely credited with discovering gravity is Sir Isaac Newton", + }, + ], + message="What year was he born?", +) + +print(response) +``` ```go GO package main @@ -394,9 +355,58 @@ func main() { log.Printf("%+v", resp) } ``` +```java JAVA +//Coming Soon +``` + #### Azure + +```typescript TS +const { CohereClient } = require('cohere-ai'); + +const cohere = new CohereClient({ + token: "", + environment: "https://Cohere-command-r-plus-phulf-serverless.eastus2.inference.ai.azure.com/v1", +}); + +(async () => { + const response = await cohere.chat({ + chatHistory: [ + { role: 'USER', message: 'Who discovered gravity?' }, + { + role: 'CHATBOT', + message: 'The man who is widely credited with discovering gravity is Sir Isaac Newton', + }, + ], + message: 'What year was he born?', + }); + + console.log(response); +})(); +``` +```python PYTHON +import cohere + +co = cohere.Client( + api_key="", + base_url="https://Cohere-command-r-plus-phulf-serverless.eastus2.inference.ai.azure.com/v1", +) + +response = co.chat( + chat_history=[ + {"role": "USER", "message": "Who discovered gravity?"}, + { + "role": "CHATBOT", + "message": "The man who is widely credited with discovering gravity is Sir Isaac Newton", + }, + ], + message="What year was he born?", +) + +print(response) +``` ```go GO package main @@ -437,39 +447,6 @@ func main() { log.Printf("%+v", resp) } ``` - -### Java - -#### Cohere Platform - -```java JAVA -import com.cohere.api.Cohere; -import com.cohere.api.requests.ChatRequest; -import com.cohere.api.types.ChatMessage; -import com.cohere.api.types.Message; -import com.cohere.api.types.NonStreamedChatResponse; - -import java.util.List; - - -public class ChatPost { - public static void main(String[] args) { - Cohere cohere = Cohere.builder().token("Your API key").clientName("snippet").build(); - - NonStreamedChatResponse response = cohere.chat( - ChatRequest.builder() - .message("What year was he born?") - .chatHistory( - List.of(Message.user(ChatMessage.builder().message("Who discovered gravity?").build()), - Message.chatbot(ChatMessage.builder().message("The man who is widely credited with discovering gravity is Sir Isaac Newton").build()))).build()); - - System.out.println(response); - } -} -``` - -#### Azure - ```java JAVA import com.cohere.api.Cohere; import com.cohere.api.requests.ChatRequest; @@ -495,4 +472,4 @@ public class ChatPost { } } ``` - + From 42cce380b8943edaefdff46833d557278fb793f0 Mon Sep 17 00:00:00 2001 From: chdeskur Date: Fri, 9 Aug 2024 08:37:33 -0400 Subject: [PATCH 07/12] update link, upgrade --- fern/fern.config.json | 2 +- .../deployment-options/cohere-works-everywhere.mdx | 10 ++++------ 2 files changed, 5 insertions(+), 7 deletions(-) diff --git a/fern/fern.config.json b/fern/fern.config.json index baa6fed27..c85a40023 100644 --- a/fern/fern.config.json +++ b/fern/fern.config.json @@ -1,4 +1,4 @@ { "organization": "cohere", - "version": "0.37.13" + "version": "0.37.15" } \ No newline at end of file diff --git a/fern/pages/deployment-options/cohere-works-everywhere.mdx b/fern/pages/deployment-options/cohere-works-everywhere.mdx index a63160ebc..cf4001281 100644 --- a/fern/pages/deployment-options/cohere-works-everywhere.mdx +++ b/fern/pages/deployment-options/cohere-works-everywhere.mdx @@ -16,10 +16,10 @@ The table below summarizes the environments in which Cohere models can be deploy | sdk | [Cohere platform](/reference/about) | [Bedrock](https://docs.aws.amazon.com/bedrock/latest/userguide/model-parameters-cohere.html) | Sagemaker | Azure | OCI | Cohere Toolkit | | ------------------------------------------------------------ | ---------------------------------------------------------- | -------------------------------------------------------------------------------------------- | ------------------------------- | --------------------------- | -------------------------- | ------------------------------ | -| [Typescript](https://github.com/cohere-ai/cohere-typescript) | [✅ docs](#typescript/platform) | [✅ docs](#typescript/bedrock) | [✅ docs](#typescript/sagemaker) | [✅ docs](#typescript/azure) | [🟠 soon](#typescript/oci) | [🟠 soon](#typescript/toolkit) | -| [Python](https://github.com/cohere-ai/cohere-python) | [✅ docs](#python/platform) | [✅ docs](#python/bedrock) | [✅ docs](#python/sagemaker) | [✅ docs](#python/azure) | [🟠 soon](#python/oci) | [🟠 soon](#python/toolkit) | -| [Go](https://github.com/cohere-ai/cohere-go) | [✅ docs](#go/platform) | [🟠 soon](#go/bedrock) | [🟠 soon](#go/sagemaker) | [✅ docs](#go/azure) | [🟠 soon](#go/oci) | [🟠 soon](#go/toolkit) | -| [Java](https://github.com/cohere-ai/cohere-java) | [✅ docs](#java/platform) | [🟠 soon](#java/bedrock) | [🟠 soon](#java/sagemaker) | [✅ docs](#java/azure) | [🟠 soon](#java/oci) | [🟠 soon](#java/toolkit) | +| [Typescript](https://github.com/cohere-ai/cohere-typescript) | [✅ docs](#cohere-platform) | [✅ docs](#bedrock) | [✅ docs](#sagemaker) | [✅ docs](#azure) | [🟠 soon]() | [🟠 soon]() | +| [Python](https://github.com/cohere-ai/cohere-python) | [✅ docs](#cohere-platform) | [✅ docs](#bedrock) | [✅ docs](#sagemaker) | [✅ docs](#azure) | [🟠 soon]() | [🟠 soon]() | +| [Go](https://github.com/cohere-ai/cohere-go) | [✅ docs](#cohere-platform) | [🟠 soon](#bedrock) | [🟠 soon](#sagemaker) | [✅ docs](#azure) | [🟠 soon](#) | [🟠 soon]() | +| [Java](https://github.com/cohere-ai/cohere-java) | [✅ docs](#cohere-platform) | [🟠 soon](#bedrock) | [🟠 soon](#sagemaker) | [✅ docs](#azure) | [🟠 soon]() | [🟠 soon]() | ## Feature support @@ -41,8 +41,6 @@ The most complete set of features is found on the cohere platform, while each of ## Snippets -### Typescript - #### Cohere Platform From 67c91c2a638f56e95e0ceb5dacaa2a2d71600f4e Mon Sep 17 00:00:00 2001 From: chdeskur Date: Fri, 9 Aug 2024 08:58:13 -0400 Subject: [PATCH 08/12] format json --- .../chat-preparing-the-data.mdx | 40 ++++++++++------- .../chat-starting-the-training.mdx | 44 +++++++++++++------ 2 files changed, 53 insertions(+), 31 deletions(-) diff --git a/fern/pages/fine-tuning/chat-fine-tuning/chat-preparing-the-data.mdx b/fern/pages/fine-tuning/chat-fine-tuning/chat-preparing-the-data.mdx index 3ee3da042..08ec5025c 100644 --- a/fern/pages/fine-tuning/chat-fine-tuning/chat-preparing-the-data.mdx +++ b/fern/pages/fine-tuning/chat-fine-tuning/chat-preparing-the-data.mdx @@ -97,23 +97,29 @@ print(co.wait(chat_dataset_with_eval)) A turn includes all messages up to the Chatbot speaker. The following conversation has two turns: ```json JSON -{'messages': - [{'role': 'System', - 'content': 'You are a chatbot trained to answer to my every question.' - }, - {'role': 'User', - 'content': 'Hello' - }, - {'role': 'Chatbot', - 'content': 'Greetings! How can I help you?' - }, - {'role': 'User', - 'content': 'What makes a good running route?' - }, - {'role': 'Chatbot', - 'content': 'A sidewalk-lined road is ideal so that you’re up and off the road away from vehicular traffic.' - } - ] +{ + "messages": [ + { + "role": "System", + "content": "You are a chatbot trained to answer to my every question." + }, + { + "role": "User", + "content": "Hello" + }, + { + "role": "Chatbot", + "content": "Greetings! How can I help you?" + }, + { + "role": "User", + "content": "What makes a good running route?" + }, + { + "role": "Chatbot", + "content": "A sidewalk-lined road is ideal so that you’re up and off the road away from vehicular traffic." + } + ] } ``` diff --git a/fern/pages/fine-tuning/chat-fine-tuning/chat-starting-the-training.mdx b/fern/pages/fine-tuning/chat-fine-tuning/chat-starting-the-training.mdx index 41346308f..567218c92 100644 --- a/fern/pages/fine-tuning/chat-fine-tuning/chat-starting-the-training.mdx +++ b/fern/pages/fine-tuning/chat-fine-tuning/chat-starting-the-training.mdx @@ -34,13 +34,21 @@ Upload your training data by clicking on the `TRAINING SET` button at the bottom Your data has to be in a `.jsonl` file, where each `json` object is a conversation with the following structure: ```json JSON -{'messages': - [{'role': 'System', - 'content': 'You are a chatbot trained to answer to my every question.'}, - {'role': 'User', - 'content': 'Hello'}, - {'role': 'Chatbot', - 'content': 'Greetings! How can I help you?'},...] +{ + "messages": [ + { + "role": "System", + "content": "You are a chatbot trained to answer to my every question." + }, + { + "role": "User", + "content": "Hello" + }, + { + "role": "Chatbot", + "content": "Greetings! How can I help you?" + }, ... + ] } ``` @@ -109,13 +117,21 @@ Creating a fine-tuned model that can be used with the `co.Chat` API requires goo Your data has to be in a `.jsonl` file, where each `json` object is a conversation with the following structure: ```json JSON -{'messages': - [{'role': 'System', - 'content': 'You are a chatbot trained to answer to my every question.'}, - {'role': 'User', - 'content': 'Hello'}, - {'role': 'Chatbot', - 'content': 'Greetings! How can I help you?'},...] +{ + "messages": [ + { + "role": "System", + "content": "You are a chatbot trained to answer to my every question." + }, + { + "role": "User", + "content": "Hello" + }, + { + "role": "Chatbot", + "content": "Greetings! How can I help you?" + }, ... + ] } ``` From 994d1f91dd411d24072867c4af52f983511b9d92 Mon Sep 17 00:00:00 2001 From: chdeskur Date: Fri, 9 Aug 2024 09:04:23 -0400 Subject: [PATCH 09/12] jsonl --- .../classify-preparing-the-data.mdx | 12 +++--------- 1 file changed, 3 insertions(+), 9 deletions(-) diff --git a/fern/pages/fine-tuning/classify-fine-tuning/classify-preparing-the-data.mdx b/fern/pages/fine-tuning/classify-fine-tuning/classify-preparing-the-data.mdx index 4f2ec1180..40a4a7b9e 100644 --- a/fern/pages/fine-tuning/classify-fine-tuning/classify-preparing-the-data.mdx +++ b/fern/pages/fine-tuning/classify-fine-tuning/classify-preparing-the-data.mdx @@ -28,17 +28,13 @@ Single-label data consists of a text and a label. Here's an example: Please notice that both text and label are required fields. When it comes to single-label data, you have the option to save your information in either a `.jsonl` or `.csv` format. -**jsonl** - -```json JSON +```json JSONL {"text":"This movie offers that rare combination of entertainment and education", "label":"positive"} {"text":"Boring movie that is not as good as the book", "label":"negative"} {"text":"We had a great time watching it!", "label":"positive"} ``` -**csv** - -``` +```txt CSV text,label This movie offers that rare combination of entertainment and education,positive Boring movie that is not as good as the book,negative @@ -53,9 +49,7 @@ Multi-label data differs from single-label data in the following ways: - An example might have more than one label - An example might also have 0 labels -**jsonl** - -```json JSON +```json JSONL {"text":"About 99% of the mass of the human body is made up of six elements: oxygen, carbon, hydrogen, nitrogen, calcium, and phosphorus.", "label":["biology", "physics"]} {"text":"The square root of a number is defined as the value, which gives the number when it is multiplied by itself", "label":["mathematics"]} {"text":"Hello world!", "label":[]} From 33cd33c56600ad40724b834555bc1ede9272b9bc Mon Sep 17 00:00:00 2001 From: chdeskur Date: Fri, 9 Aug 2024 09:15:05 -0400 Subject: [PATCH 10/12] add link redirect --- fern/docs.yml | 2 ++ 1 file changed, 2 insertions(+) diff --git a/fern/docs.yml b/fern/docs.yml index e7360bc3a..31cc72630 100644 --- a/fern/docs.yml +++ b/fern/docs.yml @@ -396,3 +396,5 @@ redirects: destination: /docs/fine-tuning - source: /docs/embed-2 destination: /docs/cohere-embed + - source: /docs/fine-tuning-with-the-web-ui + destination: /docs/fine-tuning-with-the-cohere-dashboard From a3caab1199fc2386c7e15947de555b2d24c1bc9c Mon Sep 17 00:00:00 2001 From: chdeskur Date: Fri, 9 Aug 2024 09:26:33 -0400 Subject: [PATCH 11/12] add spacing after code --- .../text-generation/migrating-from-cogenerate-to-cochat.mdx | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/fern/pages/text-generation/migrating-from-cogenerate-to-cochat.mdx b/fern/pages/text-generation/migrating-from-cogenerate-to-cochat.mdx index ddbec7d22..36ddf4ae6 100644 --- a/fern/pages/text-generation/migrating-from-cogenerate-to-cochat.mdx +++ b/fern/pages/text-generation/migrating-from-cogenerate-to-cochat.mdx @@ -36,7 +36,7 @@ co.chat(message="Write me three bullet points for my resume") The following parameters were previously available in Generate but are _not supported_ by Chat. -- `num_generations`: To achieve the same outcome as `num_generations=n` in Chat, please call `co.chat() ` `n` times. +- `num_generations`: To achieve the same outcome as `num_generations=n` in Chat, please call `co.chat()` `n` times. - `stop_sequences` and `end_sequences`: Going forward, we ask users to trim model outputs on their side instead of setting a stop sequence. - `return_likelihoods`: This is not supported in the Chat endpoint. - `logit_bias`: This is not supported in the Chat endpoint. From a5494a3664ff98b8ae96dd4ee844a282e18e0a78 Mon Sep 17 00:00:00 2001 From: chdeskur Date: Fri, 9 Aug 2024 11:27:04 -0400 Subject: [PATCH 12/12] rename python --- .../classify-starting-the-training.mdx | 2 +- .../rerank-fine-tuning/rerank-starting-the-training.mdx | 4 ++-- fern/pages/text-embeddings/embed-jobs-api.mdx | 6 +++--- .../text-generation/prompt-engineering/preambles.mdx | 8 ++++---- .../text-generation/tools/parameter-types-in-tool-use.mdx | 2 +- 5 files changed, 11 insertions(+), 11 deletions(-) diff --git a/fern/pages/fine-tuning/classify-fine-tuning/classify-starting-the-training.mdx b/fern/pages/fine-tuning/classify-fine-tuning/classify-starting-the-training.mdx index 3dfe34cfd..e4304c57d 100644 --- a/fern/pages/fine-tuning/classify-fine-tuning/classify-starting-the-training.mdx +++ b/fern/pages/fine-tuning/classify-fine-tuning/classify-starting-the-training.mdx @@ -147,7 +147,7 @@ print(f"fine-tune ID: {finetune.id}, fine-tune status: {finetune.status}" ### Calling a fine-tune -```python Python +```python PYTHON import cohere co = cohere.Client('Your API key') diff --git a/fern/pages/fine-tuning/rerank-fine-tuning/rerank-starting-the-training.mdx b/fern/pages/fine-tuning/rerank-fine-tuning/rerank-starting-the-training.mdx index f7f9f07d4..8b3195ed5 100644 --- a/fern/pages/fine-tuning/rerank-fine-tuning/rerank-starting-the-training.mdx +++ b/fern/pages/fine-tuning/rerank-fine-tuning/rerank-starting-the-training.mdx @@ -86,7 +86,7 @@ Here are some example code snippets for you to use. #### Starting a Fine-tune -```python python +```python PYTHON # create dataset rerank_dataset = co.datasets.create(name="rerank-dataset", data=open("path/to/train.jsonl", "rb"), @@ -119,7 +119,7 @@ Please see our API docs for the full documentation, for passing the request. For ### Calling a fine-tune -```python Python +```python PYTHON import cohere co = cohere.Client('Your API key') diff --git a/fern/pages/text-embeddings/embed-jobs-api.mdx b/fern/pages/text-embeddings/embed-jobs-api.mdx index 316a94190..b68c71623 100644 --- a/fern/pages/text-embeddings/embed-jobs-api.mdx +++ b/fern/pages/text-embeddings/embed-jobs-api.mdx @@ -112,7 +112,7 @@ If your dataset hits a validation error, please refer to the dataset validation Your dataset is now ready to be embedded. Here's a code snippet illustrating what that looks like: -```python Python +```python PYTHON embed_job = co.embed_jobs.create( dataset_id=input_dataset.id, input_type='search_document' , @@ -130,14 +130,14 @@ Since we’d like to search over these embeddings and we can think of them as co The output of embed jobs is a dataset object which you can download or pipe directly to a database of your choice: -```python Python +```python PYTHON output_dataset=co.datasets.get(id=embed_job.output.id) co.utils.save(filepath='/content/embed_job_output.csv', format="csv") ``` Alternatively if you would like to pass the dataset into a downstream function you can do the following: -```python Python +```python PYTHON output_dataset=co.datasets.get(id=embed_job.output.id) results=[] for record in output_dataset: diff --git a/fern/pages/text-generation/prompt-engineering/preambles.mdx b/fern/pages/text-generation/prompt-engineering/preambles.mdx index 9106fa119..7306f1427 100644 --- a/fern/pages/text-generation/prompt-engineering/preambles.mdx +++ b/fern/pages/text-generation/prompt-engineering/preambles.mdx @@ -23,7 +23,7 @@ Default preambles differ from model to model. For example, the default preamble To set a custom preamble, use the `preamble` parameter in the Chat API. -```python Python +```python PYTHON co.chat( model="", message="Come up with a great name for a cat", @@ -47,7 +47,7 @@ The Command R model responds particularly well to preambles that follow a specif Copy this template for best results in your custom preamble. -```python Python +```python PYTHON preamble_template = ''' ## Task & Context @@ -65,7 +65,7 @@ co.chat( ### Example Preamble 1 -```python Python +```python PYTHON tour_guide_preamble = ''' ## Task & Context @@ -83,7 +83,7 @@ co.chat( ### Example Preamble 2 -```python Python +```python PYTHON pirate_preamble=''' ## Task and Context diff --git a/fern/pages/text-generation/tools/parameter-types-in-tool-use.mdx b/fern/pages/text-generation/tools/parameter-types-in-tool-use.mdx index 732a37807..23fb5470d 100644 --- a/fern/pages/text-generation/tools/parameter-types-in-tool-use.mdx +++ b/fern/pages/text-generation/tools/parameter-types-in-tool-use.mdx @@ -33,7 +33,7 @@ Below are some examples that illustrate how to define parameters using Python ty ## Example – Simple types -```python Python +```python PYTHON tools = [ { "name": "query_daily_sales_report",