diff --git a/dev/404.html b/dev/404.html index e96d589..f798963 100644 --- a/dev/404.html +++ b/dev/404.html @@ -16,7 +16,7 @@
Skip to content

404

PAGE NOT FOUND

But if you don't change your direction, and if you keep looking, you may end up where you are heading.
- + \ No newline at end of file diff --git a/dev/advanced.html b/dev/advanced.html index 769d16b..09ddfac 100644 --- a/dev/advanced.html +++ b/dev/advanced.html @@ -52,7 +52,7 @@ result.context

Let's save this result for debugging later into JSON.

julia
using AIHelpMe.PromptingTools: JSON3
 config_key = AIHelpMe.get_config_key() # "nomicembedtext-0-Bool"
 JSON3.write("rag-makie-xyzyzyz-$(config_key)-20240419.json", result)

Now, you want to let us know. Please share the above JSON with a few notes of what you expected/what is wrong via a Github Issue or on Slack (#generative-ai channel)!

- + \ No newline at end of file diff --git a/dev/assets/reference.md.CC7Os28t.js b/dev/assets/reference.md.jv-XqdQl.js similarity index 95% rename from dev/assets/reference.md.CC7Os28t.js rename to dev/assets/reference.md.jv-XqdQl.js index 089e5aa..91d2c7c 100644 --- a/dev/assets/reference.md.CC7Os28t.js +++ b/dev/assets/reference.md.jv-XqdQl.js @@ -1,4 +1,4 @@ -import{_ as s,c as i,o as e,a7 as a}from"./chunks/framework.BTZ5G5k6.js";const u=JSON.parse('{"title":"Reference","description":"","frontmatter":{},"headers":[],"relativePath":"reference.md","filePath":"reference.md","lastUpdated":null}'),t={name:"reference.md"},l=a(`

Reference

# AIHelpMe.ALLOWED_PACKSConstant.
julia
ALLOWED PACKS

Currently available packs are:

source


# AIHelpMe.ALLOWED_PREFERENCESConstant.

Keys that are allowed to be set via set_preferences!

source


# AIHelpMe.LOADED_PACKSConstant.
julia
LOADED_PACKS

The knowledge packs that are currently loaded in the index.

source


# AIHelpMe.PREFERENCESConstant.
julia
PREFERENCES

You can set preferences for AIHelpMe by using the set_preferences!. It will create a LocalPreferences.toml file in your current directory and will reload your prefences from there.

Check your preferences by calling get_preferences(key::String).

Available Preferences (for set_preferences!)

source


# AIHelpMe.RAG_CONFIGURATIONSConstant.
julia
RAG_CONFIGURATIONS

A dictionary of RAG configurations, keyed by a unique symbol (eg, bronze). Each entry contains a dictionary with keys :config and :kwargs, where :config is the RAG configuration object (AbstractRAGConfig) and :kwargs the NamedTuple of corresponding kwargs.

Available Options:

source


# AIHelpMe.aihelpMethod.
julia
aihelp([cfg::RT.AbstractRAGConfig, index::RT.AbstractChunkIndex,]
+import{_ as s,c as i,o as e,a7 as a}from"./chunks/framework.BTZ5G5k6.js";const u=JSON.parse('{"title":"Reference","description":"","frontmatter":{},"headers":[],"relativePath":"reference.md","filePath":"reference.md","lastUpdated":null}'),t={name:"reference.md"},l=a(`

Reference

# AIHelpMe.ALLOWED_PACKSConstant.
julia
ALLOWED PACKS

Currently available packs are:

  • :julia - Julia documentation, standard library docstrings and a few extras (for Julia v1.10)

  • :tidier - Tidier.jl organization documentation (as of 7th April 2024)

  • :makie - Makie.jl organization documentation (as of 30th March 2024)

source


# AIHelpMe.ALLOWED_PREFERENCESConstant.

Keys that are allowed to be set via set_preferences!

source


# AIHelpMe.LOADED_PACKSConstant.
julia
LOADED_PACKS

The knowledge packs that are currently loaded in the index.

source


# AIHelpMe.PREFERENCESConstant.
julia
PREFERENCES

You can set preferences for AIHelpMe by using the set_preferences!. It will create a LocalPreferences.toml file in your current directory and will reload your prefences from there.

Check your preferences by calling get_preferences(key::String).

Available Preferences (for set_preferences!)

  • MODEL_CHAT: The default model to use for aigenerate and most ai* calls. See PromptingTools.MODEL_REGISTRY for a list of available models or define your own with PromptingTools.register_model!.

  • MODEL_EMBEDDING: The default model to use for aiembed (embedding documents). See PromptingTools.MODEL_REGISTRY for a list of available models or define your own with PromptingTools.register_model!.

  • EMBEDDING_DIMENSION: The dimension of the embedding vector. Defaults to 1024 (truncated OpenAI embedding). Set to 0 to use the maximum allowed dimension.

  • LOADED_PACKS: The knowledge packs that are loaded on restart/refresh (load_index!()).

source


# AIHelpMe.RAG_CONFIGURATIONSConstant.
julia
RAG_CONFIGURATIONS

A dictionary of RAG configurations, keyed by a unique symbol (eg, bronze). Each entry contains a dictionary with keys :config and :kwargs, where :config is the RAG configuration object (AbstractRAGConfig) and :kwargs the NamedTuple of corresponding kwargs.

Available Options:

  • :bronze: A simple configuration for a bronze pipeline, using truncated binary embeddings (dimensionality: 1024) and no re-ranking or refinement.

  • :silver: A simple configuration for a bronze pipeline, using truncated binary embeddings (dimensionality: 1024) but also enables re-ranking step.

  • :gold: A more complex configuration, similar to :simpler, but using a standard embeddings (dimensionality: 3072, type: Float32). It also leverages re-ranking and refinement with a web-search.

source


# AIHelpMe.aihelpMethod.
julia
aihelp([cfg::RT.AbstractRAGConfig, index::RT.AbstractChunkIndex,]
     question::AbstractString;
     verbose::Integer = 1,
     model = MODEL_CHAT,
@@ -18,34 +18,34 @@ import{_ as s,c as i,o as e,a7 as a}from"./chunks/framework.BTZ5G5k6.js";const u
 
 question = "How to make a barplot in Makie.jl?"
 result = aihelp(question; search = true, rerank = true, return_all = true)
-pprint(result) # nicer display with sources for each chunk/sentences (look for square brackets)

source


# AIHelpMe.docdata_to_sourceMethod.
julia
docdata_to_source(data::AbstractDict)

Creates a source path from a given DocStr record

source


# AIHelpMe.docextractFunction.
julia
docextract(d::MultiDoc, sep::AbstractString = "

")

Extracts the documentation from a MultiDoc record (separates the individual docs within DocStr with sep)

source


# AIHelpMe.docextractFunction.
julia
docextract(modules::Vector{Module} = Base.Docs.modules)

Extracts the documentation from a vector of modules.

source


# AIHelpMe.docextractFunction.
julia
docextract(d::DocStr, sep::AbstractString = "

")

Extracts the documentation from a DocStr record. Separates the individual docs within DocStr with sep.

source


# AIHelpMe.docextractMethod.
julia
docextract(mod::Module)

Extracts the documentation from a given (loaded) module.

source


# AIHelpMe.find_new_chunksMethod.
julia
find_new_chunks(old_chunks::AbstractVector{<:AbstractString},
-    new_chunks::AbstractVector{<:AbstractString})

Identifies the new chunks in new_chunks that are not present in old_chunks.

Returns a mask of chunks that are new (not present in old_chunks).

Uses SHA256 hashes to dedupe the strings quickly and effectively.

source


# AIHelpMe.get_config_keyFunction.

Returns the configuration key for the given cfg and kwargs to use the relevant artifacts.

source


# AIHelpMe.get_preferencesMethod.
julia
get_preferences(key::String)

Get preferences for AIHelpMe. See ?PREFERENCES for more information.

See also: set_preferences!

Example

julia
AIHelpMe.get_preferences("MODEL_CHAT")

source


# AIHelpMe.last_resultMethod.
julia
last_result()

Returns the RAGResult from the last aihelp call. It can be useful to see the sources/references used by the AI model to generate the response.

If you're using aihelp() make sure to set return_all = true to return the RAGResult.

source


# AIHelpMe.load_index!Function.
julia
load_index!(packs::Vector{Symbol}=LOADED_PACKS[]; verbose::Bool = true, kwargs...)
-load_index!(pack::Symbol; verbose::Bool = true, kwargs...)

Loads one or more packs into the main index from our pre-built artifacts.

Availability of packs might vary depending on your pipeline configuration (ie, whether we have the correct embeddings for it). See AIHelpMe.ALLOWED_PACKS

Example

julia
load_index!(:julia)

Or multiple packs

julia
load_index!([:julia, :makie,:tidier])

source


# AIHelpMe.load_index!Method.
julia
load_index!(file_path::AbstractString;
-    verbose::Bool = true, kwargs...)

Loads the serialized index in file_path into the global variable MAIN_INDEX.

Supports .jls (serialized Julia object) and .hdf5 (HDF5.jl) files.

source


# AIHelpMe.load_index!Method.
julia
load_index!(index::RT.AbstractChunkIndex;
+pprint(result) # nicer display with sources for each chunk/sentences (look for square brackets)

source


# AIHelpMe.docdata_to_sourceMethod.
julia
docdata_to_source(data::AbstractDict)

Creates a source path from a given DocStr record

source


# AIHelpMe.docextractFunction.
julia
docextract(d::MultiDoc, sep::AbstractString = "

")

Extracts the documentation from a MultiDoc record (separates the individual docs within DocStr with sep)

source


# AIHelpMe.docextractFunction.
julia
docextract(modules::Vector{Module} = Base.Docs.modules)

Extracts the documentation from a vector of modules.

source


# AIHelpMe.docextractFunction.
julia
docextract(d::DocStr, sep::AbstractString = "

")

Extracts the documentation from a DocStr record. Separates the individual docs within DocStr with sep.

source


# AIHelpMe.docextractMethod.
julia
docextract(mod::Module)

Extracts the documentation from a given (loaded) module.

source


# AIHelpMe.find_new_chunksMethod.
julia
find_new_chunks(old_chunks::AbstractVector{<:AbstractString},
+    new_chunks::AbstractVector{<:AbstractString})

Identifies the new chunks in new_chunks that are not present in old_chunks.

Returns a mask of chunks that are new (not present in old_chunks).

Uses SHA256 hashes to dedupe the strings quickly and effectively.

source


# AIHelpMe.get_config_keyFunction.

Returns the configuration key for the given cfg and kwargs to use the relevant artifacts.

source


# AIHelpMe.get_preferencesMethod.
julia
get_preferences(key::String)

Get preferences for AIHelpMe. See ?PREFERENCES for more information.

See also: set_preferences!

Example

julia
AIHelpMe.get_preferences("MODEL_CHAT")

source


# AIHelpMe.last_resultMethod.
julia
last_result()

Returns the RAGResult from the last aihelp call. It can be useful to see the sources/references used by the AI model to generate the response.

If you're using aihelp() make sure to set return_all = true to return the RAGResult.

source


# AIHelpMe.load_index!Function.
julia
load_index!(packs::Vector{Symbol}=LOADED_PACKS[]; verbose::Bool = true, kwargs...)
+load_index!(pack::Symbol; verbose::Bool = true, kwargs...)

Loads one or more packs into the main index from our pre-built artifacts.

Availability of packs might vary depending on your pipeline configuration (ie, whether we have the correct embeddings for it). See AIHelpMe.ALLOWED_PACKS

Example

julia
load_index!(:julia)

Or multiple packs

julia
load_index!([:julia, :makie,:tidier])

source


# AIHelpMe.load_index!Method.
julia
load_index!(file_path::AbstractString;
+    verbose::Bool = true, kwargs...)

Loads the serialized index in file_path into the global variable MAIN_INDEX.

Supports .jls (serialized Julia object) and .hdf5 (HDF5.jl) files.

source


# AIHelpMe.load_index!Method.
julia
load_index!(index::RT.AbstractChunkIndex;
     verbose::Bool = 1, kwargs...)

Loads the provided index into the global variable MAIN_INDEX.

If you don't have an index yet, use build_index to build one from your currently loaded packages (see ?build_index)

Example

julia
# build an index from some modules, keep empty to embed all loaded modules (eg, \`build_index()\`) 
 index = AIH.build_index([DataFramesMeta, DataFrames, CSV])
-AIH.load_index!(index)

source


# AIHelpMe.load_index_hdf5Method.

Hacky function to load a HDF5 file into a ChunkIndex object. Only bare-bone ChunkIndex is supported right now.

source


# AIHelpMe.set_preferences!Method.
julia
set_preferences!(pairs::Pair{String, <:Any}...)

Set preferences for AIHelpMe. See ?PREFERENCES for more information.

See also: get_preferences

Example

Change your API key and default model:

julia
# EMBEDDING_DIMENSION of 0 means the maximum allowed
-AIHelpMe.set_preferences!("MODEL_CHAT" => "llama3", "MODEL_EMBEDDING" => "nomic-embed-text", "EMBEDDING_DIMENSION" => 0)

source


# AIHelpMe.update_indexFunction.
julia
update_index(index::RT.AbstractChunkIndex = MAIN_INDEX[],
+AIH.load_index!(index)

source


# AIHelpMe.load_index_hdf5Method.

Hacky function to load a HDF5 file into a ChunkIndex object. Only bare-bone ChunkIndex is supported right now.

source


# AIHelpMe.set_preferences!Method.
julia
set_preferences!(pairs::Pair{String, <:Any}...)

Set preferences for AIHelpMe. See ?PREFERENCES for more information.

See also: get_preferences

Example

Change your API key and default model:

julia
# EMBEDDING_DIMENSION of 0 means the maximum allowed
+AIHelpMe.set_preferences!("MODEL_CHAT" => "llama3", "MODEL_EMBEDDING" => "nomic-embed-text", "EMBEDDING_DIMENSION" => 0)

source


# AIHelpMe.update_indexFunction.
julia
update_index(index::RT.AbstractChunkIndex = MAIN_INDEX[],
     modules::Vector{Module} = Base.Docs.modules;
     verbose::Integer = 1,
     kwargs...)

Updates the provided index with the documentation of the provided modules.

Deduplicates against the index.sources and embeds only the new document chunks (as measured by a hash).

Returns the updated index (new instance).

For available configurations and customizations, see the corresponding modules and functions of PromptingTools.Experimental.RAGTools (eg, build_index).

Example

If you loaded some new packages and want to add them to your MAIN_INDEX (or any index you use), run:

julia
# To update the MAIN_INDEX as well
 AIHelpMe.update_index() |> AHM.load_index!
 
 # To update an explicit index
-index = AIHelpMe.update_index(index)

source


# AIHelpMe.update_pipeline!Function.
julia
update_pipeline!(option::Symbol = :bronze; model_chat = MODEL_CHAT,
+index = AIHelpMe.update_index(index)

source


# AIHelpMe.update_pipeline!Function.
julia
update_pipeline!(option::Symbol = :bronze; model_chat = MODEL_CHAT,
     model_embedding = MODEL_EMBEDDING, verbose::Bool = true, embedding_dimension::Integer = EMBEDDING_DIMENSION)

Updates the default RAG pipeline to one of the pre-configuration options and sets the requested chat and embedding models.

This is a good way to update model types to change between OpenAI models and Ollama models.

See available pipeline options via keys(RAG_CONFIGURATIONS).

Logic:

  • Updates the global MODEL_CHAT and MODEL_EMBEDDING to the requested models.

  • Update the global EMBEDDING_DIMENSION for the requested embedding dimensionality after truncation (embedding_dimension).

  • Updates the global RAG_CONFIG and RAG_KWARGS to the requested option.

  • Updates the global LOADED_CONFIG_KEY to the configuration key for the given option and kwargs (used by the artifact system to download the correct knowledge packs).

Example

julia
update_pipeline!(:bronze; model_chat = "gpt4t")

You don't need to re-load your index if you just change the chat model.

You can switch the pipeline to Ollama models: Note: only 1 Ollama embedding model is supported for embeddings now! You must select "nomic-embed-text" and if you do, set embedding_dimension=0 (maximum dimension available)

julia
update_pipeline!(:bronze; model_chat = "llama3", model_embedding="nomic-embed-text", embedding_dimension=0)
 
 # You must download the corresponding knowledge packs via \`load_index!\` (because you changed the embedding model)
-load_index!()

source


# PromptingTools.Experimental.RAGTools.build_indexFunction.
julia
RT.build_index(modules::Vector{Module} = Base.Docs.modules; verbose::Int = 1,
-    kwargs...)

Build index from the documentation of the currently loaded modules. If modules is empty, it will use all currently loaded modules.

source


# PromptingTools.Experimental.RAGTools.build_indexMethod.
julia
RT.build_index(mod::Module; verbose::Int = 1, kwargs...)

Build index from the documentation of a given module mod.

source


# AIHelpMe.@aihelp!_strMacro.
julia
aihelp!"user_question"[model_alias] -> AIMessage

The aihelp!"" string macro is used to continue a previous conversation with the AI model.

It appends the new user prompt to the last conversation in the tracked history (in AIHelpMe.CONV_HISTORY) and generates a response based on the entire conversation context. If you want to see the previous conversation, you can access it via AIHelpMe.CONV_HISTORY, which keeps at most last PromptingTools.MAX_HISTORY_LENGTH conversations.

It does NOT provide new context from the documentation. To do that, start a new conversation with aihelp"<question>".

Arguments

  • user_question (String): The follow up question to be added to the existing conversation.

  • model_alias (optional, any): Specify the model alias of the AI model to be used (see PT.MODEL_ALIASES). If not provided, the default model is used.

Returns

AIMessage corresponding to the new user prompt, considering the entire conversation history.

Example

To continue a conversation:

julia
# start conversation as normal
+load_index!()

source


# PromptingTools.Experimental.RAGTools.build_indexFunction.
julia
RT.build_index(modules::Vector{Module} = Base.Docs.modules; verbose::Int = 1,
+    kwargs...)

Build index from the documentation of the currently loaded modules. If modules is empty, it will use all currently loaded modules.

source


# PromptingTools.Experimental.RAGTools.build_indexMethod.
julia
RT.build_index(mod::Module; verbose::Int = 1, kwargs...)

Build index from the documentation of a given module mod.

source


# AIHelpMe.@aihelp!_strMacro.
julia
aihelp!"user_question"[model_alias] -> AIMessage

The aihelp!"" string macro is used to continue a previous conversation with the AI model.

It appends the new user prompt to the last conversation in the tracked history (in AIHelpMe.CONV_HISTORY) and generates a response based on the entire conversation context. If you want to see the previous conversation, you can access it via AIHelpMe.CONV_HISTORY, which keeps at most last PromptingTools.MAX_HISTORY_LENGTH conversations.

It does NOT provide new context from the documentation. To do that, start a new conversation with aihelp"<question>".

Arguments

  • user_question (String): The follow up question to be added to the existing conversation.

  • model_alias (optional, any): Specify the model alias of the AI model to be used (see PT.MODEL_ALIASES). If not provided, the default model is used.

Returns

AIMessage corresponding to the new user prompt, considering the entire conversation history.

Example

To continue a conversation:

julia
# start conversation as normal
 aihelp"How to create a dictionary?" 
 
 # ... wait for reply and then react to it:
 
 # continue the conversation (notice that you can change the model, eg, to more powerful one for better answer)
 aihelp!"Can you create it from named tuple?"gpt4t
-# AIMessage("Yes, you can create a dictionary from a named tuple ...")

Usage Notes

  • This macro should be used when you want to maintain the context of an ongoing conversation (ie, the last ai"" message).

  • It automatically accesses and updates the global conversation history.

  • If no conversation history is found, it raises an assertion error, suggesting to initiate a new conversation using ai"" instead.

Important

Ensure that the conversation history is not too long to maintain relevancy and coherence in the AI's responses. The history length is managed by MAX_HISTORY_LENGTH.

source


# AIHelpMe.@aihelp_strMacro.
julia
aihelp"user_question"[model_alias] -> AIMessage

The aihelp"" string macro generates an AI response to a given user question by using aihelp under the hood. It will automatically try to provide the most relevant bits of the documentation (from the index) to the LLM to answer the question.

See also aihelp!"" if you want to reply to the provided message / continue the conversation.

Arguments

  • user_question (String): The question to be answered by the AI model.

  • model_alias (optional, any): Provide model alias of the AI model (see MODEL_ALIASES).

Returns

AIMessage corresponding to the input prompt.

Example

julia
result = aihelp"Hello, how are you?"
+# AIMessage("Yes, you can create a dictionary from a named tuple ...")

Usage Notes

  • This macro should be used when you want to maintain the context of an ongoing conversation (ie, the last ai"" message).

  • It automatically accesses and updates the global conversation history.

  • If no conversation history is found, it raises an assertion error, suggesting to initiate a new conversation using ai"" instead.

Important

Ensure that the conversation history is not too long to maintain relevancy and coherence in the AI's responses. The history length is managed by MAX_HISTORY_LENGTH.

source


# AIHelpMe.@aihelp_strMacro.
julia
aihelp"user_question"[model_alias] -> AIMessage

The aihelp"" string macro generates an AI response to a given user question by using aihelp under the hood. It will automatically try to provide the most relevant bits of the documentation (from the index) to the LLM to answer the question.

See also aihelp!"" if you want to reply to the provided message / continue the conversation.

Arguments

  • user_question (String): The question to be answered by the AI model.

  • model_alias (optional, any): Provide model alias of the AI model (see MODEL_ALIASES).

Returns

AIMessage corresponding to the input prompt.

Example

julia
result = aihelp"Hello, how are you?"
 # AIMessage("Hello! I'm an AI assistant, so I don't have feelings, but I'm here to help you. How can I assist you today?")

If you want to interpolate some variables or additional context, simply use string interpolation:

julia
a=1
 result = aihelp"What is \`$a+$a\`?"
 # AIMessage("The sum of \`1+1\` is \`2\`.")

If you want to use a different model, eg, GPT-3.5 Turbo, you can provide its alias as a flag:

julia
result = aihelp"What is \`1.23 * 100 + 1\`?"gpt3t
-# AIMessage("The answer is 124.")

source


`,54),n=[l];function p(d,o,h,r,k,c){return e(),i("div",null,n)}const E=s(t,[["render",p]]);export{u as __pageData,E as default}; +# AIMessage("The answer is 124.")

source


`,54),n=[l];function p(d,o,h,r,k,c){return e(),i("div",null,n)}const E=s(t,[["render",p]]);export{u as __pageData,E as default}; diff --git a/dev/assets/reference.md.CC7Os28t.lean.js b/dev/assets/reference.md.jv-XqdQl.lean.js similarity index 100% rename from dev/assets/reference.md.CC7Os28t.lean.js rename to dev/assets/reference.md.jv-XqdQl.lean.js diff --git a/dev/faq.html b/dev/faq.html index bb4d5b5..d722a8c 100644 --- a/dev/faq.html +++ b/dev/faq.html @@ -19,7 +19,7 @@
Skip to content

Frequently Asked Questions

##Is it expensive to embed all my documentation? No, embedding a comprehensive set of documentation is surprisingly cost-effective. Embedding around 170 modules, including all standard libraries and more, costs approximately 8 cents and takes less than 30 seconds. To save you money, we have already embedded the Julia standard libraries and made them available for download via Artifacts. We expect that any further knowledge base extensions should be at most a few cents (see Extending the Knowledge Base).

How much does it cost to ask a question?

Each query incurs only a fraction of a cent, depending on the length and chosen model.

Can I use the Cohere Trial API Key for commercial projects?

No, a trial key is only for testing purposes. But it takes only a few clicks to switch to Production API. The cost is only 1 per 1000 searches (!!!) and has many other benefits.

How accurate are the answers?

Like any other Generative AI answers, ie, it depends and you should always double-check.

Can I use it without the internet?

Not at the moment. It might be possible in the future, as PromptingTools.jl supports local LLMs.

Why do we need Cohere API Key?

Cohere's API is used to re-rank the best matching snippets from the documentation. It's free to use in limited quantities (ie, ~thousand requests per month), which should be enough for most users. Re-ranking improves the quality and accuracy of the answers.

Why do we need Tavily API Key?

Tavily is used for the web search results to augment your answers. It's free to use in limited quantities.

Can we use Ollama (locally-hosted) models?

Yes! See the Using Ollama Models section.

- + \ No newline at end of file diff --git a/dev/hashmap.json b/dev/hashmap.json index de0d985..039ea1c 100644 --- a/dev/hashmap.json +++ b/dev/hashmap.json @@ -1 +1 @@ -{"faq.md":"O9UdQwDm","index.md":"BW7u-aFT","introduction.md":"H808J_as","advanced.md":"W3Wxvh9-","reference.md":"CC7Os28t"} +{"index.md":"BW7u-aFT","faq.md":"O9UdQwDm","advanced.md":"W3Wxvh9-","introduction.md":"H808J_as","reference.md":"jv-XqdQl"} diff --git a/dev/index.html b/dev/index.html index 66550e7..836606d 100644 --- a/dev/index.html +++ b/dev/index.html @@ -31,7 +31,7 @@ # you can achieve the same with aihelp"" macros, by simply calling the "last_result" pprint(last_result())

For more information, see the Quick Start Guide section. For setting up AIHelpMe with locally-hosted models, see the Using Ollama Models section.

- + \ No newline at end of file diff --git a/dev/introduction.html b/dev/introduction.html index 4cec88d..28f9d06 100644 --- a/dev/introduction.html +++ b/dev/introduction.html @@ -112,7 +112,7 @@ aihelp"What does this error mean? \$err" # Note the $err to interpolate the stacktrace
plaintext
[ Info: Done generating response. Total cost: \$0.003
 
 AIMessage("The error message "MethodError: no method matching f(::Int8)" means that there is no method defined for function `f` that accepts an argument of type `Int8`. The error message also provides the closest candidate methods that were found, which are `f(::Any, !Matched::Any)` and `f(!Matched::Int64)` in the specified file `embed_all.jl` at lines 45 and 61, respectively.")

How it works

AIHelpMe leverages PromptingTools.jl to communicate with the AI models.

We apply a Retrieval Augment Generation (RAG) pattern, ie,

This ensures that the answers are not only based on general AI knowledge but are also specifically tailored to Julia's ecosystem and best practices.

Visit an introduction to RAG tools in PromptingTools.jl.

Future Directions

AIHelpMe is continuously evolving. Future updates may include:

Please note that this is merely a pre-release to gauge the interest in this project.

- + \ No newline at end of file diff --git a/dev/reference.html b/dev/reference.html index d1c4029..3b1fd9f 100644 --- a/dev/reference.html +++ b/dev/reference.html @@ -12,13 +12,13 @@ - + -
Skip to content

Reference

# AIHelpMe.ALLOWED_PACKSConstant.
julia
ALLOWED PACKS

Currently available packs are:

  • :julia - Julia documentation, standard library docstrings and a few extras (for Julia v1.10)

  • :tidier - Tidier.jl organization documentation (as of 7th April 2024)

  • :makie - Makie.jl organization documentation (as of 30th March 2024)

source


# AIHelpMe.ALLOWED_PREFERENCESConstant.

Keys that are allowed to be set via set_preferences!

source


# AIHelpMe.LOADED_PACKSConstant.
julia
LOADED_PACKS

The knowledge packs that are currently loaded in the index.

source


# AIHelpMe.PREFERENCESConstant.
julia
PREFERENCES

You can set preferences for AIHelpMe by using the set_preferences!. It will create a LocalPreferences.toml file in your current directory and will reload your prefences from there.

Check your preferences by calling get_preferences(key::String).

Available Preferences (for set_preferences!)

  • MODEL_CHAT: The default model to use for aigenerate and most ai* calls. See PromptingTools.MODEL_REGISTRY for a list of available models or define your own with PromptingTools.register_model!.

  • MODEL_EMBEDDING: The default model to use for aiembed (embedding documents). See PromptingTools.MODEL_REGISTRY for a list of available models or define your own with PromptingTools.register_model!.

  • EMBEDDING_DIMENSION: The dimension of the embedding vector. Defaults to 1024 (truncated OpenAI embedding). Set to 0 to use the maximum allowed dimension.

  • LOADED_PACKS: The knowledge packs that are loaded on restart/refresh (load_index!()).

source


# AIHelpMe.RAG_CONFIGURATIONSConstant.
julia
RAG_CONFIGURATIONS

A dictionary of RAG configurations, keyed by a unique symbol (eg, bronze). Each entry contains a dictionary with keys :config and :kwargs, where :config is the RAG configuration object (AbstractRAGConfig) and :kwargs the NamedTuple of corresponding kwargs.

Available Options:

  • :bronze: A simple configuration for a bronze pipeline, using truncated binary embeddings (dimensionality: 1024) and no re-ranking or refinement.

  • :silver: A simple configuration for a bronze pipeline, using truncated binary embeddings (dimensionality: 1024) but also enables re-ranking step.

  • :gold: A more complex configuration, similar to :simpler, but using a standard embeddings (dimensionality: 3072, type: Float32). It also leverages re-ranking and refinement with a web-search.

source


# AIHelpMe.aihelpMethod.
julia
aihelp([cfg::RT.AbstractRAGConfig, index::RT.AbstractChunkIndex,]
+    
Skip to content

Reference

# AIHelpMe.ALLOWED_PACKSConstant.
julia
ALLOWED PACKS

Currently available packs are:

  • :julia - Julia documentation, standard library docstrings and a few extras (for Julia v1.10)

  • :tidier - Tidier.jl organization documentation (as of 7th April 2024)

  • :makie - Makie.jl organization documentation (as of 30th March 2024)

source


# AIHelpMe.ALLOWED_PREFERENCESConstant.

Keys that are allowed to be set via set_preferences!

source


# AIHelpMe.LOADED_PACKSConstant.
julia
LOADED_PACKS

The knowledge packs that are currently loaded in the index.

source


# AIHelpMe.PREFERENCESConstant.
julia
PREFERENCES

You can set preferences for AIHelpMe by using the set_preferences!. It will create a LocalPreferences.toml file in your current directory and will reload your prefences from there.

Check your preferences by calling get_preferences(key::String).

Available Preferences (for set_preferences!)

  • MODEL_CHAT: The default model to use for aigenerate and most ai* calls. See PromptingTools.MODEL_REGISTRY for a list of available models or define your own with PromptingTools.register_model!.

  • MODEL_EMBEDDING: The default model to use for aiembed (embedding documents). See PromptingTools.MODEL_REGISTRY for a list of available models or define your own with PromptingTools.register_model!.

  • EMBEDDING_DIMENSION: The dimension of the embedding vector. Defaults to 1024 (truncated OpenAI embedding). Set to 0 to use the maximum allowed dimension.

  • LOADED_PACKS: The knowledge packs that are loaded on restart/refresh (load_index!()).

source


# AIHelpMe.RAG_CONFIGURATIONSConstant.
julia
RAG_CONFIGURATIONS

A dictionary of RAG configurations, keyed by a unique symbol (eg, bronze). Each entry contains a dictionary with keys :config and :kwargs, where :config is the RAG configuration object (AbstractRAGConfig) and :kwargs the NamedTuple of corresponding kwargs.

Available Options:

  • :bronze: A simple configuration for a bronze pipeline, using truncated binary embeddings (dimensionality: 1024) and no re-ranking or refinement.

  • :silver: A simple configuration for a bronze pipeline, using truncated binary embeddings (dimensionality: 1024) but also enables re-ranking step.

  • :gold: A more complex configuration, similar to :simpler, but using a standard embeddings (dimensionality: 3072, type: Float32). It also leverages re-ranking and refinement with a web-search.

source


# AIHelpMe.aihelpMethod.
julia
aihelp([cfg::RT.AbstractRAGConfig, index::RT.AbstractChunkIndex,]
     question::AbstractString;
     verbose::Integer = 1,
     model = MODEL_CHAT,
@@ -38,38 +38,38 @@
 
 question = "How to make a barplot in Makie.jl?"
 result = aihelp(question; search = true, rerank = true, return_all = true)
-pprint(result) # nicer display with sources for each chunk/sentences (look for square brackets)

source


# AIHelpMe.docdata_to_sourceMethod.
julia
docdata_to_source(data::AbstractDict)

Creates a source path from a given DocStr record

source


# AIHelpMe.docextractFunction.
julia
docextract(d::MultiDoc, sep::AbstractString = "

")

Extracts the documentation from a MultiDoc record (separates the individual docs within DocStr with sep)

source


# AIHelpMe.docextractFunction.
julia
docextract(modules::Vector{Module} = Base.Docs.modules)

Extracts the documentation from a vector of modules.

source


# AIHelpMe.docextractFunction.
julia
docextract(d::DocStr, sep::AbstractString = "

")

Extracts the documentation from a DocStr record. Separates the individual docs within DocStr with sep.

source


# AIHelpMe.docextractMethod.
julia
docextract(mod::Module)

Extracts the documentation from a given (loaded) module.

source


# AIHelpMe.find_new_chunksMethod.
julia
find_new_chunks(old_chunks::AbstractVector{<:AbstractString},
-    new_chunks::AbstractVector{<:AbstractString})

Identifies the new chunks in new_chunks that are not present in old_chunks.

Returns a mask of chunks that are new (not present in old_chunks).

Uses SHA256 hashes to dedupe the strings quickly and effectively.

source


# AIHelpMe.get_config_keyFunction.

Returns the configuration key for the given cfg and kwargs to use the relevant artifacts.

source


# AIHelpMe.get_preferencesMethod.
julia
get_preferences(key::String)

Get preferences for AIHelpMe. See ?PREFERENCES for more information.

See also: set_preferences!

Example

julia
AIHelpMe.get_preferences("MODEL_CHAT")

source


# AIHelpMe.last_resultMethod.
julia
last_result()

Returns the RAGResult from the last aihelp call. It can be useful to see the sources/references used by the AI model to generate the response.

If you're using aihelp() make sure to set return_all = true to return the RAGResult.

source


# AIHelpMe.load_index!Function.
julia
load_index!(packs::Vector{Symbol}=LOADED_PACKS[]; verbose::Bool = true, kwargs...)
-load_index!(pack::Symbol; verbose::Bool = true, kwargs...)

Loads one or more packs into the main index from our pre-built artifacts.

Availability of packs might vary depending on your pipeline configuration (ie, whether we have the correct embeddings for it). See AIHelpMe.ALLOWED_PACKS

Example

julia
load_index!(:julia)

Or multiple packs

julia
load_index!([:julia, :makie,:tidier])

source


# AIHelpMe.load_index!Method.
julia
load_index!(file_path::AbstractString;
-    verbose::Bool = true, kwargs...)

Loads the serialized index in file_path into the global variable MAIN_INDEX.

Supports .jls (serialized Julia object) and .hdf5 (HDF5.jl) files.

source


# AIHelpMe.load_index!Method.
julia
load_index!(index::RT.AbstractChunkIndex;
+pprint(result) # nicer display with sources for each chunk/sentences (look for square brackets)

source


# AIHelpMe.docdata_to_sourceMethod.
julia
docdata_to_source(data::AbstractDict)

Creates a source path from a given DocStr record

source


# AIHelpMe.docextractFunction.
julia
docextract(d::MultiDoc, sep::AbstractString = "

")

Extracts the documentation from a MultiDoc record (separates the individual docs within DocStr with sep)

source


# AIHelpMe.docextractFunction.
julia
docextract(modules::Vector{Module} = Base.Docs.modules)

Extracts the documentation from a vector of modules.

source


# AIHelpMe.docextractFunction.
julia
docextract(d::DocStr, sep::AbstractString = "

")

Extracts the documentation from a DocStr record. Separates the individual docs within DocStr with sep.

source


# AIHelpMe.docextractMethod.
julia
docextract(mod::Module)

Extracts the documentation from a given (loaded) module.

source


# AIHelpMe.find_new_chunksMethod.
julia
find_new_chunks(old_chunks::AbstractVector{<:AbstractString},
+    new_chunks::AbstractVector{<:AbstractString})

Identifies the new chunks in new_chunks that are not present in old_chunks.

Returns a mask of chunks that are new (not present in old_chunks).

Uses SHA256 hashes to dedupe the strings quickly and effectively.

source


# AIHelpMe.get_config_keyFunction.

Returns the configuration key for the given cfg and kwargs to use the relevant artifacts.

source


# AIHelpMe.get_preferencesMethod.
julia
get_preferences(key::String)

Get preferences for AIHelpMe. See ?PREFERENCES for more information.

See also: set_preferences!

Example

julia
AIHelpMe.get_preferences("MODEL_CHAT")

source


# AIHelpMe.last_resultMethod.
julia
last_result()

Returns the RAGResult from the last aihelp call. It can be useful to see the sources/references used by the AI model to generate the response.

If you're using aihelp() make sure to set return_all = true to return the RAGResult.

source


# AIHelpMe.load_index!Function.
julia
load_index!(packs::Vector{Symbol}=LOADED_PACKS[]; verbose::Bool = true, kwargs...)
+load_index!(pack::Symbol; verbose::Bool = true, kwargs...)

Loads one or more packs into the main index from our pre-built artifacts.

Availability of packs might vary depending on your pipeline configuration (ie, whether we have the correct embeddings for it). See AIHelpMe.ALLOWED_PACKS

Example

julia
load_index!(:julia)

Or multiple packs

julia
load_index!([:julia, :makie,:tidier])

source


# AIHelpMe.load_index!Method.
julia
load_index!(file_path::AbstractString;
+    verbose::Bool = true, kwargs...)

Loads the serialized index in file_path into the global variable MAIN_INDEX.

Supports .jls (serialized Julia object) and .hdf5 (HDF5.jl) files.

source


# AIHelpMe.load_index!Method.
julia
load_index!(index::RT.AbstractChunkIndex;
     verbose::Bool = 1, kwargs...)

Loads the provided index into the global variable MAIN_INDEX.

If you don't have an index yet, use build_index to build one from your currently loaded packages (see ?build_index)

Example

julia
# build an index from some modules, keep empty to embed all loaded modules (eg, `build_index()`) 
 index = AIH.build_index([DataFramesMeta, DataFrames, CSV])
-AIH.load_index!(index)

source


# AIHelpMe.load_index_hdf5Method.

Hacky function to load a HDF5 file into a ChunkIndex object. Only bare-bone ChunkIndex is supported right now.

source


# AIHelpMe.set_preferences!Method.
julia
set_preferences!(pairs::Pair{String, <:Any}...)

Set preferences for AIHelpMe. See ?PREFERENCES for more information.

See also: get_preferences

Example

Change your API key and default model:

julia
# EMBEDDING_DIMENSION of 0 means the maximum allowed
-AIHelpMe.set_preferences!("MODEL_CHAT" => "llama3", "MODEL_EMBEDDING" => "nomic-embed-text", "EMBEDDING_DIMENSION" => 0)

source


# AIHelpMe.update_indexFunction.
julia
update_index(index::RT.AbstractChunkIndex = MAIN_INDEX[],
+AIH.load_index!(index)

source


# AIHelpMe.load_index_hdf5Method.

Hacky function to load a HDF5 file into a ChunkIndex object. Only bare-bone ChunkIndex is supported right now.

source


# AIHelpMe.set_preferences!Method.
julia
set_preferences!(pairs::Pair{String, <:Any}...)

Set preferences for AIHelpMe. See ?PREFERENCES for more information.

See also: get_preferences

Example

Change your API key and default model:

julia
# EMBEDDING_DIMENSION of 0 means the maximum allowed
+AIHelpMe.set_preferences!("MODEL_CHAT" => "llama3", "MODEL_EMBEDDING" => "nomic-embed-text", "EMBEDDING_DIMENSION" => 0)

source


# AIHelpMe.update_indexFunction.
julia
update_index(index::RT.AbstractChunkIndex = MAIN_INDEX[],
     modules::Vector{Module} = Base.Docs.modules;
     verbose::Integer = 1,
     kwargs...)

Updates the provided index with the documentation of the provided modules.

Deduplicates against the index.sources and embeds only the new document chunks (as measured by a hash).

Returns the updated index (new instance).

For available configurations and customizations, see the corresponding modules and functions of PromptingTools.Experimental.RAGTools (eg, build_index).

Example

If you loaded some new packages and want to add them to your MAIN_INDEX (or any index you use), run:

julia
# To update the MAIN_INDEX as well
 AIHelpMe.update_index() |> AHM.load_index!
 
 # To update an explicit index
-index = AIHelpMe.update_index(index)

source


# AIHelpMe.update_pipeline!Function.
julia
update_pipeline!(option::Symbol = :bronze; model_chat = MODEL_CHAT,
+index = AIHelpMe.update_index(index)

source


# AIHelpMe.update_pipeline!Function.
julia
update_pipeline!(option::Symbol = :bronze; model_chat = MODEL_CHAT,
     model_embedding = MODEL_EMBEDDING, verbose::Bool = true, embedding_dimension::Integer = EMBEDDING_DIMENSION)

Updates the default RAG pipeline to one of the pre-configuration options and sets the requested chat and embedding models.

This is a good way to update model types to change between OpenAI models and Ollama models.

See available pipeline options via keys(RAG_CONFIGURATIONS).

Logic:

  • Updates the global MODEL_CHAT and MODEL_EMBEDDING to the requested models.

  • Update the global EMBEDDING_DIMENSION for the requested embedding dimensionality after truncation (embedding_dimension).

  • Updates the global RAG_CONFIG and RAG_KWARGS to the requested option.

  • Updates the global LOADED_CONFIG_KEY to the configuration key for the given option and kwargs (used by the artifact system to download the correct knowledge packs).

Example

julia
update_pipeline!(:bronze; model_chat = "gpt4t")

You don't need to re-load your index if you just change the chat model.

You can switch the pipeline to Ollama models: Note: only 1 Ollama embedding model is supported for embeddings now! You must select "nomic-embed-text" and if you do, set embedding_dimension=0 (maximum dimension available)

julia
update_pipeline!(:bronze; model_chat = "llama3", model_embedding="nomic-embed-text", embedding_dimension=0)
 
 # You must download the corresponding knowledge packs via `load_index!` (because you changed the embedding model)
-load_index!()

source


# PromptingTools.Experimental.RAGTools.build_indexFunction.
julia
RT.build_index(modules::Vector{Module} = Base.Docs.modules; verbose::Int = 1,
-    kwargs...)

Build index from the documentation of the currently loaded modules. If modules is empty, it will use all currently loaded modules.

source


# PromptingTools.Experimental.RAGTools.build_indexMethod.
julia
RT.build_index(mod::Module; verbose::Int = 1, kwargs...)

Build index from the documentation of a given module mod.

source


# AIHelpMe.@aihelp!_strMacro.
julia
aihelp!"user_question"[model_alias] -> AIMessage

The aihelp!"" string macro is used to continue a previous conversation with the AI model.

It appends the new user prompt to the last conversation in the tracked history (in AIHelpMe.CONV_HISTORY) and generates a response based on the entire conversation context. If you want to see the previous conversation, you can access it via AIHelpMe.CONV_HISTORY, which keeps at most last PromptingTools.MAX_HISTORY_LENGTH conversations.

It does NOT provide new context from the documentation. To do that, start a new conversation with aihelp"<question>".

Arguments

  • user_question (String): The follow up question to be added to the existing conversation.

  • model_alias (optional, any): Specify the model alias of the AI model to be used (see PT.MODEL_ALIASES). If not provided, the default model is used.

Returns

AIMessage corresponding to the new user prompt, considering the entire conversation history.

Example

To continue a conversation:

julia
# start conversation as normal
+load_index!()

source


# PromptingTools.Experimental.RAGTools.build_indexFunction.
julia
RT.build_index(modules::Vector{Module} = Base.Docs.modules; verbose::Int = 1,
+    kwargs...)

Build index from the documentation of the currently loaded modules. If modules is empty, it will use all currently loaded modules.

source


# PromptingTools.Experimental.RAGTools.build_indexMethod.
julia
RT.build_index(mod::Module; verbose::Int = 1, kwargs...)

Build index from the documentation of a given module mod.

source


# AIHelpMe.@aihelp!_strMacro.
julia
aihelp!"user_question"[model_alias] -> AIMessage

The aihelp!"" string macro is used to continue a previous conversation with the AI model.

It appends the new user prompt to the last conversation in the tracked history (in AIHelpMe.CONV_HISTORY) and generates a response based on the entire conversation context. If you want to see the previous conversation, you can access it via AIHelpMe.CONV_HISTORY, which keeps at most last PromptingTools.MAX_HISTORY_LENGTH conversations.

It does NOT provide new context from the documentation. To do that, start a new conversation with aihelp"<question>".

Arguments

  • user_question (String): The follow up question to be added to the existing conversation.

  • model_alias (optional, any): Specify the model alias of the AI model to be used (see PT.MODEL_ALIASES). If not provided, the default model is used.

Returns

AIMessage corresponding to the new user prompt, considering the entire conversation history.

Example

To continue a conversation:

julia
# start conversation as normal
 aihelp"How to create a dictionary?" 
 
 # ... wait for reply and then react to it:
 
 # continue the conversation (notice that you can change the model, eg, to more powerful one for better answer)
 aihelp!"Can you create it from named tuple?"gpt4t
-# AIMessage("Yes, you can create a dictionary from a named tuple ...")

Usage Notes

  • This macro should be used when you want to maintain the context of an ongoing conversation (ie, the last ai"" message).

  • It automatically accesses and updates the global conversation history.

  • If no conversation history is found, it raises an assertion error, suggesting to initiate a new conversation using ai"" instead.

Important

Ensure that the conversation history is not too long to maintain relevancy and coherence in the AI's responses. The history length is managed by MAX_HISTORY_LENGTH.

source


# AIHelpMe.@aihelp_strMacro.
julia
aihelp"user_question"[model_alias] -> AIMessage

The aihelp"" string macro generates an AI response to a given user question by using aihelp under the hood. It will automatically try to provide the most relevant bits of the documentation (from the index) to the LLM to answer the question.

See also aihelp!"" if you want to reply to the provided message / continue the conversation.

Arguments

  • user_question (String): The question to be answered by the AI model.

  • model_alias (optional, any): Provide model alias of the AI model (see MODEL_ALIASES).

Returns

AIMessage corresponding to the input prompt.

Example

julia
result = aihelp"Hello, how are you?"
+# AIMessage("Yes, you can create a dictionary from a named tuple ...")

Usage Notes

  • This macro should be used when you want to maintain the context of an ongoing conversation (ie, the last ai"" message).

  • It automatically accesses and updates the global conversation history.

  • If no conversation history is found, it raises an assertion error, suggesting to initiate a new conversation using ai"" instead.

Important

Ensure that the conversation history is not too long to maintain relevancy and coherence in the AI's responses. The history length is managed by MAX_HISTORY_LENGTH.

source


# AIHelpMe.@aihelp_strMacro.
julia
aihelp"user_question"[model_alias] -> AIMessage

The aihelp"" string macro generates an AI response to a given user question by using aihelp under the hood. It will automatically try to provide the most relevant bits of the documentation (from the index) to the LLM to answer the question.

See also aihelp!"" if you want to reply to the provided message / continue the conversation.

Arguments

  • user_question (String): The question to be answered by the AI model.

  • model_alias (optional, any): Provide model alias of the AI model (see MODEL_ALIASES).

Returns

AIMessage corresponding to the input prompt.

Example

julia
result = aihelp"Hello, how are you?"
 # AIMessage("Hello! I'm an AI assistant, so I don't have feelings, but I'm here to help you. How can I assist you today?")

If you want to interpolate some variables or additional context, simply use string interpolation:

julia
a=1
 result = aihelp"What is `$a+$a`?"
 # AIMessage("The sum of `1+1` is `2`.")

If you want to use a different model, eg, GPT-3.5 Turbo, you can provide its alias as a flag:

julia
result = aihelp"What is `1.23 * 100 + 1`?"gpt3t
-# AIMessage("The answer is 124.")

source


- +# AIMessage("The answer is 124.")

source


+ \ No newline at end of file