Down below you can find the code snippets that demonstrate the usage of many Semantic Kernel features.
You can run those tests using the IDE or the command line. To run the tests using the command line run the following command from the root of Concepts project:
dotnet test -l "console;verbosity=detailed" --filter "FullyQualifiedName=NameSpace.TestClass.TestMethod"
Example for ChatCompletion/OpenAI_ChatCompletion.cs
file, targeting the ChatPromptSync
test:
dotnet test -l "console;verbosity=detailed" --filter "FullyQualifiedName=ChatCompletion.OpenAI_ChatCompletion.ChatPromptSync"
Agents - Different ways of using Agents
- ComplexChat_NestedShopper
- Legacy_AgentAuthoring
- Legacy_AgentCharts
- Legacy_AgentCollaboration
- Legacy_AgentDelegation
- Legacy_AgentTools
- Legacy_Agents
- Legacy_ChatCompletionAgent
- MixedChat_Agents
- OpenAIAssistant_ChartMaker
- OpenAIAssistant_CodeInterpreter
- OpenAIAssistant_Retrieval
AudioToText - Different ways of using AudioToText
services to extract text from audio
- Gemini_FunctionCalling
- FunctionCalling
- NexusRaven_HuggingFaceTextGeneration
- MultipleFunctionsVsParameters
ChatCompletion - Examples using ChatCompletion
messaging capable service with models
- AzureAIInference_ChatCompletion
- AzureAIInference_ChatCompletionStreaming
- AzureOpenAI_ChatCompletion
- AzureOpenAI_ChatCompletionStreaming
- AzureOpenAI_CustomClient
- AzureOpenAIWithData_ChatCompletion
- ChatHistoryAuthorName
- ChatHistorySerialization
- Connectors_CustomHttpClient
- Connectors_KernelStreaming
- Connectors_WithMultipleLLMs
- Google_GeminiChatCompletion
- Google_GeminiChatCompletionStreaming
- Google_GeminiGetModelResult
- Google_GeminiVision
- OpenAI_ChatCompletion
- OpenAI_ChatCompletionStreaming
- OpenAI_ChatCompletionWithVision
- OpenAI_CustomClient
- OpenAI_UsingLogitBias
- OpenAI_FunctionCalling
- OpenAI_ReasonedFunctionCalling
- MultipleProviders_ChatHistoryReducer
- MistralAI_ChatPrompt
- MistralAI_FunctionCalling
- MistralAI_StreamingFunctionCalling
- Onnx_ChatCompletion
- Onnx_ChatCompletionStreaming
- Ollama_ChatCompletion
- Ollama_ChatCompletionStreaming
- AutoFunctionInvocationFiltering
- FunctionInvocationFiltering
- Legacy_KernelHooks
- MaxTokensWithFilters
- PIIDetectionWithFilters
- PromptRenderFiltering
- RetryWithFilters
- TelemetryWithFilters
- Arguments
- FunctionResult_Metadata
- FunctionResult_StronglyTyped
- MethodFunctions
- MethodFunctions_Advanced
- MethodFunctions_Types
- MethodFunctions_Yaml
- PromptFunctions_Inline
- PromptFunctions_MultipleArguments
ImageToText - Using ImageToText
services to describe images
Memory - Using AI Memory
concepts
- Ollama_EmbeddingGeneration
- Onnx_EmbeddingGeneration
- HuggingFace_EmbeddingGeneration
- MemoryStore_CustomReadOnly
- SemanticTextMemory_Building
- TextChunkerUsage
- TextChunkingAndEmbedding
- TextMemoryPlugin_GeminiEmbeddingGeneration
- TextMemoryPlugin_MultipleMemoryStore
- TextMemoryPlugin_RecallJsonSerializationWithOptions
- VectorStore_DataIngestion_Simple: A simple example of how to do data ingestion into a vector store when getting started.
- VectorStore_DataIngestion_MultiStore: An example of data ingestion that uses the same code to ingest into multiple vector stores types.
- VectorStore_DataIngestion_CustomMapper: An example that shows how to use a custom mapper for when your data model and storage model doesn't match.
- VectorStore_GenericDataModel_Interop: An example that shows how you can use the built-in, generic data model from Semantic Kernel to read and write to a Vector Store.
- VectorStore_ConsumeFromMemoryStore_AzureAISearch: An example that shows how you can use the AzureAISearchVectorStore to consume data that was ingested using the AzureAISearchMemoryStore.
- VectorStore_ConsumeFromMemoryStore_Qdrant: An example that shows how you can use the QdrantVectorStore to consume data that was ingested using the QdrantMemoryStore.
- VectorStore_ConsumeFromMemoryStore_Redis: An example that shows how you can use the RedisVectorStore to consume data that was ingested using the RedisMemoryStore.
- VectorStore_MigrateFromMemoryStore_Redis: An example that shows how you can use the RedisMemoryStore and RedisVectorStore to migrate data to a new schema.
- VectorStore_Langchain_Interop_AzureAISearch: An example that shows how you can use the AzureAISearch Vector Store to consume data that was ingested using Langchain.
- VectorStore_Langchain_Interop_Qdrant: An example that shows how you can use the Qdrant Vector Store to consume data that was ingested using Langchain.
- VectorStore_Langchain_Interop_Redis: An example that shows how you can use the Redis Vector Store to consume data that was ingested using Langchain.
Plugins - Different ways of creating and using Plugins
- ApiManifestBasedPlugins
- ConversationSummaryPlugin
- CreatePluginFromOpenApiSpec_Github
- CreatePluginFromOpenApiSpec_Jira
- CreatePluginFromOpenApiSpec_Klarna
- CreatePluginFromOpenApiSpec_RepairService
- CustomMutablePlugin
- DescribeAllPluginsAndFunctions
- GroundednessChecks
- ImportPluginFromGrpc
- TransformPlugin
PromptTemplates - Using Templates
with parametrization for Prompt
rendering
- ChatCompletionPrompts
- ChatWithPrompts
- LiquidPrompts
- MultiplePromptTemplates
- PromptFunctionsWithChatGPT
- TemplateLanguage
- PromptyFunction
- HandlebarsVisionPrompts
- SafeChatPrompts
- ChatLoopWithPrompt