Releases: jamesrochabrun/PolyAI
PolyAI v2.1.1
Fixing Chat history for Gemini.
Full Changelog: v2.1.0...v2.1.1
PolyAI v2.0.0
Removing Gemini library dependency.
Full Changelog: v2.1.1...v2.0.0
v2.1.0 PolyAI
Full Changelog: 2.0...v2.1.0
Support for Local Models with Ollama
This release adds support for SwiftOpenAI v3.3 https://github.com/jamesrochabrun/SwiftOpenAI/releases/tag/v3.3
Ollama now has built-in compatibility with the OpenAI Chat Completions API, making it possible to use more tooling and applications with Ollama locally.
Ollama
To interact with local models using Ollama OpenAI compatibility endpoints, use the following configuration setup.
1 - Download Ollama if yo don't have it installed already.
2 - Download the model you need, e.g for llama3
type in terminal:
ollama pull llama3
Once you have the model installed locally you are ready to use PolyAI!
let ollamaConfiguration: LLMConfiguration = .ollama(url: "http://localhost:11434")
More information can be found here.
Gemini support
Adding Gemini support.
public enum LLMConfiguration {
case openAI(OpenAI)
public enum OpenAI {
/// Configuration for accessing OpenAI's API.
/// - Parameters:
/// - apiKey: The API key for authenticating requests to OpenAI.
/// - organizationID: Optional organization ID for OpenAI usage.
/// - configuration: The URLSession configuration to use for network requests. Defaults to `.default`.
/// - decoder: The JSON decoder used for decoding responses. Defaults to a new instance of `JSONDecoder`.
case api(key: String, organizationID: String? = nil, configuration: URLSessionConfiguration = .default, decoder: JSONDecoder = .init())
/// Configuration for accessing OpenAI's API.
/// - Parameters:
/// - configuration: The AzureOpenAIConfiguration.
/// - urlSessionConfiguration: The URLSession configuration to use for network requests. Defaults to `.default`.
/// - decoder: The JSON decoder used for decoding responses. Defaults to a new instance of `JSONDecoder`.
case azure(configuration: AzureOpenAIConfiguration, urlSessionConfiguration: URLSessionConfiguration = .default, decoder: JSONDecoder = .init())
/// Configuration for accessing OpenAI's API.
/// - Parameters:
/// - aiproxyPartialKey: The partial key provided in the 'API Keys' section of the AIProxy dashboard.
/// - aiproxyDeviceCheckBypass: The bypass token that is provided in the 'API Keys' section of the AIProxy dashboard.
/// - configuration: The URLSession configuration to use for network requests. Defaults to `.default`.
/// - decoder: The JSON decoder used for decoding responses. Defaults to a new instance of `JSONDecoder`.
case aiProxy(aiproxyPartialKey: String, aiproxyDeviceCheckBypass: String? = nil, configuration: URLSessionConfiguration = .default, decoder: JSONDecoder = .init())
}
/// Configuration for accessing Anthropic's API.
/// - Parameters:
/// - apiKey: The API key for authenticating requests to Anthropic.
/// - configuration: The URLSession configuration to use for network requests. Defaults to `.default`.
case anthropic(apiKey: String, configuration: URLSessionConfiguration = .default)
/// Configuration for accessing Gemini's API.
/// - Parameters:
/// - apiKey: The API key for authenticating requests to Gemini.
case gemini(apiKey: String) /// ✨ New
}
Gemini Configuration:
let geminiConfiguration: LLMConfiguration = .gemini(apiKey: "your_gemini_api_key_here")
let service = PolyAIServiceFactory.serviceWith([geminiConfiguration])
Message Stream With Gemini
let prompt = "How are you today?"
let parameters: LLMParameter = .gemini(model: "gemini-pro", messages: [.init(role: .user, content: prompt)], maxTokens: 2000)
let stream = try await service.streamMessage(parameters)
OpenAI support for Azure and AIProxy
Adding LLMConfiguration
definition to allow usage of Azure and AIPRoxy within PolyAI.
public enum LLMConfiguration {
case openAI(OpenAI)
public enum OpenAI {
/// Configuration for accessing OpenAI's API.
/// - Parameters:
/// - apiKey: The API key for authenticating requests to OpenAI.
/// - organizationID: Optional organization ID for OpenAI usage.
/// - configuration: The URLSession configuration to use for network requests. Defaults to `.default`.
/// - decoder: The JSON decoder used for decoding responses. Defaults to a new instance of `JSONDecoder`.
case api(key: String, organizationID: String? = nil, configuration: URLSessionConfiguration = .default, decoder: JSONDecoder = .init())
/// Configuration for accessing OpenAI's API.
/// - Parameters:
/// - configuration: The AzureOpenAIConfiguration.
/// - urlSessionConfiguration: The URLSession configuration to use for network requests. Defaults to `.default`.
/// - decoder: The JSON decoder used for decoding responses. Defaults to a new instance of `JSONDecoder`.
case azure(configuration: AzureOpenAIConfiguration, urlSessionConfiguration: URLSessionConfiguration = .default, decoder: JSONDecoder = .init())
/// Configuration for accessing OpenAI's API.
/// - Parameters:
/// - aiproxyPartialKey: The partial key provided in the 'API Keys' section of the AIProxy dashboard.
/// - aiproxyDeviceCheckBypass: The bypass token that is provided in the 'API Keys' section of the AIProxy dashboard.
/// - configuration: The URLSession configuration to use for network requests. Defaults to `.default`.
/// - decoder: The JSON decoder used for decoding responses. Defaults to a new instance of `JSONDecoder`.
case aiProxy(aiproxyPartialKey: String, aiproxyDeviceCheckBypass: String? = nil, configuration: URLSessionConfiguration = .default, decoder: JSONDecoder = .init())
}
/// Configuration for accessing Anthropic's API.
/// - Parameters:
/// - apiKey: The API key for authenticating requests to Anthropic.
/// - configuration: The URLSession configuration to use for network requests. Defaults to `.default`.
case anthropic(apiKey: String, configuration: URLSessionConfiguration = .default)
}
Usage
Azure:
let azureConfiguration: LLMConfiguration = .openAI(.azure(configuration: .init(resourceName: "YOUR_RESOURCE_NAME", openAIAPIKey: .apiKey("YOUR_API_KEY"), apiVersion: "THE_API_VERSION")))
AIProxy
let aiProxyConfiguration: LLMConfiguration = .openAI(.aiProxy(aiproxyPartialKey: "hardcode_partial_key_here", aiproxyDeviceCheckBypass: "hardcode_device_check_bypass_here"))
PolyAI 🚀
- Support for Message with OpenAI and Anthropic API's
- Stream Message.
- README