Skip to content

Releases: jamesrochabrun/PolyAI

PolyAI v2.1.1

29 Dec 08:15
Compare
Choose a tag to compare

Fixing Chat history for Gemini.

Full Changelog: v2.1.0...v2.1.1

PolyAI v2.0.0

29 Dec 08:49
Compare
Choose a tag to compare

Removing Gemini library dependency.

Full Changelog: v2.1.1...v2.0.0

v2.1.0 PolyAI

03 Nov 05:54
Compare
Choose a tag to compare

Full Changelog: 2.0...v2.1.0

Support for Local Models with Ollama

30 Jun 04:59
Compare
Choose a tag to compare

This release adds support for SwiftOpenAI v3.3 https://github.com/jamesrochabrun/SwiftOpenAI/releases/tag/v3.3

Ollama now has built-in compatibility with the OpenAI Chat Completions API, making it possible to use more tooling and applications with Ollama locally.

Ollama

To interact with local models using Ollama OpenAI compatibility endpoints, use the following configuration setup.

1 - Download Ollama if yo don't have it installed already.
2 - Download the model you need, e.g for llama3 type in terminal:

ollama pull llama3

Once you have the model installed locally you are ready to use PolyAI!

let ollamaConfiguration: LLMConfiguration = .ollama(url: "http://localhost:11434")

More information can be found here.

Gemini support

05 May 06:18
Compare
Choose a tag to compare

Adding Gemini support.

Screenshot 2024-05-04 at 11 17 10 PM
public enum LLMConfiguration {

   case openAI(OpenAI)
   
   public enum OpenAI {
      /// Configuration for accessing OpenAI's API.
      /// - Parameters:
      ///   - apiKey: The API key for authenticating requests to OpenAI.
      ///   - organizationID: Optional organization ID for OpenAI usage.
      ///   - configuration: The URLSession configuration to use for network requests. Defaults to `.default`.
      ///   - decoder: The JSON decoder used for decoding responses. Defaults to a new instance of `JSONDecoder`.
      case api(key: String, organizationID: String? = nil, configuration: URLSessionConfiguration = .default, decoder: JSONDecoder = .init())
      /// Configuration for accessing OpenAI's API.
      /// - Parameters:
      ///   - configuration: The AzureOpenAIConfiguration.
      ///   - urlSessionConfiguration: The URLSession configuration to use for network requests. Defaults to `.default`.
      ///   - decoder: The JSON decoder used for decoding responses. Defaults to a new instance of `JSONDecoder`.
      case azure(configuration: AzureOpenAIConfiguration, urlSessionConfiguration: URLSessionConfiguration = .default, decoder: JSONDecoder = .init())
      /// Configuration for accessing OpenAI's API.
      /// - Parameters:
      ///   - aiproxyPartialKey: The partial key provided in the 'API Keys' section of the AIProxy dashboard.
      ///   - aiproxyDeviceCheckBypass: The bypass token that is provided in the 'API Keys' section of the AIProxy dashboard.
      ///   - configuration: The URLSession configuration to use for network requests. Defaults to `.default`.
      ///   - decoder: The JSON decoder used for decoding responses. Defaults to a new instance of `JSONDecoder`.
      case aiProxy(aiproxyPartialKey: String, aiproxyDeviceCheckBypass: String? = nil, configuration: URLSessionConfiguration = .default, decoder: JSONDecoder = .init())
   }
   
   /// Configuration for accessing Anthropic's API.
   /// - Parameters:
   ///   - apiKey: The API key for authenticating requests to Anthropic.
   ///   - configuration: The URLSession configuration to use for network requests. Defaults to `.default`.
   case anthropic(apiKey: String, configuration: URLSessionConfiguration = .default)
   
   /// Configuration for accessing Gemini's API.
   /// - Parameters:
   ///   - apiKey: The API key for authenticating requests to Gemini.
   case gemini(apiKey: String) /// ✨ New
}

Gemini Configuration:

let geminiConfiguration: LLMConfiguration = .gemini(apiKey: "your_gemini_api_key_here")
let service = PolyAIServiceFactory.serviceWith([geminiConfiguration])

Message Stream With Gemini

let prompt = "How are you today?"
let parameters: LLMParameter = .gemini(model: "gemini-pro", messages: [.init(role: .user, content: prompt)], maxTokens: 2000)
let stream = try await service.streamMessage(parameters)

OpenAI support for Azure and AIProxy

17 Apr 06:02
Compare
Choose a tag to compare

Adding LLMConfiguration definition to allow usage of Azure and AIPRoxy within PolyAI.

public enum LLMConfiguration {

   case openAI(OpenAI)
   
   public enum OpenAI {
      /// Configuration for accessing OpenAI's API.
      /// - Parameters:
      ///   - apiKey: The API key for authenticating requests to OpenAI.
      ///   - organizationID: Optional organization ID for OpenAI usage.
      ///   - configuration: The URLSession configuration to use for network requests. Defaults to `.default`.
      ///   - decoder: The JSON decoder used for decoding responses. Defaults to a new instance of `JSONDecoder`.
      case api(key: String, organizationID: String? = nil, configuration: URLSessionConfiguration = .default, decoder: JSONDecoder = .init())
      /// Configuration for accessing OpenAI's API.
      /// - Parameters:
      ///   - configuration: The AzureOpenAIConfiguration.
      ///   - urlSessionConfiguration: The URLSession configuration to use for network requests. Defaults to `.default`.
      ///   - decoder: The JSON decoder used for decoding responses. Defaults to a new instance of `JSONDecoder`.
      case azure(configuration: AzureOpenAIConfiguration, urlSessionConfiguration: URLSessionConfiguration = .default, decoder: JSONDecoder = .init())
      /// Configuration for accessing OpenAI's API.
      /// - Parameters:
      ///   - aiproxyPartialKey: The partial key provided in the 'API Keys' section of the AIProxy dashboard.
      ///   - aiproxyDeviceCheckBypass: The bypass token that is provided in the 'API Keys' section of the AIProxy dashboard.
      ///   - configuration: The URLSession configuration to use for network requests. Defaults to `.default`.
      ///   - decoder: The JSON decoder used for decoding responses. Defaults to a new instance of `JSONDecoder`.
      case aiProxy(aiproxyPartialKey: String, aiproxyDeviceCheckBypass: String? = nil, configuration: URLSessionConfiguration = .default, decoder: JSONDecoder = .init())
   }
   
   /// Configuration for accessing Anthropic's API.
   /// - Parameters:
   ///   - apiKey: The API key for authenticating requests to Anthropic.
   ///   - configuration: The URLSession configuration to use for network requests. Defaults to `.default`.
   case anthropic(apiKey: String, configuration: URLSessionConfiguration = .default)
}

Usage

Azure:

let azureConfiguration: LLMConfiguration = .openAI(.azure(configuration: .init(resourceName: "YOUR_RESOURCE_NAME", openAIAPIKey: .apiKey("YOUR_API_KEY"), apiVersion: "THE_API_VERSION")))

AIProxy

let aiProxyConfiguration: LLMConfiguration = .openAI(.aiProxy(aiproxyPartialKey: "hardcode_partial_key_here", aiproxyDeviceCheckBypass: "hardcode_device_check_bypass_here"))

PolyAI 🚀

16 Apr 05:04
Compare
Choose a tag to compare
  • Support for Message with OpenAI and Anthropic API's
  • Stream Message.
  • README