Releases: jamesrochabrun/SwiftOpenAI
SwiftOpenAI v.3.9.6
What's Changed
Predicted outputs support:
Usage:
let code = """
ScrollView {
VStack {
textArea
Text(chatProvider.errorMessage)
.foregroundColor(.red)
streamedChatResultView
}
}
"""
let content: ChatCompletionParameters.Message.ContentType = .text("Change this Scrollview to be a list" )
let parameters = ChatCompletionParameters(
messages: [
.init(role: .user, content: content),
.init(role: .user, content: .text(code))],
model: .gpt4o,
prediction: .init(content: .text(code)))
try await openAIService.startChat(parameters: parameters)
Other:
- Update: adds new embeddings models by @macistador in #106
New Contributors
- @macistador made their first contribution in #106
Full Changelog: v3.9.5...v.3.9.6
SwiftOpenAI v3.9.5
What's Changed
- Updates from latest OpenAI API. by @jamesrochabrun in #104
Added reasoning_effort parameter for o1 models.
Added metadata for Chat Completions.
Model updates.
Full Changelog: v3.9.4...v3.9.5
SwiftOpenAI v3.9.4
SwiftOpenAI v3.9.3
Support for additional parameters, users now can use open router.
let service = OpenAIServiceFactory.service(
apikey: "${OPENROUTER_API_KEY}",
baseURL: "https://openrouter.ai",
proxyPath: "api"
headers: ["HTTP-Referer": "${YOUR_SITE_URL}", "X-Title": "${YOUR_SITE_NAME}")
What's Changed
- Adding additional parameters in API. by @jamesrochabrun in #101
Full Changelog: v3.9.2...v3.9.3
SwiftOpenAI v3.9.2
Gemini
Gemini is now accessible from the OpenAI Library. Announcement .
SwiftOpenAI
support all OpenAI endpoints, however Please refer to Gemini documentation to understand which API's are currently compatible'
Gemini is now accessible through the OpenAI Library. See the announcement here.
SwiftOpenAI supports all OpenAI endpoints. However, please refer to the Gemini documentation to understand which APIs are currently compatible."
You can instantiate a OpenAIService
using your Gemini token like this...
let geminiAPIKey = "your_api_key"
let baseURL = "https://generativelanguage.googleapis.com"
let version = "v1beta"
let service = OpenAIServiceFactory.service(
apiKey: apiKey,
overrideBaseURL: baseURL,
overrideVersion: version)
You can now create a chat request using the .custom model parameter and pass the model name as a string.
let parameters = ChatCompletionParameters(
messages: [.init(
role: .user,
content: content)],
model: .custom("gemini-1.5-flash"))
let stream = try await service.startStreamedChat(parameters: parameters)
SwiftOpenAI v3.9.1
SwiftOpenAI v3.9.0
- Added store parameters for new Evals framework
- Added support for Chat Completions Audio generation.
What's Changed
- Chat Completions Audio generation by @jamesrochabrun in #94
Full Changelog: v3.8.2...v3.9.0
SwiftOpenAI v3.8.2
What's Changed
- Fix decoding error triggered by Assistants API responses with data from searching a vector database by @mplawley in #91
- Fixing issue 88 by @jamesrochabrun in #92
- Groq support by @jamesrochabrun in #93
New Contributors
Full Changelog: v3.8.1...v3.8.2