DeepSeekSwift
🚨 Due to current server resource constraints, DeepSeek temporarily suspended API service recharges to prevent any potential impact on users operations. Existing balances can still be used for calls.
DeepSeek Swift SDK is a lightweight and efficient Swift-based client for interacting with the DeepSeek API. It provides support for chat message completion, streaming, error handling, and configurating DeepSeek LLM with advanced parameters.
- Supports chat completion requests
- Supports fill in the middle completion requests
- Handles error responses with detailed error descriptions and recovery suggestions.
- streaming responses both for chat completion and as well fill in the middle responses
- Built-in support for different models and advanced parameters
- User balance fetchin and available LLM models fetching
- Uses Swift concurrency (async/await) for network calls
To integrate DeepSwiftSeek
into your project, you can use Swift Package Manager (SPM):
let package = Package(
dependencies: [
.package(url: "https://github.com/tornikegomareli/DeepSwiftSeek.git", exact: "0.0.2")
]
)
Or add it via Xcode:
- Open your project in Xcode.
- Navigate to File > Swift Packages > Add Package Dependency.
- Enter the repository URL.
- Choose the latest version and click Next.
import DeepSwiftSeek
let configuration = Configuration(apiKey: "YOUR_API_KEY")
let deepSeekClient = DeepSeekClient(configuration: configuration)
Task {
do {
let response = try await deepSeekClient.chatCompletions(
messages: {
ChatMessageRequest(role: .user, content: "Tell me a joke.", name: "User")
},
model: .deepSeekChat,
parameters: .creative
)
print(response.choices.first?.message.content ?? "No response")
} catch {
print("Error: \(error.localizedDescription)")
}
}
Task {
do {
let stream = try await deepSeekClient.chatCompletionStream(
messages: {
ChatMessageRequest(role: .user, content: "Write a poem.", name: "User")
},
model: .deepSeekChat,
parameters: .streaming
)
for try await chunk in stream {
print(chunk) // Prints streamed responses
}
} catch {
print("Streaming error: \(error.localizedDescription)")
}
}
Task {
do {
let stream = try await deepSeekClient.fimCompletionStream(
messages: {
[
ChatMessageRequest(
role: .user,
content: "function greet() {\n /* FIM_START */\n /* FIM_END */\n return 'Hello world';\n}",
name: "User"
)
]
},
model: .deepSeekReasoner,
parameters: .streaming
)
for try await chunk in stream {
// Each chunk is a streamed part of the fill-in-the-middle response.
print("FIM Stream Chunk:\n\(chunk)")
}
} catch {
print("FIM Streaming Error: \(error.localizedDescription)")
}
}
Task {
do {
let response = try await deepSeekClient.fimCompletions(
messages: {
[
ChatMessageRequest(
role: .user,
content: "function greet() {\n // FIM_START\n // FIM_END\n return 'Hello world';\n}",
name: "User"
)
]
},
model: .deepSeekReasoner,
parameters: .creative
)
if let content = response.choices.first?.message.content {
print("FIM Completion:\n\(content)")
}
} catch {
print("FIM Error: \(error.localizedDescription)")
}
}
Task {
do {
let response = try await deepSeekClient.listModels()
} catch {
print("ListModels Error: \(error.localizedDescription)")
}
}
Task {
do {
let response = try await deepSeekClient.fetchUserBalance()
} catch {
print("UserBalance Error: \(error.localizedDescription)")
}
}
The SDK provides detailed error handling:
catch let error as DeepSeekError {
print("DeepSeek API Error: \(error.localizedDescription)")
print("Recovery Suggestion: \(error.recoverySuggestion ?? "None")")
} catch {
print("Unexpected error: \(error)")
}
DeepSeek SDK supports multiple models:
public enum DeepSeekModel: String {
case deepSeekChat = "deepseek-chat"
case deepSeekReasoner = "deepseek-reasoner"
}
You can configure chat completion parameters:
let parameters = ChatParameters(
frequencyPenalty: 0.5,
maxTokens: 512,
presencePenalty: 0.5,
temperature: 0.7,
topP: 0.9
)
Mode | Temperature | Max Tokens | Top P |
---|---|---|---|
Creative | 0.9 | 2048 | 0.9 |
Focused | 0.3 | 2048 | 0.3 |
Streaming | 0.7 | 4096 | 0.9 |
Code Generation | 0.2 | 2048 | 0.95 |
Concise | 0.5 | 256 | 0.5 |
If you need specific configurations, you can define your own parameter presets:
extension ChatParameters {
static let myCustomPreset = ChatParameters(
frequencyPenalty: 0.4,
maxTokens: 1024,
presencePenalty: 0.6,
temperature: 0.8,
topP: 0.85
)
}
Then use it in your requests:
let parameters = ChatParameters.myCustomPreset
This approach allows you to maintain reusable configurations tailored to different needs.
DeepSeek SDK has built-in error handling for various API failures:
Error Type | Description |
---|---|
invalidFormat |
Invalid request body format. |
authenticationFailed |
Incorrect API key. |
insufficientBalance |
No balance remaining. |
rateLimitReached |
Too many requests sent. |
serverOverloaded |
High traffic on server. |
encodingError |
Failed to encode request body. |
- Improve documentation with more examples
- SwiftUI full demo based on chat, history and reasoning
- Reasoning model + OpenAI SDK
This project is available under the MIT License.
This SDK is not affiliated with DeepSeek and is an independent implementation to interact with their API.