SwiftAI A modern, type-safe Swift library for building AI-powered apps. SwiftAI provides a unified API that works seamlessly across different AI models - from Apple's on-device models to cloud-based services like OpenAI. ✨ Features 🤖 Model Agnostic : Unified API across Apple's on-device models, OpenAI, Anthropic, and custom backends : Unified API across Apple's on-device models, OpenAI, Anthropic, and custom backends 🎯 Structured Output : Strongly-typed structured outputs with compile-time validation : Strongly-typed structured outputs with compile-time validation 🔧 Agent Tool Loop : First-class support for tool use : First-class support for tool use 💬 Conversations : Stateful chat sessions with automatic context management : Stateful chat sessions with automatic context management 🏗️ Extensible : Plugin architecture for custom models and tools : Plugin architecture for custom models and tools ⚡ Swift-Native: Built with async/await and modern Swift concurrency 🚀 Quick Start import SwiftAI let llm = SystemLLM ( ) let response = try await llm . reply ( to : " What is the capital of France? " ) print ( response . content ) // "Paris" 📦 Installation Swift Package Manager Xcode: Go to File → Add Package Dependencies Enter: https://github.com/mi12labs/SwiftAI Click Add Package Package.swift: dependencies: [ . package ( url : " https://github.com/mi12labs/SwiftAI " , from : " main " ) ] 📚 Getting Started 🚀 Step 1: Your First AI Query Start with the simplest possible example - just ask a question and get an answer: import SwiftAI // Initialize Apple's on-device language model. let llm = SystemLLM ( ) // Ask a question and get a response. let response = try await llm . reply ( to : " What is the capital of France? " ) print ( response . content ) // "Paris" What just happened? SystemLLM() creates Apple's on-device AI model creates Apple's on-device AI model reply(to:) sends your question and returns a String by default sends your question and returns a by default try await handles the asynchronous AI processing handles the asynchronous AI processing The response is wrapped in a response object - use .content to get the actual text 📊 Step 2: Structured Responses Instead of getting plain text, let's get structured data that your app can use directly: // Define the structure you want back @ Generable struct CityInfo { let name : String let country : String let population : Int } let response = try await llm . reply ( to : " Tell me about Tokyo " , returning : CityInfo . self // Tell the LLM what to output ) let cityInfo = response . content print ( cityInfo . name ) // "Tokyo" print ( cityInfo . country ) // "Japan" print ( cityInfo . population ) // 13960000 What's new here? @Generable tells SwiftAI this struct can be generated by AI tells SwiftAI this struct can be generated by AI returning: CityInfo.self specifies you want structured data, not a string specifies you want structured data, not a string SwiftAI automatically converts the AI's response into your struct No JSON parsing required! 💡 Key Concept: Type-Safe AI SwiftAI ensures the AI returns data in exactly the format your code expects. If the AI can't generate valid data, you'll get an error instead of broken data. 🛠️ Step 3: Tool Use Let your AI call functions in your app to get real-time information: // Create a tool the AI can use struct WeatherTool : Tool { let description = " Get current weather for a city " @ Generable struct Arguments { let city : String } func call ( arguments : Arguments ) async throws -> String { // Your weather API logic here return " It's 72°F and sunny in \( arguments . city ) " } } // Use the tool with your AI let weatherTool = WeatherTool ( ) let response = try await llm . reply ( to : " What's the weather like in San Francisco? " , tools : [ weatherTool ] ) print ( response . content ) // "Based on current data, it's 72°F and sunny in San Francisco" What's new here? Tool protocol lets you create functions the AI can call protocol lets you create functions the AI can call Arguments struct defines what parameters your tool needs (also @Generable ) struct defines what parameters your tool needs (also ) The AI automatically decides when to call your tool You get back a natural language response that incorporates the tool's data 💡 Key Concept: AI Function Calling The AI reads your tool's description and automatically decides whether to call it. You don't manually trigger tools - the AI does it when needed. 🔄 Step 4: Model Switching Different AI models have different strengths. SwiftAI makes switching seamless: // Choose your model based on availability let llm : any LLM = { let systemLLM = SystemLLM ( ) return systemLLM . isAvailable ? systemLLM : OpenaiLLM ( apiKey : " your-api-key " ) } ( ) // Same code works with any model let response = try await llm . reply ( to : " Write a haiku about Berlin. " ) print ( response . content ) What's new here? SystemLLM runs on-device (private, fast, free) runs on-device (private, fast, free) OpenaiLLM uses the cloud (more capable, requires API key) uses the cloud (more capable, requires API key) isAvailable checks if the on-device model is ready checks if the on-device model is ready Same reply() method works with any LLM 💡 Key Concept: Model Agnostic API Your code doesn't change when you switch models. This lets you optimize for different scenarios (privacy, capabilities, cost) without rewriting your app. 💬 Step 5: Conversations For multi-turn conversations, use Chat to maintain context across messages: // Create a chat with tools let chat = try Chat ( with : llm , tools : [ weatherTool ] ) // Have a conversation let greeting = try await chat . send ( " Hello! I'm planning a trip. " ) let advice = try await chat . send ( " What should I pack for Seattle? " ) // The AI remembers context from previous messages What's new here? Chat maintains conversation history automatically maintains conversation history automatically send() is like reply() but remembers previous messages is like but remembers previous messages Tools work in conversations too The AI remembers context from earlier in the conversation 💡 Key Concept: Stateful vs Stateless reply() is stateless - each call is independent is stateless - each call is independent Chat is stateful - builds on previous conversation 🎯 Step 6: Advanced Constraints Add validation rules and descriptions to guide AI generation: @ Generable struct UserProfile { @ Guide ( description : " A valid username starting with a letter " , . pattern ( " ^[a-zA-Z][a-zA-Z0-9_]{2,}$ " ) ) let username : String @ Guide ( description : " User age in years " , . minimum ( 13 ) , . maximum ( 120 ) ) let age : Int @ Guide ( description : " One to three favorite colors " , . minimumCount ( 1 ) , . maximumCount ( 3 ) ) let favoriteColors : [ String ] } What's new here? @Guide adds constraints and descriptions to fields which help LLM generate good content adds constraints and descriptions to fields which help LLM generate good content .pattern() tells the LLM to follow a regex tells the LLM to follow a regex .minimum() and .maximum() constrain numbers and constrain numbers .minimumCount() and .maximumCount() control array sizes 💡 Key Concept: Validated Generation Constraints ensure the AI follows your business rules. 🎯 Quick Reference What You Want What To Use Example Simple text response reply(to:) reply(to: "Hello") Structured data reply(to:returning:) reply(to: "...", returning: MyStruct.self) Function calling reply(to:tools:) reply(to: "...", tools: [myTool]) Conversation Chat chat.send("Hello") Model switching any LLM SystemLLM() or OpenaiLLM() 🔧 Supported Models Model Type Privacy Capabilities Cost SystemLLM On-device 🔒 Private Good 🆓 Free OpenaiLLM Cloud API ⚠️ Shared Excellent 💰 Paid CustomLLM Your choice Your choice Your choice Your choice 📖 Examples TODO: Add example projects. ⚡ Feature Parity Status vs FoundationModels SDK Feature Status Streaming responses ❌ #issue Model prewarming ❌ #issue Structured outputs for enums ❌ #issue 🤝 Contributing We welcome contributions! Please read our Contributing Guidelines. Development Setup git clone https://github.com/your-org/SwiftAI.git cd SwiftAI swift build swift test 📄 License SwiftAI is released under the MIT License. See LICENSE for details. ⚠️ Alpha ⚠️ SwiftAI is alpha 🚧 – rough edges and breaking changes are expected. Built with ❤️ for the Swift community