The GenAI module provides AI integration capabilities for Soul, supporting multiple AI providers including OpenAI, Anthropic, and Google Gemini. It enables easy AI-powered automation with conversation management, prompt templating, and advanced features like rate limiting and fallback handling.
ai = GenAI.chat("openai") .model("gpt-3.5-turbo") .configure({ api_key: "your-api-key", max_tokens: 150 })response = ai.invoke("Hello, how are you?")println(response.text)
conversation = GenAI.createConversation()// Add messages to conversationconversation.addSystemMessage("You are a helpful assistant.")conversation.addUserMessage("What is the capital of France?")// Get AI response with conversation contextai = GenAI.chat("openai").model("gpt-3.5-turbo")response = ai.invokeConversation(conversation)// Add AI response to conversationconversation.addAssistantMessage(response.text)// Continue conversationconversation.addUserMessage("What about Germany?")response = ai.invokeConversation(conversation)
conversation = GenAI.createConversation()// Add various message typesconversation.addSystemMessage("You are a coding assistant.")conversation.addUserMessage("How do I create a function in Soul?")// Get conversation infomessageCount = conversation.length()messages = conversation.getMessages()lastUserMessage = conversation.getLastUserMessage()// Summarize conversationsummary = conversation.summarize()println(summary) // "Conversation with 1 user messages and 0 assistant messages"
// Register a templateGenAI.registerTemplate("greeting", "Hello {name}! How can I help you with {topic}?")// Use registered templatetemplate = GenAI.getTemplate("greeting")message = template.render({ name: "Bob", topic: "programming"})ai = GenAI.chat("openai").model("gpt-3.5-turbo")response = ai.invoke(message)
text = "This is a sample text for token counting"model = "gpt-3.5-turbo"tokenCount = GenAI.countTokens(text, model)println("Estimated tokens: " + tokenCount)
rawText = " This has extra spaces \n\n and newlines "cleanedText = GenAI.preprocessText(rawText)println(cleanedText) // "This has extra spaces and newlines"
prompts = [ "What is the capital of France?", "What is the capital of Germany?", "What is the capital of Italy?"]batch = GenAI.createBatch(prompts)println("Batch ID: " + batch.getId())println("Estimated tokens: " + batch.getEstimatedTokens())
Validate responses: Check for errors before processing
Manage conversations: Track context for multi-turn chats
Monitor token usage: Keep track of API costs
Copy
Ask AI
// Good - comprehensive AI setupsoul createRobustAI() { // Create AI with fallback fallbackModels = ["gpt-4", "gpt-3.5-turbo", "claude-3-haiku"] fallbackManager = GenAI.createFallbackManager(fallbackModels) // Rate limiting rateLimiter = GenAI.createRateLimiter(100, 10) // Retry handling retryHandler = GenAI.createRetryHandler({ max_retries: 3, base_delay: 1000, exponential_backoff: true }) // Main AI instance ai = GenAI.chat("openai") .model(fallbackManager.getCurrentModel()) .configure({ api_key: "your-api-key", max_tokens: 200 }) return { ai: ai, rateLimiter: rateLimiter, retryHandler: retryHandler, fallbackManager: fallbackManager }}
The GenAI module provides a comprehensive AI integration platform for Soul, enabling sophisticated AI-powered applications with robust error handling, conversation management, and multi-provider support.