GMAI
Gradle Managed AI
A Gradle plugin that automatically manages Ollama LLM instances for your build tasks
GMAI (Gradle Managed AI) is a Gradle plugin that seamlessly integrates AI capabilities into your build process by automatically managing Ollama instances. It handles the entire lifecycle of AI services - from installation and startup to model management and cleanup - so you can focus on using AI in your tasks.
What GMAI Does
- Automatic Ollama Management: Installs, starts, and stops Ollama automatically based on your build needs
- Task Integration: Simple API to make any Gradle task depend on AI services with
useManagedAi()
- Model Management: Automatically pulls and manages AI models defined in your configuration
- Lifecycle Management: Ensures AI services are available when needed and cleaned up afterward
- Cross-Platform: Works on macOS, Linux, and Windows with automatic platform detection
Key Features
Task-Dependent AI Services
Tasks can declare dependencies on AI services, and GMAI handles everything automatically:
tasks.withType<Test> {
useManagedAi() // AI services start before tests, stop after
}
Smart Installation
GMAI finds existing Ollama installations or installs per-project for isolation:
- Uses existing installations when available
- Falls back to project-local installation (
.ollama/bin/ollama
) - Configurable installation strategies for different environments
Automatic Model Management
Define models in your build script and GMAI handles the rest:
managedAi {
models {
"llama3" {
version = "8b"
}
"codellama" {
version = "7b"
}
}
}
Quick Start
// build.gradle.kts
plugins {
id("se.premex.gmai") version "0.0.1"
}
managedAi {
models {
"llama3" {
version = "8b"
}
}
}
// Use AI in your tasks
tasks.withType<Test> {
useManagedAi()
systemProperty("ollama.url", "http://localhost:11434")
}
Use Cases
- AI-Powered Testing: Use LLMs in your test suites for dynamic test generation or validation
- Code Generation: Generate code during build time using AI models
- Documentation: Generate or validate documentation with AI assistance
- CI/CD Integration: Run AI-powered tasks in continuous integration environments
Why GMAI?
- Zero Configuration: Works out of the box with sensible defaults
- Build Integration: Native Gradle task dependencies and lifecycle management
- Team Consistency: Same AI environment for all team members
- CI/CD Ready: Designed for continuous integration environments
- Isolation: Project-specific installations don’t interfere with system setup
Get Started
- Getting Started - Add GMAI to your project and run your first AI-powered task
- Configuration - Customize installation strategies and model configurations
- Examples - Real-world usage patterns and best practices
- API Reference - Complete API documentation
Available Tasks
GMAI provides several built-in tasks for managing AI services:
setupManagedAi
- Start Ollama and ensure all models are availableteardownManagedAi
- Stop Ollama and cleanup resourcesstartOllama
- Start the Ollama servicestopOllama
- Stop the Ollama serviceollamaStatus
- Check Ollama status and list available modelspullModel{ModelName}
- Pull specific models (auto-generated for each configured model)
Product by Premex
GMAI is developed and maintained by Premex, a company specializing in innovative software solutions.