Spring AI Playground is a self-hosted web UI that simplifies AI experimentation and testing. It provides Java developers with an intuitive interface for working with large language models (LLMs), vector databases, prompt engineering, and Model Context Protocol (MCP) integrations.
Built on Spring AI, it supports leading model providers and includes comprehensive tools for testing retrieval-augmented generation (RAG) workflows and MCP integrations. The goal is to make AI more accessible to developers, helping them quickly prototype Spring AI-based applications with enhanced contextual awareness and external tool capabilities.
Build and run the app:
./mvnw clean install
./mvnw spring-boot:run
Run the following command to build the Docker image:
./mvnw spring-boot:build-image -Pproduction -DskipTests=true -Dspring-boot.build-image.imageName=jmlab/spring-ai-playgorund:latest
docker run -p 8080:8080 -e SPRING_AI_OLLAMA_BASE_URL=http://host.docker.internal:11434 jmlab/spring-ai-playgorund:latest
The environment variable SPRING_AI_OLLAMA_BASE_URL is set to http://host.docker.internal:11434 to connect to Ollama running on your host machine. If Ollama is running on a different port or host, adjust the URL accordingly.
Spring AI Playground uses Ollama by default for local LLM and embedding models. No API keys are required, which makes it easy to get started.
To enable Ollama, ensure it is installed and running on your system. Refer to the Spring AI Ollama Chat Prerequisites for setup details.
Spring AI Playground supports all major AI model providers, including Anthropic, OpenAI, Microsoft, Amazon, Google, and Ollama. For more details on the available implementations, visit the Spring AI Chat Models Reference Documentation.
Switching to OpenAI is a primary example of how you can use a different AI model with Spring AI Playground. To explore other models supported by Spring AI, learn more in the Spring AI Documentation.
To switch to OpenAI, follow these steps:
Modify the pom.xml
file:
<dependency>
<groupId>org.springframework.ai</groupId>
<artifactId>spring-ai-ollama-spring-boot-starter</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.ai</groupId>
<artifactId>spring-ai-openai-spring-boot-starter</artifactId>
</dependency>
Update application.yaml
:
spring:
profiles:
default: openai
ai:
openai:
api-key: your-openai-api-key
Spring AI Playground now includes a comprehensive MCP (Model Context Protocol) Playground that provides a visual interface for managing connections to external tools through AI models. This feature leverages Spring AI’s Model Context Protocol implementation to offer client-side capabilities.
STREAMABLE HTTP officially introduced in the MCP v2025‑03‑26 specification (March 26, 2025) — is a single-endpoint HTTP transport that replaces the former HTTP+SSE setup. Clients send JSON‑RPC via POST to /mcp, while responses may optionally use an SSE-style stream, with session‑ID tracking and resumable connections.
This MCP Playground provides developers with a powerful visual tool for prototyping, testing, and debugging Model Context Protocol integrations, making it easier to build sophisticated AI applications with contextual awareness.
Spring AI Playground now provides seamless integration with MCP (Model Context Protocol) tools directly within the chat interface, enabling you to enhance AI conversations with external tools. Here’s how you can leverage this powerful feature:
⚠️ Important for Ollama Users
When using Ollama as your AI provider, ensure you’re using a tool-enabled model that supports external function calling. Not all Ollama models support MCP tool integration.
ollama pull <model-name>
Tip
Models like Qwen 3 and DeepSeek-R1 offer advanced reasoning capabilities with visible thought processes, making them particularly effective for complex MCP tool workflows.
This integration enables developers to quickly prototype and test tool-enhanced AI interactions, bringing the power of external systems and capabilities directly into your Spring AI conversations through the Model Context Protocol.
Spring AI Playground offers a comprehensive vector database playground with advanced retrieval capabilities powered by Spring AI’s VectorStore API integration.
Vector Database providers including Apache Cassandra, Azure Cosmos DB, Azure Vector Search, Chroma, Elasticsearch, GemFire, MariaDB, Milvus, MongoDB Atlas, Neo4j, OpenSearch, Oracle, PostgreSQL/PGVector, Pinecone, Qdrant, Redis, SAP Hana, Typesense and Weaviate.
author == 'John' && year >= 2023
) to narrow search scopes and refine query results.These features, combined with Spring AI’s flexibility, provide a comprehensive playground for vector database testing and advanced integration into your applications.
Spring AI Playground now offers a fully integrated RAG (Retrieval-Augmented Generation) feature, allowing you to enhance AI responses with knowledge from your own documents. Here’s how you can make the most of this capability:
This seamless integration enables developers to quickly prototype and optimize knowledge-enhanced AI interactions within a single, intuitive interface-bringing the power of Retrieval-Augmented Generation to your Spring AI applications.
Here are some features we are planning to develop for future releases of Spring AI Playground:
Observability:
Introducing tools to track and monitor AI performance, usage, and errors for better management and debugging.
Authentication:
Implementing login and security features to control access to the Spring AI Playground.
Multimodal Support:
Supporting embedding, image, audio, and moderation models from Spring AI
These features will help make Spring AI Playground even better for testing and building AI projects.