In this post, We are going to build a stateless conversational chatbot using Langchain4j integrated with Ollama. The focus is on stateless design, where the client maintains conversation history and sends the entire message trail with every request.
This architecture is perfect for scalable microservices or cloud-native deployments where persistence or server-side session tracking is undesirable or costly.
1. Introduction
Building chatbots is easier than ever with frameworks like Langchain4j. However, most chatbot implementations manage user state on the server side, not always ideal for cloud-native, stateless deployments.
In this tutorial, we’ll show how to:
· Build a chatbot using Langchain4j
· Integrate it with Ollama as the LLM backend
· Design a stateless chat interface where the client tracks history
· Deploy a clean REST API
2. Prerequisites
· Java 21+
· Spring Boot
· Langchain4j
· Ollama installed and running locally (e.g. with llama2 or mistral models)
· Familiarity with REST APIs and JSON
3. Why Stateless?
Stateless design means the server doesn’t store chat history between requests. Instead,
· Client manages full message history
· Every request is complete and independent
· Perfect for scalable deployments, microservices, and serverless functions
4. JSON Request Structure
Here’s how the client sends the conversation:
{ "chatMessageRequests": [ {"message": "Hi", "messageType": "USER"}, {"message": "How can I assist you today?", "messageType": "AI"}, {"message": "What is the Capital Of India", "messageType": "USER"}, {"message": "The capital of India is New Delhi.", "messageType": "AI"}, {"message": "Tell me more about this country", "messageType": "USER"} ] }
This JSON contains an array called chatMessageRequests, which holds the entire back-and-forth conversation so far. Each object inside the array represents a single message exchanged between the user and the AI. Each message has:
· message: the actual text of the message
· messageType: whether the message was sent by the USER or the AI
Why It's Designed This Way (Stateless Architecture)?
In a stateless chatbot, the server does not store previous messages or conversation history. So:
· The client must send the full history of messages on every request.
· This history gives the AI the necessary context to generate a coherent response.
In the example above:
· The user starts the conversation: "Hi"
· The AI responds: "How can I assist you today?"
· The user asks: "What is the Capital Of India"
· The AI answers: "The capital of India is New Delhi."
· The user continues: "Tell me more about this country"
Now, when this payload is sent to the server, the server passes this entire history to the AI model (via Langchain4j)
The AI uses this history as context to answer: "Tell me more about this country"
Defining DTO
public class ChatMessageRequest { private String message; private ChatMessageType messageType; } public enum ChatMessageType { USER, AI } public class ChatRequestBody { private List<ChatMessageRequest> chatMessageRequests; } public class ConversationResponse { private String message; public ConversationResponse(String message) { this.message = message; } }
Chat Service
We use Langchain4j’s ChatLanguageModel interface, implemented here with Ollama.
@Service public class ChatService { @Autowired private OllamaChatModel chatModel; public ConversationResponse chat(List<ChatMessageRequest> chatMessageRequests) { ChatResponse response = chatModel.chat(from(chatMessageRequests)); return new ConversationResponse(response.aiMessage().text()); } private static List<ChatMessage> from(List<ChatMessageRequest> chatMessageRequests) { List<ChatMessage> chatMessages = new ArrayList<>(); chatMessages.add(new SystemMessage("You are a helpful Chat Assistant")); for (ChatMessageRequest chatMessageRequest : chatMessageRequests) { if (ChatMessageType.USER == chatMessageRequest.getMessageType()) { chatMessages.add(new UserMessage(chatMessageRequest.getMessage())); } else if (ChatMessageType.AI == chatMessageRequest.getMessageType()) { chatMessages.add(new AiMessage(chatMessageRequest.getMessage())); } } return chatMessages; } }
REST Controller
@RestController @RequestMapping("/api/chat") @CrossOrigin("*") @Tag(name = "Chat Controller", description = "This section contains APIs related to Chat APIs Powered by Ollama") public class ChatController { @Autowired private ChatService chatService; @PostMapping public ConversationResponse chat(@RequestBody @Valid ChatRequestBody chatRequestBody) { return chatService.chat(chatRequestBody.getChatMessageRequests()); } }
Follow below step-by-step procedure to build a working application.
Step 1: Create new maven project ‘chat-bot’
Step 2: Update pom.xml with maven dependencies.
pom.xml
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd"> <modelVersion>4.0.0</modelVersion> <groupId>com.sample.app</groupId> <artifactId>chat-bot</artifactId> <version>0.0.1-SNAPSHOT</version> <properties> <maven.compiler.source>21</maven.compiler.source> <maven.compiler.target>21</maven.compiler.target> <java.version>21</java.version> </properties> <parent> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-parent</artifactId> <version>3.3.10</version> </parent> <dependencyManagement> <dependencies> <dependency> <groupId>dev.langchain4j</groupId> <artifactId>langchain4j-bom</artifactId> <version>1.0.1</version> <type>pom</type> <scope>import</scope> </dependency> </dependencies> </dependencyManagement> <dependencies> <dependency> <groupId>dev.langchain4j</groupId> <artifactId>langchain4j-spring-boot-starter</artifactId> </dependency> <dependency> <groupId>dev.langchain4j</groupId> <artifactId>langchain4j-ollama-spring-boot-starter</artifactId> </dependency> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-web</artifactId> </dependency> <dependency> <groupId>org.springdoc</groupId> <artifactId>springdoc-openapi-starter-webmvc-ui</artifactId> <version>2.6.0</version> </dependency> <!-- https://mvnrepository.com/artifact/jakarta.validation/jakarta.validation-api --> <dependency> <groupId>jakarta.validation</groupId> <artifactId>jakarta.validation-api</artifactId> </dependency> </dependencies> <build> <plugins> <plugin> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-maven-plugin</artifactId> <executions> <execution> <goals> <goal>repackage</goal> <!-- Important --> </goals> </execution> </executions> </plugin> </plugins> </build> </project>
Step 3: Create com.sample.app.llm package and define all llm related classes here.
ChatMessageRequest.java
package com.sample.app.llm; public class ChatMessageRequest { private String message; private ChatMessageType messageType; public String getMessage() { return message; } public void setMessage(String message) { this.message = message; } public ChatMessageType getMessageType() { return messageType; } public void setMessageType(ChatMessageType messageType) { this.messageType = messageType; } }
ChatMessageType.java
package com.sample.app.llm; public enum ChatMessageType { USER, AI }
ChatRequestBody.java
package com.sample.app.llm; import java.util.List; public class ChatRequestBody { List<ChatMessageRequest> chatMessageRequests; public List<ChatMessageRequest> getChatMessageRequests() { return chatMessageRequests; } public void setChatMessageRequests(List<ChatMessageRequest> chatMessageRequests) { this.chatMessageRequests = chatMessageRequests; } }
ChatService.java
package com.sample.app.llm; import java.util.ArrayList; import java.util.List; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.stereotype.Service; import dev.langchain4j.data.message.AiMessage; import dev.langchain4j.data.message.ChatMessage; import dev.langchain4j.data.message.SystemMessage; import dev.langchain4j.data.message.UserMessage; import dev.langchain4j.model.chat.response.ChatResponse; import dev.langchain4j.model.ollama.OllamaChatModel; @Service public class ChatService { @Autowired private OllamaChatModel chatModel; public ConversationResponse chat(List<ChatMessageRequest> chatMessageRequests) { ChatResponse response = chatModel.chat(from(chatMessageRequests)); return new ConversationResponse(response.aiMessage().text()); } private static List<ChatMessage> from(List<ChatMessageRequest> chatMessageRequests) { List<ChatMessage> chatMessages = new ArrayList<>(); chatMessages.add(new SystemMessage("You are a helpful Chat Assistant")); for (ChatMessageRequest chatMessageRequest : chatMessageRequests) { if (ChatMessageType.USER == chatMessageRequest.getMessageType()) { chatMessages.add(new UserMessage(chatMessageRequest.getMessage())); } else if (ChatMessageType.AI == chatMessageRequest.getMessageType()) { chatMessages.add(new AiMessage(chatMessageRequest.getMessage())); } } return chatMessages; } }
ConversationResponse.java
package com.sample.app.llm; public class ConversationResponse { private String message; public ConversationResponse() { } public ConversationResponse(String message) { this.message = message; } public String getMessage() { return message; } public void setMessage(String message) { this.message = message; } }
ModelConfig.java
package com.sample.app.llm; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; import dev.langchain4j.http.client.spring.restclient.SpringRestClientBuilderFactory; import dev.langchain4j.model.ollama.OllamaChatModel; @Configuration public class ModelConfig { @Bean public OllamaChatModel ollamaLanguageModel() { return OllamaChatModel.builder().baseUrl("http://localhost:11434").modelName("llama3.2") .httpClientBuilder(new SpringRestClientBuilderFactory().create()) .build(); } }
Step 4: Define swagger configuration.
SwaggerConfig.java
package com.sample.app.config; import org.springframework.context.annotation.Configuration; import io.swagger.v3.oas.annotations.OpenAPIDefinition; import io.swagger.v3.oas.annotations.info.Info; @Configuration @OpenAPIDefinition(info = @Info(title = "Chat service Application", version = "v1")) public class SwaggerConfig { }
Step 5: Define ChatController class.
ChatController.java
package com.sample.app.controller; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.web.bind.annotation.CrossOrigin; import org.springframework.web.bind.annotation.PostMapping; import org.springframework.web.bind.annotation.RequestBody; import org.springframework.web.bind.annotation.RequestMapping; import org.springframework.web.bind.annotation.RestController; import com.sample.app.llm.ChatRequestBody; import com.sample.app.llm.ChatService; import com.sample.app.llm.ConversationResponse; import io.swagger.v3.oas.annotations.tags.Tag; import jakarta.validation.Valid; @RestController @RequestMapping("/api/chat") @CrossOrigin("*") @Tag(name = "Chat Controller", description = "This section contains APIs related to Chat APIs Powered by Ollama") public class ChatController { @Autowired private ChatService chatService; @PostMapping public ConversationResponse chat(@RequestBody @Valid ChatRequestBody chatRequestBody) { return chatService.chat(chatRequestBody.getChatMessageRequests()); } }
Step 6: Define main application class.
App.java
package com.sample.app; import org.springframework.boot.SpringApplication; import org.springframework.boot.autoconfigure.SpringBootApplication; @SpringBootApplication public class App { public static void main(String[] args) { SpringApplication.run(App.class, args); } }
Step 7: Create static folder in src/main/resources and define index.html file.
index.html
<!DOCTYPE html> <html lang="en"> <head> <meta charset="UTF-8"> <title>LangChain4j Chatbot</title> <meta name="viewport" content="width=device-width, initial-scale=1"> <link href="https://fonts.googleapis.com/css2?family=Inter:wght@400;600&display=swap" rel="stylesheet"> <style> * { box-sizing: border-box; } body { font-family: 'Inter', sans-serif; background: #f0f2f5; margin: 0; padding: 0; display: flex; justify-content: center; align-items: flex-start; min-height: 100vh; } #chat-box { background: white; border-radius: 12px; padding: 24px; width: 90%; max-width: 600px; margin: 40px auto; box-shadow: 0 8px 24px rgba(0, 0, 0, 0.08); display: flex; flex-direction: column; } h2 { text-align: center; margin-bottom: 20px; font-size: 24px; color: #333; } #chat-history { flex: 1; max-height: 400px; overflow-y: auto; padding-right: 5px; margin-bottom: 20px; } .message { padding: 12px 16px; margin-bottom: 12px; border-radius: 8px; line-height: 1.4; width: fit-content; max-width: 80%; animation: fadeIn 0.3s ease-in-out; } .user { background-color: #d1ecf1; align-self: flex-end; text-align: right; } .ai { background-color: #f8d7da; align-self: flex-start; text-align: left; } #input-container { display: flex; gap: 10px; align-items: center; } #user-input { flex: 1; padding: 12px; border-radius: 8px; border: 1px solid #ccc; font-size: 16px; } button { padding: 12px 20px; background-color: #007bff; color: white; border: none; border-radius: 8px; font-size: 16px; cursor: pointer; transition: background 0.3s ease; } button:hover { background-color: #0056b3; } @keyframes fadeIn { from { opacity: 0; transform: translateY(5px); } to { opacity: 1; transform: translateY(0); } } /* Scrollbar styling */ #chat-history::-webkit-scrollbar { width: 8px; } #chat-history::-webkit-scrollbar-thumb { background-color: #ccc; border-radius: 4px; } #chat-history::-webkit-scrollbar-track { background-color: transparent; } </style> </head> <body> <div id="chat-box"> <h2>LangChain4j Chatbot</h2> <div id="chat-history"></div> <div id="input-container"> <input type="text" id="user-input" placeholder="Type your message..." autocomplete="off"> <button onclick="sendMessage()">Send</button> </div> </div> <script> let chatHistory = []; function appendMessage(message, sender) { const div = document.createElement('div'); div.className = 'message ' + (sender === 'USER' ? 'user' : 'ai'); div.innerText = message; document.getElementById('chat-history').appendChild(div); document.getElementById('chat-history').scrollTop = document.getElementById('chat-history').scrollHeight; } function sendMessage() { const input = document.getElementById('user-input'); const userMessage = input.value.trim(); if (!userMessage) return; appendMessage(userMessage, 'USER'); chatHistory.push({ message: userMessage, messageType: "USER" }); input.value = ''; input.focus(); fetch('/api/chat', { method: 'POST', headers: { 'Content-Type': 'application/json' }, body: JSON.stringify({ chatMessageRequests: chatHistory }) }) .then(response => response.json()) .then(data => { if (data && data.message) { const aiMessage = data.message; appendMessage(aiMessage, 'AI'); chatHistory.push({ message: aiMessage, messageType: "AI" }); } else { appendMessage("No response from server", 'AI'); } }) .catch(error => { console.error('Error:', error); appendMessage("Error connecting to server", 'AI'); }); } // Optional: Send on Enter key document.getElementById('user-input').addEventListener('keypress', function (e) { if (e.key === 'Enter') { sendMessage(); } }); </script> </body> </html>
Build the project
Navigate to the project root folder and execute below command to generate an artifact.
mvn clean install
Upon successful execution of the command, you can see an executable jar ‘chat-bot-0.0.1-SNAPSHOT.jar’ in target folder.
$ ls ./target/ chat-bot-0.0.1-SNAPSHOT.jar generated-sources maven-status chat-bot-0.0.1-SNAPSHOT.jar.original generated-test-sources test-classes classes maven-archiver
Run the Application
Execute following command to run the Application.
java -jar ./target/chat-bot-0.0.1-SNAPSHOT.jar --server.port=1235
Open the url http://localhost:1235/ in browser, you will see following kind of screen.
Type the following message in chat window and click on Send button.
Hi
Now type following message.
What is the Capital Of India?
Now ask, “Tell me something about this country in maximum 5 lines.”
That’s it….Happy Learning..:)
Previous Next Home
No comments:
Post a Comment