I recently came across the Spring AI project, which “aims to streamline the development of applications that incorporate artificial intelligence functionality without unnecessary complexity.”
Thus far, I’ve been relying on the LangChain4J Framework for my AI projects, but as a Java developer & Spring enthusiast,
I was excited to see what Spring AI had to offer. Unfortunately, at the time of writing, the latest stable release 0.8.1
does not support function calling, which is critical for most advanced use cases. As such, I will be working using the unstable
1.0.0-SNAPSHOT
. Function calls enable the Agent to “interact” with the rest of our software & 3rd party services–The
framework will intercept a tool invocation request & call the appropriate method defined in the callback.
Today’s Project: Today, we will be building a simple Spring AI agent that will help manage a dummy hotel booking system. It will be able to check availability, book rooms, and look up bookings by guest name. Function calls will be used to expose these capabilities to the Agent.
I’m going to keep this project simple and focus more on utilizing the Spring AI framework rather than building a bullet-proof, production ready Agent. The complete code will be available on GitHub
Getting Started
Dependencies
Before adding the Spring AI dependency to your project, you will need to add the Spring AI Snapshot repository to your pom.xml
file.
<repositories>
<repository>
<id>spring-snapshots</id>
<name>Spring Snapshots</name>
<url>https://repo.spring.io/snapshot</url>
<releases>
<enabled>false</enabled>
</releases>
</repository>
</repositories>
Once that is added, we’ll now be able to access the 1.0.0-SNAPSHOT
version of Spring AI–which is reported as unstable,
but does support function calling.
Next, we’ll add the Spring AI dependency to our pom.xml
file.
<dependency>
<groupId>org.springframework.experimental</groupId>
<artifactId>spring-ai</artifactId>
<version>1.0.0-SNAPSHOT</version>
</dependency>
Credentials
I’ll be using OpenAI for this project. The Spring-AI framework allows us to create an entry in our application.properties
file:
spring.application.name=spring-ai-booking
spring.ai.openai.api-key=${OPENAI_API_KEY}
spring.ai.openai.chat.options.model=gpt-4-1106-preview
I’m setting an environment variable OPENAI_API_KEY
to my OpenAI API key in my Run Configuration. I also have opted to
specify the model. gpt-4-1106-preview
is a solid choice for our project as it boasts improved function call capabilities
and a larger context window.
Sanity-Check
Now that our dependencies are set up, let’s do a quick sanity-check to ensure that everything is working as expected.
The following code snippet was pulled directly from the Spring AI documentation:
@RestController
public class SimpleAiController {
private final ChatClient chatClient;
@Autowired
public SimpleAiController(ChatClient chatClient) {
this.chatClient = chatClient;
}
@GetMapping("/ai/simple")
public Map<String, String> completion(@RequestParam(value = "message", defaultValue = "Tell me a joke") String message) {
return Map.of("generation", chatClient.call(message));
}
}
With my application running, I want to test the endpoint by sending a GET
request to http://localhost:8080/ai/simple
/tmp ❯ curl http://localhost:8080/ai/simple
{"generation":"Why did the scarecrow win an award?\nBecause he was outstanding in his field!"}%
Looks like we’re in business! Now we can move on to building our Hotel Booking Agent.
Note: Thus far, this is all the code required. The Spring-AI framework was able to configure and autowire the
ChatClient for us using the API key we provided in the application.properties
file. LangChain4J
requires a bit more
configuration to get started.
Creating the Hotel Booking Agent
The hotel booking agent will be a simple agent that can handle the following commands:
- check availability
- book a room
- look up a booking by guest name
In addition to these command capabilities, we will be using a simple in-memory data store to manage both the hotel bookings and the conversation context. I mentioned earlier that this exploration is more about the Spring AI framework than building a production-ready agent, so we will only support a single conversation at a time.
Booking Service and Function Calls
I’ve created a simple booking service that will manage the hotel bookings. While I will spare you the details of its implementation, as it is not the focus of this post, I do want to point out some conditions I’ve set up for the service:
/**
* Initializes the availability of rooms for specific dates for demonstration purposes.
*/
@PostConstruct
public void init() {
// Set availability for January 15, 2025 (available)
LocalDate availableDate = LocalDate.of(2025, 1, 15);
setAvailability(availableDate, 5);
// Set availability for February 28, 2025 (unavailable)
LocalDate unavailableDate = LocalDate.of(2025, 2, 28);
setAvailability(unavailableDate, 0);
}
The full service can be found here.
While the service exposes methods to check availability, book a room, and cancel a booking, the LLM Agent will not be able to interact with the service directly. To wire up the service to the LLM Agent, we will need to define a function and expose it to the LLM Agent.
In the official Spring AI function calling documentation, they provide the following example:
public class MockWeatherService implements Function<Request, Response> {
public enum Unit { C, F }
public record Request(String location, Unit unit) {}
public record Response(double temp, Unit unit) {}
public Response apply(Request request) {
return new Response(30.0, Unit.C);
}
}
The above is supposed to wrap a 3rd party service. After spending some time reading the documentation and source code, I was
unable to find a way to expose multiple methods on a service without creating multiple classes. I was really hoping that
exposing a service call would be as simple as a @Tool
annotation on a method, like in the LangChain4J
framework.
Booking Tools
We will define a function that will expose a single method call on the HotelBookingService to the Agent.
@Component
@RequiredArgsConstructor
public class CheckAvailabilityTool implements Function<CheckAvailabilityTool.Request, CheckAvailabilityTool.Response> {
private final HotelBookingService hotelBookingService;
public record Request(String date) {}
public record Response(boolean available) {}
@Override
public Response apply(Request request) {
// LocalDate from a string
LocalDate date = LocalDate.parse(request.date);
Boolean isAvailable = hotelBookingService.isAvailable(date);
return new Response(isAvailable);
}
}
To test this out, I’ve modified the SimpleAiController
to test it out, here’s how it looks now:
@RestController
public class SimpleAiController {
private final ChatClient chatClient;
private final CheckAvailabilityTool checkAvailabilityTool;
@Autowired
public SimpleAiController(ChatClient chatClient, CheckAvailabilityTool checkAvailabilityTool) {
this.chatClient = chatClient;
this.checkAvailabilityTool = checkAvailabilityTool;
}
@GetMapping("/ai/simple")
public Map<String, String> completion(@RequestParam(value = "message", defaultValue = "Do you have any rooms available on February 28, 2025") String message) {
UserMessage userMessage = new UserMessage(message);
var promptOptions = OpenAiChatOptions.builder()
.withFunctionCallbacks(List.of(FunctionCallbackWrapper.builder(checkAvailabilityTool)
.withName("CheckAvailability")
.withDescription("Check the availability of rooms for a specific date")
.withResponseConverter((response) -> "" + response.available())
.build()))
.build();
ChatResponse response = chatClient.call(new Prompt(List.of(userMessage), promptOptions));
return Map.of("generation", response.getResult().toString());
}
}
You can see that the ChatClient/LLM is now aware of the CheckAvailabilityTool
function. I’ve also updated the default
message value to include a date that is set to be unavailable in the HotelBookingService
.
Let’s test it out:
/tmp ❯ curl http://localhost:8080/ai/simple
{"generation":"Generation{assistantMessage=AssistantMessage{content='I'm sorry, but we do not have any rooms available on February 28, 2025.', properties={role=ASSISTANT, finishReason=STOP, id=chatcmpl-9658XUpnLnRKZ2LUlbk4DbF00WBBV}, messageType=ASSISTANT}, chatGenerationMetadata=org.springframework.ai.chat.metadata.ChatGenerationMetadata$1@642222bf}"}
Perfect! The LLM was able to call the CheckAvailabilityTool
function and respond accordingly. I will continue to build
wrapper functions for the BookRoom
and CancelBooking
methods in the HotelBookingService
.
Chat Management
Spring AI doesn’t appear to provide a mechanism OOTB for managing the conversation context. I will be using a simple in-memory store using another singleton scoped spring service to manage a single conversation.
Note: With LLM APIs, they generally do not manage any of the conversation context. This is typically handled by the client, which will save all relevant messages & provide them upon each request for the model to generate a response.
Below is the simple ConversationService I’ve created:
@Service
@Scope("singleton")
@Slf4j
public class ConversationService {
private List<Message> messageList = Collections.synchronizedList(new ArrayList<>());
/**
* Adds a message to the conversation.
*
* @param message the message to be added to the conversation
* @return the updated list of messages in the conversation
*/
public synchronized List<Message> addMessage(Message message) {
messageList.add(message);
log.info("Added message to conversation: {}, total messages: {}", message, messageList.size());
return messageList;
}
/**
* Retrieves all messages in the conversation.
*
* @return the list of messages in the conversation
*/
public synchronized List<Message> getAllMessages() {
log.info("Retrieved all {} messages", messageList.size());
return messageList;
}
}
In a real world application, this would be replaced with a more robust solution, such as Redis or a database–and would support concurrent conversations. As this is written, it will only save the context of our current conversation.
The Agent
Now that we have the services we will require, as well as methods on those services exposed as tools, we can build
the Agent. The Agent will need to utilize the ConversationService
to manage the conversation context, and will need to
be aware of the CheckAvailabilityTool
, BookRoomTool
, and CancelBookingTool
capabilities to fulfill its mission as a
hotel booking agent.
Review the code below:
@Component
@RequiredArgsConstructor
public class BookingAgent {
private final ChatClient chatClient;
private final CheckAvailabilityTool checkAvailabilityTool;
private final FindBookingTool findBookingTool;
private final BookRoomTool bookRoomTool;
private final ConversationService conversationService;
/**
* When the BookingAgent is created, we will define the agent's role as a SystemMessage at the top of the conversation.
*/
@PostConstruct
public void defineAgentProfile() {
String agentProfile = "You are a booking agent for an online hotel. You are here to help customers book rooms and check availability." +
"Use the tools you have access to in order to help customers with their requests. You can check availability, book rooms, and find bookings.";
SystemMessage systemMessage = new SystemMessage(agentProfile);
conversationService.addMessage(systemMessage);
}
/**
* When a message is sent to the agent, the agent will handle the message and return a response.
* @param message
* @return
*/
public String handleMessage(String message) {
// Add the user message to the conversation
UserMessage latestMessage = new UserMessage(message);
conversationService.addMessage(latestMessage);
List<Message> messages = conversationService.getAllMessages();
var promptOptions = getPromptOptions();
ChatResponse response = chatClient.call(new Prompt(messages, promptOptions));
// Add the assistant response to the conversation
conversationService.addMessage(response.getResult().getOutput());
// Return the assistant response
return response.getResult().getOutput().getContent();
}
/**
* Expose function callbacks to the OpenAI chat client
*
* @return
*/
private OpenAiChatOptions getPromptOptions() {
return OpenAiChatOptions.builder()
.withFunctionCallbacks(List.of(FunctionCallbackWrapper.builder(checkAvailabilityTool)
.withName("CheckAvailability")
.withDescription("Helpful for checking the availability of rooms for a specific date, this should be used before booking a room for a new guest.")
.withResponseConverter((response) -> "" + response.available())
.build(),
FunctionCallbackWrapper.builder(bookRoomTool)
.withName("BookRoom")
.withDescription("Helpful for booking a room for a new guest for a specific check-in and check-out date")
.withResponseConverter((response) -> response.bookingStatus())
.build(),
FunctionCallbackWrapper.builder(findBookingTool)
.withName("FindBooking")
.withDescription("Helpful to determine if an existing guest has booked a room")
.withResponseConverter((response) -> response.booking())
.build()))
.build();
}
}
The defineAgentProfile
method is used to set the stage for the agent. This would be referred to as “The Profiling Module”
in Unified Framework for LLM Based Agents. When the agent is created a SystemMessage
explaining the agent’s role is added to the top of the conversation.
When ChatResponse response = chatClient.call(new Prompt(messages, promptOptions));
is called, the Agent is provided
the entire context (view the messages
array being passed to a new prompt) as well as the promptOptions
which include
all of the function callbacks that the Agent is has access to.
Note: Spring provides multiple ways to expose tools to the Agent. Including @Bean
annotations in a configuration class.
I opted to use the @Component
annotation to keep everything in one place so that readers could see the entire Agent in one
place. You can read about other ways to expose tools in the Spring AI documentation.
Testing the Agent
In order to test the agent, I created a simple BookingAgentTest
class which will invoke it with a few different messages.
Here are the contents of that test, then we’ll walk through the output:
@SpringBootTest
class BookingAgentTest {
@Autowired
private BookingAgent bookingAgent;
@Test
public void testBookingConversation() {
String firstMessage = "Hi, my name is John--Can you see if any rooms are available on February 28, 2025?";
System.out.println(bookingAgent.handleMessage(firstMessage));
String availability = "Do you have any availability on January 15th, 2025?";
System.out.println(bookingAgent.handleMessage(availability));
// Start a new conversation
String alternativeDate = "Please book 1 room for John on January 15h, 2025. The check-out date will be January 21st, 2025.";
// Expect a successful booking
System.out.println(bookingAgent.handleMessage(alternativeDate));
String checkBooking = "Can you see if a guest John has reserved any rooms?";
// Expect a yes
System.out.println(bookingAgent.handleMessage(checkBooking));
// Demonstrate persisted conversation context
String summarize = "Can you summarize our discussion today?";
System.out.println(bookingAgent.handleMessage(summarize));
}
}
In the above, we can see that we set up a few different scenarios for the agent to handle.
- Check on a date that is unavailable - The Agent should look up the availability and respond that there are no rooms available.
- Check on a date that is available - The Agent should look up the availability and respond that there are rooms available.
- Book a room - The Agent should book a room for John on the specified date range.
- Check if a booking exists - The Agent should respond that a booking exists for John.
- Summarize the conversation - The Agent should summarize the conversation.
The final point to summarize the conversation is really just to illustrate that the ConversationService
is correctly
managing the conversation. It should be housing all messages from both Agent & User and enable to Agent to provide
a complete summary of the conversation.
Here are some selected outputs from the test (I’ve added the comments over log statements to provide context):
# 1st User message
Added message to conversation: UserMessage{content='Hi, my name is John--Can you see if any rooms are available on February 28, 2025?', properties={}, messageType=USER}, total messages: 2
# Agent response indicating Feb 28, 2025 is unavailable (as expected)
Added message to conversation: AssistantMessage{content='I'm sorry, John, but it looks like there are no rooms available on February 28, 2025. If you have any flexibility with your dates or need assistance with anything else, please let me know, and I'll be happy to help.', properties={role=ASSISTANT, finishReason=STOP, id=chatcmpl-96K38H75d6hEhDErQtBf3UiT5VZRC}, messageType=ASSISTANT}, total messages: 3
I'm sorry, John, but it looks like there are no rooms available on February 28, 2025. If you have any flexibility with your dates or need assistance with anything else, please let me know, and I'll be happy to help.
# User suggests a different date
Added message to conversation: UserMessage{content='Do you have any availability on January 15th, 2025?', properties={}, messageType=USER}, total messages: 4
# Agent response indicating Jan 15, 2025 is available (as expected)
Added message to conversation: AssistantMessage{content='Great news, John! We do have rooms available on January 15th, 2025. If you'd like to proceed with booking a room, please provide me with the details, such as the number of nights you'd like to stay and any specific room preferences you may have.', properties={role=ASSISTANT, finishReason=STOP, id=chatcmpl-96K3CelXXwBTlN0SJrBDFBDS9HmcD}, messageType=ASSISTANT}, total messages: 5
# User requests a booking
Added message to conversation: UserMessage{content='Please book 1 room for John on January 15h, 2025. The check-out date will be January 21st, 2025.', properties={}, messageType=USER}, total messages: 6
# Agent response indicating the booking was successful
Added message to conversation: AssistantMessage{content='Your room has been successfully booked, John! You'll be staying from January 15th to January 21st, 2025. If you need any further assistance or confirmation details, please don't hesitate to ask. Enjoy your stay!', properties={role=ASSISTANT, finishReason=STOP, id=chatcmpl-96K3HwfCdXdiUy7iRvrjo4NKHrWn7}, messageType=ASSISTANT}, total messages: 7
# User checks if a booking exists (Just to demonstrate the FindBookingTool)
Added message to conversation: AssistantMessage{content='Yes, John, you have a reservation with us. Here are the details of your booking:
- Check-in Date: January 15, 2025
- Check-out Date: January 21, 2025
If you need any further assistance regarding your booking or if there's anything else I can help you with, please let me know.', properties={role=ASSISTANT, finishReason=STOP, id=chatcmpl-96K3LfNZJsHYfXCOsfMT6HfTOFn0r}, messageType=ASSISTANT}, total messages: 9
We can see that the Agent is able to handle the different scenarios and provide the appropriate responses. The Agent is invoking tools appropriately, tracking the conversation & relaying accurate information. I’m happy with the results!
To button things down, I’ve adjusted the SimpleAIController
to use the BookingAgent
which will enable us to interact
with the Agent via a simple REST API.
Conclusion
This has been a fun exploration of the Spring AI framework. I was able to build a simple Hotel Booking Agent that was able
to handle a few different scenarios. As mentioned at the top of the article, Function Calling support is not available in
the latest stable release. Compared to the LangChain4J
framework, Spring AI is a bit more complex to get started with,
and provides fewer out-of-the-box features. Furthermore, it appears that in its current form Function Calling isn’t
very portable between models, but this will require further exploration.
One other quip I have with the Spring AI Framework is that we do not get as much insight or control into the function calling
under the hood. With LangChain4J
we can stash the function requests/responses in a chat MemoryStore
which would
enable agents to track their tool executions over a long form conversation. This doesn’t seem easily possible with the
current SNAPSHOT version of Spring AI. The common example that I use for this problem is a bot being exposed to a tool
that requires pagination. If the bot cannot track the pagination state (by seeing what it invoked awhile ago), it will
get stuck in a loop requesting the first page.
In all, I’m excited to see where the Spring AI project goes. I think it has a lot of potential, and I’m looking forward to seeing how it evolves. The Spring team & community has a great track record of building robust, developer-friendly tools, and I’m excited to see what the future holds for Spring AI.
The full code for today’s project as well as a Postman collection to interact with the Agent can be found Here.