Development PracticesConference50min
Testing AI service providers with local LLMs, Testcontainers, and Microcks
This session demonstrates building an integration testing framework for Quarkus apps using LLM engines and cloud services. It covers efficient local testing with tools like Testcontainers, Microcks, and Ollama, enabling cost-effective, reproducible AI and cloud service tests—minimizing reliance on public resources and credits.
Oleg NenashevIndependent
When managing services at scale, it is often required to automate testing while mocking (or modeling) the parts provided by public clouds, to minimize the external resource use and costs. If you do not want to burn your credits for public LLM providers, you may want to do the same for the models, too.
For AWS integration testing, we use Localstack... And we have local LLM engines like Ollama to test our app with! For common software, we can use Testcontainers, a popular framework for Docker and other container engines. Can we use the same framework for testing AI providers and organizing developer services? Yes! Can we go further and introduce consistent LLM behavior and inject failures to have reproducible tests? Also yes, there are Microcks, WireMock and their extensions. And yes, we can put everything in a Dev Container on your local machine!
In this session, we will take a Quarkus-based app that leverages an LLM engine and a few MCPs, and build an integration testing framework for it from scratch, with the help of Quarkus Dev Services, Langchain4j, Testcontainers, Microcks and Ollama.
For AWS integration testing, we use Localstack... And we have local LLM engines like Ollama to test our app with! For common software, we can use Testcontainers, a popular framework for Docker and other container engines. Can we use the same framework for testing AI providers and organizing developer services? Yes! Can we go further and introduce consistent LLM behavior and inject failures to have reproducible tests? Also yes, there are Microcks, WireMock and their extensions. And yes, we can put everything in a Dev Container on your local machine!
In this session, we will take a Quarkus-based app that leverages an LLM engine and a few MCPs, and build an integration testing framework for it from scratch, with the help of Quarkus Dev Services, Langchain4j, Testcontainers, Microcks and Ollama.
Oleg Nenashev
Oleg is a developer tools hacker, community builder, and DevRel consultant. He's a passionate open-source software, open ecosystems, and open hardware advocate. Oleg is a CNCF and CDF ambassador, Testcontainers Champion, Kotlin Foundation Ecosystem Committee Member, and a former Jenkins Board member and CDF TOC Chair. Oleg has a PhD in electronics design and volunteers in the Free and Open Source Silicon Foundation, as well as in Ukrainian support and Russian anti-war organizations.
comments.speakerNotEnabledComments