DevOpsDevOps
Byte size15min
BEGINNER

FrameworkDesktop and IncusOS, a perfect combination for running LLMs locally

This talk demonstrates setting up and using an IncusOS desktop server for running AI applications locally, highlighting benefits like privacy, cost control, GPU support, and eco-efficiency. Attendees will learn setup steps, common issues, and how Incus enables efficient, private, and scalable local LLM deployment for developers and DevOps professionals.

Peter Smink
Peter SminkTeam Rockstars IT/ASML

talkDetail.whenAndWhere

Wednesday, April 1, 17:30-17:45
Zaal 4
talks.roomOccupancytalks.noOccupancyInfo
talks.description
If your are a seriously developing AI Applications, a framework desktop server running IncusOS is a must have.
I will discus the pro and cons of such a setup,
go through the setup process and issues I run into when setting it up and using it.

I will give a demo how to you can run your own AI application on a laptop that use AI models running on the Incus server.

key take aways
  • you have a guide how to setup your own configuration and what issues you can encounter
  • you have an good impressing how this setup can help you own AI development
  • a solution to run LLM locally for privacy reasons
  • a solution to run LLM locally on a GPU for performance
  • a solution to keep cost under control. If your application has many mcp tools the costs per call are high when using public AI's
  • a solution that can run LLM that do not fit in graphical cards
  • You are a devops and just want to see how a modern tools like Incus can be used for running any container or VM
  • You want to be eco-friendly, to control/reduce your own energy usage for running AI

Target audience:
AI developers, Devops, Anyone who wants to run LLMs locally in a relative efficient way.
privacy
performance
incusos
llm
talks.speakers
Peter Smink

Peter Smink

Team Rockstars IT/ASML

Netherlands

More than 37 years of experience as software/system developer.

talkDetail.rateThisTalk

talkDetail.poortalkDetail.excellent

talkDetail.ratingNotYetAvailable

talkDetail.ratingAvailableWhenStarted

talkDetail.signInRequired

talkDetail.signInToRateDescription

occupancy.title

occupancy.votingNotYetAvailable

occupancy.votingAvailableBeforeStart

talkDetail.signInRequired

occupancy.signInToVoteDescription

comments.title

comments.speakerNotEnabledComments