Future & RobotsLunch Talk15min
The Intuitive Web: Designing for Multimodal Search Experiences
Multimodal search revolutionizes human-machine interaction by integrating images, text, and voice to enhance search experiences. This talk explores multimodal search modes, AI's role in redefining intent signals, and real-world applications, guiding developers to create intuitive systems that align with natural information-seeking behaviors, simplifying complex queries beyond traditional methods.
Myriam JessierNeurospicy Agency & PRAGM
talkDetail.whenAndWhere
Friday, June 20, 12:45-13:00
NT
Multimodal search is a new level of human-machine communication. It enables us to interact with information, combining images, text, voice while providing extra context to create intuitive search experiences. Snap a pic of the ingredients, and boom! You've got instant intel that used to take serious detective work. Is it vegan? Gluten-free? No more relying on brands to spell it out. I've been documenting common combinations for search "modes" and would love to talk about how we can harness multimodal search to make some complex queries a bit more simple for users, more in line with how we all intuitively search for things. In this talk, we’ll break down the core “modes” of multimodal search, explore how AI is redefining intent signals, and map out real-world applications that push search beyond traditional queries and hashtags. Let’s talk about how developers and technologists can tap into this shift to build smarter, more responsive systems that align with the way people naturally seek information.
Myriam Jessier
Myriam Jessier enjoys going down rabbit holes, hyperfocusing on search-related things, dinosaurs and anything octopus-related. They have a long successful career in search and digital analytics. If you need creative solutions to prickly technical marketing and data problems, you can count on Myriam to tackle it.
comments.speakerNotEnabledComments