Mixed Reality + Diffusion prototype as a tool for exploring concepts, styles, and moods by transforming real-world surroundings into alternate realities.
#MixedReality#MR#AI#StableDiffusion#Quest3
From Montreal to the mountains. Mixed Reality + Nano Banana to transform real-world surroundings. Iterating on my prototype to enable concurrent generation.
#NanoBanana#WithGoogleGemini#MixedReality#AI#Quest3
Mixed Reality + Nano Banana — Quick test using Gemini 2.5 Flash Image in MR to transform real-world surroundings. Not as fast as SDTurbo, but opens up different possibilities; especially when it comes to preserving the source img. #NanoBanana#WithGoogleGemini#MixedReality#AI
We explored the use of 3D GS for museums, using an @mbamtl gallery as our testbed. A single on-site capture rendered across web, VR, and installation. Thank you to the MBAM for the support.
See & try: labs.dpt.co/article-3dgs.htm…
Here is the web version built using @playcanvas
Two real-time img2img diffusion pipelines running side by side, each with its own video input; blending and alternating between prompts via a MIDI controller.
— Hands-on Hallucination.
#ai#stablediffusion#realtime
Quick glimpse of this weekend’s Hands-on Hallucination exploration. I’ve always been captivated by the power of masks; they feel like portals into other worlds. I will later share a longer video.
#AI#StableDiffusion#Realtime#RealTimeAI
Hands-on Hallucination session: Focusing the diffusion process with the Hallucination Scope.
Experimenting with new ways to explore and transform a scene.
#AI#StableDiffusion#Realtime
From the previous session, I pulled a few frames of generated creatures and turned them into 3D models using Hunyuan 3D 2.5. Thankfully, I don’t have a resin 3D printer.
#AI#StableDiffusion#Realtime#3D#Hunyuan3D
2nd half of the previous Hands-on Hallucination session: New camera angle, magic clay in motion, and an additional light source.
#AI#StableDiffusion#Realtime