Real-Time Experiences with Jean-François Larouche, Moment Factory
Immersive Experiences

Founded in 2001, Moment Factory has grown into a global leader in creating immersive, tech-driven experiences that bring people together in public spaces. Blending art, innovation, and storytelling, the studio continues to push the boundaries of experiential design. In this interview, we speak with Jean-François Larouche, Director of Real-Time & Creative Director about Moment Factory's journey, the rise of real-time content, and the technologies shaping the future of immersive environments.
Can you tell us a few words about Moment Factory - a brief history, guiding principles and recent evolution?
Moment Factory is a multimedia studio founded in 2001 with a mission to bring people together by creating shared, immersive experiences. Guided by the principles of creativity, collaboration, and innovation, we specialise in blending art and technology to craft unforgettable moments. Over the years, we’ve evolved from being a pioneer in projection mapping and interactive installations to a global leader in experiential design, embracing technologies like real-time content and artificial intelligence to redefine what’s possible in public spaces.
And can you share how you first got started in the industry and your Moment Factory journey?
I began my career at the intersection of design and technology, fuelled by a fascination for how digital tools can elevate and transform human experiences.
Joining Moment Factory was a natural extension of this passion, providing a platform to explore groundbreaking ideas and collaborative innovation. Over the years, I’ve had the privilege of leading and contributing to multidisciplinary teams, combining expertise in UX, creative direction, and software development to craft immersive environments that push the boundaries of interactive content and deeply resonate with audiences worldwide.

Phish at the Sphere. Image credits: Alive Coverage
What trends have you seen develop and grow in the experiential space over the last five years?
Over the past five years, there has been a significant surge in interest for real-time audiovisual content, more intelligent, personalized, and connected experiences, as well as interactivity. Recently, the integration of AI has been transformative, making environments more adaptive and allowing us to create large-scale modular systems and transform our production pipelines.
Audiences today expect deeper engagement, often delivered through multi-sensory narratives and participatory design that invites them to actively shape their experience.
Real-time content has seen a huge upswing in recent years - can you define what this means to you and why it is important for modern immersive experiences?
Real-time content is dynamic, adaptable media that evolves in response to live inputs such as audience interaction, environmental data, or external triggers.
It’s a game changer for experiential design because it fosters a two-way dialogue between the installation and its audience, transforming passive spectators into active participants.
This immediacy enables us to craft deeply engaging, personalised experiences that are always fresh, relevant, and responsive, seamlessly bridging the physical and digital worlds.
Could you share an example of a project where you implemented real-time content? How did the flexibility of the content enhance your project's experience?
A notable example is our recent collaboration with the band Phish on a four-night concert series at Sphere in Las Vegas. For this project, we faced the unique challenge of aligning the visual content with the band’s improvisational style while meeting the technical precision required by Sphere’s cutting-edge technologies. To address this, we developed an innovative real-time platform that fused pre-rendered visuals with real-time rendering, creating a dynamic "live VJing set." This approach allowed the visuals to adapt fluidly to the band’s spontaneous performances across four 3.5-hour shows.

Phish at the Sphere. Image credits: Alive Coverage
How do you design content that can adapt in real-time to audience interactions?
Designing adaptive content begins with a deep understanding of the audience and the context of interaction. User experience is our primary milestone, guiding every creative and technical decision.
Inspired by agile development methodologies, we prioritise rapid prototyping to explore ideas, embrace failures, and refine solutions—following the "fail fast, fail often" philosophy.
Prototyping is essential: it allows us to quickly test and adjust how different responses feel in context, ensuring that the interaction remains intuitive, engaging, and meaningful.
How does real-time data integration impact the creation and execution of real-time content in your location-based experience projects?
Real-time data integration brings a new dimension of depth and responsiveness to LBX projects, making them feel dynamic and alive. It enables installations to react in real time to environmental changes, audience density, or even live social media trends.
For example, integrating weather data can adjust the mood of a space, while real-time audience feedback can dynamically shift content priorities to maintain engagement. A prime example of this approach is our recent project at Changi Airport Terminal 2, where we created a digital sky. This innovative feature opens an overhead window to the world above, emulating daylight and meteorological conditions in real time through integration with the airport weather system.
However, this level of responsiveness requires meticulous planning to ensure a smooth operation. Robust data pipelines, low-latency processing, and reliable fallback systems are essential to address technical challenges and maintain performance.

Changi Airport Terminal 2. Image Credits: Moment Factory.
What are the main challenges of creating real-time content, and how do you overcome them?
One of the primary challenges in creating real-time content is balancing the technical complexity of these systems with the artistic vision. Real-time systems require high computational efficiency, seamless integration, and precise data handling, which can sometimes impose constraints on creative ambitions.
We address these challenges by fostering close collaboration between creative and technical teams from the very beginning. By adopting agile workflows and prioritising iterative prototyping, we ensure that both the artistic and technical aspects are aligned. Extensive testing helps us optimise performance and identify potential issues early.
Scalability and adaptability are equally important, as many installations need to evolve over time to remain relevant and engaging. This forward-thinking approach allows us to deliver projects that are both technically robust and creatively compelling.
Where do you see the future of real-time content for LBX heading? What innovations are you most excited about?
The future of real-time content lies in its ability to create hyper-personalised and contextually aware experiences. Innovations like AI-driven narrative design, real-time audience sentiment analysis, and the integration of wearables will make experiences even more immersive.
I’m particularly excited about the potential of mixed reality and the convergence of virtual worlds with physical spaces, where real-time content can provide harmonious transitions between the two realms, unlocking entirely new forms of storytelling.
Not only does this allow us to “bridge the physical and digital worlds,” but it also reinforces our core mission: to create experiences “in the real world where audiences can connect together and live collective experiences.” This focus on shared, tangible moments remains at the heart of everything we do, even as we explore the frontiers of emerging technologies.
And what other technologies or working practices are you experimenting with that show real promise for the future.
Our philosophy has always been to hijack technology and use it to create new magic tricks and illusions that resonate with people on an emotional level. AI is no exception—it’s a thrilling new tool to experiment with, opening up creative possibilities we could only dream of before.
We’re exploring AI tools for generative content creation, which significantly enhances and accelerates the ideation and prototyping process. These tools empower our team and enable us to take our experiences to new levels that were previously unattainable.
Additionally, AI is transforming our workflows by streamlining pipelines and addressing tasks we may not want—or need—to focus on manually. This “Everyday AI” empowers us to dedicate more energy to creativity and innovation while ensuring that repetitive or time-consuming tasks are handled efficiently.
That said, our approach to AI remains deeply human-centered. For us, AI is made by humans, for humans. Our motto, “We do it in public,” reflects our commitment to transparency and collaboration. Ultimately, we believe a human must always make the final decisions, ensuring that the creative vision and production cues remain authentic and aligned with our values.