We presented BFF at the 2025 NeurIPS Creative AI Track, Dec 2-7.
Details
NeurIPS 2025 The Thirty-Ninth Annual Conference on Neural Information Processing Systems – San Diego Convention Center, Tuesday Dec 2nd through Sunday Dec 7th – Mexico City, Sunday Nov 30 through Friday December 5
Twomey, R., Fleming, J.R. (2025) Best Friends Forever. In NeurIPS 2025 Creative AI Track.
This performance-lecture (with a robot dog) presents artist and engineer Robert Twomey’s artistic research into the ways we live with, think through, and feel alongside machines. From his dissertation “Machines for Living”—a study of the smart home as an intimate site of technological cohabitation—to recent work on “communing” with creative AI, Twomey explores how emerging technologies shape domestic, perceptual, and emotional life. Central to his practice is the design of introspective technologies: hybrid systems that act as mirrors, surrogates, and partners, producing mutually revelatory encounters between human and machine. Twomey introduces BFF, a new media artwork with Jesse Fleming, in which two artist-researchers co-parent and converse with quadruped robot dogs running local LLMs. Structured as a Batesonian metalogue, BFF stages recursive, embodied dialogue about AI alignment, simulation, and attachment, offering a poetic exploration of machine intimacy at the frontiers of art, AI, and the everyday.
Professor Twomey’s Athenaeum presentation is the keynote for Third Annual Meeting of the World Imagination Network.
I will present my project Becoming BFFs: Developing Cinematic Autoethnography with a Robot Dog as part of the SICCA Fellowship Forum in Fall 2025.
“Becoming BFFs” explores intimacy, embodiment, and AI alignment in the evolving relationship between a human parent and their robot dog. Using LIDAR imaging, 360° video, and snapshots of internal AI model states, I have piloted a hybrid cinematic language that blends human and machine perception. The project extends traditions of cinéma vérité in an algorithmic register, asking what it means to see our selves through the perceptual and cognitive systems of machines. It turns those introspective technologies to describe an emerging human-robot relationship. This project is a collaboration with Jesse Fleming.
Our paper on Quantum Theater is live on the ACM Proceedings of SIGGRAPH 2025:
Quantum Theater takes quantum phenomena and re-imagines them as playable theater using generative AI to expand narrative possibilities in real-time. Working with archival materials and recent literature, it engages quantum science both as a subject and a means for creating playable experience, exploring the history and current development of the field. Phenomena like entanglement, superposition, coherence, and collapse are used as models for experiential and narrative effects manifest in theatrical performance. Through XR techniques multiple realities are layered on stage, branching and collapsing as the action develops over the course of the performance. Functional quantum modules shape this narrative logic in a post-AI exploration of liveness, variability, and improvisation. Quantum Theater explores simultaneous narratives in a space of competing realities, casting the audience as observer-participants actively cohering a story through their choices.
Robert Twomey, Ash Eliza Smith, Reid Brockmeier, and Samantha Bendix. 2025. “Quantum Theater: Extending Realities for Post-AI Liveness”. In Proceedings of the Special Interest Group on Computer Graphics and Interactive Techniques Conference Spatial Storytelling (SIGGRAPH Spatial Storytelling ’25). Association for Computing Machinery, New York, NY, USA, Article 10, 1–3. https://doi.org/10.1145/3721244.3742446
Quantum Theater: Extending Reality for Post-AI Liveness
Quantum Theater takes quantum science as subject and method for playable theater. Phenomena like entanglement, superposition, coherence, and collapse shape the performance in a post-AI exploration of liveness, variability, and improvisation. Multiple realities are layered on stage, where the audience as observer-participant plays an active role in cohering singular narratives.
Join us for Live Action Robotic Role Play (LARRP) with Robots and Shape the Future of Multispecies Care at the Kiewit Luminarium in Omaha, Nebraska. July 17, 2025, 7-10pm.
We will have two robots on hand that guests can interact with, a robotic dog and robotic arm. Guests will be prompted to interact through different scenarios and play out our robotic futures. We will be gathering data, ideating on and embodying new ways of giving and receiving care in more-than-human futures. This is an extension of the Speculative Robotics Lab pop-up at UNL in 2025.
I’m piloting a new course on creative code for Fall 2025 in the Department of Visual Arts: VIS42 Intro to Creative Code. This will replace our existing Computer Science (CS) requirement for Interdisciplinary Computing and the Arts (ICAM) Majors.
Description This course provides students with a foundation in programming and computational thinking, and their application in creative projects. Topics covered may include generative graphics and sound, interactive media, and others. Students will gain practical skills through hands-on experience and experimentation, learning to integrate computing into artistic practices. No prior programming experience is required.
It was a pleasure to host Mendi + Keith Obadike at UCSD this past week on the occasion of their show “The Skeuomorph” at the Gallery QI. Talk and panel discussion embedded below.
I served as a guest judge for the Triton XR hackathon held at the Design Innovation Building on March 1-2, 2025. Triton XR is the UCSD VR/AR/XR student organization I am faculty advisor for. The theme was Mindfulness, and I judged along with with Trish and Joe from Maveric Studio, Cassie Vietan from the Stanford Compassion Institute/Center for Mindfulness, Jessica D’Elena Tweed, and external guests from Unity.