Together with Prof. Karcher Morris and Postdoctoral scholar Jon Paden, we have been awarded a $45,000 grant from the UC San Diego Course Development and Instructional Improvement Program (CDIIP) to develop and pilot imagination for engineers within STEM curricula. This builds on work I have done as a lecturer in Data Science, Electrical and Computer Engineering/ML for the Arts, bridging cultivate of human imagination within STEM education, and focused on imagination as a driver of engagement, retention, and broadening the scope of STEM disciplines. The modules and resources we develop (and publish) will be shpaed with an eye towards broad applicability across diverse educational fields.
I am co-organizing a workshop on Computational Measurements of Machine Creativity (CMMC) for CVPR21.
Bridging the Gap between Subjective and Computational Measurements of Machine Creativity
While the methods for producing machine creativity have significantly improved, the discussion on a scientific consensus on measuring the creative abilities of machines has only begun. As Artificial Intelligence becomes capable of solving more abstract and advanced problems (e.g., image synthesis, cross-modal translations), how do we measure the creative performance of a machine? In the world of visual art, subjective evaluations of creativity have been discussed at length. In the CVPR community, by comparison, evaluating a creative method has not been as systematized. Our goal in this workshop is to discuss current methods for measuring creativity both from experts in creative artificial intelligence as well as artists. We do not wish to narrow the gap between how humans evaluate creativity and how machines do, instead we wish to understand the differences and create links between the two such that our machine creativity methods improve.
June 20, 2021, 11:00am – 2:30pm EDT | http://cmmc-cvpr21.com/
I gave a workshop with faculty and graduate students from the University of Chicago Digital Media Workshop and Poetry & Poetics Workshop on Machine Imagination: Text to Image Generation with Neural Networks.
Description: With recent advancements in machine learning techniques, researchers have demonstrated remarkable achievements in image synthesis (BigGAN, StyleGAN), textual understanding (GPT-3), and other areas of text and image manipulation. This hands-on workshop introduces state-of-the-art techniques for text-to-image translation, where textual prompts are used to guide the generation of visual imagery. Participants will gain experience with Open AI’s CLIP network and Google’s BigGAN, using free Google Colab notebooks which they can apply to their own work after the event. We will discuss other relationships between text and image in art and literature; consider the strengths and limitations of these new techniques; and relate these computational processes to human language, perception, and visual expression and imagination. Please bring a text you would like to experiment with!
Workshop link here: https://github.com/roberttwomey/machine-imagination-workshop
I spoke at the April 30, ACM SIGGRAPH Digital Arts Community SPARKS event on Robotics, Electronics, AI, moderated by Hye Yeon Nam and Jan Searleman.
My talk, From Experimental Human Computer Interaction to Machine Cohabitation: New Directions in Art, Technology, and Intimate Life, explored human-computer cohabitation:
How do we prepare for a future living, working, and learning with machines? What new possibilities arise from the advent of always-on intelligent assistants, affordable co-robotic platforms, and ubiquitous AI? Now that we have invited the machines into our homes, our workplaces, our intimate everyday, how can we reimagine the terms of our human-computer interactions?
Through the presentation of a series of experimental arts projects, this talk addresses our machine cohabitant future. I will show key previous works building affective surrogates, developing inhabitable smart spaces, and situating machine observers with varying degrees of agency within shared environments. These projects lead to the discussion of my current work building embodied interfaces and staging experimental Human-Robot Interactions. I will raise critical concerns with language and communication, embodied intelligence, and the dynamics of model-limited experience within these contexts.
April 30, 2021 | https://dac.siggraph.org/robotics-electronics-ai/
We did it! We’ve received a 3-year grant from the National Science Foundation to develop and test an Augmented Reality (AR) environment for collaborative coding. After years working on NSF-funded projects, this is my first time as co-PI:
We’ll be working with HS students from underserved communities in SD to study the efficacy of visual, embodied coding compared to traditional approaches, in promoting computational interest and ability. can’t wait to start!
This is the second project with my collaborator Ying Wu.
I’m excited that our workshop, Measuring Computational Creativity: Collaboratively Designing Metrics to Evaluating Creative Machines will be featured at ISEA2020 – Why Sentience? in Montreal in October. Eunsu Kang, Jean Oh, and I, together with ISEA participants, will develop metrics to assess computational creativity. We will address questions including:
How do we make a creative machine? Creativity is not a sudden burst out of blank space. It involves “a multitude of definitions, conceptualizations, domains, disciplines that bear on its study, empirical methods, and levels of analysis, as well as research orientations that are both basic and applied – and applied in varied contexts.” From Newell, Shaw, and Simon’s insights on computational creativity to Boden’s definitions such as combinational creativity, exploratory creativity, and transformational creativity, defining what kind of creativity, which is appropriate for the specific task of a machine, would be a sensible first step to build a creative algorithm/machine. Yet some questions remain. Can we computationally model ambiguity? Would a novelty search result in valuable discoveries? Where is the threshold between randomness and creativity? Last but not least, how do we evaluate the creativity of an algorithm? This workshop is a first attempt to establish evaluation metrics assessing computational creativity in our current international Arts and Machine Learning (ML) research renaissance.
You can read more about the preliminary programming here: http://isea2020.isea-international.org/preliminary-programming/
I’m thrilled to offer my new course through the Halicioglu Data Science Institute at UC San Diego. It’s been in the works for about a year now.
This course addresses the intersection of data science and contemporary arts and culture, exploring four main themes of authorship, representation, visualization, and data provenance. The course is not solely an introducing to data science techniques, nor merely an arts practice course, but explores significant new possibilities for both fields arising from their intersection. Students will examine problems from complementary perspectives of artist-researchers and data scientists.
Read more here: dsc160.roberttwomey.com
How can data science and the arts and humanities learn from one another?
Two days of events February 7-8 considering the growing digitization of the cultural record and the explosion of new data generation, collection, and analysis practices create a new state of cultured data: culture as data, and data as a driver of culture. Our symposium examines this emerging condition, considering both how analytic techniques enable new understandings of culture, and how the proliferation of data in everyday life changes how culture is produced, distributed, and influenced. In these panels, we wrestle with new modes of scholarship and cultural production enabled by data-forward analysis methods, and consider perspectives from the arts and humanities for data science practice. What can these disciplines teach one another about their possibilities and limits towards realizing a more just, informed, and culturally-rich future?
With 200 RSVPs for both days, and a robust and diverse turnout, the event was a success!
Day 1 Talks @ Atkinson Hall, UC San Diego: https://youtu.be/3qBd5t0iV8c?t=1365
Stay tuned for complete archives of the talks and performances on the website: cultureddata.net
Curated by DELUGE (Stephanie E. Sherman, Ash E. Smith) and Robert Twomey http://gallery.calit2.net/portal/
January 14-March 13, 2020
gallery@calit2 will be closed on UCSD observed holidays
Tuesday, January 14th, 2020
5:00-6:00pm Discussion with Jordan Crandall, DELUGE (Stephanie E. Sherman, Ash E. Smith) Robert Twomey, and guests; Calit2 Auditorium. Stream Online: https://www.youtube.com/watch?v=kR3DZh9RjWs
Cultured Data Symposium
Friday, February 7, 2020
1:00pm-5:00pm, Keynote from Shannon Mattern
Calit2 Auditorium 5:00pm-7:00pm Reception and gallery open
Thursday, March 12, 2020
5:00pm Eco-streaming with Calum Bowden; gallery@calit2
Stream the exhibition online http://streaming.energy