MAT EoYS2024 | soft AI+M

The Media Arts and Technology
End of Year Show 2024






SketchPath

Devon Frost [@SBCAST | @ELINGS]

This collection of ceramics demonstrates the combination of manual and digital qualities possible with SketchPath, a system for clay 3D printing design through digitally drawn tool paths (https://devnfrost.com/projects/skcam.html).

@devnfrost






M259 Data Visualization 3D Interactive Projects

Jing Peng | Paul Kim | Nefeli Manoudaki | Shaw Xiao [@ELINGS]

These projects have been realized in winter 2024 that explore techniques of data information retrieval and the language of algorithmic visualization. The reflect fundamentals of data visualization and design, with an emphasis on data query, data analysis and processing, and visualization in 2D frequency and 3D interactive spatial visualization.

https://vislab.mat.ucsb.edu/2024/p2/JingPeng/Project2_JingPeng.html
https://vislab.mat.ucsb.edu/2024/p2/PaulKim/Project2_PaulKim.html
https://vislab.mat.ucsb.edu/2024/p2/NefeliManoudaki/Project2_NefeliManoudaki.html
https://vislab.mat.ucsb.edu/2024/p2/ShawXiao/Project2_ShawXiao.html
https://vislab.mat.ucsb.edu/2024/p3/JingPeng/Project3_JingPeng.html
https://vislab.mat.ucsb.edu/2024/p3/PaulKim/Project3_PaulKim.html
https://vislab.mat.ucsb.edu/2024/p3/NefeliManoudaki/Project3_NefeliManoudaki.html
https://vislab.mat.ucsb.edu/2024/p3/ShawXiao/Project3_ShawXiao.html






AI and the Art of Failure

Pratyush ‘Rumi’ Bhattacharyya [@SBCAST | @ELINGS]

My project, ‘AI and The Art Of Failure,’ is an introspective foray into the integration of AI within the arts, as a component of the (media) artist’s toolkit. These works emerged during our coursework for MAT255 in Fall 2023 under the guidance of Professor George Legrady. The objective was to assess the extent to which we could engage with AI generative software like MidJourney and Stable Diffusion. While the course aimed to examine our ability to generate and manipulate such software through textual prompts, the AI systems often struggled to interpret our instructions accurately. Consequently, this suspension of human control over the collaborative creation of artworks with machines yielded unexpected aesthetic results, notwithstanding some students’ preference for greater autonomy over their works. I contend that my ‘failed’ artworks delineate the limits of human agency when interfacing with machines as artistic mediums. They unveil unforeseen interactions with machines, indicating alternative approaches to viewing machines and art as avenues for radical expression that defy conventional norms of art production. This unconventional approach echoes Jack Halberstam’s thesis in ‘The Queer Art of Failure’ (2011), where failure is posited as a potent tool for challenging capitalist and heteronormative structures, offering insights into alternative frameworks to challenge individualism and conformity through analyses of both ‘high and low art’ within popular culture.

https://vimeo.com/channels/802109/videos
@yusher_name






Layering Colors in Space

Emilie Yu | Fanny Chevalier | Karan Singh | Adrien Bousseau [@ELINGS]

Our interactive virtual reality system will invite visitors to contribute to a 3D painting by layering colored brush strokes, each person adding on to the collective creation by taking up the virtual paint brush. Thus, layer after layer, stroke after stroke, the virtual 3D space will be altered and made into something new. ”

https://em-yu.github.io
http://fannychevalier.net
https://www.dgp.toronto.edu/~karan
https://www-sop.inria.fr/members/Adrien.Bousseau






badperson.gcode (s)

Lucy Bell | Devon Frost [@SBCAST | @ELINGS]

Bad Person is a result of a collaboration between MAT PhD Devon Frost and Art MFA Lucy Bell. Using Frost’s sketch path software, Bell drew a [bad] rendering of a person to be clay 3D printed. Bad Person is structurally unsound, making the standing figures extra [good].

@notlucybelll
@devnfrost






Shadows

Deniz Caglarcan [@SBCAST]

Shadows is a transdisciplinary artwork by Deniz Çalarcan, featuring selected oil paintings from Güne_ Çalarcans Shadow Collection. The piece contemplates the intricacies of human social connections, recognizing that while individuals have unique traits, they naturally seek relationships. Maslows hierarchy of needs plays a key role in shaping the artwork, correlating social status with mental states. The artwork uses human figures to create texture, with the brain interpreting acrylic forms as body parts, shadows as directions, and figure mass as detail. Deniz enlarges these figures, enhancing details perceivable by the audience and establishing a narrative. The compositions flow applies a Schaefferian approach, linking gestures by syntactical content and unique morphology. Themes drawn from the paintings provide a formal, semantic structure. The audiences perception ultimately coalesces these elements into a unified artwork.

http://www.denizcaglarcan.com @denizcaglarcan






Millipath

Sam Bourgault [@SBCAST | @ELINGS]

Millipath is an action-oriented programming web application enabling the parametric design of machine toolpaths for surface texture production on CNC-milling machines. Millipath was created through a research-through-design process informed by contemporary materialist theory related to the notion of action, a place where tools and material meets. With this software, we investigated how operationalizing this theory in digital fabrication systems can support expressive modes of production and design decisions in response to material behaviors. We found that working at different levels of the machine toolpath enabled the use of machine properties to leverage material qualities and understand their limitations. The artifacts presented  a stool, an inlay tray, and a series of textured cups  demonstrate the opportunities of designing with machine actions and circumventing the typical digital fabrication, which focuses on geometry. The surface textures produced on these artifacts engage with superposition, insertion, and transfer through the fabrication of aesthetic and functional features.

https://sambourgault.com @farwest1138






Mys

Stejara Dinulescu | Nefeli Manoudaki | Iason Paterakis [@SBCAST | @ELINGS]

“Mys” is a wearable sensing system that enhances human perception through interaction with an intelligent environment. It captures human kinesthetic movements via three accelerometers and translates these movements into extended reality architectural transformations. Kinesthesia is analyzed into a unique inkblot signature via artificial intelligence diffusion models, which is then projected back onto the individual. “Mys” offers a visionary glimpse into the future of augmented interactions between biological and machine intelligence, fostering a deeper, more immersive connection between humans and the spaces they inhabit. This work is part of a body of submissions to the 2024 Synthesizer Design Hackathon.

http://stejarasart.com @stejaraiulia
@nefeliman
@deejay_tekton






HIGH FIVE (lo-fi)

Sam Bourgault | the worm [@ELINGS]

HIGH FIVE (lo-fi) is an interactive robotic demonstration where human participants teach a robot arm to do a handshake by pushing it around and creating dynamic movements. The robot records the movements and repeats them on demand. As more and more handshakes are recorded, the library of hand movements grows, allowing people to learn and experience others handshakes. This work interrogates the nature of agency and communication across living and non-living entities and tries to bridge human-to-human physical connections across time through embodied robotic engagement.

https://sambourgault.com @farwest1138






SculptAR: Direct Manipulations of Machine Toolpaths in Augmented Reality for 3D Clay Printing

Joyce Passananti | Ana María Cárdenas Gasca | Jennifer Jacobs | Tobias Höllerer [@ELINGS]

SculptAR is a Hololens application that demonstrates the direct manipulation of machine toolpaths in Augmented Reality for clay 3D printing. SculptAR relies on hand interactions to edit path control points. SculptAR also provides a set of options that allow the user to control how their changes to one control points are broadcast to others to determine surface shapes and textures. By leveraging AR interactions in a physical context, SculptAR aims to leverage existing physical workflows and enable practitioners to apply their understanding of material properties and visual understanding of physical 3D objects. ”

https://amcard.myportfolio.com
http://joycepassananti.website
https://ecl.mat.ucsb.edu
https://ilab.cs.ucsb.edu






Fencing Hallucination 2024

Weihao Qiu [@ELINGS]

Fencing Hallucination 2024 is a re-imagination of the ontology of photographs in the age of generative AI. Image-generative AI deconstructs the traditional role of photographs as records of reality, and bringing about the issues of artistic value and authorship. The multi-screen interactive installation is driven by an AI image-generative system, invites the audience to the dialog about human-machine co-creation and cameraless photography. It provides real-time human-AI interaction in a form of virtual fencing game uses AI to generate chronophotographs from audience movements. ”

http://www.q-wh.com @qwh_7






BioModular

Sabina Hyoju Ahn [@SBCAST]

BioModular is an instrument created based on the BCO (Bioelectricity-Controlled Oscillator) circuit that uses bioelectricity from living beings as control voltage. This instrument has two functions: it operates as a modular synthesizer with an eight-step sequencer and a bionoise-controlled mode. Its design draws inspiration from electronic music devices, including modular synthesizers and step sequencers, as well as devices emerging from the DIY culture. This instrument has been designed to fit the Eurorack system, enhancing its compatibility and functionality.

https://sabinaahn.com @sabina_ahn






Spatial Orchestra

You-Jin Kim | Myungin Lee | Marko Peljhan | JoAnn Kuchera-Morin | Tobias Höllerer [@ELINGS]

Spatial Orchestra demonstrates how easy it is to play musical instruments using basic input like natural locomotion, which is accessible to most. Unlike many musical instruments, our work allows individuals of all skill levels to effortlessly create music by walking into virtual bubbles. Our Augmented Reality experience involves interacting with ever-shifting sound bubbles that the user engages with by stepping into color-coded bubbles within the assigned area using a standalone AR headset. Each bubble corresponds to a cello note, and omits sound from the center of the bubble, and lets the user hear and express in spatial audio, effectively transforming participants into musicians. This interactive element enables users to explore the intersection of spatial awareness, musical rhythm that extends to bodily expression through playful movements and dance-like gestures within the bubble-filled environment. This unique experience illuminates the intricate relationship between spatial awareness and the art of musical performance. ”

https://www.yujnkm.com @yujnkm
https://www.myunginlee.com @lee_myungin
https://www.arts.ucsb.edu/peljhan @systemics.mx
https://allosphere.ucsb.edu
https://sites.cs.ucsb.edu/~holl






PR1M0RDIUM

Ryan Millett [@SBCAST]

Infinite variations of the ineffable seep into the material. Reality flickers, consumed by the computational gyre. The old gradients are vanishing, and the new models struggle to converge: now is the time of algogorgons.”

https://rmillett.myportfolio.com






TouchPulseBitBox

Sabina Hoju Ahn | Ryan Millett [@SBCAST]

TouchPulseBitBox is a dual pulsar synthesizer built on the Daisy Seed platform using Gen. It features an interactive design with four knobs and seven light sensors, enabling dynamic parameter modulation through physical interaction. As users adjust the knobs, they simultaneously disrupt the light sensors, integrating tactile feedback directly into sound manipulation. Developed for the Synthux Academy 2024 Synth Design Hackathon.

http://sabinaahn.com
https://rmillett.myportfolio.com






Genuine Horsefeathers

Jazer Giles [@SBCAST | @ELINGS]

Genuine Horsefeathers uses machine learning to analyze incoming microphone signals and play back the closest match from a curated audio corpus. This corpus, sourced from the Internet Archive, comprises audio from news broadcasts, talk shows, videos, and various other media. Engage by speaking into the microphone, and experience human communication as it is modified and distorted through media.

http://jazergiles.com @jazergiles






Projections

Jazer Giles [@ELINGS]

A body of pen plotter prints that use isometric projection to explore visual perception of space.

http://jazergiles.com @jazergiles






know how you feel

Jazer Giles [@SBCAST | @ELINGS]

know how you feel is an audio/video performance installation in which a colorful feedback-driven pixel shader is sonified. ”

http://jazergiles.com @jazergiles






TAISAH - A Song of Your Dream

Olifa Ching-Ying Hsieh | Timothy Wood | Weihao Qiu [@ELINGS]

is a participatory immersive audio-visual healing space created by a collaboration of Olifa Ching-Ying Hsieh, Timothy Wood, and Weihao Qiu, which combines neural and physical feedback data to generate spatial audio, AI graphics and interactive design in the immersive space. The Experimental Forest of National Taiwan University supported artists to enter the mountainous areas of Taiwan to experience the specific landscapes and indigenous cultures. The Bunun people ²Ï live in harmony with nature and believe in ““Taisah”`““, which is to look for omens of fate in their dreams. Dreams are a bridge that connect our inner subconscious to the reality of our environment. This experience guides us toward one meaning of”“Healing”“: a movement towards balancing our inner and outer worlds, through releasing, receiving, listening to the voices felt deeply within, giving space to external voices and what can not be known, and recalibrating our body-mind to natural frequencies and find ourselves from deep inside. We will invite the audience to participate as the dreamer in this shared immersive environment through guided meditation, interactive EEG brainwave devices and motion capture sensors. Brainwaves and movement data streams will modulate our audio-visual environments in real-time, moving between illusory AI-generated forests and real imagery of the mountain forests of Taiwan. You can see, listen, and explore personal melodies of sound, vision, and movement inside this dream-like forest.”

http://olifahsieh.net
http://embodiedworlds.com @fishuyo
http://www.q-wh.com @qwh_7






Attention Manifold

Weihao Qiu, | Shaw(Yiran) Xiao, | Grace Feng [@ELINGS]

Beauty is in the eye of the beholder. Our perceptions define how we view the world and indicate who we are at our core. In today’s era of big data, every fragment of our interaction with the virtual world is converted into dataanalyzed, categorized, and learned by machines, transforming our unique identities into mere vectors in a vast digital latent space. This input is then fed into a machine learning algorithm, which crafts a virtual sculpture and sound, poetically portraying the viewer’s individuality and identity. This project creates the dialogue between the audience and machine by the machine’s digital abstraction of our identity.

http://www.q-wh.com @qwh_7
https://gracefeng05.github.io/






SILICONE DREAM

Jenni Hutson | Marcel Rodriguez-Riccelli [@SBCAST | @ELINGS]

SILICONE DREAM is a novel soft shape display device designed and built by Jenni Hutson. SILICONE DREAM can receive arbitrary data values over OSC and translate them into changing shapes using a grid of actuator controlled shafts underneath a silicone top layer. We have added programmable LED lighting underneath the silicone layer to extend the possibilities of data representation. The soft shape display system is controlled by an embedded system with custom software, and contributes a generalizable system for future shape display design for both artistic and scientific use cases. During the show, SILICONE DREAM is controlled by generated topographic data as well as will be driven by musical data during live performance by Marcel Rodriguez-Riccelli. ”

http://jennihutson.com
@marcelrodriguezriccelli






Stretching Genders

Angelos Floros | Video Performance: Mariana T Kotsanou [@ELINGS]

Step into a realm where the boundaries of gender blur and the harmony of opposing energies intertwine. Our interactive installation, inspired by the concept of Izutsu, invites you to delve into the fluidity of gender and the interconnectedness of masculine and feminine forces within the same body. Through the utilization of two sensors, representing the masculine and feminine, visitors are encouraged to engage with the installation and explore the dynamic interplay between these two fundamental aspects of human existence. As you approach each sensor, you will experience a sensory journey that reflects the essence of masculine and feminine energies. The masculine sensor may evoke imagery or sounds associated with strength, action, and vitality, while the feminine sensor may evoke sensations of nurture, intuition, and receptivity. By interacting with both sensors simultaneously, participants will witness the convergence of these energies, creating a harmonious balance within themselves. Through this experience, we aim to provoke contemplation on the fluidity of gender roles and the beauty of embracing the complexity and diversity of human identity. Join us on this immersive exploration of gender harmony, where the lines between male and female dissolve, and the essence of unity prevails.”






Aquatic well-being

Angelos Floros [@SBCAST]

In the heart of bustling urban life, there exists a sanctuary dedicated to inner peace and tranquility. This unique space invites visitors to embark on a journey of self-discovery, focusing inward to explore the depths of one’s own psyche. At its core lies an aquarium symbolizing the internal landscape of the human mind. As visitors gaze into the crystal-clear waters, they contemplate their thoughts and emotions. The gentle sway of aquatic life mirrors the ebb and flow of inner thoughts, offering insight into the complexities of the human experience. The representation of the self within the aquarium serves as a powerful symbola reminder of the interconnectedness between the external environment and our internal state of being. Through this immersive experience, visitors explore the profound connection between psychological well-being and our relationship with the environment. The tranquil setting fosters introspection, nurturing a sense of inner calm amidst the chaos of modern life.”






Always + AlloSphere Showcase

Curtis Roads | JoAnn Kuchera-Morin | Kon-Hyong Kim | Timothy Wood | Dennis Adderton | Gustavo Rincon | Myungin Lee | Graham Wakefield | Haru Ji | Lance Putnam [@ELINGS]

This AlloSphere showcase features Road’s composition ‘Always’ which will be spatialized live by Curtis alongside an immersive visualization. The show will proceed with research projects from the AlloSphere Research Group, past and present including: ‘Probably/Possibly’ (JoAnn Kuchera-Morin) ‘Fractal’ (Kon-Hyong Kim) ‘Three Sphere’ (Kon-Hyong Kim, Dennis Adderton) ‘Fabric’ (Timothy Wood) ‘Endless Current’ (Graham Wakefield, Haru Ji) ‘Adrift’ (Lance Putnam) ‘Volume Viewer’ (AlloSphere Research Group)” ”






AlloPortal Showcase

JoAnn Kuchera-Morin | Kon-Hyong Kim | Timothy Wood | Gustavo Rincon | Dennis Adderton [@ELINGS]

How can one find patterns in complex information and work with this information creatively and intuitively leading to new and unique innovation? Building our computational language and representing very complex information through our senses, visual and audio representations, will facilitate the uncovering of new patterns in this information and allow scientists and engineers to work with their data perceptually and intuitively, the way that artists do. In this installation, explore the complex worlds of quantum mechanics, fractals, and spring mass equations that move like fabric.”

http://allosphere.ucsb.edu






Michael Candy Robotics Workshop

Jenni Hutson | Weihao Qiu | Sam Bourgault | Devon Frost | Emma Brown | Marcel Rodriguez-Riccelli | Jazer Sibley-Schwartz | Sabina Hyoju Ahn [@ELINGS]

These projects were created in the spring during a workshop led by Systemics Lab resident researcher Michael Candy. Students designed and filmed mechanical critters using upcycled materials and found objects

https://michaelcandy.com/
jhutson@ucsb.edu
wqiu@ucsb.edu
samuellebourgault@ucsb.edu
dfrost@ucsb.edu
emma_brown@ucsb.edu
riccelli@ucsb.edu
jazer@ucsb.edu
sabina_ahn@ucsb.edu






Reality Agents

Emma Brown [@ELINGS]

“Reality Agents” is a reality TV script generator. As opposed to simply generating a script using a Large Language Model (LLM), this work extends Generative Agent technology, developed this past summer at Stanford University, to power two completely autonomous agents with their own personalities, emotional states, intentions, and objectives. Drawing on cognitive science research, particularly hot cognition, which explores the interaction between emotions and decision-making, the software aims to create a generative agent-based system with believable outbursts.

http://nworb.io @nworb____






Organoid_Protonoesis 1

Iason Paterakis | Nefeli Manoudaki | Diarmid Flatley | Ryan Millett | Jazer Shibley | Marcos Novak | Ken Kosik | Tjitse van der Molen | Eve Bodnia [@SBCAST | @ELINGS]

Organoid_Protonoesis 1 is a neurobiological-data-driven interactive transmodal installation that gives form and expression to the intricate relationship between brain-cell organoid firings, and AI image generation. It is a work-in-progress that extends previous efforts such as the transLAB’s Synaptic Time Tunnel (SIGGRAPH 2023) in fusing transmodal media arts, neurobiology and cognitive science, and artificial intelligence and machine learning.

This project introduces a real-time interface between spontaneous brain organoid activity and external stimuli, enabling unique and novel interactions between human intellect and emergent and self-organizing neural connectivity and activation. Neuronal firing data, converted to hyperedges within hypergraphs, are transformed into computational architectural morphologies, generative spatialized sound, and AI virtual entities.

Organoid_Protonoesis 1 showcases the potential of integrating real-time generative AI with biological systems, inviting participants to explore and reflect on the evolving boundaries of cognition, signal processing, and contemporary media.

Organoid_Protonoesis 1 is a collaboration between the MAT transLAB and the Neurobiology Lab, with additional support from the MAT AlloPlex Studio/Lab at SBCAST.

https://sbcast.org
https://www.iasonpaterakis.com @deejay_tekton
https://nefeliman.com @nefeliman
https://www.diarmidflatley.com @nulltools
https://rmillett.myportfolio.com
https://jazergiles.com @jazergiles






Osmosis //Episode 04 // Emerging A(I)rchitecture, Fluctuating Space

Iason Paterakis | Nefeli Manoudaki [@SBCAST]

Osmosis is an immersive projection mapping installation, that embodies the essence of liminal space by seamlessly blending real-time AI-generated visuals with the urban environment. Like biological osmosis in biology, this artwork envisions a transformative journey where virtual elements permeate the building facade, blurring the boundaries between the tangible and the virtual. Through its captivating exploration of AIs creative potential, Osmosis invites contemplation on the evolving relationship between technology and art in our interconnected world.

https://www.iasonpaterakis.com @deejay_tekton
https://nefeliman.com @nefeliman






Vitrified Sounds

Jazer Sibley-Schwartz | Devon Frost | Marcel Rodriguez-Riccelli | Sam Bourgault [@SBCAST | @ELINGS]

Pottery practices stretch back to early human civilizations. While the technology we use to fabricate with clay has evolved, the process through which it vitrifies remains the same. The crystalline structure of ceramic provides both strength and beauty. As an ode to this ubiquitous material that transcends human experience, we augment two 3D-printed clay vessels with electronic components. We process the physical sound of these vessels being struck by motors through systems of oscillators, which, like groups of fireflies, exhibit coupling behavior. This is The Vitrified Sounds Instrument. The Vitrified Sounds Instrument processes the ringing of clay 3D printed vessels struck by motors. The motors run using pulse width modulation from a Daisy board and are amplified by a transistor circuit. As they struck the vessels, piezo mics installed on the ceramic surface record ringing into delay lines that are resampled at varying times to be fed into an amplifier assembly. The signal is sent to the Daisy board for processing. Free-running accumulators sequence 8 overtones, four per vessel, which are synthesized. Knobs allow interactive control of delay times and speed of the sequencers through the Daisy. All processing is done in the gen environment inside of Max. After processing, the final audio outputs from both channels are fed to a final amplifier, which drives the speakers.

http://jazergiles.com @jazergiles
https://sambourgault.com @devnfrost
http://marcelrodriguezricc.github.io/portfolio @marcelrodriguezriccelli https://sambourgault.com @farwest1138






Nice Guy

Emma Brown [@SBCAST | @ELINGS]

“Theory of mind” is the cognitive faculty that enables individuals to attribute mental states like beliefs, intents, desires, emotions, and knowledge to themselves and others. Nice Guy is a public experiment attempting to model theory of mind in an agent powered by a large language model. The machine agent identifies as a “nice guy” and attempts to empathize with humans in an unbroken stream of consciousness.

http://nworb.io @nworb____






Emotoscope

Emma Brown | Marcel Rodriguez-Riccelli [@ELINGS]

Two microcontrollers stream a fine-tuned Stable Diffusion model to oscilloscopes.

http://nworb.io @nworb____
@marcelrodriguezriccelli






CREATE ensemble

Ryan Millett | Yvonne Yuan | Deniz Caglarcan | Marcel Rodriguez-Riccelli | Karl Yerkes [@SBCAST]

CREATE Ensemble is a dynamic group focused on composition, improvisation, critique, research, and live performance. We embrace diverse artistic expressions including music for acoustic and electronic instruments, audiovisual art, live coding, networked performances, interactive dance, performance art, and trans-categorical live performances. For EYoS 2024, CREATE presents a structured improvisation where the ensemble reads from a live stream of abstract symbols representing musical gestures. Each performer uniquely interprets these symbols according to the nature of their instruments and based on their individual musical notions, resulting in a collaborative and dynamic performance.