PROJECT: Ambulant Sounds / Sounds in Space [2015-19]

Sounds in Space was a 3.5 year project exploring the nature of sound, location and orientation in space. Essentially what happens when we allow sound to exist relative to you, and have a tangible presence.

The Sound in Space technology allows you to place virtual sounds in a 3D space using a mobile phone. Once the sounds are placed, audience members can walk around the space wearing headphones to hear the sounds, all of which are binaural and respond to the person’s movement within the space.

We outsourced all the code so that it can be used for your own experiments - it is (surprisingly) well documented at the GCL Github:  https://github.com/googlecreativelab/sounds-in-space

On Sound.

- excerpted from Debbie’s Design Matters podcast.

Tea:  SO we're in your studio, but we're wearing headphones, so the sound gets very distorted. It's not natural sound, and it feels unnatural to be wearing headphones in this small little room. And you can move between those sounds and that can allow you to understand the sound, understand the context of the space you're in differently. It can affect how you experience the information.

Tea:  Just the idea of moving the sounds so that we could make it sound like we're in a vast concert hall. And then you begin to understand how if you can do that with objects in space, all of this stuff with AR and VR, we're always listening. We might be able to see it on a screen in the thing, but unless you can hear it in exactly the right place, and we can't hear it in exactly the right place because the sound isn't coming from that place.

Tea:  It's being generated artificially, and it can't be artificially generated specifically for you, because your ears are a funny shape. So, because sound is subjective to you, that thing of putting a sound in space precisely so that it works, becomes an incredibly challenging exercise. And it's one that I'm enjoying enormously. I think we'll get our ears scanned.

Tea:  In a few years time, I think you'll get your ears scanned just like you get your eyes tested, and that you'll have headphones which literally adapt to the inner structure of your ear, and allow you mainly entirely for the purpose of augmented reality, that sounds can begin to come from space, from very precise spaces. And they're going to need to know the shape of your ear in order for it to come from that precise space.

Debbie:  So, customized listening.

Tea:  Yeah, customized listening. Because then information doesn't need to come into you from a screen. It can begin to exist as sound.

Debbie:  Is this something that you're working on at Google?

Tea:  We’re really interested in all the AR core stuff, and anchoring sounds in space, and how you orient. There are things that humans do really well. Filtering sensory perception being one, biases and filtering being another one, and understanding your location in space is another one.

You begin to realize how our understanding of reality, objective reality, is really reliant on those skills happening in our brain. So until we can begin to allow our little supercomputers in our pocket to understand where they are in space, and to understand how we experience sensory perception, not a default for humans, then these kind of experiences are still limited.

And when we do begin to play with those spaces, it's amazing, because you start to put information into the real world, in a way that you are experiencing it. I always talk about sliding doors, the opening and shutting doors, in that it's a great example of augmented reality. You walk towards the thing, the thing understands your intent, which is to walk through the door, and the mechanics of the door, open the door. Or sometimes they don't.

There's no real reason where that can't ... That sort of intent, and you think about how much information we know about you, and the patterns that you exist in on a daily basis. And for me, I like playing in cultural spaces. So theater gets massively changed when you move into a space where you've got whole other metalayers of information around the performance, or books move into an interesting space.

PROJECT: Collector Cards [2014]

Collector Cards was another pre-NFT experiment into creating (in this instance) fungible tokens of value that fed back into creator ecosystems and that could be shared between phones easily.

It was entirely inspired and powered by Google’s purchase of Bump - a brilliant bit of near-field tech that something horrible happened to once we bought it.

Sadly the whole project was deader on arrival than the teams that actually worked on Bump although we did persevere longer than we should have. (again).

PROJECT: BITESIZE [2015]

Bitesize is another of those projects that died because however intuitive, inventive, daring or simply fun they were - they weren’t what the company wanted us to make. Also (always) because the platform we used (Sheets in this case), and the platform we were using it to replace (Sites) didn’t see what we were doing as genius. They just saw it as a bug, a backdoor, a security threat. A threat basically.

Which is, in my opinion, why apps should never leave beta and should always be ready to become something else.

There is no documentation on Bitesize. It was over an extended period, strangled at birth. It was sad. My thoughts and prayers are still with Jonny Richards and Jude Osborn

PROJECT: SIS - Agatha Gothe Snape [2019]

'Wet Matter', 2020 was one of two major new commissions produced for 'Agatha Gothe-Snape: The Outcome Is Certain' at Monash University Museum of Art, the first survey exhibition of Sydney based artist Agatha Gothe-Snape.

'Wet Matter', 2020 was an augmented sonic reality experience that bridges the embodied experience of performing with the act of being a witness. Based on the artist's understanding of the body as being both physically and metaphorically porous—continuous with the fluid world around us—the work comprises a wall painting, cloud points applied by members of Google’s Creative Lab, exhibition furniture and approximately 140 sounds ranging from field recordings to spoken monologues that can be experienced by wearing headphones and a harness fitted with a Smart Phone.

Featuring sound by Evelyn Ida Morris and contributions by Lizzie Thomson and Brian Fuata, the work is driven by the Sounds in Space app made by Google's Creative Lab and reveals collaboration as an important site for experimentation—where new ideas and languages are created, and where the previously unimaginable is made actual.

PROJECT: ARIA [2019]

ARIA is one of those projects that are monumental, career-works for everyone working on them or around them — but get lost when exposed to the stupefying dazedness of everyday digital flotsam.

For a real insight into why this project mattered I recommend Sean Kelly’s blog - the actual genius behind the scenes.

Opera Queensland Partners with Google Creative Lab to Bring the Opera Home with AR and AI

Australia’s Opera Queensland worked with Google Creative Lab KI in an augmented reality and AI-powered demo project that brings the stage experience of the opera to the four walls of your home using augmented reality and artificial intelligence.

With the prototype AR app, opera fans no longer have to line up or even travel to the theater to enjoy an opera production. The project will allow opera enthusiasts to turn their homes into a theater stage and experience the opera via augmented reality.

Opera enthusiasts will now enjoy performances from the comfort of their homes

The AR experimental app, dubbed Project AR-ia, was created with a version of The Magic Flute that’s performed by the Australian opera singers Emma Pearson, Brenton Spiteri and Wade Kernot and the performance was accompanied by the Brisbane’s Chamber Orchestra, Camerata.

Opera Queensland hopes to use the technology to not only demystify opera and make it more accessible, thereby bringing it to newer audiences. The Project AR-ia experimental augmented reality app shows how the opera art can adapt with the times and be enjoyed by audiences anywhere and at any time.

The app engages audiences in various ways and revolutionizes how we experience the music form. While attending the opera has entailed dressing up and travelling with loved ones to the theater, future opera enthusiasts will be able to simply watch the performances from the comfort of their homes with AR avatars performing for them.

Opera Queensland approached Google in early 2018 in a bid to explore the creation of diverse opera experiences through new technology. The opera wanted to give all enthusiasts a front-row seat in the opera from the comfort of their homes. To realize this, the performances of the opera singers have been completely digitized in 3D as a first step. The technology leverages Google’s AI-based volumetric recording system known as The Relightables which was unveiled recently.

The images of the opera actors were captured using 90 high-resolution cameras and numerous depth sensors from multiple perspectives. An algorithm subsequently created an animated 3D model of the person captured from the many individual images.

The digital person or avatar created is then projected into the user’s real environment through Google’s smartphone AR software. This is how the user is able to get a private screening of Mozart’s Magic Flute in their living rooms.

The technical challenge faced by the developer was that of the high data outlay. According to the developer, just 20 seconds of the spatial film material consumed more than a terabyte of data which had to be computed and compressed even further.

Even after the data compression, one second of the volumetric opera still required about five megabytes. Three opera singers that are animated at 30 frames per second required about 450 megabytes of data per second. This is equivalent to about 4.5 gigabytes for every ten seconds. This roughly corresponds to the data volume of a complete high-resolution 2D film.

This massive data throughput required for the streaming of the opera must played a role in the developers not releasing the prototype augmented reality opera. The project itself began as a storytelling experiment for a new kind of experience exploring how to make art accessible to a greater number of people. The work on the AR formats is set to continue as the demo app is at the forefront of creative technology and will require significant investment and development over a duration of time that will help propel it beyond the prototype stage.

PROJECT: PRECURSORS to a DIGITAL MUSE [2019]

Remember when we all got excited about GPT-2?

The Experiments: https://experiments.withgoogle.com/collection/aiwriting

No, we never made them publicly accessible. I felt [and continued to feel up to my departure in 2023] that generative AI is a net negative. That’s why we do experiments. Not that anyone listens to that part.

PROJECT: SIS - Displayed with Heidi Latsky [2018]

Promo: Heidi Latsky Dance & Google Creative Lab from Heidi Latsky Dance on Vimeo.

Filmed at City College, this short film portrays the essence of the partnership between Heidi Latsky Dance and Google Creative Lab begun in the fall of 2018.

From: https://heidilatskydance.org/displayed

In development since 2017, D.I.S.P.L.A.Y.E.D  is an immersive experience at the intersection of dance, fashion and art, transforming each venue into a unique gallery space. Towards the end of 2018, HLD embarked on an exciting partnership with Google's Creative Lab to enhance the installation with Audio Augmented Reality (AR). With the support of the Disability Forward Fund, the HLD app was created for increased accessibility. With the support of the MAP Fund, three Volumetric Videos (holograms) were created to compliment the exhibit.

The D.I.S.P.L.A.Y.E.D sculpture court provided a unique platform to experiment with Google’s Creative Lab audio AR kit. The kit allows you to place virtual sound spheres in a 3D space using a mobile phone. Once the sounds are placed, audience members can walk around the space wearing headphones to hear the sounds, all of which are binaural and respond to the person’s movement within the space.

D.I.S.P.L.A.Y.E.D was part of the Sounds In Space [SIS] series of experiments.

PROJECT: The Cube [2014-2016]

Play With Google’s Psychedelic New Interactive Music Video Cube

Tech Crunch, July 2014

"It’s called The Cube, and it’s a trip. Built by Google Creative Labs as “an experimental platform for interactive storytelling”. It debuted online today with indie dance band The Presets’ new single “No Fun”. You decide what to watch and hear by clicking and dragging The Cube to show a single side or a combination.

Read More

PROJECT: Hangouts in History [2013-2016]

‘Hangouts in History’ - Bringing the past to life

Between 2012 & 2015 Creative Lab explored different ways to use, well, Zoom. But back then we called it “Hangouts” (and later “Google Meet”.
For example bringing history into the classroom by using actors to recreate history and help teachers who are trying to transport their students back more than 700 years in time to plague-era Europe. But we had a hunch that technology might be one way to make this subject a bit more participatory.  With this in mind, Google's Creative Lab teamed up with Grumpy Sailor to help a class of year 8 students from Bowral ‘video conference’ with 1348, in what we became the first of 5 “Hangouts in History”.
[Unfortunately we were 8 yrs too early for the pandemic.]

PROJECT: Ghosts, Toast, and The Things Unsaid [2016]

This is one of my favourite projects of my career - I will always be immensely grateful to all who made it happen - the developers at Grumpy Sailor, the cast, but most of all Dan Koerner & Sam Haren at Sandpit.
I wrote a lot about it here: “We’re doing a play. Again…”

"an immersive performance with 360˚ sound using Google’s invisible, embedded technology." 

The Australian, Feb 2016 || Guardian, Mar 2016

In collaboration with Google’s Creative Lab, Sandpit’s intimate performance features two ghosts (the audience) revisiting the kitchen(s) where they grew up, fell in love and refurbished, over 50 years. Audiences of two will (digitally) tune in to the inner thoughts of a couple at three stages of their life.