PROFILE: SBS Audio: SEEN [2024]

Tea Uglow is an influential voice in the tech world. A pioneering force behind Google's Creative Lab where she led experimental tech projects for 17 years, Tea Uglow's brilliance extends beyond technology. Join Yumi Stynes as the pair walk through Tea's personal journey, and how her experience as a trans autistic woman has become a powerful lens through which she reshapes the tech landscape.

Read More

PROJECT: Ambulant Sounds / Sounds in Space [2015-19]

Sounds in Space was a 3.5 year project exploring the nature of sound, location and orientation in space. Essentially what happens when we allow sound to exist relative to you, and have a tangible presence.

The Sound in Space technology allows you to place virtual sounds in a 3D space using a mobile phone. Once the sounds are placed, audience members can walk around the space wearing headphones to hear the sounds, all of which are binaural and respond to the person’s movement within the space.

We outsourced all the code so that it can be used for your own experiments - it is (surprisingly) well documented at the GCL Github:  https://github.com/googlecreativelab/sounds-in-space

On Sound.

- excerpted from Debbie’s Design Matters podcast.

Tea:  SO we're in your studio, but we're wearing headphones, so the sound gets very distorted. It's not natural sound, and it feels unnatural to be wearing headphones in this small little room. And you can move between those sounds and that can allow you to understand the sound, understand the context of the space you're in differently. It can affect how you experience the information.

Tea:  Just the idea of moving the sounds so that we could make it sound like we're in a vast concert hall. And then you begin to understand how if you can do that with objects in space, all of this stuff with AR and VR, we're always listening. We might be able to see it on a screen in the thing, but unless you can hear it in exactly the right place, and we can't hear it in exactly the right place because the sound isn't coming from that place.

Tea:  It's being generated artificially, and it can't be artificially generated specifically for you, because your ears are a funny shape. So, because sound is subjective to you, that thing of putting a sound in space precisely so that it works, becomes an incredibly challenging exercise. And it's one that I'm enjoying enormously. I think we'll get our ears scanned.

Tea:  In a few years time, I think you'll get your ears scanned just like you get your eyes tested, and that you'll have headphones which literally adapt to the inner structure of your ear, and allow you mainly entirely for the purpose of augmented reality, that sounds can begin to come from space, from very precise spaces. And they're going to need to know the shape of your ear in order for it to come from that precise space.

Debbie:  So, customized listening.

Tea:  Yeah, customized listening. Because then information doesn't need to come into you from a screen. It can begin to exist as sound.

Debbie:  Is this something that you're working on at Google?

Tea:  We’re really interested in all the AR core stuff, and anchoring sounds in space, and how you orient. There are things that humans do really well. Filtering sensory perception being one, biases and filtering being another one, and understanding your location in space is another one.

You begin to realize how our understanding of reality, objective reality, is really reliant on those skills happening in our brain. So until we can begin to allow our little supercomputers in our pocket to understand where they are in space, and to understand how we experience sensory perception, not a default for humans, then these kind of experiences are still limited.

And when we do begin to play with those spaces, it's amazing, because you start to put information into the real world, in a way that you are experiencing it. I always talk about sliding doors, the opening and shutting doors, in that it's a great example of augmented reality. You walk towards the thing, the thing understands your intent, which is to walk through the door, and the mechanics of the door, open the door. Or sometimes they don't.

There's no real reason where that can't ... That sort of intent, and you think about how much information we know about you, and the patterns that you exist in on a daily basis. And for me, I like playing in cultural spaces. So theater gets massively changed when you move into a space where you've got whole other metalayers of information around the performance, or books move into an interesting space.

PROJECT: Collector Cards [2014]

Collector Cards was another pre-NFT experiment into creating (in this instance) fungible tokens of value that fed back into creator ecosystems and that could be shared between phones easily.

It was entirely inspired and powered by Google’s purchase of Bump - a brilliant bit of near-field tech that something horrible happened to once we bought it.

Sadly the whole project was deader on arrival than the teams that actually worked on Bump although we did persevere longer than we should have. (again).

PROJECT: BITESIZE [2015]

Bitesize is another of those projects that died because however intuitive, inventive, daring or simply fun they were - they weren’t what the company wanted us to make. Also (always) because the platform we used (Sheets in this case), and the platform we were using it to replace (Sites) didn’t see what we were doing as genius. They just saw it as a bug, a backdoor, a security threat. A threat basically.

Which is, in my opinion, why apps should never leave beta and should always be ready to become something else.

There is no documentation on Bitesize. It was over an extended period, strangled at birth. It was sad. My thoughts and prayers are still with Jonny Richards and Jude Osborn

TALK: Pathology of Identity [2020]

PROFILE: Cannes Lions President of Glass [2023] - Campaign Brief

The Cannes Lions International Festival of Creativity has announced the names of the Jury Presidents who will lead juries to award this year’s Lions and set the global benchmark for excellence in creativity.

 

Tea Uglow, founder of Dark Swan, former CD at Google will represent Australia as jury president of Glass Lions: The Lion for Change. The Glass Lion recognises work that implicitly or explicitly addresses issues of gender inequality or prejudice, through the conscious representation of gender in advertising.

Read More

PROFILE: Design Matters [2020]

Design Matters: Tea Uglow

It’s hard to describe exactly what Tea Uglow does. But know this: She has your dream job. Within Google, as has been said before, she is, essentially, paid to play.

The gig didn’t come easily. Uglow started as a Fine Art student, came across design and navigated the tides of the Dotcom boom and Dotcom bust, and then grabbed the laptop from her severance package and taught herself HTML. She bounced around a few design jobs, and then happened upon a one-month contract position to make Powerpoints for the sales team at Google.

… And then she founded Google’s Creative Lab in Europe.

How?! In this episode of Design Matters, Debbie Millman explores just that—and, of course, digs into what Uglow does today as creative director of Google’s Creative Lab in Sydney.

Uglow’s work may not be the easiest thing to nail down in a nutshell, and it’s best seen in action. So here we present a tapestry of Tea, from her personal writings to a medley of her striking projects that reveal the key to her swift rise and all the rest of it: her raw brilliance.

Midsummer Night’s Dreaming“The Royal Shakespeare Company put on a unique, one-off performance of ‘Midsummer Night’s Dream’ in collaboration with Google’s Creative Lab. It took place online, and offline—at the same time. It was the culmination of an 18 month project looking at new forms of theater with digital at the core.”

XY-Fi “XY-Fi allows you to mouse-over the physical world, with your phone.”

Editions at Play “Editions At Play is the Peabody Futures–award-winning initiative by Visual Editions and Google’s Creative Lab to explore what a digital book might be: one which makes use of the dynamic properties of the web.”

Hangouts in History “Google’s Creative Lab teamed up with Grumpy Sailor to help a class of year 8 students from Bowral ‘video conference’ with 1348, in what we became the first of five ‘Hangouts in History.’”

The Oracles “The Oracles is a cross-platform experience, developed for primary school children in Haringey. Digital and physical environments are blended, alternating between gameplay and visits to Fallow Cross, where enchanted objects know where you are so that your moves trigger the story.”

Story Spheres “Story Spheres is a way to add stories to panoramic photographs. It’s a simple concept that combines the storytelling tools of words and pictures with a little digital magic.”

Bar.Foo “Google has a secret interview process …”

Debbie talks with Tea Uglow about experimental digital projects that are pushing the boundaries of tech and art. “This is very much the principal with all my projects, just draw a line in the sand and put a flag there. Then at least someone might come over and look at the line and ask ‘What’s the flag for?’”

PROJECT: SIS - Agatha Gothe Snape [2019]

'Wet Matter', 2020 was one of two major new commissions produced for 'Agatha Gothe-Snape: The Outcome Is Certain' at Monash University Museum of Art, the first survey exhibition of Sydney based artist Agatha Gothe-Snape.

'Wet Matter', 2020 was an augmented sonic reality experience that bridges the embodied experience of performing with the act of being a witness. Based on the artist's understanding of the body as being both physically and metaphorically porous—continuous with the fluid world around us—the work comprises a wall painting, cloud points applied by members of Google’s Creative Lab, exhibition furniture and approximately 140 sounds ranging from field recordings to spoken monologues that can be experienced by wearing headphones and a harness fitted with a Smart Phone.

Featuring sound by Evelyn Ida Morris and contributions by Lizzie Thomson and Brian Fuata, the work is driven by the Sounds in Space app made by Google's Creative Lab and reveals collaboration as an important site for experimentation—where new ideas and languages are created, and where the previously unimaginable is made actual.

PROJECT: ARIA [2019]

ARIA is one of those projects that are monumental, career-works for everyone working on them or around them — but get lost when exposed to the stupefying dazedness of everyday digital flotsam.

For a real insight into why this project mattered I recommend Sean Kelly’s blog - the actual genius behind the scenes.

Opera Queensland Partners with Google Creative Lab to Bring the Opera Home with AR and AI

Australia’s Opera Queensland worked with Google Creative Lab KI in an augmented reality and AI-powered demo project that brings the stage experience of the opera to the four walls of your home using augmented reality and artificial intelligence.

With the prototype AR app, opera fans no longer have to line up or even travel to the theater to enjoy an opera production. The project will allow opera enthusiasts to turn their homes into a theater stage and experience the opera via augmented reality.

Opera enthusiasts will now enjoy performances from the comfort of their homes

The AR experimental app, dubbed Project AR-ia, was created with a version of The Magic Flute that’s performed by the Australian opera singers Emma Pearson, Brenton Spiteri and Wade Kernot and the performance was accompanied by the Brisbane’s Chamber Orchestra, Camerata.

Opera Queensland hopes to use the technology to not only demystify opera and make it more accessible, thereby bringing it to newer audiences. The Project AR-ia experimental augmented reality app shows how the opera art can adapt with the times and be enjoyed by audiences anywhere and at any time.

The app engages audiences in various ways and revolutionizes how we experience the music form. While attending the opera has entailed dressing up and travelling with loved ones to the theater, future opera enthusiasts will be able to simply watch the performances from the comfort of their homes with AR avatars performing for them.

Opera Queensland approached Google in early 2018 in a bid to explore the creation of diverse opera experiences through new technology. The opera wanted to give all enthusiasts a front-row seat in the opera from the comfort of their homes. To realize this, the performances of the opera singers have been completely digitized in 3D as a first step. The technology leverages Google’s AI-based volumetric recording system known as The Relightables which was unveiled recently.

The images of the opera actors were captured using 90 high-resolution cameras and numerous depth sensors from multiple perspectives. An algorithm subsequently created an animated 3D model of the person captured from the many individual images.

The digital person or avatar created is then projected into the user’s real environment through Google’s smartphone AR software. This is how the user is able to get a private screening of Mozart’s Magic Flute in their living rooms.

The technical challenge faced by the developer was that of the high data outlay. According to the developer, just 20 seconds of the spatial film material consumed more than a terabyte of data which had to be computed and compressed even further.

Even after the data compression, one second of the volumetric opera still required about five megabytes. Three opera singers that are animated at 30 frames per second required about 450 megabytes of data per second. This is equivalent to about 4.5 gigabytes for every ten seconds. This roughly corresponds to the data volume of a complete high-resolution 2D film.

This massive data throughput required for the streaming of the opera must played a role in the developers not releasing the prototype augmented reality opera. The project itself began as a storytelling experiment for a new kind of experience exploring how to make art accessible to a greater number of people. The work on the AR formats is set to continue as the demo app is at the forefront of creative technology and will require significant investment and development over a duration of time that will help propel it beyond the prototype stage.

PROJECT: PRECURSORS to a DIGITAL MUSE [2019]

Remember when we all got excited about GPT-2?

The Experiments: https://experiments.withgoogle.com/collection/aiwriting

No, we never made them publicly accessible. I felt [and continued to feel up to my departure in 2023] that generative AI is a net negative. That’s why we do experiments. Not that anyone listens to that part.