Sarah Kane: How a Blind Astrophysicist Looks at the Stars through Sonification and Generative Music

00:00

Joe Patitucci

This episode is brought to you by PlantWave. PlantWave turns a plant's biorhythms into music. You just attach two sensors to a plant's leaves. PlantWave connects wirelessly to a mobile device running the plantwave app, and the app has instruments on it that are built for plants to play. Check it out plantwave.com and share it with your friends.

Welcome to the nature of now podcast, where consciousness becomes form. My name is Joe Patitucci, and today we have an amazing guest. Her name is Sarah Kane, and she's an astrophysicist. And unlike most astronomers or astrophysicists, most of the time you get this image of a person staring through a telescope, looking at the night sky, all that good stuff. Sarah doesn't do that now. Why? Well, Sarah is legally blind, and Sarah uses sonification as part of her practice of understanding and studying the universe.

01:00

Joe Patitucci

So I thought that this was really cool. When you listen to this music of the stars, it's actually not much different from plant music in certain ways. It's a way of translating data from the cosmos into sound so people can have a better understanding of it, better understanding of the universe, just like we do with PlantWave, with plants. And it's really cool to hear from Sarah how useful this is and how NASA researchers are using sonification to understand the cosmos. So without further ado, let's join Sarah Kane from the panel, sonification and the future of generative music at South by Southwest 2023. Hey there. Hello, everyone. Welcome to the sonification generative music future of generative music panel here or conscious conversation. I'm here with Sarah Kane. Hello, Sarah.

02:03

Sarah Kane

Hello. Excited to be here.

02:05

Joe Patitucci

Yeah. So today we're going to talk about different ways of translating data into music, different practices, some of the value that sonification can provide, and the different approaches and the different ways that we come through. So, just as an intro of myself, my name is Joe Patitucci. I'm CEO of Data Garden. We design data sonification systems, one of which you may have experienced here, called PlantWave. It's a device that translates plant data into music. And so you can check it out over there at 1545, down here in the expo hall, and then also upstairs in eleven b, we have a plant music zone, which is plant music lounge, which has multiple plants, all connected to PlantWaves. It's a really immersive chill space to go and relax, and it really showcases the power of sonification. So, yeah, that's my intro.

03:15

Joe Patitucci

And then we have Sarah Kane. And so, Sarah, would you like to introduce yourself.

03:19

Sarah Kane

So, yeah. My name is Sarah. I am a fourth year undergraduate studying astrophysics at the University of Pennsylvania. I'm primarily interested in how we can use the current structure and status of the Milky Way to understand the formation history of the galaxy. But I am also legally blind, and so I'm really interested in this emerging field of sonification. How can we represent astronomical data as sound, both to make it more accessible to people like me who are blind and visually impaired, but also to make it more usable or enjoyable to sighted people, too? I will be starting my phd at the University of Cambridge next year in astronomy as a Marshall scholar selected by the british government.

04:06

Joe Patitucci

Wonderful. So I think we can start first by giving some. Well, we'll first start just by defining sonification. So sonification is a way of taking non audio data and representing it as audio. So it can sound musical, it also can sound like noise. There are lots of ways of doing data sonification. You can think of it similar to a weather map. A weather map is data visualization. We're all familiar with that. Similarly, we're taking information and then we're making it sound. Is there anything that I missed? Anything there?

04:48

Sarah Kane

I think one of the things I like to emphasize with sonification is that it's just a different way of representing data or ideas. So because sighted people, most humans, are inherently visual, it feels very instinctive to represent things visually. Think about a line chart, a bar graph. These things might feel like instinctively natural ways to represent data, but there's no reason that same data cannot be represented in sound instead. So I think that gives us good sort of framework with which to think about sonification, that it's not necessarily an alternative to some natural form. It's just another option that we have.

05:36

Joe Patitucci

Yeah, that's a great .1 of the things I really love about sonification, and this kind of comes from studying film and stuff in college, is, I remember my one professor would always say, when we be creating films, he'd always say, start with the audio, because the audio is the way you're going to tell the story, whether or not someone's looking at the screen or whether they're paying attention. And you can kind of build the images on top of it. What I really appreciate about sonification, even for sighted people, is that it's a way of monitoring data without having to pay direct visual attention to it. And that can be really powerful, especially if you're monitoring some kind of system that maybe only has anomaly happen once every couple of hours. You don't want to be staring at a screen the whole time or something.

06:35

Joe Patitucci

So, yeah, it has that benefit as well.

06:38

Sarah Kane

Yeah, I love that you brought that up, because I'm talking about a lot of these sonification ideas from the perspective of a blind person. But there is all sorts of upcoming research suggesting that, as you said, sighted people benefit immensely from sonification. There's a paper from 2014 saying that doctors understand eeg, so heart scan data better when they have it sonified. It's more intuitive. There have been a recent slew of NASA papers suggesting that, or not NASA of nature papers suggesting that sighted people can understand data better when they have both the sonification and the visualization. And this all comes back to this idea of universal design, this concept that when you make things that are accessible for disabled people, so in this case, blind people, you're actually just making things better for everyone involved.

07:33

Sarah Kane

So I think this is a sort of perfect example of that. In some circles, sonification arises initially for blind people, but then we start to see. It actually helps everyone.

07:46

Joe Patitucci

Yeah, it's a great point. There are lots of different ways people learn, different ways people absorb information, and different ways that our sensory system is connected to aspects of our brain. So if you think of visual stimulation is more related to the prefrontal frontal cortex, way more intellectual in a way, whereas I think you used the word intuitive there when you were talking about sonification, that makes a lot of sense because sound is more connected to the emotional and the feeling centers of our system. So it can also be a really powerful tool in terms of. It's like a direct connection to our nervous system in terms of how we're feeling. And we'll get into that a little bit, too. That plays into how we design for sonification as well. But let's have a look or listen at some different examples of sonification.

08:57

Joe Patitucci

We could start maybe with the. We'll start with the plant. How many of you have checked out the plant music lounge today or at all this whole time? Okay, we got some folks. I highly recommend checking that out. Definitely. Here's some plant music. It has, like, a little pad and little bass and maybe a little. So what you're hearing there is a plant, I think it's an orchid, actually connected to a PlantWave. And so you're hearing changes in conductivity through a plant represented as music. It's mostly changes in water content, graphing that as a wave and translating into pitch. And so each instrument you hear is actually being controlled by the same data. It's just being represented at different resolutions.

10:07

Joe Patitucci

So the bass is allowed to be triggered like once every 10 seconds, whereas that little staccato Ep thing is allowed to be triggered more often. So you're getting a higher resolution of the, quote, unquote image of the wave from the staccato sound than you are from the flute or the Bass. So that's one example of data sonification. And, yeah, you can check that out with PlantWave, and then we're going to have a look, I believe, and a listen to some of the space sonifications that we have.

10:47

Sarah Kane

Yeah. So I've picked out some sonifications from NASA's universe of sound. This is an incredible sonification project headed by Kim Arkhand and Matt Russo. So I've picked out a couple. I don't know if we'll listen to all of them, but we have both the visualization. So this is probably what you're more familiar with if you've seen telescope images before. There are these really beautiful images of astronomical objects, and we'll see sort of a visual representation of how the scientists are turning this data into sound. So maybe we will start with the galactic center. So this is a visualization. So an image and a sonification of the image of our galactic center.

11:40

Joe Patitucci

Lam you.

12:06

Sarah Kane

So in this sonification, those little twinkling noises, you hear the sort of compact notes, those are stars, or what we call compact sources, those become individual notes. And then those more extended sounds, those are like the clouds of gas. What you see is the more sort of, for lack of better phrasing, blurred parts of the image. You might have heard a bit of a crescendo towards the end of it when we hit the right side of the image. That is actually the supermassive black hole at the center of our galaxy, sagittarius, a star that you hear is that sort of crescendo of notes.

12:45

Sarah Kane

In this particular sonification, the scientists chose to pan from left to right along the image with higher pitches representing light at higher points of the image and lower pitches being lower points on the image to sort of give us that full 2d picture. And then how loud the noise is corresponds to the intensity of light or how much light we're getting, how bright it is. And in that way, the scientists are trying to capture all aspects of the visualization in the sonification as well.

13:23

Joe Patitucci

Amazing. How has sonification changed your way of seeing space data? Because I know that my understanding is you got into sonification kind of through this. Maybe you can tell us a little bit about how you got into it and then what? It's meant for you.

13:47

Sarah Kane

Yeah. So I stumbled into the world of sonification completely by accident. I was doing my first ever research project. I had just finished my freshman year of college, and I was looking at something called exoplanet transits, which are when planets pass in between us and their host star, and we see these little blips in how bright the star looks to us because the planet keeps blocking some of the light. And I was trying to get some data that was hosted by. From a NASA telescope hosted by the space telescope Science Institute. And because I was a freshman, I was very confused. I was very lost. I didn't know what I was doing. So I tried to find someone at the space telescope science institute to ask for help.

14:30

Sarah Kane

And then I happened to see in his bio that he's the project lead for something called Astronify, which was a project that tries to turn the type of data that I was working with into sound. And I couldn't believe it. So I send him this email. First of my questions, whatever. I was confused. And then I go, by the way, I saw that you're the head of this project astronify, and I'm blind, and I would love to do anything at all to help with this project. And they were very welcoming. They let me be a usability tester for them. And things sort of snowballed from there. I got more and more involved in sonification. I heard about projects like this one from the universe of sound, and it means a lot of things to me.

15:19

Sarah Kane

I think there's a lot of potential in sonification to be a real research tool for people like me, to be a scientific tool that I would use in my day to day life when I'm doing data analysis. Are we quite yet there? Not yet. I think we're still in the beginning stages of sonification as a scientific tool. But even more than that, it shows me that people care. People care about inclusion. People care about making things accessible to blind people. Before I got into sonification, I really thought that I was the only blind astronomer in the world and that I was going to have to be the first one to figure out how to do things and how things worked, and that I was alone and that no one would care about making things accessible.

16:07

Sarah Kane

And I feel, pardon the pun, sonification opened my eyes to the fact that people do care about accessibility in science, and that caring about that accessibility makes things better for sighted people, too.

16:24

Joe Patitucci

Yeah. Wow. Thank you for that. There's a lot of evidence that when we remove kind of one way of absorbing information, we can go deeper into another. And I'm sure that I would imagine that your attunement to listening is much deeper than most people's, as it's true for a lot of musicians and people that have to prioritize one sense over another, we kind of get into that space. It's for everyone. Sonification has a lot of power. And so when you're listening to something like we just heard, what are you comparing? I'd imagine you're comparing this image to that image. What are some of the things that you're listening for?

17:38

Sarah Kane

Yeah. So what's interesting about the world of astronomy sonification as it stands right now, is that there are roughly two, maybe three classes of sonifications, and this is true of actually visualizations too. So there are sonifications and visualizations for education and public outreach, and then there know the visualizations and sonifications for research tools. If you have seen, for instance, the beautiful images from the James Webb telescope recently, that's not necessarily what images in astro papers look like. I'm sorry to say they're not. Well, they're not the same sort of beautiful. So when I'm listening to something like this, which is material made for outreach, I am chasing that sense of awe that drew me and I think a lot of other people to astronomy, that sense of wonder, the sense of smallness in the face of the universe.

18:40

Sarah Kane

So I don't want to discount the fact that the sonification is just beautiful and that is an end unto itself, because that is what inspires people to study astronomy, to study other sciences. If I'm listening, for instance, to a sonification of data, for instance, something like astronify, which still sounds very cool and is lovely to listen to, but it's a very different sort of tool, then I might be listening for changes in the data. Where do I hear changes in pitch? That could correspond to changes in how bright a star looks? What could that say? Does it mean the star is getting brighter? Could it be like a flare on the surface of the star, or is it a dip in the brightness of the star? Could that be a planet passing in front?

19:25

Sarah Kane

So depending on what the goal is of our sonification, we could be listening for very different things.

19:31

Joe Patitucci

And is there a way to delineate between whether it's a flare or whether it's a planet? Because it sounds like what you're looking for in that situation is kind of like the delta or there's like a big change. So yeah. Is there any way of indicating the direction of change, whether it's light or dark or density of whatever the body is?

19:58

Sarah Kane

Yeah. So this is a great question, and this is something we think about a lot in the astronomical sonification community. How do we standardize what these changes in pitch mean? So that it's an intuitive sort of process. So if we're talking about these flares versus transits, these brighter changes versus darker changes, the type of data we're looking at is something called a light curve, which is a measure of how bright the star is as a function of time. It literally just looks like a line graph that goes up and down as the star looks brighter and dimmer to us here on Earth. And a sonification of that data would just sort of pan along. You know, you hear the line as sound, where higher points in the line graph are higher pitches, and lower points in the line graph are lower pitches.

20:50

Sarah Kane

And that makes it quite intuitive to hear the difference between a flare versus a transit, because in a flare you hear this sudden whir of louder or not louder of higher pitches, whereas in a transit, you hear a dip into lower pitches.

21:07

Joe Patitucci

Cool. All right, that makes a lot of sense. And so this is all being done. So it's similar in terms of with PlantWave and making plant music. We're following this wave that we're getting from the plant. That's basically changes in impedance graphed over time. In real time. And so when the conductivity goes up, when there's more water between the two points, the notes go up. When it goes down, they go down, and then the delta kind of thing comes into play as well. So how quickly it changes will result in different effects being applied. Maybe if it changes quickly, the pacing of the notes may increase, or if it decreases, pacing could go down. And that's all done in real time.

22:04

Joe Patitucci

And because PlantWave is done in real time, one of the challenges that we have to deal with is that, well, I'll start with astronomical data, especially when you're dealing with an image and you're sonifying the image, you kind of know what the maximum and minimum values will be, right, because it's all there in front of you. It already exists. With plant sonification, one of the challenges is that it's changing all the time.

22:33

Joe Patitucci

You kind of think, you know what the baseline is, you know about what the range of data is that you're going to get, but you have that thing hooked up for a few days, and before you know it's way out of there and if you had it as a direct sonification of tone, like you would do it in an image, what would end up happening is that at a certain point the notes would go outside of the audible range. So one of the things that we have to do with real time sonification is do some way of either measuring and recalibrating the system, or do like what we do with PlantWave, which is we actually wrap, we wrap the notes, so we'll have a set range that the plant is allowed to play.

23:19

Joe Patitucci

And then when it hits a certain point, then it just has to wrap back down under. Yeah. I'm wondering what kind of design challenges appear on the sonification of space data side.

23:33

Sarah Kane

That's incredible. So I had never thought about the challenges of live sonification. I'm over here complaining about standardizing sonifications of data we already have, instead of dealing with an ever changing data set. That was very interesting to hear about. I suppose the challenge of astronomical sonification is that there are so many different types of data, so many different types of visualization, and it becomes a challenge to find the most intuitive sonification when you have 100, a thousand different plots that are wildly different. So, for instance, what we just listened to in the galactic center is a sonification where we pan from left to right, and how high or low the pitch is corresponds to how high we are on the plot, whether we're in the top or the bottom part of the plot. And then the volume corresponds to how much light we're getting.

24:38

Sarah Kane

But we could look at just an image. So not even a different type of plot, but still just a picture of a very different object. I'm going to ask you to pull up the Cassiopia a sonification in a second, please. We have still an image of an object, just a different object. And the method we use to sonify the data already has to be wildly different, because the image is so different. So if you could play the cas. A sonification is a sonification of cassiopia a, which is a supernova remnant, the remnant of a star that died explosively. And as you can see, it's roughly circular ish in a way that the galactic center image didn't look.

25:41

Sarah Kane

And this means that it makes much more sense to sort of scan out from the center of the image, rather than panning from left to right, as we did before. And since we're moving out from the center, it no longer makes sense to have pitch correspond to whether we're hearing noise from the top or the bottom of the image. So now the different pitches or the different instruments you're hearing are actually mapped to the different elements you can see in that graph. You can't see the elements, but the elements that astronomers know are there. So that's like calcium and silicone and iron, remnants of this supernova. So each instrument you hear sort of in that chorus is mapping how much of that element is at the point of the image where we are currently sonifying.

26:35

Joe Patitucci

Okay. Wow. So there are just many different ways that one would sonify. And when you're doing research or when you're checking out these different images, is it common for the different practices to all be available? Is it kind of like, visually, it would be like a histogram or something, and you say, like, oh, I'm looking at these certain things, but can you look at the. They normally provide the image as a sonification in multiple ways.

27:10

Sarah Kane

Yeah. So, this is a really interesting thing in astronomy, and I think one of the most common misconceptions about astronomy growing up, I would often hear, how in the world are you going to be an astronomer? You can't see through a telescope. Most astronomers don't look through telescopes, and, in fact, most of us don't look at images, either, not in the same way. So, for instance, in my research, I unfortunately never actually get to look at those pretty pictures. That's not really what my research looks like. Instead, it's many line graphs or other types of bizarre and unusual plots. And one of the challenges of being a blind person in astronomy is that there are not sonifications available for all of these different plots we have.

27:57

Sarah Kane

So, astronify, which I've talked about, is a sonification of one very specific type of plot, a light curve, and it works really well at that. But suddenly, you move to a slightly different graph, and it no longer works. So it's a really great question of, are all of these sonifications available? And the answer is, not yet, but I hope someday. Yes, I think people are working towards that.

28:24

Joe Patitucci

Got it. And it sounds like there isn't yet really a standardized way of sonifying, or there aren't necessarily, like, named types of sonification where one could. Yeah, sounds like there's a lot of opportunity there. And maybe, I don't know, we have a pretty cool real time sonification engine. Maybe we could plug it in one of these days and let it rip. Yeah, that's really cool. So, one of the things I'm hearing from you is the way that you're doing research is astronomers aren't necessarily looking at pretty pictures all day, right? Or just staring into telescopes, listening to planetarium music.

29:12

Joe Patitucci

But one of the things that I find interesting is, and that people might not know is a lot of these images you're talking about, some of these things are meant to be more internal for scientific research and some of them are more for outreach. So those images that we see of galaxies and all these things are like how true to the visual are they? And how much interpretation is done in that space, on the visual space, is that something that maybe most people aren't aware of?

29:51

Sarah Kane

That's an excellent question and something I was really hoping to get to talk about because I think this really hammers home how sonification is no less intuitive, no less natural than probably. Again, I'm going to talk about James Webb because, know, shiny new toy maybe you saw those incredible pictures from James Webb that came out this summer, and it's easy to think that those are just like pointing a camera and taking a picture. But in reality, a lot of telescopes are actually looking at light that we cannot see. So our eyes, if they work better than mine, can only see a very narrow, little tiny strip of the electromagnetic spectrum of all the light that's out there. So a lot of telescopes like James Webb, which looks at infrared light, I believe, are not looking at light that we can see.

30:48

Sarah Kane

So anytime you see an image from James Webb, there has to be some interpretation there in terms of how do we translate this light? We can't see into light. We can. You know, I'm no expert in that process, but I think it's pretty faithful to the actual data. That's a primary goal there. I won't claim to be an expert in that process, but because when we have this light we can't see, we are already translating it into something we can see. It's already a change from the original. That's something. I like to emphasize that sonification is just a different sort of translation. We are still taking something we can't see naturally and making it into a form that we can see or hear naturally.

31:42

Joe Patitucci

Totally. You mean that the galaxy doesn't sound like celestial flutes? Is that what you're saying, Summer?

31:51

Sarah Kane

Yeah, I know. That's something I get asked a lot. Like how can you hear something in space? Especially a lot of these black hole mergers are represented as sound, like a really cool chirp noise. I don't know if any of you have heard that. And one of the questions we get a lot is like, how do you hear anything if space is a vacuum? We're not hearing anything. We are representing some data we're getting as sound. It is a representation of something we observe, just as an image is a representation of something we observe. It's not literally music from the stars. It's more music of the stars that we're making.

32:32

Joe Patitucci

Exactly. Yeah. That's an important point. With PlantWave, we get a lot of. Sometimes people are angry on the Internet. Some people like doing that. That's a hobby. And they'll in all caps, like, plants don't sound like flutes. And it's like, okay, yeah, no, but we can represent the data from plants with flutes so that we can hear these changes that are happening. Yeah. It's just really important to note that sonification is a way of representing this information to humans in a relatable way. And a lot of what we see out there, even in the visual land, is in some ways a translation as well. And that doesn't make it any more or less real. Let's be clear about that, because what it is it's a tool. We're expanding our ability to perceive outside of the apparatus we've been given as humans.

33:43

Joe Patitucci

And, for instance, plants are able to absorb light and respond to light that's outside of our visible light spectrum. And so if we're able to listen to plants as they're responding to changes in their environment, then in a way, we are opening our ears to be able to see with the leaves of a plant. And there are lots more applications we can get into. But do you have anything else to add to that?

34:17

Sarah Kane

No, I think you put that very well. I think, as you were saying, one of the things, and I know I've said this already, but one of the things I just always want to emphasize with sonification is that it's just a different way to represent data. It's just another way to do the same thing.

34:35

Joe Patitucci

Sure. And so a lot of people might get into, like, I love Carl Sagan. I used to just go to sleep listening to the Cosmos series every night. Like atoms is massive, as sun's universe is smaller than Adams. And I just blissed out into wonder. You know, I think a lot of people that are into astronomy and into space, they get pulled in through the wonder of images and things like that. How is that for you in terms of, is the sonification doing it for you? When you hear the sonification of space, does it bring you into this feeling of this sense of wonder and awe and good vibes?

35:25

Sarah Kane

I definitely think it does for me. I think my inspiration to get into astronomy was a little different from a lot of sighted astronomers. People will talk about looking up at the night sky or watching these documentaries and whatnot, and a lot of those things weren't accessible to me. And so I got into astronomy because I watched a lot of Star Trek. But in the past few years, as I've gotten into sonification and I've heard these images, I guess I feel that sense of wonder, and I can get a sense of the beauty that other people are able to see. And it makes me think of the next little blind kid who wants to be an astronomer and who will hear that and be able to get that sense of wonder from something real, from real data.

36:21

Sarah Kane

This is the sound that we use to represent real astronomy and not just a tv show, as good as it might be. So I definitely get that sense of wonder. And I'm excited for a new generation of blind people to get to experience it at a younger age than I did and hopefully to inspire more blind astronomers to pursue the career and to know just by hearing that sonification, that people want them in this community, that they're welcome here, that people are working to make sure that they can experience that sense of wonder that will make them want to be a part of this scientific community.

37:00

Joe Patitucci

That's really cool. I'm thinking now, too, about the way that the data is represented as sounding celestial and things. I'm wondering who the first person was that started to make the sonifications of the space data. I would imagine they were likely sighted, and they were taking that feeling they have and communicating that through sound, which is now translated through a human, through the technology, back into wonder for you to experience and all the next generation coming up.

37:43

Sarah Kane

I think that's one of my favorite things about sonification that you've sort of just touched on, is that it's such an incredible opportunity for cross discipline collaboration, because you don't just have astronomers working on sonification. You have musicians, you have psychologists to understand how we interpret sound. You have sound engineers. You have all these different people who have to sort of work together to make a sonification that is not only faithful to the data, but is also enjoyable. It captures the feelings that we want it to. I would be remiss in talking about astronomical sonification without mentioning Kate, Meredith and glass education, an incredible nonprofit that organizes a monthly sonification world chat where people from all over the world talk about sonification. And it is not just astronomers. It is people from all different disciplines. It's very exciting.

38:41

Sarah Kane

I think sometimes us scientists get a little stuck in our bubbles of hard sciences. And it's exciting to get to collaborate with artists and sound engineers and psychologists who have perspectives on how to make sonifications better than we could ever make them. Just a bunch of astronomers sitting in a room, clacking away at a keyboard.

39:06

Joe Patitucci

Right on. Yeah. It's important to have the qualitative aspects of our work available to us. Right. The environment that we work in is important. So always having that connection to wonder and to having that connection to the kind of magic that is qualitative as we're studying the quantitative. So I think we might be ready to open up for questions. Does that sound good? All right. Yeah, I think we're ready so we can open it up. Are there any questions from the audience on sonification? Plants or space or where we're going, what's happening, who's setting the harmonics and the limits? Like the scaling and that stuff with the plants. Right. With both the plants and in this case, the astronomical image, because it's so beautiful. Clearly there was some thought put into making sure that the harmonics work despite whatever frequency is being activated. Yeah.

40:25

Joe Patitucci

I can give a little on the plant side. So I'm the guy. I made those decisions in the PlantWave app. I've been working on it for, what, eleven years? I've developed a sonification practice and framework. I can give you a little insight on that. So there are a few core values with it. One core value is the value of harmony. So the value of harmony is that if something sounds nice, people are going to spend more time with it. If you're looking to observe data over a long period of time, whether it's with your eyes or with your ears, you got to make sure it's non fatiguing. Right? So value of harmony, then I say, okay, pentatonic scales. So everything's always scaled to a pentatonic key. That way, no matter what order the notes are played, it always stays kind of harmonious.

41:27

Joe Patitucci

The next thing is that the next aspect is to provide multiple levels of resolution of the data all at once. And so you can do that with multiple instruments. So people will comment on an Instagram post or something and they'll be like, yeah, cool, that orchid sounds great, but I wish you'd remove the piano because I was just trying to listen to the plant because it's like maybe piano, bass and whatever. The plant is controlling multiple instruments, and so they've missed the fact that it's all being represented of the plant, but yet providing different resolutions of the data as music. So you might have. The bass is allowed to be triggered once every 10 seconds. So you're getting the change over time on the.

42:29

Joe Patitucci

You're getting a change over every 10 seconds on the bass instrument, whereas maybe marimba or something that's way more staccato, you're getting a finer level of resolution. So that's why with PlantWave, we have four different instruments that the plant can play all at one time. And so you're hearing the same data, but at different resolutions. And together it ends up sounding like a symphony. So those are two. So harmony providing different levels of resolution. The third one is providing insight into large changes, like allowing novel events to be very clearly novel events. So we designed for that by, the notes are generated by the change in conductivity. So you can think of that as like a lie detector and lie detector wave. Imagine that wave. It's funny because it's the same circuitry, basically.

43:33

Joe Patitucci

So lie detector wave, imagine that being translated into pitch, right? But then imagine that lie detector wave goes from something that's like a really low frequency, so the wave is really long to something really high, really fast. So like there's a huge velocity in the change. So an event like that with PlantWave or with our sonification system would change a knob, which can be mapped to one of many things. You think of knob in your car stereo or something as volume. But a knob could also be turning up the reverb. A knob could make it so that the sample rate becomes faster and it's able to trigger more notes faster. So we're using CC messages, which is a midi is musical instrument digital interface. It's like a computer language for synthesizers. So we're using the MIDI CC signal to represent changes in activity.

44:44

Joe Patitucci

So there's a lot of information there when you're listening with a PlantWave. And we're trying to make it as easy to listen to for as long as possible so that you can be with the data for those emergent happenings of large changes in the delta. That's how we approach it. And then we're also applying that to other. We're applying that to wearable data and things like that. So we have some patents related to that. That's the bigger thing for Data Garden. Moving forward beyond PlantWave. It's using that same kind of sonification engine to make real time biofeedback music for wellness, for meditation, for exercise, high intensity interval training, all these, you know, our lead developer Carl over here has been amazing know, ensuring that we're building it in a way that it can scale.

45:46

Joe Patitucci

And, yeah, we've developed a language, and so I think there are different languages, maybe for different. There's a different language, maybe for biological systems in real time, that have to recalibrate than the language that's used for sonification, but that's how we're doing it. I'd love to hear more on your side.

46:06

Speaker 3

Yeah.

46:07

Sarah Kane

So, unfortunately, I am not the guy who does that. So I don't know if I'll be able to give you as in depth answer there. What I can tell you is that oftentimes for especially these outreach sonifications, we use real musical instruments, which will set some limits on the pitches just based on what instruments, the noises they can make. And that's because that's aesthetically pleasing to the ear. A lot of effort goes into making sure that things are aesthetically pleasing. As you said, people won't listen or look at things that they find grading. And that's where a lot of our collaboration with sound engineers and musicians comes in. In terms of here's our we don't have live data, so we know what the brightest and the dimmest point is.

46:56

Sarah Kane

So how do we map that onto a range of pitches that is broad enough that you can hear the differences, but narrow enough that we're not getting to any pitches that are absolutely miserable to listen to in terms of some sonifications that are maybe more research oriented when maybe we're not using instruments in that case, again, still, a lot of thought goes into sort of default pitches that are used to map things, again. So it isn't painful to the ears of whatever poor astronomer is listening to it. And a lot of that comes down to usability testing, a lot of testing, which definitely needs to include blind and visually impaired people for best results, because we're good at listening to things.

47:43

Sarah Kane

So a lot of usability testing and then sometimes including, I believe, with astronify, because you can make your own sonifications of your own data. With things like astronify, you, I believe, have some control over the pitch range. So maybe if you need to modify the pitch range so you can hear something else, you as the user, I believe, have the option to do that. At the very least, with astronomy, you have the option to invert the pitches, like invert color on your phone, you can invert the pitches. So higher points on a line graph, brighter moments for the star, will actually be lower pitches. And that's all in terms of making things more usable, making things more pleasant, and again, that comes down to a lot of usability testing.

48:31

Sarah Kane

So, again, another wonderful opportunity for collaboration, not only with musicians, but now with astronomers, with the blind, visually impaired community, to include them, to include us in the creation of these tools that will help them. That idea of nothing about us without us.

48:55

Joe Patitucci

Yeah, great question. Along with that, there's also this kind of, like, spacing between the notes. We arpeggiate things just to make it so that it has more of a time feel. Sometimes it's not. I know that with a lot of the sonification stuff for NASA, a lot of times it's not because you want to know precisely where the star is or where that thing is on the image. Right?

49:21

Sarah Kane

Yeah. So in these sonifications we listen to. So these compact sources, like stars, like little points of light, those would be a single note, whereas other things are sort of like more of a wave. I don't know. How do I describe this? I don't know. More interconnected points of notes. I suppose. Something like astronomy is a bit different. I likened it in one interview to, like, a glissando on a piano, where you run your hand down the piano and you hit every key. You can still hear the individual notes, but they sort of blend together. It doesn't sound like a piano, but it sort of has that feel of we hit individual notes because they're individual points on this line graph, it's not a continuous line. It's like each individual observation.

50:11

Sarah Kane

So we hit individual notes, but they all sort of blend together in something that gives you a good picture of the continuous whole. So, yeah, it's interesting to think about how all of these aspects of the sonification play together. The pitches, the spacing between notes, how loud it is, how all of these things interplay, to give us a comprehensible image, quote unquote.

50:36

Joe Patitucci

Right on. Thanks for that. We have another question over here. Yes.

50:40

Sarah Kane

Hi. My name is honey. Thank you both for sharing your practice and your findings so concisely and generously with us. I'm really interested in playing with sonification, and I'm wondering what tools are available to us lay people to collect and translate data for sonification, and probably most particularly for biofeedback, since we are the instrument that we have available.

51:05

Joe Patitucci

Sure. So for biofeedback sonification, we have PlantWave. For plants, you could technically use PlantWave on your body. In a way. We have another, like an alternate firmware version we might release eventually that will allow for more of that kind of thing. We started experimenting with sonifying breath patterns and things like that. We're working on connecting our sound engine to different wearable devices. There's nothing out there right now. And honestly, I don't think that right now there's a market for it, which is great because we're building it. That's what we want to do. We want to build the market for it. We just feel like this is something that we want in our lives. But there's nothing really out there commercially that does sonification.

52:08

Joe Patitucci

There's a lot of arduino projects you can find to do to play around with things, but I don't know of anything right now on the market that's doing direct sonification in that way. Most of it is like, even with the biofeedback devices, it might unlock certain sounds, or you might get little birds if you're wearing a muse headband or something like that. But not in the same way of the real time direct biofeedback music. Do you know of any?

52:42

Sarah Kane

Certainly not in the realm of biofeedback, I'm sorry to say. A little outside my wheelhouse. There are certain astronomical tools that will let you do your own sonification, one of them being astronify, another being sono uno. But these are primarily for astronomical data, and I will say, I don't know how accessible they are to lay people. For instance, astronomy will require you to know how to program in Python. I would say, if you're interested in getting into this kind of thing, I would definitely point you at the sonification world chat again, hosted by glass education. And you can get a lot of different perspectives, not just from astronomers, but from people sonifying all sorts of data. So that's probably where I'd point you as a starting point. Unfortunately, it's not a tool, it's just a community. Thank you. Who else has a question?

53:39

Joe Patitucci

Awesome. We have the sonification of this ending soon with bagpipes, I think. But we still have like five minutes left. Go for it.

53:49

Speaker 3

Hi, I'm Caroline from Australia. Thank you for your talk. I was just wondering, sonification has been around for quite a long time, and it seems to me, I think I bombarded you a bit the other day when I first met you because I was so excited this was here. It seems to me that it's not still taken very seriously. And what do you think that emerging tech might bring to that conversation? Change things?

54:12

Joe Patitucci

Yeah, I think one of the biggest challenges is actual audio literacy in general. To me, it's like a media literacy problem, and we're such a visual culture and so to me, the biggest challenge, to me, the biggest challenge is that people, there's a lot of room for humans, of post industrial societies to step into learning how to listen. That's part of what I'm looking to do with PlantWave. We're creating this thing where some people walk up and they hear it and they're just like, I don't know, it's just playing the same music as playing before. Oh, I hook up this plant, it's playing the same thing when I hook it up to that plant because they're like, oh, it's the same instruments. I thought it would be like a totally different band if I hooked it up to this versus that.

55:19

Joe Patitucci

But even sometimes, if you say, well, do you notice that the flute isn't playing now and that it's in the same key, but this isn't happening. I feel like it's like listening literacy, and it's really easy for us musicians and folks to think, oh, how hasn't this taken off? And the reason is that people need to be educated because this is a tool that you need to learn how to use. It's like riding a bike or anything. For us musicians, it's easy. For people that are hyper intuitive, it's easy.

55:59

Joe Patitucci

But if it's somebody that's really super analytical, that isn't into as many of the qualitative aspects of being a human being, unfortunately, some of those people are the ones that are making the decisions of what gets funded and what doesn't get funded, because they're looking for the hockey stick graph on the investment and all that.

56:28

Speaker 3

You think it could? Certainly one of its first uses could be in areas where people are looking at whole systems or listening to whole systems, like cybersecurity has a lot of anomalies in the system. And if you're someone who listens to that on a daily basis, then you could hear those distinctions when something might go awry.

56:48

Joe Patitucci

I've thought about that, too. That's a great point. I think that for those systems, though, they might not need a human to listen to it. They have the AI in place for other aspects of the system to manage itself. I think its greatest potential is really in allowing people to connect more deeply to their own bodies through biofeedback music. And I'm sure there are going to be other areas too. We'll find out. Do you have anything you'd like to add?

57:22

Sarah Kane

Yeah, it's interesting. I think we come at this from very different perspectives, which makes at sonification from very different perspectives. And backgrounds, which makes things interesting. I think within the scientific community, it's not so much an issue of it not being taken seriously, because people are generally very receptive to this idea of sonification so much as it not, as you said, being mainstream. There isn't sonification available or sonification tools available for everything that we use in astronomy. And outside of the actual sonification community itself, there's not much of a push for there to be tools. From my perspective, I think the biggest solution to that is going to be inclusion. I am one blind person, one of, mind you, a relatively small number of blind people in the scientific community calling for these changes to be made.

58:26

Sarah Kane

And it's hard for things to become mainstream when you have, what, 510 blind astronomers in the entire world. I don't know how many there actually are, but not many. So I think I often talk about this idea of a snowball effect, something like the sonifications we played for you are going to inspire more blind people to become astronomers. And the more of us there are, the more of a push there will be for more sonifications. It will become more mainstream. You'll see more blind astronomers in the field still. You'll get more sonifications, and eventually there will be so many of them that it will be mainstream not just for blind people, but for the whole community. We have all sorts of things that started out as accessibility tools and eventually just became so popular that they're universal.

59:17

Sarah Kane

For instance, I talked about invert colors on your phone or dark mode. How many of you guys like dark mode? That was initially a visual accommodation for people with various reading or sight issues. Eventually, when you have enough people with disabilities pushing for these changes, they become mainstream, and people start to see that actually, again, this accessible, universal design is better for everyone.

59:43

Joe Patitucci

I love that. That's so wonderful. There are so many different ways that our systems are configured right as human beings. We all have different hardware here, and all of our hardware is configured specifically for our highest contribution as humans and as individuals. And the more that we can create ways for human beings to fully express themselves and fully engage in the world around us, the better it's going to be for everyone, because everybody has a really unique gift, and I'm really glad. It's really great to have met you and to hear about sonification from your perspective. It does feel really promising also to use this technology to really illuminate deeper connection with the universe and our solar system and galaxies and ourselves, our own bodies and the natural world around us. Thanks so much.

01:00:50

Joe Patitucci

I think we're going to close out because we have a party that's rolling right into this place, but we'll be around here. Thank you so much for attending and go on upstairs to the plant music lounge. It's such a zone. It's such a good space. And meanwhile, I guess let's check this out. Peace. Thank you.

01:01:11

Sarah Kane

Thank you.

01:01:13

Joe Patitucci

Thanks for listening to the Nature of now podcast. Of course, we will have links in the show, notes to Sarah Kane and all the information. Really, really cool conversation. I'm looking forward to staying abreast at what Sarah is up to because she's just such a brilliant human being. So, yeah, thanks again for listening. I hope you learned a lot from this show, and we'll catch you all soon. This episode is brought to you by PlantWave. PlantWave translates a plant's biorhythms into music. All you have to do is connect two sensors to a plant's leaves. PlantWave pairs wirelessly to a mobile device running our free plantwave app. And the app has instruments on it that are designed for plants to play. So every single note that you're hearing right now is an expression of real time data that is coming from the plant.

01:02:07

Joe Patitucci

These are the impulses in the plant being expressed as music. Nothing's pre recorded. It's just real. It's happening. And what's kind of cool about that is that a lot of people, when they hear this, they think it's kind of pre composed music. And I'll get accused in the comments, like, yo, man, this is definitely something that's been composed, and I find that really cool. That's flattering. I wish I wrote millions of hours of music and just put it in an app. Fact is, I didn't. I'm not that great of a musician. I'm better at thinking of how to design algorithms and things like that. PlantWave is all scaled to a key. That's why it sounds harmonious like this. I did that on purpose because it's designed for human beings to be able to tune into plants.

01:03:00

Joe Patitucci

And in order for a human being to want to spend time tuning into the plants through these impulses, made a design decision to make it in a key. So that's why you're hearing this. So this is the plant playing this collection of instruments, and I have other collections of instruments. So, like, what if we heard this plant biorhythms triggering piano samples? What would that sound like? Let's check this out. Cool. That's what that would sound like. And then what would happen? If I touch it cool, the notes are going up a little more. That's fun. What's cool is that you don't even have touch the plant. If you just hang out with it all day, you'll notice every once in a while that it's doing completely different things. It's going into a whole other octave range, or it's switching what instruments it's triggering.

01:04:06

Joe Patitucci

And that is because a plant is a living being. Newsflash, it is not a thing. And my hope is that by sharing with PlantWave with the world, we can help us all have a greater understanding and compassion for the life around us and realize that we're all a part of earth and we're all in this together, floating around through space. So if you're interested in connecting to nature in a new way, check out plantwave.com. And yeah, I'm just happy that you're experiencing this podcast and this music and any way that you want to experience or support PlantWave, whether that's just like enjoying our videos on TikTok or Instagram or any of those things, or checking out our YouTube videos, or just send us some. Just send like a. How about you just send like a field of love to us for doing this?

01:05:06

Joe Patitucci

That's all we need, really. All we need is love, right? And food and water and all that stuff too. Unless you're a pronitarian. Still skeptical about that, but I digress. Check out planwave.com and thanks so much for listening.

Next
Next

Eduardo Castillo: Impact through Artistry and the Organic Rise of Habitas