The Power and Manipulation Behind Free Platforms and Search Engines
The Power and Manipulation Behind Free Platforms and Search…
In this compelling episode of This Anthro Life, we engage with filmmaker David Donnelly, delving into the depths of his documentary "Cost o…
April 24, 2024

The Power and Manipulation Behind Free Platforms and Search Engines

In this compelling episode of This Anthro Life, we engage with filmmaker David Donnelly, delving into the depths of his documentary "Cost of Convenience." Donnelly's exploration uncovers the intricate web of consequences spawned by technology, spotlighting the covert toll of social media and search engine usage. Through a narrative woven with two years of intensive interviews and research, Donnelly unveils the unseen impacts of our digital age. The conversation traverses the power dynamics inherent in data collection, emphasizing the imperative of transparency. We advocate for a cultural shift, calling for an evaluation of our relationship with technology and its pervasive influence on modern society.

Explore the intricate repercussions of technology through David Donnelly's lens in "Cost of Convenience." Delve into two years of intensive research revealing unseen impacts.

The player is loading ...
This Anthro Life

How do the hidden costs of technology, like social media and search engines, shape our lives in unforeseen ways?

In this compelling episode of This Anthro Life, we engage with filmmaker David Donnelly, delving into the depths of his documentary "Cost of Convenience." Donnelly's exploration uncovers the intricate web of consequences spawned by technology, spotlighting the covert toll of social media and search engine usage. Through a narrative woven with two years of intensive interviews and research, Donnelly unveils the unseen impacts of our digital age. The conversation traverses the power dynamics inherent in data collection, emphasizing the imperative of transparency. We advocate for a cultural shift, calling for an evaluation of our relationship with technology and its pervasive influence on modern society.

Explore the intricate repercussions of technology through David Donnelly's lens in "Cost of Convenience." Delve into two years of intensive research revealing unseen impacts.


Timestamp: 
0:00 The impact of digital technology on human experience, exploring themes of power, data collection, and mental health.
06:44 Data collection and surveillance in the tech industry.
14:27 The impact of technology on critical thinking and society.
17:48 Interdisciplinary approach to understanding complex problems.
23:10 The impact of algorithms on mental health and privacy.
30:37 The impact of online interactions on human connection and well-being.
33:28 The purpose of education and critical thinking.
38:04 The impact of algorithms on society, including privacy, wealth inequality, and discrimination.
43:47 AI bias, responsibility, and accountability in technology development.
49:08 Tech's impact on society, including mental health, aging, and consciousness.

Key takeaways

  • Technology platforms that we use for free collect and sell our data, which can be used to predict our behavior more accurately than our minds.
  • The lack of transparency in data collection and algorithmic decision-making gives those with access to the data a tremendous amount of power and influence.
  • Our relationship with technology is causing a range of societal issues, including mental health problems, polarization, and misinformation.
  • There is a need for a cultural paradigm shift and more informed citizenship to address the consequences of technology and advocate for transparency and control over data.


About This Anthro Life:
This Anthro Life is a thought-provoking podcast that explores the human side of technology, culture, and business. Hosted by Adam Gamwell, we unravel fascinating narratives and connect them to the wider context of our lives. Tune in to https://thisanthrolife.org and subscribe to our Substack at https://thisanthrolife.substack.com for more captivating episodes and engaging content.  

Connect with David Donnelly:
Instagram: https://www.instagram.com/dav_donnelly/?hl=en
Twitter: https://twitter.com/daviddonnelly?lang=en
  
Connect with This Anthro Life:
Instagram: https://www.instagram.com/thisanthrolife/ 
Facebook: https://www.facebook.com/thisanthrolife 
LinkedIn: https://www.linkedin.com/company/this-anthro-life-podcast/
This Anthro Life website: https://www.thisanthrolife.org/  
Substack blog: https://thisanthrolife.substack.com 

Transcript

Adam  00:00

If we get the feeling that life online feels a bit like living and fast forward, you know, for many people around the world scrolling through news feeds has replaced morning newspapers, and conference calls connect us more frequently than handshakes. We're also constantly bombarded by ads and notifications from every corner of our screens, even from our head foot sometimes right in somehow it seems to still move faster every year. And it can be like, the more connected we are, the more alone we can feel. I met him again well, and you're listening to this Anthro life where we peel back the layers of the mundane to reveal the extraordinary insights that they hold. Now today's episode takes us to the heart of how our digital lives are reshaping the human experience. And I'm excited to introduce you to David Donnelly, the documentary filmmaker who has been exploring the questions of what's truly behind our screen-illuminated lives, he's got a new film out and called the Cost of Convenience. And after checking it out, I wanted to talk to him about the ways that convenient tech changes flows of power in data to shape the narrative of our daily lives. The challenge that we explore is that free platforms and search engines that we rely on are not just tools for connection, but are a canvas on which our behaviors are subtly painted, directed, and even manipulated. This conversation isn't just a sneak peek behind the curtain of digital platforms, it's actually a call to examine the strings attached to our digital puppetry, we might say, you know, from the algorithms dictating our movie picks to the silent auction of our personal data, David challenges us to consider not just the convenience, but the cost of our convenience. This is not a utopian or dystopian story. It's just the story of us enough technology and of the silent revolution that's unfolding in our pockets in our living rooms. This is This Anthro Life, where anthropology and the exploration of human potential converge, stay tuned. Pretty

 

David Donnelly  01:38

much what happened was during the pandemic, there was a period of time to where prior to that I was doing a lot of documentaries in the world of classical music. And those were focused around physical events and concerts. And so when they started to hit, suddenly, everything got canceled for in many cases within that being permanently. And during that window of time, also, people started using technology at a rate that would usually take seven years. And so we had this acceleration happen in a small window of time, and I just started looking inward instead of traveling around and doing I was doing prior to those that kind of filmmaking. And then we started to, you know, kind of pull the thread, you know, and it just kept kept taking us to kind of a deeper and deeper side of technology, in particular in that platforms that I kind of, I think that people have a general sense of anxiety that there's something kind of off and we wanted to get to the root of the problem. And so we spent two years interviewing people, it was completely exploratory. There wasn't a script in advance, like a lot of documentaries. It wasn't centered around an event, or a particular person, it really was just our team of people starting to ask questions, and then seeing where those questions lead us and continuing to trust the cast to introduce us to more people to try to, you know, deconstruct this very complex web of of consequences and actions. Cool.

 

Adam  03:10

No, I think it's really interesting to note, man, it's like makes my researcher brain happy to kind of think about we can make film this way to have like, as we explore a topic and kind of deep dive and find different rabbit holes. And also, we work with our interlocutors or folks in our cast to, like help us find those, those next kind of levels of depth. That's, that's really, that's really cool way to think about for making a film itself, it's not an easy, it's not an easy way to do.

 

David Donnelly  03:34

So yeah, I did feel that that was the only way to retain any sort of integrity as far as just asking honest questions. And, and also to that the public deserves that. And clearly a lot of the stuff that that you see anywhere, any kind of finished content, a lot of times it has an end goal, or it has kind of a marketing aspect created before it's even complete. And that's, you know, understandable, because, you know, these, you know, films have to generate revenue, but we just felt like this one had to be made independently. So it was independently financed. And we really waited until it was completely done before we even started to think about how to get it out there.

 

Adam  04:11

Cool. And that makes sense. And I think that the subject matter in the kind of in depth format that you take in the who you're talking to also, you know, the different different scholars, you know, investors in Facebook, you know, early on in a bunch of folks, I think that actually that's a really good method for how can we approach a film in a way with integrity that's like, we're gonna we'll finance it ourselves, especially because it's a lot of this is gonna get us into advertising. But then also, what does that mean for business models? You know, so, you know, what are the things that I came away from the film I won't do if we can't do any spoiler alerts, you know, but just like, there's an important theme of power throughout the film, right? In terms of who has the power to collect data, you know, and what does that mean? So you know, let's if you don't mind kind of give given folks that they haven't seen the film yet, like kind of a broad overview of what we're exploring in cost of convenience and then like then let's let's talk a bit about power is one of the themes that that comes Through the film,

 

David Donnelly  05:01

yeah, I think most people, there's kind of a general idea that these platforms that we use in search, and also in social media, that they're free. And yet, we're not realizing that there's a lot of data that's collected, and that data is sold off to third parties is, is run through algorithms that can predict your behavior, your future behavior, in some cases more accurately than your own mind can. And the people that have access to that data have a tremendous amount of power and influence. Because we don't have that same data ourselves. There's a lack of transparency, Carissa valise, who wrote a fantastic book, who's in the film, the book is called privacy is power, she is Associate Professor of artificial intelligence ethics at Oxford. And I mean, I think you can see from some of the examples that she gave just how easily you data can be given to other people, it's, I think people should be alarmed, and in some cases, enraged, especially if you're a parents, and, you know, you're you, people aren't understanding  the gravity of, of this power vacuum, you know, if you're a teenager, you know, your brains not even fully developed yet, and you've got an algorithm that is influencing what kind of job do you might want to pursue, where to live, what kind of university that you might want to go to, or not go to university at all, who to hang out with what kind of activities to do. I mean, it's, it's endless. And, you know, these algorithms don't have, you know, empathy, they have just a binary success or failure ability. And we're seeing the consequences of that throughout our entire society right now. And, you know, as a researcher, it's very challenging to, you know, accurately say that one thing is the cause of some of these things like mental illness, that we're seeing an increase in suicide rates, you know, causation is difficult to isolate, when there's so many different things that people are doing, especially on their phones today, but at the same time, you could just look at some basic, you know, measurements and see that you know, as screentime is increasing, and as these things are becoming more ubiquitous, that the mental health issues, that we're having the polarization issues, that we're having misinformation, so many of the existential crises that we're faced today, I think, have a route in our relationship with technology. And there's not a lot of information about the costs of that there's billions of dollars invested in marketing, and the benefits of these things. But the other side of that coin is less known. Yeah,

 

Adam  07:44

it's an important point, right? In both, as you point out in the film, as you can just we can see in kind of business practices around tech, the tech industry, you know, especially social media, you know, with Facebook, and Instagram, or, you know, LinkedIn, which they all are owned by the same company, right. And then we have tick tock and a bunch of other films. I mean, even like Reddit itself was kind of falling into this category at some point to where they just went, or they're just going public right now, as we're recording this, basically, which will have huge implications when shareholders now are going to be dictating a bit more about what happens with the, with the content. Yeah, you know, I think that's the thing that's really interesting piece that like, yeah, it was it was mentioned the film to where it's like these ideas of like, we there's like sometimes a general idea where you know, that data is being captured by social media platforms, but like things that we don't realize, and that you point out the film is things like, your Tesla is measuring your weight in your seat, when you're sitting down, you know, in like your TV, when you when you watch your your program, we think okay, Netflix is watching how long I'm using watching a program for it. But then your television itself is also recording things, sending that to Sony or whatever, whatever the brand is. So it's like this idea that we don't even realize how much data goes out from everything that we do. So it's not just the like, quote, unquote, if you're more expected to see, okay, Facebook's gonna serve me an ad based on what I click on. It's not It's like way deeper than that. Right. And I think that's the one thing that was interesting to me that, that I think a lot of folks probably don't realize is like the depth of data collection across ubiquitous platforms, right, your blender, if it's a smart blenders, calculating, sending stuff to SharkNinja, you know,

 

David Donnelly  09:08

it's absolutely disturbing, to try to understand the scope of how we're surveilled, and how surveillance capitalism has become the engine of the modern economy. And we're just starting to be able to share this information, because pretty much because some of these companies were forced to reveal the data they had. And unfortunately, when you start to see these trials happen, and you know, these legislators start to, you know, go after some of these CEOs and stuff, it's, it's already too late now that there's already a new threat. And so this iteration is going to continue to happen to where we're chasing this until we can get ahead of it. And that's why this is so connected to anthropology, because this is requiring a massive cultural paradigm shift that we need to inspire and to try to get across to families and friends and to come together, as you know, informed citizens to let the politicians know that we're not okay with this, and that we can still have a competitive economy, you know, and, and also have transparency and control over data. And also to so many of the elements of this, marking a major revolution in the way that we talk to each other, the way that we just our daily lives, how much this has been revolutionized by these internet platforms, and we just haven't had time, our brains have not had time to process these changes. And so we're living through this period of confusion. And I think it's going to require a lot of clarity. And the only way to get that is through transparency.

 

Adam  10:44

Yeah, I think that's a great point. And it's, it's funny that, you know, a typical business model in late capitalism today, right? Is is around this idea that like, you can't have transparency, because then you'll know my trade secrets are my IP, right? And then somehow that's going to stifle innovation and or that I can't make new products or something, or that somebody else can copy me. And it's like, right, this this interesting idea, right, that, in essence, is a falsehood, right. And in terms of that, we can't have transparency like the the ironic thing there, I guess, was a couple things that come to mind. One is that a lot of the algorithmic systems and obviously into AI today to our black boxes to the point that even times, you know, open AI doesn't have super, like total information into how their algorithms are working. You know, I mean, some super expert data scientists might know most of it, but like a lot of them do a little bit of their own work. Once we plug them in and set the given this this point, like when we talk about how we're creating parts of technology that are shaping social life, if companies themselves don't even have or don't even take responsibility for understanding how their systems are shaping, and doing their work like that, I mean, obviously has huge implications, right. And so it's like this question of transparency, but that they say, we can't share our trade secrets, but then it's like, but do you even know how all of your trade secrets work? On the insights? That's,

 

David Donnelly  12:01

that's a great point. And in many cases, the people that are creating it, don't understand it. And in many cases, the kids in Silicon Valley whose parents create these technologies aren't even allowed to use the technologies that their parents created, because they're not even sure exactly how it's going to impact them, which is a red flag. Yeah. I mean, that's, that's, I mean, that's, that's, you know, people should be a little bit concerned about that. Why are there so many tech free schools in Silicon Valley for young kids? You know, what do they know that, you know, that everybody else doesn't know? And then there's the, like you said, there's, it's it is a falsehood, I think, because as Roger McNamee says in the film, you know, data is not like an organ, you know, you can't your personal data, you can't just, you know, give it away, if we were to have more control over that, I mean, you could choose to then give your data away, and then maybe you should get compensated for that. And, you know, that's how that's how I think it should be, and we can still have that ability to choose. But, you know, we need to be able to have an informed citizenship in order for democracy to work and when that's removed, and things are getting taken, and then all of that transfer of wealth is what's changing his power dynamic. And it's, it's very lopsided, and I think we have to fight for for that to be more balanced. And on the other side of this, too, is just the how this is changing our day to day lives for how we've lived for so long, and how dramatic that is. I mean, you know, Dunbar's number, I'm sure you're aware of I mean, we're, you know, we're just not meant to have 1000s of friends or to feel like we need 1000s of friends. That's, that's just one of, you know, the countless ways that, you know, we're like, friendship is this evolutionary mechanism that was really important, when you were living in a village somewhere, and you had a stranger come in that stranger was a threat. So our brains developed recognize a certain amount of faces. And now suddenly, we're forced to feel like we have to have all these relationships. Yeah, we kind of start to, you know, break down. And you can see that happening, you know, all the time. And, you know, you can look at it like dating. I mean, you know, it's the whole way of finding a partner. You know, when suddenly you have to swipe, and it's just visual. There's not this hormonal exchange, it's occurring. Yep. You know, you can see why a good example of a company taking advantage of this reverse business model is like, hinge, you know, I mean, because we realized way after something like Tinder was created, that those apps were designed for you not to find a partner, because if you do find a long term partner, you're gonna stop using it. And so of course, like, they don't want you to find a long term partner, they want to keep you, you know, guessing and so, and finding somebody else and so there's, I think, ultimately, the future of this, especially the people are gonna be harnessing, you know, AI are going to have to find ways to revert back to some of the more natural ways and put those two together somehow I think that's going to be some some a huge success when people can start can start to do that, ya

 

Adam  15:04

know, it's a great point too. And in case folks aren't familiar, Dunbar's number is referring to this idea that we have wired in our brain, we can know a certain number of people's about 150 that we can kind of know on an intimate level. And so once we get beyond that, like, it's just, you know, it gets in Think about that. So if you have a Instagram feed or a Facebook feed, how many followers or friends do you have? It's probably more than 150. And that's, to your point, like why it makes us a little bit crazy, right? A little bit scattered. And it's, you know, it's interesting point, because you mentioned surveillance capitalism a bit before. And it's like, this interesting idea, because the other side of that we hear talked about kind of on the positive side is like, the attention economy is what we're talking about today, too, right? It's like, that's where like, you know, audience capture in this idea of I want to pay attention to XYZ listened to and, you know, a nice, personalized ad that makes me buy some shoes, or whatever it is, you know, but like, I think this is this important, two sides of this coin here. One of them is like people are basically monetizing attention. That's what tech companies are doing. But the flip side, right, in this notion that you're talking about, and Shoshana Zubov came up with the this idea of surveillance capitalism, like this idea of like, that's the economic model that we're operating under now. And like, those two words, don't sound fun together. Right. Yeah, approach.

 

David Donnelly  16:10

I mean, there's, there's, you know, as we talked about the attention economy, the the big, big problem with what's going on is that it's, it's interfering with our ability to think critically. And as a result, it's hard to hold two ideas in your head at the same time. And so, you know, if you're talking about the consequences of some of these in my platforms, then people immediately think, you know, you're a Luddite, and like, you know, like, you know, you might be like in like Amish or something like that. And that's just not the case, you can, I think we have to acknowledge that technology is amazing. And it's great. And in so many ways, there's so many benefits, you know, look at AI, what it can do, you know, to detect something like cancer. I mean, there's an endless amount of positive benefits of this transportation at the same time, you know, it can be harmful. And so we have to be able to hold these two things in our head at the same time. And we're seeing that less and less often.

 

Adam  17:09

Yeah, that's a such an important point to where it's that since we're kind of so brain drained, right, by getting so much information all the time, and it's funny, it's like, I don't know about you, I struggle with this every day, right? I'm like, Okay, I'm just gonna check my emails, and then suddenly, like, 35 emails, and then my brain is like, Okay, now I'm tired, I need to do work. But now, you know, I just, I've just taken in too many things right away. And so it's like this, even this, like these very simple things kind of every day, where we have this kind of brain or brain drain or kind of like the the decision fatigue something else, right? The very common thing of like, yeah, what should I watch on YouTube or Netflix? And it's like, then you end up watching nothing, and you just like scroll for 30 minutes, you know, and that's an interesting thing to kind of think about there, too, is like, what are these technologies doing? Right? Like, we need to be able to think critically about them. And like, what does this mean for society, and you know, both how we act with one another, and then how we approach something much bigger like democracy, especially because we're in an era now that we have a lot of complex problems that need nuanced solutions, we need critical thinking to deal with them. So how do we like cultivate more critical thinking skills to kind of navigate this? Or like, have you seen some some spaces of either hope, or kind of what focused on boys are getting around that?

 

David Donnelly  18:19

I think one of the many answers, I think lies in the way that we approach problems and the way that we approach history. And in the past, you know, it was a couple of big ideas that were really dominating the world, whether it was you know, communism versus capitalism, whether it was you know, Christianity versus Islam, whatever, whatever it might be. Now, there's a countless amount of ideas that are populating people's minds. And now that you can reach anybody, there's just an endless amount of niche audiences and different ideas. And so you see somebody like, you know, Jared Diamond, or Yuval, Noah Harare, and they're kind of going through history and putting together pieces of the puzzle. And I think that really is, is an approach that's going to be effective in the future, which is a big history approach is combining a lot of different fields to show what's happening. I mean, anthropology is certainly a part of that. But you know, you got to be able to combine that with technology, with science with history. So that and that's what we tried to do with the film was take this big history approach to show how all of these come together, and biology and psychology and then kind of put that together and see how it's all impacting us. And that that Big History approach, I think, is how you're going to see more stories evolve. You know, as, as, as you know, with every major revolution, there's a revolution in the way that we tell stories that goes all the way back from you know, going from hunter gatherers to agricultural, and I think what we're seeing now is a need to start time Telling stories that are putting together pieces of a puzzle and helping us figure out something that has grown increasingly complex, rather than just focusing on one of those elements. And it's still important to have that deep dive into one thing. Right? Yeah, I mean, there's no shortage of documentaries, that focused on, you know, the, the Facebook hack, for example, and showing that, you know, we can't allow it to happen again, you know, for example, once that data is out, or, you know, social media itself, or, you know, the mental health crisis that's occurring, but there's not too many things that are trying to put all those together, because honestly, it's it's a tougher sell for people to watch. And I'm hoping that that changes, and that we can inspire people to start wanting to learn this way, which, you know, I think is, I think, is also a very fascinating to see how everything connects together. Yeah.

 

Adam  20:50

And I think that's great. And I 100% agree to it. Like, that's interesting, because we do see, those kinds of arguments echoed like in universities in higher education, as people are asking, should we be siloing anthropology and biology than in sociology and history? You know, obviously, you can take classes in different areas, you know, and there obviously are interlinked linkages between them, but it but it's like, the way that we've kind of siloed education in the past is an interesting point of noting, okay, well, we chose those silos, right to kind of put knowledge into, which was the choice and we can choose not to do that we can choose to do something else. And so I think that I appreciate your point where it's like, as we approach larger, complex problems, we need more complex answers and more complex solutions. And I agree that that part of that is like, we have to be more interdisciplinary and how we explore that,

 

David Donnelly  21:35

well, I think there was a need, there was a need for for, you know, those those silos, you know, you wanted a pilot to be able to, you know, be the, you know, the best possible pilot and know, every single, you know, computational element on the plane. But as we start to shift a lot more knowledge onto devices and computers, then we're going to be forced to use the critical side of our brain even more. And, you know, that requires, you know, an understanding of philosophy that understands there, there's just so much more that I think we're going to be forced to reexamine our entire educational system. Yeah, I mean, right now, kids need to learn how making a single post can change the rest of their life and potentially hurt their career. I mean, like, that's not something we're teaching. And that's not the teachers fault. Most, you know, teachers are, there's an amazing group of people in the United States and across the world that, you know, take a path to try to inspire people that's very hard, you know, to do, and very little, you know, benefits anymore of doing that. But we have to rethink, you know, what information and knowledge and, you know, mechanics of thinking, do we want future generations to be able to have for, you know, our success, as, you know, society and nation and even as a species, and those are the kinds of questions we should be constantly having, in a healthy way, you know, not a political way, not in a partisan way. But I'm hoping that this is an opportunity for everyone to come together, which is why we deliberately, you know, did not take any kind of political routes, you know, in the film, because that's not what this is about. This is about that we are part of something larger than ourselves, and we're all living through this revolution. And we need to have conversations about that as humans. Yeah.

 

Adam  23:24

Yeah. 100% agree there, I think that you raised a ton of interesting points, and really important issues there. And like, one of them, I think, is is interesting is to know, like, you know, it's kind of like you today, you have to say, there's not a political agenda of this film, this is actually about something that's bigger than all of us, right? And, like, how does that reflect on like, how, how we've been telling stories, right? And also about what we expect as an audience, right? Well, we, we expect there to be some kind of a spin, right? When we're consuming content or media and then like, that's an interesting thing for us to recognize, too. Again, thinking of critical thinking of like, how am I consuming media? Like not just what data am I giving away? That's one side, the other side is like, what am I expect when I'm getting it? To your point? You know, it's like, okay, is this is a lefty film? Is this a righty film, like, what are we trying to do here? And it's like, no, we're actually trying to present a a encompassing perspective of the cost of thinking these platforms are free, and what that might mean for how we behave and engage with one another, you know, interpersonally. So I guess I'm curious, your thoughts about that, like, either, I don't know, if it's a film reception question or like, has had this come up or kind of people are seemingly expected,

 

David Donnelly  24:25

we were active? Yeah. Because the easiest way to market something is to divide the public into segments and then target one of those segments. And, you know, as you can see from the film, we don't have a clear, you know, demographic, because, you know, we are trying to deep dive into all of these different things that connect us as humans. And we did get a lot of flack from agents and distributors early on, you were kind of like, well, you could change it, you know, maybe make it a little bit more this or that, because those are the documentaries that do really well the ones that have Political angles, because then you can just be like, you know, you could just talk shit about the other side. And then you can, you know, you're, it's an easy sell to an audience because that audience is very tribal on both sides. And, and because we didn't take that route that did eliminate some of the, you know, earlier opportunities that we had. But once again, we really wanted to retain the integrity of the story, and also the importance of the issue, which is why we, you know, we finance independently, I think, the polarization aspects and the, you know, the cancel culture aspect, all of this really does connect to the way that we're connecting and influenced by these algorithms. Because as Roger McNamee points out in the in the film, if we can just step back and get out of our own head, and know that when we see something that elicits an emotional response in us, that was deliberate, because it's, it's raising, or it's activating our fight or flight system, and it wants you to either click on it, because you disagree with that, or click on it, because you do agree with it, then you're going to share it because you're, you know, you're a member of this tribe. And you know, you and that's that mindset is, is based upon a binary function. And that's what allows these algorithms to be successful, they just want to click or not click, and whether you hate it or love, it doesn't matter. It just wants you to have an emotional reaction to it. And so, you know, as we're digesting this data and thinking about these things, it's always important to pull back and, you know, understand how this is impacting our actual brains.

 

Adam  26:40

Yeah, right. And to your point to have like, making visible the business model behind that to have understanding that what it's doing to us is a secondary effect of what the business model is trying to accomplish, which is click or not click right. And like, that's, it's a yes, a stark thing to realize, right? That, like, if it had really positive effects on all of us, you know, would we feel differently? Maybe, but maybe not, like, if we didn't even recognize it, like a business model has a huge effect on how we feel and think and act? You know, what does that mean for the controller power we do? Or don't have? And then how does that affect obviously, because we can think about this right? How you and I engage how we engage with friends, how we need your neighbors, how we might engage with someone that is politically different than us? How might we engage with an idea as abstract as democracy that we all participate in? In the US, you know, that this this kind of question of like, what role does something like an algorithm or this the, these platforms like play in shaping that? And like, to your point, yeah, we've seen like, you know, the Cambridge analytical scandal, and Facebook and recognizing to that, that, you know, they have this giant spike and rise in mental health issues, especially amongst teenagers, who've been on these platforms and like, weren't alive before they weren't here. You know, for better for worse. Um, you know, I remember Facebook came out when I was in college, you know, but it didn't, I didn't get

 

David Donnelly  27:50

  1. Yeah, we're right. We're about the same age. You know,

 

Adam  27:54

and I remembered you, edu address for a number of years, right. So it was it was a school only thing for a while. And I refused to join for free until my friend finally signed me up, like three years in or something. Yeah, but then he's like this, this, this interesting point where it's like, I think it's like, I can remember the first time that somebody said to me, like, just be aware of what you post because it could affect, you know, again, what a job, employers looking for something and I was like, what, like, you know, this is like, you know, 2010 or something. So, but realize, like, now, it's like, you know, there's so much more posting and so much happening,

 

David Donnelly  28:24

not only that a lot of these employers have automatic software that scrubs the internet for potential employers. I mean, yeah, and, you know, gives them reports, it's a it's absolutely crazy. How much data is constantly being transferred, you know, just daily interactions, and the average person just doesn't have any idea how much they're giving away throughout that process. And I, one of the things I wanted to ask you, you know, from an anthropological perspective is, you know, when we're, when we're under surveillance, when we know that we're being surveilled, as Chris talks about in the film, and kind of, you know, it goes back to this concept that we're on the savanna and we're being watched by a predator and we feel like we're being hunted. And that if you know, you're being watched, you know, then you're gonna behave differently. Like, you know, like the panopticon, right? It's just the, you know, this idea of a prison that was, you know, invented. And I think we're living in one of those right now. And that's what's causing the kind of this lingering sense of anxiety. Yeah, because we know that we're kind of always being watched or listened to, but how do you how do you feel like that impacts us on a larger scale? Yeah,

 

Adam  29:31

that's a great, it's a great question. Yeah, it is this idea that like, and with the panopticon, too, it's right. It's like we like the prisoners can be seen, but they cannot see the person watching them. And so like, it's the perfect example of what we're talking about here of the idea of technology, always kind of capturing our data, but we're not really quite knowing. And yeah, I mean, there's actually like, that's one of the the main like, things we have to think about as researchers like going out into the field. I mean, documentarian is to where it's like, us being there will change the behavior of people that we're talking with and like, you know, and you see this For like, right, there's user experience research, you see it in any kind of corporate research, any kind of academic research to, you're supposed to be aware of this idea of like, how you ask your questions, and what you're asking, Are you leading somebody on? Are you just asking an open ended question? Right? How is you being there? Like, what is the power dynamic in that place? And so, yeah, I think that that is a huge piece at play, right? What's interesting is like, is thinking about that tension of feeling like we're kind of always being watched, and that leading to some level of lanten anxiety, like, it's basically the idea that anything that I do on a digital device could get me into trouble at some point. That's a weird feeling, right?

 

David Donnelly  30:33

Yeah, yeah. I everyone feels that, you know, or just, you know, making a post and then just realizing, God, who was stupid was I thinking or like, nobody liked it, man, this must have been or just comes off wrong. Or if you're like, me, sometimes, you know, you're awkward. And then you know, you write something, then you're like, you know, in retrospect, it didn't. But I mean, that's that we all have those feelings, you know, and it's such an and also to, we just, we don't behave the same way online as we do in person. Yeah. 100% don't, you know, and so it's just, this is just causing the cycle of confirmation bias and tribalism to just keep escalating when what we need is the complete opposite. And, you know, that's, that's why we have to really have some kind of cultural reset and paradigm shift, you know, in order for us to, you know, to, I mean, to not completely be, you know, destroyed.

 

Adam  31:24

Yeah, it reminds me, this is talking to a biological anthropologist, David Sampson about this on an episode a few months ago. And then, like, he had this really provocative book that's called our tribal future, which is interesting idea of like, because we don't we tend to think of tribalism as, as a bad thing. Because it is, in a political sense, it is dangerous, especially today. But the tribal instincts that we have, as people actually can be quite powerful to your point, things like intentional proximity with people that we care about, right? Being face to face, you know, the idea of like, having your fire team is what he calls it, like, who are the five people that you can call when shit goes down. Like, if you need help, if your baby's sick, and you and your wife's out and do whatever you like, you need, I need to go to the roof, my leg and I need to go to the hospital, whatever it is, like those people like, there's like, immense data kind of pointing out, like how much that gives you a level set of well being because you know, that if you need to someone's there for you, right. And so to this point, like, the way that we act differently online is a huge challenge, you know, because it's, we do see the rise of like, toxic fan toxic culture, you know, and we wouldn't get it in sales and all that other stuff, but just like these very intense kind of communities based around hatred, that like if you put people in a room together talking like it's in actually communicating, it's very hard to then have that same level of hatred in you know, vitriol, I mean, it's, it's, it takes work to get past that at this point, which is, which is one of the unfortunate parts because it calcifies on people, but like, these ideas of like, recognizing that there are in when we can kind of activate those positive aspects of our of our human capacity right for being with others. And, you know, understanding like, we have things like you know, again, you mentioned this in the in the film to like limbic resonance and mirror neurons, right, that when when you smile, I smile, like we can feel good together. And it humanizes us to one another, right? That's like, one of the ways we recognize not only

 

David Donnelly  33:07

not only that, it's, you know, as Dr. Kesh explains, in the film, it's, it actually is hormonal regulation, we need those experiences in order to have, you know, proper releases of serotonin and dopamine. And so, you know, we kind of need that feeling of having a secure conversation with somebody in a safe space. And that's why you know, tribalism does have positive aspects, which goes back to the idea that two things can be true at the same time, you know, I mean, look at sports, which is a great example, you know, or look at business, you know, if you're on the team with somebody else, it creates competition, competition is good for everyone. Especially, you know, if you're, you know, in America, and so I think, you know, it pushes us, you know, further to our limitations and boundaries. And so there's an endless, you know, forcing us to innovate. Yeah, I mean, those that's, that's a fascinating concept for a book, I really wouldn't really want to read that. And so I think the, the other side of that is, if you have too much of something, or if it if it's been, you know, I mean, what Roger kind of explains in the film, is that someone hijacked the way that our brains work, and all of these kinds of things that have evolved for positive reasons to increase our chances of survival. And they've taken those mechanisms and used it to extract data and to manipulate our behavior. Yeah. And, you know, that doesn't if it sounds like a sci fi film, or black mirror, I think, even if you just if you're a fan of Black Mirror, like I was, I mean, you watch them those episodes and you're like, Okay, it's not so crazy. Some of that stuff is and it's tough to, you know, I wasn't setting out to make a dystopian film. I'm very optimistic about the future, but your mission what this really is, as kind of a call to minds to get people to come together and say that look, all those division that we see all this polarization obvious hatred. Well, what if this is the opportunity for us to come together and have a conversation about what parts of the human experience do we want to preserve and fight for? And that's the conversation that I want to be a part of. And I think that's something that everyone should be having. And this is what we should be having in our schools and our universities and our households. Yeah. And besides the the mindless shit that is fed to you all day long. And in order to have that conversation, you have to have the vernacular, and you have to have a basic understanding of how these different things work. Yeah,

 

Adam  35:33

no, I think that's a fundamentally important point. And I agree to it's that like, it's this it's the I was laughing at this, I was listening to a podcast with Nora Bateson, who is the daughter of Gregory Bateson, who's just, he's an he's an early 20th century anthropologist, and I guess, kind of dilettante. He started a bit of everything to the point even like injecting dolphins with LSD or trying to figure out he didn't, but he's working with an organization that did that to see if they could like psychically communicate with dolphins, but he's the father of cybernetics in terms of the thinking process. And Nora Bateson said this thing on this podcast, she was noting that when she was being sent to school, her father would like weep, because he's like, they're going to destroy your mind, you know, because they're just gonna, they're gonna teach you like, very basic things and teach you not to think you know, and so, I'm not saying this to shit on schools. But like, I think, to your point, like one of things we do need to think about is like, what's the purpose of an education, especially today, right. And like, on the one hand, like, we've been saying that we do have to rethink the disciplinary boundaries that we've used, like they are useful, but then also, we can think about how we need to transcend them. But the other part is this, that you're saying here that, how do we ask questions of what does it mean to live a good life? Right? What is it that we want out of life, you know, as? And in thinking about, what if education? Does that helps us at least inquire about that? Like, that's, I think, a powerful question. Because it's like, even the idea of like, critical thinking, you know, got discussed if I can remember, like, maybe in my anthropology 101 class, but like, not really in depth until, like, later, you know, in in like doing graduate school with that. And so it's like, even though they did like, when is critical thinking taught like, because we're seeing like, the, you know, as we know, the challenge against humanities and social sciences in like, as like, you know, some politicians just saying that they're not important arenas, because they don't make money, or they don't help you get a job. It's like this. This is the I think the paradigm we have to ask is like, well, then what's the point of education? Is it just to get a job? Yeah, vocational school, which is totally fine. But is that what everybody needs to be doing or should be doing? Like, there should be a more conversations about what we can use education to do. And that can be vocation that can be critical thinking like, but I think it's like we have these bigger questions be part of the two, I think would be it would be a huge leap, you know, just saying, let's clear that space to ask those questions of what does it mean, what do we want out of life?

 

David Donnelly  37:38

I couldn't agree more. I mean, there's a huge need for philosophy, right? Now, imagine if you're developing software that's going to be used in navigation systems for planes, how this, you know, redundancy, you know, problems might might work, you know, who has the control over what happens in certain situations. I mean, these are, these are huge questions. And, you know, you can't just allow the people that are creating technologies to also be the people that are creating, you know, the power dynamics and those technologies, because they're effectively forming the future of civilization. And that's a conversation that all of us should be a part of. I think, when it comes to education, I was very fortunate, I was in a program called International Baccalaureate, which, you know, really was an amazing experience to where a lot of it was about asking tough questions and thinking critically, from a very young age, really heavy on reading. And it really opened my mind to the world, we read stuff from all around the world, we got me really interested in culture at a young age. And I think, now we do have to ask questions, because there's so much information that is available. Now, that was harder to get to, when we were growing up. I mean, we're the last of the analogs, as I like to say, I mean, I can still remember having to, you know, look through an actual encyclopedia to find stuff. And once again, two things can be true at the same time, you know, it's great to have access to all of the information that's ever been collected in the history of our species, you know, in our fingertips. At the same time, we have to understand ourselves and to start to educate younger people that data, you know, is an information are two different things. And information and knowledge are two different things. And knowledge and wisdom are two different things. And all of those different, you know, definitions as to what they are and how to utilize them. You know, an algorithm cannot can never teach us.

 

Adam  39:34

Yeah, that's really important and exactly right, that and also things like, knowledge and data, information and wisdom. All are different and take different time to get right. And that you can't like, that's right hack through IGBT to get wisdom, right, like, like, what's the distill, like Socrates for me like, that doesn't mean you're wise, right? You just You just got like some information there. To that point. I mean, I think something in terms of like raising people's consciousness and awareness, one thing that you Bruce it up to me in the film, which I thought was brilliantly said was just this idea that we, what we think of as a search engine is actually an advertising engine. Right. And like recognizing that, again, is making the business model visible. And the challenge, the challenge point that was raised is also this idea that, like, we use search engines to fact check things and like, think about those two things together, right? It's like, it's an advertising engine. And I'm using this to check facts for something when I'm looking up something that I read, like, I'm curious, your thoughts on this, like when this when you when you put it out?

 

David Donnelly  40:27

I was really fascinated, because you know, the person who kind of well, there's two people that kind of focused on that aspect of it in the film. But, you know, when Roger was telling us the story behind Google, these were initially, you know, obviously, they're, they're clearly very, you know, genius level guys, but they, they came together very kind of idealistically wanting to create this, this thing that they said, could never had advertising in it. Because if the search engine advertising, then it would lose its integrity. And that's in their paper. Yeah, CSIS and you can read that and you're like, and now like, why, and the reason why is because they raised all this money, but then once you raise all that money, then you have to find a way to make it profitable. Because it was a for profit, it's a nonprofit venture. And so as a result, the shareholder started leaning on them. And, of course, they introduced advertising into the equation. And now you know, you've got the highest concentration of wealth in the history of our existence, and the fastest, the fastest accumulation of wealth in these tech sectors. But the other person that does amazing work in this field is Dr. Sofia noble, she wrote a book called algorithms of oppression. And, and this is really quite terrifying, when you start to see that, you know, algorithms are not treating everyone the same, you know, they're their algorithms. And so, you know, there's a lot of discrimination that occurs between people of different races of different classes, you just, it divides and conquers, is what these algorithms do. That's how they have success, you know, and as an algorithm, which is, you know, getting the clicks or whatever their functionality is supposed to do. But this was leading people, you know, for example of this was leading minorities to buy homes in the same subdivisions away from, you know, primarily white subdivisions. It's called Digital redlining, that's just one example. It's pushing people into certain universities or away from making a decision to go to college, whether or not to buy a house, what who to get a loan from from the bank, what kind of interest rate now it's used in God, there was just something a couple of weeks ago that I read about how, you know, the algorithms that are used for prison sentences and how there's, you know, clinical error that resulted in all these problems. I mean, it's just, it's it's an it's an endless amount of influence that this is having over, you know, very serious aspects of our lives. And I think, once again, go back to transparency. I mean, there has to be there has to be more of it in order for this to, you know, to balance out

 

Adam  42:52

Yeah. Yeah. That's, that's, those are some praise examples, just to have us think about like, Okay, we because we put a lot of trust into these systems, right. Yeah. And yeah, and to the point of like, they function Yeah, like, on the scale that we've never seen before, right, where it's before, if you had a police district, that was racist, not great. But then if you have them using a system that was developed by some other company, that then they can profit around the state or beyond, like, that's a huge problem, right? And then on top of that, they say, well, the system said, I didn't say it, and it's like, okay, but you're putting all of people of color in one community, you know, when you're selling houses, you know, and it's like, that's a problem to think about. It's like, the, it's like that the scale also makes this kind of like, well, okay, how do I, how do I even approached us? Right? And so, but I think that that's the important point to realize. And remember that, like, algorithms are made by people. And this is where bias has come in, right? Like if there's, it's not that someone necessarily programs a racist sentiment into into a device, but they're capturing reputation on the internet, and they don't either filter them or don't tag them certain ways. Then it's just taking it and spitting it back out. Right. GBT is just a apply statistics machine that's throwing back the most likely next word, which is why it makes things up.

 

David Donnelly  44:01

And then we can see when they tried to do the opposite, we can see what happened with you know, the, what do they call it? Like the the god, the woke bot or whatever was created all these imaginary? Yeah, it's just yeah, yeah, that's it's a it's a tough situation. Because I mean, it's once again, the thing that makes us human is that we have emotions that are not binary, you know, you can't just you know, how do you feel? It's not a question of, you know, one to 10. I feel fine. It's great. Yes, I feel fine. Great. I mean, it's like, but that's what you know, that's what an algorithm is. It's numbers, and they're binary. It's ones and zeros. Yeah. And that's what people forget, like, as we're having this conversation, as you're clicking in your phone is everything. It's all ones and zeros. And that's why I think there's so much need for for, for philosophy and for you know, this these kinds of major conversations because we might want to leave parts of what we consider to be human behind us and evolve into something else. And we also want to preserve some of those aspects. But we have to selectively start to figure out like, you know, what things? How can we continue to improve as a species and as a society?

 

Adam  45:12

And who are we? Who are we letting control those conversations, right? Like if we're letting, letting the open AI and Googles and Facebooks, CEOs, all white men, like also make those choices for us? Like, we have to ask this question of that, like, you know, and the ultimately, basically shareholders behind that, like, who are we letting decide what the next iteration of humanity is? Or should be right? And then it's kind of like, that's the conversation to your point that needs to involve everybody, right?

 

David Donnelly  45:36

Well, the, the one analogy that I think is important, I know we don't get much time off, but that we like to use is, you know, if you have a, you know, if you if you want to drive a car, you have to get a license, because cars are amazing. But cars can also be very dangerous, as we know. And there's rules and regulations for what you do when you're behind the wheel. Yeah. Now, if you want to go online, whether it's through your phone, or whatever, you can, anybody can do it. And there's little information about how it actually works. And there's hardly any regulation. And so, and yet the dangers can still have life altering consequences. That's absolutely crazy. Yeah, you know, there should be a basic understanding, and mandatory education that is technologically and digitally oriented. For people's own safety and for the safety of others.

 

Adam  46:35

Yeah, and this idea to have like, you know, to come back to our, our Dunbar's number thinking to where it's like, if you know, the people that you're giving a cake to, you're gonna be concerned that you give them food poisoning, right. But if you're just like making that screen making device that that you never see who gets it like, then you may not think about that same thing. So there is like this other point of like, accountability and responsibility should be also part of this, right? Like you have, you have a responsibility as a driver, and accountability that if you get into an accident, you can be at fault, right, or you can hurt somebody, or yourself. And it's interesting that Yeah, like that sort of set of thinking doesn't always apply to businesses. And it's like, I think that that is to your point, like a huge area that we have to then bring into that conversation too. And just kind of say, like, if you're going to make things, at any scale, there's a level of responsibility that you have to have, and it cannot be the terms of service that you just signed it over to me as a consumer that I also can't read 48 pages, a giant document, right. Like, that also just says, like, the company is harmless is what they what they all say, right? Like, you can't

 

David Donnelly  47:34

go to that. Yeah, the power dynamic, that, you know, once we started the conversation is, is very important to because you see that a lot of the technologies that are being created, it's a new luxury to not have them in your life, you know, kids that are not going to, you know, they're going to tech free schools, adults going to, you know, places to holiday to where in, there's no phones available, whereas, you know, if you're a mom working multiple jobs, and you know, you come back, and you've got to, you know, you've got you're exhausted, I mean, it's very easy to give a kid an iPad. And so we're starting to see this transition of use of screen time being associated with wealth. And it's just, it's, you know, so this is now becoming a tool for people to, you know, extract data from those who don't have as much freedom in their lives to not use it. And I think that's also a trend that just, it can't end well.

 

Adam  48:37

Yeah, that's a it's a brilliant point. Right? It's it's reinventing social inequality. In a Yeah, it feels like a very weird way, right? That I cannot use my phone. Because I have to, I have to either, you know, survive or look for work, or whatever it is, right? If especially if I exactly. I don't have the freedom to not use the devices. Yeah,

 

David Donnelly  48:54

I mean, some somebody that might not answer an email the day they you know, that that could potentially cost them their job, or it could cause a problem, or if they're a gig worker, and there's an opportunity for them to suddenly go out and Uber or to deliver food. I mean, there's that constant pressure to be on, you know, well, and so this is once again, these come back to basic decisions of, you know, what elements of being human do we want to fight for and preserve and what elements to you know, we think that we can work on

 

Adam  49:22

Yeah, well, I'm super into for the next conversation on this too, then so one thing so as I read the classic conveniences the first in a trilogy that you're making, so yeah, like what's what's what's, what do you what's the next is it? What's the next story there?

 

David Donnelly  49:35

Yeah, so the way that we looked at it is is kind of a mind body spirit trilogy. And so cost of convenience is, is kind of the first big history dive into how the merger of technology and culture is impacting our minds. And the the next thing we've already started filming we've been working on this for the last year is called forever young. And that is how technology is impacting our bodies. Cool. So what we've been doing is interviewing top scientists from around the world who are focused on aging research. And, you know, we're quickly learning that this is a massive revolution in aging science that's occurring, that is going to have all kinds of consequences that we aren't prepared for. People are going to be living to 120 more frequently, life expectancy is going to continue to, to increase and everything from, you know, social security to the way that our economies are structured to healthspan versus lifespan to equal access to these technologies that can detect early tumors and make us live longer, healthier lives. I mean, the list goes on and on. And so that is the next exploration. The third part of the trilogy is called a history of consciousness, which we have not started filming yet. And that is about the how technology is impacting our spirits. Well, if this goes into spiritual technology, this goes into mindfulness and meditation, as well as how artificial intelligence is becoming more and more a part of our lives. It's asking the question that if we start to incorporate these technologies into our lives, at what point are we no longer human? It's explores the idea of what's called speciation, which is, you know, the human race potentially splitting off into different parts one with people who have access to certain technologies or to choose to become cyborgs and others that do not we explore uploading of digital consciousness to the cloud. Transhumanism is a big part of that story. So

 

Adam  51:43

well, okay, that all sounds awesome. Super excited, enthused by all that. So let's, let's talk about the next film soon, too. Yeah,

 

David Donnelly  51:50

yeah, I appreciate it.

 

Adam  51:51

I imagine. I'm just thinking that like, also, you may already be doing this. But like the for especially with the consciousness one, I think the the, the oncoming revolution of like psychedelic psycho therapies, like, with technology, very interesting, like frontier, what's happening in that space?

 

David Donnelly  52:07

That's a big part of we touch upon that in Forever Young, because we we talked about how are we going to do people that just if there's going to be a pill that can solve virtually any problem. It's only a matter of time before we start, you know, utilizing you know, psychedelics, to, you know, to help with our emotional, emotional issues. And so it's like, how far do we take that, you know? And then history of consciousness? I think, I mean, if I could describe it anything, it's pretty much a you know, it's an acid trip.

 

Adam  52:31

Yeah. That's a trip now. Cool, David. Thank you. This has been a great conversation. I appreciate you. You rabbit holing with me across the, the Yeah, the idea landscape, join it. And yeah, started to get the film out to folks to check it out, too. So that's a good feedback, pass it on your way as well. But thanks for making the film and really enjoyed it and keep doing good work. And I'd love to talk again. Yeah.

 

David Donnelly  52:49

Yeah. So what I ended up people can see the film at cost of convenience dot film. Cool.

 

Adam  52:54

Cool. Yeah. I'll put the link for the film in the in the show notes. Folks. Got it. Awesome. Well, thank you so much. As we wrap today's journey through the tangled web of technology, culture and humanity. I want to extend a huge thanks again to David Donnelly for offering us this front row seat to the making of the caustic convenience and sharing his insights that both challenging and lightness. Today's conversation has taken us from the rapid acceleration of technology adoption during the pandemic to the profound impacts of surveillance capitalism on our daily lives and the very essence of human connection. We've uncovered the stark realities of how our digital footprints are far larger than mere echoes in cyberspace. Right there. Commodities traded in bustling marketplaces, a data brokerage, including everything from the ads that we see to the political narratives that can shape society. David's work taking us deep into the complex dance of documentary filmmaking. an anthropological inquiry exposes the raw nerve of our digital existence, and helps us to think about the questions of what it means to live authentically in an increasingly algorithmic driven world. So as you go about your day, you know, glued to your screens, large or small, take a moment to reflect on the cost of that convenience, right? Are we paying with more than just our wallets? Are we even sacrificing a piece of our humanity for the sake of efficiency and entertainment, let's not forget the power of human connection or the warmth of a smile from someone that no emoji can replicate as much as we love them in the critical thinking that can flourish away from the glow of the screen, or at least an added driven experience. And of course, I highly recommend that you check out David's film the cost of convenience, you can see it linked in our show notes below. And let's continue the conversation both online and off. Be sure to drop your comments in our substack blog and or on social media. Thank you once again for tuning into this Anthro life and until next time, keep exploring, keep questioning and most importantly, keep connecting. Remember that in the world eager to digitize and categorize us our human experiences are perhaps some of the most important things that we should not give up. Amen again. Well, we'll see you next time.

 

David Donnelly Profile Photo

David Donnelly

Director/Filmmaker

DAVID DONNELLY is an American filmmaker renowned for his impactful documentaries in the classical music realm, notably his award-winning debut, Maestro, featuring stars like Paavo Järvi, Joshua Bell, Hilary Hahn, and Lang Lang. This film, translated into multiple languages and has been broadcast worldwide, is highly regarded as an educational tool in music education. Following Maestro, Donnelly directed Nordic Pulse and Forte, completing a trilogy offering an unparalleled glimpse into classical music. His work, relevant amid the Ukraine invasion, includes narratives on Estonia's Singing Revolution, showcasing his storytelling's depth. Donnelly's films have been showcased at prestigious venues like the Whitney Museum and the Kennedy Center, underlining his status in both the art and film communities. In 2021, he co-founded CultureNet and announced The Cost of Convenience, the first in a new trilogy exploring technology's cultural implications. Donnelly's career extends beyond filmmaking; he's a sought-after speaker, sharing insights from interviews with global thought leaders across over 30 countries.