#002: Curate or Be Curated – Bailey Parnell on Social Media, AI, Mindfulness & Control

Bailey Parnell
Bailey Parnell is the founder of the Center for Digital Wellbeing and. a recognized expert in digital wellness, media literacy, and mental health. Her TED Talk on the effects of social media has garnered millions of views, and her work focuses on helping individuals and institutions navigate technology in healthier, more intentional ways. She frequently speaks about online harm, digital boundaries, and the intersection of tech and emotional well-being. Bailey is also the founder of Skillscamp, a soft skills and leadership development company.
In this conversation, Bailey Parnell explores the concept of digital wellness and the idea that social media should be considered a risky behavior. She outlines four major stressors tied to social media use, including FOMO, online harassment, and the pursuit of validation, and discusses how these platforms affect mental health across all age groups. Parnell shares strategies for regaining control over your digital environment and emphasizes the importance of building a fulfilling offline life. She also discusses the illusion of privacy online, the role of government regulation, and the ethical responsibility of tech companies to prioritize user well-being. The conversation expands into the future of AI, with Parnell highlighting its potential to support soft skill development and mental health, while cautioning against unchecked bias and corporate influence. She offers a vision for a more mindful relationship with technology and explains why digital habits are deeply connected to our sense of self, community, and purpose.
Siara (00:01.382)
Bailey Parnell, thank you so much for being here. I'm really excited to have you on the show. I came across your TED Talk a while ago while searching for answers about social media and stress and how it all relates. And I just really resonated with everything that you had to say, had to have you come on here and share your wisdom. So thank you for joining us.
Bailey Parnell (00:20.268)
Of course, yeah. Happy to be here and talk about how it's all evolved since that talk.
Siara (00:25.21)
Yes, yes, it's only been a couple of years, but I feel like so much has changed. let's get into it. First off, how did you become so interested in digital well-being? What was your journey here?
Bailey Parnell (00:39.214)
Well, I'm kind of a crossover generation like yourself, I think, between millennial and Gen Z. And so I'm just at those years where my young youth was without kind of mobile technology, mobile technology, social media being what it is today, but that it started, it very much started kind of proliferating society as in my high school and then into young college years. then of course that just became even more so.
I was working in a university and I was working primarily in student affairs, which is kind of everything outside the classroom that supports student success. And half of my role was working with them to build things like emotional intelligence programming and resilience with health and wellness departments and then career centers and learning support and all these departments, right? But the other half of my role was very much social marketing and digital marketing.
storytelling, digital student engagement was a part of this as well. And so we were taking at the time it was mostly Twitter, Facebook, and then Instagram was popping off. And we're like, okay, how marketers seem to have figured this out. How can we use these same tools to actually engage students in their education when
especially when they're not here, because that university was a commuter school. So 96 % of our students had lived off campus. And we're thinking, OK, this is amazing. Why it's amazing is because it was just a little bit of luck, opportunity, and preparation. If we weren't at a university that was a city school that was primarily commuter, maybe we wouldn't have had as much incentive to understand how social media could actually help our goals. then, dun, dun, dun. But then, of course,
Siara (02:25.076)
Mm.
Bailey Parnell (02:30.348)
with the proximity of health and wellness departments, I mean, one time I remember they came to me and they're like, hey, it's social media week, or it's mental health week and you're the social media person, can you talk about how they connect? And we were like, yeah, absolutely. At the beginning of that TED talk, you also saw that I had been going through, this was all kind of coalescing at the same time, I went on my own trip and I was.
On this trip, I was going without mobile and without social and I was kind of putting my phone away. And I had realized that I was having what in the Ted Talk I call phantom vibration syndrome, where you think your phone is going off, you sometimes literally feel it and you check and it hasn't. And I was having sort of subconscious reactions to my phone and to social. And I don't know about you, but I like to be in control and in control of my brain. And so I was just thinking to myself at the time,
Siara (03:18.515)
Yes.
Bailey Parnell (03:23.372)
You now know that I was working in social marketing and I was studying social marketing. went from my bachelor's was in media production and my master's was in communications and culture. So I was literally working in it, studying it and personally using it. And I just thought to myself, I'm supposed to be the person who knows what's going on here. And this is really starting to affect people's mental health. At the time, research, just a little bit of research was coming out about
Siara (03:43.607)
Mmm.
Bailey Parnell (03:51.704)
how Facebook affects teens, if you can imagine that. And so I was kind of like, well, I should be the person that knows what's going on here. And that led us to doing some research within the university about even just simple things like what do our students like most and least about each platform. And that sort of spiraled into, well, if I'm doing this research anyways, I may as well get a degree out of it. And that's the master's. And then of course, it's just snowballed from there.
Siara (03:55.338)
Yes.
Siara (04:14.665)
Nice.
Okay, so I think that something I can relate to you on is being behind the scenes a bit with the social media and the marketing and seeing even when you know how it's all working, even when you're acting as a bit of an Oz there, you are still impacted by it yourself. So you're like, if I'm feeling this way, what does it feel like on the other side? So I really...
Bailey Parnell (04:25.614)
Hmm?
Bailey Parnell (04:36.269)
Mmm.
Bailey Parnell (04:39.574)
Yeah. Having understanding of it and the way that it works is an important first step. Don't get me wrong. It is actually a very important first step. And when you're in marketing and you are in social marketing, particularly, you do kind of understand the behind the scenes. But what you'll find is that a lot of these folks will still go home and still be users themselves.
Siara (04:49.495)
Hmm.
Bailey Parnell (05:03.404)
So there's sometimes different strategies, know, like these folks, it's like you never get off because sometimes they're finding their personal time and their professional time is tied up with each other. And that's a little bit of a, it's still very much digital wellbeing, but a slightly different conversation than if I was just talking to someone who is using it for personal entertainment and happiness and fulfillment and is not getting that.
Siara (05:25.471)
Right, yeah. So do you think that, how do you strike that balance then? Being someone who's active on social media, being someone who has this consistent presence, but someone who's also very aware of their own internal digital wellness, do you have certain ways that you split those two lives?
Bailey Parnell (05:46.092)
Yeah, yeah, I do have, I have digital wellness practices, absolutely, that are just a part of my life now. Here's the thing, we're, we've centered on social media and I know why we will, of course, even just my own research, but digital wellbeing goes beyond social media and even beyond mental health, right? Digital wellbeing is like, is how all of these technologies, how, you know, things like computers and
things like how we're filming this podcast right now.
Bailey Parnell (06:24.77)
Sorry, I I lost you.
Siara (06:29.57)
I can't hear you. You've cut out slightly. I'm sorry. I think I lost you for a minute there. You were saying digital well-being and computers.
Bailey Parnell (06:31.288)
Okay, yeah, no, I think I lost you there.
Bailey Parnell (06:40.482)
Yeah, the irony is that digital wellbeing, the irony is just like that where internet gets in the way. Digital wellbeing is about how all of this technology affects the things that we are trying to do as humans. it's, it's included. How does it affect our democracies, our societies, our relationships, our marriages, our, all of these sorts of things, things that we want to achieve for ourselves as humans. How did it just affect in real time, our ability to have a meaningful conversation?
Siara (07:01.806)
Mm-hmm.
Bailey Parnell (07:10.594)
This is digital wellbeing, right? And how does it keep us well or how does it not? And so for me, it goes beyond just social media, but I do absolutely have social media practices. Like I'm very serious about who gets to enter my digital world and who gets to stay in that feed. So just because I met you at some random event, I'm sorry, it doesn't mean that I'm following you back or you're getting muted or something like this. Like it's just...
Siara (07:32.856)
Yes.
Bailey Parnell (07:36.662)
I try to have people who add me on LinkedIn and they're like, I really like what you post and they add me. I'm discerning, you know, I know it sounds a little cold, but I'm like, but do I like what you post? Right? You know, I get, love it. So glad you love what I share. You can follow without being a connection on LinkedIn. I know that's like a specific, but there's, there are these choices that you can make in these spaces that.
Siara (07:51.215)
Yes.
Bailey Parnell (08:02.324)
now control when I log into my LinkedIn, what do I experience in that digital world? And there's some control I can exert there. And then there's also how this stuff affects your offline life. And it seems counterintuitive to say, like to give this piece of advice, but there's no better advice that I could give than have an offline life that you're obsessed with, whatever that looks like. That will be the best piece of digital wellbeing advice I can give you, in the social media world, I should say. Because if I'm being really honest,
I'm very passionate about my work and my business and my businesses. I'm very passionate about, I'm in school right now and my research and that stuff is probably mostly on a screen. So then I'm engaging different digital wellbeing strategies. I'm not thinking as much about social media in those times and that comes with its own set of risks, things that are more like interpersonal comparison, maybe even.
Siara (08:44.25)
Right.
Bailey Parnell (08:57.454)
confidence related and I'm thinking more about like I need to figure out how to do some of my life outside of a screen. That's my biggest digital like well-being consideration at this very moment because my work is mostly through a screen, my schooling is mostly through a screen and even like one of the main things we do for fun is watch movies. So I'm like, all right, I need to actually make an intentional decision to go do offline things or sometimes I'm even like, I just need to clean my house.
because I need something manual to do.
Siara (09:30.147)
Dude, what is your favorite zero screens activity?
Bailey Parnell (09:35.304)
I'm, really like walking. That sounds kind of funny, but I actually, I, I used to live in Toronto, same thing there. Now I'm in New York city and I do really love New York and I find that I get grounded by just going out, walking to parks, being in nature. And sometimes I get some really good ideas as well. So that's an onscreen activity that is done with intention. I do go out like to eat a lot as well.
Siara (09:39.847)
That counts.
Bailey Parnell (10:05.086)
And I was, you know, it's funny, I was gonna say like, we go to the movies a lot, but that would technically be a screen too. You know what counts though? Broadway. Broadway is not a screen, so that counts.
Siara (10:10.995)
Yes, but it feels so, yeah.
That's amazing. So do you have, you talked a little bit about your discernment for who you connect with on LinkedIn, which I respect that so much. think I looked at my LinkedIn, I realized I have 500 plus connections. I remember back in the day, the app was definitely pushing us all to do that. Now we all kind of know why it's way better for LinkedIn's engagement. can't even fault them for that. But.
Bailey Parnell (10:25.315)
Okay.
Bailey Parnell (10:32.662)
Mm. Yeah.
Siara (10:43.168)
I am like, maybe I need to purge a bit. My algorithm's kind of all over the place. And so I have been really toying with this idea of controlling the algorithms on any social media that I choose to participate in a bit more. Do you have any tips or advice for us on the best ways to do that? I know that you are on Instagram. I know that you're on TikTok. How do you do it?
Bailey Parnell (11:03.438)
Yeah.
Bailey Parnell (11:07.054)
There's, have so many strategies, but you know, two are popping to mind right now, which is there's one that is technical and there's one that's like with you with your offline self. One strategy would be what we've already referenced. I know, for example, that on LinkedIn, there are technical things you can do, following someone versus connecting with them.
and is different, right? You if you wanna follow my content, that's different than I actually know you, which is the purpose of LinkedIn. Now, let's say on something like Instagram, they also have sort of technical things that you can do, like muting someone when you don't want to unfollow them because you don't want them to know that you unfollowed them, right? And I would still consider that a wellbeing practice, or in this case, a digital wellbeing practice.
Siara (11:32.831)
Mm-hmm.
Bailey Parnell (11:57.102)
And even if you go on TikTok, you will see people say, it always make me laugh when I see this, when people are saying, commenting for the algorithm, or they'll say, commenting to see more of this in my feed. And they're not wrong, that actually does work because the things that you engage with are the things that the algorithm is going to receive. And to be clear, we refer to this algorithm as this thing out there.
Siara (12:07.695)
Yes.
Bailey Parnell (12:24.642)
but it actually is a form of curation AI, curation artificial intelligence. And now that people have more understanding of AI, I think it'll be helpful for you to understand that when you're gaming the algorithm, you're actually teaching your curation AI. And that might make more sense to people now that they've interacted with things like chat GPT. So here's the trick of that though. A lot of what you interact with, you're doing in a mindless way.
Siara (12:40.928)
Mm-hmm.
Bailey Parnell (12:52.94)
I hope that's not true, but honestly for most people it's true. And you're given a scroll and you might be spending like even half a second longer on something because it's sensational and because you hate it. Like actually the opposite, you're spending time on it because you're shocked by it or you don't like it. Well, that is still a recorded transaction attributing value to that content in the attention economy. So you've got,
Siara (13:07.955)
Mm-hmm.
Bailey Parnell (13:22.828)
That's where I would say, know, what's a strategy for mindlessness? Mindfulness. And that seems like a odd thing and it seems counterintuitive, but actually a digital social media strategy would probably be to build some of these offline skills like mindfulness. And you will consume less content because you will be mindfully consuming it. And not just less content, you will be more attuned with, do even like this? Do I even like the poster? Do I need to get my news?
Siara (13:42.763)
Mm-hmm.
Bailey Parnell (13:51.38)
on my Instagram feed. No, everybody, you don't. You actually don't. That can be a choice you make. I'm not saying that you're choosing to be ignorant. You can get your news in other ways. I get it in an email news digest, right? So, and so that's a choice where it's not all linked up and you're going back and forth mentally with like, that's my niece. So cute. Love that too. this is a really inspiring travel blogger. didn't know I want to travel there too. this
Siara (14:01.666)
Right.
Hehe.
Bailey Parnell (14:21.226)
this person is, there's kind of like racist protest happening here. And then you're like all over the place in 30 minutes. That doesn't help focus and it certainly doesn't help profession. Like even in the world of your professional goals and aspirations, it most certainly won't help that.
Siara (14:28.962)
Mm-hmm and
Siara (14:38.403)
Right, I'm so curious because I've had this internal conspiracy. Just, yeah. I know. Well, you get to reading about how good products are made. Good meaning, when I say good product, what I really mean is products that are getting users to be very active and engaged. And so I think of these not interested buttons.
Bailey Parnell (14:44.878)
Yeah, I love a good conspiracy. I love how many of them are actually true these days.
Siara (15:08.341)
I think every social media has a version of this. You can click I'd like to see less posts like this. I'd like to see more posts like this, you know So sometimes I feel like when I click the not interested I'm seeing something that I'm like this obviously has my attention, but I don't really want to see this on my feed So let me click not interested. I feel like it does nothing Most the time and it's so frustrating and then sometimes here's where the conspiracy comes in. I think
Bailey Parnell (15:15.337)
That would be a technical one.
Bailey Parnell (15:30.601)
Bye.
Bailey Parnell (15:35.385)
Mm-hmm.
Siara (15:36.226)
You know, with rage bait and getting our attention and knowing sometimes the most anxiety inducing content is the most addictive content, is it possible that sometimes they're like, not interested? Are you sure? And they push it a little bit more? I don't know. This is, again, this is my little conspiracy, but at the very least, I wonder if those buttons do anything.
Bailey Parnell (15:54.21)
Yeah.
I can't tell you that. I have met a lot of people in the digital wellbeing movement that are former tech employees. And you know, the odd thing is that when you meet them, most people are good, I tend to believe. And so it's hard because there could be someone in that platform that is actually trying to make it better.
Siara (15:59.462)
Yeah.
Siara (16:08.538)
Yeah.
Bailey Parnell (16:21.88)
But then the system is how all humans work together, right? And the system of the company or what you might call company culture or this sort of stuff is what I'm not really entirely sure. I would do it, yeah. However, however, as I'm saying that, if we're not sure, that means that there has been a depletion of trust in corporate services.
Siara (16:42.648)
Right.
Bailey Parnell (16:45.726)
And I would say that's actually also a really huge part of the digital wellbeing movement is how government is regulating and auditing companies. Just the same way we would regulate and audit companies that are providing us products, are providing us services in kind of other fields, right? You know, if you go to a restaurant, they have a health symbol on their door and they are not allowed to operate if they don't meet certain health codes and health conditions.
And that would be a service that you're getting and then the products, can't even sit in the chair that you're sitting in unless it's been tested to be able to meet human welfare standards, right? So even though the fact that we don't know and that we're questioning it is a red flag in and of itself because trust has been eroded.
Siara (17:31.213)
Absolutely. I mean, similarly, I'm thinking about some ways that governments are trying to incentivize these companies to think about well-being a little bit more. There's a new bill that they just announced at the time of this recording, less than 48 hours ago in the UK called the Safer Phones Bill.
Bailey Parnell (17:52.398)
Mm-hmm.
Siara (17:52.656)
So the bill would force social media platforms to make content less addictive for teens under the age of 16 by banning those algorithms that push that doom scrolling and the addictive content. Do you think legislation like this is the right approach to tackle the addictive social media problem?
Bailey Parnell (18:11.394)
Yes, because you have to understand that social media is a risky behavior, like sex or drugs or alcohol. And I know you've heard me say this before, because a risky behavior is quite simply, when you participate, you expose yourself to potential harm, like driving or drinking. And that doesn't mean that it's all bad, it means that there are things that you can put in place to make sure that people can do this thing safely and get the benefits of it with less risk.
Siara (18:21.767)
Mm-hmm.
Bailey Parnell (18:41.67)
And if active social media is now 20 years old and really popular social media is maybe like, I don't know, 15 years old, well, we've had enough time now to know what some of those risks are. And we know that it is huge capital, like it's a huge capital project, big capital project. That's what big tech is and big social media is making a lot of money. Now, what governments are supposed to do for the people is protect the people.
Siara (19:11.197)
Mm-hmm.
Bailey Parnell (19:11.374)
is prioritize human welfare, and I think most people would agree, most especially child welfare, in the face of XYZ. And in this case, that includes businesses. I mean, XYZ can include a lot of things in the face of what? But certainly in the face of businesses, governments are meant to be there to protect the people. And so if they're even trying right now, I'm saying, yeah, try it, try it.
Try to see if you have to enforce those companies to figure out what makes healthier content and improve it. prove it guess it's prove it and improve it. Then yeah, go for it because every other business has had to figure that out.
Siara (19:51.175)
Mm-hmm.
Siara (19:56.953)
I'm curious, do you think younger generations are impacted differently than older generations? Because when I hear laws like this, I have a bit of FOMO, because I'm wishing when I was first introduced to social media as a teenager, I'm like, I wish these bills had been in place. I know it had impacted me and my peers in certain ways, but maybe laws like this could be the shift that we all need for all social media to become healthier for all of us. So I'm curious.
how you think it differs.
Bailey Parnell (20:26.466)
Yeah, I have so much to say about that actually. I can think of like three different areas to go with that insight because on one side, great, if you think that this laws would have helped you, that's precisely why you and everyone listening should be fighting to make sure that they're in place. Because just because something didn't happen for me in the year that I was born does not mean that I should not be fighting for that for today and for the future. And also, I should be fighting for it even if I'm not gonna see it.
Siara (20:55.687)
Yes.
Bailey Parnell (20:55.724)
because I'm going to be dead and guess what? The kids are still gonna be here. And that's kind of like the main thing that, so there's one. Two is, you know, even though, of course, like parents, when their kids start getting into social media is often when they will find me or our content or maybe even a show like this. Because then they realize that they are responsible for the well-being of someone else. However, I'm telling ya.
I have worked with people of every age, including seniors on social media wellbeing and digital wellbeing. And that's why I would say, to your point, legislation like this, the reason why they're able to pass it is, it just bothers me a little bit. like, it has to be for the kids in order for people to even like get it through. But I'm like, okay, great, sure. We can all rally around that. But actually it probably will help adults as well.
Siara (21:34.057)
Really.
Bailey Parnell (21:54.38)
Like for example, when Europe was passing laws on data privacy and their GDPR and all this sort of stuff, that will help everybody. That will help kids and it will help adults. mean, kids will maybe understand it less, data privacy is big thing mostly for adults. So there's kind of like all aspects of digital wellbeing.
Siara (22:10.322)
Uh-huh.
Siara (22:18.621)
And then I have heard also the debate of, Do we want to look to these companies who are creating the product as the line of responsibility for making them well? Or for kids specifically, should we be looking more at the parents or the educators who have a little bit more control of how those devices are used?
Bailey Parnell (22:37.826)
Yeah. Well, here's the thing. Like any risky behavior, a multi-pronged approach is required, actually. And this is one thing that was not covered in the TED Talk, because as I started, as that was blowing up and I started going around talking about this, it was very clear that once we all were like, hey, this is important, I would always engage with people and questions about who's responsible for it. And so step five of safe social is now holding responsible parties accountable. And
To your point, if we're talking about something like driving or drinking, well then all parties who have a stake in human welfare are responsible. So that includes the companies themselves, 100%. Because if you're making all this money, and the best case scenario is that people are addicted in a risky behavior, a potentially harmful space, not that it's all bad, then you have some responsibility to make sure they can do it safely.
Siara (23:22.601)
Hmm.
Bailey Parnell (23:37.014)
and then you enter government. Well, you have some responsibility to make sure that they are making sure that it's being done safely because that's what you're supposed to do, regulate big business so that they're not hurting the people in the face of capitalist aims. parents, guess what? You also have a responsibility. Educators do as well. Parents have a responsibility because if you're letting your kids engage in this risky behavior, you of course,
Siara (23:45.006)
Mm.
Bailey Parnell (24:03.854)
at the end of the day are also responsible for making sure that they can do these things safely, that they're equipped with the skills offline and the digital literacy online to do it. And if you don't understand it, this is where I do have the most empathy. I do have the most empathy, I think, for parents and educators because we're just at a really weird time of history right now where still largely those people did not grow up with it themselves. So they are like, have a really different relationship.
Siara (24:18.028)
Yeah.
Siara (24:28.854)
Okay.
Bailey Parnell (24:32.832)
a deep different relationship with digital technologies and with social media. Whereas young folks right now, there is no digital life. Like there's no my offline life and my digital life. It's kind of just life. Like this has always been here. There has always been a digital, digital is life. It gets hard to, it's hard to describe. It's like if we were growing up or something, our parents were growing up saying my TV life. It's like, no, TVs are just in life. Like, well, what do you watch? How do you use it? I mean, like it's a small distinction.
Siara (24:45.57)
Mm-hmm.
Siara (24:56.108)
You're right.
Bailey Parnell (25:03.104)
It seems, but it becomes a big distinction when parents are like, just get off your phone. Who even cares? And it's like, trust me, they care. Like not only do they care, it's actually like deeply woven into sense of self, sense of relationships with others, social hierarchies. Like things have changed there and humans have always, have always abided by those systems and just delivered a little differently. So I do have empathy for them, but, but at the same time,
it will still be their responsibility at the end of the day. Like if you're having a cabinet of alcohol in the house, it is still your responsibility to make sure it's protected, make sure kids understand what can happen if you do this or if you engage with this, what to do if something happens. We call them guard rails, like guard rails on a highway. It's like, yeah, you can drive. You can't drive as fast as you want because you're gonna put other people in harm's way.
Siara (25:53.118)
huh.
Siara (25:58.776)
Bye bye.
Yeah, and it feels like it's just evolving so rapidly that almost parents might want to learn these technologies while their kids are using them. My parents certainly didn't know much about social media. I remember I joined. was so young too. And had they known, they probably would have been thinking, all right, let's think about this. Let's educate ourselves. So it's, I'm just happy it's becoming more of a conversation that parents can prepare themselves for. But I totally agree with you. It's
Bailey Parnell (26:13.934)
Yeah.
Siara (26:30.988)
we need to look to the companies, which brings me to my next question, which is, right now it feels that it seems, and it might be fact, that algorithms are designed for our engagement to entertain us. The incentive is not our wellness. It's not our mental health. If an algorithm was designed for the benefit of our wellness, a social media one specifically,
Bailey Parnell (26:35.352)
Mm.
Siara (26:58.808)
Do you have any ideas of what that might look like? Blank slate.
Bailey Parnell (27:06.567)
It could, there's a lot, actually there's a lot that people will suggest in this space, like how could we improve it? And I'm very cognizant of the fact that some of these suggestions might be things that would hurt the bottom line, and I'm suggesting them anyway. things like balanced content delivery, like algorithms limiting exposure to the more divisive content and sensational content. Like you can tell if it's divisive.
ChatGBT can tell if a comment section is divisive these days. Okay, so no excuses, especially in the post AI world. my gosh, right? But so a balance like this matters because if you're only, if you've not even realized it, you've never even heard of this thing called the negativity bias in the brain. Maybe you didn't know that your brain stores like negative information in long-term memory faster and longer.
because it becomes a threat, then you might not realize that, you've been on social media for this hour and you didn't think about which percent, how, what percentage of what you consumed is negative. Like who thinks about that stuff? So who should think about it? The companies. should think about it. The governments, because you might not know everything about how this content interacts with your brain. This is not out of the realm of normality though, because when we were like, if you think about television and how we were releasing,
Siara (28:18.208)
Mm-hmm.
Bailey Parnell (28:32.458)
shows. The the what am I saying the people I'm blanking on the name right now but the producers and the networks who were releasing the shows had to abide by certain standards for the public and for example you couldn't have certain things on kids television you couldn't have you couldn't advertise smoking or even drinking a lot of the time like it was like
Well, where I grew up in Canada, you couldn't do that stuff. You had to do it in safer ways. So you had to like provide training even in your advertisements. This stuff doesn't happen as much on social media. Maybe there's personalized wellbeing prompts. You know everything else about me. You know, you're advertising, I don't know, weight loss to this person. And you know that this person is pregnant before even they know because of their search. Well,
Could we use the same technology to actually help someone's wellbeing when they're searching things like suicide? Yes, the answer is yes, by the way. The answer is yes. And it's been done in research actually. Maybe it's like prioritizing content based on, not just based on like popularity, but if there's posts, like how many likes for example, but maybe if posts are encouraging conversation, I don't know.
Siara (29:31.989)
Hmm.
Siara (29:55.625)
Yeah.
Bailey Parnell (29:55.66)
So here's the thing, I can't tell you exactly how it's going to go, but I can tell you that it's worth a try, right? It's just like it was worth a try to see, should we do this thing called seat belts? Should we make it so that bars are not allowed to like bars in Canada are responsible partially if you are drunk and they keep serving you and you drive home and kill someone, they will take your keys and they are legally allowed to do so. So.
Siara (30:02.774)
Mm.
Siara (30:07.136)
Right.
Siara (30:23.754)
Mm-hmm.
Bailey Parnell (30:25.23)
So it's like, again, not out of the realm of normality that if you're creating a risky space that people operate in, well, you know what? If someone is abusing people for a long time, maybe they should be just cut off. Maybe they should be tied to their IP address so they can't go make another account. And maybe there's mental health check-ins that actually are just like the same way you have to get through a data policy. If there's like a pop-up that says,
Siara (30:36.445)
Yes.
Siara (30:40.465)
Right.
Bailey Parnell (30:49.364)
if you're not feeling well right now, we don't recommend that you use the same way, same way drinking and driving has to do that stuff. Right? So this, again, you're thinking like, the alcohol companies have to put out these messages. Yes, they do. And guess what? They still make a lot of money. They're doing just fine. So there's that. I mean, yeah, I feel like this one, everybody would probably have good ideas and everybody would probably have a counter of like, she
Siara (30:56.19)
huh.
Bailey Parnell (31:15.246)
Yeah, we're just gonna close the mental health reminder and then, you know, move on. But that's okay. You know, if it's being repeated once a week and you have to keep closing it, maybe you will read it that one time. Maybe now people understand not exactly what is in the data policies because they have to click accept or reject, but they understand that their data is being collected. And to even understand that this can affect your mental health is, trust me, even like more than a lot of people.
Siara (31:19.505)
Right.
Siara (31:43.606)
Mm-hmm and who wants to be flagged anywhere as someone who is? Consistently abusive on social media. I mean, I definitely think of certain I won't call out but certain Social media platforms where I go to the comments and I'm like we've lost a sense of kindness when we go to our phone There are people who I think are forgetting themselves There's no way that that's there's this many mean people in the world when I go about in the world
I just don't see it. So I think people have their mask, whether or not their face is on their profile picture or if it's like a completely almost ghost social media account, people get a little bit more harsh when it comes to social media. So if people could just get that reminder of like, maybe your comments have been leaning this way lately or maybe you should consider, I think a lot more people than we might expect would say,
Bailey Parnell (32:37.036)
Mm-hmm.
Siara (32:43.343)
That's probably true. I probably should chill out a little bit. Maybe they type the thing and then they erase it and realize, really, I'm just having a bad day and this didn't need to be published. So.
Bailey Parnell (32:53.582)
Yeah, yeah, I think so. I think so, and you know, there's another suggestion from many is that you have to validate your identification to be on these networks. And a lot of people are saying like they wouldn't like that from a data privacy perspective. But then someone made an interesting argument to me. They said that your privacy is fake. They said.
Siara (33:06.746)
Hmm.
Siara (33:19.352)
Hmm.
Bailey Parnell (33:19.992)
They said, government already knows. they already, the minute you landed here, they knew where you were. They could follow your phone. They could do all these things. But now the minute that privacy or lack of privacy is being used to keep you healthy, it's an issue. So I go back and forth. I go back and forth between these ones. But I do think that if you had to, if it had to be tied to your personal identity, I kind of like this. I kind of like this.
And maybe brands can submit a brand application as they do and they get brand approval or something, an organization that is not a person. But for the person accounts, it's like, well, part of the reason we're able to keep social harmony is because you do have to be answerable to what you say to people in real life. And on social media, you don't have to be answerable. So I play around with this one too.
Siara (34:11.78)
Mm-hmm.
Siara (34:15.609)
It might be the case for those blockchain-based social media platforms too, because at least, even if it's not that complete separation of identity, at least it's tied to some sort of wallet. It's a little too expensive to try and create all of these different.
blockchain based accounts, I'm not 100 % sure how it all works, but I've definitely heard the argument that that could be a way to maintain a bit of that privacy, but still be held accountable and not making, you know, 10, 20 different accounts on one platform. And the bot issue, of course.
Bailey Parnell (34:46.488)
Yeah, and then who decides, who decides what, like what is abuse and what is not is another natural question. And these should be the questions, everybody. These should be the questions just the same way that in, you know, we both have lived in democracies and we're fortunate for that because the question should be, is the outcome that we are healthier? What is the way to get there? We should agree on the outcome.
Siara (34:57.082)
Yeah.
Bailey Parnell (35:14.016)
And then we should talk a little bit with people about what should be the strategies to get there. We should not be saying like, this, like right now, there's also an erosion of trust in politicians and whatnot and in governments. And that is a large, a great part of that is because of what's happening on social media.
Siara (35:19.472)
Mm-hmm.
Siara (35:36.991)
We talked about TikTok. I just want to share some numbers. So in 2023, users collectively spent over 4.43 billion minutes per day on TikTok. Instagram, about 3.9 billion per day.
Bailey Parnell (35:49.718)
Hmm? Hmm?
Siara (35:55.134)
It's one of those numbers where it's surprising, but it's also not surprising. There's a lot of use of social media across all generations. And so it feels like the type of content that we're getting, feel like that number gets, you know, it goes higher every year. And the type of content that we get is always shifting towards even more addictive content than before. And it's only a matter of time until we get a new, even more addictive social media app or feature or device.
Bailey Parnell (36:18.19)
Mm-hmm.
Bailey Parnell (36:24.174)
Thank
Siara (36:25.042)
have the goggles coming out and whatever. So for those of us who are chasing this digitally well life, how do we keep up? How do we shift our habits over time? Because the tech companies are getting better and better at grabbing our attention. It feels like it's becoming a little bit harder to manage all of it.
Bailey Parnell (36:27.309)
No.
Bailey Parnell (36:48.283)
It is getting harder to manage all of it and It's even you know, it's even getting harder for me like I said because I'm attached to a screen all day It's not just about social media. There are other things that we do in digital that capture our attention like streaming and gaming and and these things so it is getting harder and I contend with this question a lot and For you know, I don't think
Siara (37:08.381)
Good.
Bailey Parnell (37:17.132)
Right now there has been any replacement for two things that seem to be very positive. Spending offline physical time with people that you love, being in community with other humans is just something that can't be replaced. Like for example, you can't recreate the oxytocin that is released. Even if you are connecting with someone online, you can't recreate the same.
oxytocin or dopamine that's released and when you're connecting with that same person offline. That would suggest, in maybe a very obvious way to some, that the physical, is not just some, like that this biological aspect of us is not just some piece that you can extrapolate from the rest of our condition. And why I say that is because I think we are increasingly trying to extrapolate different parts of ourselves through technology.
And you naturally, when you start talking about things like this and digital representations and then, my goodness, when you get to AI, you will naturally end up at spirituality. Or we can call it something else, but people's beliefs about what it means to be human, what is purpose, what is our purpose here.
What does it mean to have mind-body connection? And these questions sound quote unquote spiritual to some, but this is the only anecdote that I've really found for people who are struggling in digital lives. It's because so much can be done here and there's so much good here, but we are actually still organic entities. And it's, I sometimes say, know, It's like we're version, we're like living in a version 13 world with version 7 brains and bodies. So there's that offline,
Siara (39:03.391)
Mm-hmm. Yeah.
Bailey Parnell (39:07.415)
Time with being in community with other humans is super valuable. Valuable to your digital, I'm actually saying this, remember, in response to you will have a better digital time online, I promise you, because your digital networks will start to reflect the things that you care about offline. You will be happier when you go in. You'll be able to set a network that better serves you. You'll be able to have grace for people that haven't figured it out yet. You'll be able to move on from stress because you've built resilience.
So this all follows you online. And then the second thing that's of great interest to me, it's kind of similar, is nature. Like offline time in nature cannot be, cannot be recreated in the same way completely. And again, that would suggest that there is this biological mind-body connection element to it that is maybe not even fully explained in science yet. There are many people.
that have many, many historical theories for this. Some would find those theories in the disciplines of religion and some would find them in the disciplines of spirituality or sort of secular mindfulness or biology and physiology and how these things connect to one another. Psychology, like there are many disciplines that have been trying to explore these questions, but one thing we know for sure is that it works, as in time spent in nature.
seems to reduce people's cortisol and makes them feel more connected and improves their life satisfaction levels. Even your proximity to living to a park seems to improve life satisfaction levels. So those are two things I'd recommend for anybody as this gets more complicated. You will have to be more intentional about offline time, just as I'm having to do right now.
Siara (40:57.989)
So when they start telling us that we can put on goggles, feel like we're in nature, even get the scent of nature and hear all of the sounds, we can make the argument that spiritually we're not sure if it will have the same impact.
Bailey Parnell (41:14.27)
Yeah, I actually love this conversation. It's one of my favorite things to talk about because it starts blending all the things, like all of life. That's why I'm so passionate about, even passionate about AI or, you know, collected intelligence. I'm passionate about our digital extensions because it really comes all down to what it should always be about, which is what does it mean to be human and what does that mean in the grander scheme of things? And...
It seems unconnected when you were listening to this podcast. Maybe you didn't expect that we would go this deep, but I do think it is this deep. You know, when, when we have these goggles that create, you're getting into philosophical arguments too. So when you have goggles that create all of your senses are, and you feel like you are in nature. I can tell you firsthand that I have Apple vision pro and there is some, there is some effect. I'll tell you that much.
Siara (41:42.02)
I love it.
Siara (42:04.131)
Mm-hmm.
Bailey Parnell (42:07.564)
There is an effect. When I was surrounded by Mount Hood and I can see it all around me and I can hear it, well, this is how humans take in data, right, through our senses. So at least some of those senses were absolutely positively stoked. And I was definitely feeling calmer and there was like an affective reaction to it and a physical reaction to it. So I actually think that's really positive.
Siara (42:26.531)
Mm-hmm.
Bailey Parnell (42:36.194)
However, will that replace me actually being in Mount Hood? I'm not sure that it can actually. I'm not sure that it can for, because I would always be missing that one scent. I would always be missing, sorry, maybe scent, sure. The actual smell, sure. I might always be missing touch and the other ways that humans take in data. And perhaps one of the most interesting ways we take in data is not fully understood. And that's the sixth sense beyond our five senses.
how we, you know, like the emotionally aware, the emotionally intelligent, your brain to deduce what's happening very quickly, you can't always, and be in a physical space with our very organic physical body in spaces that are made up of the exact same chemicals is not fully explained, I would say.
Siara (43:27.482)
This is gonna seem random, but I do remember seeing something, reading something somewhere where someone was struggling with how they looked in pictures. They thought that they looked great, then they would take a picture and be like, just, this picture isn't turning out right. And I think that might be the perfect metaphor for this conversation where you're just never gonna be able to capture what it's like to be in person and experience beauty right here.
in the moment and all of the things that come with it, not just the visual beauty. So yeah, I do love where this conversation is going. I want to get into your thoughts on AI. It seems like you have some really positive things to share, specifically about generative AI. You run Skills Camp, which is a company that's focused on helping people build their soft skills. And so...
Bailey Parnell (43:57.816)
That's right.
Siara (44:21.552)
I'm curious about how you've figured out Generative AI can help people build those off soft skills and what else you think it has to offer us right
Bailey Parnell (44:32.63)
Yeah, so I will, let me just count, let me just like give a preamble before we get into this. Everyone, I am very aware, more aware than most people you know in the whole world about the ethical risks of AI, okay? We can absolutely talk about that after, and I know about all of them. So, however, I have also found that...
Siara (44:47.271)
Mm-hmm.
Bailey Parnell (44:59.042)
that when I talk to everyday people, even when I talk to some people in this digital wellbeing space, most of what they interact with every day is the fears around AI and is what they're nervous about and maybe their generator of choice. And I've actually come to say, well, there's a reason that it's worth fighting for actually. And some of those reasons have to do with skills camp, but there's also just the positive.
Siara (45:12.776)
Mm.
Bailey Parnell (45:28.626)
the positive aspects that can be done with it and maybe we'll talk more about that too. So let me start here. Humans forever have been trying to collect our intelligence and pass it on to other humans. Whether that was the stone tablet or the book or a library or a computer, we were trying to collect our intelligence and pass it on to other humans for the success of the species.
In fact, this was the success of the species, our abilities to communicate and collaborate early in literally us as a species. So that has always happened. I think that AI was always going to happen. We were always on the path to collecting our intelligence as best we can and trying to access that intelligence as a collective hive mind. Now that's why sometimes I'll call AI like collected intelligence.
Siara (46:07.338)
Hmm.
Bailey Parnell (46:19.884)
and I make the distinguishing difference between collected and collective, it is not collective yet. It is not as a result of all people yet. It is collecting stuff though. And so there's that. Now, the way that we have used collected intelligence in the past has been amazing, like actually amazing.
Yes, it has also been used for the worst things in history. And I need your listeners to be able to hold both in your mind at the same time. Both things are true. The way that you have used a knife can be really good or really bad. The way that we have used collected intelligence has done some really special things. And the same thing is true for AI. Now, in the world of learning and where I am, it's kind of funny, because some people would otherwise call soft skills human skills.
Siara (46:48.586)
Mmm.
Bailey Parnell (47:11.938)
or they would call them other things. And what does it look like to have that be AI assisted? Well, now you kind of get into instructional design. How could AI support learning anywhere? Right now, if we're doing things like teaching feedback scenarios, I know that as an adult learner, you are more likely to learn if these scenarios are directly connected to you, they're relevant to examples in your life, maybe they're timely and in your industry or something like that. I know you're more likely to remember them. Well, now we can do that very, very quickly.
Siara (47:12.411)
Mm-hmm.
Bailey Parnell (47:41.59)
maybe even in the session that I'm with you in. So you're more likely to learn this stuff. That would be something I would do if we were teaching conflict, if we were teaching stress management, would be scenario-based learning. Maybe it's personalized feedback and coaching. That's another way that we would teach people. And that's actually a really high touch way is imagine a one-on-one interaction. So these days you can now have AI, well, there was, mean, it's being worked on literally as we're recording this.
Siara (48:10.06)
Mmm.
Bailey Parnell (48:10.572)
You can have AI give you feedback on your public speaking as you are delivering the speaking. That might freak some of you out, but it might be like, Hey, slow down, you know, keep it cool. Like, drone pacing, clarity, all this sort of stuff. emotional intelligence training is an interesting one because it's a really interesting one. We teach emotional intelligence and we teach empathy and practice and we did it before AI. Now people would say, how do you do that? Well, we break them down.
Siara (48:20.842)
Mm-hmm.
Bailey Parnell (48:40.428)
like any other skill that human does, that humans do, sorry. And so how does one kick a soccer ball? How does one present signs of emotional intelligence? How do you build it? Break it down into learnable steps and then practice those steps. so AI could help with probably each of those steps, like the situations, like being able to prompt you with questions that we know increase self-awareness.
Being able to pick up on emotional cues and Zoom calls, we're already there actually. There's an interesting company called Hume AI, which is looking specifically at more empathic and emotionally intelligent algorithms, which kind of freaks people out a little bit sometimes and also excites them sometimes because, well, I think it would be a good thing if our AI buddies could speak to us in emotionally intelligent, kind ways.
Siara (49:13.55)
Mmm.
Bailey Parnell (49:34.85)
Adaptive learning pathways. So let's say you're taking online courses and you you You didn't pass the test, but you're just not sure why why should you have to wait a month to get feedback from a professor? Somewhere in the world like you could get help straight away and it could say now I see where you went wrong The Khan Academy is already doing this in their quote-unquote tutors their AI tutors and
Siara (49:55.074)
Mm-hmm.
Bailey Parnell (50:01.848)
There's really, mean, that list will go on and on and it will continue to go on and on. And as I'm saying this, everyone is probably thinking, well, that sounds pretty good, right? Well, let me tell you about some other good things going on in the world of AI. AI has been shown to be able to dramatically improve health outcomes. For example, like Google's sequencing of proteins that would have taken literally thousands of years if you counted up the human time that it took to...
Siara (50:12.792)
Mm-hmm.
Bailey Parnell (50:27.512)
to identify proteins against others and then find so many more. And it was, and they've been able to sequence these proteins. And if you understand proteins, then doctors can do a lot more with them, right? You can build a lot more in each other. We can understand how one might affect another part of the body. Farmers in India are using AI to decrease water use, pesticide use, and increase yield of crops.
Let me give a little side note too. AI is the collected intelligence. When you pair that with robotics and sensors and all these other forms of technology, that's when you start getting like smart robots or smart cars. You have to pair them. So in the, the Indian farmers, what they're able to do is have cameras that can identify plants. And if you can identify the plant that is already artificial intelligence, and then you can say, well, I can give pesticides or water on a plant level as opposed to an acre level.
And that means, you need water, but you don't need water. So you get water and you don't get water. And that imagine like saving that water, right? Let's see, ooh, when I was at AI for Good at the United Nations in Geneva, there was people collected from all over the world using some kind of AI to solve SDGs. So a couple of the ones that stood out to me were little robots that go clean coral.
Siara (51:23.171)
Mmm.
Siara (51:27.578)
Wow.
Siara (51:33.263)
Incredible.
Siara (51:46.04)
Mmm.
Bailey Parnell (51:52.162)
They clean the coral that we're messing up. So that's amazing. There was another one. I could literally go on forever actually. gets me really excited. It gets everyone excited.
Siara (51:55.248)
Ha!
Siara (52:00.336)
That's incredible. I need to see that on video.
Bailey Parnell (52:06.454)
Yeah, you have to look it up. There was one that was like glasses for the blind and you could put on a sensor and through sensor technology that is now really, really good, as in the same sensor technology that would end up in like a Tesla car so that you don't hit things, they were starting to be able to pair with, you know, like brains and so that people could walk or even if you're blind and you can't see that sensors would be able to guide you and do all this sort of stuff.
Siara (52:10.672)
Mmm.
Bailey Parnell (52:35.192)
There was even robots that were helping care for dementia patients. And they were seeing really good signs of positive outcomes with dementia patients by being able to have an intelligent robot. And part of the reason, you might be thinking, that's so sad. Why don't we just have a human doing that? Well, I'll tell you, actually. Humans were still part of these examples. But humans, we're not the best at everything. A couple things we're not the best at, being consistent.
So with a dementia patient or Alzheimer's, when things were changing, the robot would remain consistent, which was kind of comforting. And the second thing we're not good at as humans is indefinite patience. So even if you love someone so much in the world and you're there every single day, if you're repeating something for the 10th time, humans aren't very good at patience. Whereas the robot will repeat it for the 10th time. So it was creating kind of like feelings of, of connection. And so anyways, I could go on forever there. When I start saying this though,
Siara (53:22.227)
You
Siara (53:31.507)
you
Bailey Parnell (53:33.272)
When I start saying that AI has the potential, when paired with the good side of humans, to increase health, to actually help and fix our environment that we have messed up, to help kids learn better, to do all these things, you would be saying, you can make the argument that it is a moral imperative to explore this. It is a moral imperative to do this right.
Siara (53:56.953)
Mm-hmm. Yeah, absolutely. And I mean, I do want to hear some of the concerns that you have about generative AI, but when you mentioned the sustainable development goals, I'm thinking of equity. I don't know how they, how they,
communicated exactly, but I know equity is one of them. And I get worried because I know that bias is such a rampant problem in AI. I have experienced bias when playing around with, I play around with a lot of AI tools. So I'm curious, I want to hear a positive. Do you know any AI that's being used to help?
Bailey Parnell (54:22.104)
Mm-hmm, that's right.
Siara (54:35.092)
you know, the problem that we have with equity as it pertains to race, gender, sexuality, all of the equity problems that we have. But then I would love to get into some concerns that you might have.
Bailey Parnell (54:48.088)
Yes, I have lots of concerns as well. Lots. And one of them is that, right, the perpetuation of bias, because the same bias that, remember, if we framed this as collected intelligence, which means it is being collected from certain places, and right now it is being collected mostly from these tech companies that we just spent an hour talking about how concerning they are and how they are in a paradigm, a capitalist paradigm, which means that
that there is still money to be made at the end of the day, right? Which theoretically, theoretically, governments are in a different paradigm, but you know, we love our conspiracy theories. okay, with collected intelligence, if it's mostly coming from these companies, then it's mostly coming from men, and it's mostly coming from white men, and let's be honest, it's mostly coming from a Western paradigm. That is very worrying.
Siara (55:29.691)
We do.
Bailey Parnell (55:45.196)
like American imperialism is basically finding its way into these networks. And they sometimes they don't even know they're doing it. That's just what bias is. If you're not aware of it, then you don't know how it might present, then you can't intentionally counter it. So that is, I'm also, so I'm 100 % I'm worried about that, but you asked for a good news story as well. I do have a friend who is working in the Middle East and he's from,
Nigeria and he's saying that right now over in Africa they seem and in parts of Middle East as well they seem to be having like a leapfrog effect with what they're calling it is leapfrogging because of AI and what this means is whereas these places may have been behind technologically before they are now able to skip a whole step, skip a digitalization actually I'm not gonna
This is his area of expertise, not mine. I'll do my best to explain it, but basically that all this infrastructure that countries like Canada and the US have had to build up for ages is now different. And that sometimes it is easier to bring in a new technology than it is to adapt a society from an old technology. You know, it's just like if I was just starting fresh with.
Siara (57:03.896)
Mm.
Bailey Parnell (57:07.178)
my hairstyle as opposed to having to dye my old hair. I don't know. So what they can do then is to use the phrase leapfrogging, is the idea is that they will leapfrog ahead of other countries and leapfrog ahead of other places because they don't have to spend the time transitioning everybody over from the old way of doing things. And that's because of AI right now. And I just told you how it was being used in...
Siara (57:11.064)
you
Bailey Parnell (57:34.424)
you know, India, and that's also true elsewhere. So that's, think, something good to look forward to. The other thing, I'll just read it, I just put this in my dissertation recently, was when I was exploring a risk of AI in leadership, because that's what my dissertation is about, is how leaders can use generative AI for more human-centric leadership practices. And of course I explore ethical risk of bias, but there was also people who were saying, actually there's...
Just the same way as there's AI bias, there might also be algorithmic inclusion. Like, as there's algorithmic exclusion, you might be able to code inclusion the same way we would do it offline. And I think that's a really interesting area to explore. I don't know exactly what it looks like yet, but it means that there's possibilities here, right?
Siara (58:22.94)
I definitely think about, you know, just even hiring, you know, say in Canada or in America. And I know that in HR tech, AI is a hot topic. And so I worry about that. And I worry about, you know, people. I love the idea of getting.
Bailey Parnell (58:29.25)
Mm-hmm. Mm-hmm.
Bailey Parnell (58:35.436)
Mm-hmm. That's right.
Siara (58:43.941)
Quick advice when you need to make a more informed and thoughtful decision. I've done it. You go to ChatGPT and like, this is a tough decision. Well, how do I word this correct? How do I make sure this is coming from a place of empathy when I'm dealing with maybe a tough customer, tough coworker?
Bailey Parnell (58:49.454)
So for sure.
Bailey Parnell (59:00.334)
Probably gave you a pretty good reaction, didn't it?
Siara (59:02.754)
Yeah, it gave me a great reaction. It always agrees with me. That's my... And that's where I pause and say, sometimes, have you ever heard the golden retriever boyfriend metaphor on social media? Sometimes I feel like certain AI models, again, I won't call out anyone, but certain AI models are golden retriever boyfriend. Yes, so the golden retriever boyfriend, it sounds like a lovely person.
Bailey Parnell (59:16.536)
Yeah. Yeah.
Bailey Parnell (59:24.044)
You should probably explain to your listeners what the golden retriever's boyfriend is.
Siara (59:32.788)
someone who, and it could be a girlfriend, someone who pretty much agrees with everything that you have to say, doesn't really give you much of a problem with anything that you do. They're just going to cheer you on no matter what you do, even if you're doing the wrong thing, saying the wrong thing. And so I found that certain models can be a bit of that golden retriever. My golden retriever AI, where they're sort of...
Bailey Parnell (59:55.681)
on there.
Siara (59:58.656)
validating things that sometimes shouldn't be validated, especially in the workplace when it's hard to see. You have blinders on, it's really hard to see things objectively. I worry about bias coming out in that way. Yeah, these are just, you know, thoughts that kind of run through my head all day, but it's definitely, I've gotten to the point where there are some models that I'll go to for certain use cases versus others because, well, that brings me to my next
Bailey Parnell (01:00:23.596)
Interesting. Like what?
Siara (01:00:28.56)
Next question, which is I really want to name a use case and I want you to tell me which generative AI model that you would run to for that specific scenario.
Bailey Parnell (01:00:40.11)
geez. Well, I can tell you what I would use is fairly similar, but I am already feeling I'm going to fail people in this exercise because I don't use every model just because I study it. Right. I always say, I always say at the start of a talk, say, I don't code and I don't, am a humanities social scientist. Do you want to know how people have affectively experienced this technology? So, but yeah, we could, we can try.
Siara (01:00:44.295)
Okay.
Siara (01:00:58.202)
Mm-hmm.
Siara (01:01:05.853)
I'm with you. I also don't code, but I think that people who don't code get to talk about it too. So here we are. All right, so ready? So how about if you were, let's go with this, getting advice on how to deal with a tough customer or coworker, which model would you use?
Bailey Parnell (01:01:11.8)
Yeah.
Bailey Parnell (01:01:27.874)
I'd probably use Claude. See, by anthropic, seems to have more emotional intelligence. And it seems to be more people-centric, and that would be reflected probably in the company structure as well.
Siara (01:01:31.613)
Mmm.
Siara (01:01:38.865)
Okay.
Siara (01:01:45.901)
okay. My first thought was ChatGPT simply because I think it has more context of myself, but I like the more objective human-centric angle that you can get from Anthropic, so I'll have to try that.
Bailey Parnell (01:02:01.846)
Yeah, that would be my recommendation. mean, I'll use chat GPT as well. And it's funny you say that because my sister, a wonderful use case of, of what I would consider AI assisted emotional intelligence and communication is she told me she's like, she's always struggled with communication. She has really high emotions and she's like, I've not been able to communicate clearly. And when she was in an argument with some like, I don't know, extended family member.
Siara (01:02:22.961)
Mmm.
Bailey Parnell (01:02:30.508)
she's started to put her responses into GPT and started to be like, how do I make this sound less like angry? Because she doesn't feel angry, but she's not had the training. Like, sorry, let me be clear. She feels anger, but she doesn't want her communication to get in the way of communicating what she wants to communicate. And I'm like, wow, that's really interesting, right? So, okay, what's next?
Siara (01:02:41.117)
Siara (01:02:55.911)
Mm-hmm. Okay, what model would you use to generate ideas for social media posts and why?
Bailey Parnell (01:03:04.534)
I mean, probably I would just go to GPT for that. that it is nice when it has memory of you. And I would probably also, you know, give it links or something. I do a lot. Like my most paid version is GPT right now for sure. And I'll give it everything. And I'll be like, here's PDFs, here's word documents. know, my husband too, he's kind of trained GPT, he's done more like a.
Siara (01:03:13.578)
Mm-hmm.
Siara (01:03:27.008)
Mm-hmm.
Bailey Parnell (01:03:32.182)
intentional work on training GPT to respond like him. So he'll say like, here's how I would, he's a writer. Well, well like for social media posts and stuff like that, you know, and then his team, can say like basically GPT was able to help him word how he writes so that he could give it to his social media team and say like, okay, here's my style.
Siara (01:03:36.416)
Siara (01:03:46.112)
Mm.
Bailey Parnell (01:04:00.012)
What they're doing with it is interesting, I don't even know really. But let's see, yeah, I'd probably give it even my own papers and be like, what would be interesting? I haven't yet, but I would.
Siara (01:04:12.87)
Mm-hmm. I think I would go for, and this one is more paid, so it's not as accessible, but writer.com, all of this is unsponsored audience, by the way. You can, as you just said, and you can do the custom GPT, you can train a custom GPT with a paid plan, but with Writer, it can have a little bit, I would say like more access to your knowledge base, and so it can have.
Bailey Parnell (01:04:23.95)
Ha
Bailey Parnell (01:04:28.738)
Hmm.
Bailey Parnell (01:04:35.246)
Yeah.
Siara (01:04:41.28)
all of the context of say you are a company with a product, all of the context of what your product can do, you can feed it all of the thought leadership that you've ever, and you can do that with a GPT. I just like the way that Writer organizes it and they have little scales that you can put for how professional or casual do you want it to sound or how friendly versus more clinical you want it to sound.
Bailey Parnell (01:04:55.086)
Mm.
Bailey Parnell (01:05:05.547)
Yeah.
Siara (01:05:10.654)
Yeah, but I've definitely tried both being in content marketing myself.
Bailey Parnell (01:05:10.67)
That makes sense.
Bailey Parnell (01:05:15.246)
I use Grammarly and Grammarly is kind of always on and it's always a funny thing in academia because academia is really scared of AI right now but they're scared of generative AI and I say like well do you use Grammarly? Yeah. And does it suggest other ways to write your sentence? Yeah. Do you use that? Yeah. So it's like you already use generative AI in your writing and what you're trying to figure out or not you, what everyone will figure out is where is there...
Siara (01:05:24.287)
Hmm.
Siara (01:05:37.771)
Yes, absolutely.
Bailey Parnell (01:05:43.842)
their line and the line in academia is very strict. Obviously like it can't write your stuff straight up, but you are allowed to use Grammarly and it is allowed to suggest other ways to write your sentence. they academia is going to have, it's going to have a reckoning in the next decade and it'll be very interesting. I can tell you that most of my, most of my friends are AI assisted and all and all, and trust me, want to be moral about this, like want to be.
Siara (01:05:49.891)
Mm-hmm.
Siara (01:05:59.991)
rules.
Yeah.
Siara (01:06:08.608)
Okay.
Bailey Parnell (01:06:12.962)
Do this the right way.
Siara (01:06:14.773)
Absolutely. Okay, so how about you're watching a YouTube video, you think it's interesting, there's some pretty juicy facts. Where would you go to fact check that video?
Bailey Parnell (01:06:28.556)
I'd probably go to like Snopes. I'm not sure, you know, I haven't been there in a minute. I don't even know if they've added like an AI component to it, but I would probably go to Snopes. And what's the other one that I haven't saved? Yeah, maybe like factcheck.com. Anyways, yeah, that's what I would do. What would you do?
Siara (01:06:33.771)
Yeah.
Siara (01:06:46.185)
of Politico or...
Siara (01:06:53.656)
Okay. If I was literally using one of the popular generative AI models, I think when I go for more fact-based information, I go to perplexity because they give you those sources when they set you an answer, which I don't know how accurate it all is, but it definitely makes me have more trust every time that they give me an output. But an even better one, yeah. An even better one is consensus.
Bailey Parnell (01:07:07.214)
Mmm.
Bailey Parnell (01:07:16.702)
Yeah GPT is the worst for that.
Siara (01:07:22.859)
I don't know if, I think it's consensus.ai or consensus.com where they use AI based on a bunch of scholarly articles to generate an answer. So you're asking maybe a yes or no question or like a quantitative question, but it's.
Bailey Parnell (01:07:35.982)
Hmm.
Bailey Parnell (01:07:39.66)
Yeah, here's another one my friend introduced me to recently called Site, S-C-I-T-E, which is like for academic scholarly articles, searching and research. here's the thing, like, even as you're saying this, these use cases, I'm like, nobody here listening is going to ever know all of the most recent platforms. There is now, there's a company coming out for almost everything, right?
Siara (01:08:03.365)
Mm-hmm.
Bailey Parnell (01:08:08.238)
And so I almost feel like this conversation is gonna become like, well, what are you using? what? They have that now? Like, they're doing that? Like, I can use it for this? So I feel like you're even teaching me a lot, which is nice.
Siara (01:08:14.742)
Yeah.
Siara (01:08:20.899)
Yeah, there's so much I can't keep up. I sign up for those newsletters that, you know, they show you the cool companies. And sometimes, you know, I get disappointed. Sometimes it's just really just a GPT wrapper and it's not true innovation, but sometimes it's really cool. So.
Bailey Parnell (01:08:25.949)
Same.
Bailey Parnell (01:08:35.596)
Rave. Rave.
Or sometimes they've like taken GPT and like they've done, they've added to it in a very specific way. And that's great. Like that's really, I think why that's actually probably why this technology is having such a rapid advancement and why it's going to be part of everything, literally everything.
Siara (01:08:44.407)
Mm-hmm.
Siara (01:08:56.365)
Yeah, yeah, totally agree. Okay, one last one. How would you identify a bird in the wild? I've choose this because I did this the other day.
Bailey Parnell (01:09:06.49)
no, my dad just showed me one of these.
Siara (01:09:13.111)
My answer is not special or nature specific. It's very basic.
Bailey Parnell (01:09:19.242)
snap, okay well probably me in this moment because I can't remember the name would probably just be like Google images. But my dad, he showed me this app recently that was particularly about like nature and plants and he could take a picture, he so excited to take a picture of a plant and tell me the name of it. And I was like that's artificial intelligence, right? Like because, yeah so what would you do?
Siara (01:09:27.863)
Nice.
Siara (01:09:42.074)
Yeah.
Siara (01:09:45.731)
I just took a picture and dropped it right into the GPT mobile app and I trusted its answer.
Bailey Parnell (01:09:51.244)
Yeah, okay, so yeah. I'm not fully there yet with GPT just because I've proved it wrong so many times that I don't trust it fully. So I feel like Google Images in this case, I would be able to like see what it most looks like and then be like, where did you get that from? You know?
Siara (01:09:57.689)
Mmm. Fair.
Siara (01:10:08.161)
Yeah, no, that's very fair. I should probably double check. It was just here in the city, I was like, probably is a morning dove. That probably is what it is. I don't know my birds.
Bailey Parnell (01:10:15.532)
Yeah. Nope. But yours is probably, but this is probably like speedier. So I could just do that too. I fight, let's say you drop it in. It gives you some options. What I could do very quickly is be like, can you give me three other possibilities? And then Google only those possibilities. Right.
Siara (01:10:23.385)
Mm-hmm.
Siara (01:10:34.487)
Right, exactly. Yeah. Okay, amazing. Well, that was fun. I've learned about some new models that I need to check out. I am curious what's next for the Center of Digital Wellbeing. Are there any upcoming projects or initiatives that you're particularly excited about this year or in 2025?
Bailey Parnell (01:10:55.118)
Well right now that's so the center is a nonprofit and it's And I'm kind of in the full swing of skills camp and my doctorate right now. So it's a little bit like Taking a a backseat right now however I'm also was in the process of transitioning it to be more of like advocate Support whereas when I started safe it used to be called safe social and transition to be the center for digital well-being
Siara (01:11:10.437)
Mm.
Bailey Parnell (01:11:24.866)
And when I started that, I was more interested in like ambassadorship, where people were kind of down for this message. But then I just realized through practice, through along the way that people who are coming to us actually already had communities of their own. Like they were like, I'm a parent or I'm a teacher or I'm a community worker, or I wanna bring this to my school. So they were actually into the work. It wasn't just about saying like, hey, I'm into this thing.
Siara (01:11:30.629)
Hmm.
Bailey Parnell (01:11:50.658)
And then I realized that that is actually advocacy and that maybe we should focus more on advocate support as opposed to ambassadorship. And that means like what we want to focus on then is like creating resources, like lesson plans, teachers, parents guides, like how to talk about this and how to deal with AI and then giving it to the people that can actually give it to the rest of the people. Cause we can only do so much. It's only a small, it's only, it's only a few of us right now.
Siara (01:12:15.695)
Mm-hmm.
Bailey Parnell (01:12:20.771)
And that's okay, because I only expect that I'll be more successful and then my nonprofits can get more successful.
Siara (01:12:30.661)
Yeah, absolutely. And it feels like a bit of a, I don't find so many people that are specifically focused on digital wellness and digital wellbeing. So it's a new landscape. Super excited for that organization. And then with Skills Camp, is there a crossover? Is there digital wellness skills that you can build on Skills Camp or maybe even digital etiquette, which is something I'm really interested in and I feel is a soft skill.
Bailey Parnell (01:12:39.96)
Mm-hmm, that's right.
Bailey Parnell (01:12:48.43)
Hmm.
Bailey Parnell (01:12:57.228)
Yep. We actually, there's more crossover than ever these days, especially since the pandemic. used to, I used to feel like it was two separate worlds for me, but these days it's not on one side of things. Step three in building practicing safe social and digital wellbeing, as we have actually already covered on the podcast is building the offline soft skills, things like resilience and confidence, self-awareness, mindfulness. These things absolutely follow you online and would be a longer term solution to your digital wellbeing.
And that's what, and Skills Camp teaches that. So Skills Camp was already building the solution in a way. And that was kind of a poetic thing in my life. And then the other, it's gone the other direction as well. So in Skills Camp, when we go into workplaces, honestly, you can't have a communication program anymore without including digital communications. You can, when we do leadership, it's like, you leading remote and hybrid teams? Are you, you're stressed. Do you understand how...
well you're spending 80 % of your work day on screens. Is it possible that that might have something to do with that, right? So it's just kind of like all woven in together now and increasingly so.
Siara (01:14:08.965)
Amazing. Okay, so I ask everyone this before we wrap up the show. I want to know what this year, this week, this month, whatever feels comfortable, what are you logging out of this year and what are you logging into, however you may translate that.
Bailey Parnell (01:14:28.014)
Are you saying like, what am I canceling? What am I getting into? I love that. What am I done with? Let's see.
Siara (01:14:31.245)
Yeah, sure. Yeah. What are you done with?
Bailey Parnell (01:14:43.662)
I am logging into a really heavy year of, if all goes according to plan, I'll be graduating my doctorate a year from now. And so I'm about to be logging in to a very, thank you, hopefully we will be back here in a year saying it was successful. Hopefully I'll be logging, I will be dialing into a very heavy year of.
Siara (01:14:54.81)
Wow.
Siara (01:15:01.856)
I'm
I know we will be.
Bailey Parnell (01:15:11.166)
of writing and screens, actually. So it's so funny we're talking about this in this context. Given what this last summer was like, I'm kind of mentally preparing for this year ahead. And so what I'm going to be thinking about, I think is how I will be logging out with more intention. I mean, it's kind of a meta conversation because you're at like, literally logging out. But I need to, those things I was saying earlier about physical offline time, I'm going to have to be rigid about.
Siara (01:15:30.871)
Yeah.
you
Bailey Parnell (01:15:40.564)
offline time out in nature by myself or with my husband. Because even as much as I love them, if I'm around a bunch of other people in my offline time and I'm with my family and let's say we're there for a week, it's not as the same kind of rest. I'm engaged in another way, but I think over these past couple months I've been overly engaged in everything and I need a break. Yeah, so that's it.
Siara (01:15:59.897)
Yeah.
Siara (01:16:08.805)
I'm picturing you on a typewriter writing in the forest. Something to consider.
Bailey Parnell (01:16:12.558)
I actually bought one for my husband. We have a typewriter here because he's a writer and I thought it was like interesting. And because sometimes you just want to do the writing and not the other parts or like even the screen part and I don't know. I'll figure it out.
Siara (01:16:18.393)
That's amazing.
Siara (01:16:24.547)
Mm-hmm.
Siara (01:16:32.093)
I know you will. I can't wait to hear what you've learned and how you figure out how to balance it all. Okay, well, thank you so much for coming and being on the show. I learned so much. I'm thinking about spirituality. It's I have so much to think about after this. To everyone listening, thank you for tuning in. If you enjoyed this episode, don't forget to subscribe, leave a review, share it with someone who might need to hear this. But before we wrap up,
Bailey Parnell (01:16:33.166)
sorry.
Bailey Parnell (01:16:45.248)
In text.
Siara (01:16:58.617)
Bailey, can you let our listeners know how they can learn more about your work or stay connected with you?
Bailey Parnell (01:17:03.598)
Sure, I'm at Bailey Parnell on everything and hopefully the good side of social media and I'm, you can also go to BaileyParnell.com or if you want to check out what the center is doing, that is the center for thecenterfordigitalwellbeing.org
Siara (01:17:19.319)
Okay, perfect. Thank you. All right. See you all soon.