Evan and Brad give us their review of “The Social Dilemma”. Netflix’s new documentary sheds light on some techniques tech companies use to get us hooked on social media and the information they’re collecting on us. Hear what a couple of information security and privacy experts have to say about a topic that’s not going away any time soon.
Protect Your Organization from Cybersecurity Threats
SecurityStudio help information security leaders at organizations ensure they’re protected against cybersecurity threats, stay insurable, and legally defensible with our risk assessment and risk management software. Schedule a demo to learn how we can help.
[00:00:22] Evan Francen: Good morning everyone. Thanks for tuning in the episode 99 of the Unsecurity podcast. Today is september 30th 2020. And joining me is my co-host and friend Brad Nigh. Good morning Brad.
[00:00:37] Brad Nigh: Good morning Evan. How you doing man?
[00:00:44] Evan Francen: You know the weather is nice but allergies are little uh
[00:00:46] Brad Nigh: I like this weather.
[00:00:48] Evan Francen: Yeah. You and I have a different definition of nice. I think that’s okay.
[00:00:51] Brad Nigh: It’s cool in the morning and get nice in the afternoon. Nice warm, mid sixties honey. Yeah,
[00:00:59] Evan Francen: I suppose. All right. We’ve got a special show planned for our listeners this week brad and I, we both watched the social dilemma on netflix. It’s a documentary about social media in our society. And interestingly it was released in january. It’s funny how neither of us, you or I would watch it until me. Yeah. You watched it What? Uh you know, yep. And I watched it uh a couple weeks ago. Um, and now it’s trending verse six is the most popular video on Netflix, but that changes every day. Uh but I guess it’s better late than never better late to the party than that showing up at all before we jump in because I am really excited about talking about this because it was kind of an eye opener for me. But let’s catch up quick. Uh it’s customary for us. What’s up, What’s new with you, What’s going on at FRSecure at home, whatever.
[00:02:00] Brad Nigh: Uh, you know, like I said, it’s nice weather. I did a ton of yard work over the weekend and then just kinda relaxed and enjoyed it. You know, some family time. We’ve got guys just so much going on with like big things coming for our secure, so they’re working on the new uh D. C. So program and how that’s gonna look and the I. R. A maturity assessment and just a ton of stuff going on.
[00:02:31] Evan Francen: Well that’s cool. Yeah, keeps keeps you busy every time I um, now I’m on these videos, you know, I turn them on and I look at them. My God, you look tired as hell. I look tired. Do I look tired a little bit my eyes like this or something
[00:02:48] Brad Nigh: Now You look like the al behind you.
[00:02:51] Evan Francen: Yeah. Yeah at all. I like the all. He keeps an eye on things. I’m sure for people who are just listening to the podcast. The run video too. I’ve got an owl background who’s from my shoulder checking things out. Uh Yeah man, there’s so much crap. That crap. That’s good stuff, bunch of stuff at the state of Minnesota going on. Um, I think I wrote a couple new things last week. I can’t remember what it was anymore. But security steel, new features coming. Always new features coming. You’re never done. You know we should we should do this. So automatic population of crime index is coming. Did you know that? Yeah.
[00:03:38] Brad Nigh: Uh I think well I didn’t know it was confirmed. I know that you guys have been working on that.
[00:03:44] Evan Francen: Right? Yeah. Well because that’s one of the things where when you’re doing the risk assessment we’re doing these two are at risk assessment uh you have to go outside of the program to go get data and you have to do it at a place where you have to subscribe. Yeah. And so I didn’t like that. I didn’t like an additional expense for anybody plus it’s just inconvenient. So instead we built a table or a pretty big database actually every zip code in the United States. Um And it’s uh in my metrics used a variety of data. We used data from A. D. P. Data from you know the FBI which 80 peoples from to um just a bunch of different data sources and put it into our own table. Nice. Yeah. So now when you enter in the address, you know because again for the listeners, you know information security is not just eat right. We also have to take into account physical control. So we have to take into account the people part administrator and so uh the physical side of things so much is influenced by the crime rate of the location you’re in We’re also going to pull that data into s to me which will be kind of cool. So if you’re doing yeah so I think a lot of people aren’t really aware of the crime rate in their own backing arts in their own neighborhoods, you just only see what you see,
[00:05:16] Brad Nigh: right?
[00:05:18] Evan Francen: If you don’t hear sirens every day. Well it must be good. Yeah especially property crimes. Property crimes are ones that you typically happen late at night. There’s not an immediate victim, you know like in a violent crime, you know that Right away right. That’s when the sirens, that’s when you hear the sirens right? Uh But on a property crime typically it’s reported the next day, the next the next month and it happens typical hours for most property crimes like theft, burglary happened. Uh huh. You know between the hours of two a.m. and four a.m.
[00:05:59] Brad Nigh: Yeah while
[00:06:02] Evan Francen: you’re sleeping,
[00:06:03] Brad Nigh: What is it that you ever, everyone always says nothing good happens after two AM. Yep,
[00:06:11] Evan Francen: yep. And sadly A friend of a friend of ours from church, he actually passed away. He was only 21 years old, speaking after two AM. 2:30 AM. He uh him and a buddy we’re drunk driving, you know and we’re wearing their seatbelts flip the car uh they were both extracted and you know he didn’t make it but that was 2 30 in the morning. So again two a.m. Man, it’s time to maybe be locked blocked up doing work. Right. And something or sleep
[00:06:42] Brad Nigh: either working or sleeping. That’s about all you should be doing to write
[00:06:48] Evan Francen: or burglary.
[00:06:49] Brad Nigh: You shouldn’t be doing
[00:06:51] Evan Francen: that. No, you shouldn’t Burkle, don’t google, don’t
[00:06:56] Brad Nigh: burglars burgle word,
[00:06:58] Evan Francen: I don’t know the word.
[00:06:59] Brad Nigh: It’s fun to say.
[00:07:01] Evan Francen: It’s kind of fun to say. I cannot have never burgled. No, nope done a lot of things. But that’s one that I have a all right, so family, you’re healthy other than your sinus infection today?
[00:07:18] Brad Nigh: Yeah. Feeling feeling pretty good.
[00:07:21] Evan Francen: Cool. All right. So you watch you watch this also dilemma on two sunday night, correct? Yeah. All right. What did you think?
[00:07:31] Brad Nigh: You know, I’m a little torn because a lot of it in there wasn’t terribly surprising or new. Uh you know, we’ve known that these companies have these algorithms and they’re doing these things. Um and so it wasn’t anything terribly surprising to me. I’m glad that maybe it’s going to uh shine some light to maybe expose this to others that don’t think that way. But I don’t know. It felt I didn’t like the fact that the people that made the money off of it and built these companies were like well gosh look what’s happening and trying to come across it almost came across as self righteous and there they profited they built the stuff right. If they really had a problem with it. Why would you stay there for five years? 10 years?
[00:08:34] Evan Francen: Yeah. Well it’s interesting to the different takes that I get from people who have watched it because I’ve heard everything. I like hearing other people’s perspective on staff. We don’t do that enough. I think in today’s society where I it’s okay to disagree with people. Uh I talked to some people and they’re like, oh my god, really? I mean their eyes were opened, they talked to others who are like, oh okay, yeah that makes okay. That makes a lot of sense. And I talked to others who I gotta talk to one person who just simply disagreed with the home. There’s like nope, really? I talked to others. Yeah. I talked to some who were like uh one person in particular and I’m not going to mention any names here because I don’t want to help anybody that didn’t get permission for the one person was like, well I don’t use social media so it doesn’t really matter to me. And I was like, well it does as people, you don’t live in a bubble. The people you interact with every day at the grocery store at you know at the gas station, they are using social media and they are influenced by social media and it actually it’s beyond influence its manipulation. They’re manipulated by social media. And so it’s if you want to interact with others, it helps to understand where they come from. So where are they getting their data where they’re getting their information. Well, if they’re getting it from social media, that’s going to be different than you. You know, I’ll give you, I’ll give you an example. Uh yeah, I could talk all day about some of the stuff, but one of the things that struck me was, you know, I went to Sturgis and are, and I was responsible. I want to be a responsible adult. I don’t want to put people at risk. I don’t want to hurt anybody. Uh I went there for my own mental health. You know, I needed to get some sense of normalcy. This, this this year has been insane for me. And so I went to Sturgis, nobody was wearing masks, people spreading, he had two camps. It was so diametrically opposed. One was like, it’s a super spreader event. You know everybody the whole world, you can expose everybody and then the others are like its way of blondes for yourself. So I went to Sturgis and then you come back from Sturgis and then you know, I socially distanced are actually isolated for two weeks. You know, I didn’t get Covid, I didn’t bring Covid with me. I’m cool. I think you know at least I was because the two weeks for sure, right, right. I mean I was outside of that window. So then um and um you know the news stories and I haven’t read recently, but it was, you know 300 ish, maybe confirmed cases came from Sturgis Alleged right, one study and uh yes, one death. So there are 460,000 people that were in Sturgis and you do that. It’s like, you know, wasn’t it doesn’t seem like much of a super spreader event to me. Yeah. And then there was another study that came Right and it’s had 250,000 people were
[00:11:57] Brad Nigh: exposed or whatever,
[00:11:59] Evan Francen: right? That coat that, that Sturgis was For 260,000 covid infections. It’s like uh huh Yeah, 300 on one side, 260,000 on the other side. And then I read that study word for work All 53 pages of it and I was like, now this is all mm it’s doesn’t it’s crap. So, but the reason why this is important and I’m getting to the point, I I promise is if I am getting my news Fed to me that says 300 people And then you’re getting your news that says 260,000 people
[00:12:41] Brad Nigh: Yeah. And we,
[00:12:42] Evan Francen: we meet and we meet it right? And we meet at the grocery store and you find out that I went to Sturgis If you get the 250,000 person, you’re like yourself as son of a bitch, you know, you exposed to everybody, Look what you did, you were part of this and I’m thinking like what, what are you talking about? Right? There was no super spreader man Because I’m getting my news saying 300 people, right? So you’ve already already think just the biggest jerk. And I’m wondering what the hell is your problem? You know what I mean? And that’s some of the stuff that happened with the social
[00:13:23] Brad Nigh: media. Yeah, Well, it’s a you do get you see it all the time. You get into an echo chamber, right? Where you’re only following people that you know, you agree with and support your you well, that’s not great. I mean, no, you need to be looking at at as much as you can and looking at opposing views and understanding, you know, Trying to get the whole picture. Not just like, like you said, Well, uh, the thing I said, I saw said 300 or the study I saw said 250. Like you can’t just look at one and be like, well, there we go.
[00:14:10] Evan Francen: Well, that’s why I was trying to make a point to this person who says that or not. I don’t I don’t do it much of social media. Well, you should know that this stuff is happening because you need to take the time. Like if somebody is like judging you and all over your ass about something and you’re wondering like, what is the deal? What’s your reality man? Maybe that’s part of it. That’s what they get fat every single day. And so all right. So about backing up a little bit. I thought it was interesting uh how I really do think that a lot of these guys started and gals started these platforms for good purpose. You know like the guy who invented the like button you know he said that he because he wanted to spread positivity to the world right? I mean liking that’s
[00:15:09] Brad Nigh: Yeah. You know I wouldn’t necessarily I don’t disagree with that. I don’t think that these people started it with malicious intent.
[00:15:20] Evan Francen: Right? Well I think what it was when when media first sort of started the customer was you and me. We weren’t that we weren’t the product the product was whatever you know, was there platform. Uh And then one of the guys that I think that uh if any product is free then you’re the product or something.
[00:15:44] Brad Nigh: Yeah if it is free you’re not the consumer, you’re the product.
[00:15:49] Evan Francen: Yeah. And so that’s what the advertisers, you know sort of came in and figured out how to monetize, you know social media then it became, well if I want my adds to have the most impact. I want to tell them premium dollar and I want to make the most amount of money. Well then my data better be really really good. So then we start tracking every single thing that people do. Well how much time they spend everywhere so that my targeted ads hit you with the best possible chance of you doing the action that I want you to do right? And if I get better at it, if I’m facebook and I’m better at this than uh you know instagram. Mhm And the advertisers going to spend their money on facebook instead of instagram or vice versa or whatever. Right? And so whoever’s got the best sort of data is going to fetch the top dollar and then it just exacerbates mm and then it becomes and then it crosses another line like it’s fine to keep him in fine. But if you just capture data about somebody and try to use it to target them. We’ve been doing that for years. It was an issue is when I start feeding data back to you to manipulate you into doing things that um becoming unhealthy
[00:17:16] Brad Nigh: Yeah. You know
[00:17:19] Evan Francen: and make it addicting, I would create a social media addicting.
[00:17:24] Brad Nigh: Yeah. Yeah it definitely isn’t. You know personally I don’t I have the twitter account, I’m barely on it other than to kind of use it as a news aggregator. But yeah you do see it where it’s like you know people have to post immediately to whatever for everything that happens. You know it’s that instant gratification of posting and getting the lights or or whatever, you know internet points for the platform you’re using. So I do think that there is that you have to be careful about that right? Oh
[00:18:09] Evan Francen: uh there’s so many things to account for. It was funny when they started show, they asked these people who are super smart people, right? That have been in the social media industry, in the tech industry for a long time. They started off the show and they asked what’s the problem? And you know, how they all like they couldn’t put it into a word first, you know, just a simple statement. Uh, the problem is bigger and deeper than that, right? You explain The suicide rate in teenage girls aged 11 to 14 nearly tripling over the 10,011, right?
[00:18:56] Brad Nigh: Oh, there’s
[00:18:57] Evan Francen: decades before that it was flat,
[00:19:00] Brad Nigh: right? No, there’s definitely issues. And I mean, we’ve talked about it, it’s the, a big part of his cyber bullying right before you had to actually physically be around someone to bully them now. It’s never, it’s just as far as your phone, right right around the clock.
[00:19:22] Evan Francen: Well how often have bullies, you know, the bullies bully oftentimes because they feel insecure about themselves, right? If I put you down, that elevates me kind of thing, you know, we dealt with bullies on the playground and you know, they’re just usually they’re just uh, you know, immature, you know, little people, little people inside, you know what I mean? They just, and then, so they come out kind of fighting because if they put you and then they feel less worse about themselves when you do it online. Yeah. You know, I’m trying to get more likes trying to get more followers trying to do whatever you’re somebody who’s, you know, I view as a potential threat or I can get other people, you know, I mean, there’s just so many motivations for bullying now. They didn’t have before.
[00:20:17] Brad Nigh: Yeah. Well, I think people, it’s easier to do it because it’s anonymous too. Or it can be, Yeah.
[00:20:27] Evan Francen: Yeah. It lives forever. Like when I get, when I would get bullet on the, on the playground, we just fight.
[00:20:35] Brad Nigh: Yeah. And then you go home and it’s done.
[00:20:38] Evan Francen: My neighbor, my neighbor’s name was matt, never forget this kid. Uh, we were best friends and I don’t know, maybe once every couple of months, you know, big fight, you know, and so I beat the crap out of him or he’d beat the crap out of me and then, you know, later that afternoon were out riding dirt bikes again. Now online. It doesn’t happen like cat,
[00:21:07] Brad Nigh: Right? Right. Well, like I said, you just, you can’t escape it, right?
[00:21:14] Evan Francen: So there’s a bullying part, certainly there’s the, when I’m so attached to something that’s where I feel my value. You know, uh, some people, it’s food, right? They get, so they’re addicted to food. That’s where their attachment is. They feel good when they eat certain foods. Uh, so there’s, and I thought it was really interesting in the social dilemma how they’ve used and mastered the psychology of behaviors to create their applications in a way to influence behaviors. You know, one of those that came up and this is a side effect of it is that snap that Snapchat dysmorphia
[00:21:56] Brad Nigh: Yeah, Well, and you do see that and it doesn’t help that, you know, people are doing, you know, all these filters and Photoshop ping and it’s just incredibly easy to do, right? It’s that false false impression of blood real is
[00:22:18] Evan Francen: Well, totally. And like I have a 15 year old daughter and you have daughters that are teenagers or one that’s a teenager now they, uh, what I think women, it’s different from boys and girls, right? It’s so different like girls like to yeah, be complemented on the, on their looks complimented, you know what I mean? They just, it’s different boys. Boys, it might be accomplishments or sports and it’s just, and sometimes there’s a mix between the two. I’m not saying that it’s right. I’m speaking in generalities, but the yes. Uh it’s sad to see that I get my gratification from what you think I look like on a computer screen and not who I am as a person,
[00:23:07] Brad Nigh: right? Yeah, I agree. I don’t, yeah, I don’t like it, but it’s kind of where we’re at right now.
[00:23:21] Evan Francen: Well, but we’re gonna get to that because I think we can undo some of this, we can make it better. What I’m not cool with is like, there’s something that’s broken and then just sitting there going, well shit, excuse my language well that’s where it is, that’s the way it’s not good enough. Well because even if it’s not influencing me, it’s influencing the people I interact with every day. True,
[00:23:44] Brad Nigh: you know, going back to kind of back to the social dilemma the movie, you know and saying you mentioned, you know not doing anything, you know, I did find it interesting that Netflix is producing this and uh you know they had a prize for it was like $1 million dollars to build a better algorithm for recommendations, right? But they don’t mention themselves in the film for doing it which but they do the exact same thing, you know, and you know the other thing that kind of came across as you had all these people saying well it’s horrible, but no real solution other than just put your phone down, quit doing it. There’s no there was no like positively no action items other than well just don’t do it right, How do we
[00:24:42] Evan Francen: want? I think, I think I’ve got some action items for us towards the end, I’d like to talk about your because I agree, I think we need to do something and it needs to be us, it means to people, not the companies if you’re expecting twitter or facebook or instagram to fix this themselves,
[00:25:04] Brad Nigh: know what I mean?
[00:25:06] Evan Francen: They’re making billions and billions of dollars, why are they going to change it?
[00:25:11] Brad Nigh: Oh absolutely, you know, they’re in it to make money but you know, I don’t know, it was like I said at the very beginning, you know, there was there was some good, good parts of the movie like it brought to light a lot of things that I think I’m gonna use the air quotes for the normal people uh that they may not have realized, you know, there are some points with where my wife was like, oh I didn’t know that, you know, and to me, I was like uh this has been the case forever. So you know, I think that it could potentially do some, you know, bring some positive like to it. But overall, I don’t know, it just felt priest. Yeah, I felt like it was honestly like a kind of a ego boost for the people that were on it, right? Like well we’re so good because we’re calling this out now and
[00:26:09] Evan Francen: I love I love the fact that you and I see a different because that makes it, that’s what makes it cool man. Honestly, I don’t take offense to people who see things differently because I saw it different. I saw it as a group of people who wanted to do good and then realized, oh my God, what do we just create? And the change of heart, That’s the way I read it
[00:26:36] Brad Nigh: and I could totally see that, right? And you know, there’s just there was a bunch of stuff. So I started looking at uh, you know, you have a list of all the people that were kind of featured on there. And I started started doing some looking on it. And you know, like he had uh I gotta look at who it was. Uh, tim Kendall. He’s, yeah, he’s the ceo of a company that helps you reduce your screen time. But they don’t mention that, you know, there it feels like there was just some it wasn’t completely truthful, right? Like it was, it was doing some manipulation itself to do this. And, and again,
[00:27:24] Evan Francen: I think everybody’s got a bias, but
[00:27:26] Brad Nigh: again, I think that overall, I think I’m hoping that it can bring some some life to some of these things that people don’t realize, but Okay.
[00:27:39] Evan Francen: Yeah. Well, okay, so, so try to get past the motivation are trying to understand maybe the motivation of the people that were in the movie. If you look at the movie just for the facts or you look at the movie for the things that are actually happening with social media, you know, what things in there. Can we dispute? Because I can’t really find anything uh it’s all completely feasible. And what it did for me was it took like you a lot of things, uh, I don’t like social media uh much. Um I try to get other points of view. I hate, I don’t hate, I really dislike making decisions based on opinions. I like making decisions based on facts. I don’t make decisions. Um on second information. You know, I’ll give you an example. Last week one of our executives came to me and was explaining to me what another executive, what they think another executive is thinking or feeling. I was like hold up. If they’re executive is having an issue, they can come talk to me. I use firsthand information. I don’t use second hand information. Yeah. Well consider it. But if you’re telling me something that somebody else said, if I’m gonna operate under that premise, I’m probably gonna get it wrong because I’m interpreting you’re saying about somebody else is thinking the hell, why don’t I just go to the person who’s thinking the things that you say they’re thinking? You know what I mean? So when I looked at the social dilemma, you see all these things happening in the world around you and I’m always trying to figure out like facts like what the hell is actually going on? You take Covid you can take uh any of the events of this year, take the social justice movements. You know, whether it will there be black lives matter or antifa or whatever else. And you look at the stuff and I’m like, what the hell? How does this make any damn sense?
[00:29:51] Brad Nigh: Well, I think the one of the big issues with social media is it makes it really hard to differentiate between fact and opinion, right? Like there’s just
[00:30:03] Evan Francen: alternate realities,
[00:30:05] Brad Nigh: right? Exactly. It people are throwing so much out there. It’s like, what do I what what what’s the reality like you said? And so I think that’s a big, big issue. And that’s the other thing. Like if you look at, you know, reputable news sources, they have a vetting process. They have to do you know editorials and
[00:30:26] Evan Francen: who’s a reputable news source now?
[00:30:29] Brad Nigh: Well, I’m just traditionally like you’re looking at like I’m just looking like Washington post or the new york times or some of those bigger newspaper or news organizations. Right? But they have a process regardless of who it is. There’s a actual process, right? They have to.
[00:30:47] Evan Francen: Yeah. So the thing that I’ve learned is like everything is for sale. Everything your your information you as a human being are for sale. The the uh the one doctor, what was the doctor crap? What was her name? I believe it was either doctors uh bev bev that was
[00:31:10] Brad Nigh: the human futures.
[00:31:11] Evan Francen: Yeah. Because the Washington post the Washington times. Fox news CNN all these news organizations. Are we go through one of their websites. Right. Do you not see ads? If I go to the website, Do I probably I probably see different ads that you see because of ads that are served up specifically for me. They’re also influenced. It’s all biologists. Right? So I don’t even know what is a reputable news source anymore because they feed me different realities shoot, You know what I mean? It’s just so if you want to find like real facts, you almost have to, you have to do the work and go find real facts.
[00:31:52] Brad Nigh: Oh, I would agree. Like don’t just Trust one Source, Right?
[00:31:58] Evan Francen: Yeah. Go back to the 260,000 study or whatever the number was of people that were infected because of um, because of Sturgis every news or every news, not every, but every every major news organization that I was able to find when with that story as is basically so CNN Fox news, it was one that crossed over the political spectrum, which is interesting left and right. We’re both taking this in running with it. And I’m like, what the hell? So you read the damn report, you’re like, uh, this is a crap study. Sorry. And it’s not peer reviewed. You’ve got three bricks that go into this thing that you do this conclusion. The metrics don’t correlate. So I don’t know how the hell you came up with 260,000 and everybody ran with it. Like it’s fact. And then after a while after the dust settled a little bit, you have to see a couple of news stories that come out like, yeah, that report was probably Bs. Well nobody follows your sex. You know, your redactions. Well
[00:33:07] Brad Nigh: that’s the, I think that’s part of the issue is we’ve gotten into this 24 7 news cycle, you know, regardless. You know, I would say probably driven by social media more than anything and it’s just a rush to get be the first, right? It’s so many people are just like wanted to be the first that yeah, they don’t always take the time to do some of that checking to verify what’s going on.
[00:33:41] Evan Francen: Right? So why do they want to be the first?
[00:33:45] Brad Nigh: Well, because that’s what drives your money, right? If you are the first you get cited drives revenue, drive traffic to your site,
[00:33:53] Evan Francen: it’s back to me, Right,
[00:33:55] Brad Nigh: Right. Oh, absolutely.
[00:33:58] Evan Francen: So you have the information age because I think you, and I remember when news seemed to be actually news, right? When it wasn’t so heavily influenced by advertisers by social media and it’s not that my wife and I were talking, uh, you know, she says, well money is the root of all evil. And I’m like, that’s not true. It’s the love of money, that’s the root of all evil. Yeah. Money’s not bad, right? It’s when you start to love money, when you compromise your morals, right? When you compromise morals and ethics and all these things for the dollars and it seems to me, you know that the social media started off probably with good intentions even facebook, right? Facebook was, you know, a campus tool for, you know, interaction, I doubt that Zuckerberg, I thought, oh, this is going to be a multi multi multi billion dollar idea where I can influence elections and influence, you know all this stuff that’s what it became somewhere along the line. I think there was a compromise in oh money seems pretty damn cool things with it and you know just insulate myself from the rest of the world and all this sinful crap I did, you know what I
[00:35:20] Brad Nigh: mean? Yeah I know yeah
[00:35:29] Evan Francen: it’s
[00:35:31] Brad Nigh: a mess, what is this
[00:35:33] Evan Francen: and it’s not going to get better right you and I know how math works you and I know how algorithms work you and I know how Ai works a I just continues to progress more data, more learning refinement of the of the decision trees or whatever that decision trees but refinement of um feed more data to Ai and the better Ai gets.
[00:36:02] Brad Nigh: Yeah but again like we talked about the big issue with Ai is it’s built with the biases of whoever created it, you know, so it’s right and
[00:36:11] Evan Francen: in this case if the motivation is being I don’t give a crap about you, you’re the product, right, yep, that’s a scary, it’s a scary Ai man.
[00:36:21] Brad Nigh: Yeah. Yeah I don’t uh yeah I don’t think that we disagree that that there’s some you know unsettling things out there that you know at this point these companies now see people as the product as a uh huh. Yeah oh I’m not consumable but you know I don’t know what the right word is, I’m struggling here, but you know, it is an issue and I do, like I said, the one thing I do like about the social dilemma is I think it’s bringing a lot of the stuff to light to people that had no idea or weren’t aware of it, whether it’s, you know, intentionally or just they didn’t know and I would I would be willing to bet the vast majority just didn’t realize it, right. They just don’t understand this. But yeah, uh huh.
[00:37:21] Evan Francen: Was there any was there anything in the dilemma that you disagreed with any of the conclusions? I mean, I know the bias is a thing. Yeah,
[00:37:30] Brad Nigh: no, I mean, I don’t, there were certain things about the way they did it, like, you know, they had those three guys standing on at that
[00:37:40] Evan Francen: that was done. But
[00:37:42] Brad Nigh: you know, so I didn’t like some of that stuff, like I felt like that was misleading. Um you know, we’re like, oh well we’re gonna get we’re gonna do this stuff to make this, that’s not how that works and
[00:37:55] Evan Francen: it, you know, but but it’s not all that unlike a i is it, I mean to the normal people who don’t understand i if you were to show them the math and the they would never understand that, but they can relate to.
[00:38:07] Brad Nigh: Yeah, yeah, yeah. I don’t know, I didn’t I didn’t like that, I just felt like it wasn’t real good. But overall, I think there was a lot of really, there was a lot of good messages out of it and you know, it’s there’s a lot of things that
[00:38:29] Evan Francen: so you didn’t like the three guys and the best. But again, is there anything that you disagreed with? No,
[00:38:37] Brad Nigh: overall it was like I said, it, it’s going to bring the light a lot of things that that a lot of, you know, some of them were, well, um security, privacy, tech focused people probably knew about and understood. But yeah, so I think there’s a lot of really good that could come from this. I’m not. What
[00:39:02] Evan Francen: do you think that, what do you think? Yeah, what do you think bad could come from it? Mhm.
[00:39:09] Brad Nigh: You know, I think the biggest concern that I would have is yeah, is that people are gonna see this and take his gospel right? Like, like you said, they’re not gonna look and understand maybe another source. Hey, like I mentioned, what was, what was the motivation of some of these people? I think that’s important regardless of what it is, whether it truly is, you know, virtuous, I made a mistake, I tried to do something good and now I want to call his delight or hey, you should put your phone down and guess what my company’s does help you put your phone down, right? So I would hope that people that really watch this and kind of it is eye opening start doing the research. Because if they just take this as gospel and understand and say no, this is how it is. Well, it that doesn’t that’s exactly what’s going on with social media. The movie is doing exactly the same thing.
[00:40:08] Evan Francen: Well, that’s why that’s why I’m asking you. And it seems like it may come off like it’s a challenging thing and it’s not meant to be that it’s truly do want to know if there’s anything that you disagree with that was in that documentary I want to know, because that might push me towards redoing additional research. Because one of the things I’ve learned crystal clear this year, and I’ve always known it, but it really wants to the surface is when people can’t defend their point. They typically do one of two things either change the subject or they uh start attacking your character. And the reason why I’m asking this isn’t to see if you can defend your point, I want to learn more so I can defend my mind. You know what I mean? I’m like, what are you thinking?
[00:40:54] Brad Nigh: So funny? So I watched it? And I mean, you could ask, you know, my wife, she’ll she’ll tell you, I was getting a little fired up at some of the stuff just because of how it was presented and not necessarily the content, but how it was presented. And some of the biases behind it because that is important, right? That’s, I mean, that’s the whole point of social media and they were doing that same thing. So I did do some, uh, digging and looking and there was a really, what I thought was a good article on tech dirt. I don’t know if you’ve ever really followed that. Uh, I’ll throw it. Um, that I thought did a good job of kind of expressing a lot of the kind of, some of the frustrations I have with it, Not necessarily, they’re not saying, yeah, what was in there is not true, but there it would cause out some of the kind of the hypocrisy I think. Right? So that, that was my biggest issue with it. Is is there this movie was meant to do to call out and say, hey look, did you know, this stuff is happening, but it turns around and is doing the exact same thing.
[00:42:11] Evan Francen: What Yeah, I don’t know, is it, yeah. Uh, you know, and that’s because the exact same thing that the things that I take issue with is people using my information or information of other people and using it in an unsuspecting wait too manipulate them psychologically.
[00:42:36] Brad Nigh: And I agree that that is,
[00:42:39] Evan Francen: I didn’t feel like the movie was trying to manipulate me to do anything psychologically. I think what it was doing was calling out, calling awareness to and the challenge in the movie and doing it this way is you have to reach various audience, right? You have the tech audience who’s like, yeah, I could do without the story of the family and all that other Bs because whatever, but then you’ve got, the two aren’t tech people who are like, they resonated with that story, There might have been, I’m guessing that there were probably people that had no idea how Ai actually works at all.
[00:43:15] Brad Nigh: Yeah,
[00:43:16] Evan Francen: look at those three guys, look at those three guys and go, is that sort of how AI works? Well, sort of, not really, but we’re getting you closer.
[00:43:25] Brad Nigh: Yeah, well if I don’t want to, I’m not disagreeing with the overall point, right? I think that it has the, that it had, it did have good information, right? I just did, I didn’t agree with how it was presented.
[00:43:42] Evan Francen: But the one thing I like about this stuff Yeah, well, the one thing I like about this stuff to is what’s our common grounds are common ground is we both agree with the point.
[00:43:53] Brad Nigh: Mhm. Oh absolutely, like there’s a reason I’ve never had facebook and I don’t have instagram or Snapchat er whatever the, you know thing is for that exact reason, what they said in there, it’s true, I absolutely fully understand that. The reason I have twitter is for my professional kind of requirement standpoint more than anything, not because I want to do it or like and enjoy it, right? So I’m not disagreeing with that.
[00:44:25] Evan Francen: Yeah, and this is like anything, right? There’s using something and being addicted to something,
[00:44:29] Brad Nigh: right?
[00:44:31] Evan Francen: Like I’m that’s the reason why I can’t, I don’t drink, right? I’m not good at using alcohol, I abuse it. All right, I’m addicted to such a thing. Uh But with social media, it’s something that’s never been addicting to me. It doesn’t uh it just doesn’t latch for some reason and which is great because I don’t want to be so I can use social media. I have, I have a facebook account and I may be on there once a week, maybe just, you know, what’s up, you know, actually follows from Harley Davidson groups on there just to see what’s going on with motorcycles and such, um twitter, you know, pain in the ass because there’s just so many Stupid things it said on Twitter, characters.
[00:45:22] Brad Nigh: Alright, well that’s what we’re just talking about with, You know, all the stuff, I totally agree with the premise, right? If you’re not paying for it, you are the product 100%,, you know, it allows for this cyber bullying and the anonymity allows for just, you know, just hatred, misinformation, whatever it might be. I’m not arguing that at all. Uh you know, but yeah, like you said, I think it’s, it’s interesting, we do agree on the central premise, right? I think the big disagreement is kind of, you know, maybe how it was presented and how it was portrayed, which, hey, that’s that’s what makes it great is we can have different viewpoints, but at the end of the day, we both are kind of saying the same, yeah, what the same goal, Right?
[00:46:22] Evan Francen: Well, that’s the exercise too. I think in just having, you know, in in in the podcast today is uh Disagreeing in the way things are presented is one thing, disagreeing, central point is another thing trying to find common ground so that you can progress and what are we okay we’re going to do about it so far and uh, you know, the way things are present, that I get that, man, I mean, there’s certain things that I watch, like, I mean sit down and watch a team with my wife and I’m like, what the hell are watching this doesn’t this is stupid and she’s like enthralled with it. Well, that’s okay, that’s taste and whatever, right? But the central point is, you know, that social media is being used to manipulate people and um, you know, we gotta do something about it.
[00:47:19] Brad Nigh: Yeah, Well, and yeah, I think the awareness and shining light on it is going to be, you know, one of the biggest factors and that’s what this movie is doing and I think that’s a good positive out of it. I think, you know, like, I don’t know how many times I’ve already said today, I think it’s going to like bring this to like people who never even considered it, who had no nothing like no understanding of it at all. And if they are now aware of that and understand that and start and that makes a positive change for them, then that’s great. That’s a great first step.
[00:48:01] Evan Francen: You know, right? Well, in the addictive nature of social media to, you know, I have a 15 year old daughter who I am maybe a little more liberal than other information security people and how she can use social media because I’m uh and it’s, you know, different ways of raising kids and there’s nothing wrong with it. Uh, I’m one of those parents who, and we’ll let their kids fail To to learn a point, right? I’m always a safety net, right? You’re always a safety net. It’s like I let uh like in our house when you’re 18 years old and if you’re not going to college or doing other things, then you’re out, right? Talk to everything I can teach you. You can make a living. So with my daughter and your social media use uh there was a few years ago I grounded her, right? Because she snuck out and I caught her because I, because I get up at all hours of the night, right? So I had gotten up at like two o’clock in the morning to go do some work. And uh here comes my daughter walking in with her friends and I’m like, what are you doing? She got grounded for a week. And part of the grounding was, I’m taking your phone away, I’m taking your phone away. Knowing the psychological impacts of taking away from a teenager, there’s actually cycle out, There’s research that you can cause some serious trauma, not unlike PTSd from taking a phone away from a kid. So I took, so I said, you know, from this hour till this hour, every night, you’re not going to have any, you’re gonna have your phone. Well, she heard me say that she wasn’t gonna have a phone, her phone for a week. She didn’t hear me say between this hour and this hour. So she Daniel had a nervous breakdown. Yeah, because she thought I was taking her phone away period for an entire
[00:50:03] Brad Nigh: week. Sorry, it might be the kids just went to the bus stop and the dogs out putting the kindle, we’re going insane. So I have to go yell at him real quick. I heard, you know sorry about that. Yeah, no, I agree. I think, you know, it’s not what you do have to trust your kids right? Like you hope you’ve raised them, right? They’re going to do the right things. Uh
[00:50:28] Evan Francen: you know, the part that surprised the part that surprised me was just the mental action to thinking she wasn’t gonna have her phone for a week. You would have thought I lopped off her right arm.
[00:50:41] Brad Nigh: Yeah. You know, it’s funny, I’m trying to think my daughters aren’t, They don’t leave, we’ll go out and they leave their phones here or they’ll let their phone’s battery die and then plug it in, which by the way, really, I’m like, 02% how what? But you know, they they
[00:51:03] Evan Francen: 100%
[00:51:04] Brad Nigh: thanks. But you know, I think, you know, setting expectations up front. You know, my oldest one does have uh an instagram, You know, if her friends all had it and she was kind of getting left out, so, you know that she got it when she was 13, which is the terms of service and I know she has friends who have younger siblings like 67 years old that have instagram accounts that, to me is an issue you’re as a parent, you’re now, also you’re failing your kids. If you let them on social media at 67 years old, that’s not cool. Like you’re just setting them up for all this stuff we’ve been talking about. So, you know, I think getting setting those expectations up front, uh, you know, being open and honest with your kids, talking to them, communicating with them, understanding their behaviors and being able to tell when there’s a difference when something changes because if they’re being bullied at school, it’s usually pretty easy to tell, right? Somebody’s gonna see if they’re gonna, it’ll get around, they’re getting bullied online, it’s a lot harder to see. You may not even know. Right? So just being, having open communication and trust with your kids is, you know, really important, especially with ground social media.
[00:52:36] Evan Francen: So we’re we’re coming up on time. Uh I think we do a part two for the social divide. This has been a great discussion. I’ve learned some things uh you know, in talking with you and the reason why this is important, you know, from an information security perspective is information security isn’t about well there’s lots of things actually this is that we’ve got we’ve got some serious privacy issues here where people are using my data without my explicit consent. Yes, there is a terms of service. Yes, there was something you clicked, but when I say explicit, I mean known to you that you made a conscious decision that I am going to trade all of my data for this thing knowing the consequences of that. That really hasn’t been fully by, you know many people. And also I’d still like to discuss um what we can potentially do about this because I’m not one of the people where I’m like, oh well it sucks. You know, I mean you and I are both not like that. No, I
[00:53:42] Brad Nigh: agree. I think part two where we think more focused on what are the options, What are the solutions? How do we move forward from here would be a really good discussion as well.
[00:53:56] Evan Francen: Yeah, I agree. So let’s do that for next week. Uh, this week 100 great discussion. What’s that
[00:54:05] Brad Nigh: next week? 101 100. That’s insane. I
[00:54:09] Evan Francen: know, man, I don’t yeah, crazy. Uh so if you haven’t looked yet, uh if you go to Evan francine dot com, you’ll find the episode 99. These show notes, we’ve got, you know, some additional notes that I made. I did create a list of the sources including Tristan Harris, Jeff Seibert, bailey, Richardson so on. And you know, the people that were actually in that movie, if you want to go down the path of trying to figure out bias is a little bit more, you know, maybe that’ll help uh you haven’t seen the social dilemma yet. I highly suggest you do sit down, spend the hour and a half.
[00:54:51] Brad Nigh: Yeah. And I’ll, even though I didn’t necessarily proof that was presented, I agree you should, it’s a good, it has a good overall, you know, I think intent, but, and I think it’s gonna bring a lot of light to a lot of people.
[00:55:07] Evan Francen: Yeah, one of the things I don’t want is social media is so ingrained into our society. I don’t want people to be ignorant. I don’t want you to just think, oh, it’s just the way it is. You know, I open facebook every morning and blah blah blah, blah, blah, you’re not people are, even though they act like it, People are not sheep. We’re not, we have a left and a right, we have logic and reason. We have creativity. We have in our brains that I’m not I’m not willing to sit by and let people use be used by, you know, a sheep man. I’m just not
[00:55:42] Brad Nigh: Well, I still go back to the men in black quote of a person is not is not done but crowd or a group of people are, you know, panicky, not great. So it’s individual. Sure. But as a group,
[00:56:00] Evan Francen: right? But so sit down and spend the hour and a half consider at all. Just like we are right. And it’s okay to have different points of view. I don’t know why we got to this point in our society where it’s not ok for somebody to disagree with help. You
[00:56:15] Brad Nigh: Just have a yeah. Have a civil conversation like you, you know, I’ve had you said some things that I’m like, oh, you know, I didn’t consider that. That’s a good point. But it’s pretty clear. We didn’t we don’t we’re not seeing 100% on on this. It was okay. Yeah. It’s like it was it was a constructive conversation.
[00:56:38] Evan Francen: Absolutely. And one of them and I like that too because you set the example hopefully for others to have the same because one of the things you opened my eyes to which I didn’t consider enough I think is the bias behind things. I will go do some research and I will think okay because I want to I’ve got it’s like a meal right? I got to meet, you know, and now everything else is kind of like the spices on the meat. You know, it’s really going to influence how it tastes. It’s really going to influence how it affects me in my brain.
[00:57:07] Brad Nigh: That’s a great analogy.
[00:57:09] Evan Francen: Yeah. Thanks man. See the left, the left brain and the right brain work. We’re not sheep. Yeah, none of us are. Uh so go watch it. I had three news stories. I wanted to talk about their on show notes. We’re not going to talk about them now because we did spend a really good conversation today
[00:57:29] Brad Nigh: and we didn’t even talk about the U. H. S. Ransomware. No, God,
[00:57:36] Evan Francen: there’s so much to talk about what we could do a daily podcast easily. You know, if we had a yeah, maybe maybe that’s what I do in retirement. Let’s just do a daily podcast. Just what the hell is going on today? Um Yeah. All right. But the three news stories was Windows XP Windows Server 2003 source code leaks online. If you’re using XP or 2003. My God, what the hell? Get off them. It’s 17 years, 18, 20 years past that now uh from when those were released. Uh So if you whatever, it’s an interesting story, go, you know, google it and you’ll find some stuff. The trump administration’s ticktock ban has been delayed uh again more interesting stuff, but to other social media app that it’s crazy. Yeah. And I thought another thing that was relevant uh, to today’s discussion, maybe we talk about it next week is FBI and sees a warn of election results. Disinformation campaigns. That’s really interesting because our information is being sold to the highest bidder and then I can feed that I can feed you information or disinformation, which that leads to population.
[00:58:52] Brad Nigh: Yeah. Right into what we just into the movie and the point of it. So I think that’s a that would be part of that conversation next week.
[00:59:00] Evan Francen: Yeah. So I’m looking forward to, man. This is they succeeded in my expectations. I came in this morning thinking one I was frustrated because my schedule is all messed up because you lost internet access. But nothing you can do about that, nothing you could do. So I don’t blame, I blame whoever you’re sp is uh so
[00:59:19] Brad Nigh: frustrated.
[00:59:20] Evan Francen: I know, man, isn’t it? It’s just got and it throws everything off. But we’re good. And this was a better episode than I was expecting. I really enjoyed it. And he shot out for you this week.
[00:59:32] Brad Nigh: Uh you know, it’s just so many. I can’t take one. Um I’m gonna give you a shout out for this conversation today. I agree with you. This was thanks to I didn’t I didn’t know what to expect coming into this and this was a lot of fun.
[00:59:52] Evan Francen: Thank you, appreciate it. Right? Yeah, yeah, I could return the favor man, I give a shout out to you, then uh I’ll tell you, I’ll tell you why you’re in a position where it’s difficult. Uh people don’t realize the stuff that you do every day. Um You’re not frontlines anymore, meaning you’re not out, you know, serving customers and being the face of our secure anymore. And you’re also not truly like managing a group, You influence a group really heavily uh and you’re kind of in between the executive leadership team and the senior management team and it’s a difficult kind of spot that you operate in every day and I really appreciate what you do, man. Thank you. Yeah. All right. We’re very grateful for our listeners and we love hearing from you, send us messages by email at un security at proton mail dot com or check us out on twitter. It’s @UnsecurityP. Uh we promised to use our twitter account to manipulate you as much as we can. Uh so if you follow us watch out, we’re gonna start sending you a bunch of weird stuff
[01:01:00] Brad Nigh: like the show notes and you know, right?
[01:01:05] Evan Francen: Exactly. If you want to socialize with me or brad directly. We uh we sort of there you I’m @EvanFrancen Brad’s @BradNigh. We work for people, we get paid by uh you know, we get paychecks mostly. And if you want to follow those people. security studio @StudioSecurity and FRSecure is @FRSecure. Uh that’s it. We’ll talk to you again next week.