Close Mobile Menu

The Edge Episode 3: I Know Where You Live

Subscribe to The Edge on Apple Podcasts, Spotify, and YouTube.

 

Subscribe, and continue listening to The Edge on Apple Podcasts, Spotify, and YouTube.

Laura and Leah worry about their digital presence. How much could someone find out about their private lives based on their online behavior? With the help of Steve Trush and Sean Brooks of the UC Berkeley Citizen Clinic, they discover their cyber-insecurities and clean up their acts.

Show Notes:

  • The Citizen Clinic at UC Berkeley
  • Have I Been Pwned?—Site the identifies if you’ve been part of a data breach
  • MyLife—Website that gathers personal data and forms a dubious reputation score 
  • Berkeley News Center article about the data breach at UC Berkeley
  • Get your own YubiKey
  • Get a password manager—1Password is the one Laura now uses

This episode was produced by Coby McDonald and written and hosted by Laura Smith and Leah Worthington.

Special thanks to Pat Joseph, Steve Trush, Sean Brooks, Ralphie the dog, Brooke Kottmann, and California magazine interns, Steven Rascón and Grace Vogel. Art by Michiko Toki, and original music by Mogli Maureal.

The Transcript:

LEAH WORTHINGTON: Laura, Can I ask you something … personal?

LAURA SMITH: Always.

LEAH: Are you sure? 

LAURA: Yeah, Leah, of course. Go ahead. 

LEAH: Has your … data been breached? 

LAURA:  Oh. Is that really a PERSONAL question? I thought you were going to ask me how often I shower or something. 

LEAH: What could be more personal than your PERSONAL INFORMATION falling into the hands of nefarious actors? 

LAURA: I guess it’s just that “data breach” doesn’t sound like a personal problem—It sounds more like a tech issue; you know what I mean? 

LEAH: Oh, it’s personal. I’m talking about your address, your Social Security number, your baby’s birthday …

LAURA: Okay, leave my baby out of this. 

LEAH: Think about it: Everyday you spread your personal data all over town with hardly a second thought. 

LAURA: Leah, what kind of person do you think I am? 

LEAH: It’s not just you, Laura. We all do it. We share information about ourselves with banks, credit card companies, our doctors, our schools. And we assume that it’s secure. Take UC Berkeley for instance. Tens of thousands of students, teachers and staff entrust the university with their personal info. You know, because you have to submit that stuff when you apply and register, or when you set up your tax info with HR, or whatever. And all that data is stored somewhere. 

LAURA: Yeah, hopefully somewhere safe. Like … in a safe. Behind a painting. 

LEAH: Well … But it’s all digitized, so, instead of a safe behind a painting, it’s more like a computer with encryption. 

LAURA: You lost me. 

LEAH: Okay, how about we do this with a story. 

LAURA: Great. Now you’re talkin’ my language.  

[THEME MUSIC]

LAURA: This is The Edge, a podcast produced by California magazine (and the Cal Alumni Association). 

LEAH: Where we bring in UC Berkeley experts to talk about ideas on the cutting edge and a few that go too far.

LAURA: In this episode, we’re going to find out just how cyber-insecure we are and what, if anything, we can do to protect ourselves. I’m your host Laura Smith …

LEAH: And I’m also your host, Leah Worthington. 

[MUSIC OUT]

LEAH: So in December of 2015, some hackers broke into the UC Berkeley financial system and accessed the personal data of 80,000 people. All of a sudden, the Social Security numbers, bank data, and other info of students, employees, vendors, etc. was in the hands of dark-web tech pirates. 

LAURA: Whoa, that does not sound good.  

LEAH: Yeah, it was a really big deal. 

LAURA: So, how did they get the data? I assume they didn’t, like, break in and steal computers, right? 

LEAH: Well, funny you should ask. That actually DID happen at UC Berkeley, but that was a few years later, and I’ll tell you about that in a minute. In this case, you’re right. They didn’t grab the computers. Basically, there was a flaw in the financial system software, and Berkeley actually KNEW about it and was working on fixing it when the attackers … exploited the weakness. 

LAURA: Those jerks. They totally caught us with our pants down.  

LEAH: It’s true. And it was a huge security headache. The university had to provide everyone with identity-theft monitoring and identity-theft insurance. And then, of course, they had to fix the software. And it wasn’t the first time. In 2009 hackers made off with the data of 160 THOUSAND Cal students, alums, and others.

LAURA: Oh, no.

LEAH: In that case the breach wasn’t discovered for SIX months, so these hackers had access to the data for all that entire time. 

LEAH: AND …

LAURA: Really? There’s more? 

LEAH: In 2005, a guy just walked in and swiped a laptop that had personal data for over 100 thousand students. Although, in this case, the university says they got it back before the info could be taken. 

LAURA: Okay, small victory. 

LEAH: AND … 

LAURA: Leah, stop! This is just far too much hacking. 

LEAH: Six months before THAT, hackers got access to Berkeley research that had the data of 600 THOUSAND— 

LAURA: Okay, okay, okay … I’m done. This is just far too much hacking. But seriously. If Berkeley, a massive research institution with some of the world’s leading info-security experts, is vulnerable to cyberattacks, then what’s the hope for someone like me? 

LEAH: I know. Okay, but before you lose all hope, I do have an idea that might make you feel a bit better.

LAURA: Oh, yeah? What’s that?

LEAH: Okay, what if we treated our online health the same way we treat our physical health? Think of it like this: Just like our bodies, our “digital selves” need regular maintenance to stay healthy. We go in for checkups, we immunize ourselves against dangerous bugs …

LAURA: Okay, I think I see where you’re going with this. Hackers are like viruses that damage your online immune system leaving you vulnerable to worse illnesses … like credit card fraud.

LEAH: If we think of our virtual selves as something you have to take care of proactively, in the same way you might go to the doctor for a checkup, there are actually some pretty simple things you could do to make sure you’re at peak online health. 

LAURA: Okay, so … how do you do that? 

LEAH: Well I know a magical place filled with benevolent hackers who I think can help us figure this out. 

LAURA: Okay, so do you guys want to introduce yourself? 

STEVE TRUSH: Yeah. I'm Steve Trush. I am the Deputy Director of the Citizen Clinic here at UC Berkeley's Center for Long-Term Cybersecurity.

SEAN BROOKS: And I'm Sean Brooks, the Director of the Citizen Clinic.

LAURA: Oh, yeah, we forgot to introduce Ralphie. Ralphie is a really cool-looking dog like he looks like the creature from The Neverending Story.

LEAH: Oh, my God. He does. Yeah, I know. You're right. Yeah.

STEVE: Ralphie is the Citizen Clinic Mascot.

LAURA: So the Citizen Clinic is a public-interest cybersecurity clinic at UC Berkeley that works with “politically vulnerable,” and low-resourced groups. Think: journalists, human rights organizations, nonprofits––groups that might be targeted for online harassment, or malware or security breaches, but don’t have the financial resources to protect themselves. 

LEAH: And my first question is: Who are these hackers anyway?

LEAH: Personally, I’m picturing, like, masked guys in an abandoned arcade maybe somewhere on Coney Island …

LAURA: Do you get ALL of your information from Mr. Robot?

LEAH: No comment. Back to the experts …

SEAN: When when you look at big, corporate email hacks where you're talking about hundreds, millions of addresses, I mean, by and large, these are members of organized crime groups. 

SEAN: Sometimes you'll hear about sort of big breaches done by activists or by, you know, people doing it for the lulz just, you know, trying to make a point. By and large, those are sort of less insidious. 

LEAH: I imagine that hackers aren't going after people, individual people, one by one unless they have some reason to?

SEAN: Yeah, one of the things we tell our students a lot: That all attackers have a boss and a budget. And those things mean different things to different attackers, right. That budget might not necessarily be money, but it could be time. 

STEVE: So what would you be like most worried about someone finding?

LEAH: I'd be worried about, like, bank information, like Social Security numbers. 

LAURA: I also would be super pissed if my Social was available online. 'Cause that seems ... you guys are kind of smirking.

SEAN: I mean, the reality is, is that Social Security numbers are available really widely for everybody. I mean, it's a really commonly breached piece of information. And one of the reasons is, is that it can't change. Right, you know, you have an email address—it gets compromised. If you feel like it's super unsafe, you get a new email address. The things that can't change: your birthday, your Social Security number. You know, those things get out there. And it's really hard, if not impossible, to claw them back.

LAURA: Speaking of things that get out there … passwords. According to Steve, we’ve been doing passwords all wrong.

LAURA: How bad is it if my password is the exact same ... for everything?

STEVE: Really bad. It's probably the worst thing.

LAURA: Oh, God. Oh, I don't do that.

LEAH: Asking for a friend. 

LEAH: What about, like, not the same but like, a pattern?

STEVE: Not the same, but a pattern ... So, common things are like, you change the last number, or, like, you add some, like, punctuation at the end of it, things like that? That's also pretty bad. If I knew what the root of your password was, I can easily write a script that can try out, you know, adding numbers to the end of it or adding different combinations of punctuation.

SEAN: You know, there's a concept for password strength, right, about entropy. 

LAURA: The general idea is, that, as you increase the complexity (or entropy) of your passwords, by using a longer, more varied string of characters, you increase the number of attempts it would take for an attacker to break it.

LEAH: So then we should all change our passwords to super long, complicated things like “xyz!..*”, right? 

LAURA: If you can’t say it, then it can’t be your password.

LEAH: That’s a really great rule.

SEAN: Generally, a strong password doesn't get you very far. You can have a strong password; it is a good thing to have. But given the frequency at which they're breached, and frankly how, like, bad of a tool they are, right, because passwords are hard to remember, it's hard to remember a lot of them. It's hard to constantly be using them and logging in. So we develop lots of shortcuts because generally they're a very bad tool.  

LAURA: How would someone know ... Oh, God, they're smiling. 

LEAH: They just opened up their laptops. 

[LAUGHTER] 

SEAN: Well, so, you know, we've been talking about sort of going after your accounts by finding your passwords, right? But honestly, the easiest way to get access to someone's account is for them to just give it to you. Right. And so what you would do is send a phishing email. And the best way to send this phishing email is with sort of targeted information about that person. Really tailored to their interests. That's called a spear-phishing email. So if I wanted to send you a spear-phishing email, maybe I would look for your Spotify account. And send you something that looks like an email from Ticketmaster and say, “Hey, you know, there are a few last minute tickets available to this band that it seems like you're listening to a lot recently." You know, "Do you want to buy tickets?” and send you to a page that looks like a Ticketmaster login and have you login with your credentials there. And I can use those credentials to see if I can get access to something else.

LAURA: Okay, but sometimes the phishing emails are a little less friendly. Sometimes, rather than prey on our interests, they prey on our fears.

SEAN: A really famous example from the 2016 attacks on the [Democratic National Convention] is the email that sort of got John Podesta—was a sort of a Google account security email—which was, you know, "Someone from Ukraine has logged into your email. Click here to reset your password." You know, those types of urgent things. You know, it really bypasses a lot of our ability to kind of, sort of think rationally and be like, "Well, is this real?" Very famously, John Podesta also did actually, you know, question the legitimacy of that email and sent it to his IT team. And they said it was legitimate and said to him, you know, "I hadn't clicked on the thing after that."

LAURA and LEAH: Oh, wow.

SEAN: You know, there's lots of different techniques, but the goal is, right, is to be something benign enough, so that it's not suspicious or to use one of these socially social-engineering techniques to get you to sort of ignore warning signs and just kind of dive in.

LEAH: I see. So instead of you having to do all the work, you just manipulate me so that I give you everything you need.

SEAN: Right.

LEAH: Great. That makes sense.

LEAH: So should we talk more about what you guys can find on us?

SEAN: I think the next thing I would probably do is, if I was trying to find your, like, physical addresses, I would look at finding who your parents are and look at their kind of social media profiles. We can do that.

LAURA: Yeah. Let's find your parents.

LEAH: My parents? I think he was talking to you!

SEAN: Looking back at historical photos, also. You know, people forget what they post. I found one, Laura, on your Twitter page, which is, appears to be a bunch of parking tickets. The image is not high quality enough for me to see any of the information on these parking tickets, but I was really hoping that I would get something from that because I could have gotten license plate number? VIN number? ... That would have been pretty good.

LEAH: With fancy software would you be able to extract information from a photo like that?

SEAN: I don't know. I have my doubts that we'd get anything from this. It doesn't look like a very ...

LAURA: Just to be clear here, how many parking tickets were on the car?

SEAN: Five.

LAURA: Five? Yeah, I got five. I remember that day was such a bad day.

LEAH: You got five in one day, in one day?

LAURA: In one day. 

LEAH: What the hell were you doing?

LAURA: Just working.

LAURA: What's happening now?

STEVE: So because you have a website that's registered to your name—I assume it belongs to you, and you registered it—there are websites that I can look at the registration history and see if you put your physical address as well as your phone number.

LEAH: Apparently if you don’t opt out of sharing that information, and the website company isn’t protecting themselves, that information is totally searchable.

STEVE: Looks like it was registered in New York. But I can look through the history and see if that was always the case and whether something was leaked, such as ...

LAURA: Oh, my God. 

LEAH: A street address!

STEVE: Old street address.

LAURA: That was my apartment in New York.

STEVE: Is this your current phone number?

LAURA: Yeah. Yeah, that would be that. 

STEVE: So there's, there's a bunch of these data aggregators that are on the internet. And one of them is like my life, and they will have what they've gathered from other, you know, data sources. They'll buy information from the DMV, and they'll put together profiles. They’re actually, they’re allowed to resell it to the data brokers.

LAURA: The DMV? That is so wack.

LEAH: Yeah, that's crazy.

STEVE: That's one of the ways they stay in business.

LAURA: The Department of Motor Vehicles?

LEAH: That seems—I don't know how all of this works, but since they're like a federal agency—that seems really questionable.

SEAN: State. You know, each state has different laws around what is and isn't permissible.

LAURA: There’s a lot of information on this ... First of all, they’re so wrong about my net worth. I have a negative net worth.

LEAH: Damn, look at your income!

LAURA: And my income is very wrong.

LEAH: Did you see that your home is worth $625?

LAURA: I know, I’m curious what that even means. Like what does that even mean? Is that my car?

SEAN: It's also a really good example of just how fundamentally unreliable a lot of this stuff is. Right? And people might be making decisions about you based on this. Right? You know, if you're doing pre-employment screenings, background checks, a lot of these services sell this data as fairly authoritative. And clearly it's not.

LEAH: Okay, quick pause. I have to say, Laura, no offense, but it does seem like you’re putting out quite a bit of damaging information on yourself. Without any help from anyone else.

LAURA: I know. At this point Steve and Sean haven’t really even tried that hard. They’ve looked through my social media profiles, done some Google searches … but I’m just giving my information away.

LAURA: But, as Sean explained, maybe it’s not even your fault. The world we live in is pressuring us to make our lives more and more public.

SEAN: A lot of career paths now is people are told that they have to build a personal brand, they have to have an online presence so that they’re findable. That they're identifiable, not only by just information that they would share about themselves, but they, you know, have to have a personality associated with that. And in order to do that, we have to share things about ourselves and our lives, that, in the past, might not have been accessible to people that we’re professionally associated with. 

LEAH: Yeah, I mean, it makes me think of—I'm very afraid of being murdered, and I did this test online to find out what my odds were of being murdered. And a lot of the questions were like, "Do you run the same route all the time?" And I was like, "Yes, I do!" And it makes me think that the way I live online is sort of, like, I run the same route. I go to the same rehearsals, I go to the same workplace, I work in the same cafes ... But it's also all accessible online. If I post a story on Instagram, or I, you know, say, tag myself as going to an event or whatever, it's just so easy to map my route.

LAURA: Well Leah is very boring. So.

LEAH: I'm just waiting to be murdered.

LAURA: She's trying to be murdered.

LEAH: Okay, so I know we were sort of joking about the murder bit, but there’s some truth to that. And it made us wonder: What’s the worst-case scenario when it comes to cyberattacks? 

LAURA: So my fear is that because I do write about white supremacy sometimes, and some of the people that I write about are quite dangerous, like, they'll come to my house and, like, I don’t know, murder me. Or my baby.

STEVE: Or they won't come to your house. They may spoof your phone number and call the police and say that there's a hostage situation at your home address and let the SWAT team come to your house.

LEAH: Dox you.

LAURA: Oh, God. 

SEAN: I mean, the reality is, I think, also that the stakes are just higher for women. Right? Women, members of racial, religious, cultural minorities. And certainly for women journalists. There's been plenty of examples in the last couple of years of sustained trolling campaigns, making it just fundamentally difficult to do your job. And so that is not necessarily a worst-case scenario, but it is a highly likely scenario, particularly for someone who's reporting on white supremacy. 

LAURA: Sean says that the threat level can really vary from person to person, company to company. And that’s something they work on assessing at the Citizen Clinic.

SEAN: A lot of things that we work with our student teams when they're working with nonprofits is something called threat modeling: thinking about who the bad guy is and what the bad thing is specifically that we're worried about so that we can help the organizations that we work with spend the limited time and resources that they have to address those needs. 

LEAH: But for all of our listeners who are at home and might not have access to a professional, there are some things you can do on your own. Like, if you want to figure out if you’ve been a part of a large-scale data breach, there are places you can check.

LAURA: For example, if you go to haveibeenpwned.com—and “pwned” is spelled P-W-N-E-D—you can enter your email address and see a list of your accounts that have been part of cyberattacks. 

SEAN: So what this is a service run by a guy named Troy Hunt. And what he does is when there's a major breach, he'll get access to all of the breach data and scan basically all of the email addresses and then create like a giant archive of, okay, here's a searchable database of what breaches my information has appeared in. 

LEAH: What does that mean? 

STEVE: Here, like it says, you can check to see if your account has been compromised in a breach. 

LAURA: Do it! Do it! Did I get "pwned"?

STEVE: So you just click on the pwned button, and it says, "Oh, no, you’ve been pwned!” You were pwned on nine breached sites.

SEAN: And so, Leah, you use the service, Canva. In May of this year, that was breached. And so in that one email address, geographic location, name, password, and username was released. But your data also appeared in a number of other services. 

LEAH: This made me start to think, like, I’ve put my name and email address into so many different websites, and they are all potential entry points to my personal information. And I can’t even honestly remember the names of most of them or why I even signed up in the first place.

LAURA: Yeah, that’s such a good point. I mean, I bet most of us have. So, then, here we are with all of our passwords and email addresses scattered about the interwebz. And the question is: What, if anything, can we do about it?   

LEAH: I wouldn't even know where to begin with, you know, deleting accounts or resetting passwords for all these random services.

STEVE: So that's why we have a thing called password managers, which, if you've been around, you may have heard of them.

LEAH: Felt a little passive-aggressive.

LEAH: I've heard of password managers and also heard of those little like-key things that—

STEVE: A yubikey? I have a couple on my keychain.

LAURA: Wait. So what is a "yubikey"?

STEVE: So yubikey is a form of multi-factor authentication.

LEAH: It’s just a little something you can keep on your keychain and plug right into your computer. 

LAURA: So, let’s say I’m trying to sign into Twitter from a computer I don’t normally use—after I type in my password, Twitter will prompt me to plug in my yubikey and confirm that it’s me.

LEAH: And it makes your accounts much more difficult to hack because you need the actual, physical thing in order to get in.

LEAH: See I know some things I knew about yubikeys. 

LAURA: You did know that you said that, you know, he's at first, and then you're like, yeah, that's what I was. 

LEAH: I said a key thing.

STEVE: You’ve been around in the security—

LAURA: Leah’s been around the block.

LEAH: So I gotta be honest, I think part of the barrier for me to being more cybersecure is that I don't actually feel that vulnerable. Like I feel aware that my information is vulnerable. But I don't really know, like Laura said, I'm pretty boring. I don't make a ton of money. I don't see that there are things in my life that are that interesting to other people ...

LAURA: She's trying to throw people off the trail right now.

LEAH: Don't look for me, there's nothing to find! But seriously.

STEVE: Yeah, well, so I think there's like a basic level of cybersecurity, you know, practice that you should take, regardless of what you think your adversaries and threats may be. And that’s things like having the multi-factor authentication that we talked about, using a password manager at least, using unique passwords on every site. And if you have to, write them down in a book that works for you. But using a VPN, virtual private network, that's another just basic kind of hygiene. Where if we're talking about like, you know, safe sex, right? Like, you wear a condom, you know? And it doesn't matter, like, who your partner necessarily is. You don't have to have a threat profile every time just like, okay, there are certain things that we do, that we know are safer practices.

LEAH: And to take that metaphor one step further: When you use a condom, you’re not just protecting yourself from any potential STDs, you’re also preventing transmission to all future partners.

STEVE: There's like a herd immunity that we need to encourage. Where, not only are you worried about yourself, but if someone was targeting you, they could also be targeting your family members. And that kind of raises the stakes.

SEAN: We are all essentially the gateway to each other's information, right? I think that one of the things that's really challenging about the internet is that we make decisions for other people all the time based on our personal judgment of our own privacy and security that they wouldn't necessarily buy into, right? You know, I have two kids, and it’s really hard to think about the kinds of decision I’m making for them, posting their pictures on social media platforms, so my distant relatives can see my kids, when I know, from a very young age I’m entering their faces into facial recognition databases that will be able to track them conceivably for the rest of their lives.

LAURA: Okay, but, Leah.

LEAH: Yes, Laura?

LAURA: So, I’m making strong, unique passwords for every account I have, storing them in a secure password manager and even using multi-factor authentication. But what do I do about all my data that’s already out there?

LEAH: That’s such a great question! And you know, I’d answer it, but I think Steve wants to.

STEVE: It's really hard to deal with, like, this public information, like our phone numbers and our addresses, because we give them to the DMV or utility companies and some of that stuff is sold to other data brokers, which is resold. And so there's, if you go to privacyduck.com, there are lists of instructions on how to opt out of each and every one of these data brokers, and it is actually like a full-time job to do that. 

SEAN: And, as Steve pointed out, you know, going through these things and opting out on a continuous basis is fairly time intensive, so you can pay someone to do that for you through some of these services. But if you want to show them how much this stuff costs …

LAURA: Oh, God.

STEVE: Yeah, if you want the paid service it is ... VIP privacy is $1,000 a year. Just to pay someone to go through and opt out. 

SEAN: Which is more than the value of your home. 

[LAUGHTER]

LAURA: True!

LEAH: Laura, you’re going to have to sell that tent. And more.

LAURA: I’m going to have to sell my home and come up with some extra money.

SEAN: And so, you know, when people talk about privacy being a luxury good, this is kind of the example, right? Because sure there are people who can afford this. But most of us can’t. 

LEAH: So then, okay, most of us can't afford this. So what can most of us afford?

STEVE: I would say each of you should make a list of your most important accounts, such as—both personal and work—and make sure that each of those accounts have strong, unique passwords. 

SEAN: If you can think of four generally unrelated words—toddlers are a great password generator—but four generally unrelated words, that's a pretty good password. You know, if you were trying to attack that with a computer, it would take a number of years to break that just by brute force. You know it's much easier to remember, too, because you can write a little story in your head around that. Makes it a little more human friendly. 

LEAH: Sean also reminded us to set up multi-factor authentication. That might include a yubikey, a fingerprint, or even just a text message that we get on our phones.

SEAN: And, usually, when you set up multi-factor authentication, it also turns on things like login alerts. So your phone will be notified that “Hey, someone tried to log into your Google account. Was it you?” And most of the time, it will be you, and it will be fine. But if you get a strange alert, you'll know that either somebody has tried to access your account or something strange is going on.

STEVE: So I think it would be important for you to think of like your phone numbers and protecting that. So there are things, like, you know, signing up for Google Voice and having a phone number that's tied to your actual phone number. So, I use a phone number, it's called the VOIP number. It's essentially kind of like a made up phone number that will forward to my actual phone number, and no one knows like my actual phone number. Except for the phone company.

LAURA: Really?

LAURA: I know, I know, you’re thinking, “But I’ve had this same number since middle school! It’s part of my identity!” And, actually, that’s the whole problem. Phone numbers, passwords, email addresses, they are all part of our identity. Which means that if someone gets access to them, they can use them to access other, more private parts of your identity as well. 

SEAN: You know, there, there are a lot of trade-offs. And I think that some of this is just the development of social norms that takes time for us to start understanding how attached are we to these specific phone numbers or these addresses? And how much friction are we willing to go through to change these things over time to kind of keep ahead of this?

LAURA: They also mentioned a few tricks that hackers and phishers use that we should watch out for. And what to do if you see them. 

LEAH: The No. 1 thing to remember is that these attackers prey on our fears and on our vulnerabilities. So if you notice something suspicious, don’t start wiring money to Kazakhstan! Slow down, and talk to someone you trust before you act. 

STEVE: Generally these targeted attempts and harassment happens at, like, the worst time in our lives, where we're already stressed out, you know. You're probably dealing with a move or, you know, a bunch of other work commitments, and then you get like a phishing email, or you get a notification that someone tried to log into your account.

LEAH: Also ask yourself if you expected that email from Google or your bank or from whomever, and if you didn’t––be really suspicious about it. Definitely don’t click on any links. 

STEVE: There's a common tactic of using a security alert as the phishing attempt. So, for instance, you’ll get an email saying from Facebook that, oh, “There was a suspicious login attempt, you should change your password immediately.” And those things I always ... take seriously, but I'll go to the website itself, and I won't follow the links that are in the phishing email. If Facebook or Spotify or Google is saying that "There's a suspicious activity," I'll go to, you know, Google, log in myself, go to the security settings and change my password.

LEAH: That actually makes me feel ... good. 

LAURA: Yeah, I actually feel pretty good. I feel like I can find the four things that I want to manage. 

LEAH: Mhm. Yeah, this seems doable. I was feeling a little despondent at a certain point when we were looking at the cost of security.

SEAN: We see a lot of ... a friend of ours calls it "security nihilism," with the organizations that we work with, right? Where there’s just, like, there's so much scary stuff out there. And it's really hard to know where to start, to know if anything makes any difference. And it's really easy to just kind of throw up your hands and say, you know, what can I possibly do? Finding those little bite-sized chunks to make some meaningful progress on, can be very affirming and encouraging and does, in fact, I mean the things that we're talking about—better authentication security—will harden your online presence in a really meaningful fashion. It makes you a lot harder to compromise.

LEAH: So how secure do you guys feel, how cyber security feels on a scale of one to Mr. Robot?

SEAN: Not. I mean, part of it is the nature of the work that we do. I am just as susceptible to a good phishing email. You know, my information is out there, just like it's out there about everybody else. And so, you know, I think we all have to operate at a certain level of assumed insecurity because of that. 

LEAH: So, Laura, you went back to the Citizen Clinic for your cybersecurity check-up?

LAURA: I got a password manager, I set up two-factor authentication on my phone, and I also got a yubikey. And then I went through my social media and tried to remove anything that clearly stated where I lived: like an old Craigslist post, or a wedding registry, you know, things like that. 

LEAH: So how do you feel? Do you actually feel more … secure?

LAURA: I mean, I definitely feel more secure. And I was surprised by how easy it was to do all of that. And now that I just do it everyday, it’s just like one of the things I do, and I don’t think much about it. But also, at the same time, learning more about this, has made me think that no one is totally secure. And a lot of companies have my credit card information, and I’m just sort of trusting them that they’re taking the precautions to prevent any major breaches, like what happened at Berkeley. 

LAURA: It occurs to me that one thing I can do is make sure that I’m more careful with one-time transactions or account logins, like when I sign up for some random food delivery service and use one of my generic “throwaway passwords.”

LEAH: Yeah, totally, I do that all the time. Hopefully no one is listening to this …

LEAH: So, should we hear from our neighborhood kid now?

LAURA: Yup! Here’s Marco [Joseph] with the last word. 

MARCO JOSEPH: I just use the same password for everything, so I don’t have to write it down really. 

LAURA: Marco, that’s bad. Did you know that? That you’re supposed to have different passwords for everything? 

MARCO: I know, but, like, honestly, I’m just too lazy to really care.

LAURA: This The Edge, brought to you by California magazine and the Cal Alumni Association. I’m Laura Smith.

LEAH: And I’m Leah Worthington.

LEAH: This episode was produced by Coby McDonald with support from Pat Joseph. Special thanks to … Steve Trush, Sean Brooks, Brooke Kottmann, California magazine interns Grace Vogel and Steven Rascón, and of course, Ralphie the dog. Original music by Mogli Maureal. 

LEAH: And, for God’s sake, change your password.

Share this article