Episode 25

full
Published on:

6th Jan 2024

EP25: The Impact of the 23andMe Breach: Protecting Personal Information Online

The recent breach of the 23andMe service, a DNA ancestry testing company. It highlights the impact of the breach, the potential risks of sharing personal information, and the lack of adequate security measures taken by the company. It also suggests various security practices that could have been implemented to prevent such breaches in the future. Finally, it mentions the importance of individual users taking steps to protect their own security and the recommendation to freeze credit to prevent identity theft.

---

I do hope you enjoyed this episode of the podcast. Here's some helpful resources including any sites that were mentioned in this episode.

--

--

Find subscriber links on my site, add to your podcast player, or listen on the web players on my site:

Listen to Byte Sized Security

--

Support this Podcast with a Tip:

Support Byte Sized Security

--

If you have questions for the show, feedback or topics you want covered. Please send a short email to marc@bytesizedsecurity.show with the Subject line of "Byte-Sized Security" so I know it's about the podcast.

Transcript
Speaker:

So by now.

2

:

You know that 23 and me.

3

:

Was breached.

4

:

And 23 and me is basically one of those

who am I related to ancestry type.

5

:

Services.

6

:

And it got breached because.

7

:

There were somewhat like 14,000.

8

:

People or accounts.

9

:

That had reused passwords.

10

:

And.

11

:

Whoever, whatever logged into those.

12

:

And then was able to use this DNA

family tree, something or other sharing.

13

:

And scrape a bunch of data

from, I don't know, 6.9 million

14

:

accounts, which is kind of a lot.

15

:

But you do have to wonder, like, I

understand that you log into these sites

16

:

and you've got to give away all this

PII, personally identifiable information.

17

:

I get that.

18

:

First of all, I don't know why you're

giving away to just check that out,

19

:

but I understand people do that.

20

:

Whatever.

21

:

But you're giving it away

to a private company.

22

:

And then they get breached

and all this data.

23

:

I get scraped now.

24

:

I don't think that whoever, whatever

did this is getting actual DNA.

25

:

That's not what it is.

26

:

You weren't giving DNA samples,

but you are giving away a lot of

27

:

information about who we related to.

28

:

Things like that.

29

:

That can be used later for really

good social engineering tactics.

30

:

And that's going to make.

31

:

That's going to make a huge difference.

32

:

And I don't know what the

ramifications of that are.

33

:

But the thing that's a little

bit disheartening is twofold.

34

:

One.

35

:

Quite a lot of cyber security community.

36

:

At least the pulse I got was well.

37

:

You know, basically it is the user's

fault because they're reusing passwords

38

:

and they shouldn't be doing that.

39

:

And.

40

:

Uh, while I agree that you should

not be reasoning passwords.

41

:

I completely disagree that

it's on the users to do that.

42

:

We're talking about.

43

:

P I.

44

:

And information.

45

:

This is not just some

news site or whatever.

46

:

This company had an

obligation to do something.

47

:

And what they're falling back on is the

definition of reasonable security, which

48

:

there really is no definition for that.

49

:

Right.

50

:

You could argue that, Hey, reasonable

security is username password.

51

:

Okay.

52

:

But there are so many

things they could have done.

53

:

From the start that they did

afterwards after the breach.

54

:

And this is about 10 things

and I got these 10 things from

55

:

chat GPT, my $20 subscription.

56

:

But I'm sure that they paid a lot of

money not to do any of these things

57

:

or be told that they should do these

and then said, okay, we'll do it.

58

:

And then never did.

59

:

One mandatory, mandatory,

two factor authentication.

60

:

They could have done mandatory

two factor authentication.

61

:

Now they did it after words, but

they didn't do it in the beginning.

62

:

I'm not even sure it was

an option in the beginning.

63

:

To be honest with you, I didn't

sign up for the services.

64

:

So I don't know.

65

:

But it's weird that they did it.

66

:

Much later.

67

:

And I think some of the excuses

were, well, it would stop people

68

:

from, you know, adopting as quickly.

69

:

As they normally would.

70

:

And I completely disagree.

71

:

Chat made me do two a Fe.

72

:

Now I am in tech.

73

:

So I'm going to adopt that anyway.

74

:

But if people are really interested in

their ancestry, which they were, and

75

:

that was something that was just so hot

three whatever years ago, they absolutely

76

:

would have said it to a Fe on their

phone to get past that, to continue on,

77

:

to figure out who they're related to.

78

:

There's just some things that are

not going to stop a user from doing.

79

:

And I think figuring out who

you're related to, when you're

80

:

really, really interested.

81

:

Setting up to a fire on your

phone is not going to stop you.

82

:

Advanced password policies, you know,

they could have enforced a strong

83

:

password policy mix of letters,

numbers, and special characters

84

:

and a minimum password length.

85

:

That most likely would have gotten

rid of the reuse passwords because

86

:

those aren't things that people

are going to remember necessarily.

87

:

Now they may, they may just say,

oh, I'll just use the same password

88

:

and I'll copy and paste it in there.

89

:

But it would have been

a little bit harder.

90

:

For people just to type things into a

box when you force them to do something.

91

:

Crazy now they may still

write it down somewhere.

92

:

But.

93

:

It may not be used in another

breach, things like that.

94

:

You also could check against

known password lists.

95

:

Like have I been poned to make sure,

oh, you can't use this password because

96

:

it's been found in a previous breach.

97

:

You need to use a new one, right?

98

:

Things like that.

99

:

Account lockout mechanisms so

they could have done, you know,

100

:

temporarily locked accounts, side

of number of failed login attempts.

101

:

Now what happened was

it's considered to be.

102

:

Brute force, but it was

really credential stuffing.

103

:

So they're just using usernames

and passwords that have been

104

:

reused in previous breaches.

105

:

So they weren't necessarily brute,

forcing to get into accounts like.

106

:

Trying someone's account

over and over and over again.

107

:

But before.

108

:

But credential stuffing

does fall under the us.

109

:

For brute force.

110

:

So you could have had account lockout.

111

:

Mechanisms, you don't know if

somebody tried these passwords in

112

:

the end, the user had changed them.

113

:

You could have locked it out and

seen a high number of lockouts,

114

:

which would be kind of weird.

115

:

Behavior analysts and machine learning.

116

:

That would have been nice if they

had behavior analytics like unusual

117

:

login patterns from unfamiliar

locations or multiple failed

118

:

login attempts, short period.

119

:

Uh, 14,000 people logging in, in a

short period of time, things like that.

120

:

Even if you didn't have to have

Fe enabled and I've had this on a

121

:

financial site where someone knew

my user and password shame on me.

122

:

But they knew that it was not

normal for me to log in there

123

:

and they locked my account.

124

:

So you could have had behavioral

analysts, uh, working on that and.

125

:

They did not.

126

:

You could have had a capture challenge.

127

:

The kind of lame for spam.

128

:

But it's possible that if they

were using an automate, well,

129

:

they were our, no, one's going

to sit down and do 14,000 logins.

130

:

If you'd had a capture challenge.

131

:

You potentially could have

blocked a lot of this stuff.

132

:

IP address monitoring and blocking.

133

:

Did they really use 14,000 different

IP addresses to log in in different

134

:

locations across the world?

135

:

Highly unlikely.

136

:

Therefore, if they were coming from

a VPN, you could have flagged that

137

:

and, and, and block that activity.

138

:

Which goes back to the.

139

:

Account lockout mechanisms where you're

seeing something coming in, you know, you.

140

:

Web application firewall sees that,

and you could have blocked that so they

141

:

could have done something like that.

142

:

Use your education and awareness.

143

:

Now nobody does this.

144

:

I mean, they do it internally, right?

145

:

But you could have encouraged your users

over time because this is a sensitive site

146

:

that they shouldn't reuse their passwords.

147

:

They should check if their email

address had been part of any data

148

:

breach, like have I been poned and

then use a different password, but no.

149

:

To be fair, no company does this, right.

150

:

You just sign in and they're not going

to educate you on security training.

151

:

That's for your employer to do.

152

:

Um, that could change though.

153

:

Uh, regular security audits and updates.

154

:

So somehow they got an ISO 27 0 1.

155

:

Aye.

156

:

Don't understand that.

157

:

And there's me give me much credit

that that's a useful search.

158

:

But they've had that for a while.

159

:

It could have been regular security audits

to updates and fix these kinds of things.

160

:

We don't know.

161

:

We don't know.

162

:

Uh, you don't know if that actually was

true and then no one did anything with it.

163

:

Who knows?

164

:

So I don't know about this one.

165

:

Multilayered security approach.

166

:

So defense in depth.

167

:

So firewalls intrusion, detection

systems, regular vulnerability scanning.

168

:

Now vulnerability scanning

wouldn't have helped this, but.

169

:

Honestly, you could have looked at, you

could have looked at the loss stuff and

170

:

you could have picked this stuff up.

171

:

And that was intrusion detection

systems may have flag that,

172

:

Hey, this is unusual activity.

173

:

You've got all these people

logging in from all different

174

:

places, all over the world.

175

:

And that's not normally

where they log in from.

176

:

Right.

177

:

If I'm from.

178

:

If I am from California.

179

:

And all of a sudden I'm logging in

from, you know, a different country.

180

:

Don't you think that's kind of weird.

181

:

I mean, it's not, it's not impossible, but

isn't it kind of weird that you'd want to

182

:

like, maybe prompt me for something else.

183

:

Like, Hey, you don't normally log in from.

184

:

Europe, maybe we should ask you

for some other thing, like your

185

:

two FFA, which they didn't enable.

186

:

And then of course encouraging

the use of password managers.

187

:

And promoting the use of password

managers to help maintain strong, unique

188

:

passwords for each of the online accounts.

189

:

Again, it goes back to the

user education awareness.

190

:

Nobody's doing that, right?

191

:

The people that are doing that at

the cybersecurity, people telling you

192

:

years using a password manager, right.

193

:

But.

194

:

If you want adoption to happen, sometimes

companies have to force adoption and

195

:

then everyone just starts doing it.

196

:

So if you want to have faith to happen.

197

:

Kind of like chat did at some

point, probably because they

198

:

had some kind of COVID incident.

199

:

You just force everyone to start

doing it and they're going to do

200

:

it because they'd rather use the

service to not use the service.

201

:

And so you force them to do that.

202

:

And it ends up being

something that they can do.

203

:

So when I, when I look at

these lists of things they

204

:

could have done, but didn't do.

205

:

It's just concerning.

206

:

And of course they've already,

they've already turned around and

207

:

blamed it on the user saying, Hey,

you guys used reuse passwords.

208

:

That's on you.

209

:

We have reasonable

security measures in place.

210

:

And if you're going to just, you know,

211

:

Reused.

212

:

His name is passwords and I,

there's not much we can do about it.

213

:

So that's, that's on you and I

completely disagree about disagree.

214

:

With that.

215

:

There's also the, it doesn't

fall under the HIPAA laws because

216

:

it's not medical information.

217

:

Right.

218

:

It's personally identifiable information,

but it's not medical information is

219

:

they're not actually getting DNA.

220

:

And then there's some other obscure laws.

221

:

That are going to say, well, they've

had this information, but what

222

:

exactly did they take that was used

against you that caused you harm?

223

:

Who knows man.

224

:

So.

225

:

And then they changed their terms of

service at the end that you had to.

226

:

You had to opt out so that you

couldn't do this class action lawsuit.

227

:

I had.

228

:

I forgot it had to be like individual

or something, but they kind of did

229

:

it right, right after Thanksgiving.

230

:

And they switched it and

gave you 30 days to opt out.

231

:

So, if you didn't opt out of

this and you opted in, then you

232

:

automatically opted into not being

part of some class action stuff.

233

:

So.

234

:

A lot of shady stuff, but you

know, it's an 85 cent stock.

235

:

My guess is that the lawyers on probably

both sides are gonna make a lot of money.

236

:

They'll go out.

237

:

They'll just file for bankruptcy.

238

:

They'll either stay in business

and just restructure or they'll

239

:

just completely go away.

240

:

Who cares?

241

:

It doesn't matter.

242

:

And then.

243

:

You the end user, you, the person

that was using this, you'll

244

:

just move on to something else.

245

:

Everybody got your data.

246

:

You're going to get nothing.

247

:

You're going to get some credit

monitoring for three years, right?

248

:

From another agency that was

breached that leaked all your data.

249

:

So, I mean, It's unfortunate

at the end, the end user here

250

:

is going to be like usual.

251

:

It's going to be the loser in

this, in this particular battle.

252

:

And.

253

:

You just have to start thinking if you,

yourself and your friends and your family,

254

:

you've got to just be better at security.

255

:

Use the tools that are

available on the site.

256

:

Most most major sites, maybe, obviously

not this one, but most, most major sites.

257

:

If you go to the number two.

258

:

F a.directory.

259

:

That's the number two F a doc directory.

260

:

You're going to see all

the different sites.

261

:

This is not, you know, this is not a

hundred percent, but you're going to

262

:

see a lot of sites and how to set up

to a Fe on all these major accounts,

263

:

all your social media, your email.

264

:

You should just have an

authenticator app on your phone.

265

:

They're free.

266

:

You should just go set up to a Fe

as much as you can just do, make a

267

:

goal, like one a day or something.

268

:

And just don't be that low hanging fruit.

269

:

Right.

270

:

Even if you reuse the username

and password, if you had

271

:

to have a, on your account.

272

:

That would not happen now.

273

:

In this case, there was that DNA,

whatever sharing kind of thing.

274

:

So even the PR even these poor people

that maybe really did have really long

275

:

and complex passwords and super secure.

276

:

The person that they were connected

with, obviously got hacked and then

277

:

their information gets scraped too.

278

:

So this goes back to one of

the thoughts that I had is.

279

:

You've got to assume a breach

mentality and it doesn't matter

280

:

how safe and secure you are.

281

:

If the place that's holding your data.

282

:

Isn't right.

283

:

So.

284

:

Again, you could have had to have Fe

you could have had a really strong

285

:

password on this particular site.

286

:

But because somehow in the backend,

somebody opted into this DNA sharing

287

:

or family tree sharing, and maybe

you were, or were not involved in

288

:

that your information gets scraped

because you're connected with

289

:

someone who's completely insecure.

290

:

And this is one of those rare use

cases, I guess you could say, but.

291

:

He just got to adopt better security

practices and this gets worse over time.

292

:

And identity theft is real and

identity theft is real painful.

293

:

And the simplest thing

you can possibly do.

294

:

The simplest thing you can possibly do.

295

:

Is freeze your credit.

296

:

Online.

297

:

For free.

298

:

At the major credit bureaus.

299

:

And what that does is that gives

you an additional pin code.

300

:

That you must use to log into

those agencies, additional pin

301

:

code, long pin code that you log

into those additional agencies.

302

:

That you can then unfreeze your credit.

303

:

For a certain thought, whatever you want

to call it for a certain period of time.

304

:

And then it gets frozen again.

305

:

And what that means is if somebody

has all my information, they can go

306

:

get a loan for a house or a car or

whatever, and they're gonna apply for

307

:

a loan and pretend that they're me.

308

:

When they go to run those credit checks.

309

:

Those those companies won't be able to

get a credit check because they're going

310

:

to say, oh, it looks like, you know,

Experian or TransUnion or whatever.

311

:

Is, um, They're not,

I'm not getting it back.

312

:

You're going to need to go and freeze

that and that person, that thing is not

313

:

going to have that pin code that you

set up and therefore you're not going

314

:

to have a loan outstanding in your name.

315

:

And this goes for kids too, which is odd.

316

:

But.

317

:

If you think about it, if your children

get their credentials stolen as well.

318

:

Whatever.

319

:

A person could theoretically get alone and

you wouldn't even really know about that.

320

:

You know, till much later it'd be

18 years later that your child's

321

:

credit has been used by somebody

and they have terrible credit.

322

:

Right.

323

:

I mean, there's all these possibilities.

324

:

People always talk about,

oh, you can get this thing.

325

:

That will monitor your credit

and do all this kind of stuff.

326

:

And you pay yearly fee and why.

327

:

Why.

328

:

You just go lock your credit.

329

:

The three major.

330

:

At the three major.

331

:

Credit unions.

332

:

You get that?

333

:

Pin code.

334

:

And that's it.

335

:

And then it's easy peasy and you know

what it, yeah, it'll stop you temporarily

336

:

from applying for credit cards online.

337

:

But.

338

:

You still, it's not hard to unfreeze it.

339

:

It's quick.

340

:

You can put a time limit on it.

341

:

It's not a big deal at all.

342

:

And in fact, you still get

the promos and bonuses.

343

:

I had my, I have my credit frozen.

344

:

And I wanted some.

345

:

Airline credit card.

346

:

And I went and applied for it.

347

:

And of course it said, Hey, it

looks like your credit's frozen.

348

:

So we're not able to do

this at this online for you.

349

:

But we're going to send you a met,

you know, mail and then you can

350

:

actually call in and blah, blah, blah.

351

:

And when I did actually call in,

I said, Hey, I need to unfreeze my

352

:

credit so I can get this promotion.

353

:

I just want to make sure

I'm still going to get it.

354

:

And they said, oh yeah, you're still

gonna get it because you applied for it.

355

:

During the promotional period, just

because the promo closes and you haven't

356

:

done it online or whatever you still got.

357

:

So I just on frozen my credit,

they were able to check online.

358

:

Yep.

359

:

And then they went ahead and did

it, and I still got the promotion.

360

:

So.

361

:

If you are worried like, well, if this

is a window and I want to get this thing.

362

:

B you still applied during that time.

363

:

And just because your credit's

frozen doesn't mean you're not

364

:

going to still hit that window.

365

:

It just, it just takes a few,

you know, a couple more days.

366

:

So I'm going to send you a piece

of mail and some number that you

367

:

have to call in for them to do

an additional verification check.

368

:

So it's not a big deal.

369

:

But people act like it's a big

deal and it's a huge hassle.

370

:

And I'm telling you the hassle is going

to be when you get that identity theft.

371

:

Uh, happens to you or your kid.

372

:

That's going to be a hassle for you.

373

:

So you're going to want to go out and,

you know, Get better security hygiene.

374

:

It's not that hard to do.

375

:

It really isn't.

376

:

You get an authenticator

app and a password manager.

377

:

Easy peasy.

378

:

You go to that to F a that's

number two F a.directory, and

379

:

you just do one thing day.

380

:

And he locked down all

your social media accounts.

381

:

You locked down your email

accounts and things like that.

382

:

And that just stops you from

being that low hanging fruit.

383

:

So I hope this has been an

informative podcast for you, and I

384

:

hope you weren't part of the breach.

385

:

Um, it's unfortunate.

386

:

But, you know, 20, 24 is just

starting and I'm sure there's plenty

387

:

more breaches where that came from.

Support the Podcast with a Tip

If you're enjoying Byte-Sized Security and finding these practical tips useful, please consider supporting the podcast with a small contribution. It costs $17 per month just to cover podcast hosting fees, and your support helps offset the costs of producing this security resource and keeping episodes free. Even a tip of $1-5 per month from loyal listeners adds up and allows me to continue providing great cybersecurity info. Please considering a donation. I appreciate you helping sustain Byte-Sized Security! Now back to the security tips..
Support the Podcast
A
We haven’t had any Tips yet :( Maybe you could be the first!
Show artwork for Byte Sized Security

About the Podcast

Byte Sized Security
Snackable advice on cyber security best practices tailored for professionals on the go
In a world where cyberattacks are becoming more commonplace, we all need to be vigilant about protecting our digital lives, whether at home or at work. Byte Sized Security is the podcast that provides snackable advice on cybersecurity best practices tailored for professionals on the go.

Hosted by information security expert, Marc David, each 15-20 minute episode provides actionable guidance to help listeners safeguard their devices, data, and organizations against online threats. With new episodes released every Monday, Byte Sized Security covers topics like social engineering, password management, multi-factor authentication, security awareness training, regulatory compliance, incident response, and more.

Whether you're an IT professional, small business owner, developer, or just someone interested in learning more about cybersecurity, Byte Sized Security is the quick, easy way to pick up useful tips and insights you can immediately put into practice. The clear, jargon-free advice is perfect for listening on your commute, during a lunch break, or working out.

Visit bytesizedsecurity.com to access episodes and show notes with key takeaways and links to useful resources mentioned in each episode. Don't let cybercriminals catch you off guard - get smart, fast with Byte Sized Security! Tune in to boost your cybersecurity knowledge and help secure your part of cyberspace.
Support This Show

About your host

Profile picture for Marc David

Marc David

Marc David is a Certified Information Systems Security Professional (CISSP) and the host of the cybersecurity podcast, Byte-Sized Security. He has over 15 years of experience in the information security field, specializing in network security, cloud security, and security awareness training. Marc is an engaging speaker and teacher with a passion for demystifying complex security topics. He got his start in security as a software developer for encrypted messaging platforms. Over his career, Marc has held security leadership roles at tech companies like Radius Networks and Vanco Payment Solutions. He now runs his own cybersecurity consulting and training firm helping businesses and individuals implement practical security controls. When he’s not hosting his popular security podcast, you can find Marc speaking at industry conferences or volunteering to teach kids cyber safety. Marc lives with his family outside of Boston where he also enjoys running, reading, and hiking.