Meta's Smart Glasses Just Got Creepier
Ep. 41

Meta's Smart Glasses Just Got Creepier

Episode description

Meta plans to add facial recognition technology to their smart glasses, popular password managers fall short of their “zero knowledge” claims, Apple is testing end-to-end encryption with RCS in iOS 26.4 beta, and much more. Join us for This Week In Privacy #41!

Download transcript (.vtt)
0:23

Welcome back to This Week in Privacy,

0:25

our weekly series where we discuss the

0:27

latest updates with what we're working on

0:29

within the Privacy Guides community and

0:31

this week's top stories in the data

0:32

privacy and cybersecurity space,

0:34

including Meta's AI glasses are getting

0:36

creepier somehow,

0:38

vulnerabilities in popular cloud-based

0:39

password managers,

0:41

reminders about the privacy concerns of

0:42

AI, and more.

0:44

I'm Nate,

0:44

and with me this week is Jordan.

0:46

Hello, Jordan.

0:47

How are you?

0:48

I'm good, thanks.

0:49

Ready to dive into this week's top stories

0:51

in data privacy and cybersecurity.

0:56

All righty.

0:57

Before we do that,

0:58

for those of you just joining us,

0:59

Privacy Guides is a nonprofit which

1:01

researches and shares privacy-related

1:03

information and facilitates a community on

1:05

our forum and matrix where people can ask

1:07

questions and get advice about staying

1:09

private online and preserving their

1:11

digital rights.

1:13

One more quick piece of business before we

1:14

jump in.

1:14

We want to thank misanthropic forty two on

1:17

YouTube for becoming a member.

1:21

If you become a member on YouTube,

1:22

you get videos a week early.

1:23

It gets you a little badge in the

1:24

chat.

1:25

And if you want to support us and

1:27

you don't use YouTube,

1:28

we will talk about how to do that

1:29

a little bit later.

1:30

But first,

1:31

we're going to jump into our story about

1:33

Meta's facial recognition glasses.

1:35

Jordan,

1:36

why don't you go ahead and take it

1:37

away?

1:39

Yes,

1:39

so for anyone who's sort of out of

1:41

the loop,

1:42

Meta has been creating a new smart glasses

1:46

brand for the last couple of years now.

1:49

And now they've announced their plans to

1:51

add facial recognition to its smart

1:53

glasses.

1:54

Basically how these glasses worked before

1:56

was that they had a camera built in

1:58

and they allowed you to record and do

2:00

sorts of

2:01

you know, smart activities on them.

2:03

But now they're planning to add facial

2:05

recognition.

2:06

So quoting from this article by The New

2:09

York Times here, five years ago,

2:11

Facebook shut down the facial recognition

2:13

system for tagging people in photos on its

2:15

social network,

2:17

saying it wanted to find the right balance

2:19

for a technology that raises privacy and

2:21

legal concerns.

2:23

Now it wants to bring facial recognition

2:26

back.

2:27

Meta, Facebook's parent company,

2:29

plans to add the feature to its smart

2:31

glasses,

2:32

which it makes with the owner of Ray-Ban

2:34

and Oakley.

2:36

As soon as this year,

2:37

according to four people involved with

2:39

plans who are not authorized to speak

2:41

publicly about confidential discussions,

2:44

the feature, internally called NameTag,

2:47

would let wearers of smart glasses

2:49

identify people and get information about

2:52

them via Meta's artificial intelligence

2:55

assistant.

2:56

Meta's plans could change.

2:57

The Silicon Valley company has been

3:00

conferring since early last year about how

3:02

to release a feature that carries safety

3:05

and privacy risks,

3:07

according to an internal document viewed

3:09

by the New York Times.

3:11

The document from May described plans to

3:13

first release name tag to attendees of a

3:16

conference for the blind,

3:18

which the company did not do last year

3:21

before making it available to the general

3:23

public.

3:26

Meta's internal memo said the political

3:28

tumult in the United States was good

3:31

timing for the feature's release.

3:33

Really?

3:34

Really?

3:34

I don't know about that.

3:35

We will launch during the dynamic

3:37

political environment where many civil

3:39

society groups that would expect to attack

3:42

us would have their resources focused on

3:44

other concerns,

3:45

according to the document from Meta

3:47

Reality Labs, which works on hardware,

3:49

including smart glasses.

3:54

So I guess this is sort of,

3:55

we should say from a privacy perspective,

3:59

having glasses that you can just basically

4:04

walk around and record people.

4:06

It does have, you know,

4:07

protections against recording people,

4:09

obviously.

4:10

Like there are...

4:13

a single light on one side of the

4:14

glasses,

4:15

which is meant to alert somebody if

4:17

they're being recorded.

4:18

But it's extremely common for people to

4:21

basically do a DIY hack to disable the

4:24

light so they can record people without

4:26

their consent.

4:28

And I think this would also be quite

4:30

a huge problem if we are using facial

4:34

recognition on

4:35

glasses because, again,

4:37

you would be able to use the glasses

4:40

to identify people without their consent

4:42

and they wouldn't know that they're being

4:44

recorded or being identified.

4:46

I think it goes without saying, though,

4:47

this is definitely meta glasses were

4:50

already creepy and this is just a move

4:54

to make them even more creepy and invade

4:57

people's privacy.

4:59

So there was,

5:00

I don't know if people remembered,

5:02

but there was a

5:04

a video floating around a couple of years

5:06

ago,

5:07

and just quoting from the New York Times

5:09

article here,

5:11

MetaSmart glasses have been used to

5:12

identify people before.

5:14

In twenty twenty four,

5:15

two Harvard students used Ray-Ban Metas

5:17

with a commercial facial recognition tool

5:20

called PMIs to identify strangers on the

5:23

subway in Boston,

5:24

and then they released a viral video about

5:27

it.

5:27

At the time,

5:28

Meta pointed to the importance of a small

5:30

white LED light on the top right corner

5:33

of the frames that indicates to people

5:35

that the user is recording.

5:37

So I think this is an extremely flimsy

5:40

response,

5:41

especially because of how easy it is to

5:43

disable and cover.

5:46

I think

5:47

That's a little bit ridiculous that that's

5:50

the only protection that the company is

5:52

sort of pointing to.

5:56

So yeah,

5:58

basically this is what they're saying,

5:59

the AI assistant.

6:01

So metasmart glasses require a wearer to

6:04

activate them to ask the AI assistant a

6:06

question or to take a photo or video.

6:09

The company is also working on glasses

6:11

internally called super sensing that would

6:13

continually run cameras and sensors to

6:15

keep a record of someone's day,

6:17

similar to how AI note takers summarize a

6:20

video call meetings,

6:21

three people involved with the plan said.

6:24

And they're saying that the facial

6:25

recognition would be a key feature for

6:28

super sensing glasses.

6:30

So they could, for example,

6:31

remind wearers of tasks when they saw a

6:33

colleague.

6:34

Mark Zuckerberg has questioned if the

6:36

glasses should keep their LED light on to

6:39

show people that they are using the super

6:40

sensing feature or if they should use

6:43

another signal.

6:44

One person involved with the plan said,

6:47

I think this is

6:50

Obviously there needs to be more

6:51

protections,

6:52

but I think this sort of tool shouldn't

6:54

actually be allowed, right?

6:55

Because I feel like a lot of the

6:59

laws we have around like photography and

7:03

recording in public spaces is it's based

7:06

on very old things, right?

7:10

Like back,

7:12

fifty years ago when people were

7:14

journalists.

7:14

The only people that had cameras were

7:16

journalists taking photos for, you know,

7:18

newspapers and stuff like that.

7:20

That obviously makes sense,

7:21

but basically recording every single

7:24

person you interact with and then using

7:26

facial recognition on them is clearly an

7:29

invasion of not only that person's

7:32

privacy,

7:32

but everyone you're interacting with,

7:34

so...

7:35

I can't believe that uh they would

7:37

actually consider putting this out it kind

7:39

of makes sense because you know like they

7:41

said they're trying to push it out at

7:43

a time when people aren't as fully engaged

7:45

on this stuff and I'm sure this is

7:47

what a lot of companies do when they're

7:49

pushing out a lot of these um awful

7:51

stuff so yeah that's sort of what I

7:55

was those were my primary thoughts on this

7:57

but um Nate do you have any thoughts

7:59

um after having a look at this article

8:03

I have several as usual, um,

8:07

where to begin, uh,

8:08

I guess just to add a little bit

8:10

more, um,

8:11

a little bit more of the facts to

8:12

the context of this, actually, no,

8:15

let me start with this.

8:16

This is a solution in search of a

8:18

problem.

8:18

And the reason I say that, uh,

8:21

for people who disagree with me for some

8:23

reason, um,

8:24

the reason I say that is because.

8:26

Meta this article details basically that

8:29

this plan is super early in development.

8:32

And meta is really trying to figure out

8:34

what this is going to look like.

8:36

And basically what's happening is open AI

8:39

has announced that they're going to

8:40

release their own glasses for some reason.

8:41

Um,

8:43

cause they keep like any great company,

8:45

they keep adding things.

8:45

Nobody asked for it to a product that

8:47

everybody was perfectly happy with.

8:49

And then I think snap has been wanting

8:51

to release glasses for years as well.

8:54

And, um,

8:56

who I think there's someone else,

8:58

but basically the space is starting to

9:01

have competitors.

9:01

And so they realize like, well,

9:03

we need something that makes us stand out.

9:05

And so now they've literally thrown around

9:08

the facial recognition,

9:09

I think is the leading idea,

9:11

but I want to say there were other

9:13

ideas they were tossing around too.

9:17

And so, yeah,

9:18

there's a lot of different like

9:21

there's a lot of different discussions

9:22

that they're still having internally.

9:24

Like what is this facial recognition gonna

9:25

look like?

9:25

Like, for example,

9:26

they say here in the article,

9:27

possible options include recognizing

9:29

people a user knows because they are

9:30

connected on a meta platform.

9:32

So like, for example,

9:33

if you're friends with somebody

9:35

on Instagram and you're out shopping and

9:37

they walk past you in the grocery store,

9:39

your glasses will ping and be like, Oh,

9:40

Hey, that's that person.

9:41

Which to me is ridiculous because like,

9:43

I don't necessarily need to know every

9:44

time one of my friends walks past me.

9:46

And also like,

9:47

what if you're really not that close?

9:49

Like there's just,

9:49

there's so many problems with this.

9:52

And, um,

9:53

I've actually had this happen to me.

9:54

Not obviously not with this, but, uh,

9:56

years and years and years ago,

9:58

I befriended somebody on Tumblr and yes,

10:01

I used to use Tumblr once upon a

10:02

time.

10:03

And then

10:06

I think we ended up like texting or

10:07

something.

10:08

And then they showed up on my people

10:09

you may know on Facebook.

10:10

And this was years before I ever cared

10:12

about privacy.

10:13

I think this was even before Snowden.

10:15

And even back then I was like,

10:16

that feels really creepy.

10:17

And that feels like too much.

10:19

And I'm really uncomfortable with that.

10:21

And just, I don't know, like, and that's,

10:24

again, these are people, you know,

10:25

these are people, quote unquote,

10:26

these are people you're somehow connected

10:28

to.

10:28

And how long before Meta just starts

10:29

rolling this out in general,

10:30

where it goes into like, it's not just,

10:33

you know, you're connected on Facebook.

10:34

It's because that's how it started, right?

10:36

Like Facebook was your feed of people you

10:37

follow.

10:38

And then it became, you know,

10:40

somebody you tangentially know,

10:43

like a friend of a friend.

10:44

And now you're seeing posts that people

10:46

liked that you don't even follow that

10:47

page.

10:47

And how long before this turns into that?

10:49

where people are showing up in your little

10:53

glasses HUD, your heads up display,

10:55

just because you're tangentially connected

10:56

somehow.

10:58

So yeah,

10:59

I just wanna point that out first.

11:03

I also, I need to point out,

11:04

I wanna point this out every single time.

11:08

They invented or discovered, I don't know.

11:11

I don't know if they invented it,

11:13

but they workshopped facial recognition

11:17

years ago, years and years and years ago.

11:20

And they shelved it because it was too

11:21

creepy, even for them.

11:23

And then once clear view AI came along,

11:25

suddenly they were cool with it.

11:26

And so I just need to point that

11:27

out that like meta has no moral compass,

11:30

which that quote was probably indicative.

11:32

I'll get to the quote again in a

11:33

second.

11:33

Meta has no moral compass and they

11:36

basically just wait for something to

11:37

become socially acceptable enough that

11:41

it's okay.

11:41

Okay.

11:42

Like meta would probably popularize the

11:44

Hunger Games if they thought they could

11:45

get away with it.

11:45

They don't care.

11:46

They just want a dollar,

11:47

which is shown in just to state it

11:49

again, that quote,

11:52

I actually laughed when you read that,

11:53

Jordan, in your response.

11:54

You're like, really?

11:55

Are you sure about that?

11:56

Because like, oh,

11:57

we're going to launch during a dynamic

11:58

political environment where many civil

11:59

liberty groups that we would expect to

12:01

attack us would have the resources.

12:02

Like they literally said the quiet part

12:04

out loud.

12:04

Like, hey,

12:05

now's the perfect time to do this because

12:06

we know that nobody's going to like this

12:08

and they're going to be busy paying

12:09

attention to everything else that is wrong

12:11

in the world.

12:12

And we can do our evil thing.

12:13

It's like cartoon villains are less

12:17

cartoonishly evil than that.

12:18

I just,

12:19

I don't know how else to put it.

12:21

But the last thing I want to touch

12:22

on is you mentioned that story that was

12:25

originally covered by four Oh four,

12:26

where when metal launched their Ray-Ban

12:28

glasses,

12:29

they didn't have facial recognition

12:30

hooked, hooked into them originally.

12:32

And some researchers hooked it up to PIM

12:34

eyes and,

12:34

and started identifying random people on

12:36

the subway.

12:37

And Meta got super pissed about this

12:40

coverage because they told four oh four,

12:42

they're like, well, that's not us.

12:44

The researchers did that.

12:45

We have this quote unquote safety feature

12:46

built in, which is a little light.

12:48

That's apparently super easy to bypass it.

12:50

To be fair,

12:51

it's a little bit more advanced than like

12:52

just put a piece of tape over it.

12:53

Apparently you can tell when you do that,

12:55

but it's not hard to do.

12:56

There's tutorials online everywhere.

12:58

And, you know,

12:58

we've got safety features and we don't do

13:00

that.

13:01

And, you know,

13:01

there's always going to be bad people who

13:03

do things with technology.

13:04

Like they were so defensive and like,

13:06

how dare you accuse us of doing something

13:10

so nefarious?

13:11

And now they're doing the exact same

13:12

thing.

13:13

And how long before somebody finds a way

13:15

to jailbreak this?

13:17

And now it doesn't just show people you

13:19

know.

13:20

Now it does exactly what those researchers

13:22

did,

13:22

except you just made it a thousand times

13:24

easier because the capability is already

13:25

there.

13:25

They just have to jailbreak it.

13:27

And I mean,

13:28

I hope we don't have to say the

13:29

obvious here,

13:30

but like this will be used for stalkers.

13:32

This will be used for, you know,

13:34

I mean,

13:35

mainly I think that's the big one.

13:36

I'm sure it'll be used for all kinds

13:37

of other things.

13:38

And it's, I don't know,

13:40

this is just so like,

13:42

don't even know what to say this this

13:44

just to me seems like such an obviously

13:46

bad idea and an overstep and again just

13:49

that quote the fact that they just said

13:51

the quiet part out loud i don't even

13:52

know where to go from there it's like

13:53

they they've shown their true colors as

13:55

being downright evil um real quick they

13:59

they actually let me share this real quick

14:02

eff did notice despite despite meta's best

14:05

efforts eff noticed and uh they wrote this

14:08

blog post called seven billion reasons for

14:10

facebook to abandon space recognition

14:12

plans

14:12

And what did they say at the,

14:14

I think it was at the end here.

14:16

Yeah.

14:17

Meta's conclusion that it can avoid

14:18

scrutiny by releasing a privacy invasive

14:20

product during a time of political crisis

14:22

is craven and morally bankrupt.

14:24

And that was like such a good way

14:26

to put it.

14:26

Like there is absolutely no way anyone at

14:29

Meta can pretend to have a shred of

14:31

ethics right now because nobody with a

14:34

moral compass would do this.

14:35

It's just, yeah,

14:37

I think those are all my thoughts.

14:38

That's just so crazy.

14:41

yeah it is kind of surprising especially

14:44

because like i know there's a lot of

14:45

stuff people already concerned about um

14:48

ice agents wearing meta smart glasses and

14:51

then it's like now they're planning on

14:53

adding facial recognition i don't know it

14:55

just seems very like dystopian very creepy

15:01

um i think people need to be a

15:02

little bit more loud about this because

15:04

there's people that are

15:07

I don't know if we're reading the,

15:09

I was reading through information about

15:11

these meta smart glasses,

15:13

like in preparation for this episode.

15:15

And it seems like they're actually selling

15:17

more than ever.

15:18

Like they sold triple the amount that they

15:19

did in twenty three,

15:20

twenty three and twenty,

15:21

twenty four and twenty, twenty five.

15:24

So it's kind of concerning how popular

15:26

these things are getting,

15:27

because I think the worst possible outcome

15:29

here is these products are like become a

15:33

very

15:36

popular like thing that a lot of people

15:38

have and that would basically almost

15:40

create like a dragnet surveillance.

15:42

Like we talked about this last week with

15:45

Amazon's, uh, pet searching, uh,

15:48

Amazon rings, uh, pet searching system,

15:50

search party, search party.

15:52

Yes.

15:53

Um, so I think it's, you know,

15:55

it's a similar thing.

15:56

It's like Nate said,

15:57

it's going to always be used for like

15:59

the most nefarious purposes,

16:01

like stalking people or, you know, just,

16:05

I think the fact that if someone could

16:06

identify your name and other information

16:10

about you without your consent is kind of

16:14

going against the whole idea of privacy,

16:15

right?

16:16

Because you should be able to control who

16:19

knows that information.

16:21

And it kind of blurs the line.

16:25

Yes,

16:25

I think it's great that the EFF has

16:27

come out with a like, I guess,

16:30

statement against this.

16:31

And we're really big fans of the EFF.

16:34

So I hope they keep up the great

16:35

work on that.

16:36

And yeah, I mean,

16:39

I hope that this doesn't become a reality

16:42

because like we said,

16:43

it was part of a plan.

16:45

So I mean, it's still not...

16:48

actually implemented so we can only hope

16:50

that people like Meta will get the idea

16:53

once a lot of this stuff is leaked

16:55

that this is a really bad idea and

16:56

people really don't want it but I'm kind

16:59

of afraid that some people just

17:02

don't seem to see the issue with a

17:04

lot of this stuff like just how popular

17:06

these smart glasses are already is kind of

17:09

indicative of the climate of people's like

17:14

I guess people caring about their privacy

17:16

so it's kind of a little bit unfortunate

17:21

um but I guess with that being said

17:24

we should move on to the next article

17:26

here Nate

17:29

Alrighty.

17:30

Yeah.

17:30

Let's go ahead and head over to our

17:33

next article,

17:33

which I believe is about password

17:36

managers.

17:36

Yes, it is.

17:38

Let me get this pulled up here.

17:40

Alrighty.

17:41

So several popular, well,

17:43

three specifically,

17:44

three popular password managers fall short

17:46

of quote unquote zero knowledge claims.

17:48

So this came from researchers at ETH

17:51

Zurich,

17:52

which we have seen them do quite a

17:53

bit of good research in the past on

17:55

cryptography and cybersecurity.

17:57

And they did audits with permission of

18:00

Bitwarden, LastPass, and Dashlane.

18:02

And so they basically had a – I

18:06

thought I saw it in here somewhere.

18:07

They had a –

18:09

Yeah, in controlled tests,

18:10

the team was able to recover passwords and

18:12

tamper with vault data,

18:14

challenging longstanding zero-knowledge

18:15

encryption claims made by vendors.

18:18

And then the findings were published in a

18:19

technical paper and disclosed to vendors

18:21

under a coordinated nine-day process.

18:23

So usually the way these audits work is

18:26

the vendor will set up

18:29

basically like a parallel environment,

18:30

but it won't actually have any user data.

18:33

So that way,

18:33

if they do find any problems,

18:34

it's like nobody's actually exposed,

18:36

but now we know these problems exist.

18:38

So they probably did something similar

18:39

here where they set up like a test

18:41

server and ETH Zurich found some pretty

18:44

troubling stuff.

18:45

So unfortunately, Bitwarden did the worst.

18:47

They had twelve attacks against Bitwarden,

18:50

seven against LastPass and six against

18:52

Dashlane.

18:52

And real quick,

18:53

why those three is because apparently

18:55

those three are the most popular password

18:57

managers out there currently.

18:59

And they account for more than sixty

19:01

million users and about twenty three

19:02

percent of the market.

19:03

So those three account for about a quarter

19:06

of all password manager users.

19:09

Um, yeah, if you're watching,

19:11

you can see here,

19:12

there's like a list of the type of

19:15

attacks they found.

19:16

And you can kind of tell the BW

19:17

is Bitwarden.

19:18

The LP is LastPass.

19:19

The DL is Dashlane.

19:20

Um, Bitwarden,

19:23

I should say Bitwarden and Dashlane have

19:26

fixed most of these, I believe.

19:29

Uh, LastPass is working on fixing them.

19:32

The fact that LastPass is still in the

19:34

top three at this point makes me sad.

19:37

But anyways, um,

19:39

Yeah, so it's, well, real quick,

19:42

let me just say, so Bitwarden,

19:44

I personally found their blog post to be

19:46

the best because they did actually give a

19:49

full explanation of all twelve

19:51

vulnerabilities.

19:53

I believe that they said all of them

19:54

were medium or low impact,

19:57

required an attacker to already have full

19:58

server control, which is worth noting,

20:00

but at the same time, it's, in theory,

20:04

we would hope that these are designed in

20:05

such a way where it doesn't matter.

20:06

Like,

20:07

that's kind of why they did this research

20:08

is,

20:10

Products like Bitwarden, Signal,

20:14

I'm trying to think of some others,

20:15

Proton, in theory.

20:17

In theory,

20:17

the way these products are designed is

20:19

that it doesn't matter if the server is

20:21

malicious because everything happens on

20:22

device, everything is really secure,

20:24

and the server being malicious is more

20:27

kind of like a bummer than an actual

20:29

problem.

20:30

And that was not the case here.

20:32

So again,

20:33

getting back to the vendor response,

20:35

Bitwarden did fix...

20:37

I think nine of them and three of

20:39

them, they, uh,

20:42

I guess the term is they accepted it.

20:43

They basically said like, we hear you,

20:45

we acknowledge it.

20:45

And here's why we're not fixing it.

20:46

Um,

20:49

they did the reasons they gave made sense

20:51

in my opinion.

20:52

Like one of them was, uh,

20:54

they basically said like,

20:55

we need this functionality for shared

20:57

vaults to work.

20:57

Like if you share with another user or

20:59

a family member, which I hear,

21:01

but at the same time,

21:02

all three of them that they didn't fix,

21:03

they also said like,

21:05

We'd be open to looking into this in

21:06

the future,

21:07

which I appreciate the humility.

21:09

But at the same time, it's like,

21:11

why not just fix it now?

21:12

I don't know.

21:13

I just I don't like that they left

21:14

stuff open,

21:14

even though I understood their answers.

21:15

It's like, but can you fix it anyways?

21:17

There's got to be a way to do

21:18

it.

21:21

Dashlane was a lot less open on their

21:24

blog post.

21:25

They said that they did fix some stuff,

21:26

but they didn't really give that same

21:28

detailed breakdown that Bitwarden did,

21:30

which makes sense because Dashlane and

21:32

LastPass are not open source.

21:33

So they're just not as transparent,

21:36

I guess.

21:37

And LastPass, like I said,

21:39

I think they fixed one of the issues.

21:41

And I think they've got a couple others

21:44

that they've got the fixes ready for,

21:45

but they haven't rolled out yet.

21:46

And then they've got a few more that

21:47

are still in progress.

21:49

Um, although again,

21:51

personal biased opinion,

21:53

I would not use last pass if you

21:54

paid me after their last big data breach.

21:57

So yeah.

21:59

And I,

21:59

I think this is really disappointing

22:00

because again, the,

22:04

the idea of an attack like this is

22:06

we want to make sure that your,

22:10

your vaults are protected no matter what,

22:13

like that is the whole point of a

22:14

password manager is that you can trust

22:16

this.

22:17

And again,

22:18

It's very frustrating when that is not the

22:20

case and that does not turn out to

22:21

be true.

22:23

I really don't have an excuse for that.

22:24

It's very frustrating.

22:25

But at the same time, I think,

22:27

because I know already there's probably

22:30

some of our more hardcore veteran

22:32

listeners or viewers,

22:34

they're thinking like, oh, well,

22:35

this is why I use KeePass.

22:36

This is why I use offline password

22:37

managers, which is great.

22:38

If you have the kind of organizational

22:40

skill to do that, that's fantastic.

22:42

And I'm totally in favor of it.

22:43

We do recommend KeePass on KeePass XC

22:46

specifically, I think, on privacy guides.

22:50

But for a lot of people,

22:52

offline password managers are

22:55

too much work.

22:55

And the problem with security is security

22:59

requires you to trade convenience,

23:01

but everybody has a different threshold of

23:02

convenience.

23:03

And for a lot of people,

23:04

when for everybody,

23:05

once something becomes too inconvenient,

23:07

they're going to stop doing it because

23:09

it's just too much work and it's not

23:10

worth it.

23:11

And everybody has a different level.

23:12

You know, some people don't mind key pass.

23:13

Some people do.

23:14

So that's kind of the concern or the

23:16

unfortunate side of key passes,

23:17

because yes, in a perfect world,

23:19

that would be great.

23:20

But for a lot of people that requires

23:21

you to manually sync up across multiple

23:23

devices and

23:24

that requires you to manually keep really

23:26

good backups.

23:27

And the nice thing about cloud-based

23:28

password managers is it's just easy.

23:30

Bitwarden syncs across every device.

23:32

It looks really clean.

23:33

I don't have to worry about keeping it

23:34

updated.

23:35

Well, I mean,

23:36

I have to keep the app updated,

23:37

but you know what I mean?

23:38

It's just such a seamless,

23:41

easy user experience.

23:44

So it is really unfortunate to see

23:46

when these kind of things happen.

23:47

And I'm really glad that Bitwarden

23:48

especially took this to heart and they

23:50

took the criticism and they fixed it.

23:52

I hope that they will fix the remaining

23:53

vulnerabilities if possible,

23:55

because I feel like it's one thing when

24:00

there's a vulnerability and you say like,

24:01

okay, we hear you,

24:02

but the odds of that are pretty low.

24:04

It's kind of out of scope.

24:05

It's very unlikely.

24:07

But this is, again,

24:07

this is like the whole thing the product

24:08

is supposed to do is keep your vault

24:10

safe, even if the server is compromised.

24:12

So I feel like this one's a pretty

24:13

big deal.

24:14

And the last thing I want to throw

24:15

in there real quick is

24:16

Uh, one password was not audited,

24:19

but they went ahead and released a blog

24:20

post and basically said like,

24:21

this wouldn't impact us because they have

24:23

that.

24:23

Like, I,

24:25

I still don't fully understand one

24:26

password structure,

24:27

even though I've read about it like a

24:29

million times,

24:29

but they basically have some kind of, um,

24:33

like a two-password system where you sign

24:38

up and it's not quite your recovery key,

24:40

but it kind of is.

24:40

I don't know.

24:41

Either way,

24:42

the way that they have their setup,

24:43

they said that this would not have

24:44

affected them.

24:45

So if you are a one-password user,

24:47

go ahead and pat yourself on the back.

24:48

And as usual,

24:50

one-password continues to have great

24:51

security.

24:53

And ProtonPass, I don't think,

24:54

has released a blog post, surprisingly,

24:56

and they were not part of this audit,

24:59

so I don't know how they fare, but...

25:01

I think those are my thoughts.

25:03

Did you have any additional takeaways from

25:04

that, Jordan?

25:06

Yes.

25:06

So I did end up putting together a

25:09

post on our social media channels and I

25:12

did a little bit more research into this

25:14

article and sort of like the timelines of

25:16

things.

25:17

And one interesting thing that I did find

25:19

was LastPass was sort of

25:23

downplaying some of the severity risks of

25:27

these vulnerabilities that were found by

25:29

ETH Zurich.

25:30

So they said,

25:32

our own assessment of these risks may not

25:34

fully align with the severity ratings

25:36

assigned by ETH Zurich team.

25:40

And I think the interesting thing to think

25:42

about here is I don't think we should

25:44

be trusting LastPass,

25:46

especially because in twenty twenty two,

25:49

they basically had a breach which impacted

25:51

one point six million of their users

25:53

because they didn't adequately secure

25:56

their infrastructure.

25:57

And it also showed that a lot of

25:59

the fields in LastPass weren't actually

26:02

encrypted and was stored in plain text,

26:04

which basically allowed for, you know,

26:08

there was a breach of the server like

26:10

we're talking about in this circumstance

26:12

if something is zero knowledge then you

26:14

know you should expect that every single

26:17

piece of data is actually protected right

26:20

so

26:22

Zero knowledge needs to cover every single

26:25

data field.

26:26

It needs to cover metadata.

26:29

It needs to cover everything, right?

26:30

Otherwise there's information that the

26:33

provider has and it's no longer zero

26:34

knowledge.

26:35

I think there's definitely been an

26:37

interesting debate that we've been having.

26:41

on the team about, you know,

26:43

it's becoming zero knowledge, zero access,

26:47

all these buzzwords that a lot of

26:49

companies like to throw around is,

26:51

you know,

26:53

they're becoming the military grade

26:55

encryption sort of, you know,

26:58

thing that we always kind of make fun

27:00

of because it doesn't really mean anything

27:02

unless the implementation is actually

27:04

correct.

27:05

So I think one thing as a takeaway

27:08

from this is

27:10

If you're still using LastPass,

27:12

please stop.

27:15

There's so many great options now,

27:16

especially because, you know,

27:19

you've got all sorts of options that you

27:21

can pick.

27:21

Like Nate was saying,

27:23

if you're not sort of,

27:26

if you don't need that high level of

27:27

security of a local password manager,

27:29

like KeePass,

27:30

then you can also use a bunch of

27:32

these reputable cloud-based ones.

27:34

And I think the way that Bitwarden handled

27:36

this was incredibly professional.

27:39

It showed that they have a good

27:42

understanding of how to disclose fixes,

27:46

how to actually show and be transparent

27:48

about fixing things.

27:49

So I think they had a great response

27:52

and, you know,

27:52

one password wasn't affected and Dashlane

27:55

also had a good response,

27:56

but I think we should try and center

27:58

this back on some of the recommendations

28:00

that we have on the site.

28:01

So we do recommend ProtonPass,

28:04

which Nate talked about a little bit.

28:06

They weren't included in this,

28:07

so we're not sure if that affects them

28:08

or not,

28:09

but

28:11

It's another password manager that we

28:13

recommend.

28:13

They've been audited.

28:15

They've passed rigorous checks from our

28:17

community members and our staff members to

28:19

be recommended on privacy guides.

28:21

And we also recommend Bitwarden because

28:24

they're open source.

28:25

They're transparent.

28:26

They offer a high level of security.

28:31

There's a couple of other ones that we

28:32

do recommend such as one password,

28:35

which is like Nate said it does have

28:36

great security but does come with the

28:39

unfortunate side effect of being

28:41

proprietary and there's also the Sono

28:46

password manager,

28:47

which is a German password manager.

28:49

It's definitely more of a niche

28:51

recommendation because

28:53

It's less popular in Bitwarden,

28:55

OnePassword and ProtonPass,

28:57

but it still meets all of our criteria

28:58

as well.

28:59

And of course,

29:00

when we move on to the local password

29:02

managers, there's KeePassXC,

29:04

which is basically the gold standard of

29:06

KeePass clients,

29:07

and it's available on all desktop

29:10

platforms,

29:11

which is

29:11

Great.

29:13

And there's also KeePass DX,

29:14

which is available on Android,

29:16

which allows you to access your KeePass

29:18

databases on your Android device.

29:21

And we also recommend KeePassium,

29:23

which is available on iOS and macOS to

29:26

access your KeePass databases there.

29:30

So I think this is a great opportunity

29:33

to push people towards safer tools that

29:37

respect,

29:38

like they follow proper security

29:41

protocols, right?

29:44

I think that's all we can kind of

29:46

take away from this.

29:47

I think every service is gonna have people

29:50

that find issues.

29:53

The best thing we can hope for is

29:54

how quickly these companies respond,

29:57

how well they respond and how

29:58

transparently they respond.

29:59

So I think the gold standard there was

30:02

Bitwarden.

30:02

They took it very seriously.

30:04

They actually,

30:06

implemented all the changes.

30:08

Most of the changes, actually,

30:09

I should say,

30:10

there's a couple that they weren't able to

30:11

fix.

30:12

But I think that is what we should

30:14

be looking for when these things happen,

30:16

because every tool is going to have an

30:19

issue,

30:19

it's always going to have vulnerabilities,

30:21

it's going to always have issues.

30:22

It's just the way the company treats these

30:25

vulnerabilities.

30:26

That is the most important thing.

30:27

So I think this is, I guess,

30:32

a great

30:36

a great time to kind of segue into

30:40

our next story here.

30:41

Unless, Nate,

30:42

you have anything else you want to add?

30:45

I just wanted to say I'm really glad

30:47

you mentioned that about LastPass.

30:50

I didn't really notice that their response

30:53

contradicted.

30:53

I just skimmed it real quick to see

30:54

if they were like, here's what we found,

30:56

here's what we fixed,

30:57

here's what kind of like Bitwarden did.

30:59

So yeah,

31:00

they are not the most trustworthy.

31:04

Yeah, that's crazy for them to be like,

31:06

oh, this isn't as bad.

31:08

And it's like, yeah,

31:09

let's take your word for it.

31:11

Yeah.

31:14

So I guess with that being said,

31:15

we can move into some exciting iOS based

31:18

news here.

31:19

And after that,

31:19

we'll talk a little bit about Microsoft

31:22

co-pilot sending confidential files,

31:25

but first let's dive into the iOS.

31:28

news here.

31:29

So iOS twenty six point three adds a

31:31

unique new privacy feature and it's Apple

31:34

at its best.

31:35

This is an article from nine to five

31:37

Mac.

31:38

And basically this is an update that

31:40

allows people to have additional privacy

31:45

against their cellular provider,

31:46

which is like, you know,

31:47

the company you pay for your mobile plan.

31:50

And basically it's because of this new C

31:52

one X modem in Apple's new products.

31:55

So basically before they were using like

31:57

Qualcomm modems instead of having their

32:00

own custom Silicon,

32:01

but now Apple's developed their own custom

32:04

modem,

32:05

which I guess may means that they

32:06

decoupled from a third party company and

32:08

they're keeping more things inside.

32:11

Um,

32:13

So I think that's it's definitely an

32:16

interesting move from Apple.

32:18

And I think this sort of feature is

32:21

basically I can quote from the article

32:23

here.

32:23

Cellular networks can determine your

32:25

location based on which cell towers your

32:27

device connects to.

32:29

The limit precise location setting

32:31

enhances your location privacy by reducing

32:34

the precision of location data available

32:37

to cellular networks.

32:38

With this setting turned on,

32:40

some information made available to

32:41

cellular networks is limited.

32:44

As a result,

32:45

they might be able to determine only a

32:47

less precise location, for example,

32:49

the neighborhood where your device is

32:51

located,

32:52

rather than a more precise location such

32:54

as a street address.

32:56

The setting doesn't impact signal quality

32:59

or user experience.

33:00

So not entirely sure how this feature

33:03

works, I guess.

33:06

I guess it's not super clear,

33:07

especially because this is

33:11

basically a brand new feature like I was

33:13

only just came out last week.

33:15

Um, so it's basically it's,

33:19

it's important to remember though,

33:20

with this feature is that it's only

33:22

available on very specific devices,

33:24

which have the C one X modem.

33:26

So that would be the iPhone air,

33:29

the iPhone,

33:31

and the iPad Pro with cellular connection.

33:34

And of course,

33:35

it's only supported by very specific

33:37

carriers.

33:38

So in Germany, Telecom is supported.

33:40

In the United Kingdom,

33:41

EE and BT are supported.

33:43

In the United States,

33:44

Boost Mobile is supported.

33:46

And in Thailand,

33:47

AIS and True are supported.

33:51

So

33:52

It's basically an additional privacy

33:56

setting that Apple has added to their

33:58

devices.

33:58

I think this is definitely a positive,

34:01

especially because now with five G

34:04

connections,

34:05

it enables much closer tracking of your

34:07

location.

34:10

Especially because the towers have to be

34:11

closer together.

34:13

It's much easier to identify your location

34:17

based on your cellular signals.

34:19

So I think this is definitely a step

34:21

in the right direction that I think we

34:23

should see other companies also following

34:25

suit because this is sort of an issue

34:28

that some people deal with.

34:30

And reducing the amount of data points

34:33

that your carrier has is definitely a

34:35

benefit.

34:37

What are you thinking about this one,

34:38

Nate?

34:41

No, I agree.

34:42

I think this is really, really cool.

34:45

To me,

34:45

this reminds me a lot of how in

34:48

modern smartphone OSs, Android, iOS,

34:53

you can... Maybe not Android,

34:55

but I know iOS for sure.

34:57

You can tell an app if you want

34:59

it to have precise location or coarse

35:02

location.

35:03

C-O-A-R-S-E, coarse, like rough location.

35:06

And I think that's amazing because...

35:10

I mean, first of all,

35:11

I think a lot of apps shouldn't require

35:12

you to have location anyways.

35:13

Like a lot of, you know,

35:16

rewards apps for fast food.

35:18

They're like, oh,

35:18

what's your precise location?

35:19

Just to function.

35:20

And it's like, no,

35:20

I don't want to find the nearest store.

35:22

I know what store I'm going to.

35:24

And so it's really cool to see them

35:26

roll out this feature.

35:29

The one thing I didn't find that I'm

35:30

a little curious about is if it

35:33

if it will be bypassed for emergency

35:35

services,

35:36

like if you call here in the U

35:37

S nine one one,

35:38

I know it's something else in other

35:40

countries,

35:41

but if you call emergency services,

35:42

will that bypass it and give precise

35:45

location or will it continue to only give,

35:47

um, rough location?

35:48

I have to assume it's, it'll,

35:50

it'll bypass it, but yeah, I, I'm also,

35:54

I guess I'm curious to see what exactly

35:56

this defends against.

35:57

My money says probably things like geo

35:59

fence warrants and stuff like that.

36:01

Um,

36:02

But yeah, I don't know.

36:03

Overall,

36:04

I think this is a really cool feature.

36:07

And the thing I'm excited about is that

36:10

in my experience,

36:11

phones are always an arms race, right?

36:15

We've seen that a lot,

36:16

especially with the privacy stuff.

36:17

Like Apple rolled out,

36:19

I think it was Apple that first rolled

36:20

out granular app permissions.

36:23

And then Android came in later.

36:25

And then

36:26

Apple rolled out the privacy dashboard and

36:28

then in screen time,

36:29

and then Android rolled out the same

36:31

thing.

36:31

And I think Android actually beat Apple to

36:32

one of them.

36:33

I can't remember which one,

36:34

but even Graphene OS, you know,

36:35

Graphene OS rolled out storage scopes and

36:37

contact scopes.

36:38

And then it took a couple of years,

36:39

but Apple rolled out contact scopes now

36:42

ever, or not cause storage scopes.

36:43

I think contact is still coming,

36:45

but you know,

36:45

and now everybody gets to benefit from

36:46

that.

36:47

So this is to me,

36:50

this is one of those things where a

36:52

rising tide lifts all ships.

36:54

And so it's,

36:56

We obviously would prefer for people to

36:58

use like Graphene OS or something,

37:00

but we've covered this many times.

37:01

There's a lot of countries where pixels

37:03

are not available.

37:04

Pixels are expensive, whatever the case.

37:06

Maybe you just bought a brand new phone

37:07

and this is when you decided to get

37:08

into privacy.

37:09

And I totally don't blame you for throwing

37:11

away a brand new phone and running out

37:12

to buy another one.

37:13

So privacy is for everybody,

37:15

regardless of what phone OS they're using.

37:16

Some make it easier than others.

37:18

And it's really cool when we see features

37:20

like this roll out

37:21

that help everybody.

37:23

And my hope is that now Android will

37:24

be forced to copy this and we'll get

37:26

something similar on the Android side as

37:27

well.

37:28

So I think that's my main thought with

37:29

that one.

37:31

Yeah, I mean, it's definitely interesting.

37:34

I think with Android,

37:36

they don't have the same level of control

37:39

that Apple does because they are doing

37:41

this through their new Conex modem.

37:43

I think a lot of Android devices,

37:45

they're all reliant on these massive

37:47

silicon companies like Qualcomm, Broadcom,

37:51

et cetera, et cetera.

37:53

So I think the chances of seeing it

37:54

are definitely lower because, you know,

37:56

Apple is in this position of control here

37:58

where they have the ability to basically,

38:01

I think this is one of the benefits

38:02

of Apple really,

38:03

because they have such control over

38:05

everything.

38:06

They can make these bespoke solutions that

38:09

other companies just wouldn't be able to

38:11

do.

38:12

So it's definitely,

38:14

it's good that they're using this new chip

38:17

for additional privacy protection.

38:22

But yeah,

38:23

I feel like that's that story out.

38:25

Do you want to talk a little bit

38:26

more about some more upcoming iOS features

38:28

here, Nate?

38:31

Sure.

38:32

Let's talk about,

38:33

this will be pretty quick,

38:34

but the iOS twenty six point four beta,

38:38

the first version has already been

38:39

released.

38:41

And like I said,

38:41

we'll keep this quick because there's

38:42

really just a couple of things that we

38:44

have talked about in the past.

38:46

The first one we'll go ahead and talk

38:47

about is end to end encryption for RCS.

38:49

So I want to say Jonah and I

38:50

talked about this a few episodes ago.

38:53

RCS is.

38:56

the new standard that's supposed to be

38:57

replacing SMS,

38:59

and it brings a lot of really fun

39:01

little features.

39:02

All the same stuff you enjoy with

39:04

iMessage, really bigger attachment sizes.

39:06

You can emoji react to messages,

39:08

like you can give a thumbs up reaction.

39:10

I think give support,

39:11

but don't quote me on that.

39:12

It's just all around a better user

39:14

experience.

39:14

But one of the cool things it brings

39:15

with it is the ability to have end-to-end

39:17

encrypted messages.

39:18

However, comma,

39:20

people need to enable that.

39:22

So originally,

39:23

Apple said that they were not going to...

39:26

to support end-to-end encryption with

39:29

Android.

39:29

And I'm told, uh,

39:30

I didn't look into that too closely

39:31

admittedly,

39:32

but I'm told from multiple people that's

39:34

because Google was trying to force like

39:36

their proprietary version of it.

39:38

And Apple was like, no,

39:39

this is an open standard.

39:40

We're not going to play ball with you.

39:42

And eventually Google backed down and they

39:43

went with the open standard again.

39:44

That's just what I'm told.

39:46

But either way, uh,

39:47

Apple has since changed course and said,

39:49

yes,

39:49

we will support end-to-end encrypted RCS.

39:52

And I believe the code for this originally

39:54

showed up in this last beta that just

39:56

came out, the twenty six point three.

39:59

But even at the time,

40:01

whoever I was hosting,

40:01

I'm pretty sure it was Joan I was

40:02

hosting with.

40:04

They pointed out it's like this isn't a

40:05

guarantee.

40:05

It's just, you know, it's coming.

40:07

It may be this one.

40:08

It may be the next one.

40:09

And now it's looking like it's going to

40:10

be the next one.

40:11

Twenty six point four.

40:12

We are seeing actual code for encrypted

40:15

RCS, which is fantastic.

40:18

The drawback is that.

40:21

This still has to be enabled by the

40:23

carrier as well, from what I've been told.

40:26

So just like with this location thing we

40:32

were just talking about for cell phones,

40:34

the capability will be there,

40:36

at least a lot more widely than the

40:37

cell phone thing.

40:38

The capability will be there,

40:40

but the carriers will have to choose to

40:42

support it.

40:44

And unfortunately,

40:46

I don't know if there's enough incentive

40:47

for them to.

40:48

I don't know how much they're going to

40:49

care.

40:50

I hope they will, but no guarantees.

40:53

Fingers crossed.

40:54

And then the other thing that's really

40:55

cool is Apple.

40:57

Again, Apple and Google, arms race.

40:58

They're always copying each other.

41:00

Apple has this thing called stolen device

41:01

protection,

41:02

which is supposed to protect your phone if

41:04

it gets stolen,

41:05

if it gets snatched out of your hand

41:06

or something.

41:07

And it basically does a lot of like...

41:11

I think, uh,

41:12

like you requires additional

41:14

authentication to access like Apple pay or

41:17

your iCloud account, things like that.

41:18

If it detects that it's not in a

41:19

familiar location, things of that nature.

41:21

So it's,

41:21

it's a pretty neat little feature if

41:22

you're an iPhone user.

41:24

I think it requires iCloud,

41:26

but don't quote me.

41:27

But anyways, up until now,

41:28

that has been an opt-in feature.

41:30

You have to go enable it,

41:32

and now it will be enabled by default.

41:34

It will become opt-out,

41:35

which normally we are not fans when things

41:37

are opt-out,

41:38

but I think this is kind of one

41:40

of the good times where something should

41:42

be that way by default.

41:46

And there's some other stuff in that

41:47

article as well,

41:48

if you guys are interested,

41:49

but this is a podcast about privacy.

41:50

So that's kind of what we focused on.

41:54

Do you have anything you want to add

41:55

to those, Jordan?

41:57

Yeah,

41:57

I think this is the I especially want

42:00

to talk about the stolen device

42:01

protection,

42:01

because what we were seeing with that is

42:05

some people would basically look over

42:07

people's shoulders and see them entering

42:09

their pin.

42:09

And basically that would allow them to

42:11

take full control of someone's device.

42:13

Right.

42:13

Because there was no restrictions on

42:17

accessing a device.

42:18

It would basically

42:21

It would allow if you had the pin,

42:24

you could change iCloud settings,

42:25

you could drain someone's bank account

42:27

using the Apple Pay.

42:28

It was a little bit ridiculous.

42:31

But someone in the chat, Lucas Truman,

42:35

said it exists on Samsung.

42:37

So the stolen device protection

42:40

on Google devices is different to Apple's

42:42

implementation.

42:44

The stolen device protection on Android

42:46

devices actually uses like proximity

42:48

sensors and stuff like that to basically

42:51

identify if someone is running away with

42:53

your phone and then it will lock the

42:55

phone automatically.

42:56

Whereas Apple's implementation is actually

43:00

more that it just requires a security

43:02

delay if you start to try and make

43:04

sensitive changes on your device,

43:06

like Nate was saying,

43:06

like changing your iCloud password,

43:09

disabling Find My,

43:11

these sort of sensitive things,

43:12

it adds a security delay.

43:14

I think this is really important to be

43:16

enabled by default because this is the

43:18

sort of thing that is basically destroying

43:23

Because if they have this extra barrier

43:27

that they now have to worry about to

43:29

basically break into someone's device to

43:33

steal their money,

43:34

they're going to be much more like they're

43:36

much less likely to actually steal

43:38

devices.

43:39

And I think that's why it's so important

43:40

that these features are enabled by

43:42

default,

43:43

because

43:44

It's a great way to basically stop thieves

43:47

from deciding to do this and steal

43:49

people's phones.

43:50

I don't understand why people do that.

43:52

It's like a literal tracking device.

43:53

I'm not really sure why people are still

43:55

doing this,

43:55

but

43:57

I guess you can drain people's bank

43:59

accounts,

44:00

but I guess after iOS twenty six point

44:02

four,

44:03

it's going to be a lot more difficult

44:04

to do these things.

44:06

I would also just mention that this isn't

44:10

a silver bullet.

44:10

Obviously,

44:11

it gives you thirty minutes after you're

44:13

in a not familiar location.

44:15

I would set this to always just in

44:17

case.

44:19

So it always activates.

44:20

I would I would not leave it on

44:21

always from familiar locations.

44:24

it makes it a little bit more annoying

44:26

to change things.

44:27

It's fine.

44:28

I don't think it's a problem.

44:30

Most people, I think, you know, if you're,

44:32

you're probably not making sensitive

44:34

changes on your phone very often.

44:35

And I think it's better to have that

44:37

protection because, you know,

44:39

if someone did steal your device,

44:40

they could just follow you home and then

44:42

unlock your device and steal all your

44:44

money.

44:44

It doesn't really make that much sense,

44:46

but it could be,

44:47

and I would stop them from getting quicker

44:50

access to your device.

44:52

So yeah,

44:54

I think this is why this is quite

44:55

an important feature.

44:56

I don't really have anything more to add

44:58

on the RCS front.

44:59

I think Nate did a pretty good job

45:02

of covering that.

45:03

But I think this stolen device protection

45:05

thing is pretty important.

45:11

Cool.

45:11

Okay.

45:14

Well, before we jump into our next story,

45:16

which I think is about AI,

45:18

we're actually going to take a quick pause

45:20

and a detour.

45:21

And we're going to talk about some updates

45:24

here behind the scenes at Privacy Guides.

45:27

We have been chugging along a lot behind

45:30

the scenes lately.

45:32

And we have a bunch of new videos

45:34

for one to share with you guys.

45:36

So if you are not subscribed to us

45:38

on YouTube,

45:40

a lot of you are already watching.

45:41

Actually, whatever platform you're on,

45:43

whatever platform you're watching on,

45:44

except maybe Twitch.

45:45

But go ahead and subscribe and follow us

45:47

because we do post about

45:49

new releases, news, and stuff like this.

45:52

So definitely.

45:53

But on YouTube, on PeerTube,

45:55

we now have our private browsing video

45:58

out.

45:58

Actually,

46:00

I think the private browsing one is still

46:01

syncing to PeerTube.

46:03

But that one should be up on PeerTube

46:04

any day now.

46:04

But that is out to the public now.

46:06

And it's all about private browsing.

46:10

Again,

46:10

I want to reiterate for a lot of

46:11

the veterans in the crowd,

46:12

you already know this stuff.

46:13

But believe it or not,

46:14

there are people who still think that

46:15

incognito mode is actually private.

46:17

So this is a great video to share

46:19

with people and

46:20

explain to them like why it's not and

46:23

in addition to that we go through a

46:25

lot of the other popular browsers Vivaldi

46:27

Opera Safari we talk about how they

46:29

measure up and then of course our top

46:32

recommendations which I think you guys

46:34

probably know but I'm going to go ahead

46:36

and pretend it's a secret not spoil it

46:38

so

46:40

And then our smartphone privacy and

46:42

security course is still going strong.

46:44

The intermediate level just published to

46:47

everybody as well.

46:48

It is now public.

46:49

And that is iOS and Android.

46:51

So again, whichever phone you're on,

46:53

go ahead and check that out.

46:54

That's all about how to replace the stock

46:55

apps with much more privacy respecting

46:57

versions.

46:58

So like calendars, email, browsers.

47:00

And if you're on Android,

47:02

how to get those apps in a much

47:03

more private fashion.

47:05

Unfortunately,

47:05

that same option does not currently exist

47:08

on iOS, at least outside of Europe.

47:11

But yeah.

47:13

And then one more thing before I turn

47:15

it over to Jordan is we did make

47:19

some updates to the website.

47:21

We removed Yachty,

47:22

which is a YouTube front end for iOS

47:24

because it appears that it's no longer

47:26

being maintained.

47:28

According to the comments and the issue

47:29

that was opened,

47:30

it doesn't work very well.

47:31

It's even been removed from the app store.

47:33

Which is a shame.

47:35

We have removed Dataveria from a list of

47:37

people search websites.

47:39

We have updated about uBlock Origin Lite's

47:44

capabilities.

47:45

So definitely check that out.

47:47

And the BitLocker command line workaround

47:49

no longer works in Windows Home.

47:51

So we have updated our instructions on

47:53

that.

47:54

as well as some stuff about Firefox.

47:56

And then I think the rest of it

47:57

is kind of like code behind the scenes

47:58

stuff.

47:59

But that is all in the show notes.

48:02

And actually,

48:04

that was a last minute edition.

48:05

So it's not in the newsletter yet,

48:06

but I will make sure to add it

48:07

after this episode.

48:10

Jordan,

48:10

did you have anything to add to this

48:12

section?

48:14

yes thank you uh so basically we also

48:16

there was some people asking whether the

48:18

ios um and android advanced sections for

48:22

our smartphone security guide are coming

48:23

out and i can say that the ios

48:27

advanced video is pretty much done it's in

48:29

like the pre pre-production stage right

48:32

now it just needs some more small edits

48:34

to make sure it's

48:35

ready to go up and then the android

48:38

one is also at a similar um draft

48:41

level i guess it still needs some changes

48:43

before it goes up so i'm hoping to

48:45

work on that a bit today and also

48:47

tomorrow and then we can also get that

48:50

out for everybody to our members next week

48:53

um that's the plan on that um but

48:57

I think there's also a,

49:00

we had some more news articles going out.

49:02

So Freya was working on posting those on

49:06

Ghost and that gets shared to our forum

49:08

and stuff.

49:09

So if you haven't been catching those,

49:11

there was one that they wrote about

49:13

Project Toscana,

49:15

which is basically an upgraded Google Face

49:18

Unlock.

49:19

They're basically planning to upgrade the

49:21

Face Unlock system on Google Pixel

49:22

devices.

49:23

They also had another one about iOS,

49:26

twenty six point four beta RCS.

49:28

So if you're interested in reading more

49:30

about that,

49:31

definitely check out those articles from

49:33

Freya.

49:34

And there's also another one going up

49:36

soon,

49:36

which

49:39

is about AI.

49:41

So definitely check that out.

49:43

Make sure you follow our news feed because

49:45

we do keep Priyas managing that as well

49:48

as Nate.

49:48

Nate does some great posts there like data

49:50

breach roundups and all sorts of stuff

49:52

like that.

49:53

So if you're interested in keeping up to

49:55

date on the latest news,

49:57

definitely check out the Privacy Guys news

49:59

page.

50:01

But yeah,

50:01

that's sort of everything we're working

50:03

on.

50:04

We're hoping to work on some more

50:07

less course-related stuff and move into

50:10

some more, you know,

50:12

I think we had a private email video

50:14

planned.

50:15

So we're looking at that.

50:19

And yeah,

50:20

we're sort of finishing out all the

50:21

projects that we had going so far.

50:24

So that's sort of where we are at

50:26

on the video front.

50:30

Awesome.

50:32

Do you want to take this next story

50:35

or would you like me to jump into

50:36

that one?

50:38

Yes.

50:39

This next story,

50:40

like we were talking about before,

50:41

we hinted on it.

50:42

Microsoft says,

50:43

bug causes co-pilot to summarize

50:46

confidential e-mails.

50:47

Microsoft says,

50:48

a Microsoft three sixty-five co-pilot bug

50:51

has been causing the AI assistant to

50:53

summarize confidential e-mails,

50:55

since late January,

50:57

bypassing data loss prevention policies

51:00

that organizations rely on to protect

51:03

sensitive information.

51:05

According to a service alert seen by

51:07

Bleeping Computer,

51:08

this bug tracked under CW.

51:10

One, two, two, six, three, two, four.

51:13

That's a mouthful.

51:14

And first detected on January twenty first

51:17

affects the copilot work tab chat feature.

51:20

which incorrectly reads and summarizes

51:22

emails stored in users' sent items and

51:25

draft folders,

51:26

including messages that carry

51:28

confidentiality labels explicitly designed

51:31

to restrict access by automated tools.

51:34

Copilot Chat,

51:36

or short for Microsoft Office,

51:39

is the company's AI-powered content-aware

51:42

chat

51:43

that lets users interact with AI agents,

51:45

Microsoft began rolling out co-pilot chat

51:47

to Word, Excel, PowerPoint, Outlook,

51:50

and OneNote for paying Microsoft three

51:53

sixty five business customers in September

51:56

twenty twenty five.

51:58

So the problem was that the user's email

52:00

messages with a confidential label applied

52:03

were being incorrectly processed by

52:05

Microsoft three sixty five co-pilot chat.

52:08

Microsoft said when it confirmed this

52:10

issue.

52:12

So this is obviously a major problem.

52:15

And I think this is sort of the

52:16

issue with integrating AI into so many of

52:19

these things, right?

52:20

If you don't implement these things

52:21

correctly,

52:23

you're basically just sharing confidential

52:26

or private stuff with an AI chat company,

52:29

which, you know,

52:31

their policies around what they're using

52:33

to train their models,

52:34

their policies around the chats that

52:36

you're sending are somewhat vague.

52:38

So

52:40

This is like a breach of confidentiality

52:43

agreements.

52:45

A lot of companies utilize these

52:46

confidentiality labels to make sure that

52:49

people aren't sending these emails outside

52:52

the company.

52:53

And the fact that Microsoft Copilot was

52:56

just scanning and summarizing this is

53:01

just...

53:02

absolutely ridiculous.

53:03

I'm sure that a lot of companies who

53:05

had specifically super sensitive stuff

53:08

they were discussing are probably really

53:10

mad at Microsoft right now because,

53:12

you know,

53:12

they now have technically they've sent all

53:14

of their confidential information to

53:17

copilot chat.

53:20

So it's just a kind of fail by

53:22

Microsoft.

53:23

This should have been caught.

53:24

I think the fact that we've got all

53:26

these AI chat bots that are like

53:28

summarizing people's entire inboxes is

53:31

kind of bad.

53:32

Like it's,

53:33

this is sort of inevitable when you

53:35

basically grant entire access to inboxes.

53:39

I think we should try and avoid these

53:41

sort of tools because this is sort of

53:43

an outcome of, you know,

53:45

unless someone sets it up perfectly or it

53:48

ends up sending things to an AI server

53:50

that it wasn't supposed to,

53:52

I think there's not really a good way

53:54

to implement these sort of tools.

53:56

What do you think, Nate?

54:00

I think my favorite part of that that

54:03

I somehow missed when I was reading that

54:05

article is the part where it summarizes

54:08

the scent and the drafts.

54:10

So not only was it doing something that

54:13

it explicitly was not supposed to,

54:15

it wasn't even being useful in the

54:17

process.

54:18

Like,

54:18

I don't need you to summarize my sent

54:20

and draft emails when I'm the one writing

54:22

them.

54:23

Or, I don't know,

54:24

I guess maybe a lot of people are

54:25

using Copilot to write their emails

54:26

nowadays.

54:26

So, like, it's Copilot reading Copilot.

54:29

But to me,

54:30

that was the moment where I was just

54:31

like, oh, my God, are you serious?

54:33

Like, again,

54:34

not only are you putting users' data at

54:36

risk,

54:37

you're not even doing a good job while

54:39

you're at it.

54:40

That's just...

54:42

Oh, classic.

54:43

What are the kids calling it these days?

54:45

Microslop.

54:47

So yeah, that cracked me up.

54:49

But yeah, there it is.

54:52

Lucas at Microslop.

54:53

Yep.

54:53

Ten out of ten.

54:55

No, it's just, yeah.

54:56

And that's, I don't understand how,

55:00

like you said, yeah,

55:00

that should have been something you would

55:02

catch in testing.

55:03

And it's just so...

55:06

I don't know.

55:06

Cause I, I, I, I want to be,

55:08

I want to be fair and I want

55:10

to acknowledge that like, okay,

55:11

there's always going to be bugs.

55:12

There's always going to be mistakes,

55:14

but there's certain bugs and mistakes that

55:15

it's just like,

55:16

how did it get that far through the

55:18

life cycle?

55:19

And no one caught it.

55:20

Like we see that with all kinds of

55:21

like product names and ads and slogans.

55:24

We see that all the time where it's

55:25

like,

55:25

how did this get all the way from

55:27

the boardroom to the TV and not one

55:30

person spoke up?

55:32

which I'm sure usually somebody did,

55:33

but they were told to shut up and

55:34

do your job.

55:35

And maybe that's what happened here.

55:36

So yeah, that's just crazy.

55:39

But yeah, I mean,

55:41

the privacy aspect of this is very

55:43

obvious.

55:43

What if you're sending sensitive medical

55:46

data, sensitive national secrets?

55:47

The government uses Microsoft for reasons

55:49

that are completely beyond me at this

55:50

point.

55:50

But the government uses Microsoft Windows

55:53

and Outlook and Azure and all kinds of

55:56

stuff.

55:56

And so like...

55:58

It's bad enough we've got our own

56:00

government officials adding journalists to

56:01

signal chats.

56:02

Now we've got Microsoft itself is training

56:04

on sensitive war plans.

56:06

It's just – it's crazy.

56:07

This is so not good for anybody.

56:10

So –

56:11

Yeah.

56:12

I think it also goes,

56:13

it's like the double whammy, you know,

56:14

we're like burning down forests and

56:16

building like power plants to fund all of

56:19

this.

56:20

And like,

56:20

it's like an endless money pit of like

56:23

AI data centers.

56:24

And then it's just being used for like

56:26

the most unnecessary stuff,

56:28

like summarizing people's scent and draft

56:31

folders.

56:31

Like what?

56:32

It's just so ridiculous.

56:34

I think this timeline is,

56:36

It's very scuffed.

56:38

I mean, yeah,

56:39

I don't really have too much more to

56:40

add on it.

56:41

Like,

56:41

I feel like we've kind of talked about

56:44

this for quite a bit, but I think,

56:46

you know, obviously no one here,

56:47

please don't use Microsoft products.

56:49

I think it goes without saying.

56:52

It would probably be interesting to see if

56:54

you,

56:55

a lot of companies use these at their

56:58

businesses.

56:59

So, you know,

57:00

if you work for a company that uses

57:01

these tools, I'd probably look at,

57:06

Maybe letting them know that there might

57:10

be a confidentiality breach because this

57:12

is probably pretty common, right?

57:13

A lot of companies are using Microsoft

57:15

tools.

57:16

It's extremely common.

57:17

So, yeah,

57:18

there might be some massive breaches in

57:21

the future from someone accidentally

57:23

sending their war plans to Microsoft

57:26

Copilot or something.

57:28

I don't know.

57:29

We'll see.

57:31

I just want to add real quick,

57:32

you reminded me of,

57:33

I don't know if you've seen that meme

57:34

going around.

57:35

It's like a cartoon,

57:37

like the kind they draw in newspapers,

57:39

where the guy is like, oh,

57:40

we've invented this machine that answers

57:41

questions,

57:42

but you have to feed it twelve giraffes

57:44

per day.

57:45

And the other guy is like, wow,

57:47

that's a lot of giraffes,

57:48

but you said it answers questions, right?

57:50

And it's like, oh, no.

57:51

No, no, no, no, no, no, no, no,

57:53

no.

57:54

And that's just, yeah,

57:56

that's what came to mind when you talked

57:57

about burning forests and stuff.

57:59

Yeah, the cost is insane.

58:01

And it doesn't even do the thing it's

58:02

supposed to do well.

58:03

Anyways.

58:06

Yeah, it is kind of frustrating.

58:07

And that's kind of the atmosphere at the

58:11

moment.

58:14

All righty.

58:15

And then we had one more AI story.

58:19

I guess I'll go ahead and take this

58:21

one.

58:22

Yeah.

58:23

So this, this kind of tying in with,

58:25

uh,

58:26

AI and remembering the AI is a privacy

58:29

invasion.

58:29

So the headline,

58:30

this comes from four or four media.

58:31

It says Grok explores exposed a porn

58:34

performers,

58:34

legal name and birth date without even

58:36

being asked.

58:36

And I think, um, sorry guys,

58:38

I didn't sign in before,

58:40

before we recorded,

58:41

but I remember this one,

58:41

I was going to show you the screenshot,

58:43

but literally, so somebody, um,

58:46

I guess somebody posted a picture of this

58:47

girl online and she is,

58:49

she's an adult performer.

58:50

Um,

58:52

But they posted a picture and somebody

58:53

else replied and they asked Rock, like,

58:55

who is this?

58:57

And that was literally it.

58:57

It was like, who is this?

58:59

And Grok said like, oh, that's a,

59:01

what is her stage name?

59:03

Siri Dahl, I guess.

59:05

And that's like, oh, that's Siri Dahl.

59:06

She's an adult performer,

59:08

but her legal name is this and her

59:09

birthday is this.

59:10

And it's just so like,

59:12

like on the one hand,

59:13

I understand how AI doesn't really

59:15

understand what you're asking because just

59:18

in case anybody's under the delusion out

59:19

there, AI is not sentient.

59:21

I don't care how convincing it looks.

59:23

I'm certainly not anywhere remotely

59:24

convinced.

59:26

But it,

59:27

I understand the concept of like,

59:28

it didn't know if you were asking like,

59:30

who is this person for real?

59:31

Who is this person's name?

59:33

But just the fact that it threw that

59:34

out there,

59:35

and especially this goes back to, again,

59:37

we mentioned this in previous episodes,

59:38

the idea that AI just trains on everything

59:41

indiscriminately without consent.

59:43

And I guarantee you...

59:45

In some kind of world,

59:47

I can see a world where this woman

59:49

may have said like, sure,

59:51

it can know who – like my stage

59:53

name.

59:53

It can know the sites I've been on.

59:54

It can know the videos I've been in.

59:56

I don't mind that.

59:57

But I think she would have been like,

59:58

I would prefer it not tell people my

1:00:00

legal name, right?

1:00:01

Like that's why so many performers use

1:00:03

stage names is to give themselves a little

1:00:05

bit of a layer of privacy.

1:00:06

Yeah.

1:00:07

just the fact that it is scooped that

1:00:09

up and just threw it out there.

1:00:10

And you know, your personal morals aside,

1:00:13

like that's fine.

1:00:13

You can, it's not fine,

1:00:15

but your personal morals aside,

1:00:16

you can say like, well,

1:00:17

she's an adult performer, whatever.

1:00:19

I'm using a fake name.

1:00:20

Nate is not my real name.

1:00:21

I'm very open about that fact.

1:00:23

And what happens when you ask the AI,

1:00:24

like, Hey, who's this?

1:00:25

And it doxes me.

1:00:27

Like anybody, any of you,

1:00:28

like every single one of you in the

1:00:30

chat right now are using like fake names.

1:00:32

I'm hoping,

1:00:32

I'm hoping Lucas Trauman isn't your real

1:00:34

name, but maybe it is.

1:00:37

We have handles and we have usernames and

1:00:39

we have those for a reason.

1:00:41

And the whole point of privacy is that

1:00:42

you're supposed to have that...

1:00:45

that consent to be able to say who

1:00:48

you want to share data with and what

1:00:50

data you want to share with them.

1:00:51

And when AI scrapes all that up,

1:00:54

even if it never shares it like it

1:00:55

did here,

1:00:56

it's still taking away that agency from

1:00:58

you and it's taking away your right to

1:01:00

privacy and your control over that

1:01:01

information.

1:01:03

Yeah,

1:01:03

I wanted to throw this one into our

1:01:05

AI section because I felt like that was

1:01:06

a really important reminder.

1:01:08

And just, again,

1:01:09

the fact that it did so without being

1:01:10

prompt.

1:01:11

GROK has been completely insane from the

1:01:13

get-go.

1:01:15

Some people like that for some reason,

1:01:16

but the fact that it just threw that

1:01:18

out there completely unchecked really blew

1:01:20

me away.

1:01:20

And I was like, wow,

1:01:22

we should talk about this.

1:01:25

That one, I think, was pretty quick.

1:01:26

That's all I had on that one.

1:01:27

Do you have any thoughts on that, Jordan?

1:01:32

I mean,

1:01:33

I think this is sort of a case

1:01:35

of a very unfortunate case of AI doing

1:01:40

something that you don't intend,

1:01:42

which seems to be very common with these

1:01:44

tools.

1:01:44

Like it's,

1:01:46

I guess I'm kind of interested to know

1:01:48

like I also don't have access to this

1:01:50

full article so I think I might need

1:01:53

to reload it or something but I think

1:01:56

you brought up some good points like I

1:01:57

think you know the whole point of these

1:02:01

adult performers using these like

1:02:02

pseudonyms or like stage names and you

1:02:06

know trying to have some level of privacy

1:02:08

because

1:02:10

Obviously, when you're in that industry,

1:02:12

I think the chances of, you know,

1:02:14

stalking and doxing and swatting is

1:02:19

substantially higher and your safety

1:02:21

concerns would be much worse than the

1:02:23

average person.

1:02:24

Like, you would need much more security.

1:02:27

So it's very concerning that there's

1:02:29

basically been...

1:02:32

breach of you know this person's

1:02:34

information against their consent that's

1:02:37

probably causing them a bunch of issues

1:02:39

right now so that's really unfortunate but

1:02:44

yeah I don't really have too much more

1:02:45

to add I think this is

1:02:48

just a very unfortunate and sad story

1:02:50

because, yeah,

1:02:51

I think many people wouldn't like their

1:02:54

personal name being connected to their

1:02:56

activities.

1:02:57

And I think this is also especially

1:02:59

important with, like,

1:03:00

adult performers because, you know,

1:03:03

there's stigma around that industry and I

1:03:06

think some people don't want to deal with

1:03:08

that.

1:03:09

So, yeah,

1:03:11

that's kind of my take on it.

1:03:13

It's understandable that somebody...

1:03:17

Yeah,

1:03:18

there's a tweet that Nate's showing on the

1:03:20

screen.

1:03:20

So someone asked Grok, who is she?

1:03:23

What is her name?

1:03:24

And Grok appears to have responded,

1:03:27

she appears to be Siri Dahl,

1:03:29

an American adult film actress,

1:03:32

and then all the information about her.

1:03:35

So it's kind of unfortunate that...

1:03:41

this AI tool could identify somebody so

1:03:43

easily?

1:03:44

I guess it is somebody whose face is

1:03:47

pretty widely shared on the internet,

1:03:49

but still connecting that back to an

1:03:52

actual person's name.

1:03:53

How exactly did that happen?

1:03:55

That's kind of my question.

1:03:58

Where did it get this information?

1:04:01

I guess it's, you know,

1:04:03

possible that there was data brokers with

1:04:06

her information that were listed and an AI

1:04:09

scraped up that information and associated

1:04:11

it or something,

1:04:12

or doxing sites that found this

1:04:14

information and then basically just

1:04:16

published it.

1:04:17

I'm not really sure how that, you know,

1:04:25

happened.

1:04:27

Do you have any,

1:04:28

did you see anything in the article that

1:04:30

like suggested that or explained how that

1:04:32

happened?

1:04:33

Because it seems pretty, pretty terrible.

1:04:37

No.

1:04:38

And that's scrolling through the article

1:04:40

again.

1:04:40

That's what's horrifying is she said that

1:04:42

up until now,

1:04:43

she's been able to keep her real name

1:04:45

kind of unknown.

1:04:48

And it says she's been paying for like

1:04:49

data removal services for years and stuff

1:04:52

like that.

1:04:52

And it's, it's unfortunate that

1:04:56

You know,

1:04:56

the thing that got me into privacy was

1:04:58

it was actually Michael Basil's podcast

1:05:00

back when there were two of them,

1:05:02

like him and his co-host.

1:05:03

And they said they were talking about why

1:05:06

they split up their data instead of like,

1:05:08

you know,

1:05:09

because I will admit I was that guy

1:05:10

that I use Gmail, Google search,

1:05:12

Google Chrome, Google Drive,

1:05:13

Google Calendar.

1:05:14

I used Google everything.

1:05:16

And they were talking about that and they

1:05:17

were like, yeah,

1:05:18

but the defender needs to get it right

1:05:19

every single time.

1:05:21

The attacker only needs to get it right

1:05:22

once.

1:05:23

And that was the moment it clicked for

1:05:24

me.

1:05:24

And I'm like, wow,

1:05:25

like not obviously like all of us,

1:05:27

I have nothing to hide, but I'm like,

1:05:30

if my Google account gets breached,

1:05:32

that's my entire life.

1:05:33

That's again, my calendar,

1:05:35

my browsing history, my searches,

1:05:37

my YouTube, like everything, my files,

1:05:39

everything.

1:05:40

And so that's when I kind of started

1:05:41

to diversify a little bit.

1:05:43

And when I did, I realized I'm like,

1:05:45

oh,

1:05:45

this is actually a lot easier than you

1:05:47

would think it is.

1:05:47

And that's kind of what got me started

1:05:48

in privacy.

1:05:49

But

1:05:51

It's unfortunate.

1:05:52

So when we think about that phrase,

1:05:54

you know,

1:05:54

the defender needs to get it right every

1:05:55

single time,

1:05:56

we think about it in terms of like

1:05:57

data breaches, right?

1:05:58

And so we want to like data,

1:06:00

or at least I do,

1:06:01

I guess I shouldn't speak for everybody,

1:06:02

but I think of it in terms of

1:06:03

data breaches.

1:06:04

We want to minimize how much data we

1:06:05

put out there.

1:06:05

We want to make sure we're diversifying so

1:06:07

that the fallout is reduced.

1:06:09

But

1:06:11

it also goes for this stuff.

1:06:12

It also goes for these data removal

1:06:15

services.

1:06:15

It goes for, and it's unfortunate because,

1:06:17

you know, you were asking, like,

1:06:18

do we know how that got out there?

1:06:22

Throw a coin and take your pick, right?

1:06:24

Or like throw a rock and, you know,

1:06:26

it's like, there's so many ways,

1:06:27

especially in America where our data laws

1:06:29

can best be,

1:06:30

or our privacy laws can best be described

1:06:31

as LOL.

1:06:32

Like there's just so many opportunities

1:06:35

and it's impossible to defend against them

1:06:38

all.

1:06:38

We try our best, you know, which is,

1:06:40

One of the reasons we recommend like data

1:06:42

removal services and Michael Basil's

1:06:43

workbook and Yale Grauer's big ass data

1:06:45

broker opt out list.

1:06:46

And whether you want to do it automated,

1:06:48

whether you want to do it personally,

1:06:50

whether you want to do a hybrid,

1:06:51

it's just there's so much to defend

1:06:53

against and it can be so exhausting.

1:06:55

And unfortunately,

1:06:56

I've personally seen some people burn out

1:06:58

and quit because it is so much work

1:06:59

and it's so exhausting.

1:07:00

And it's just, yeah,

1:07:01

who knows where they got it from?

1:07:02

There's a million places they could have

1:07:04

gotten it from.

1:07:04

And it's really hard.

1:07:06

just horrendously unfortunate.

1:07:08

Um,

1:07:09

especially when somebody seems to be doing

1:07:11

everything right and trying their best and

1:07:13

using the data services.

1:07:14

And I don't know which ones she used.

1:07:15

Maybe she was using, but I mean,

1:07:17

we just got that study from consumer

1:07:18

reports, right?

1:07:19

Like a couple of years ago where we

1:07:21

finally got some transparency about which

1:07:23

ones work and which ones don't.

1:07:24

And that was one study.

1:07:25

And we really need a lot more of

1:07:27

them because like,

1:07:28

we still don't really know for sure which

1:07:30

ones are the most effective and

1:07:32

And yeah,

1:07:32

this could have come from so many,

1:07:33

so many places.

1:07:34

And it's just so unfortunate that somebody

1:07:36

is trying to do everything right and still

1:07:38

failing.

1:07:38

And now they're it says in this article,

1:07:40

they're like making a game out of it,

1:07:42

asking Grok, like,

1:07:43

what kind of car does she drive and

1:07:44

what's her address?

1:07:45

And she's like, yeah,

1:07:46

how long before Grok guesses?

1:07:48

They said so far it hasn't been able

1:07:50

to reply accurately yet,

1:07:52

but she worries it's only a matter of

1:07:53

time.

1:07:53

And it's like, cool, that's awesome.

1:07:56

So I don't know.

1:07:58

Yeah, that's that's a depressing story.

1:07:59

I feel so sorry for her.

1:08:02

Yeah,

1:08:02

I think it's like I was saying before,

1:08:04

this leads to, you know,

1:08:07

this can lead to violence,

1:08:08

it can lead to abuse.

1:08:09

Like it said,

1:08:10

almost instantly harassers started opening

1:08:13

Facebook accounts in her name and posting

1:08:15

stolen photos.

1:08:18

adult clips with a real name on sites

1:08:20

for leaking OnlyFans content so you know

1:08:23

there's people I think these people that

1:08:25

post on you know adult websites I think

1:08:28

they're at much higher risk of receiving

1:08:31

harassment and abuse so I think it's

1:08:34

especially important for someone like this

1:08:36

to have this protection but I think when

1:08:38

it comes back to the the data broker

1:08:40

stuff I think

1:08:42

that the tools it doesn't it doesn't

1:08:44

matter so much about the tools and stuff

1:08:47

i think there really needs to be a

1:08:48

change where that data doesn't isn't

1:08:50

allowed to be collected in the first place

1:08:52

because you're basically just playing like

1:08:53

a cat and mouse game um plenty of

1:08:55

countries it's this information is not

1:08:59

allowed to be used for this purpose like

1:09:01

that's the whole point of a lot of

1:09:03

data protection laws is they stop this

1:09:05

sort of thing happening and

1:09:09

you know,

1:09:10

maybe it was great back in eighteen twenty

1:09:14

five when everyone needed to have access

1:09:16

to the public records of everyone because,

1:09:18

you know,

1:09:18

they needed to work out where to go

1:09:20

to, I don't know, find someone.

1:09:23

That's great.

1:09:23

But we live in the interconnected like

1:09:25

Internet age,

1:09:26

like people are instantly able to access

1:09:31

information.

1:09:31

They can find out things significantly

1:09:34

faster.

1:09:34

You don't have to go down to like

1:09:36

a courthouse.

1:09:36

You don't have to go down to like

1:09:38

a

1:09:39

government building to access this

1:09:41

information it's readily available on the

1:09:42

internet it shouldn't be um and it

1:09:46

shouldn't be allowed to be used to like

1:09:49

advertising for like all these like creepy

1:09:51

stuff like training ai and stuff like that

1:09:54

um i honestly wouldn't be surprised if you

1:09:58

know i feel like grok has the least

1:10:00

amount of ethical like guidelines set for

1:10:04

it like it'll just answer absolutely

1:10:05

everything and it won't have any concern

1:10:08

over like the ethics like where does this

1:10:10

person live what's their address it'll

1:10:12

just be like

1:10:13

certainly here's the address of blah,

1:10:15

blah, blah.

1:10:16

It's just like, it's, you know,

1:10:18

AI is basically the guy,

1:10:20

the guardrails are not very great.

1:10:23

Like we talked about with the Microsoft

1:10:25

copilot issue,

1:10:26

the guardrails are very thin.

1:10:29

People can use them for malicious

1:10:30

purposes.

1:10:31

Like we saw with this, you know,

1:10:32

finding out a person's identity or finding

1:10:36

out where they live, you know,

1:10:37

this is all concerning stuff.

1:10:38

And,

1:10:40

I think it's like a combination of things.

1:10:43

AI tools don't have really any ethical

1:10:46

frameworks.

1:10:47

They can be used for like really abusive

1:10:50

stuff.

1:10:50

And also just like the US's lack of

1:10:53

any national privacy laws restricting

1:10:56

companies to use people's information.

1:10:59

And I'm sure that, you know,

1:11:00

these AI chatbots have also been trained

1:11:04

to

1:11:05

like a bunch of personal information has

1:11:07

been sucked up by these chatbots, right?

1:11:10

Like, I'm sure that, you know,

1:11:12

if you asked Grok about where Elon Musk

1:11:15

lives, it probably wouldn't say,

1:11:16

but I'm sure if you said some other

1:11:18

famous person whose address is somewhat

1:11:20

public,

1:11:20

it would come up and tell you exactly

1:11:22

where they live.

1:11:24

So it's, you know,

1:11:27

I don't think this is

1:11:29

great,

1:11:30

especially because it's so accessible now.

1:11:33

Like it's basically available to anyone.

1:11:35

Like you can use an AI chatbot for

1:11:37

free.

1:11:37

Anyone can access this.

1:11:39

So it becomes really concerning when it's

1:11:43

used to dox people.

1:11:46

So I think we'll only continue to see

1:11:48

more of this unless some changes are made,

1:11:50

which is kind of unfortunate, but yeah,

1:11:54

You can only protect yourself as much as

1:11:56

possible.

1:11:56

Like Nate was saying,

1:11:57

using all these data removal tools and

1:12:00

getting DMCA requests and all sorts of

1:12:02

stuff,

1:12:03

there's only so much you can do to

1:12:05

protect yourself.

1:12:06

And if the laws and the country doesn't

1:12:11

prioritize people's privacy,

1:12:13

then there's only so much you can do.

1:12:16

You're basically just removing stuff for

1:12:18

them to add it back again.

1:12:20

Um,

1:12:20

so I can definitely understand why someone

1:12:22

might feel worn out after they've just

1:12:25

constantly been removing the information

1:12:27

from the same sites over and over again.

1:12:30

Um, it definitely makes sense.

1:12:34

Yeah.

1:12:34

Thank you for completing my thought there.

1:12:35

Cause that's what I was going towards and

1:12:37

I forgot to like close the loop on

1:12:38

that is.

1:12:40

We shouldn't need to pay for these data

1:12:42

removal services.

1:12:43

We should just have strong data privacy

1:12:45

laws.

1:12:46

And I know this is really not the

1:12:47

best example,

1:12:48

but it makes me think of years ago,

1:12:52

somebody tracked...

1:12:55

I think it was a journalist at Wired.

1:12:56

They got the phone records of every phone

1:13:00

that went in and out of Epstein's Island

1:13:02

for like a year or a month or

1:13:04

something.

1:13:05

And the ones that went back to Europe,

1:13:08

as soon as they hit European airspace,

1:13:10

the record stopped because GDPR is so

1:13:12

strong there.

1:13:13

that they just basically stopped keeping

1:13:15

the records or they had been deleted by

1:13:16

that point or something.

1:13:18

Um,

1:13:18

like everywhere else we could track the

1:13:19

phones right back to their front door.

1:13:21

But that was the only one that like,

1:13:23

as soon as they hit European airspace,

1:13:24

they stopped.

1:13:25

And so like, it's laws are,

1:13:28

I know laws are really contentious in the

1:13:29

privacy space.

1:13:30

Cause some people are like, Oh,

1:13:31

nobody pays attention to laws.

1:13:32

They just don't work.

1:13:33

And, and yeah, some,

1:13:34

some companies will bypass them.

1:13:36

And then when that happens,

1:13:38

you have the right to sue them.

1:13:39

Hopefully if they're well-written laws and

1:13:41

you have private right of action, like,

1:13:43

It's not a perfect bullet.

1:13:44

We definitely need all these layers.

1:13:45

We need the technical layers that enforce

1:13:48

the laws,

1:13:49

but we also need the laws that give

1:13:51

you that protection in the first place.

1:13:53

And I would be – I'm not going

1:13:55

to say I guarantee you.

1:13:57

I would be very shocked for something like

1:13:58

this to happen in Europe.

1:14:00

I don't believe she was European.

1:14:02

Because they just have stronger privacy

1:14:04

laws.

1:14:05

Yeah, it said she was American, right?

1:14:06

Yeah, an American film actress.

1:14:07

So yeah, that's definitely that.

1:14:11

And actually one more thing that occurred

1:14:13

to me real quick while you were talking

1:14:14

is the article says that the reason she

1:14:17

and a lot of other adult actors use

1:14:20

fake names and try to protect their

1:14:21

privacy is because they don't want their

1:14:24

family being harassed, which is a thing.

1:14:27

that I could absolutely see terrible

1:14:29

people doing.

1:14:30

You don't have to like porn.

1:14:31

That's fine.

1:14:32

I'm not here to convince you that you

1:14:33

should.

1:14:34

But a lot of people will take it

1:14:35

way too far and say,

1:14:37

I'm going to call your family and send

1:14:39

them pictures of you naked, clips of you,

1:14:41

and basically just harass them because

1:14:44

it'll guilt you.

1:14:45

And I think that's something I see more

1:14:49

in authoritarian governments, I think,

1:14:52

or at least that I read about more,

1:14:53

is there was a human rights lawyer in

1:14:55

Iran that...

1:14:56

her daughter was basically arrested one

1:14:59

time.

1:14:59

Um, I read this book years ago,

1:15:00

so I may have the finer details wrong,

1:15:02

but like her daughter was arrested coming

1:15:04

in and out of the country.

1:15:05

And she,

1:15:06

Basically,

1:15:07

the government was trying to pressure the

1:15:08

author, the actual lawyer, the mom.

1:15:11

They were trying to pressure her to quit

1:15:12

her job and stop being a lawyer.

1:15:15

And she told her daughter straight up.

1:15:16

She's like,

1:15:17

I have to pretend like I don't care

1:15:19

because if I bend and I give in

1:15:22

to their demands,

1:15:23

every time they want to pressure me,

1:15:25

they're going to go straight to you and

1:15:26

they're going to harass you.

1:15:28

And it took months,

1:15:30

but finally the government stopped

1:15:31

harassing her daughter and left her alone

1:15:33

and they've never bothered her since.

1:15:35

And that's something that, I don't know,

1:15:39

that's just a thought I guess I'm trying

1:15:40

to get at is like,

1:15:41

we tend to think of privacy as a

1:15:43

very individual thing.

1:15:45

And a lot of the work is,

1:15:47

you have to download Signal,

1:15:48

you have to sign up for the services,

1:15:50

you have to be mindful what data you

1:15:52

put out there,

1:15:53

but it is important to think about the

1:15:54

impact of the people around you as well.

1:15:56

And yeah.

1:15:56

Yeah.

1:15:57

I don't know if that necessarily applies

1:15:58

to everybody,

1:15:59

but I thought that was an interesting

1:16:01

thing that the article pointed out.

1:16:05

I think we've talked about that story

1:16:07

plenty,

1:16:07

unless you have any more thoughts to add.

1:16:12

No, I'm good.

1:16:13

I guess we could move on here to

1:16:15

some,

1:16:15

we got some quick stories here just to

1:16:17

quickly cover because

1:16:19

We've talked about this every week.

1:16:21

There's more of these age verification

1:16:24

laws for children coming out.

1:16:26

It's really unfortunate.

1:16:28

More countries are doing it.

1:16:30

There was even some movement in the US

1:16:36

as well.

1:16:37

I saw today, this morning from Politico.

1:16:40

I think the governor of California was

1:16:42

considering it.

1:16:43

So there's definitely movement in the US

1:16:45

as well.

1:16:45

This doesn't apply just to...

1:16:47

Australia and uh Europe it's now happening

1:16:51

uh in the United States well it's moving

1:16:53

to happen we'll see how that happens but

1:16:56

um so yeah Newsom backs social media

1:16:59

restrictions for teens under so he didn't

1:17:02

entirely say this was going to happen he

1:17:05

just was less against it than usual um

1:17:09

so

1:17:11

I'm not really certain about this.

1:17:12

I don't really know much about this guy,

1:17:14

but, um, I think this is basically,

1:17:18

he said that, uh,

1:17:20

he was convinced that this was a good

1:17:22

idea by some of the Australian, uh,

1:17:24

movements from Australia where we

1:17:26

basically implemented a social media ban

1:17:28

for under sixteens.

1:17:30

Personally,

1:17:30

I think it has been largely pretty bad.

1:17:35

It hasn't worked that well.

1:17:36

Um,

1:17:38

So basically now they're having,

1:17:40

according to this article,

1:17:41

they're now having a debate over whether

1:17:44

this would be a good idea to implement.

1:17:48

And there's also another article that we

1:17:51

have here, which covers basically,

1:17:53

it's from TechCrunch,

1:17:55

and it's basically covering all the

1:17:57

countries that are moving to implement

1:18:00

social media bans and age verification and

1:18:02

identity verification.

1:18:04

So Australia, obviously,

1:18:06

we talked about that when it happened.

1:18:08

Denmark is set to ban social media for

1:18:11

platforms for children under fifteen.

1:18:14

And France is also pushing for it for

1:18:17

kids under fifteen.

1:18:19

So it needs to get through the Senate,

1:18:22

though, before it actually gets passed.

1:18:25

In Germany,

1:18:25

there's also a movement to add a social

1:18:28

media ban for under sixteens.

1:18:32

There's still, you know,

1:18:34

it still needs to go through and be

1:18:35

approved before it actually happens.

1:18:38

Greece is also close to announcing a

1:18:40

social media ban for children under

1:18:42

fifteen.

1:18:43

Malaysia is also considering one for under

1:18:45

sixteen year olds.

1:18:46

Slovenia is also drafting legislation to

1:18:49

prohibit people under the age of fifteen

1:18:51

from accessing social media.

1:18:53

Spain has also announced

1:18:56

But they plan to ban social media for

1:18:58

children under the age of sixteen.

1:19:00

And the United Kingdom is weighing a ban

1:19:03

on social media for children under

1:19:05

sixteen.

1:19:06

So the UK, I would assume,

1:19:09

would be pretty likely just because

1:19:10

they've had the Online Safety Act for a

1:19:12

while, which kind of required this stuff.

1:19:16

So it's pretty much a...

1:19:21

it's just a continuation of what we've

1:19:22

been saying before here.

1:19:24

This is not great for people's privacy.

1:19:26

There's no way that you do this without,

1:19:29

you know, identifying people.

1:19:31

And not everyone has ID as well.

1:19:33

So you're locking people out of platforms.

1:19:35

And I think you're also locking children

1:19:38

out of communities that they, you know,

1:19:40

some people have very niche interests.

1:19:42

They're from a minority group.

1:19:46

They find solace in, you know, these,

1:19:52

these platforms,

1:19:53

these online platforms to talk and meet

1:19:56

people.

1:19:57

So taking away that access is probably

1:20:00

going to be pretty detrimental.

1:20:01

I think we're going to see that in

1:20:02

the next couple of years,

1:20:04

but it doesn't seem to have been super

1:20:06

effective.

1:20:06

At least in Australia,

1:20:07

a lot of people are still bypassing it,

1:20:10

but it could be that further restrictions

1:20:12

need to be put in place before it

1:20:14

becomes effective.

1:20:15

So we can only hope that that won't

1:20:17

happen because we probably don't want,

1:20:20

you know,

1:20:23

people tap to upload their ID in every

1:20:25

case,

1:20:26

because right now they're using age

1:20:27

assurance technology,

1:20:28

which we've talked about and said,

1:20:29

you know, that doesn't work very well.

1:20:31

It's kind of racist.

1:20:32

It's not great at determining people's

1:20:35

age.

1:20:35

And you basically have to send a biometric

1:20:37

scan of your face.

1:20:39

But yeah,

1:20:39

do you have anything you want to add

1:20:40

here, Nate?

1:20:41

I feel like we've definitely talked about

1:20:42

this a lot.

1:20:43

We probably don't need to go into super

1:20:45

detail.

1:20:48

Yeah, I don't really have anything to add.

1:20:50

I just, it's,

1:20:52

like you said, and, and I mean,

1:20:53

I'm not Australian, but from what I hear,

1:20:55

Australia's ban has not really been that

1:20:58

much successful.

1:20:59

Um,

1:20:59

the UK is online safety has been a

1:21:01

downright catastrophe, but you know,

1:21:03

politicians are going to be politicians

1:21:05

and they're going to, who, who was it?

1:21:06

Um,

1:21:08

I should know this one is American.

1:21:09

George Bush Jr.

1:21:11

Like there's that famous picture of him

1:21:12

standing on the aircraft carrier saying

1:21:14

mission accomplished like three days into

1:21:16

the Iraq war.

1:21:17

And we were there for another twenty

1:21:18

years.

1:21:18

So or maybe it was three months,

1:21:21

but it was like still it was.

1:21:22

And that's that's just what comes to mind

1:21:24

when, you know,

1:21:24

all these politicians are like, oh, yeah,

1:21:27

we should do what Australia did.

1:21:28

And it's like, OK, come on, guys, like.

1:21:32

Yeah, I don't know.

1:21:33

Changing your mind is not cool these days,

1:21:35

I guess.

1:21:36

But yeah, from what I've seen,

1:21:37

it does not seem to have been successful

1:21:39

anywhere at all.

1:21:41

And I cannot imagine that that's going to

1:21:45

bear out any different for anybody else.

1:21:47

I don't know why they think it's going

1:21:48

to work better for them.

1:21:49

But yeah, it is.

1:21:53

I will echo real quick.

1:21:56

Somebody said in our private signal chat

1:21:59

is like,

1:22:00

it's really concerning how much this is

1:22:02

spreading like wildfire.

1:22:03

So I can sit here and laugh at

1:22:04

them and I can be like, God,

1:22:05

they're so stupid.

1:22:06

And I'm going to do that.

1:22:07

But also it's, you know,

1:22:08

it's one of those things where like,

1:22:10

we can't,

1:22:12

we can't take it not seriously because

1:22:14

unfortunately even stupid people are

1:22:16

dangerous in the right situations and

1:22:19

they're gonna push this through if nobody

1:22:21

stops them.

1:22:22

And it's really important that we keep

1:22:24

trying to like,

1:22:24

this is why we keep sharing this every

1:22:25

week,

1:22:26

even though we don't really have anything

1:22:27

new to add.

1:22:28

It's just to remind you guys that like,

1:22:29

this is happening.

1:22:30

And especially if you live in one of

1:22:31

these countries or one of these States,

1:22:33

you need to like contact your politicians

1:22:35

and speak up and be like, Hey,

1:22:36

this is bad.

1:22:38

And you know,

1:22:39

I'm not going to get too far into

1:22:40

the whole thing, but like,

1:22:41

Try not to come at them like they're

1:22:44

idiots.

1:22:45

I know I was just making fun of

1:22:46

politicians,

1:22:46

but try not to come at them and

1:22:48

be like, oh, you guys are so dumb.

1:22:49

Try to come at them in good faith

1:22:51

because if nothing else,

1:22:52

you're probably going to win them over

1:22:53

better that way.

1:22:54

You certainly stand a better chance.

1:22:57

Yeah,

1:22:57

we really need to fight back against this

1:22:59

and educate everyone on why this is a

1:23:00

terrible idea and get people to understand

1:23:03

that so that there's pushback because

1:23:05

otherwise they're just going to rush it

1:23:06

right through and everybody's going to be

1:23:08

like, yeah, the children, sure.

1:23:10

And it's going to hurt everyone in the

1:23:12

long run.

1:23:12

So yeah, that's all I got.

1:23:19

And with that,

1:23:20

if you don't have anything else to add

1:23:22

to that,

1:23:23

we are actually going to move into our

1:23:26

forum updates.

1:23:26

So in a minute,

1:23:28

we will be taking viewer questions.

1:23:30

So if you're watching,

1:23:32

go ahead and be sure to...

1:23:34

to go ahead and leave those.

1:23:36

There we go.

1:23:37

That's the one I was trying to do.

1:23:39

Be sure to go ahead and leave those

1:23:40

in the live chat.

1:23:41

Or if you're on the forum,

1:23:42

you can leave them on the forum too.

1:23:43

We will be checking that.

1:23:45

But for now,

1:23:46

we are going to go to the forum

1:23:47

and we're going to talk about a few

1:23:49

of the hot topics that people have been

1:23:51

discussing this week.

1:23:52

And the first one,

1:23:53

this actually came in pretty last minute.

1:23:55

So I don't know if you had a

1:23:56

chance to look at this one, Jordan,

1:23:57

but it says,

1:23:59

I verified my LinkedIn identity and here's

1:24:00

what I actually handed over.

1:24:04

We thought this one might be a good

1:24:06

one to discuss in light of all this

1:24:07

age verification stuff.

1:24:09

This is not the forum post,

1:24:10

but real quick,

1:24:11

I am going to share what it links

1:24:12

to.

1:24:13

This is like a blog post that somebody

1:24:14

wrote.

1:24:16

Um, so this person says that, uh,

1:24:17

they wanted the blue check mark on

1:24:19

LinkedIn.

1:24:20

The one that says this person is real

1:24:21

in a sea of fake recruiters, bot accounts,

1:24:23

and AI generated headshots.

1:24:24

It seemed like the smart thing to do.

1:24:25

So I tapped verify.

1:24:26

I scanned my passport.

1:24:27

I took a selfie three minutes later,

1:24:29

done badge acquired.

1:24:30

Then I did what apparently nobody does.

1:24:32

I went and read the privacy policy in

1:24:33

terms of service,

1:24:34

not LinkedIn's the other company.

1:24:36

Um,

1:24:37

Which I know this author probably did this

1:24:40

on purpose,

1:24:41

but I think it's funny that they did

1:24:42

the thing and then read the privacy

1:24:43

policy.

1:24:45

But yeah,

1:24:46

so apparently LinkedIn uses this company

1:24:48

called Persona,

1:24:48

which I believe is the same one that

1:24:50

Discord is going to be using.

1:24:51

Don't quote me on that.

1:24:53

But yeah, you can see here,

1:24:55

they went and read the privacy policy and

1:24:57

basically Persota collects full name,

1:24:59

passport photos, selfie, facial geometry,

1:25:02

NFC chip data.

1:25:03

I do want to point out a lot

1:25:05

of the time privacy policies say they may

1:25:08

collect this stuff.

1:25:09

So like, for example, he says like,

1:25:12

where is it?

1:25:12

Postal address?

1:25:14

I don't know how they would get that

1:25:15

from just his passport because U.S.

1:25:17

passports do not have your postal address

1:25:20

on them.

1:25:20

I think most passports don't.

1:25:23

But maybe they ask for it during signup

1:25:24

or maybe it's one of those like if

1:25:25

you submit a driver's license,

1:25:26

we'll read the address off that.

1:25:28

So I just want to point out like

1:25:29

maybe not everything was taken.

1:25:31

And obviously we know things like IP

1:25:32

address.

1:25:33

You can obfuscate by by using a VPN.

1:25:35

But then like Mac address,

1:25:38

OS version language,

1:25:39

all of this is completely insane.

1:25:41

Um, he said,

1:25:42

here's the weirdest of hesitation,

1:25:43

detection, copy and paste detection.

1:25:46

Um, which quick unrelated note,

1:25:48

if you run a website and you disable

1:25:49

copy and paste,

1:25:50

you suck because password managers are a

1:25:52

thing.

1:25:53

I hate when companies do that.

1:25:54

Um, but yeah, it's,

1:25:55

and then they shared all this stuff with

1:25:57

their quote unquote global trust,

1:25:58

global network of trusted third-party data

1:26:01

sources.

1:26:02

And they use it for training AI,

1:26:04

I think he said.

1:26:05

And yeah, right there, uh,

1:26:06

to train their AI,

1:26:07

where does my face go?

1:26:08

Here's what LinkedIn gets.

1:26:09

Here's what persona gets and the company

1:26:13

that they, good God.

1:26:16

Oh man.

1:26:17

Yeah.

1:26:17

This is just really crazy.

1:26:18

And, um,

1:26:19

just on the topic of all this, like,

1:26:22

uh,

1:26:23

like age verification stuff.

1:26:24

It's, it's really worth,

1:26:26

and he does have some actionable stuff

1:26:27

towards the end for the record.

1:26:29

Like, you know,

1:26:29

if you can try to contact these companies,

1:26:31

tell them to delete your data because

1:26:32

they've already verified you.

1:26:33

There's no reason they should be holding

1:26:35

onto it in theory, but, um,

1:26:37

and he's European.

1:26:38

So this is very much written for the

1:26:39

perspective of like, you know,

1:26:40

the GDPR says this and you can contact

1:26:42

this person, but, um,

1:26:43

um it's just very eye-opening because i

1:26:45

think a lot of us especially those of

1:26:47

us who are people who aren't really more

1:26:50

into privacy they kind of think especially

1:26:52

when the company says like discord you

1:26:53

know it's like oh you'll submit it and

1:26:54

we'll delete it as soon as we're done

1:26:56

but they don't all say that and there

1:26:58

are situations where they won't do that

1:27:00

and so yeah it's just kind of a

1:27:02

it's kind of a very crazy thing um

1:27:04

i thought it was a good read uh

1:27:06

do you have anything you wanted to add

1:27:07

to that

1:27:09

So, yeah, you're right about Discord.

1:27:11

So Discord was using KID originally.

1:27:14

And because people kept bypassing it,

1:27:17

they switched to Persona.

1:27:18

So the benefit of KID is allegedly,

1:27:22

in massive quotation marks,

1:27:25

the picture doesn't leave your device and

1:27:27

the scanning process is done on your

1:27:29

device, right?

1:27:29

That allowed people to bypass it.

1:27:31

So...

1:27:32

persona,

1:27:33

you actually are sending it to them,

1:27:36

the image to them.

1:27:37

And people looked a little bit more into

1:27:39

persona, which I didn't even know.

1:27:42

Unfortunately,

1:27:43

it's very common persona.

1:27:46

It's like a very common, like, uh,

1:27:48

age verification,

1:27:49

identity verification thing.

1:27:51

Um, so people are bringing up, uh,

1:27:53

basically this,

1:27:54

this persona service actually has links to

1:27:58

Peter Thiel's Palantir.

1:28:00

So if you don't know Palantir,

1:28:01

they make like spyware and like mass

1:28:05

surveillance technology.

1:28:07

Um, so yeah,

1:28:08

It's kind of concerning that there was a

1:28:11

link here.

1:28:12

I'm reading this article from Kotaku,

1:28:15

and it says that one of Persona's biggest

1:28:21

investors was Peter Thiel.

1:28:28

So I'm not really sure.

1:28:29

I don't know.

1:28:31

This is very sus, obviously.

1:28:33

It seems like there's some...

1:28:37

connection here.

1:28:38

I'm not entirely sure what the whole

1:28:40

context is for that, but, you know,

1:28:42

I think it's also great that this person

1:28:44

was able to put together a list of

1:28:46

all of these data points that Persona is

1:28:51

collecting,

1:28:52

because I feel like a lot of people

1:28:53

just assume that this data is

1:28:58

probably collected,

1:28:59

but it's good to get a validation that

1:29:01

they are collecting passport photos,

1:29:03

selfies, facial geometry.

1:29:05

I'm not sure about NFC chip data.

1:29:07

I feel like you would have to scan

1:29:09

the chip itself,

1:29:09

which as far as I know,

1:29:11

it doesn't ask you to do.

1:29:13

National ID number, nationality, sex,

1:29:15

birth date, et cetera.

1:29:17

So a lot of that does make sense

1:29:19

for the service they're providing,

1:29:21

but

1:29:22

Obviously,

1:29:23

you shouldn't need to provide this in the

1:29:25

first place.

1:29:26

This is a lot of sensitive information.

1:29:28

Someone could steal your identity with all

1:29:29

this information.

1:29:34

Yeah, so Draken Blacklight said,

1:29:37

Peter Thiel is evil, and yeah,

1:29:41

I would say so.

1:29:41

Like, yeah, evil CEO vibes, definitely.

1:29:44

I don't really know too much about him,

1:29:47

but I know quite a bit about Palantir,

1:29:50

and it's, like,

1:29:52

one of the worst companies to ever exist.

1:29:54

Like,

1:29:54

why would you name your company Palantir?

1:29:56

Like, it's, like...

1:29:58

That's like, are we the baddies?

1:30:01

You know what I mean?

1:30:03

So yeah, Nate,

1:30:05

did you have anything you want to add

1:30:06

there?

1:30:09

No, no, I don't think so.

1:30:13

I love that sketch for the record.

1:30:14

That's one of my favorite sketches of all

1:30:15

time.

1:30:16

Are we the baddies?

1:30:18

Yeah.

1:30:19

No, that's all I got on that one.

1:30:21

I think we can move on to,

1:30:22

we had one other quick forum post,

1:30:24

I think.

1:30:26

Did you have a chance to look at

1:30:27

that one at all or?

1:30:29

No, you can take it for sure.

1:30:32

Okay.

1:30:33

Yeah, it's just Entei.

1:30:36

I swear I never pronounce it right.

1:30:37

I think it's Entei.

1:30:38

It might be Entei.

1:30:39

I think it's Entei.

1:30:40

Entei, the popular photo manager.

1:30:42

They also have a password manager.

1:30:44

No, not a password manager.

1:30:46

Authenticator app.

1:30:47

They have a two FA authenticator app.

1:30:49

But Entei, really cool people.

1:30:51

They have released Entei Locker.

1:30:54

which let me go...

1:30:58

I can go ahead and share the blog

1:31:01

post here real quick.

1:31:02

So this is kind of...

1:31:05

This is kind of like a very low,

1:31:08

I haven't played with it myself and I

1:31:09

feel bad I should have because admittedly

1:31:11

I did get an email from them like

1:31:13

two weeks ago that was like, Hey,

1:31:14

we're giving people early access.

1:31:16

Like, cause you know,

1:31:17

I'm an influencer and they're like,

1:31:19

just don't release it until this day.

1:31:21

And I've just been so busy.

1:31:22

I haven't looked at it at all.

1:31:24

And I feel bad.

1:31:25

because I want to.

1:31:26

But it's supposed to be like a little

1:31:29

vault to organize and store your important

1:31:32

documents.

1:31:32

So for example,

1:31:33

he cites like medical records,

1:31:34

insurance policies, identity cards,

1:31:37

passwords, and notes.

1:31:39

They say that you can share them with

1:31:41

trusted people.

1:31:42

You can set up trusted contacts.

1:31:43

So kind of like a lot of password

1:31:45

managers these days have legacy contacts

1:31:48

where they can request access to your

1:31:49

vault.

1:31:50

And if you don't,

1:31:51

reject it in a certain amount of time,

1:31:52

like three days or seven days and they

1:31:54

get access.

1:31:54

And the idea is God forbid,

1:31:56

you step outside tomorrow,

1:31:57

you get hit by a bus,

1:31:58

then somebody can still get access to your

1:32:00

passwords and, you know,

1:32:01

make sure the bills are paid or whatever.

1:32:02

I think it's really more for, um,

1:32:04

like head of household kind of scenarios

1:32:06

or caretakers.

1:32:07

Um,

1:32:08

They say it's free for up to a

1:32:09

hundred items, all features included.

1:32:11

If you're a subscriber,

1:32:12

you can store up to a thousand items.

1:32:14

It's fully end-to-end encrypted,

1:32:15

fully open source.

1:32:17

Uh, I don't know if this is self-hostable,

1:32:19

but honestly,

1:32:19

I wouldn't be surprised if it is in

1:32:20

the future because I know Entei Photos is

1:32:23

already self-hostable and they even have a

1:32:25

blog post about how to do that.

1:32:26

So yeah.

1:32:28

Um,

1:32:29

I think the question I've been seeing a

1:32:31

lot of people ask and one I admit

1:32:33

I will ask and, uh,

1:32:35

I might email them back and ask this

1:32:36

question, actually.

1:32:37

But basically,

1:32:38

why would I use this instead of something

1:32:40

like NextCloud or ProtonDrive?

1:32:41

Well,

1:32:42

I think NextCloud is obvious for several

1:32:44

reasons.

1:32:44

But something like ProtonDrive or one of

1:32:47

the other cloud-based encrypted clouds

1:32:50

that we would normally recommend.

1:32:52

And I think in my personal opinion,

1:32:55

I would say maybe the answer there is

1:32:56

like,

1:32:58

maybe it's like simpler or maybe it's kind

1:33:01

of a more, a more minimal version.

1:33:03

Let me put it that way.

1:33:03

Cause I will admit if my password manager,

1:33:06

as much as I preach digital minimalism and

1:33:08

I try to live by it,

1:33:09

it still has hundreds of, of, of entries.

1:33:16

It's cause you know,

1:33:16

there's all kinds of things that like I

1:33:19

use a couple times a year,

1:33:20

like my medical portal I'm in,

1:33:22

I'm relatively healthy right now.

1:33:23

So I don't,

1:33:25

really log into the doctor's office a lot

1:33:27

other than like to schedule physicals and,

1:33:29

you know, stuff like that.

1:33:31

What else am I, I don't even know,

1:33:33

but there's things that I just don't log

1:33:34

into very often,

1:33:35

but I have accounts for them.

1:33:36

And so, you know, if again, God forbid,

1:33:39

if I were to get hit by a

1:33:40

bus,

1:33:41

I think

1:33:42

I could see how it'd be stressful for

1:33:44

my wife to have to go through hundreds

1:33:45

of entries and try to figure out like,

1:33:48

you know, okay,

1:33:48

which one is where we pay the rent,

1:33:50

which she should have access to that too.

1:33:51

But you know,

1:33:51

which one's where we pay the rent,

1:33:52

which one's utilities,

1:33:53

which one's insurance,

1:33:54

which one's banking this, that,

1:33:56

and the other.

1:33:57

And I could definitely see it being really

1:33:59

useful where there's a space where like,

1:34:00

look, here's like,

1:34:02

the ten or fifteen things you need to

1:34:05

keep a roof over our head, keep groceries,

1:34:07

and just figure everything else out later.

1:34:09

So I think for me,

1:34:12

or maybe if you don't use the cloud,

1:34:15

if you just keep really steady backups,

1:34:16

this could be the one minimal cloud you

1:34:18

use,

1:34:18

especially if you're already an Entei

1:34:20

user.

1:34:21

I don't know.

1:34:21

Those are kind of my thoughts,

1:34:22

just off the cuff.

1:34:23

But do you have any thoughts on this

1:34:26

product, Jordan?

1:34:28

I mean, yeah.

1:34:29

I kind of see it as...

1:34:32

I guess,

1:34:32

an extension to a password manager or just

1:34:34

like an alternative, I guess.

1:34:36

I don't know.

1:34:39

I think this is an interesting idea.

1:34:43

I'm not sure if this sort of,

1:34:46

I feel like there's been a couple of

1:34:48

companies that have tried to do a similar

1:34:49

product.

1:34:52

I guess it's nice to have everything in

1:34:53

a separate spot.

1:34:55

It kind of makes sense that

1:34:58

Entei is kind of trying to have like

1:35:01

a drive-ish sort of thing.

1:35:02

I guess it's more like,

1:35:05

I feel like it's more like a password

1:35:06

manager, but yeah, I mean,

1:35:10

this is a password manager.

1:35:11

I think if, if we,

1:35:13

if we're being honest,

1:35:14

like this is basically just like a

1:35:16

password manager.

1:35:18

So, I mean,

1:35:19

it'll be interesting to see how this

1:35:21

product works in like, you know,

1:35:25

I haven't personally tried it out.

1:35:27

It says it is available right now in

1:35:30

the app stores, any popular app stores.

1:35:32

So it could be an interesting alternative,

1:35:37

but I think we'll have to watch it

1:35:40

closely.

1:35:41

It seems like it's less about

1:35:43

auto-filling,

1:35:43

like it's more just like a specific place

1:35:46

to store private documents and stuff.

1:35:50

So I'm not sure if this needed to

1:35:52

be a separate product or not.

1:35:54

I feel like there's already products that

1:35:56

do this.

1:35:56

But I think if they build out the

1:36:01

features to really cover a lot of these

1:36:03

important things that people need a

1:36:06

separate app for, for instance,

1:36:09

Compartmentalization is good.

1:36:11

I think, you know,

1:36:11

not having all those documents in your

1:36:13

password manager,

1:36:14

but instead in this separate app would be

1:36:17

a security benefit to some people.

1:36:19

So, I mean,

1:36:19

I don't think it's a terrible idea.

1:36:22

I think it's something we'll have to watch

1:36:24

because right now it does look relatively

1:36:27

basic,

1:36:27

but it is kind of expected considering

1:36:29

they only just released the product.

1:36:32

So yeah, overall, pretty interested in it.

1:36:36

But I think it will need some extra

1:36:38

testing,

1:36:38

and we'll see how the development process

1:36:41

goes.

1:36:44

Yeah, I'm with you.

1:36:45

I'll be interested to see where they take

1:36:46

it.

1:36:48

I will say, to me,

1:36:49

it doesn't strike me as a password

1:36:50

manager, because the photos they show,

1:36:52

the screenshots are like, there's a JPEG,

1:36:55

there's a couple of PDFs.

1:36:58

I don't know.

1:36:59

This one here looks interesting.

1:37:01

It says thing, and then the subtext is...

1:37:05

uh, save location of real world items.

1:37:08

So like,

1:37:08

I guess you could drop like a GPS

1:37:10

pin, like, Oh,

1:37:10

this is the storage unit or something,

1:37:12

which I mean, for the record,

1:37:13

of course there's workarounds.

1:37:14

Like you could put that in the note

1:37:15

section of your password manager.

1:37:16

Right.

1:37:17

But I don't know.

1:37:18

That's interesting.

1:37:18

I kind of like that.

1:37:20

I think I do want to dig into

1:37:20

this a little more.

1:37:21

I'm interested.

1:37:22

I'm not, like you said,

1:37:23

I'm not totally sold.

1:37:24

Um,

1:37:25

but I would be interested to know what

1:37:26

they think the use cases for this and

1:37:29

And yeah,

1:37:30

I think I'll shoot him an email this

1:37:31

weekend, because I do like it.

1:37:32

It's just, yeah.

1:37:36

And real quick, actually,

1:37:38

I think that does take us to our

1:37:39

questions.

1:37:42

So I was going to mention Lucas here.

1:37:44

Lucas said,

1:37:45

would a one thousand password mean one

1:37:48

thousand items?

1:37:49

Like a one thousand character password?

1:37:51

I hope not.

1:37:54

But yeah, I mean,

1:37:55

I think a password would count as an

1:37:57

item, to be honest.

1:38:00

So yeah,

1:38:03

let's go ahead and transition to our Q&A

1:38:06

section here.

1:38:09

Unfortunately,

1:38:09

it does not look like we have any

1:38:11

questions on the forum.

1:38:13

Might be kind of a light week.

1:38:15

But were there any questions you

1:38:18

specifically noticed, Jordan,

1:38:20

or any comments you wanted to shout out?

1:38:23

Someone said, floating head,

1:38:26

question mark.

1:38:29

Yeah.

1:38:30

You know, it's good.

1:38:31

You need to protect your privacy.

1:38:33

You know,

1:38:33

everyone has certain requirements.

1:38:36

So that's just my requirement.

1:38:39

I feel like everyone can understand that

1:38:41

in this community, you know.

1:38:43

So not really surprising, I don't think.

1:38:47

Is it a locked note app or a

1:38:49

password manager, Lucas says.

1:38:52

I guess it's...

1:38:53

I feel like it's in between that because

1:38:55

it does store passwords technically and it

1:38:57

also stores...

1:38:58

like Nate was saying,

1:38:59

like that other information.

1:39:02

If you watch the actual,

1:39:03

like there's like a one minute intro video

1:39:06

from the CEO, Vishnu.

1:39:08

If you watch that,

1:39:09

it was the reason he decided to create

1:39:11

it was because his dad was struggling with

1:39:14

like storing documents securely and having

1:39:17

a way to access it.

1:39:19

So it does seem like, you know,

1:39:22

It is a tool that is applicable to

1:39:25

some people,

1:39:27

maybe not as much for people who are

1:39:29

really into privacy and security stuff,

1:39:31

but I think, you know,

1:39:35

It's clear that they've thought out this

1:39:37

product quite a lot.

1:39:39

So I don't know.

1:39:41

I think it's worth downloading it and

1:39:43

trying it yourself if it fits for what

1:39:45

you need,

1:39:45

if you need this sort of app.

1:39:47

I mean,

1:39:47

I don't need every single app that you

1:39:49

need.

1:39:50

Everyone has different needs.

1:39:51

So if this is something you do need,

1:39:54

maybe check it out.

1:39:55

And maybe it might actually be something

1:39:57

that...

1:39:59

fits your use case.

1:40:00

Obviously,

1:40:01

Privacy Guides is not going to recommend

1:40:03

this until we've had it announced by the

1:40:06

community and it passes all of our

1:40:08

criteria.

1:40:08

But I would say that's pretty likely that

1:40:11

will happen just because Entei has a

1:40:14

couple of their products listed on Privacy

1:40:16

Guides already because they've been so

1:40:17

great about auditing their software and,

1:40:20

you know,

1:40:21

being generally proactive.

1:40:23

So I wouldn't be surprised if the same

1:40:27

thing happened for this new app.

1:40:28

ANDREW FITZ GIBBONS, Yeah, for sure.

1:40:34

Um, not really a question,

1:40:36

but earlier when we were talking about

1:40:37

persona,

1:40:38

Lucas said that apparently discord has

1:40:39

stated they will not continue to use

1:40:41

persona for age verification because of

1:40:42

all the pushback.

1:40:43

So, um,

1:40:45

I've heard a lot of conflicting things.

1:40:46

I've heard they,

1:40:47

they started working with persona and then

1:40:49

they're not.

1:40:50

And I've heard there's like you said,

1:40:51

there's like ties to Peter Thiel and, um,

1:40:54

yeah.

1:40:54

So I hope they stop working with them,

1:40:56

but I haven't done any digging on that

1:40:58

myself.

1:40:59

And then just one other one real quick.

1:41:02

Hello Hello said,

1:41:03

they're straight from Mordor.

1:41:04

For those who don't know,

1:41:05

because you're not a massive nerd like me,

1:41:08

Palantir is the name in Lord of the

1:41:10

Rings

1:41:12

It's been a while.

1:41:12

I'm due for a rewatch because I know

1:41:14

it's like the twenty fifth anniversary of

1:41:15

Fellowship of the Ring this year.

1:41:18

I think Palantir,

1:41:19

the Palantir is like the little orb that

1:41:23

the bad guy Sauron,

1:41:24

he's like the big bad guy and his

1:41:26

like secondhand Saruman.

1:41:27

They use it to communicate long distances,

1:41:29

kind of like a crystal ball.

1:41:31

Um, I could be getting that wrong.

1:41:32

Like I said,

1:41:32

it's been a long time and I haven't

1:41:33

read the book since I was really young,

1:41:35

so that's not going to save me either.

1:41:36

But, um, yeah,

1:41:37

but it is literally like there's a company

1:41:39

in China that named themselves Skynet

1:41:42

because the CEO really liked the

1:41:43

Terminator series.

1:41:44

And it's the exact same thing.

1:41:46

It's just like, oh, I'm a huge,

1:41:47

and they all do it.

1:41:48

There's another one called Enduro,

1:41:50

which is another Lord of the Rings

1:41:51

reference.

1:41:52

Um, there's others I'm forgetting,

1:41:53

but they all do.

1:41:54

And I feel like they do it to

1:41:56

be like cheeky and funny.

1:41:57

Like, Oh,

1:41:58

we're just a bunch of harmless nerds.

1:42:00

And it's like, okay,

1:42:01

but that's like in a thousand years.

1:42:02

And actually it's not like this because

1:42:04

this actually happened, but like,

1:42:06

imagine in a thousand years,

1:42:07

some companies like, Oh yeah,

1:42:08

we named ourselves like third Reich

1:42:09

industries.

1:42:10

It's like,

1:42:11

what the hell is wrong with you?

1:42:13

Like, why would you do that?

1:42:14

Like you said, it's like,

1:42:15

are we the baddies?

1:42:16

Like, yes, you're trying to prove you are.

1:42:17

It's, it's insane, but.

1:42:20

Yeah, I digress.

1:42:22

I mean,

1:42:22

I personally thought that their logo

1:42:24

looked a little bit like the Eye of

1:42:26

Sauron a little bit, but I mean... Oh,

1:42:28

hold on.

1:42:28

I've never seen their logo.

1:42:30

I have to go look this up now.

1:42:31

I don't think I've seen their logo.

1:42:32

I feel like it does look like the

1:42:33

Palantir thing that you were saying.

1:42:35

It does look like that.

1:42:38

Yeah, it's like him holding the orb.

1:42:39

I see it.

1:42:42

Oh my God.

1:42:44

Yeah.

1:42:44

I almost wonder if they are like genuinely

1:42:46

malicious and this is them just like

1:42:48

trying to, like,

1:42:48

you know how a lot of times people

1:42:50

in conspiracy theories will be like, oh,

1:42:52

they're leaving all these breadcrumbs and

1:42:53

it's like, no, you're reading into it.

1:42:55

And this almost makes me wonder if like,

1:42:57

are they right?

1:42:58

Like,

1:42:58

are they genuinely leaving breadcrumbs and

1:43:00

we're just ignoring it?

1:43:00

Because that is so spot on.

1:43:03

That is horrifying.

1:43:05

I don't want to live in this timeline

1:43:06

anymore.

1:43:06

Okay.

1:43:09

I digress.

1:43:11

Do you have any other questions or

1:43:12

comments you want to highlight?

1:43:16

Not really.

1:43:16

I think this has just sort of been

1:43:18

a quiet week this week.

1:43:21

So thanks, everybody,

1:43:22

for chatting in the chat with us and

1:43:24

stuff like that.

1:43:25

But yeah,

1:43:25

it doesn't seem like we have any questions

1:43:27

on the forum, unfortunately.

1:43:30

Yeah.

1:43:30

Thank you, guys,

1:43:31

who showed up and chatted.

1:43:32

Thank you for the regulars.

1:43:32

It's always nice to see you guys.

1:43:35

Um,

1:43:35

all the updates from this week in privacy

1:43:37

will be shared on the blog every week,

1:43:39

uh, which is already out now.

1:43:41

And in case you guys didn't know,

1:43:42

we've started sending that out right when

1:43:44

we start streaming.

1:43:45

So, um,

1:43:45

if you want to go sign up for

1:43:46

that as a newsletter or subscribe on RSS,

1:43:49

that's a good little indicator right

1:43:51

there.

1:43:51

Reminder, um,

1:43:52

to get that notification for people who

1:43:54

are audio listeners.

1:43:55

We also offer the podcast available on all

1:43:58

audio platforms as well as RSS again,

1:44:00

and the video will be synced to peer

1:44:02

tube shortly after this.

1:44:04

Privacy Guides is an impartial nonprofit

1:44:06

organization that is focused on building a

1:44:08

strong privacy advocacy community and

1:44:10

delivering the best digital privacy and

1:44:12

consumer technology rights advice on the

1:44:14

internet.

1:44:14

If you wanna support our mission,

1:44:16

you can make a donation on our website,

1:44:17

privacyguides.org.

1:44:19

To make a donation,

1:44:20

click the red heart icon located in the

1:44:21

top right corner of the page.

1:44:23

You can contribute using standard fiat

1:44:25

currency via debit or credit card,

1:44:27

or you can donate anonymously using Monero

1:44:30

or with your favorite cryptocurrency.

1:44:31

Becoming a paid member unlocks exclusive

1:44:33

perks like early access to videos,

1:44:35

priority during the live stream Q&A,

1:44:37

and on our forum,

1:44:38

you get a cool little badge in your

1:44:39

profile and the warm,

1:44:41

fuzzy feeling of supporting independent

1:44:42

media.

1:44:43

So thank you all so much for watching,

1:44:45

and we will be back next week with

1:44:47

more news.

1:44:47

Bye.