GrapheneOS has a Hardware Partner!
Ep. 43

GrapheneOS has a Hardware Partner!

Episode description

GrapheneOS has announced who their official hardware partner is, a new US Californian law is says all operating systems must have age verification, and much more. Join us for This Week In Privacy #43!

Download transcript (.vtt)
0:22

A major Android OEM agreeing to create a

0:24

future Graphene OS compatible phone,

0:27

ProtonMail sharing data with the FBI,

0:30

the OpenAI Pentagon deal, and more.

0:33

These are the stories that we'll be

0:34

discussing in this episode of This Week in

0:36

Privacy,

0:37

our weekly live stream where we discuss

0:39

the latest updates within the Privacy

0:41

Guides community,

0:42

and this week's top stories in the data

0:44

privacy and cybersecurity space.

0:46

I'm Jonah,

0:47

and with me this week is Nate.

0:49

How's your week been going, Nate?

0:52

Been keeping really, really busy,

0:54

but could be worse, so I can't complain.

0:57

Oh, good.

0:58

Let's hop right into it.

1:00

We'll start off with the biggest news

1:03

story I think that we've seen in privacy

1:05

and security from the past week.

1:07

Of course,

1:07

it's Motorola confirming Graphene OS

1:10

support for a future phone.

1:12

and bringing over features to their

1:14

lineup.

1:15

This article we have is from Nine to

1:17

Five Google.

1:18

They published it on March first,

1:20

and they said,

1:21

following rumors swirling for quite some

1:23

time,

1:23

Motorola has announced a partnership with

1:25

Graphene OS that will see the

1:26

privacy-focused,

1:27

de-Googled version of Android

1:28

pre-installed on upcoming Motorola

1:31

devices.

1:32

A new long-term partnership between

1:35

Motorola and Grafino S was announced at

1:36

Mobile World Conference,

1:40

earlier this week on Monday,

1:41

with plans for both a future smartphone to

1:44

have Grafino S pre-installed and certain

1:45

features from Grafino S coming over to

1:47

other Motorola devices,

1:49

the company said in a media briefing in

1:51

Barcelona.

1:52

In a press release, Motorola said,

1:55

Motorola is introducing a new era of

1:57

smartphone security through a long-term

1:59

partnership with the GrapheneOS

2:00

Foundation,

2:01

the leading nonprofit in advanced mobile

2:03

security and creators of a hardened

2:06

operating system based on the Android open

2:08

source project.

2:09

Together,

2:10

Motorola and the GrapheneOS Foundation

2:12

will work to strengthen smartphone

2:14

security and collaborate on future devices

2:17

engineered with GrapheneOS compatibility.

2:20

In the coming months,

2:21

Motorola and the Graphene OS Foundation

2:23

will continue to collaborate on joint

2:25

research, software enhancements,

2:27

and new security capabilities with more

2:30

details and solutions to roll out as the

2:32

partnership evolves.

2:34

All of this comes after some leaks at

2:38

the end of February that we saw on

2:40

Reddit and also discussed on our own

2:43

Privacy Guides forum where some Motorola

2:46

or Lenovo media slides were

2:49

leaked ahead of this showing Graphene OS

2:53

being referenced in their roadmap for

2:55

future devices.

2:56

And so those rumors did prove to be

2:59

true this week.

3:04

It's not...

3:06

It's unclear how this partnership is going

3:11

to work,

3:11

especially with Motorola saying that

3:13

they're going to be bringing over features

3:14

from GrapheneOS into their devices.

3:16

We do know right now that all of

3:20

Motorola's current devices will not be

3:22

compatible with GrapheneOS.

3:24

That will be coming as a future device.

3:27

We've seen social media updates from the

3:30

GrapheneOS team confirming that none of

3:33

Motorola's

3:34

devices currently meet their security

3:35

standards.

3:36

And they're saying that a future Motorola

3:39

device that can run GrapheneOS will have

3:42

similar specs to the flagship end of

3:44

Motorola's devices,

3:45

like the Motorola Signature,

3:47

but the current Motorola Signature will

3:49

not be supported.

3:52

GrapheneOS

3:55

social media team members have also said

3:59

that we can expect a device to come

4:03

out in twenty twenty seven.

4:04

So this is not an immediate launch by

4:08

any means,

4:09

but it is now confirmed that they will

4:11

be working with Motorola,

4:14

putting to rest all of the rumors

4:17

of all the other OEMs they could possibly

4:18

be working with.

4:19

I know there's a lot of speculation for

4:22

the past few months since Graphene OS

4:23

originally announced they would be working

4:25

with an unnamed hardware device partner.

4:28

And now that's confirmed.

4:30

But yeah,

4:30

this will definitely be a big shift for

4:33

Graphene OS and how they've always done

4:35

things in the past.

4:37

So Nate,

4:38

you've taken a look at this story.

4:40

It's been big news throughout the week.

4:41

Was there any key takeaways that you

4:43

wanted to discuss here?

4:48

Um, no,

4:49

I think you kind of covered it.

4:50

I mean, at this point,

4:50

it's still so early on that there's,

4:53

I mean,

4:55

I don't want to say there's a lot

4:55

of speculation.

4:56

I mean, it is,

4:57

there is a lot of speculation.

4:58

Like, um,

4:58

you kind of covered everything we know for

5:00

sure.

5:01

Um, I, I'm interested.

5:05

I, you know, uh,

5:06

Jordan said here in the comments that

5:07

Mortarola was an interesting choice,

5:09

which I totally agree with, but also like,

5:11

I, I'm not sure.

5:13

I'm not much of a hardware guy,

5:15

especially when it comes to phones.

5:16

I know that pixels, of course,

5:18

have like the best security,

5:19

which is why we recommend pixels.

5:20

And also iPhones have good security,

5:22

but obviously that's never going to

5:24

happen.

5:25

That would be interesting.

5:26

But I think I'm notoriously critical of

5:29

Samsung security.

5:30

So I've seen some people saying like, oh,

5:33

I wish they'd work with Samsung.

5:34

I cynically do not see a world where

5:37

Samsung security will ever be good enough.

5:39

to run a graphene device, in my opinion.

5:41

They would have to really do a lot

5:42

of work there.

5:43

But yeah, it's really just...

5:48

I can't think of anybody off the top

5:49

of my head that I'm like, oh,

5:50

it's weird they didn't go with these

5:51

people.

5:52

I definitely was not expecting Motorola,

5:54

but again,

5:54

I don't know who I was expecting.

5:56

I think I will be really impatient to

6:01

see what comes next.

6:03

I'm really interested...

6:05

Because Motorola's official announcement

6:08

for this had a very heavy emphasis on

6:16

enterprise features.

6:18

And I know that's historically something

6:20

that's been missing from a lot of FOSS

6:22

projects.

6:23

With all the stuff about age verification

6:24

going on,

6:25

a lot of people have pointed out that

6:26

a lot of FOSS projects like Linux are

6:28

missing...

6:30

parental controls.

6:31

And so it kind of makes it harder

6:33

to, uh,

6:35

pull yourself out of those systems,

6:36

but still maybe monitor what your kids do.

6:39

And so where I'm going with this is

6:41

I will be interested to see if maybe

6:42

graphene is able to pull, uh,

6:45

some of those optionally, of course,

6:46

some of those like enterprise features to

6:48

create like some kind of parental control

6:50

thing in a secure way.

6:52

Um, or anything I will be,

6:54

I've seen some rumors that

6:57

there's not necessarily a guarantee that

6:59

these phones will come pre-shipped with

7:01

graphing,

7:01

but they will be graphing compatible.

7:03

I've also seen other rumors that graphing

7:05

will be an optional,

7:06

like when you buy it,

7:07

you can select graphing.

7:09

I hope that will be commercially available

7:11

and not just for enterprise users.

7:14

Um, so yeah, it's, uh, I dunno, every,

7:19

again,

7:19

I feel like a lot of things we

7:20

could say at this point would be

7:21

speculation, but I'm really hopeful.

7:23

I'm really excited to see where this goes.

7:24

I'm happy that graphene has access.

7:26

I'm assuming they now have access to

7:28

Android.

7:29

Um, in a,

7:30

in a more stable kind of way.

7:31

Cause I know that was a big thing

7:32

is Google's been locking down Android

7:34

slowly and making it less available and

7:35

less open source in practice,

7:37

if not officially.

7:39

Um,

7:40

and a lot of ROMs have struggled with

7:42

trying to get ahold of Android so that

7:43

they can modify it and get it ready

7:45

for releases.

7:45

And that's been slowing down cycles.

7:47

So I'm assuming now they have better

7:49

access to that kind of stuff.

7:50

And they'll, you know,

7:51

of course they'll have,

7:52

I'm assuming access to the hardware to be

7:55

able to like modify that and they don't

7:56

have to reverse engineer things.

7:58

I'll be interested to see if they continue

7:59

to support the pixel or not.

8:00

So just a lot of questions, but I'm,

8:02

I'm really hopeful to see where this goes.

8:04

Yeah, for sure.

8:07

I've definitely seen a lot of conflicting

8:09

reports on this.

8:09

I know the nine to five Google said

8:11

that Graphene OS would come pre-installed.

8:13

I'm not sure if Motorola said that because

8:14

they didn't mention it in their press

8:15

release,

8:16

but maybe they did at the in-person event.

8:20

I wasn't at Mobile World Conference,

8:21

so I wouldn't know.

8:23

I do think, yeah,

8:25

I'm definitely interested to see what this

8:27

phone looks like because Graphene OS has

8:29

for a very long time touted the idea

8:33

of

8:34

like the the the titan security chip in

8:37

pixels being like the gold standard for

8:39

for smartphone security and a lot of their

8:41

features do rely on that whereas um all

8:44

of these other existing devices don't

8:46

really have a comfortable security chip in

8:50

place that has all the same features so

8:52

if we look at like all of motorola's

8:54

devices right now which use qualcomm chips

8:56

You know,

8:59

Qualcomm has some sort of secure element,

9:04

which the name of is escaping me off

9:06

the top of my head,

9:07

but it's not as comprehensive as the Titan

9:12

M chips in the Pixels.

9:15

in terms of what they can do.

9:17

And so I'm really interested to see if

9:21

Motorola is going to be able to provide

9:23

an alternative in these future phones and

9:25

what that will look like.

9:28

I don't know what sort of secure element

9:31

requirements

9:32

would be needed in this case.

9:35

I don't know what commercially available

9:37

options there are for Motorola to choose

9:39

from.

9:40

That's kind of... Oh,

9:42

that would be above my pay grade,

9:43

but I'm sure Grafino S and their team

9:46

is figuring all that stuff out and

9:47

probably...

9:48

has been working with Motorola on this for

9:50

quite some time.

9:51

I mean, obviously,

9:53

this news was released today,

9:54

but GrapheneOS has been talking about this

9:56

for a while.

9:57

And they've obviously been planning this

10:00

behind the scenes for quite some time.

10:03

It's also,

10:05

it's an interesting relationship that they

10:07

seem to have with Motorola.

10:08

And I think it explains why they haven't

10:10

gone with other OEMs because I've seen

10:14

Graphene OS representatives on social

10:16

media say that Motorola essentially came

10:20

to them asking for the partnership and

10:22

committing these resources as opposed to

10:24

them reaching out and trying to find one

10:26

that's most suitable for them.

10:30

Which makes sense because you would really

10:32

need

10:33

a pretty high level of buy-in from

10:35

whatever OEM you partner with to take on

10:39

a lot of the responsibility.

10:40

GrapheneOS is of course a very small team

10:43

still and can't exactly make all of these

10:45

hardware decisions and software changes to

10:47

support a new device just like

10:50

on a whim, resources are limited.

10:53

So being able to work with Motorola and

10:56

kind of maybe direct their team in a

11:00

security-minded focus is really

11:02

interesting.

11:02

And it's a really cool opportunity for

11:03

them.

11:05

Yeah,

11:06

I think we'll just have to wait and

11:08

see what this looks like.

11:10

I know I've seen some people disappointed.

11:12

The OEM wasn't some of the other top

11:14

picks.

11:15

I know people were hoping for OnePlus or

11:17

nothing or perhaps Sony.

11:20

But I don't think Motorola is the worst

11:23

choice out there.

11:23

And I think it's a very positive sign

11:26

that Motorola...

11:28

seemingly initiated this partnership or at

11:31

the very least is very invested in making

11:33

this happen so um it's a good level

11:37

of commitment on on their end as far

11:39

as we can tell yeah i agree while

11:44

while you were talking i was thinking

11:45

about some of the more um

11:47

The more, I guess,

11:48

open source aligned phone makers out

11:51

there,

11:51

like nothing isn't really open source,

11:53

but I think they have the whole modular

11:55

thing going on.

11:56

I might be thinking of somebody else,

11:57

but like Fairphone, Purism,

11:59

what's the other one?

12:02

The Pine phone,

12:03

which I know those were probably never

12:05

even on the table for security reasons.

12:07

But yeah, I mean,

12:09

it's one thing worth noting is I did

12:12

see a video this week that dove into

12:14

this topic a little more and showed also

12:17

the,

12:18

the hacker news y combinator uh forum

12:21

where daniel was pretty active responding

12:23

to some people and he made a point

12:24

of saying like this is not an exclusive

12:25

partnership so he said at this time

12:28

there's no plans for graphene to work with

12:30

any other oems but it's not off the

12:32

table and i actually didn't know that

12:33

about motorola coming to them but um i

12:36

think i mean i'm sure you said this

12:39

and i'm sure this is a given but

12:40

like i think this is great for graphene

12:42

i think this is great for

12:43

open source.

12:44

I think this is great for, uh,

12:46

the general consumer to have this easily

12:48

accessible, um,

12:49

potentially ships with graphene device,

12:52

uh, especially if that is again,

12:53

a consumer accessible option at checkout.

12:56

So I think if this phone does really

12:59

well, um,

13:01

I think that will show other OEMs that

13:02

there is an interest in this and being

13:04

that again, graphene,

13:06

this is not an exclusive relationship,

13:08

then that would be, uh,

13:11

that would potentially be on the table

13:13

that they could go to graphing and be

13:14

like, oh,

13:14

we want to work with you to make

13:15

our phones graphing compatible as well,

13:18

which would just give us even more option

13:20

for other manufacturers.

13:21

So, I mean, I know I'm getting really,

13:23

really ahead of myself.

13:24

This is probably years down the road if

13:25

that ever happens, but, you know,

13:27

we can dream, right?

13:28

So...

13:28

Yeah, absolutely.

13:30

I know I see some chats here about

13:32

PinePhone.

13:34

It would have been nice certainly to see

13:37

a partnership with a more niche or

13:39

especially like repairable phone.

13:42

Fairphone, I think,

13:43

would have been a top choice for a

13:45

lot of people for sure,

13:47

especially in this community,

13:48

because a lot of these values, I think,

13:52

go hand in hand a lot of the

13:54

time between open source privacy security

13:57

repairability like a lot of people are

13:59

very passionate in this community in this

14:01

community about all of those things um but

14:07

yeah no matter no matter which way you

14:09

look at this um any sort of partnership

14:11

i think with uh with an oem and

14:13

especially one that's big name as motorola

14:15

is is huge for any custom rom but

14:18

especially graphene os it's definitely

14:21

The Android realm of choice that we would

14:24

want to see partnering with OEM versus a

14:29

lot of the other options out there.

14:31

So yeah, it's very cool news.

14:37

Yeah, I don't have much else to add.

14:38

Like I said,

14:39

everything at this point is kind of a

14:40

speculation.

14:42

We'll just have to wait and see where

14:43

things go.

14:43

Yeah.

14:45

I think in the meantime,

14:46

we can talk about a different phone if

14:49

we're ready to move on,

14:50

which is the iPhone.

14:52

And this is pretty exciting news,

14:54

but apparently the iPhone and the iPad are

14:57

now approved to handle classified NATO

15:00

information.

15:01

Um, I'm not gonna lie.

15:02

This is kind of a headline says it

15:03

all.

15:04

This is a, for audio listeners, uh,

15:06

this is a new press press release directly

15:09

from Apple.

15:10

So, um, it kind of,

15:12

there's a little bit of information in

15:14

there, nothing super technical,

15:15

but you know,

15:16

Apple kind of touts all of the security

15:17

features they built into their phones

15:19

recently.

15:19

Like, um,

15:21

Biometric authentication with face ID,

15:23

memory integrity enforcement.

15:26

They say best-in-class encryption.

15:28

I mean, I guess.

15:31

Government has struggled to crack lockdown

15:33

mode, and even in the past,

15:34

just the regular encryption.

15:35

So that's probably not terribly

15:37

misleading.

15:39

Um, yeah, they say that, uh,

15:41

they have gone through,

15:43

did they say there was an audit here?

15:44

I mean,

15:44

I'm assuming there was some kind of audit

15:46

certification process, but, um, yeah,

15:48

iPhones and iPads running iOS and iPad OS,

15:51

OS are certified for NATO use in all

15:55

nations.

15:56

Um,

15:58

I don't think I have too much to

15:59

add to that.

16:00

Again,

16:00

it's a pretty self-explanatory headline,

16:02

but I think it just really,

16:05

really attests to Apple's security,

16:08

which this is going to come up again

16:09

later in the show.

16:10

But I want to remind everyone watching

16:12

that privacy and security and anonymity

16:14

are all very different things.

16:16

They're very distinct things.

16:18

And they do complement each other.

16:19

They do work together.

16:21

And some of them,

16:21

like security is how we enforce our

16:23

privacy wishes, right?

16:24

You know, with things like...

16:26

just as a really low hanging fruit

16:27

example,

16:28

a password at its most basic form is

16:30

designed to control who has access to an

16:33

account.

16:33

So that is kind of a form of

16:35

privacy controlling who has that password

16:37

in theory, at least.

16:38

So yeah, Apple,

16:41

we would definitely like to see them do

16:43

more on the privacy front.

16:44

There is of course room for improvement,

16:46

but again, they are,

16:47

they do make incredibly secure devices.

16:49

And I think this is just kind of

16:51

a further testament to that.

16:53

One thing that's interesting is they say

16:54

that this is the first consumer,

16:56

first and only consumer devices in

16:58

compliance with the information assurance

17:00

requirements of NATO nations.

17:02

So yeah, like I said,

17:05

I don't have too much to add to

17:06

that.

17:07

Jonah,

17:07

did you have any thoughts on this story?

17:10

yeah so um it's very cool i think

17:14

like you said according to this press

17:16

release and as far as i know these

17:17

are the only consumer devices that can

17:19

handle any sort of nato classified

17:22

information um which is a big

17:24

accomplishment for for apple the auditing

17:28

process um for any of this is uh

17:32

fairly extensive and i think it's probably

17:34

no

17:36

surprised that one of the best phones we

17:40

already know for in terms of security can

17:43

pass this.

17:43

But it is just more evidence that a

17:47

lot of the safeguards in place on these

17:49

devices are functional and work as

17:52

expected and can be trusted.

17:54

audits like this aren't the end all be

17:56

all of security by any means.

17:59

And they mostly make sure there's no like

18:01

super obvious mistakes,

18:02

but they don't test for everything.

18:03

And so it's not like a complete assurance

18:07

that these phones are unhackable.

18:09

And indeed,

18:10

like if we look at the level of

18:12

classified

18:14

data that these phones are now able to

18:16

handle,

18:17

which is the NATO restricted level.

18:19

That's out of the four classification

18:22

levels that NATO has.

18:23

That's the lowest one.

18:25

You don't even necessarily need a

18:28

specialized security clearance in order to

18:30

access NATO restricted information.

18:32

So

18:34

you know,

18:35

the most top secret governments are the

18:37

most top secret documents that NATO has

18:40

are not going to be stored on iPhones

18:42

anytime soon.

18:44

But it is interesting that like a full

18:46

operating system and especially a consumer

18:49

one is now able to handle this data

18:52

because typically you would see like a

18:55

NATO restricted classification limited to

18:58

something like a

19:01

A lot of those USB drives that have

19:03

hardware encryption and a pin that you

19:05

enter,

19:07

some of those will be NATO-restricted in

19:12

terms of security, which is good,

19:13

but those are obviously much simpler

19:15

devices.

19:15

They just have to handle encryption,

19:17

and that's pretty much it.

19:19

Whereas an iPhone is a...

19:21

complicated device and obviously more

19:24

challenging to guarantee the security of

19:27

those documents on it.

19:30

And so yeah,

19:31

it is a big step for Apple to

19:35

have this done.

19:36

I don't know what the process is for

19:41

like a company like apple or a os

19:44

developer to get nato certified i don't

19:47

know if that is something that um like

19:50

the company itself would have to reach out

19:53

for and pay to get certified i would

19:55

imagine um it typically is and so thinking

20:00

about like this being the first consumer

20:02

device to be to be certified to handle

20:06

nato restricted information

20:09

That might not be that surprising because

20:11

I would imagine a lot of consumer devices

20:12

probably are not willing to undergo the

20:14

effort to get this certification and audit

20:17

in the first place.

20:19

Thinking about like Graphene OS we just

20:21

talked about,

20:24

I can't imagine they would have the

20:26

resources to do like a comprehensive audit

20:29

to be certified to handle NATO restricted

20:32

information,

20:33

even if the operating system is

20:36

theoretically secure enough to do that.

20:37

So there is that takeaway that I would

20:41

think about.

20:41

I don't think

20:43

And for that reason,

20:44

I wouldn't consider iPhones to be the most

20:47

secure devices in the world now or

20:49

anything like that.

20:50

But it is certainly a good sign for

20:54

them at the very least.

20:58

For sure.

21:01

I don't have anything to add to that,

21:02

but we did have a few questions in

21:03

the chat I thought might be fun to

21:05

talk about.

21:06

Yeah.

21:07

Dyson Fan said,

21:09

do you think this will be affected by

21:10

the war in the Middle East?

21:15

I don't think so.

21:15

I think overall,

21:17

I know there's a big push in Europe

21:18

right now for digital sovereignty.

21:21

I think one of the reasons that NATO

21:22

would view Apple as a maybe less risky

21:25

company compared to someone like Microsoft

21:27

is...

21:28

Putting aside the fact that Microsoft has

21:30

been hacked by China more times than I

21:31

can count.

21:33

I think Apple does have a history of

21:35

pushing back.

21:37

Not all the time.

21:38

Definitely not all the time.

21:39

I'm not defending Apple here.

21:40

There's times they should have pushed back

21:42

that they didn't.

21:43

But they do have a history,

21:44

especially in the U.S.,

21:45

of pushing back against government data

21:47

requests.

21:47

And I don't know.

21:49

I would just imagine that kind of...

21:52

makes the the geopolitical landscape a

21:53

little bit more uh nuanced i guess um

21:57

in terms of why they might be willing

21:58

to trust someone like apple but um and

22:03

then yeah jordan just real quick said i

22:06

wonder what they use for computers because

22:07

mac wasn't included i don't know that's a

22:08

good question i know uh germany

22:10

specifically i know there's a few states

22:12

in germany that are like switching over to

22:13

linux and and uh libra office and stuff

22:15

like that but i don't know about nato

22:17

as a whole that is a really good

22:18

question so

22:21

yeah i'm not sure i mean as far

22:23

as like the war in the middle east

22:26

i nato is i know the us is

22:29

a part of nato but the us typically

22:32

when it comes to like classified

22:33

information or military stuff they

22:37

kind of do their own thing and they

22:38

have their own requirements for all of

22:39

this.

22:40

A lot of the NATO specific stuff like

22:43

this certification, for example,

22:45

is going to apply more to European

22:47

countries than the U.S.

22:50

in its own interest.

22:51

So...

22:52

There is that to think about too.

22:54

I believe iOS and other Apple devices have

22:58

been certified for a variety of US

23:01

government security standards for quite

23:03

some time,

23:03

but I don't remember exactly what level

23:07

they would be certified at or if it's

23:09

comparable to this.

23:10

I'd have to do more research into that.

23:13

Cool.

23:18

Yeah.

23:19

I mean, that was a pretty quick story,

23:21

but

23:22

That was all I had on that one.

23:25

Yeah, before we go on... Oh, yeah.

23:29

Let's talk about this.

23:30

So this story was reported by TechCrunch

23:33

here.

23:34

Meta sued over AI smart glasses privacy

23:37

concerns after workers reviewed nudity,

23:39

sex, and other footage.

23:42

According to TechCrunch,

23:43

Meta is facing a new class action lawsuit

23:45

over its AI smart glasses and their lack

23:47

of privacy after an investigation by

23:50

Swedish newspapers found that workers at a

23:52

Kenya-based subcontractor are reviewing

23:55

footage from customers' glasses,

23:57

which included sensitive content like

23:59

nudity, people having sex,

24:00

and using the toilet.

24:02

Meta claimed it was blurring faces and

24:04

images,

24:05

but sources disputed that this blurring

24:07

consistently worked.

24:09

The news prompted the UK regulator,

24:11

the Information Commissioner's Office,

24:14

to investigate the matter.

24:17

Now the tech giant is facing a lawsuit

24:19

in the United States as well.

24:20

In the newly filed complaint,

24:21

plaintiffs Gina Barton of New Jersey and

24:24

Mateo Canu of California,

24:26

represented by the public interest-focused

24:28

orcs and law firm alleged that meta

24:31

violated privacy laws and engaged in false

24:34

advertising um

24:38

So, I mean, looking at this story,

24:41

my immediate reaction is like, well, yeah,

24:43

of course this would happen if you strap

24:46

cameras to your face that are constantly

24:47

streaming to a big tech company.

24:51

And this is really a problem that we've

24:54

seen over and over before.

24:58

The one that most immediately comes to

25:00

mind is

25:02

was pretty much a very similar situation

25:04

with Siri recordings.

25:07

And those weren't video at the very least,

25:09

unlike this,

25:10

but they were being sent to a bunch

25:12

of contractors for review when that was

25:16

not clearly stated in Apple's privacy

25:18

policy.

25:19

I believe there have been similar cases

25:21

with other voice recording systems like

25:24

Alexa.

25:26

And so

25:30

it's it's just a sign that these these

25:33

big tech companies they're not going to be

25:35

treating your data properly and they're

25:37

not going to be giving it the production

25:39

that it needs because they are more

25:40

interested in consuming all of this data

25:42

as much as possible and like having a

25:46

bunch of random people contractors whoever

25:49

review all of it to supposedly probably

25:52

improve their ai services and other things

25:54

that they

25:55

that they're working on just with complete

25:58

disregard to your own privacy or personal

26:01

data.

26:02

And so, yeah,

26:05

hopefully there's a big punishment for

26:07

meta here,

26:07

but I can't imagine a lot is going

26:10

to change.

26:11

Unfortunately,

26:12

I think that we need to be aware

26:16

of these dangers and we really need to

26:18

just eliminate devices like this from

26:23

everyday use.

26:24

it's a bit crazy to me um how

26:28

much things have changed in the past ten

26:29

years because i remember back when um

26:32

google glass originally came out um and

26:35

there was this glass holes term for people

26:37

who wore it and were constantly recording

26:39

in public spaces and now all of this

26:42

stuff is kind of being normalized

26:44

unfortunately and there isn't as much

26:46

pushback anymore and i think that we

26:50

need to revisit that because I don't think

26:51

we were we were wrong back in those

26:54

days.

26:54

I think that we we were on to

26:55

something and maybe we should remember how

26:59

much we dislike products like this again.

27:04

Yeah, totally agree that.

27:06

Honestly,

27:07

that was something that really confused me

27:08

too.

27:10

With the whole like you mentioned Google

27:12

Glasses.

27:12

I remember when when those came out,

27:16

and they were such a flop.

27:18

And so when Meta announced their AI

27:20

glasses, I was like, okay,

27:22

we've already been down this road.

27:24

And I know, I think even before Meta,

27:25

I think Snap had announced their glasses,

27:27

and then I never heard of them again,

27:28

which I think those exist.

27:30

But I don't know.

27:32

I never hear about them anymore.

27:33

So my point being, I was like, oh,

27:35

this isn't going to go anywhere.

27:36

And now I think this article said that

27:38

last year they shipped like seven million

27:40

of these things.

27:41

Hold on, where was it in this thing?

27:43

But...

27:45

Yeah, while I look for that,

27:46

it just blew my mind that it's like,

27:48

wait, yeah, in twenty twenty five,

27:50

over seven million people bought meta

27:52

smart glasses.

27:52

And it's like,

27:54

how did it like what's different this time

27:56

that it worked when it did not last

27:58

time?

27:59

I'm very confused.

28:01

I think it's got to be like.

28:05

Are they making it fashionable?

28:06

I know the Ray-Ban partnership must have a

28:09

lot to do with that.

28:10

Are people willing to give in and use

28:14

it?

28:14

Yeah, if they're partnered with like,

28:17

more recognizable brands.

28:21

Kind of an unfortunate way to shop,

28:22

but I think that might be it for

28:25

a lot of people.

28:27

I mean, that does, yeah,

28:28

that could be it.

28:30

I mean, maybe it's the AI part.

28:31

Like, I have said before that, like,

28:34

I get on paper, I get the idea,

28:36

because I'm convinced I have, like,

28:38

a mild form of face blindness,

28:41

and I run into people all the time.

28:43

I mean, not obviously, like,

28:44

with someone like you that I work with

28:45

all the time and I see every week,

28:46

I know you, but, like,

28:48

I run into people all the time that

28:49

they're like, oh, hey, Nate, it's me,

28:51

so-and-so, and I'm just like,

28:53

who are you?

28:54

And then when they're like, oh,

28:55

we like did this thing together.

28:56

And I'm like, oh yes, yes.

28:57

Like I'm a contextual person.

28:59

When you tell me like how I know

29:00

you, then I remember,

29:01

but I'm so bad with names and faces.

29:03

So I would love the idea of like

29:04

AI glasses that tell me like,

29:07

do the facial recognition, like, oh,

29:08

you know, this person from this,

29:09

like save me that whole step.

29:11

But I don't want it pinging back to

29:12

the cloud,

29:13

which of course it would have to do

29:14

to do that.

29:15

But my point being is like,

29:15

I get it on paper,

29:16

but I still can't believe that like they

29:18

managed to

29:19

to actually like make it stick this time

29:21

it's so weird to me well and i

29:23

mean it it doesn't have to do that

29:24

necessarily ping to the cloud i know not

29:27

that i would advocate for this product to

29:31

exist necessarily but certainly facial

29:34

recognition that's something that has been

29:35

around for for quite some time and

29:38

well,

29:38

you would need to have a local database

29:40

in your contacts or whatever.

29:42

I do think a lot of people will

29:43

already use this feature in the Apple

29:45

Photos app or the Photos app on their

29:47

Android phone that automatically

29:49

classifies faces and you can put a name

29:52

for it.

29:52

I think that's a fairly popular feature

29:54

that runs entirely locally.

29:56

And extending that to a basic device like

30:00

this,

30:01

even if it has to ping your phone

30:02

to run this computation,

30:06

Certainly it's not necessary to ping

30:10

servers if you don't want it to,

30:12

but big tech companies are very

30:16

disincentivized to do anything locally

30:19

because there is so much data that they

30:22

can slurp up with their servers and use

30:25

for all sorts of AI and other purposes.

30:28

And of course,

30:30

we'll talk about a future story here in

30:33

the show about these AI companies

30:35

partnering with people who you probably

30:38

don't want them to be.

30:40

So that's the kind of direction that all

30:42

of this puts us in.

30:46

And yeah, it's not great.

30:50

And it certainly doesn't have to be this

30:52

way.

30:54

Just because this is the way that Meta

30:56

has decided to make this product doesn't

30:57

mean it's the only way that this product

30:59

has to exist.

31:00

And I think that that's really important

31:02

to remember.

31:05

For sure.

31:07

Yeah,

31:07

two things I wanted to add real quick

31:09

in response to redacted,

31:11

said someone needs to make glasses that

31:12

beams lasers at cameras as you walk

31:14

around.

31:15

That's probably destruction of property.

31:17

There is an app,

31:18

this is not an official recommendation

31:20

because we haven't really vetted it,

31:22

but I know there is an app that's

31:24

supposed to warn you if there are people

31:27

nearby wearing smart glasses,

31:29

not just the meta ones,

31:30

but also the snap ones and

31:32

Apparently,

31:32

there's more than just those two,

31:33

but I do have it on my phone.

31:36

It has not pinged me yet,

31:39

although I don't know if I live in

31:40

an area where people are not using them.

31:42

I don't know if it's maybe just false

31:44

negatives.

31:45

Your mileage may vary,

31:46

but it is fully open source.

31:47

You can go take a look at it.

31:48

I will say,

31:52

I've never seen any of these in person

31:54

myself.

31:54

I don't know what area these are super

31:56

popular in, but not around me yet.

32:00

Yeah,

32:00

and I've had situations where somebody's

32:03

got the big glasses and there's a screw

32:05

in the front, and I've asked them,

32:07

and I try not to sound like I'm

32:08

upset about it,

32:09

because if they think I'm angry,

32:11

they're definitely going to say no.

32:12

But I've asked people, I'm like, hey,

32:14

this is totally out of left field,

32:15

but are those the meta smart glasses?

32:17

And they're always like, no, no,

32:18

they're just whatevers.

32:20

So I haven't run into anybody yet,

32:22

but yeah.

32:24

And then the other thing I was going

32:25

to say just real quick to add some

32:26

context to this article,

32:27

it says that the reason there's a lawsuit

32:30

is because Meta's advertising specifically

32:33

says, and I quote,

32:35

you're in control of your data and

32:36

content.

32:37

And then there was like another quote

32:38

there too.

32:39

Yeah.

32:42

I don't know.

32:42

I lost it.

32:42

Oh, built for privacy,

32:44

designed for privacy, controlled by you.

32:46

So, yeah,

32:47

it's it's I think they've got it.

32:48

I hope I'm not a lawyer,

32:49

but I feel like they've got a really

32:51

solid case here that if Meta is going

32:53

to.

32:53

And I mean,

32:55

all of the veterans listening know that

32:56

this is like, oh, no, Meta lied.

32:58

Like the what's the Captain Kirk William

33:01

Shatner like?

33:02

Shocked face.

33:03

But when you explicitly say in your

33:06

advertising that like you control your

33:08

data and then find out that there was

33:10

no toggle not to submit the footage and

33:12

people are reviewing it.

33:14

I think I put this in the newsletter

33:15

that went out actually for this episode

33:17

that.

33:19

As much as we've talked about these

33:20

things,

33:20

we kind of blew over that part where

33:22

it's like part of training AI is that

33:25

people have to review it,

33:26

even if only every now and then.

33:28

People have to review it and make sure

33:30

it's working and correct it,

33:33

which is a whole other thing.

33:34

worm bag of worms that we're not going

33:36

to get into right now but i i

33:38

think it's funny that like for you and

33:39

you and me like that never even came

33:41

up once because we just thought that was

33:42

kind of a given i guess or for

33:44

whatever reason like we never even thought

33:45

to mention that that like hey by the

33:47

way there is no world in which people

33:49

will not see so at least some of

33:51

the images and footage taken by these

33:52

videos so yeah

33:58

um one of our team members uh jordan

34:01

asked sorry i'll let you do it what

34:05

protection do people have against being

34:06

recorded in public um which is a great

34:10

question unfortunately i think the answer

34:12

in most countries including here in the

34:13

united states is not much but i think

34:15

that this is a good example of

34:18

um i think why data privacy concerns are

34:22

certainly not only a technical issue

34:24

because people very often get caught up in

34:27

this um trying to think of technical

34:28

solutions and i do like unredacted

34:30

suggestion of lasers being beamed at

34:33

cameras as you walk around but at the

34:35

end of the day um the the best

34:38

way to prevent something like this is to

34:39

get strong data privacy laws

34:43

out there that would prevent people from

34:46

doing this and using your data without

34:49

your consent.

34:49

Because I don't think that just being out

34:51

in public or walking around is necessarily

34:57

consent to be recorded and filmed and that

35:01

footage stored permanently for the rest of

35:03

time, right?

35:04

It's

35:08

We really have to rethink our relationship

35:10

with technology and privacy.

35:12

And we can't just apply past norms to

35:18

the current state of what we're in.

35:19

But of course,

35:20

there are so many incentives to not do

35:22

this that I think people need to be

35:24

more vocal about.

35:27

You know,

35:28

we've talked about this in the past few

35:30

episodes,

35:30

but even governments are getting in on

35:33

this like constant mass surveillance via

35:35

companies like Flock, for example,

35:38

just constantly trying to collect as much

35:40

data as possible and seeing

35:42

what they can do with it.

35:43

And in a lot of cases,

35:44

I think they don't really know what they

35:46

can do with it yet.

35:47

I think meta with these glasses probably

35:49

doesn't know what they can do with the

35:51

data yet.

35:52

But they're collecting it all in the hopes

35:54

that they can do something with it.

35:58

And that that's, that's not good.

36:01

And I don't think we should allow that.

36:04

So hopefully, so hopefully,

36:06

that can change.

36:09

Yeah, the only technical solution,

36:10

quote unquote solution that came to mind

36:12

was I really want to buy some and

36:14

review them one of these days.

36:15

But I know you've probably heard of

36:17

there's a company that makes glasses that

36:19

they've got a few different models and one

36:21

of them is supposed to reflect IR.

36:24

So they look like relatively normal

36:26

glasses,

36:26

depending on how you feel about the style

36:28

of them.

36:28

But the frames are designed to very

36:31

invisibly reflect light back to a camera.

36:34

And it's mostly for facial recognition if

36:35

I've read... Granted,

36:37

this all came from their website,

36:38

so it may not be a hundred percent

36:39

accurate.

36:40

But according to their marketing

36:41

materials, it's like some cameras,

36:43

like surveillance cameras,

36:45

They'll use IR to like better map your

36:47

face for facial recognition purposes.

36:49

And it's designed to throw those off.

36:51

But the nice thing is, again,

36:52

if I pose for like a family photo,

36:53

my glasses look normal as opposed to they

36:56

have another model that like will

36:57

explicitly like if you take a flash photo,

36:59

it'll reflect and block you.

37:01

And so anyways,

37:03

my point is like something like that comes

37:04

to mind.

37:05

But I mean,

37:05

that comes with so many like let's just

37:07

assume it works for the record.

37:09

But you shouldn't have to like if you

37:11

don't wear glasses,

37:12

why are you going to buy them just

37:13

to throw off facial recognition?

37:15

You shouldn't have to buy them because I

37:16

think they're pretty expensive.

37:17

The frames are like two or three hundred

37:19

dollars,

37:21

which I guess is how much frames normally

37:22

cost without insurance.

37:23

But either way,

37:24

it's it's I guess my point is like

37:26

it's one of those like I agree with

37:27

you.

37:27

Like I don't like.

37:31

When ordinary people just trying to live

37:33

their lives,

37:34

have this unnecessary burden put upon

37:36

them, and I understand that like.

37:40

Like it.

37:42

I understand that there's a limit to that,

37:44

right?

37:44

Like we're not all entitled to like free

37:47

DoorDash or anything like that, right?

37:49

Like there's gonna be times you have to

37:51

put in some work and you have to

37:52

put in some effort and learn some things.

37:54

But I mean, in this situation,

37:55

like I feel like

37:57

these companies are just so out of control

37:59

and there is no data privacy law in

38:01

the U S at least not universally.

38:02

There's a patchwork of limited laws.

38:05

Like somebody here said,

38:06

there's some states in the U S which

38:07

don't allow facial recognition without

38:09

explicit consent.

38:10

Yeah.

38:10

There's like two or three that I'm aware

38:11

of.

38:12

I think there's like Texas, Illinois, um,

38:14

probably California with the,

38:16

their privacy law and maybe like a couple

38:18

others, but you know,

38:19

overall there is no like

38:22

us version of GDPR that says like, Hey,

38:25

here's the bare minimum.

38:26

And I,

38:28

the more we go through this stuff,

38:29

the more I feel like we really need

38:31

something like that,

38:31

that just kind of sets a standard,

38:33

which for the record,

38:34

it will not be good enough.

38:35

I guarantee you that,

38:36

but at least something,

38:37

some kind of bare minimum thing so that

38:40

people,

38:41

ordinary people don't have to jump through

38:43

a hundred and hoops just to try to

38:45

have like a basic level of privacy.

38:47

It's so insane.

38:48

And it's really important that like,

38:51

you can't just claim to be working around

38:53

these privacy restrictions by like

38:55

anonymizing that data or whatever,

38:56

because in cases like this, for example,

38:59

we know that that technology doesn't

39:01

really exist or it will, like,

39:03

if you want to blur faces, um,

39:04

in all of these videos,

39:05

it probably relies on AI, which again,

39:08

I'd point out Meta said that they were

39:10

doing in this case and it didn't work

39:11

consistently.

39:12

That's just going to be inherent to all

39:14

of this technology.

39:15

You're never going to be able to.

39:17

One hundred percent, uh,

39:20

ensure that all of this data is being

39:21

handled privately no matter what Meta is

39:23

claiming about this.

39:24

And really the only solution here is to

39:28

not collect that data in the first place

39:30

and to not give Meta that data in

39:32

the first place.

39:35

So yeah,

39:36

this whole thing's a bummer because it

39:38

really puts a bad spin on AI glasses

39:41

in general,

39:42

which is probably a good thing because it

39:43

seems like every single one that's come

39:45

out lately has been...

39:48

just in the form of cameras strapped to

39:50

your face, right?

39:51

Which is always like,

39:52

that's never been what I wanted from smart

39:55

glasses, even before I got into privacy.

39:59

I've always just been a huge fan of

40:00

future technology, and I was like,

40:02

smart glasses, that could be cool,

40:03

because I would want a heads-up display to

40:06

see navigation or live translation or a

40:10

ton of stuff that does not at all

40:12

require cameras.

40:14

Recording people constantly,

40:15

that's probably...

40:17

Most of the very bottom of the list

40:19

of things I would ever want to do

40:20

with my glasses.

40:21

Um,

40:22

but that is the direction that all of

40:26

these tech companies are going in rather

40:27

than, um, something more,

40:32

more useful and less privacy invasive,

40:34

unfortunately.

40:35

So it's a shame.

40:37

Yeah.

40:39

I, I really just real quick,

40:40

I want to drill home what you were

40:41

saying about like how the face blur isn't

40:43

enough.

40:44

Like.

40:45

It takes a shockingly small amount of data

40:49

to de-anonymize somebody.

40:50

And it always cracks me up when it's

40:51

something like location, right?

40:53

Like, oh, but we anonymize the location.

40:55

And how many other people in the world

40:58

spend eight hours a night at this location

41:00

and then eight hours a day at that

41:02

location?

41:03

Like that alone tells you who I am.

41:05

And then this one with like the whole,

41:06

oh, but we blur faces.

41:08

Hi, hello.

41:09

I don't think that matters for some

41:11

people.

41:12

for audio listeners,

41:13

I'm showing off my arm tattoos.

41:14

Like even if you blurred my face, it's,

41:16

I don't, it's pretty obvious, you know?

41:18

And so, yeah.

41:19

Um, I, I could see,

41:21

I'm thinking back in my own history.

41:22

I could see a few small scenarios where

41:24

like having a camera strapped to my face

41:26

would be super useful,

41:27

but that was like three times a year

41:29

at my old job just for me.

41:31

Like,

41:31

I don't think most people really need it

41:33

that much.

41:33

So yeah.

41:34

And certainly, you know,

41:35

that could be a separate product that

41:39

like,

41:39

what if I just have a little camera

41:40

that clips onto my glasses if I want

41:42

to record something, right?

41:43

I don't need it constantly.

41:44

Yeah.

41:45

Constantly on and recording.

41:47

This is a very niche use case,

41:50

I think, for a lot of people.

41:52

Yeah, super crazy.

41:56

But on that note,

41:57

we do have some site updates before we

42:00

launch into our next story.

42:01

We are going to talk a little bit

42:02

later about ProtonMail.

42:05

I know that story just came out the

42:06

other day.

42:07

But first,

42:08

here's what's going on at Privacy Guides.

42:10

And for those of you who may not

42:12

know,

42:12

Privacy Guides is a nonprofit which shares

42:14

data privacy related information.

42:16

And we facilitate a community over on our

42:18

forum and on Matrix where people can ask

42:20

questions and get advice about staying

42:22

private online and preserving their

42:23

digital rights.

42:25

So first up, big news,

42:26

our smartphone privacy and security course

42:28

that we have been talking about for months

42:29

now.

42:30

We've been releasing videos little by

42:32

little.

42:32

It is finally one hundred percent

42:34

available in full.

42:36

No membership required.

42:38

You can go over to YouTube.

42:39

I believe it's on pure tube now.

42:40

If it's not, it will be very,

42:41

very soon.

42:44

We have,

42:44

for those of you who may not be

42:45

aware of this,

42:46

we basically built a three-part smartphone

42:49

course about how to make your smartphone

42:51

more private and more secure.

42:53

And there's a beginner, intermediate,

42:55

and advanced level.

42:56

And there is also an iPhone and an

42:58

Android version.

42:59

So yeah, whichever one you use.

43:02

And you can watch them all and you

43:03

can decide maybe some of the stuff in

43:04

the advanced level doesn't apply to me.

43:06

Maybe some of it does.

43:08

If nothing else,

43:08

it lets you know what your options are

43:10

out there and our official recommendations

43:12

at this point in time about how to

43:14

make your smartphone as private and secure

43:16

as possible.

43:17

And again, that is out now.

43:19

So go ahead and check that out.

43:21

And then some big exciting news.

43:23

Myself and Jonah next week will be in

43:26

Austin, Texas.

43:26

We are at an unofficial South by Southwest

43:29

party being hosted by EFF Austin.

43:32

We will be doing a little workshop about

43:35

how to improve the privacy and security of

43:36

your phone.

43:37

So, and, um, if,

43:40

if anyone's in the area and you have

43:42

never tried graphene and you're like kind

43:44

of worried about it,

43:45

we will actually have a little demo device

43:47

that has graphene on it so that people

43:49

can play around with it and kind of

43:50

see like, oh,

43:51

this is just like a normal Android.

43:52

Like there's nothing to be scared of.

43:54

I can use it just like an Android.

43:56

Um, so we'll have that little demo device,

43:57

but also we'll just be answering questions

43:59

and, you know,

43:59

offering our advice about how to harden

44:01

your phone.

44:02

And full disclosure,

44:04

I am on the board of EFF Austin.

44:06

So yeah,

44:07

we will be there for anyone who's in

44:08

the area.

44:09

Yeah,

44:10

come stop by if you're not and it'll

44:11

be super fun, I think.

44:13

And we'll share a link to the to

44:15

the event information meetup stuff in the

44:18

in the sources of the show.

44:19

So yeah, if you're in the area,

44:21

definitely check it out.

44:23

It should be fun.

44:26

And also, I will say,

44:28

since it will be taking place next Friday,

44:31

we will be hosting this show in person

44:34

there.

44:34

So that'll be fun for people who watch

44:37

this as well.

44:40

In other news,

44:41

we have a bunch of big stuff that

44:43

we announced on our website this week.

44:45

The biggest thing that we launched was a

44:47

new section related to privacy activism.

44:51

So if you go to privacyguides.org slash

44:54

activism right now,

44:55

you can find all of these

44:57

resources um our staff writer m has been

45:01

working super hard on getting all these up

45:02

and it has a ton of useful advice

45:05

um not for like just activists in

45:08

particular but activists for privacy

45:11

people who want to advocate for data

45:13

privacy in their local communities or in

45:18

terms of legislation or in terms of

45:20

anywhere else that you might want to be

45:23

an activist for privacy rights.

45:26

And so all of these tools are meant

45:27

to empower the kind of digital rights

45:30

community that we are in.

45:31

And the first tool that we released in

45:33

this section is the privacy activist

45:36

toolbox,

45:37

which it looks like Nate is

45:39

scrolling through now here on the screen.

45:42

Essentially,

45:42

this toolbox is a list of resources and

45:47

articles that give you advice on how to

45:49

be the most effective privacy activist you

45:52

can be and how to effectively and clearly

45:56

and sustainably advocate for privacy and

46:00

digital rights.

46:02

And so if that is interesting to you,

46:04

if you've been in the privacy community

46:06

for a while and you're wondering how to

46:08

best make a difference yourself,

46:11

definitely check out these articles.

46:12

They're extremely extensive and just a

46:15

wonderful resource.

46:16

We've gotten a ton of positive feedback

46:19

from people in this space and elsewhere

46:22

who have been reading these and learning

46:24

new things or sharing these with other

46:27

privacy activists and privacy related

46:29

organizations.

46:31

in this space.

46:34

The activism section in general is

46:36

something that we hope to continue

46:38

expanding.

46:39

We have a few things on the roadmap

46:41

and hopefully we can share a bit more

46:43

information about that soon.

46:46

But for now,

46:47

I think that all of these tips will

46:49

prove to be super helpful for some of

46:52

you out there.

46:53

And if any of that sounds interesting to

46:55

you,

46:56

definitely go to privacyguides.org slash

46:59

activism and check out that resource.

47:04

Other site changes,

47:05

we've done a few very minor things.

47:09

The most notable one was that we removed

47:12

mention of zero knowledge encryption or

47:16

zero access encryption from our site

47:18

because those terms are not very...

47:20

clear and we found them to be confusing.

47:22

So we're kind of transitioning to being

47:24

more descriptive.

47:26

Zero access encryption is kind of a

47:27

marketing term that gets thrown around a

47:30

lot.

47:30

And zero knowledge encryption is not

47:33

really technically accurate.

47:36

It doesn't make a lot of sense outside

47:37

of like zero knowledge proofs,

47:38

which are totally different things.

47:40

So

47:41

Hopefully some of our resources around

47:45

encrypted tools that we recommend,

47:48

et cetera,

47:48

are more clear and we hope to use

47:51

better terminology to describe that stuff

47:55

going forward.

47:55

That's not just marketing jargon.

47:58

That's a big thing that we want to

47:59

try to eliminate from all of our resources

48:01

as much as possible.

48:02

So that was a big change.

48:04

um related to our news section our

48:08

volunteer journalist freya has been

48:11

publishing a ton of articles lately so you

48:13

can go to privacyguides.org news and check

48:16

those out there are a lot of stories

48:18

that we don't get a chance to discuss

48:20

here on the show

48:22

but are still important nonetheless,

48:25

and that is the best way to stay

48:28

up to date with those in addition to

48:29

our community forum.

48:31

Some of the articles include a full-length

48:34

article on how to game privately,

48:37

which might be interesting to the gamers

48:40

out there,

48:40

as well as more news briefs like Samsung

48:43

TV's halting data collection in Texas,

48:45

a spyware maker going to jail,

48:47

TikTok refusing to add end-to-end

48:50

encrypted direct messages, and a lot more.

48:53

So again,

48:54

that's at privacyguides.org slash news if

48:57

you want to stay up to date on

48:59

all of those topics.

49:01

All of the stuff that we do at

49:02

Privacy Guides is made possible by our

49:05

supporters.

49:06

So you can sign up for a membership

49:08

or donate at privacyguides.org.

49:10

Or if you want to promote privacy in

49:13

your own life and you want to support

49:15

us as well,

49:16

you can buy some swag from

49:18

shop.privacyguides.org.

49:23

I think that does it for all the

49:25

updates from us this week.

49:27

So let's talk about chat GPT and the

49:32

Pentagon.

49:33

Nate, what do you got for us here?

49:36

Yes.

49:37

OK, so for those who missed the memo,

49:40

which I wouldn't blame you because there

49:41

is so much freaking news going on right

49:43

now,

49:44

it's hard to stay on top of it

49:45

all.

49:46

Like I actually forgot part one of this

49:48

story until I was reading the article and

49:49

refresh my memory.

49:52

So the Pentagon used to have a contract

49:55

with Anthropic, who makes the AI Claude,

49:58

which I've heard good things about as far

50:00

as AI goes.

50:01

I guess it's pretty good at what it

50:02

does.

50:04

But Anthropic had some stipulations in

50:08

their contract,

50:09

specifically that you could not use Claude

50:12

for mass surveillance on Americans,

50:14

and you cannot use it in autonomous

50:16

weapons.

50:17

And the government tried to pressure

50:20

Claude into dropping those stipulations

50:23

and doing whatever they wanted.

50:28

I will admit I'm not fully versed in

50:30

the nuance of this story.

50:31

So I apologize if any of my opinions

50:33

are a little wrong here,

50:34

but to their credit,

50:36

Anthropic stuck with their guns and said,

50:38

no pun intended,

50:39

stuck with their guns and said, no,

50:40

we're not going to do that.

50:41

And the government dropped them and said,

50:43

we're not doing business with you anymore.

50:46

Went on to declare them a supply chain

50:47

risk.

50:47

That's a whole nother thing that we're not

50:49

going to get into, but open AI is,

50:55

as they do, swooped right in and said,

50:58

hey, we'll do business with you.

50:59

I mean,

51:01

I don't know how else to put it.

51:02

So Sam Altman, the CEO of OpenAI,

51:07

basically he's clarifying the terms of

51:10

this deal now because he recognizes that

51:15

that was not a good look to just

51:16

come in.

51:18

Here's what he says.

51:19

We were genuinely trying to deescalate

51:20

things and avoid a much worse outcome,

51:22

but I think it just looked opportunistic

51:23

and sloppy.

51:25

You can take that at face value if

51:26

you want or not.

51:27

You can probably tell how I feel from

51:29

my tone,

51:29

but that's neither here nor there.

51:31

But either way,

51:32

he's clarifying that they are still

51:35

holding to the terms that OpenAI cannot be

51:39

used for mass surveillance.

51:41

Noticeably,

51:42

I don't think this article said anything

51:43

about the autonomous weapons.

51:45

But yeah,

51:47

and I think that's kind of the...

51:51

Again,

51:51

that's kind of the bare bones of the

51:53

story.

51:55

We don't know a lot more.

51:56

We know that AI,

52:00

and I'm sure a lot of our veteran

52:01

viewers know this,

52:02

but AI is so much more than LLMs,

52:04

right?

52:05

And there's a lot of people who don't

52:06

even like the term AI because it's been

52:08

around for a long time.

52:12

AI research goes all the way back to

52:13

like the sixties, I think,

52:14

which is pretty crazy when you think about

52:16

it.

52:16

But I mean,

52:17

even before it was called AI,

52:19

we've had targeted ads,

52:20

we've had machine learning,

52:21

we've had algorithms determining all kinds

52:25

of, I mean, for years,

52:27

algorithms have been determining whether

52:28

or not you get approved for a loan,

52:30

your insurance rates.

52:31

And it's just, this is like,

52:34

the next step, um,

52:36

I've had to explain that to a few

52:37

people is that like, it,

52:38

it seems on the, from the outside,

52:40

it seems like chat GPT just came out

52:42

of nowhere, right.

52:43

In twenty, twenty two, I think it was,

52:45

but I mean, it's,

52:46

it's kind of been building towards that

52:47

behind the scenes.

52:48

It's just,

52:48

that was like the next leap forward,

52:50

at least publicly and visibly.

52:52

So, um,

52:54

Yeah, AI is being used by the military,

52:56

which is, again,

52:57

probably not a shocker to our veteran

52:58

listeners, but it's being used for, again,

53:02

it's more than just LLMs and chatbots.

53:04

It's being used to identify targets.

53:06

It's being used to calculate how sure are

53:08

we that this is a target?

53:09

Where do we think this person is going

53:10

to be next?

53:11

All that kind of stuff.

53:12

And so I think

53:15

I'm not going to lie.

53:16

This has actually been on my mind for

53:17

a long time.

53:18

Back on Surveillance Report,

53:19

Henry used to tell a famous story from

53:23

Edward Snowden where it was the – I

53:26

believe it was the Boston Marathon

53:27

bombings.

53:28

It's like him and one of his coworkers

53:29

were in a bar,

53:31

and they saw the news about the Boston

53:32

Marathon bombings.

53:34

And I think it was his coworker was

53:35

like,

53:36

how much you want to bet that guy's

53:37

in our system?

53:38

Like we flagged him.

53:39

We knew he was a threat and we

53:41

did nothing.

53:42

And when they went back to work the

53:44

next day, sure enough,

53:45

they looked him up and it's like, oh,

53:46

he was in the system.

53:47

Yes, absolutely.

53:49

And I think that has long been a

53:51

criticism that I personally have heard

53:53

from intelligence people.

53:55

Not that I know any,

53:56

but I've just like,

53:56

I've seen it around in articles and stuff

53:58

is they're so inundated with data that

54:01

they cannot sort through it to make sense

54:03

of it.

54:05

which to me tells me you should stop

54:06

collecting so much data.

54:08

But I think that's one of the most

54:09

obvious uses of AI is to sort through

54:11

that data,

54:13

which raises a lot of concerns that the

54:14

article did actually address here that AI

54:17

is known for getting it wrong or

54:19

hallucinating.

54:20

Like it says right here,

54:20

AI large language models can make mistakes

54:22

or even make things up known as

54:23

hallucinating, which...

54:25

Fun fact,

54:26

that was actually my first experience with

54:27

AI.

54:28

Back in the day, I was like, well,

54:29

let me try this out and see if

54:30

it's any good.

54:31

And so what I used it for was,

54:33

this was back when I used to recommend

54:34

Threema over on the new oil,

54:36

and I was writing a review.

54:37

And so I was like, okay,

54:39

give me the pros and cons of Threema.

54:41

And one of the pros, it was like,

54:42

it has a password manager built in.

54:44

And I'm like,

54:46

can you cite your source for that?

54:47

And of course it couldn't.

54:48

And it just went, oh, you're right.

54:49

I'm sorry.

54:50

It doesn't have a password manager.

54:51

And I'm just like,

54:52

Where did that even come from?

54:54

So yeah, AI,

54:57

that's one of the big concerns with AI

55:01

in this context.

55:01

I mean,

55:02

aside from just the privacy in general

55:03

is...

55:05

I mean,

55:05

I think there's so many issues with

55:07

privacy in general, right?

55:08

Concerns about privacy in general.

55:10

Aside from the fact that it's just a

55:11

given human right,

55:13

I think it was also Edward Snowden or

55:14

somebody said that you never have to

55:16

justify why you deserve a right.

55:17

Someone else has to justify why they need

55:19

to infringe on it.

55:22

But in addition to that,

55:23

I think something that should be said is

55:26

that, and again,

55:28

we know this thanks to Snowden in

55:31

A lot of the time,

55:32

the loophole for spying on American

55:34

citizens is that once data leaves the

55:36

country's borders,

55:37

it becomes subject to surveillance.

55:40

So last year I went to Europe, right?

55:45

Suddenly you can spy on me because if

55:46

I, you know,

55:47

had to call my wife back home,

55:48

that data's crossing borders.

55:50

Or even on a much more innocuous note,

55:52

he would talk about how data centers like

55:54

Gmail, for example,

55:56

completely unbeknownst to you,

55:57

they might move a server,

55:59

like copy the data somewhere else

56:02

temporarily to like do maintenance on that

56:04

physical server, right?

56:05

And that data might go to Canada, Mexico,

56:08

whatever, or even just sending an email.

56:11

You know, the internet...

56:13

as far as I understand,

56:14

like it tries to optimize and take the

56:15

fastest route to something,

56:17

which let's say hypothetically,

56:19

for some reason, the fastest route from,

56:21

I don't know,

56:22

Texas to California is jammed up.

56:24

It might, again,

56:25

bounce over to a server in Mexico and

56:26

then bounce back over to California to use

56:29

the fastest route.

56:30

And now again,

56:31

your data is open for interception.

56:33

So it's, yeah, there's just so,

56:36

so many privacy concerns with AI.

56:38

And the fact that they...

56:42

The fact that this is even a discussion

56:45

or a question from the military of like,

56:48

well,

56:48

can we use it for mass surveillance on

56:49

Americans?

56:51

Why?

56:52

Just, yeah, I don't know.

56:53

That's...

56:55

I think that's kind of all my thoughts

56:56

on that one.

56:57

Yeah, I...

57:00

I would definitely and you said we

57:02

wouldn't talk too much about this,

57:04

but I would want to highlight the the

57:06

idea that the US government was going to

57:08

flag anthropic as a national security

57:12

threat or for making these demands.

57:16

I think it is very concerning that the

57:18

US government was so insistent originally

57:20

that like the ability to spy on US

57:24

citizens domestically was like a hard line

57:26

that they needed to have

57:28

not roped enough in this application,

57:30

especially because this is an agreement

57:33

between AI companies and the military.

57:36

Certainly not the people you would want

57:39

surveilling on your own citizens.

57:42

But

57:44

Yeah, I mean,

57:45

there's problems with AI everywhere.

57:47

I think Jordan brings up a good point

57:49

here that even if there are safeguards

57:51

against US citizens that eventually get

57:53

added on, all of this technology,

57:55

which we already know is extremely

57:57

unreliable,

57:58

is going to be used in military operations

58:01

around the world.

58:02

And all of this AI stuff,

58:04

like you mentioned,

58:06

It's come out very recently.

58:08

I mean,

58:08

none of this stuff is like super well

58:10

tested by any means.

58:13

It's all just a lot of tech companies

58:15

really trying to jam this product into as

58:18

many possible segments as they can.

58:20

And of course,

58:21

that would include the government and the

58:22

military.

58:23

And it's all about getting a return on

58:27

this massive,

58:28

massive investment that they've all made

58:30

into AI development.

58:34

it just it's it's becoming an actively

58:40

dangerous situation i think we can see

58:42

from from this story here and i totally

58:46

agree with you that it really makes no

58:49

sense that um this ai use and all

58:52

the data collection that they're doing

58:54

will make a real difference in terms of

58:57

like stopping terrorist threats or plots

59:01

or like affecting people's everyday lives

59:03

um

59:04

And this is an argument that people have

59:06

known about and people have been making

59:07

for literal decades,

59:10

even before like the Internet and

59:12

computers were commonplace or used by

59:14

everyone.

59:17

It reminds me of like all of the

59:18

reports that came out following nine

59:20

eleven in the US about how certain

59:23

government agencies had intelligence that

59:26

indicated this might be happening,

59:28

whether or not that was passed along to

59:29

the FBI.

59:30

Like before this happened,

59:32

were people aware?

59:34

i think the general consensus there was

59:36

like you know nothing was as definitive it

59:40

wasn't completely reasonable for like

59:42

anyone to expect that that event was going

59:45

to happen ahead of time but certainly like

59:46

these people were in the systems and that

59:49

data didn't lead to anything actionable

59:52

happening and it's similar to the to the

59:54

case you talked about um where where the

59:56

perpetrator was in their systems and was

59:58

already flagged

1:00:00

And that didn't lead to anything being

1:00:02

stopped because all of this data

1:00:05

collection,

1:00:05

it isn't leading to any positive outcomes

1:00:09

here.

1:00:09

They're using national security, I think,

1:00:14

as a front for what they really want

1:00:19

to do with all of this data.

1:00:20

But much like a lot of

1:00:24

security protections that we have,

1:00:27

like the TSA, for example.

1:00:29

This is just a matter of security theater

1:00:32

in a lot of cases that isn't actually

1:00:35

doing the things that it sets out to

1:00:36

do.

1:00:36

You know,

1:00:39

they have plenty of other reasons to want

1:00:40

this data.

1:00:41

And I think national security or stopping

1:00:44

threats or stopping terrorists or

1:00:47

protecting children or whatever excuse you

1:00:48

want to you want to come up with

1:00:49

these days.

1:00:52

All of that is just an easy way

1:00:55

to put a bow on things and describe

1:00:58

it without having to really get into the

1:01:00

details.

1:01:00

But if you did get into these details,

1:01:02

you would see that all of the stuff,

1:01:04

the AI stuff that we're introducing into

1:01:06

the military,

1:01:07

all of the data collection that we're

1:01:08

doing on US citizens and people all around

1:01:12

the world, really,

1:01:14

all of this stuff is just completely

1:01:16

unnecessary.

1:01:17

And it's

1:01:20

bad it's bad for citizens of the us

1:01:22

it's bad for for everyone else in the

1:01:25

world and it's becoming actively dangerous

1:01:28

um and i think more people need to

1:01:29

be concerned about all of that yeah i

1:01:34

mean we could make a whole podcast like

1:01:37

not even just an episode we can make

1:01:38

a whole series out of all the problems

1:01:40

with ai but um

1:01:43

One of the things also that Jordan said

1:01:45

that I thought was pretty good is AI

1:01:46

is pretty biased based on its training

1:01:47

data.

1:01:48

That's historically been a big problem,

1:01:50

especially in a policing context,

1:01:51

is a lot of people have accused it

1:01:53

of...

1:01:55

One thing I've learned is if you go

1:01:57

looking for a problem, you will find one.

1:01:59

Generally speaking,

1:02:00

whatever you go looking for, you find.

1:02:02

And so if police, for example, feed it

1:02:08

uh feed ai like all these uh these

1:02:11

arrest records right and let's say they

1:02:13

all happen in the east side of town

1:02:15

then these this ai is going to be

1:02:17

like oh all the crime is in the

1:02:18

east side of town more cops are going

1:02:19

to go to the east side of town

1:02:20

they're going to find more crime because

1:02:22

there's more cops meanwhile the west side

1:02:24

of town is where all the white collar

1:02:25

crime is happening um but you know it's

1:02:28

it's just it's such a it's such an

1:02:30

imperfect thing and

1:02:32

There have been, so far,

1:02:33

there have not been any studies that have

1:02:35

shown that all this mass surveillance

1:02:37

actually stops crime or has any meaningful

1:02:40

impact on lowering crime rates.

1:02:43

And one of the big things that concerns

1:02:45

me with relying so much on AI for

1:02:47

everything is,

1:02:48

if you guys have never seen the movie

1:02:49

Brazil, I highly recommend it.

1:02:51

The ending's a little bleak,

1:02:52

I'm just gonna warn you.

1:02:54

But it's basically this very absurdist

1:02:57

sci-fi movie where this guy gets

1:03:00

wrongfully arrested

1:03:02

And his neighbor witnesses the arrest and

1:03:04

he's like,

1:03:05

I don't think they got the right guy.

1:03:06

Like I've lived next to this guy for

1:03:07

twenty years or whatever.

1:03:08

He's never been an issue.

1:03:10

And so he basically goes off on a

1:03:12

quest to try and deal with the bureaucracy

1:03:14

of like you arrested the wrong guy.

1:03:16

And he keeps running into people who are

1:03:18

basically just like, well,

1:03:19

that's what the computer said.

1:03:20

Like, that's what my paperwork says.

1:03:22

That's that's just like, no,

1:03:23

but that's what it says.

1:03:24

And like,

1:03:25

that's one of the big concerns that I

1:03:27

have with all this stuff and all this.

1:03:29

letting the machines do the thinking for

1:03:31

us shout out to the dune fans in

1:03:33

the room is that like we're entering this

1:03:35

world where it's like when the ai gets

1:03:38

it wrong what happens they're just going

1:03:39

to be like well that's what the computer

1:03:41

said yes but the computer's wrong yeah but

1:03:43

that's what the computer said it's like oh

1:03:44

my god dude so yeah it's it's a

1:03:47

very scary time we're entering into yes

1:03:53

We are going to get into some questions

1:03:57

from live streamers in a bit.

1:03:59

But before we do that,

1:04:01

we have an article here from four oh

1:04:03

four media.

1:04:04

The headline is proton mail helped FBI

1:04:07

unmask anonymous stop cop city protester.

1:04:11

A court record reviewed by four of our

1:04:13

media shows privacy focused email provider

1:04:15

ProtonMail handed over payment data

1:04:17

related to a stop Cups email account to

1:04:20

the Swiss government,

1:04:21

which handed it to the FBI.

1:04:26

So I'll read the beginning of this article

1:04:28

quick.

1:04:28

Privacy-focused email provider ProtonMail

1:04:30

provided Swiss authorities with the

1:04:32

payment data that the FBI then used to

1:04:34

determine who was allegedly behind an

1:04:35

anonymous account affiliated with the Stop

1:04:38

Cop City movement in Atlanta,

1:04:40

according to a court record reviewed by

1:04:43

Foro Fori.

1:04:45

The records that they reviewed provide

1:04:47

insight into the sort of data that

1:04:49

ProtonMail,

1:04:50

which prides itself on both its end-to-end

1:04:52

encryption and that is only governed by

1:04:53

Swiss privacy law,

1:04:54

can and does provide to third parties.

1:04:58

Um, so pretty much this,

1:05:01

this entire story, um, I,

1:05:04

I kinda disagree with,

1:05:05

with the headline a bit,

1:05:06

although obviously FBI involvement was

1:05:08

here.

1:05:09

It is important, I think,

1:05:10

to draw this distinction, um,

1:05:12

between like, uh,

1:05:14

a foreign government asking proton for

1:05:16

this information versus, um, the,

1:05:18

the Swiss courts.

1:05:20

asking Proton for this information because

1:05:23

in this case,

1:05:25

the FBI did go through those channels and

1:05:28

the Swiss courts demanded that Proton hand

1:05:31

this data over.

1:05:32

And I think that this is a big

1:05:34

difference from a lot of like big tech

1:05:36

companies, for example,

1:05:37

which will comply with court orders from

1:05:41

from other countries where they're

1:05:45

Like they might not necessarily fall under

1:05:47

their jurisdiction,

1:05:48

but they will comply with them anyways,

1:05:50

rather than like demanding everything go

1:05:51

through the U.S.

1:05:53

in a lot of big tech cases.

1:05:55

And so.

1:05:56

There is I do think you have to

1:05:59

draw this distinction because.

1:06:01

You know,

1:06:04

the Swiss courts do limit a bit.

1:06:10

as far as like what what information can

1:06:12

be requested.

1:06:12

But obviously we've seen a number of times

1:06:15

that they have been willing to demand the

1:06:19

data of activists in this case who aren't

1:06:22

necessarily

1:06:26

doing anything illegal.

1:06:27

I don't know exactly what these people are

1:06:29

being accused of,

1:06:32

but I do know that charges against a

1:06:34

lot of the people in this case,

1:06:36

according to for media in this article,

1:06:38

actually,

1:06:38

they said that they've been dropped.

1:06:39

So it's not clear like who's involved or

1:06:42

like what level of certainty the FBI even

1:06:45

had in the first place as to like

1:06:47

what crimes the person behind this email

1:06:51

supposedly committed.

1:06:53

At the end of the day,

1:06:56

kind of similar to the big story with

1:06:59

Proton revealing the IP address of a

1:07:02

French activist a little while ago,

1:07:09

the issue isn't necessarily the fact that

1:07:11

they're handing over information,

1:07:13

although it's certainly not great that

1:07:14

they have this information to hand over in

1:07:16

the first place because we can look at

1:07:19

court cases

1:07:22

from signal for example where the amount

1:07:25

of information that they have and do

1:07:26

handover is extremely extremely limited

1:07:29

whereas it seems like a lot of uh

1:07:31

data that proton has is is not protected

1:07:34

as you would expect um but i think

1:07:39

it really just highlights the importance

1:07:42

of

1:07:44

understanding what data you have is

1:07:47

protected and isn't protected when you use

1:07:49

any service, including Proton.

1:07:52

Because the encryption that is used in a

1:07:55

lot of cases,

1:07:55

and certainly in the case of Proton,

1:07:57

which is an email provider,

1:07:58

which is already not a great technology

1:08:00

for protecting this sort of metadata.

1:08:05

The encryption that's used even in

1:08:07

end-to-end encrypted products varies

1:08:09

widely.

1:08:10

So we could think about Signal again,

1:08:13

just for a simpler example,

1:08:16

compared to WhatsApp.

1:08:18

They actually use very similar encryption

1:08:21

technologies.

1:08:21

WhatsApp has famously used the Signal

1:08:25

protocol to encrypt those messages for a

1:08:27

while,

1:08:27

but

1:08:28

unlike signal,

1:08:29

which has put in a lot of effort

1:08:30

to minimizing the amount of metadata that

1:08:33

that's collected and logged by the

1:08:34

company,

1:08:36

WhatsApp and their parent company meta are

1:08:39

collecting and storing all sorts of

1:08:41

information about like,

1:08:43

who's registered on their service,

1:08:44

when they're using the app,

1:08:46

who they're communicating with,

1:08:47

they have all of that information.

1:08:49

And in that place places you at risk,

1:08:51

even though WhatsApp is end to end

1:08:53

encrypted.

1:08:56

And similarly here,

1:09:00

At the end of the day,

1:09:01

I don't think it's reasonable to expect

1:09:03

Proton to not comply with court orders,

1:09:08

of course.

1:09:09

I don't know.

1:09:11

Maybe you saw this in Consignment,

1:09:12

but I don't know if I saw in

1:09:14

this article whether Proton fought back

1:09:17

against this court order or to what

1:09:18

extent.

1:09:19

And so I'd be interested to know about

1:09:21

that.

1:09:21

But I will say,

1:09:23

at the end of the day,

1:09:27

looking at the...

1:09:30

I think especially after the French

1:09:33

activist thing,

1:09:34

Proton has made a bit of this more

1:09:35

clear and it is pretty clear in their

1:09:37

privacy policy,

1:09:38

like what information they have.

1:09:40

And I think that people just need to

1:09:41

go into situations like this,

1:09:43

assuming that any data that they give to

1:09:46

a third party service provider could

1:09:47

potentially be either leaked in a data

1:09:51

breach or handed over in a case like

1:09:52

this.

1:09:54

and need to plan accordingly because the

1:09:57

only protection that you can really rely

1:10:00

on is strong encryption of all of the

1:10:03

data you want to protect.

1:10:05

You can't rely on privacy policies.

1:10:08

You can't rely on companies avoiding court

1:10:11

orders.

1:10:13

if they have the data,

1:10:16

it will eventually be leaked,

1:10:17

whether it's the company giving it away or

1:10:19

whether it's a hack,

1:10:22

which seems inevitable.

1:10:23

I mean, Nate,

1:10:24

you publish like a data breach roundup

1:10:26

every single week, right?

1:10:28

With all sorts of companies that are

1:10:31

hacked all the time.

1:10:32

I think it's more than most people would

1:10:34

expect.

1:10:35

And yeah,

1:10:36

you can find that on our website if

1:10:37

you want to

1:10:38

go back in time and see all of

1:10:40

these happening but um yeah you have to

1:10:45

rely on encryption and you have to really

1:10:47

take a look at what these companies are

1:10:49

encrypting because proton is taking a lot

1:10:52

of data that they do not encrypt at

1:10:55

the end of the day and you need

1:10:56

to plan around that yeah it's um

1:11:05

Yeah, real quick,

1:11:05

fun story on the data breach note.

1:11:08

I started doing that back many,

1:11:09

many moons ago.

1:11:11

I started my own just solo podcast.

1:11:14

And when I ended up teaming up with

1:11:16

Henry at Surveillance Reporter,

1:11:17

that was my one stipulation is I want

1:11:20

to bring the data breach section

1:11:22

And that's kind of why I started doing

1:11:23

it here as well is because,

1:11:25

like you said,

1:11:26

I think people don't realize how

1:11:28

frighteningly common data breaches are.

1:11:30

And that was kind of like my thing

1:11:32

is like I wanted people to realize, like,

1:11:34

if for no other reason,

1:11:35

take your privacy seriously than the fact

1:11:37

that this happens literally every day.

1:11:40

But yeah, it's...

1:11:42

I think the reason I always like to

1:11:45

share these stories about Proton sharing

1:11:48

data is not to beat up on Proton

1:11:51

necessarily, but I mean, for one,

1:11:53

I already know there's going to be a

1:11:54

lot of people out there spreading

1:11:56

conspiracy theories about how Proton's a

1:11:57

honeypot and this just proves it.

1:12:00

But it's like you're saying, like email...

1:12:04

So many.

1:12:06

I think this is actually in one of

1:12:08

our upcoming videos here that should be

1:12:09

coming out soon.

1:12:11

So many of the technologies that run the

1:12:13

internet were invented literally in like

1:12:16

the nineteen sixties when there were ten

1:12:18

people online and they were all like

1:12:20

college kids and there was no need for

1:12:23

security because nobody was doing banking

1:12:25

transactions.

1:12:26

Nobody was doing sensitive military plans.

1:12:28

Nobody was sharing like

1:12:29

intimate communication.

1:12:31

It was all just literally like research

1:12:32

that was all going to be made public

1:12:33

at some point anyways.

1:12:34

Right.

1:12:35

And like maybe a few notes here and

1:12:36

there about like, you know, Hey,

1:12:37

did you get the document or whatever?

1:12:39

But it,

1:12:40

so security was really kind of an

1:12:41

afterthought.

1:12:43

And unfortunately as the internet grew and

1:12:44

scaled,

1:12:46

we kind of just kept bolting afterthoughts

1:12:48

onto this, this stuff.

1:12:50

And that's how we end up with things

1:12:51

like encrypted email, which, you know,

1:12:53

proton is great to does great.

1:12:55

But both of them and mailbox and like

1:12:58

all of these,

1:12:58

they're really just applying band aids to

1:13:00

technologies that were never really

1:13:01

designed to be secure.

1:13:03

And that's why we like things that things

1:13:05

like signal that were kind of like,

1:13:07

what if we went into the ground floor

1:13:09

and tried to be as secure as possible?

1:13:12

But even then, those have use cases.

1:13:13

Like, I always push back on that.

1:13:15

A personal pet peeve of mine,

1:13:16

I hate when people are like, oh, well,

1:13:18

you shouldn't use encrypted email because

1:13:21

email was never designed to be secure.

1:13:22

Use Signal instead.

1:13:23

And it's like, great.

1:13:25

The day my bank agrees to send me

1:13:26

a Signal message,

1:13:28

I will be in agreement with you.

1:13:29

But we're just not there.

1:13:30

Like, unfortunately, again,

1:13:32

we still have all these legacy

1:13:34

technologies that are floating around

1:13:36

because they just are.

1:13:38

And I think...

1:13:41

I think these stories are unfortunate

1:13:44

because Proton,

1:13:48

like every company is going to try to

1:13:50

market why you should use them, right?

1:13:52

And I think for,

1:13:54

especially for the target audience of

1:13:55

people like Proton,

1:13:57

it's very difficult to explain to people

1:13:59

in a nutshell why they need something like

1:14:04

Proton or PGP or anything.

1:14:06

It's very difficult to explain to them why

1:14:08

Gmail and Yahoo are not secure.

1:14:11

And also to explain nuance, right?

1:14:14

It's a very fine line to thread,

1:14:15

especially when you're talking to the

1:14:16

masses.

1:14:18

And I think there's definitely places

1:14:21

where Proton could do better.

1:14:22

Like I think with that French activist

1:14:23

one,

1:14:24

Proton did actually change some of the

1:14:26

wording on their website because it wasn't

1:14:30

technically wrong,

1:14:31

but I could see how somebody could get

1:14:33

the wrong impression.

1:14:33

And I don't know, this stuff,

1:14:37

I'm trying to put my thoughts in order

1:14:40

here.

1:14:42

It's frustrating because I don't think

1:14:45

Proton necessarily did anything wrong

1:14:47

here,

1:14:48

but I could see how people could be

1:14:51

lulled into a false sense of security.

1:14:53

And I do want to point out,

1:14:56

somebody pointed out here in the chats,

1:14:58

they said like no end-to-end encrypted

1:15:00

data was given away.

1:15:01

The account owner simply had bad OPSEC.

1:15:04

It's this person, like I will admit,

1:15:08

I pay for my Proton account with a

1:15:09

card.

1:15:10

I use a privacy.com card.

1:15:12

which is linked to my name.

1:15:13

Like if,

1:15:13

if I was the person in this scenario,

1:15:15

for whatever reason, um,

1:15:17

the FBI could request data from proton

1:15:19

proton.

1:15:20

They, here's their card info.

1:15:21

They could trace that back to privacy.com

1:15:23

who could trace it back to me.

1:15:24

I know that's not fully anonymous,

1:15:26

but also I'm not an activist.

1:15:27

If I was doing like serious,

1:15:29

heavy activism work,

1:15:31

I would probably take some more steps.

1:15:34

I don't really want to victim blame here,

1:15:35

but I guess, um,

1:15:38

And Proton pointed that out too.

1:15:39

They said like, we do accept cash.

1:15:40

We do accept cryptocurrency.

1:15:42

They don't accept Monero.

1:15:43

I'm going to always call out on that,

1:15:46

but it's, yeah, it's, it's like, it's,

1:15:48

it's important to know the limitations of

1:15:50

a tool.

1:15:50

And again,

1:15:51

like I mentioned this earlier in the show,

1:15:53

there's a difference between privacy and

1:15:54

anonymity, right?

1:15:55

Proton is not promising you anonymity,

1:15:57

at least not by default.

1:15:57

You're

1:16:01

So I think it's just really important to

1:16:02

keep in mind the limitations of these

1:16:04

tools.

1:16:05

And I just remembered you said is from

1:16:07

what I understand,

1:16:08

Proton did not push back on this order

1:16:11

because they were informed that apparently

1:16:13

this person,

1:16:14

I don't know if charges were dropped.

1:16:16

The article said that charges hadn't been

1:16:17

filed.

1:16:19

What exactly did they say?

1:16:21

Uh,

1:16:21

four or four media is not publishing the

1:16:22

person's name because they don't appear to

1:16:24

have been charged with a crime according

1:16:25

to searches of court databases.

1:16:27

So maybe they haven't been charged with a

1:16:28

crime yet.

1:16:29

Um, but yeah,

1:16:30

Apparently,

1:16:31

Proton was informed that the person in

1:16:33

this situation was violent,

1:16:35

that they had already shot at one officer,

1:16:37

that they had explosives on them.

1:16:40

I don't know how true that is.

1:16:41

That's Proton's justification,

1:16:43

and you are welcome to have your own

1:16:44

opinions on whether or not that was

1:16:47

justification enough.

1:16:48

But it is...

1:16:52

Yeah,

1:16:53

it's – Proton does push back sometimes.

1:16:55

They kind of do it on a case-by-case

1:16:56

basis,

1:16:56

which I don't know how I feel about

1:16:58

that.

1:16:58

But they try to get as much of

1:16:59

the facts of the case as they can

1:17:01

before deciding whether or not they want

1:17:02

to push back on a core order.

1:17:04

But yeah, it's –

1:17:08

I don't know.

1:17:09

I think for me,

1:17:10

the big thing again is I hate seeing

1:17:12

people confuse privacy with anonymity and

1:17:15

get really upset and be like, oh,

1:17:17

Proton shouldn't have complied.

1:17:18

Proton even said this.

1:17:19

I don't know if it was in here,

1:17:20

but there was a Reddit thread where Proton

1:17:23

issued an official statement,

1:17:24

which was very professional.

1:17:26

I was impressed by it.

1:17:27

And they did mention basically that, look,

1:17:29

nobody can operate above the law.

1:17:31

There's not a country in the world where

1:17:32

we're not subject to somebody's laws.

1:17:34

And

1:17:35

They choose to be under Swiss laws.

1:17:36

They feel that Swiss laws are very

1:17:38

thorough and set a very high bar.

1:17:41

But yeah, I mean, ultimately,

1:17:42

at the end of the day,

1:17:44

I personally would be more worried by a

1:17:45

company who ignores the law because

1:17:47

they're going to get shut down eventually.

1:17:48

Like they just they can't keep operating

1:17:50

outside the law.

1:17:51

So, yeah.

1:17:53

Yeah, I, I agree.

1:17:55

It's a very fine line for them to

1:17:57

be treading here.

1:18:01

At the end of the day,

1:18:02

like the headline is accurate.

1:18:04

They did help the authorities.

1:18:06

And you might not expect that from a

1:18:08

company that markets itself so heavily

1:18:12

around privacy.

1:18:13

And a lot of people in the privacy

1:18:15

community, especially,

1:18:16

I even saw a comment here from our

1:18:19

team member, Jordan,

1:18:20

saying they could make it more obvious the

1:18:21

data isn't encrypted,

1:18:23

which I think is certainly true.

1:18:26

But at the same time,

1:18:29

I think you have a really good point

1:18:30

about

1:18:33

like Proton needing to market this product

1:18:36

towards an extremely broad audience who

1:18:40

does not care about these problems and who

1:18:42

isn't like going to be affected by court

1:18:45

orders because the demographic that Proton

1:18:48

is targeting is

1:18:51

primarily businesses and people who are

1:18:54

switching away from the Google Workspace

1:18:56

suite of things.

1:18:57

And it is just objectively true that

1:19:02

switching from Google to Proton is a huge

1:19:04

benefit for those people.

1:19:07

No matter what they do, really,

1:19:09

it's always going to be an improvement in

1:19:10

their privacy and security.

1:19:12

And a lot of these people are not

1:19:15

going to be

1:19:17

concerned about the nitty gritty details

1:19:20

of some of this stuff.

1:19:21

And also to Proton's credit,

1:19:25

between their privacy policy and their

1:19:27

blog and some pages on their website about

1:19:31

transparency,

1:19:32

for the people who are concerned about all

1:19:34

of this stuff,

1:19:35

you can find all of this information

1:19:38

pretty accessibly on their site and in

1:19:41

their resources.

1:19:42

You do have to look for it.

1:19:44

Which you can certainly argue is

1:19:48

unfortunate,

1:19:49

but also you can see that as a

1:19:53

legitimate decision for them to make

1:19:56

because it doesn't probably make a lot of

1:19:59

sense to overwhelm the type of person or

1:20:02

business that's switching from Google and

1:20:04

Microsoft to Proton with all of this stuff

1:20:06

that isn't going to impact them.

1:20:12

It's a very hard problem to solve.

1:20:14

And I think that for people who are

1:20:17

in this situation,

1:20:20

making it more clear that you need to

1:20:21

be using tools like Signal or SimpleX or

1:20:26

other messengers that are designed from

1:20:28

the beginning to be secure rather than

1:20:31

like you said,

1:20:31

sixties technologies that have had a ton

1:20:33

of stuff just bolted on over time.

1:20:38

like that is the actual solution here and

1:20:41

i think that like more tools that are

1:20:45

designed to be as private as possible by

1:20:47

default without having to worry about this

1:20:49

makes a lot more sense than than proton

1:20:51

like trying to describe every possible

1:20:54

case where your data could be could be

1:20:56

leaked or shared like this

1:21:00

So yeah, it's kind of unfortunate,

1:21:02

but I'd agree that I don't really know

1:21:06

what else Proton can do in a situation

1:21:09

like this.

1:21:13

It's very challenging,

1:21:15

and they've created this challenge for

1:21:16

themselves because they chose to make an

1:21:19

email service,

1:21:20

but that is what they're doing at the

1:21:22

end of the day,

1:21:23

and there isn't a great way to handle

1:21:26

this, unfortunately.

1:21:30

Yeah, I agree.

1:21:31

I mean,

1:21:31

it's I think we hit a certain point

1:21:33

where it becomes

1:21:37

It becomes kind of a personal opinion

1:21:38

thing in the sense that like, for example,

1:21:40

this person here on YouTube said that I

1:21:43

think that doesn't justify the move

1:21:45

they've made.

1:21:47

And I could see that argument where like,

1:21:49

again,

1:21:49

if you're saying like they shouldn't have

1:21:50

handed over any data period,

1:21:51

no matter what,

1:21:52

I completely disagree because they will.

1:21:54

If you go with a bulletproof provider who

1:21:55

does that,

1:21:56

eventually they will be shut down.

1:21:58

And now even if you didn't do anything

1:22:00

wrong,

1:22:00

your data is sitting in an evidence locker

1:22:02

alongside everybody else.

1:22:04

We've seen that happen time and time

1:22:05

again,

1:22:05

but I could see the argument of like,

1:22:07

well, they still,

1:22:07

they should push back on every core order

1:22:09

by default.

1:22:10

And I can see that argument.

1:22:11

I don't know if I necessarily agree with

1:22:13

that for the record, but like,

1:22:15

I definitely see where you're coming from.

1:22:17

So that's what I mean when I say

1:22:17

like,

1:22:18

we kind of get to a point where

1:22:19

it becomes personal preference.

1:22:20

Like, should they have pushed back harder?

1:22:21

Should they push back every time?

1:22:23

Because there's also a part of me that

1:22:24

says, well, if they cooperate,

1:22:25

let's say they cooperate on,

1:22:27

objectively awful cases,

1:22:29

like we know this person was genuinely a

1:22:32

terrorist in the wrong,

1:22:33

we know this person is trafficking CSAM,

1:22:35

we know this person is doing awful,

1:22:36

awful things,

1:22:37

then I feel like that kind of improves

1:22:39

Proton's position when if they get a BS

1:22:42

request that's like, oh,

1:22:43

we just don't like that this journalist

1:22:44

wrote mean things about us.

1:22:46

Okay.

1:22:47

Cry me a river, go home.

1:22:48

We're not turning over the data.

1:22:50

So I don't know.

1:22:51

It's just, it's, it's personal preference,

1:22:54

but yeah, it's that same person just said,

1:22:56

there's a reason I've always avoided

1:22:57

email.

1:22:57

I'm kind of backing up what you were

1:22:58

saying.

1:22:58

It's, it's less, uh, but, uh, you know,

1:23:02

we,

1:23:03

we need to focus on things whenever

1:23:04

possible.

1:23:04

Again,

1:23:05

I mentioned that my bank is never going

1:23:07

to send me a signal message,

1:23:08

at least not anytime soon.

1:23:09

And I wish they would, but, um, yeah,

1:23:11

trying to avoid email when you can trying

1:23:13

not to.

1:23:15

I don't know,

1:23:16

just trying to move to those more private

1:23:18

or more secure from the ground up

1:23:20

alternatives where possible is kind of the

1:23:22

only solution.

1:23:22

But it has its limitations for sure.

1:23:29

But I think that was all of our

1:23:30

stories this week.

1:23:32

I was poking around Proton's website.

1:23:33

Let me close these tabs.

1:23:36

Those were all the questions.

1:23:40

So it's time to start taking viewer

1:23:42

questions, actually.

1:23:44

If you've been holding on to any questions

1:23:45

about any of the stories we've talked

1:23:46

about,

1:23:47

go ahead and start leaving them in either

1:23:49

the forum thread or the comments section

1:23:51

of the livestream.

1:23:52

And we're actually going to go ahead and

1:23:54

start with the forum thread,

1:23:56

which

1:23:57

Last I checked only got one question.

1:24:01

Yes, that is correct.

1:24:02

So we have a question from anonymous five,

1:24:05

seven, one.

1:24:06

First of all,

1:24:06

big thanks for the work that we do.

1:24:08

Thank you.

1:24:09

You said in the past,

1:24:10

I used a single Gmail address,

1:24:11

which was not your main email address for

1:24:13

all sorts of random account signups for

1:24:14

things like discord, Amazon.

1:24:16

Netflix, news websites, one-off trials,

1:24:18

et cetera.

1:24:19

You said,

1:24:20

I've used this email address for many,

1:24:21

many years.

1:24:21

Needless to say,

1:24:22

it's a bit of a cluster.

1:24:23

Younger me thought that I was being smart,

1:24:25

not having these accounts fill up my main

1:24:26

email address with spam.

1:24:28

Cut forward to today and being more

1:24:30

privacy and security aware, you got,

1:24:33

ironically,

1:24:33

a Proton subscription with a custom

1:24:35

domain.

1:24:35

You've been updating all your old accounts

1:24:37

to either Proton or simple login aliases

1:24:39

and aliases on your custom domain.

1:24:41

Got me thinking, however,

1:24:42

is this merely updating my email with a

1:24:44

unique alias a waste of time?

1:24:46

Should I rather be creating completely new

1:24:48

accounts for all these websites?

1:24:50

The thinking is that they likely keep

1:24:51

version history of my email address so I

1:24:53

could still be linked or profiled based on

1:24:55

previous email addresses.

1:24:56

A data breach could also expose the email

1:24:58

history,

1:24:58

so it doesn't help in that respect either.

1:25:00

Updating my email with a unique alias on

1:25:02

all these websites is one thing,

1:25:03

but creating new accounts and closing the

1:25:04

old ones gives me goosebumps just thinking

1:25:07

about it.

1:25:09

I have some complicated thoughts on this

1:25:11

one.

1:25:14

Well,

1:25:15

complicated in the sense that I feel like

1:25:16

it's very nuanced.

1:25:17

You know, it's always nuanced, right?

1:25:19

So, I don't know.

1:25:21

Do you want to go first, Jonah?

1:25:23

I mean, yeah,

1:25:24

I could give a few thoughts on this.

1:25:26

We might be thinking about the same thing

1:25:28

here,

1:25:28

but I do think certainly it's a good

1:25:32

thing to switch to Proton,

1:25:35

start using simple login aliases for all

1:25:38

your accounts because it is super

1:25:39

important to use Proton

1:25:42

a different email for every site that you

1:25:44

use for the same reason,

1:25:46

pretty much that you'd use a different

1:25:47

password for every site that you use,

1:25:48

which is that, you know,

1:25:50

especially you don't you don't even

1:25:52

necessarily have to be concerned about the

1:25:55

website itself tracking you,

1:25:56

although that is definitely a concern with

1:25:58

some websites.

1:25:59

But as we talked about

1:26:03

previously in the show,

1:26:04

data breaches are super common.

1:26:06

And these sites will,

1:26:10

like when these data breaches are out,

1:26:12

if your email is shared between data

1:26:14

breaches,

1:26:14

that does create a pattern that can be

1:26:19

used to track you across these sites and

1:26:21

create a profile of like the kind of

1:26:22

sites that you're using.

1:26:23

And these data breaches are super common.

1:26:25

So you don't want to have any information

1:26:27

between data breaches that can potentially

1:26:29

be linked together.

1:26:30

That is a privacy concern.

1:26:32

Um,

1:26:35

As far as updating your email with

1:26:38

accounts you already use or deleting

1:26:41

accounts and starting over,

1:26:43

that is something that is going to really

1:26:44

depend on what you think is worth it.

1:26:48

I think the person who has this question

1:26:50

really laid out a lot of the reasons

1:26:54

why you might want to do that and

1:26:55

also the reasons that you wouldn't want to

1:26:58

do that,

1:26:58

especially like just the effort involved

1:27:01

in having to recreate all of these

1:27:03

accounts.

1:27:04

And it really depends on how you feel

1:27:08

about that site.

1:27:10

I don't think for a lot of websites

1:27:13

that you would sign up with,

1:27:15

it's probably fairly unlikely that they

1:27:17

are tracking like email history,

1:27:19

for example.

1:27:20

And if we're talking about like a big

1:27:22

tech company or a data company like Amazon

1:27:24

or Facebook,

1:27:25

I would think that that is more more

1:27:26

likely.

1:27:27

But if you're talking about like a general

1:27:29

e-commerce shop or a random form or

1:27:31

whatever,

1:27:32

um it's probably unlikely that they're

1:27:34

storing that historical data forever and

1:27:37

so changing that might be fine but of

1:27:39

course that is um a case where you

1:27:42

would have to trust that is happening and

1:27:46

and you'll never know for sure so i

1:27:50

i think the way i would sum this

1:27:52

up um is just like at the end

1:27:54

of the day you have to decide whether

1:27:56

the uh

1:27:58

Whether recreating all of these accounts

1:28:00

is worth it for you,

1:28:02

but that's going to be an individual and

1:28:04

maybe even a site-by-site basis,

1:28:08

which I couldn't really tell you.

1:28:10

I don't know if you have more actionable

1:28:12

advice than that, Nader,

1:28:13

if that's kind of what you're thinking,

1:28:14

but definitely share your thoughts.

1:28:17

Yeah, very similar.

1:28:19

I will say this isn't necessarily proof,

1:28:24

but...

1:28:25

In all the years that my brain has

1:28:27

become an encyclopedia for companies that

1:28:28

have had data breaches,

1:28:30

I've only ever seen one that had a

1:28:34

breach that exposed the email you signed

1:28:37

up with.

1:28:39

I can't remember who it was,

1:28:40

but I remember it does stick out in

1:28:42

my mind because I remember thinking like,

1:28:43

oh, that's weird.

1:28:44

We've never seen that before.

1:28:48

So, I mean, I...

1:28:50

I find it kind of hard to believe

1:28:53

that if this was a common practice of

1:28:55

companies keeping a history of your email

1:28:57

addresses,

1:28:58

that they would keep – I find it

1:29:03

hard to believe that if companies were

1:29:04

doing that,

1:29:06

that we wouldn't have seen more of those

1:29:07

breaches by now with how common these

1:29:09

breaches are.

1:29:11

Um, it's certainly possible, obviously,

1:29:13

but I, I don't know.

1:29:15

That's the,

1:29:15

I've only ever seen one that did.

1:29:16

I do agree.

1:29:17

I would just add onto that really quick

1:29:19

that like, in my experience,

1:29:21

hosting software,

1:29:22

like thinking about open source software,

1:29:24

we're talking about the major platforms

1:29:25

like WordPress or form software,

1:29:28

all the stuff that like all these tiny

1:29:29

sites would be using.

1:29:30

I've also never seen, um,

1:29:33

really any situations where like that is

1:29:35

commonplace in software.

1:29:36

So I would imagine you'd only really see

1:29:39

that from like a big custom made website,

1:29:41

maybe from a big tech company,

1:29:43

but it seems pretty unlikely.

1:29:45

I would agree just from the software side

1:29:46

of things as well.

1:29:47

I've never really seen features like that

1:29:50

personally.

1:29:52

And also that story that I referenced,

1:29:54

it was literally only the sign-up email.

1:29:56

So if you signed up with Gmail and

1:29:58

then you changed your email like,

1:30:00

it would only have that Gmail and then

1:30:03

your current email.

1:30:04

It was really weird.

1:30:05

I wish I could remember who that was.

1:30:07

But anyways, my only concern with this,

1:30:10

if you want to make all new accounts,

1:30:12

I certainly don't think that's a bad idea.

1:30:14

I know there's a lot of people in

1:30:15

the privacy community that actually like

1:30:18

just periodically nuke their accounts and

1:30:20

start over all the time.

1:30:22

I think we have a regular in our

1:30:23

forum who did that recently, actually.

1:30:26

But I think my concern would be,

1:30:31

especially with some of the more

1:30:32

mainstream platforms you mentioned,

1:30:33

like Discord and Amazon,

1:30:36

I notice it's becoming increasingly hard

1:30:39

to make new accounts,

1:30:40

especially privately.

1:30:41

Like a lot of them will ding you

1:30:43

for using VPNs.

1:30:46

A lot of them will ding you if

1:30:47

you're on like Linux or an uncommon

1:30:49

browser.

1:30:51

So you run, and a lot of,

1:30:53

some of them even like Reddit,

1:30:55

Oh my God.

1:30:56

I get more and more pissed at Reddit

1:30:57

with every passing day because Reddit now

1:30:58

has this little user and it's totally

1:31:00

invisible.

1:31:00

There's subreddits you can go find and

1:31:02

check it.

1:31:03

It's called like CQS or something.

1:31:05

It's basically like a user score.

1:31:06

And if you're not active enough,

1:31:08

if you're not messaging enough,

1:31:09

if you're not using the platform enough,

1:31:11

your score lowers and they think you're a

1:31:14

scammer or a spammer bot, whatever,

1:31:16

which I guess kind of makes sense because

1:31:17

that is the thing.

1:31:18

If you're like someone who spends too much

1:31:20

time on Reddit,

1:31:20

which I have in the past,

1:31:21

you're

1:31:22

that is a thing where like people will

1:31:23

literally make accounts and then sit on

1:31:26

them dormant for like six months.

1:31:27

And then they'll sell the account to

1:31:29

somebody who will start spamming.

1:31:31

Because, you know,

1:31:31

now they're not like a brand new account

1:31:33

and they don't look suspicious or,

1:31:34

you know,

1:31:35

they'll go out and they'll like get a

1:31:36

whole bunch of karma and then they'll sell

1:31:37

the account to someone else.

1:31:39

So I kind of get why they do

1:31:40

that.

1:31:40

Or, you know,

1:31:41

people lurking that just like only send

1:31:42

DMS or whatever, but it's, it's,

1:31:45

It makes it frustrating.

1:31:46

I shared this story a couple of weeks

1:31:47

ago.

1:31:47

I logged into,

1:31:48

I have an account where I've identified

1:31:50

myself as the new oil.

1:31:51

I used to be really active in like

1:31:52

r slash privacy.

1:31:53

And I logged in for something.

1:31:54

I don't even remember what,

1:31:55

but I logged in for something.

1:31:57

And on my homepage was r slash privacy.

1:31:59

And it was a question that I was

1:32:00

like, oh,

1:32:01

I can leave an answer to that real

1:32:02

quick.

1:32:02

Like I'm qualified to answer this.

1:32:04

This person seems like they're asking a

1:32:05

good question.

1:32:06

So I went in and I typed out

1:32:07

my answer.

1:32:08

And when I hit post, it was like,

1:32:09

oh, your score is too low.

1:32:10

You can't post in here.

1:32:11

And I'm just like, all right, whatever.

1:32:12

Don't care because I haven't posted in

1:32:14

like a year.

1:32:15

So yeah, it's just, it's that would,

1:32:17

I guess where I'm going with that is

1:32:18

that would be my main concern is if

1:32:20

it's something like,

1:32:21

you know,

1:32:21

dominoes and you're ordering pizza, right?

1:32:23

They don't care.

1:32:24

As long as the card goes through,

1:32:25

make a new account, whatever,

1:32:27

if you want to.

1:32:28

But if it's something like, again,

1:32:30

like Reddit, Discord,

1:32:31

they're probably going to put up some

1:32:32

blocks and like make it,

1:32:34

probably more of a pain in the ass

1:32:35

than it's worth in my opinion and

1:32:37

especially some of them like gmail discord

1:32:40

they might require a phone number and

1:32:42

they're kind of strict about not allowing

1:32:43

voice over ip so at the end of

1:32:45

the day it's probably going to be more

1:32:46

work than it's worth in my opinion but

1:32:49

it does depend on your threat model um

1:32:52

yeah i i guess that it really depends

1:32:53

on your threat model and how much work

1:32:54

you're willing to put in but i i

1:32:56

don't think you have to i think if

1:32:58

you want to it's not a bad idea

1:32:59

but in some cases you might get

1:33:00

diminishing returns

1:33:02

The other thing I would say is I

1:33:06

certainly don't think you have to do this

1:33:08

all right away unless you have a

1:33:10

particularly good reason to.

1:33:12

And kind of similarly to how we handle

1:33:15

opting out of data broker databases in the

1:33:19

US.

1:33:19

We typically recommend,

1:33:21

unless you have an immediate concern right

1:33:23

away of some threat against you,

1:33:28

Just taking your time with it.

1:33:29

I think you don't want to you definitely

1:33:31

don't want to burn out like spending many

1:33:34

hours straight just constantly recreating

1:33:36

all these accounts.

1:33:37

Right?

1:33:38

This is something you could do over the

1:33:40

course of I mean,

1:33:41

even even a few months if you if

1:33:43

you want,

1:33:44

just do

1:33:47

Just do a few accounts a day.

1:33:48

I find if you already use a password

1:33:50

manager,

1:33:52

that is a really helpful way to find

1:33:55

all of your accounts.

1:33:56

So you can go through basically a list

1:33:59

and update the email on them at whatever

1:34:01

pace you want.

1:34:02

If you aren't using a password manager

1:34:05

yet,

1:34:05

definitely start using one because that's

1:34:08

super helpful for just, I mean,

1:34:10

not only like all of the typical benefits

1:34:13

of a password manager in terms of

1:34:14

security,

1:34:14

but also just having a list of like

1:34:16

all the places you have an account in

1:34:17

the first place.

1:34:18

That comes in handy very often.

1:34:20

And it's a huge benefit of using a

1:34:22

password manager like that.

1:34:25

So yeah, just going through things,

1:34:28

taking your time is probably fine.

1:34:33

but yeah really really depends on your

1:34:35

situation you mean you don't have to be

1:34:39

me the psychopath who changed all my

1:34:42

passwords in one weekend in one sitting I

1:34:46

don't think you have to be I would

1:34:48

say if that gets you going then good

1:34:52

for you yeah I wouldn't recommend it but

1:34:55

I definitely did that it was not wise

1:35:01

All right, so going through the chat here,

1:35:03

just to address a few of the chats.

1:35:05

Back with the headline stories,

1:35:06

somebody asked,

1:35:07

will Graphene OS have two flavors now,

1:35:09

or will there remain one flavor?

1:35:11

As far as we know,

1:35:11

there's still just going to be one version

1:35:12

of Graphene.

1:35:14

There's not going to be multiple versions

1:35:15

per device.

1:35:17

Yeah,

1:35:17

and I believe it's been confirmed that

1:35:19

you'll be able to install Graphene OS from

1:35:22

their website like usual on these devices,

1:35:24

which I would expect because Graphene OS

1:35:26

places such an emphasis on...

1:35:29

You have to trust every single aspect of

1:35:31

the installation process to know that your

1:35:34

phone is secure.

1:35:35

And so doing it from a trustworthy source

1:35:37

that you can verify from the very

1:35:39

beginning is important for your security.

1:35:41

And I can't imagine Graphene OS would give

1:35:43

that up.

1:35:45

They've also said that...

1:35:47

I believe GrapheneOS has confirmed in one

1:35:51

of their social media posts.

1:35:51

It's so hard to find some of this

1:35:53

information about GrapheneOS because it's

1:35:56

in a lot of sporadic social media posts

1:35:59

rather than one place.

1:36:00

So I don't have the post pulled up,

1:36:01

but I believe I've seen that they're not

1:36:05

going to be including any Motorola

1:36:08

loadware in GrapheneOS or anything like

1:36:10

that.

1:36:11

I think it is still an open question

1:36:14

as to whether Motorola will pre-install it

1:36:16

as we discussed earlier.

1:36:18

And if that will be called Graphene OS

1:36:21

or if like Motorola will be pre-installing

1:36:24

maybe a fork of Graphene OS that does

1:36:26

have their security tools and maybe they

1:36:29

won't.

1:36:30

call it Graphene OS.

1:36:31

Maybe they'll do it for different branding

1:36:33

reasons.

1:36:33

So it's not considered to be a second

1:36:35

flavor of Graphene OS,

1:36:36

but maybe their stock operating system

1:36:39

will incorporate a lot of Graphene OS

1:36:41

features and you could maybe consider it

1:36:44

similar to Graphene OS in that regard.

1:36:46

I don't know if that will happen or

1:36:48

not.

1:36:48

It's very unclear what the final product

1:36:52

will look like.

1:36:53

But I think that

1:36:56

we're pretty certain that there will

1:36:58

always be just the standard Graphene OS

1:37:01

that we're all familiar with right now

1:37:03

available across the board with this

1:37:05

device and with Pixels as long as Google

1:37:07

decides to support this and that the

1:37:10

experience shouldn't change.

1:37:13

So you'll always have just the standard

1:37:15

Graphene OS option no matter what Motorola

1:37:19

decides to do with the stock stuff on

1:37:20

their end.

1:37:23

You know, that just occurred to me,

1:37:24

this is totally off the cuff.

1:37:26

So maybe I'm being stupid here.

1:37:28

I wonder if this will in a way

1:37:29

pressure Google to, to maybe,

1:37:33

maybe not full on reverse course,

1:37:35

but maybe be a little kinder.

1:37:37

to custom operating systems.

1:37:41

I can't imagine it's a huge, huge...

1:37:43

I doubt like,

1:37:44

fifty percent of people that buy Pixels do

1:37:45

it to put graphene on their phone or

1:37:47

something,

1:37:47

but I have to imagine there is a

1:37:49

not insignificant portion of people,

1:37:52

and I wonder if this opening of

1:37:54

competition...

1:37:54

Because graphene is really the only one

1:37:57

that's pixel-only, right?

1:37:58

Kallax people can go to the Fairphone,

1:38:00

there's a couple of Motorolas,

1:38:02

lineage people can choose every device

1:38:04

ever made practically like but so I feel

1:38:07

like now that graphene has competent or

1:38:10

like you know what I mean like now

1:38:11

that there's other options I wonder if

1:38:13

that'll kind of make Google like hesitate

1:38:14

a little bit like oh maybe we should

1:38:17

not be quite so aggressive because we

1:38:19

might actually drive some people away I

1:38:20

don't know maybe maybe it's just me

1:38:21

dreaming but

1:38:22

true and kind of relatedly i brought this

1:38:25

up in some of the graphene os discussions

1:38:27

on our forum this week but i almost

1:38:28

wonder if this partnership with motorola

1:38:31

can maybe convince google to change their

1:38:34

policies around like google play

1:38:36

certification especially when it comes to

1:38:38

banking apps um i know people replied to

1:38:40

me saying like you know under under the

1:38:43

current

1:38:43

policies,

1:38:44

they'll never accept something like

1:38:45

graphene OS for a variety of reasons.

1:38:47

And that's certainly true.

1:38:49

But Google's policies,

1:38:50

especially when it comes to like Google

1:38:52

Play certification,

1:38:53

they're not like an inherent law of the

1:38:56

universe that's written in stone, right?

1:38:57

It's it's Google's.

1:38:59

It's up to Google's whims to decide what

1:39:01

they allow for Google Play or not.

1:39:02

And maybe

1:39:04

Maybe Motorola can be like and whisper in

1:39:06

Google's ear through some back channels

1:39:09

and get some changes made to the Google

1:39:11

Play policies and somehow get an exception

1:39:14

or a rule change or something for Graphene

1:39:16

OS that would get that approved.

1:39:20

I don't know if that'll happen.

1:39:22

It's I would agree.

1:39:23

It's probably extremely unlikely,

1:39:25

but it's probably the closest we've come

1:39:26

to it.

1:39:27

And if that's possible,

1:39:28

that would be that would be huge for

1:39:29

graph, you know,

1:39:30

because I know a huge issue that people

1:39:32

have is especially banking apps,

1:39:35

but other apps that

1:39:37

unnecessarily use google plays like safety

1:39:40

net api and other services that don't work

1:39:43

on uncertified products like graphene os

1:39:46

um so that could be that could be

1:39:48

a game changer if google decides to allow

1:39:51

like sandbox google play into that program

1:39:54

seems unlikely but you know you never know

1:39:57

i can always hope yeah for sure

1:40:02

We had another question early on.

1:40:04

Question for question time.

1:40:05

How do I choose a laptop?

1:40:06

Any suggestions?

1:40:07

Definitely going to be a Linux distro.

1:40:10

We do have a page about how to

1:40:12

pick your laptop hardware, don't we?

1:40:18

I can't remember off the top of my

1:40:19

head.

1:40:19

I'm going to check familiar,

1:40:22

but I feel like we could have had

1:40:24

an article about it.

1:40:26

I would say, I don't know,

1:40:28

it really depends on what you're looking

1:40:30

for.

1:40:30

Because there's so much there's such a

1:40:31

wide variety of hardware out there.

1:40:32

And thankfully, you know,

1:40:34

Linux will run on like all of that.

1:40:38

So you have a lot of options.

1:40:40

For me, it'd be like very challenging.

1:40:44

I think to use any of the Intel

1:40:46

and AMD stuff lately,

1:40:47

just because like power efficiency has

1:40:50

turned out to be a really big,

1:40:52

big thing for me.

1:40:53

It's nice to have like a laptop that

1:40:54

lasts all day.

1:40:55

And something like Asahi Linux on a Mac

1:41:00

is probably one of my favorite Linux

1:41:02

experiences.

1:41:03

But there are definitely limitations to

1:41:05

that.

1:41:05

So it's not something I could recommend to

1:41:06

anyone.

1:41:08

Everyone, certainly.

1:41:11

When it comes to other stuff, I know,

1:41:14

and it looks like Nate just pulled that

1:41:15

up,

1:41:16

we do have a general guide on choosing

1:41:18

hardware,

1:41:18

and there is a picking computer section.

1:41:20

So you could take a look at that

1:41:22

for some...

1:41:23

Advice,

1:41:24

there's a variety of things to look for,

1:41:27

like researching how easy it is to patch

1:41:31

the firmware on your computer from Linux,

1:41:33

because that is important for security

1:41:35

reasons,

1:41:38

or what kind of secure element they have

1:41:41

for encryption.

1:41:42

Typically,

1:41:43

all of these will come with that built

1:41:45

into the CPU,

1:41:46

so it's not a huge concern.

1:41:49

but yeah definitely like whatever provider

1:41:52

or whatever manufacturer you decide you

1:41:55

probably want to go with i would research

1:41:59

uh their track record with non-os stuff

1:42:02

like like firmware updates for example

1:42:04

that you might want to have on linux

1:42:06

because some of a lot of that will

1:42:07

come down to the specific manufacturer but

1:42:09

as far as specific brands of like what

1:42:13

laptops you can choose i don't have um

1:42:16

Any specific advice?

1:42:17

Unfortunately,

1:42:19

that would be a good question.

1:42:19

Like,

1:42:20

if you have a lot of specific

1:42:23

requirements,

1:42:24

or want to share more information about

1:42:26

that,

1:42:26

I think if you ask on our forum

1:42:28

at discuss dot privacy guides.net,

1:42:31

and can share a bit more about what

1:42:33

exactly you're looking for,

1:42:34

what's important to you in a laptop,

1:42:36

I think that the community would probably

1:42:38

be able to come up with a lot

1:42:40

of answers for you that you could

1:42:42

consider.

1:42:44

Yeah,

1:42:45

that was kind of my thought while you

1:42:46

were talking is like what I,

1:42:47

I feel like which Linux distro is going

1:42:49

to determine a lot and your,

1:42:50

your threat model and everything.

1:42:51

Right.

1:42:52

And somebody else here.

1:42:53

Um,

1:42:53

somebody else has price limit and then

1:42:55

shared a link to Nova custom, which, uh,

1:42:57

Yeah.

1:42:58

Nova Customs sent Henry from TechLore.

1:43:00

He was telling me he's put the video

1:43:01

out by now,

1:43:02

but they sent him a laptop that had

1:43:03

like ninety two gigs of RAM or something

1:43:06

and say this was way before the RAM

1:43:07

shortage.

1:43:08

And I was just like, bro,

1:43:09

what are you going to do with it

1:43:10

when you're done?

1:43:10

You want to give it to me?

1:43:11

But yeah,

1:43:12

it really depends because like I'm like

1:43:14

I'm a cubes user, for example.

1:43:15

Right.

1:43:15

And so if I'm going to buy a

1:43:17

laptop,

1:43:18

it has to meet very specific requirements

1:43:20

about the TPM and it has to have

1:43:21

an SSD and it has to have a

1:43:23

certain amount of RAM and

1:43:24

apparently also has to have a more modern

1:43:25

processor,

1:43:26

because older processors really slow it

1:43:28

down.

1:43:30

Versus if you're going to install

1:43:31

something like Ubuntu,

1:43:33

that'll run on anything,

1:43:35

which we don't recommend Ubuntu.

1:43:36

There are better distros out there.

1:43:38

But maybe you have a use case,

1:43:40

and for some reason,

1:43:40

that's the one you want to use.

1:43:42

Yeah,

1:43:42

I think I'm glad you mentioned the forum.

1:43:45

Like definitely if you post in the forum

1:43:47

and you're like, hey,

1:43:48

here's my threat model.

1:43:49

Here's my budget.

1:43:50

Here's kind of what my values are.

1:43:52

I'm sure people will give you all kinds

1:43:54

of every perspective you can imagine about

1:43:56

the pros and cons of everything out there.

1:43:59

So.

1:44:02

Moving on from that question, first name,

1:44:04

last name in our chat asked if there's

1:44:07

any statistics we can share about the

1:44:10

growth of the community or anything like

1:44:12

that.

1:44:13

I could pull up and take a look

1:44:14

at some of this really quick.

1:44:16

Unfortunately,

1:44:17

some of our platforms that we were using

1:44:18

for just tracking the amount of page views

1:44:21

and stuff that we get aren't fully working

1:44:24

right now.

1:44:25

But overall, for the past year,

1:44:33

Everything has been trending up by by

1:44:35

quite a bit.

1:44:35

If I look at our form, for example,

1:44:38

we typically averaged around like seven

1:44:41

hundred thousand page views a month to

1:44:46

pretty much over a million January one

1:44:50

point two million.

1:44:51

But every every month

1:44:52

That's on the form anyways,

1:44:54

and that excludes known crawlers and other

1:44:57

traffic.

1:44:58

So that's very good.

1:44:59

We've also seen the amount of people who

1:45:04

just log in every day and post often.

1:45:08

That has gone up quite a bit.

1:45:12

so yeah we don't have like a ton

1:45:14

of super detailed stats beyond that

1:45:17

because uh we don't track a lot of

1:45:21

that stuff but in terms of uh page

1:45:24

views um that's up and i could look

1:45:27

at like the number of members um that

1:45:31

we have uh who sign up for either

1:45:34

being a paid member and supporting our

1:45:36

work or just signing up for a newsletter

1:45:38

to get updates from our website about

1:45:41

either about the show or new articles or

1:45:43

videos that we publish.

1:45:45

And all of that is going up.

1:45:48

You can see like the total number of

1:45:50

people who signed up for those

1:45:52

notifications is up.

1:45:54

Seventeen percent from just last month.

1:45:56

So yeah,

1:45:59

everything is on an upswing and we hope

1:46:01

to consider putting out

1:46:03

even more content that people find super

1:46:06

useful in their privacy journeys.

1:46:09

And we hope that people will stick around

1:46:11

because I think we got a lot of

1:46:12

good stuff going on,

1:46:14

on our forum and in our communities that

1:46:16

make it just a great place to discuss

1:46:19

all of this stuff and hang out without

1:46:23

any kind of negativity across the board,

1:46:26

which I think is a really great thing.

1:46:31

The next comment actually came from that

1:46:33

same user.

1:46:34

They said,

1:46:34

a big story this week was the LLM

1:46:36

de-anonymization.

1:46:37

I did see that passed around a couple

1:46:39

times.

1:46:40

I was going to tell you to go

1:46:42

check out privacyguides.org slash news,

1:46:44

which I do still recommend.

1:46:46

But weirdly, I did not see that story.

1:46:48

We did not write about it.

1:46:49

Or maybe it's queued up and it hasn't

1:46:50

published yet.

1:46:51

Because I swear I thought I saw Freya

1:46:53

post that one in the news chat.

1:46:56

Yeah.

1:46:58

I'm actually looking here at the... Oh,

1:47:00

no, we haven't written about that one.

1:47:02

Crazy.

1:47:02

We need to write about that one.

1:47:05

But yeah, another...

1:47:08

I would say as far as,

1:47:10

I know we keep pushing the forum,

1:47:11

but even if you don't want to sign

1:47:14

up for it and you don't want to

1:47:14

participate, the forum works with the RSS.

1:47:17

So I actually,

1:47:18

long before I came to work for Privacy

1:47:20

Guides,

1:47:21

I have the news section of the forum

1:47:22

in my RSS feed just kind of as

1:47:24

a safety net in case there's any articles

1:47:26

that don't show up in my usual news

1:47:27

feed.

1:47:28

If somebody posts about it on the forum,

1:47:30

I will get it in my RSS feed

1:47:33

and I'll be able to

1:47:36

to go ahead and see that.

1:47:37

So I think that one probably got posted

1:47:39

because I've seen it in a few different

1:47:41

places.

1:47:42

But I mean, if nobody did,

1:47:43

then you can go and post it and

1:47:44

be that person.

1:47:45

So yeah, that was a big story.

1:47:47

Jonah said earlier in the show,

1:47:48

like there's so many stories,

1:47:50

it's hard to...

1:47:51

kind of pick, like, I'm not kidding.

1:47:53

Every week we end up with like seven

1:47:56

stories and we're like,

1:47:57

we have to trim this down or this

1:47:59

is going to be like a ten hour

1:48:00

podcast.

1:48:01

So it's really hard.

1:48:02

Yeah,

1:48:04

it's really hard to pick which stories to

1:48:05

prioritize.

1:48:06

And, you know, I'll be honest,

1:48:08

like even even me,

1:48:09

sometimes when I'm editing the clips over

1:48:11

the weekend, I'm like, man, you know,

1:48:12

I kind of wish we'd talked about this

1:48:13

other story.

1:48:15

Um, so like it,

1:48:16

it happens sometimes it's hard to

1:48:17

prioritize them.

1:48:18

There's a lot of stories out there.

1:48:19

So, um, definitely find reliable sources,

1:48:23

whether that's the forum,

1:48:23

whether that's privacyguides.org slash

1:48:25

news, um, or a trusted outlet.

1:48:28

Um, we don't cover it all.

1:48:29

We try to bring you,

1:48:31

we try to bring you the,

1:48:32

the big important ones.

1:48:35

Um, yeah.

1:48:35

And I'll also say on this show,

1:48:37

like the stories that we can, um,

1:48:39

that we can discuss and have, have,

1:48:42

Good things to add to and probably that

1:48:45

people have questions about that we can

1:48:46

answer on the live stream.

1:48:48

We're certainly aware that like we don't

1:48:50

cover a ton of stories.

1:48:51

I know there's other shows that people

1:48:55

might find similar to this one that really

1:48:57

are more news focused and kind of cover

1:48:58

every single headline throughout the week.

1:49:00

And we explicitly haven't been doing that.

1:49:03

But we know that people want to stay

1:49:05

up to date with that stuff.

1:49:06

So we are thinking about like more ways

1:49:08

that we can get

1:49:10

um just headlines in front of people and

1:49:13

get that content shared even if we don't

1:49:14

discuss it here on the show whether that's

1:49:16

through like privacyguys.org news or from

1:49:19

uh from other from other things that we

1:49:22

are thinking about working on that we can

1:49:25

maybe see if people are interested in soon

1:49:27

um so yeah

1:49:32

Yeah,

1:49:32

I was just going to add on to

1:49:32

that real quick.

1:49:33

Even back when I was at Surveillance

1:49:34

Report, where we regularly covered like,

1:49:36

thirty to forty stories a week,

1:49:38

there were still times that I was just

1:49:39

like, man, we missed this story.

1:49:41

We should have covered this story.

1:49:43

It is so hard to pick which stories

1:49:46

are the most important ones that people

1:49:48

are going to resonate with.

1:49:50

Going back to community statistics really

1:49:53

quick,

1:49:53

Jordan just shared that we just hit nine

1:49:55

thousand subscribers on YouTube.

1:49:56

So that's cool.

1:49:58

Over fourteen hundred of those subscribers

1:50:00

are just in the last month.

1:50:01

So that's definitely growing quite a bit.

1:50:05

So, yeah.

1:50:05

And of course,

1:50:07

we're constantly getting new followers,

1:50:08

whether it's on PeerTube or Mastodon or

1:50:10

other social media platforms, too.

1:50:12

So all of those

1:50:14

numbers are up as well and continue to

1:50:15

grow.

1:50:16

So I'm very happy that more people are

1:50:19

becoming interested in all of the topics

1:50:21

that we're talking about here because I

1:50:22

think it's important.

1:50:25

Yeah, for sure.

1:50:27

We got a quick question from Twitter.

1:50:30

Do you guys see MixNet and data type

1:50:32

traffic,

1:50:32

like the Molad data type traffic

1:50:35

obfuscation tools becoming popular now

1:50:38

that countries are coming for VPNs more

1:50:40

and more?

1:50:41

And then follow up,

1:50:42

do you think these tools should be in

1:50:43

more threat models?

1:50:46

I don't know much about MixNet.

1:50:49

I know data...

1:50:52

I think data was designed... I mean,

1:50:54

it's in the name.

1:50:54

Data was designed more to combat AI

1:50:57

traffic correlation as opposed to

1:50:59

censorship.

1:51:04

Personally,

1:51:05

I would like to see something like that

1:51:07

become more common just with the rise of

1:51:11

AI.

1:51:12

Earlier,

1:51:12

we talked about how historically defense

1:51:15

contractors had a struggle with...

1:51:18

having too much data and not knowing how

1:51:21

to parse through it.

1:51:22

And for better or worse,

1:51:24

I think that is coming to an end

1:51:26

with AI,

1:51:28

which is why I brought out the danger

1:51:30

of trusting AI implicitly,

1:51:32

that AI just says, well,

1:51:33

here's all this traffic correlation,

1:51:35

so he's guilty.

1:51:36

And if nobody's double-checking that,

1:51:38

things are going to get real bad real

1:51:40

quick.

1:51:40

I mean, yeah.

1:51:42

But I think...

1:51:46

I think if the last I heard the,

1:51:48

the UK was very heavily favoring, uh,

1:51:51

regulating VPNs.

1:51:52

And I think if that happens,

1:51:53

we're definitely going to see a spike in

1:51:56

censorship obfuscation and resistance

1:51:58

tools for sure.

1:51:59

Um, but that's just my two cents.

1:52:03

No, for sure.

1:52:04

The tricky thing with all of this,

1:52:06

with all the tools like that,

1:52:08

is that they typically are easy to detect,

1:52:13

like, just from your ISP standpoint.

1:52:15

So, well, similar to a VPN, like...

1:52:19

It's challenging to see what you're doing

1:52:22

with those connections and even more

1:52:24

challenging with something like Tor or

1:52:28

other mixnets because there isn't a single

1:52:30

VPN provider that legal authorities can go

1:52:32

after.

1:52:35

Hiding what you're doing,

1:52:37

hiding just the fact that you're trying to

1:52:41

maintain your privacy and trying to

1:52:43

protect your security and your data on the

1:52:45

internet,

1:52:45

hiding the fact that you want to do

1:52:47

all of that in general from your ISP

1:52:48

is a very challenging thing to do.

1:52:51

And again,

1:52:53

similar to like what I talked about in

1:52:55

earlier in the show,

1:52:57

I think it's just incredibly important to

1:53:00

remember that like,

1:53:03

This isn't only a technical issue that can

1:53:05

be solved with something like Mixnets.

1:53:06

It's really a case where people need to

1:53:09

demand from their governments and from

1:53:14

politicians that the right to maintain

1:53:17

your security online and the right to

1:53:19

maintain your privacy when you're browsing

1:53:21

the web and avoid trackers and all of

1:53:24

this stuff

1:53:25

That is something that needs to be

1:53:28

enshrined in law and upheld by these

1:53:31

institutions.

1:53:33

It's not something that technical people

1:53:36

are going to be able to just thwart

1:53:38

forever if the governments are really

1:53:40

going after this super hard.

1:53:44

And so it's very challenging, I think,

1:53:47

in a lot of places.

1:53:48

And if you're in a particularly oppressive

1:53:52

regime,

1:53:52

you don't have a lot of options and

1:53:54

you kind of just have to

1:53:56

go with what works,

1:53:57

but we're seeing all of these laws like

1:54:02

age verification and other privacy

1:54:04

invasive things, proposed VPN bans,

1:54:08

et cetera,

1:54:09

happening in countries that are supposedly

1:54:11

very democratic and should give you a lot

1:54:13

of control.

1:54:14

And these are wildly unpopular ideas,

1:54:18

especially when people fully understand

1:54:20

what these laws are asking for.

1:54:23

I think people need to recognize that you

1:54:25

actually do have a lot of power if

1:54:26

you don't want these laws to be passed

1:54:28

and you need to demand more heavily of

1:54:35

your own government that this sort of law

1:54:38

is completely unacceptable.

1:54:41

That is the solution that we have to

1:54:44

do in a democracy at the end of

1:54:46

the day.

1:54:47

And more people need to take up the

1:54:48

mantle on that.

1:54:53

Yeah,

1:54:54

I don't have much to add to that.

1:55:00

I did one quick follow-up.

1:55:01

I think this is probably our last question

1:55:03

here.

1:55:03

But first name, last name,

1:55:05

that asked the laptop question earlier.

1:55:07

They said they were thinking about Cubes.

1:55:09

Yeah,

1:55:09

somebody else mentioned the HSI score.

1:55:13

Cubes does have,

1:55:14

it should be fairly easy to find if

1:55:15

you,

1:55:16

I think if you just go to their

1:55:17

documentation,

1:55:18

it's like one of the first topics.

1:55:19

They have a really good documentation for

1:55:21

Cubes.

1:55:22

They have a list of all the different

1:55:26

laptops they've tested,

1:55:28

whether or not they're compatible,

1:55:29

which ones are.

1:55:31

They'll even tell you which components.

1:55:35

The graphics card drivers don't work,

1:55:37

but the CPU works.

1:55:39

It gets pretty granular,

1:55:41

and you can look up whatever specific

1:55:43

laptop you're thinking about getting or

1:55:44

desktop or whatever.

1:55:46

And they'll tell you if it's compatible.

1:55:48

They'll also tell you if it's been tested

1:55:49

or not.

1:55:50

Like, yes,

1:55:51

one of our team members bought this and

1:55:52

confirmed it.

1:55:53

It works.

1:55:54

Yes, it works.

1:55:55

But there's caveats or like, no,

1:55:57

it doesn't work or like it should work,

1:55:58

but we haven't tested it.

1:55:59

It's really good.

1:56:00

So I would definitely start there for

1:56:02

sure.

1:56:03

So.

1:56:04

Not a question,

1:56:05

but anonymous to at five so that ends

1:56:07

new activism project is going to be a

1:56:09

read for the weekend.

1:56:10

Absolutely.

1:56:11

I think this is an incredible resource,

1:56:13

especially if you are interested in some

1:56:15

of the stuff I was just talking about

1:56:16

being an activist or an advocate for

1:56:19

privacy rights in your area.

1:56:21

area,

1:56:22

or starting a local organization like EFF

1:56:26

Austin, for example,

1:56:27

where we're going to be next week.

1:56:30

But organizing groups like that,

1:56:32

I think a lot of the resources that

1:56:35

Em has published at privacyguides.org

1:56:38

slash activism are super useful.

1:56:42

And even if you're not sure if

1:56:44

you are a privacy activist or you're not

1:56:46

super into that.

1:56:47

I think a lot of it is very

1:56:49

good advice.

1:56:50

If you are interested in any of these

1:56:52

topics that it's definitely worth a read.

1:56:55

So yeah, totally check it out.

1:57:00

Yep.

1:57:01

Yeah, I saw that comment too.

1:57:03

Thank you.

1:57:04

We're super excited about it back here as

1:57:05

I'm sure you guys can tell.

1:57:07

So

1:57:08

But I think that's all we've got,

1:57:11

actually.

1:57:13

So I guess we'll go ahead and call

1:57:17

it here.

1:57:18

All right.

1:57:18

Well,

1:57:20

all of the updates from This Week in

1:57:22

Privacy,

1:57:23

we share them on our website on the

1:57:24

blog every week.

1:57:25

So you can sign up for the newsletter

1:57:28

or you can subscribe with your favorite

1:57:30

RSS reader if you want to stay tuned

1:57:31

and get links to all of the stuff

1:57:33

that we talked about.

1:57:35

in the show.

1:57:36

For people who prefer an audio version of

1:57:39

this,

1:57:39

we do put the audio version of this

1:57:42

recording on all podcast platforms and

1:57:46

RSS.

1:57:47

We also sync the video recording of this

1:57:49

to PureTube after the fact,

1:57:51

so you can find this video later without

1:57:54

having to go to YouTube if you don't

1:57:56

want to.

1:57:57

Privacy Guides is an impartial nonprofit

1:58:00

organization that is focused on building a

1:58:03

strong privacy advocacy community and

1:58:05

delivering the best digital privacy and

1:58:07

consumer technology rights advice on the

1:58:09

internet.

1:58:10

If you want to support our mission,

1:58:11

then you can make a donation on our

1:58:13

website at privacyguides.org slash donate.

1:58:17

You can contribute using standard fiat

1:58:19

currency via debit or credit card,

1:58:21

or you can opt to anonymously using

1:58:22

Monero,

1:58:24

or you can donate with your favorite

1:58:26

cryptocurrency, whatever that may be.

1:58:28

Becoming a paid member can unlock

1:58:31

exclusive perks like early access to video

1:58:34

content that we publish on our channel,

1:58:36

and priority during the live stream Q&A.

1:58:40

You'll also get a cool badge on your

1:58:42

profile in the Privacy Guides form and the

1:58:44

warm,

1:58:44

fuzzy feeling of supporting independent

1:58:47

media.

1:58:48

Thank you all for watching,

1:58:50

and we will see you next week live

1:58:53

from Austin, Texas.

1:58:55

Very exciting.

1:58:56

See you, everyone.