WhatsApp Is Not Really Encrypted?
Ep. 38

WhatsApp Is Not Really Encrypted?

Episode description

WhatsApp encryption is coming under scrutiny, French lawmakers are pushing an under-15s social media ban, TikTok’s privacy policy has gotten worse after it has been bought by American investors, and much more! Join us for This Week In Privacy #38.

Download transcript (.vtt)
0:22

Welcome back, everyone,

0:23

to This Week in Privacy,

0:25

our weekly series where we discuss the

0:27

latest updates with what we're working on

0:29

within the Privacy Guides community,

0:31

and this week's top stories in the data

0:33

privacy and cybersecurity space,

0:35

including allegations that WhatsApp is not

0:38

end-to-end encrypted,

0:39

France in the UK restricting some online

0:42

tools for minors,

0:43

TikTok's new US ownership, and more.

0:46

I'm Jonah, and with me today is Nate.

0:49

How are you doing today, Nate?

0:52

I'm good.

0:53

I'm good.

0:53

How are you?

0:54

I'm doing excellent.

0:55

Thank you.

0:56

For those of you who don't know,

0:58

Privacy Guides is a nonprofit which

1:00

researches and shares privacy-related

1:02

information,

1:03

and we facilitate a community on our forum

1:06

and matrix where people can ask questions

1:08

and get advice about staying private

1:10

online and preserving their digital

1:12

rights.

1:13

Before we dive into our first WhatsApp

1:15

stories,

1:16

I want to give some quick updates with

1:18

what we've been working on at Privacy Guys

1:19

this week.

1:20

Why don't I start off by handing it

1:22

over to you, Nate,

1:23

to talk about the video side of things?

1:26

Sure.

1:27

There's not too much new with the videos.

1:31

Let's see,

1:31

the smartphone course for Android that

1:36

We're adding on to this,

1:36

so it's hard for me to know how

1:38

to describe it.

1:38

The smartphone course we're doing,

1:40

the intermediate tier,

1:41

the Android section is done.

1:44

And I believe we're just trying to work

1:45

out some technical issues with PeerTube.

1:48

Once it's on PeerTube,

1:49

we will be posting that for members.

1:51

The iOS version we're hoping to have done

1:53

next week.

1:55

And Jordan has begun editing the private

2:01

browsing video that I've been talking

2:03

about.

2:03

And that will hopefully also be coming out

2:05

here

2:06

soon and in the meantime i have moved

2:08

on to scripting a video about private

2:11

messaging so i'm excited to share that one

2:13

with you guys and um again we've just

2:16

been putting out a lot of clips we

2:17

started putting out um

2:19

horizontal clips as well,

2:20

like regular aspect ratio clips on the

2:23

Privacy Guide Shorts channel.

2:24

So I know a lot of you guys,

2:26

it's a really common thing when we do

2:27

these kind of news shows that people want

2:30

something where they can share it quickly

2:32

and easily in just that story with people.

2:34

So definitely check that out if that's

2:35

something that you would like.

2:38

nice um in other privacy guides news

2:42

things have been again being pretty active

2:45

on our forum lately lots of good

2:46

discussions going on i know that you nate

2:49

and freya as well have been working on

2:52

a lot of news brief articles lately um

2:54

so those have been coming out um other

2:56

stuff is still being worked on again

2:59

behind the scenes but i know i've been

3:00

talking to em about a big project that

3:04

She has been working on for the past

3:05

few weeks now,

3:06

and that's coming out relatively soon.

3:08

Hopefully,

3:09

within the next few weeks or so,

3:11

we'll have more updates to share with you

3:13

on the stream about that.

3:15

But yeah, lots of progress is being made,

3:18

lots of big plans for the site and

3:21

for the videos in twenty twenty six,

3:24

especially as we get into the new year.

3:27

I feel like a lot of people who

3:30

have been working on all this stuff have

3:32

been feeling pushed pretty hard lately.

3:33

We've been doing a lot of work,

3:35

but hopefully it all pays off and people

3:38

like it and we can reach new people

3:40

with all of this privacy stuff.

3:43

But in terms of specific updates,

3:45

I don't think we've pushed a new release

3:47

of the website on GitHub or anything like

3:50

that.

3:50

So no changes to the recommendations or

3:53

anything so far.

3:54

But yeah,

3:56

all of that stuff is still being worked

3:57

on in the background.

3:58

And if you are hoping to see something

4:00

in particular, definitely join our forum,

4:03

join the community and talk with us a

4:06

bit about what you want to see because

4:09

a lot of the stuff that we're doing

4:10

is really

4:12

built on this community and what you all

4:14

want to see and what would make the

4:16

most impact in the privacy rights space.

4:22

With all these updates out of the way,

4:23

I think we can move on to some

4:25

of the biggest news stories that we've

4:28

seen in privacy and security in the past

4:30

week.

4:30

I know you wanted to start off with

4:32

the headline story here,

4:34

so why don't I pass it off to

4:35

you, Nate, to talk about that.

4:39

Yeah.

4:39

Sounds good.

4:40

Let's talk about WhatsApp.

4:42

So, uh, WhatsApp for,

4:44

I'm sure most of our listeners know,

4:46

you know,

4:46

it's a encrypted messenger brought to you

4:48

from Meta,

4:49

the same people who make Facebook and

4:51

Instagram and well bought Instagram.

4:54

And, um,

4:56

Yeah, it's WhatsApp, as far as we know,

4:59

is end-to-end encrypted and it uses the

5:01

signal protocol.

5:02

So there are a lot of concerns about

5:04

the metadata collection of WhatsApp.

5:05

But up until now, we've always believed,

5:07

well, the content itself is encrypted,

5:09

which is better than nothing.

5:11

Although there is now a new lawsuit that

5:12

alleges that, no, actually,

5:14

that's not the case.

5:14

And they don't mean that in like a,

5:16

well,

5:16

technically kind of sort of like they mean

5:18

it literally like, no,

5:19

it is not end-to-end encrypted.

5:21

And this lawsuit claims that if you are

5:25

a meta or WhatsApp employee,

5:27

all you need to do to access the

5:30

messages is you send a task, which is,

5:33

I guess,

5:33

just what they call like their internal

5:34

tickets or requests and metas internal

5:36

systems.

5:37

You send a task to a meta engineer

5:39

and you just say, hey,

5:40

I need access to these users messages for

5:42

whatever reason.

5:43

And they say that the engineering team

5:45

will then grant access,

5:46

often without any scrutiny at all,

5:48

and the workers' workstation will then

5:49

have a new window or widget that they

5:51

can pull up any WhatsApp users' messages

5:54

based on the user's ID number.

5:56

which is a unique number.

5:58

And then once they have the access,

5:59

you can read messages.

6:01

They say there's no separate decryption

6:03

step.

6:03

It's just available right there,

6:05

which I'll come back to that, I guess.

6:09

They say that these messages are

6:12

commingled with additional messages from

6:14

unencrypted sources.

6:16

Not entirely sure what that means.

6:17

Maybe they're talking about the DMA.

6:18

I think WhatsApp now has to

6:21

federate or combine with other third-party

6:23

messengers as part of the DMA.

6:24

But I could be wrong about that.

6:25

I'm speculating.

6:27

They also say messages appear almost as

6:29

soon as they are communicated.

6:30

So this is essentially a real-time tool.

6:32

And they say the access is unlimited and

6:34

you are able to go back indefinitely in

6:37

time to view messages all the way back

6:39

to the user's first messages when they

6:41

open the account,

6:42

including messages the users believe they

6:43

have deleted.

6:46

So it is important to note that this

6:48

lawsuit does not provide any technical

6:50

details to back up these claims.

6:53

They say that there were some courageous

6:55

whistleblowers and –

6:58

Yeah, I mean, obviously,

7:00

Meta is disputing this.

7:01

They say that these claims are, quote,

7:04

categorically false and absurd.

7:07

And they even say WhatsApp has been

7:08

encrypted using the Signal protocol for a

7:10

decade.

7:11

So, yeah,

7:14

this is definitely a big if-true kind of

7:16

moment.

7:17

And it's very concerning because WhatsApp

7:20

is...

7:21

incredibly popular around the world here

7:22

in the U S not so much,

7:23

but in other parts of the world,

7:24

in Europe, in Asia,

7:26

it's incredibly popular.

7:28

And again, up until now,

7:31

I want to reiterate,

7:31

we have concerns with WhatsApp.

7:33

I'm not saying it's great and you should

7:34

use it, but at least it was like,

7:35

well, you know,

7:36

at least the messages themselves are

7:37

encrypted.

7:37

And that's something that's more than we

7:38

can say for SMS or anything like that.

7:41

And apparently we can't even say that now,

7:42

uh, potentially.

7:44

So yeah,

7:45

The only other thought I wanted to point

7:47

out is I mentioned the whole widget thing

7:49

where they say there's no separate

7:50

decryption step.

7:51

In theory,

7:53

that doesn't necessarily mean the messages

7:55

aren't encrypted because maybe the

7:56

decryption is happening within the widget.

7:58

However,

7:59

the whole point of end-to-end encryption

8:01

is that that shouldn't be possible

8:03

regardless,

8:04

whether they're being stored in plain

8:05

text,

8:05

whether they're being stored encrypted.

8:09

The whole point of end-to-end encryption

8:10

is that the only people who should have

8:11

access are the ends.

8:13

And the server is not supposed to be

8:14

one of those ends, cough, cough, Zoom.

8:18

Sorry,

8:18

I had to take a shot at them

8:19

for that one.

8:21

But yeah, like I said,

8:22

this really is big if true and would

8:25

really be bad because of WhatsApp's really

8:28

large user base.

8:29

I think that's a really good point you

8:31

just said about how the server shouldn't

8:33

be one of the ends,

8:34

especially because we know with WhatsApp

8:37

in particular,

8:37

but also with some other end-to-end

8:39

encrypted messengers,

8:42

Most notably, iMessage,

8:43

unless you have advanced data protection

8:45

enabled,

8:47

even if end-to-end encryption is working

8:49

perfectly fine,

8:52

very often they will have these backup

8:54

features which are not end-to-end

8:55

encrypted,

8:56

and that potentially acts as a backdoor

8:58

for service providers to get into it.

9:00

As far as I know,

9:00

that is the case with WhatsApp,

9:02

and that is a potential way that this

9:04

could be true without the...

9:07

end-to-end encryption of the transmission

9:09

itself being broken.

9:10

Maybe they have a way to access these

9:13

backups easily.

9:15

But again, that's speculation.

9:17

I think it's important to remember with

9:18

this story,

9:19

and actually one of our community members

9:21

just left a comment about this as well,

9:23

which is that this is...

9:27

a legal complaint right now it's not at

9:30

the stage where any evidence has been

9:32

presented at all there's no technical

9:34

evidence within the document that's been

9:36

shared that demonstrates any sort of back

9:38

door or that there's any sort of

9:42

compromise with the encryption of whatsapp

9:44

um so

9:47

That being said,

9:50

with an app like WhatsApp that's closed

9:53

source and completely under the control of

9:56

Facebook, this is always a danger,

9:58

especially because Facebook has this

10:00

history of extensive metadata collection,

10:04

extensive...

10:06

you know, just general data collection,

10:07

actually.

10:08

And they are a company that's built

10:11

entirely on this like data driven

10:14

advertising model where collecting as much

10:16

data as they can is really paramount to

10:19

their business.

10:22

That creates a situation where it's very

10:26

hard to trust that they've implemented

10:28

end-to-end encryption correctly,

10:30

that they're not trying to weaken it

10:32

behind the scenes,

10:33

or that this is completely impossible.

10:34

So I don't think that this is...

10:37

out of the question.

10:38

But again, this hasn't been proven.

10:41

This just goes to show, I think,

10:42

that encryption in these apps needs to be

10:46

completely verifiable.

10:48

It needs to be open source,

10:50

it needs to use standard protocols,

10:52

and it can't just be a matter of

10:54

trust in the publisher of these apps

10:56

themselves.

10:57

Compared to a messenger like

10:59

Signal, which is open source,

11:02

like if this story had come out about

11:03

Signal right now, many people,

11:06

security experts,

11:07

auditors could be pouring over that source

11:10

code,

11:11

trying to see if there's any way that

11:13

this could be true, right?

11:15

And that just isn't possible with

11:16

WhatsApp.

11:16

And that's the danger of using these

11:18

proprietary closed source applications

11:22

like this for your communications instead

11:24

of more secure alternatives.

11:28

Um,

11:30

the other thing I wanted to say about

11:31

this whole WhatsApp story is that even if

11:35

this isn't true,

11:36

even if they can't read your messages

11:38

themselves,

11:40

it's well known at this point that

11:42

WhatsApp is not doing anything in terms of

11:46

preventing the collection of metadata,

11:48

which is, um, you know,

11:49

data about who you're talking with data

11:51

about when you're using the app, um,

11:54

all of that stuff.

11:54

That's not like the message content

11:56

itself.

11:56

Right.

11:57

And so.

12:00

I mean,

12:01

there's this famous quote from a U.S.

12:04

government official where he goes like,

12:06

we kill people based on metadata, right?

12:09

Because they don't actually need the

12:12

content of your messages.

12:14

If people have access to this data,

12:16

they can infer a lot about you,

12:18

who you talk to.

12:19

says a lot about what you're probably

12:21

talking about,

12:22

especially if you're doing it on a regular

12:24

basis or anything like that.

12:25

And all of that can be determined without

12:27

breaking end-to-end encryption at all.

12:28

And that's part of why I think WhatsApp

12:30

is such a dangerous application to use

12:33

because none of that metadata is

12:38

protected.

12:38

And Facebook is the last company on earth

12:43

who I would trust with that metadata in

12:46

question.

12:47

So...

12:49

Even if it's not true,

12:51

I would really encourage people to not use

12:54

WhatsApp personally.

12:55

But yeah, if it is true,

12:58

that is even worse.

13:00

We will have to keep an eye on

13:01

this story for sure because it is a

13:03

big if true moment.

13:05

Absolutely.

13:08

Yeah,

13:08

I agree with everything you just said.

13:09

This is one of the reasons, like,

13:11

we know that open source is not the

13:13

end-all be-all.

13:14

It doesn't guarantee that something is

13:15

private or secure.

13:16

But like you said,

13:17

if this was an allegation made against,

13:19

like, Signal or SimpleX, like,

13:22

this wouldn't even really be a story

13:24

because we could, I mean,

13:25

I certainly could,

13:26

and I don't know enough code for that,

13:27

but

13:28

we as a community could easily just go

13:30

pour through the source code and be like

13:31

yeah that's not what's happening here we

13:32

can we can prove that's not happening um

13:36

but yeah and and um i i i

13:39

do thank that listener for pointing out

13:40

like yes these are allegations they

13:42

haven't presented any evidence i will be

13:43

really interested to see what sort of

13:46

evidence uh they present if any um

13:51

And yeah, it's, and like you said,

13:53

the metadata is so, so, so important.

13:56

The EFF has an amazing page where they

13:59

talk about the importance of metadata and

14:00

they use some examples, like, um,

14:02

some really sensitive examples.

14:04

Like if you call the suicide hotline and

14:06

sorry, I probably just got us demonetized,

14:08

but you know, you call the hall,

14:09

the hotline at two in the morning from

14:10

the golden gate bridge.

14:12

Do you really need the contents of the

14:13

phone call to know what was probably going

14:15

on there?

14:16

And

14:17

Yeah,

14:17

I forget who it was that said that

14:19

quote,

14:20

but that is a really famous quote you

14:21

can find very easily with a web search.

14:24

And that's what he was saying is exactly

14:25

that.

14:26

Metadata is so revealing that you can make

14:29

a really convincing argument without the

14:30

content.

14:31

And at that point,

14:32

you can authorize a military strike.

14:34

Like, yeah.

14:34

I mean,

14:36

is it possible that something else is

14:38

going on?

14:38

Sure, of course, but...

14:40

Yeah, it's pretty wild.

14:46

It's good enough for most people, I think.

14:50

For sure.

14:51

This is why we encourage things like

14:54

Signal, Simplex,

14:55

things that are metadata resistant,

14:57

are fully open source,

14:59

that go above and beyond to protect users

15:01

and their data, for sure.

15:03

Before we move on really quick,

15:04

we did get a question.

15:05

How do you convince your peers to stop

15:07

using WhatsApp?

15:08

Do you have any thoughts on that, Nate?

15:10

Send them this story.

15:13

I think – okay.

15:15

I mean there's – we get questions like

15:16

this all the time,

15:17

and they're great questions.

15:19

But unfortunately,

15:20

there is no one-size-fits-all answer.

15:22

If we had the one secret answer that

15:25

could get people to take their privacy

15:26

seriously, we would have used it by now.

15:29

But I think –

15:32

One thing,

15:32

so I'm thinking particularly in the

15:34

context of like Europeans and Asians,

15:37

like people where WhatsApp is like a

15:39

common way to connect with businesses and

15:41

stuff like that.

15:41

And it's, I hate to say it,

15:43

but it's quote unquote kind of a

15:44

necessity.

15:45

I think for those people, there's,

15:48

I forget where I heard this,

15:49

but somebody really floated the idea of

15:52

instead of trying to get people off

15:55

WhatsApp,

15:56

trying to get them onto something else in

16:00

the sense that like,

16:02

You can keep WhatsApp and you can use

16:04

it for when you have to contact a

16:07

business in Germany or something,

16:09

but all your friends are also on Signal

16:13

and we could use that too.

16:15

And then it turns into a thing where

16:16

like,

16:17

in my life, I still use SMS.

16:20

I still have some services.

16:21

I logged into a bank this morning that

16:22

texted me an SMS code,

16:24

not happy about it,

16:25

but there's nothing I'm going to do about

16:26

it.

16:26

So I still have to use SMS.

16:29

I can't stop using it,

16:30

but I've got ninety,

16:31

ninety five percent of my friends and my

16:32

family on signal.

16:34

And that's where I do most of my

16:35

work.

16:35

And so I think I think trying to

16:38

encourage people rather than like, oh,

16:39

stop using WhatsApp,

16:41

try to encourage people like, oh,

16:42

we're all over here on signal.

16:44

And sorry,

16:45

this just popped into my head real quick

16:46

while I was talking.

16:48

I have had amazing success by focusing on

16:51

features.

16:52

Like I hate to say it,

16:53

but let's be honest.

16:54

Most people don't care about privacy and

16:55

security enough that that's their driving

16:57

factor to move.

16:59

It's just kind of a happy bonus.

17:01

So my wife used to be a wizard

17:04

at this and I swear to God,

17:05

she should teach a class.

17:06

She was so good at getting people to

17:08

switch to signal and she never brought up

17:10

privacy and like, you know,

17:11

she'd mentioned like, yeah,

17:12

it's this encrypted messenger,

17:13

but like

17:14

It's got bigger, you know,

17:16

bigger attachment sizes and we're

17:18

comparing to SMS here.

17:19

So I don't know how it compares to

17:20

WhatsApp,

17:20

but like it's got bigger attachment sizes.

17:22

We can send gifts,

17:23

we can send reactions because this is

17:24

before RCS was a thing.

17:26

It's like all these amazing quality of

17:28

life features.

17:28

And I swear to God, five minutes later,

17:30

I would get a text from the person

17:31

she was talking to like, Hey,

17:32

I'm on signal now.

17:32

I'm like, damn,

17:34

I've been trying to get this person on

17:35

signal for two years.

17:36

How did you do this?

17:38

So, yeah,

17:39

I think that's probably unfortunately how

17:40

we're going to get like, quote unquote,

17:42

the average person to want to switch is

17:43

by showing them the quality of life

17:46

advantages.

17:47

And, you know, yeah, absolutely.

17:49

Figure out what signal does better than

17:50

WhatsApp.

17:51

I totally agree.

17:53

You kind of stole the thing that I

17:54

wanted to talk about,

17:55

so I won't spend too long on it,

17:56

but I mean, that is definitely my, my,

18:00

I truly believe that like all of these

18:02

private alternatives,

18:03

pretty much in most of these sectors,

18:05

if you really look at them and if

18:06

you really start to use them,

18:09

they are also quality of life improvements

18:12

because

18:13

I think people are fed up with technology

18:16

and all of this surveillance and all of

18:17

these anti-features that like all of our

18:20

computers are doing things that we didn't

18:21

ask them to do now or AI is

18:24

being jammed into them or that it's

18:26

popping up.

18:27

It was annoying decades ago in Microsoft

18:30

Word with Clippy and Copilot is just as

18:33

annoying now in all of those products.

18:36

People just want functional tools,

18:39

I think,

18:39

and focusing on that aspect

18:43

I think that's probably the best way to

18:44

drive adoption of these things because

18:46

it's just, it's simpler,

18:48

it's more reliable,

18:49

and it works better in my experience.

18:51

And finding ways that it works better than

18:53

WhatsApp and focusing on that rather than

18:58

trying to compare like the security

19:00

features that they already have,

19:01

like you said,

19:01

I think that that is the way to

19:03

go.

19:05

Moving on to our next story here.

19:08

This was reported by The Guardian.

19:10

French lawmakers vote to ban social media

19:14

use by under-fifteens.

19:18

So this starts out,

19:19

legislation which also bans mobile phones

19:22

in high schools would make France the

19:24

second country after Australia to take

19:27

such a step.

19:29

French lawmakers have passed a bill that

19:31

would ban social media use by

19:33

under-fifteens,

19:34

a move championed by President Emmanuel

19:36

Macron as a way to protect children from

19:39

excessive screen time.

19:41

The Lower National Assembly adopted the

19:43

text by a vote of one-thirty to twenty-one

19:46

in a lengthy overnight session from Monday

19:49

to Tuesday.

19:50

It will now go to the Senate,

19:51

France's upper house ahead of becoming

19:53

law.

19:55

The legislation,

19:56

which also provides a ban on mobile phones

19:58

in high schools,

19:58

which I think is a great idea personally

20:01

as a former educator,

20:04

would make France the second country to

20:07

take such a step following Australia's ban

20:09

for under-sixteens in school.

20:10

December.

20:11

Social media has grown,

20:12

so is concerned that too much screen time

20:14

is harming child development and

20:15

contributing to mental health models.

20:20

And so my big question coming out of

20:22

this, I think,

20:23

is how they plan to enforce this,

20:27

because we've talked a lot in the past

20:29

about age verification,

20:31

and I know this is a huge issue

20:33

in Australia right now,

20:34

as is

20:34

I mean, this article doesn't mention that,

20:36

but they're the first country to really

20:39

take an approach like this,

20:41

banning not only very young children,

20:44

but teenagers from social media.

20:49

That's very challenging to do without

20:53

these invasive age verification things

20:55

that we have always been very concerned.

20:58

against because age verification and ID

21:00

verification it's not just a matter of

21:03

like affecting children um it forces

21:07

everyone who's signing up for these

21:09

platforms to be verified which includes

21:11

adults so there's no opt-out process here

21:13

um and that's a very dangerous privacy

21:17

concern how these IDs are going to be

21:19

implemented in the first place I think and

21:20

also um

21:23

what data is going to be shared with

21:25

all of these platforms.

21:26

That's something that we'll have to keep

21:27

an eye on.

21:28

So I'm not seeing in this particular

21:31

article how the French plan to deal with

21:35

this question.

21:36

I know that this is a pretty common

21:39

issue with a lot of legislation like this,

21:41

where lawmakers kind of

21:44

put some arbitrary goal together without

21:47

any steps or plan on how to make

21:52

it happen in a reasonable, secure,

21:55

and private way.

21:59

But yeah,

22:00

that's my biggest question out of this

22:01

story.

22:03

Did you have a chance to look into

22:05

this story any more than that, Nate?

22:10

No, just the article itself that you read.

22:16

If I remember correctly,

22:17

I don't have it pulled up in front

22:18

of me like you do.

22:19

If I remember correctly,

22:20

they did say that towards the end,

22:22

what you said there, where it's like, oh,

22:24

they don't really have a plan for how

22:26

they're going to implement this.

22:28

That's something they're going to talk

22:29

about next week.

22:31

I do find that so funny.

22:33

Yeah.

22:35

My favorite example of this,

22:36

New York City did that a few years

22:38

ago where they banned the sale of internal

22:40

combustion engine cars.

22:42

And then like the next year they went,

22:45

hey,

22:45

where are we going to put all the

22:46

chargers for these electric vehicles?

22:49

And I'm just like, seriously,

22:51

nobody had that conversation.

22:53

Come on, guys.

22:55

So, you know, yeah,

22:56

it's and it goes to show.

22:58

just this is something I harp on a

23:00

lot personally.

23:01

It's like,

23:02

I think we need better tech literacy in

23:04

general worldwide.

23:05

And cause we, we have a lot of,

23:08

I know I've said this before,

23:09

but we have a lot of elderly people

23:10

who, you know, to their defense,

23:12

I get it.

23:13

Like a lot of them existed in the

23:15

days where like color TV was the newest,

23:16

fanciest thing.

23:17

And now we've got LLMs and that's,

23:20

that's a lot to wrap your head around.

23:22

But then on the other hand of this

23:23

end of the spectrum,

23:24

we've got these people who are,

23:26

I love to cruise r slash tales from

23:29

tech support on Reddit,

23:30

but it also really makes me facepalm

23:32

because on more than one occasion,

23:34

I've seen stories like my Wi-Fi isn't

23:36

working.

23:36

And then when they're like, okay, well,

23:38

are the lights on in the router?

23:39

And they're like,

23:39

I'm not at home right now.

23:41

Well, of course your Wi-Fi is not working.

23:43

Or, you know,

23:44

I've also seen the ones where they're

23:46

like, again, you know,

23:47

my computer won't turn on.

23:48

And it's like, okay, well,

23:48

is it plugged in?

23:49

I can't see under the desk.

23:50

The lights are off.

23:52

You don't say,

23:52

and I've seen those stories multiple

23:54

times.

23:55

And so it's like multiple people,

23:56

and that's just on Reddit,

23:57

multiple people are having this issue.

23:59

So I think my point being,

24:02

we need better tech literacy,

24:04

at least in the basics.

24:05

I'm not saying everybody needs to know how

24:06

to code and self-host their own

24:08

everything, but just to understand...

24:10

That like you were saying,

24:12

that's a big thing.

24:12

It's age verification.

24:13

No, it's not.

24:14

It's identity verification.

24:15

And just to give credit,

24:16

I got that one from Taylor Lorenz.

24:17

And, you know,

24:19

it's you're going to have to upload your

24:20

ID to people watching this in France,

24:22

regardless of your age and the UK,

24:24

which we'll talk about in a minute.

24:25

Like it's not just minors,

24:27

because how else are they supposed to know

24:29

that you're not a minor?

24:29

Yeah.

24:30

And a lot of these politicians just think

24:33

like, oh, that's a technical problem.

24:34

Just as one of my other friends likes

24:35

to say,

24:36

nerd harder and we'll find a solution.

24:38

And it's like, no, there is no solution.

24:39

There is no magic bullet.

24:41

Technology is not magic.

24:45

I feel like on this show,

24:46

we've talked quite a bit about age

24:48

verification and these ID verification

24:50

problems.

24:51

And I would definitely encourage people,

24:53

if you are unfamiliar with some of those

24:55

problems,

24:56

with some of this topic to check out

24:59

the interview that you did with Taylor

25:01

Lawrence,

25:01

because I think you really covered a lot

25:03

of good stuff that was more focused on

25:05

how that's going to affect the US and

25:07

some legislation that's going on.

25:09

But it really does apply to all of

25:10

this stuff going on around.

25:11

around the world.

25:13

We see it not just in France and

25:14

Australia, but the UK, for example,

25:16

has very strict ID verification laws.

25:19

It's becoming a real problem.

25:22

Ignoring the implementation side of this,

25:29

Do you have any opinions of your own

25:31

on this social media ban for children in

25:34

general?

25:35

Is that something you support just as a

25:41

general concept?

25:42

Or what do you think?

25:44

I mean,

25:44

I have some thoughts on this if you

25:45

don't,

25:45

but I'll pass it off to you first.

25:47

Oh, I have thoughts.

25:48

My thoughts... Honestly,

25:50

it's complicated because on the one

25:52

hand...

25:55

I, like most people,

25:57

I do not neatly fit into one particular

26:00

political label or another.

26:02

I have thoughts that are left-leaning and

26:05

thoughts that are right-leaning.

26:05

And one of my more libertarian thoughts is

26:07

that parents should be in charge of their

26:09

kids.

26:10

And I don't mean that in the sense

26:11

of like, well,

26:11

parents should just raise their kids.

26:13

I mean like parents should have the

26:14

autonomy and the freedom to decide if they

26:17

think their kids are ready to see a

26:18

movie, ready to play a game,

26:20

ready to engage with the internet.

26:22

I think parents should have that freedom.

26:24

But at the same time,

26:25

I think the internet is very distinctly

26:28

different from a movie or a video game.

26:30

Well, maybe not an online game,

26:32

but like an offline game in the sense

26:35

that the internet is a much,

26:37

much bigger place with much more

26:40

disturbing content on it.

26:42

I'm sure whatever the worst thing you've

26:43

seen in a horror movie is,

26:44

there's probably something worse on the

26:45

internet.

26:47

And I think it's a lot to ask

26:49

parents to constantly know,

26:52

even if it's the most well-behaved,

26:54

well-meaning, good kid,

26:56

that doesn't necessarily mean that the

26:58

people they're interacting with online are

27:00

also acting in good faith.

27:01

And I think that's a lot to put

27:04

parents in the position of having to

27:06

constantly try to monitor all of that.

27:09

Yeah.

27:11

It's tough because I don't want to take

27:13

away the autonomy of the parents to make

27:15

those choices,

27:16

but that's also a lot of work for

27:20

people who work full-time and may not

27:21

necessarily have the tech skills and

27:23

everything.

27:24

Um, just one more thing real quick.

27:25

Somebody here said in the comments that,

27:27

you know,

27:27

regulation is how we get clean water,

27:29

clean food, you know, safe food.

27:32

And it's obviously it's not perfect.

27:33

You know,

27:33

things get recalled all the time,

27:34

but I think we can all agree.

27:36

It's a lot better.

27:37

The term snake oil comes from the old

27:39

West days when people would literally roll

27:42

into town with literal snake oil and be

27:44

like, yeah,

27:44

this will cure your cancer and arthritis

27:47

and this, that,

27:48

and the other and everything.

27:49

And like,

27:49

just give me your money and I'm going

27:50

to be

27:51

Fifty miles away by the time you realize

27:53

I ripped you off and you don't have

27:54

a way to get me because of the

27:56

technology limitations at the time.

27:57

And so regulations aren't always bad,

28:00

but it's definitely – I don't know.

28:02

I think it's a mix.

28:03

I think there's pros and cons,

28:05

and I don't really know what the right

28:06

answer is.

28:07

That comment that you pointed out is a

28:08

good one because it does sum up a

28:10

bit of how I feel about social media,

28:12

which is –

28:14

If it's such a problem,

28:15

I think what we've seen in society is

28:18

that this isn't a problem that only

28:19

affects children.

28:21

And personally,

28:21

I don't think that children are...

28:25

significantly worse off than anyone else

28:27

who's constantly being exposed to these

28:29

social media algorithms.

28:30

And so from this perspective,

28:33

we have food regulation,

28:35

we have clean water regulation.

28:37

Could we have algorithmic regulation that

28:39

applies to all of these users of the

28:41

platform to protect ourselves in general

28:45

as a society against the harms of social

28:47

media?

28:47

I think that that could be

28:49

an approach because I think what people

28:54

don't think about or realize is that the

28:57

algorithms that make up something like

28:59

Facebook and Twitter are not

29:04

like they're not necessary for social

29:06

media to function um by the way facebook

29:09

you know from a user's perspective was

29:11

probably totally fine before they

29:12

implemented like the news feed and stuff

29:14

and people generally like twitter and the

29:17

chronological ordering of tweets from only

29:19

people that you follow before you know all

29:22

this discovery stuff was baked in and it

29:23

really tried to

29:25

get you into these bubbles and echo

29:28

chambers that I think is causing a lot

29:30

of people harm, not just children.

29:32

And I think we're focusing on children

29:34

because children are, you know,

29:35

growing up in this and it's preventing

29:38

them from like building the skills that

29:40

they need to survive in adult society,

29:42

unfortunately,

29:43

that most people like adults already have.

29:46

But

29:48

beyond that like i think the harms to

29:51

all people are pretty apparent by social

29:54

media and i think that some social media

29:56

platforms like um mastodon for example

30:02

demonstrate that building communities that

30:06

you can interact with in a more healthy

30:08

way um

30:11

It's possible.

30:12

And I think that regulation on that front,

30:14

which would make these big tech companies

30:18

more

30:20

like Mastodon, for example,

30:21

and Mastodon isn't the perfect social

30:23

media, by the way,

30:24

but it's a direction that we could go

30:25

in.

30:28

We need to, I think,

30:29

get back to the internet being a place

30:32

where we share information and we share

30:37

knowledge and make it less of a place

30:41

where we just consume whatever information

30:44

the overlords of the internet have put on

30:46

the screen in front of us, right?

30:48

It needs to be more intentional,

30:49

and I think that that's the sort of

30:51

thing which could be done through

30:52

legislation that doesn't involve age

30:57

verification or anything like that,

30:58

because I think

31:02

Banning algorithms like that is really a

31:05

lot like enforcing clean food and water

31:10

regulations and that sort of thing.

31:13

It's a public health issue at the end

31:14

of the day.

31:18

I also definitely agree with the sentiment

31:21

that I've seen from some people in the

31:23

chat and also in this article from some

31:26

people that they interviewed,

31:27

which is that bands like this,

31:32

they are overly simplistic,

31:36

as this group said in the article here.

31:40

But it's also a form of digital

31:41

paternalism.

31:42

I think that is true.

31:45

It's not really the government's place to

31:51

make these decisions, I think,

31:53

in terms of parenting children.

31:54

And you got into this before.

31:56

And it is hard.

31:59

Exactly like you said,

32:00

there is a balance because there's so much

32:02

going on in people's lives.

32:06

It's so common for both parents to be

32:08

working now.

32:09

Some people have to work two jobs.

32:12

Society is just crazy at the moment,

32:14

right?

32:14

And so...

32:16

Yes, it's hard,

32:17

but I don't think that that should be

32:19

an excuse for the government to step in

32:21

in this way.

32:22

The government should be stepping in and

32:23

making people's lives easier so that they

32:26

have time to parent their children

32:27

themselves, right?

32:28

That would be an ideal outcome here.

32:30

What if we all made enough money where

32:33

we had the time to educate our children

32:36

properly?

32:36

What if the government tried to do

32:38

something about that?

32:39

I don't know.

32:41

Just a thought.

32:44

So, yeah,

32:45

I don't think it's when somebody could

32:46

actually afford to stay home and be a

32:48

parent.

32:49

Right.

32:51

Now,

32:52

I especially I really agree with and Henry

32:56

used to say this a lot on surveillance

32:57

report to exactly what you just said that

32:59

the we keep focusing on like social media

33:03

is bad for kids social media is bad

33:04

for kids.

33:05

social media is bad for everyone.

33:07

Yeah.

33:07

Social media is bad for me.

33:09

I notice it even like when I spend

33:11

too much time on social media,

33:13

I start to get that FOMO and I

33:14

start to, you know,

33:16

it really starts to consume me.

33:17

And I,

33:17

I'm sure that some people are more

33:19

susceptible to that than others.

33:20

Like I know some people that their

33:23

relationship with Facebook is like my

33:24

relationship with my phone where like half

33:26

the time I'm like, where is it?

33:27

I don't even remember.

33:28

But

33:30

These are companies that are paid full

33:32

time to figure out how can we keep

33:33

people on the platform longer.

33:35

That is their job.

33:36

I really want to stress that.

33:38

However good you are at your job,

33:40

that's how good they are.

33:43

It's just not a fair fight is what

33:45

I'm getting at.

33:46

It's really unfortunate that we keep

33:48

focusing on...

33:50

This is bad for kids and ignoring the

33:52

fact that everyone is impacted by this.

33:54

And I think it would be,

33:56

to your point,

33:56

if we're going to regulate anything,

33:57

we need to regulate the companies and the

33:59

algorithms and make it less harmful for

34:02

everyone.

34:03

And then maybe we wouldn't need to resort

34:05

to these extreme measures.

34:07

The other thing I would say about this

34:08

ban is that we are in the...

34:11

really early days of the internet still if

34:14

you really think about it um and this

34:16

was i i didn't really think about this

34:18

a lot until i heard um i think

34:19

i was watching a hank green video where

34:21

he said something to this effect where

34:23

like in in terms of like society in

34:27

general like this many to many

34:29

communication system that we have with the

34:32

internet is extremely new and if you

34:34

really think about it like most people

34:36

have probably only been

34:39

in like the social media mass

34:41

communication landscape for maybe ten,

34:44

fifteen years.

34:46

I know like you've probably been on the

34:48

internet longer,

34:48

like some of us people who have been

34:50

into technology have been a bit longer

34:53

than that.

34:53

But for most people,

34:55

it's only been around like fifteen years

34:57

and like some whole countries even today

35:00

are still like just getting connected to

35:02

the internet and just getting phones and

35:04

it's just becoming a problem.

35:06

Like this is

35:07

an extremely new development in society

35:11

and I don't think that we know like

35:13

what works and what doesn't work right and

35:15

I don't think that we've given enough

35:17

thought into all of this stuff because

35:19

there are so many benefits to to be

35:22

honest there's so many benefits to even

35:23

social media if it's if it's done properly

35:26

um that outright banning it just doesn't

35:30

make a ton of sense to me but

35:32

clearly something has to be done and I

35:34

hope that

35:36

other governments outside France and in

35:40

Australia try and think about these more

35:43

nuanced approaches to how all of this

35:46

technology can be improved and a better

35:50

tool in like people's lives rather than

35:53

just like seeing the problems that these

35:56

especially these American big tech

35:58

companies have created on the internet and

36:00

like

36:01

quickly reacting to it and just banning it

36:03

outright.

36:05

I think there's some middle ground to be

36:07

found here that I would really try to

36:09

encourage.

36:12

Totally agree.

36:13

With that out of the way,

36:14

in a little bit,

36:15

we're going to talk about TikTok.

36:18

But first,

36:20

we're going to talk about stories from the

36:23

UK.

36:25

That is correct.

36:27

So keeping with the vein of age-gating the

36:29

internet,

36:31

the UK House of Lords has voted to

36:33

ban VPNs for children as the pressure on

36:36

privacy tools increases.

36:39

So this is...

36:40

I mean,

36:41

the headline kind of says it all.

36:42

The House of Lords,

36:44

I'm not intimately familiar with the UK's

36:47

legislative system,

36:48

but it's one part of their legislative

36:51

branch.

36:51

I believe they said the House of Commons

36:53

is the other one, if I remember correctly.

36:54

Yes, that's correct.

36:56

Okay, yeah.

36:56

So basically,

36:57

the House of Lords has passed this.

36:59

Now it's going to go on to the

37:00

House of Commons.

37:01

And I may be mixing this one up

37:04

with France, but I want to say that...

37:07

The president or prime minister or whoever

37:09

has expressed support for this,

37:11

so if it passes the House of Commons,

37:12

that is probably not good.

37:15

But the good news is it says here

37:17

the labor government has a large majority

37:19

in the commons,

37:20

but it's not clear whether it will attempt

37:21

to overturn the amendment or support it.

37:23

So, yeah,

37:25

this may face scrutiny or it may just

37:27

fly right on through.

37:28

We don't know at this time.

37:29

But it says that the vote was passed

37:31

two oh seven to one fifty nine and

37:33

that within twelve months, VPNs,

37:36

let's see,

37:37

regulations which prohibit the provision

37:38

to the UK children of a relevant VPN

37:40

service must be enacted.

37:42

And this is specifically in response to

37:45

the Online Safety Act,

37:46

which has not gone well.

37:50

Within, God,

37:51

I think within days of the Online Safety

37:53

Act taking effect,

37:53

there were stories about how a VPN could

37:56

get around it.

37:57

People were using their parents' IDs.

37:59

I think some people were even using

38:00

screenshots from video games,

38:02

specifically the game Death Stranding.

38:03

So yeah, that did not go over well.

38:09

There was also, let me see,

38:11

if I remember here,

38:12

I think there was a second law.

38:13

Again,

38:13

I may be thinking of the France one.

38:15

I read all of these stories yesterday,

38:17

so they may have jumbled up in my

38:18

mind a little bit.

38:20

Um, yeah,

38:20

I'm not seeing anything about that.

38:22

So yeah, this is a, this is unfortunate.

38:25

This is kind of like we were just

38:27

saying it.

38:28

And I love when governments just pile

38:32

band-aids on top of each other.

38:34

Like we passed the online safety act.

38:35

Oh, that didn't work.

38:37

Well, let's, let's ban VPNs.

38:39

And then they're going to find a way.

38:40

Cause it's a cat and mouse.

38:41

They're going to find a way around VPNs.

38:42

And you know,

38:43

like we were just talking about a minute

38:44

ago,

38:44

the source of the issue is what harmful

38:46

content online.

38:47

Right.

38:47

So.

38:49

Why don't you address the content online?

38:51

And to their defense,

38:52

some of the stuff that's harmful,

38:54

if harmful at all,

38:55

is out of their reach.

38:56

If a website is based in another part

39:00

of the EU, India, America,

39:02

they can't really do anything about that.

39:04

But I don't know.

39:05

This just feels to me like, oh,

39:06

it didn't work.

39:07

We need to add a Band-Aid.

39:10

Yeah, it's a cat and mouse,

39:11

so I don't know where they think this

39:12

is going to end in its logical conclusion.

39:17

And it's not great,

39:18

because obviously VPNs are not a total

39:22

anonymity tool.

39:23

They definitely do get hyped up a little

39:26

bit too much,

39:27

especially in a lot of sponsor segments.

39:29

But they do still have a legitimate use

39:31

case, and they are...

39:33

I would say they're an easy way to

39:35

make some improvements to your privacy.

39:37

Like a lot of the VPN providers we

39:38

recommend have DNS block lists that will

39:42

block known trackers, known ads,

39:44

known malware.

39:45

It will change your IP address,

39:46

which is part of the way that companies

39:49

fingerprint you online.

39:50

And again, not perfect.

39:53

Definitely leaves a lot to be desired,

39:54

but it's a great start.

39:55

And especially if your threat model is

39:57

like you don't want your ISP selling your

39:59

internet history,

40:00

you don't want your ISP knowing where you

40:01

go online, which is totally fair.

40:04

Yeah, I do think they serve a purpose,

40:06

and it's really unfortunate to see them

40:07

losing a major benefit,

40:10

which is that you don't need to turn

40:12

over ID,

40:12

because that kind of defeats the whole

40:14

privacy thing, in my opinion.

40:15

I think that's about all I got on

40:18

that one.

40:19

Absolutely.

40:22

I saw this story earlier...

40:26

this week and I sent out some posts

40:28

on social media about it that have been

40:33

doing pretty popular.

40:39

But basically I was talking about these

40:40

VPN bans in

40:42

in general,

40:43

because I think that a lot of people,

40:45

and especially techie people in this

40:47

space, hear about bans on technology.

40:50

They hear about a VPN ban,

40:52

or they hear about a ban on end-to-end

40:54

encrypted messengers, like Signal,

40:56

if something like chat control were to be

40:57

rolled out.

40:58

And they think, like, oh,

41:00

I can still continue to use these tools,

41:02

and...

41:05

And I'll be fine.

41:06

Even if this affects other people,

41:08

I'm smart enough to know how to bypass

41:10

all of this stuff,

41:12

and it won't be an issue for me.

41:13

And that's what we've seen with age

41:14

verification rolling out.

41:15

A lot of people are just using VPNs,

41:18

right?

41:18

But I think...

41:21

The problem with banning and criminalizing

41:24

very common,

41:25

very mundane and very legitimately useful

41:28

technologies like VPNs, for example,

41:30

is that it makes crimes very easy to

41:36

commit and very commonplace.

41:37

And this is the first step in what

41:39

we see in these authoritarian regimes.

41:41

regimes where, you know,

41:45

they try to fill the books with as

41:48

many, you know,

41:48

potential crimes or violations as possible

41:51

so that even if you're doing something

41:55

completely unrelated to the crime at hand,

41:56

like if you're using a VPN and the

41:59

government decides they don't like it,

42:00

like if you're protesting your government,

42:02

for example, in the UK,

42:05

they can very easily like look at your

42:07

technology.

42:08

They can look at you

42:09

being a VPN user or using this end-to-end

42:12

encrypted tool or whatever,

42:13

if any of these laws pass,

42:15

and they can use that fact against you,

42:18

not only in the courts as a crime,

42:21

but also in the courts of public opinion,

42:25

so to speak,

42:26

where they can really label you as

42:29

something which you probably aren't.

42:32

And people will judge you for that.

42:34

And that comes from making these

42:37

legitimate tools

42:40

seem evil and villainizing them and really

42:44

just changing their reputation.

42:46

It affects people in a lot of ways,

42:48

and it affects people in non-technical

42:49

ways.

42:50

It's the big point that I wanted to

42:51

make here.

42:52

So I just... Yeah,

42:55

I would be worried about any...

42:58

This is the same argument that we had

42:59

with Shack Control a while ago,

43:00

which I'm sure will crop up again,

43:02

but with any of these total bans on

43:04

technology...

43:05

I just want people to remember that if

43:07

you live in these countries,

43:08

this is not just a technical issue.

43:09

And you need to be keeping an eye

43:13

on this stuff and keeping up with it

43:15

and speaking out against it because this

43:17

will end up affecting everyone.

43:19

It's a bit of a slippery slope argument,

43:22

but we're definitely at the top of some

43:24

slippery slopes right now.

43:29

Yeah.

43:29

And the other thing I want to add

43:30

onto that, that's,

43:31

that's all absolutely true.

43:32

And you're absolutely right.

43:33

A lot of the time we don't think

43:34

about the non-technical side of this,

43:37

but also I personally, I really hate that.

43:39

Like, Oh,

43:40

well I know how to get around this.

43:41

That's great.

43:42

A lot of people don't.

43:44

And privacy, you know,

43:45

privacy is a team sport and privacy is

43:47

a human right.

43:48

Like, right.

43:48

Like we have that in the merch store.

43:50

For those of you who don't know,

43:51

we have a merch store

43:51

shop.privacyguides.org.

43:53

And we have a shirt that's super awesome.

43:55

That has article twelve.

43:56

I don't,

43:57

God, I'm such a nerd.

43:58

I have this memorized.

43:59

It's the nineteen forty eight United

44:01

Nations Declaration of Human Rights.

44:03

Article twelve says that everyone has a

44:05

right to privacy.

44:05

I don't have it memorized.

44:06

That's going to be my new project is

44:07

I'm going to memorize the actual article.

44:09

But it's like it says like this is

44:12

a human right.

44:13

We're talking about like water, food,

44:15

shelter, the right to live,

44:17

the right to education and also the right

44:19

to privacy.

44:21

And so if we're going to believe that,

44:22

if we're going to sit here and say,

44:23

yes, privacy is a right,

44:25

the government is infringing on my rights

44:27

by taking away my privacy,

44:28

then that's really messed up to say, oh,

44:31

well, this doesn't affect me, so meh.

44:34

No, everybody should have that right.

44:36

Even if you know how to get around

44:38

it, lots of people don't.

44:40

Yeah.

44:41

At this point,

44:41

I don't care what form your compassion

44:43

takes.

44:43

If you're like, well,

44:44

then I'm going to teach people how to

44:45

get around it.

44:46

There may be legal repercussions for that.

44:47

I'm not endorsing that.

44:49

You do you.

44:50

But whether that's I'm going to teach

44:51

people how to get around it,

44:52

whether that's I'm going to write my

44:53

politician, whatever it is,

44:55

don't just sit back and go, oh, well,

44:56

this doesn't affect me, so I don't care.

44:58

Because what's that classic poem about the

45:00

Holocaust?

45:00

First, they came for everyone else.

45:01

And by the time they came for me,

45:02

there was no one left.

45:03

And

45:04

I wouldn't be surprised if that happens in

45:06

some places because we keep saying it

45:08

doesn't affect me until it does,

45:10

and it's incredibly selfish,

45:11

and we need to get out of that.

45:12

Sorry.

45:14

While we're on the topic of these bans

45:18

of technology for children,

45:19

I saw this comment in our YouTube chat

45:23

where they said,

45:23

as far as I can tell,

45:25

EIDAS will be used for age verification

45:27

within the EU.

45:29

Basically,

45:29

these digital ID systems will allow

45:34

websites to request some sort of ID on

45:37

your phone or computer and only get

45:39

certain information about it in a

45:41

supposedly privacy-respecting way.

45:44

And I think we've talked a bit about

45:46

digital IDs in the past,

45:47

but I just want to reiterate.

45:49

This is certainly a better solution than

45:51

the current setup that a lot of websites

45:53

are doing where you have to scan your

45:54

face and you have to scan pictures of

45:56

your ID because that is

45:59

a privacy nightmare.

46:00

It's also a security nightmare.

46:01

We've already seen, I think,

46:04

multiple data breaches of all of these age

46:06

verification and ID databases being

46:09

leaked.

46:10

And now all of this public information is

46:11

out there.

46:11

That is a huge security problem.

46:15

It's an economic problem because there

46:16

will be identity theft,

46:17

like the government is enabling

46:19

extremely scary stuff by promoting these

46:22

technologies.

46:23

And in the US here,

46:25

I know that the government uses vendors

46:28

like real.me or all these other identity

46:33

verification companies.

46:33

ID.me?

46:34

Yes, thank you.

46:36

Something like that.

46:37

I'm confusing it with real ID,

46:39

which is separate.

46:42

But yeah,

46:42

they use these for official government

46:44

things instead of making their own ID and

46:46

login system.

46:47

And like,

46:49

that is extremely concerning from a

46:51

security perspective.

46:52

But,

46:54

The overall point that I want to make

46:55

with this is that it's not only about

46:58

the privacy of the individual transaction

47:01

being made here.

47:02

This is also a censorship issue because to

47:05

get this ID in the first place,

47:07

you need to give away a lot of

47:09

your information.

47:11

So that's a privacy issue right there.

47:12

Maybe in the EU,

47:14

a lot of people already have national ID

47:15

cards.

47:16

You might be used to it.

47:18

Here in the US,

47:18

that isn't necessarily commonplace.

47:22

I know that...

47:23

The current administration is really

47:24

pushing for it to be,

47:25

and they're really supporting everyone

47:27

getting a password and having digital IDs

47:29

on their phone,

47:30

which is a whole separate thing.

47:31

But the issue being created is that as

47:36

these governments try to age-gate as many

47:39

services and as many sites as possible,

47:41

as they can possibly justify –

47:43

um it really creates a wall around all

47:45

of these things that the government has

47:46

absolute control over whether you can

47:50

cross that wall and access that site and

47:53

they can do something like revoke your id

47:55

if they want to for whatever reason um

47:59

kick you off of practically half the

48:03

internet right um and we're seeing these

48:06

id verification

48:08

laws and directives expand far beyond

48:12

their original intent of like protecting

48:15

adult services.

48:15

Now we're talking about social media

48:17

sites.

48:17

Now we're talking about VPNs.

48:19

We've seen them affect in the UK,

48:21

potentially Wikipedia, for example,

48:23

which is just a knowledge sharing service.

48:26

That's something that should be

48:28

people should have a right to access that

48:30

frankly and it's crazy that the government

48:32

would step in and get in the middle

48:34

of that and that is what we're enabling

48:36

with these digital id concepts it is a

48:40

whole system that the government has sole

48:43

control over and it is really in my

48:45

opinion antithetical to

48:50

the internet and what computers and the

48:52

internet were made for.

48:55

We just cannot accept these restrictions

48:59

on the free flow of information sharing

49:03

and knowledge.

49:03

And so even with these like private and

49:07

zero knowledge digital ID solutions,

49:10

it creates a real danger to society that

49:12

I don't think we should tolerate using

49:14

this technology at all for gating access

49:18

to information especially.

49:23

Yeah, totally agree.

49:24

Great point.

49:25

Information longs to be free.

49:27

And there are many who argue that

49:28

information should, I mean,

49:29

I think that's why, um,

49:31

I don't know if this is the ethos

49:32

for privacy guides, but at the new oil,

49:34

like I've never charged for articles for

49:38

blog posts.

49:38

Like I'll do early access,

49:39

but then like a week later it goes

49:41

public, you know, but I,

49:42

there's no part of my website that is,

49:44

Oh,

49:44

you got to join a membership to access

49:45

this premium stuff.

49:47

It's like, no,

49:48

cause it's information and it should be

49:49

free.

49:51

And yeah,

49:52

to put up those barriers to information is

49:54

really scary for the potential.

49:56

Absolutely.

49:58

Going back to something you said a little

50:00

while ago now about the government just

50:02

constantly putting band-aids on their

50:05

current bad solutions,

50:07

we have this story here from Independent.

50:11

AI and facial recognition to be rolled out

50:14

as Britain's broken policing system faces

50:16

sweeping reforms.

50:18

Officials say using AI will free up six

50:20

million hours of police time,

50:22

the equivalent of three thousand officers

50:24

each year.

50:26

This article says the Home Secretary has

50:28

announced plans to ramp up the use of

50:30

AI in live facial recognition as she

50:33

unveils sweeping reforms to fix Britain's

50:36

broken policing system.

50:39

Shobana Mahmood,

50:40

sorry if I pronounced that wrong,

50:41

is investing a hundred forty million

50:43

pounds to roll out technology which she

50:45

hopes will free up six million police

50:46

hours each year,

50:47

the equivalent of three thousand officers,

50:49

as part of the biggest overhaul of a

50:51

quote,

50:52

outdated policing model designed for

50:54

another century.

50:57

AI technology will be deployed to rapidly

50:59

analyze CCTV, doorbell,

51:01

and mobile phone footage,

51:02

detect deepfakes,

51:04

carry out digital forensics,

51:05

and speed up administration such as form

51:08

filling, redaction, and transcription.

51:13

These measures are part of a bigger

51:15

overhaul to policing that it seems like

51:19

England is seeing right now.

51:22

But I think I saw somewhere in this

51:24

article.

51:26

Now I can't find it.

51:31

Well,

51:32

I think the overall point is that these

51:37

AI tools are well known already to be

51:41

quite unreliable, right?

51:44

We're going to see a lot of

51:48

Like when we've seen this ruled out in

51:51

other law enforcement jurisdictions,

51:52

and especially like even here in the US,

51:54

for example, recently,

51:56

we talked a lot about this last week.

51:59

These AI tools being ruled out,

52:00

they're not reliable.

52:02

They're making mistakes and people are...

52:07

taking the claims of these systems at face

52:09

value.

52:10

And I think it's a really dangerous

52:11

situation that the UK is putting

52:13

themselves in by enabling this technology.

52:17

So definitely something to be wary about,

52:21

I think.

52:24

When you were reading this article,

52:25

did you see any other points you wanted

52:27

to point out here?

52:30

I think just to back up what you're

52:32

saying, yeah,

52:32

I have a friend here in the US

52:34

who works in law enforcement.

52:36

He's not a cop,

52:36

but he's like a civilian employee.

52:38

And he sends me stories,

52:40

I swear to God,

52:41

a couple times a month where he's like,

52:43

oh, so one of our cops used AI.

52:45

And I'm pretty sure this is the official

52:47

sanctioned system they're allowed to use.

52:49

Like,

52:49

I don't even think this is somebody being

52:51

like, quote unquote,

52:51

lazy and going outside the system.

52:53

He's like,

52:53

yeah,

52:54

so this cop tried to use AI to

52:56

like do his police report and it just

52:58

got everything completely wrong.

53:00

Like you said,

53:01

it was a two in the morning and

53:02

just all these little things that like,

53:03

you know, don't sound that bad to us,

53:05

but it's like, yeah,

53:05

that means this case gets thrown out in

53:07

court because the prosecutor will

53:08

absolutely tear this apart.

53:10

And just, yeah,

53:11

they're completely unreliable.

53:12

And he sends me these stories all the

53:14

time.

53:15

And I'm assuming these are just the really

53:16

bad ones he sends me that are like,

53:18

wow, they got this really wrong.

53:19

But yeah, AI is,

53:21

so so bad yeah i found this article

53:24

i was looking for it was just one

53:25

sentence but um they're creating this

53:29

national center dedicated to using the new

53:31

technology called police ai despite just

53:34

recently in ai hallucination influencing

53:38

um a decision by one of their police

53:40

departments to ban um fans of uh israeli

53:44

uh

53:45

football team from a match in Birmingham

53:47

last year.

53:47

So they're already experienced with the

53:51

problems that this can cause and the

53:52

problems that you see when you really just

53:58

take these at face value.

53:59

People aren't giving this AI oversight and

54:03

it causes real problems.

54:04

And I cannot imagine that they've really

54:07

learned from these mistakes.

54:08

I think that this sort of thing

54:11

as we've seen, it's,

54:12

it's only going to become more frequent

54:14

and more of a problem.

54:15

I think that that is the biggest problem

54:19

with this, that I would,

54:20

that I would point out for sure.

54:24

It's crazy.

54:24

And it's such high stakes too.

54:25

It's one thing when like, you know,

54:27

I'm cause I've,

54:29

I've admitted to this before.

54:30

I'll use like Braves Leo.

54:31

If I'll go to the search engine first

54:32

and I'll be like, you know,

54:33

I'll type in the keywords or whatever I

54:35

think should pop up the thing I'm looking

54:36

for, but then I'll get like, Oh God,

54:38

actually, what was it?

54:40

Um,

54:42

I think it was with WhatsApp.

54:43

Yeah,

54:43

it was this whole WhatsApp thing that our

54:44

headline story, actually,

54:45

as I'm working on this script for private

54:48

messaging,

54:48

I was looking for a story about how

54:50

WhatsApp tried to change the terms of

54:53

service so that they could share data with

54:55

other meta properties like Instagram for

54:57

targeted advertising.

54:58

And everybody got really mad.

54:59

And so I went to Brave and I

55:00

typed in like, you know,

55:01

WhatsApp data sharing, whatever, whatever.

55:03

And all I got was this week's headline

55:05

story.

55:05

And I'm just like, oh my God, okay,

55:07

forget this.

55:08

And so I went to Leo and I

55:09

was like, hey, I'm looking for this story,

55:10

blah, blah, blah.

55:11

And it was like, oh,

55:12

you're thinking of this from twenty twenty

55:14

one or whatever.

55:15

And so, yeah,

55:16

but I've had these times where, like,

55:18

I ask Leo a question and it works.

55:21

And then ten minutes later,

55:22

I have the same problem.

55:23

So I ask it another question.

55:25

But for some reason,

55:26

it loops back into the original question

55:28

and just literally word for word answers

55:30

the first question.

55:30

And I'm like, no, that's.

55:32

all right,

55:32

let me close this window and start a

55:33

new one.

55:34

And just the point being that like,

55:36

it's amazing that they see that kind of

55:38

behavior and they're like, yeah,

55:39

this will be great for determining whether

55:41

people go to jail, have a criminal record,

55:43

possibly end up on death row.

55:45

I don't think they have death row in

55:46

the UK, but you know,

55:47

just like we can completely ruin

55:48

somebody's life.

55:49

And we know that this thing is not

55:50

perfect, but we're willing to do that.

55:52

That's wow.

55:53

That's insane.

55:54

Yeah.

55:56

No, the,

55:57

the other thing in answer to your question

55:58

that jumped out at me was the,

55:59

the facial recognition bands.

56:01

These are,

56:02

Yikes.

56:03

These have been covered extensively by

56:05

groups like...

56:06

I think they're called Big Brother Watch

56:08

in the UK.

56:09

And the police will just randomly take...

56:12

They have these mobile...

56:14

I don't even know what you want to

56:15

call them.

56:15

They're mobile facial recognition bands.

56:17

They'll go out to a public street out

56:19

of the blue and they'll set them up

56:21

and just scan everybody that walks by.

56:23

And the reason they're so problematic is

56:25

because they'll put signs up at the end

56:27

of the street that say, Hey,

56:29

we're using facial recognition because

56:31

legally they have to,

56:31

they have to put those signs up so

56:33

that you can quote unquote consent.

56:35

And the reason I put quote unquote is

56:36

because there've been so many stories of

56:38

people will like turn down the street and

56:39

see that sign and

56:41

And decide like, oh,

56:42

I don't want to go down the street.

56:43

So they'll turn and walk away.

56:45

And the police will go follow that person

56:48

and hunt them down and be like,

56:49

why'd you walk away?

56:50

What do you have to hide?

56:51

What's your name?

56:52

Show me your ID.

56:52

And sometimes they'll even facial

56:54

recognition them anyways.

56:55

And it's like, dude,

56:56

it's not consent if you're going to chase

56:57

me down the street and make me do

56:59

it anyways.

57:00

And so, yeah,

57:01

I think they're going from like ten of

57:02

those to like fifty of them.

57:04

Five zero.

57:06

It's completely insane.

57:07

And those things scare the crap out of

57:08

me.

57:09

Yeah, my...

57:12

My wife made a friend in the UK

57:13

last year and she was like,

57:14

we should go sometime.

57:15

And I was like, never.

57:16

No, I'm not going to the UK.

57:18

It is a bit of a scary place.

57:21

I just saw we got a comment from

57:22

a Wither lead here who said that these

57:25

cases are hilarious that the involved

57:26

people using them are too lazy to look

57:29

back at what really AI is generated for

57:30

them.

57:31

And I think that's very true.

57:32

And it's ridiculous,

57:34

but I think it really highlights a huge

57:36

problem that we see with AI right now,

57:40

which is that I don't think people...

57:43

When,

57:43

when we see AI used in these

57:45

circumstances,

57:46

it always needs to be done under like

57:48

the oversight of a real person with

57:49

experience and knowledge in this space,

57:51

because AI will lie to you straight to

57:54

your face without batting an eye because

57:57

it can't, it doesn't know any better.

57:59

And you know,

58:00

if you're going to use AI at all,

58:01

the only way to do it is to,

58:04

um,

58:05

be aware of that and be able to

58:07

catch AI and maybe

58:10

maybe i don't know in terms of police

58:12

but maybe some police officers can can do

58:14

that right now um like more experienced

58:17

ones they might be able to look at

58:18

this and say like oh that's not quite

58:20

right but what we're missing right now i

58:22

think is all these younger people the new

58:26

generations entering the workforce or in

58:28

college right now who are really reliant

58:31

on ai they're going to be using ai

58:33

more in their jobs and they aren't being

58:36

trained on how to

58:39

properly oversee AI.

58:40

I think we're losing a lot of that

58:42

knowledge and we're putting a lot of trust

58:43

in AI and that is simply not a

58:46

tenable

58:48

solution to this to this problem um we're

58:51

not properly training anybody who's using

58:53

these tools right now in my opinion to

58:57

be aware of this in a way that

58:59

makes sense we're kind of offloading a lot

59:02

of jobs to ai right now when that

59:05

is not something that ai can do it's

59:07

never really going to be something that ai

59:09

can do there are certainly like ai

59:11

optimists who can

59:14

argue, and they may be right,

59:16

that AI will be a big part of

59:19

people's jobs in the future,

59:20

but it'll always be under this human

59:23

supervision.

59:24

And it'll always be like a force

59:26

multiplier, basically,

59:28

but you have to

59:30

You have to have the ability to recognize

59:34

the problems with AI and control it.

59:35

And people simply don't right now.

59:37

I think that's a huge problem.

59:39

I think it's only going to become more

59:40

and more of a problem as more people

59:43

with real world experience retire and they

59:45

don't pass that knowledge down to new

59:48

trainees who are just doing everything

59:50

through AI.

59:51

So that worries me quite a bit about

59:54

AI, not just in policing,

59:56

but in pretty much any field where they're

59:58

trying to apply it right now.

1:00:01

And I think we really need to be

1:00:03

aware of that and we need to do

1:00:05

more about that to make it better.

1:00:09

Yeah,

1:00:09

and that kind of goes back to what

1:00:11

I said earlier about we have a low

1:00:13

level of tech literacy.

1:00:15

We have people who...

1:00:18

You know,

1:00:18

I just mentioned the issues that I have

1:00:21

with Braves Leo,

1:00:22

which somebody else said in the comments,

1:00:23

like Leo's pretty good.

1:00:24

And I agree.

1:00:25

I'm very happy with the results.

1:00:26

It cites its sources.

1:00:27

So I always double check it.

1:00:28

And I'm like, okay,

1:00:29

let me make sure this actually says what

1:00:30

you're telling me it says.

1:00:32

But even then it's still, you know,

1:00:33

it gets things wrong.

1:00:34

It repeats itself.

1:00:35

It does things.

1:00:36

And I don't understand how like,

1:00:38

like the whole AI girlfriend thing,

1:00:40

you know,

1:00:40

like some people are really thinking like,

1:00:41

oh, this,

1:00:42

and I'm sure they realize it's software,

1:00:44

but they're like,

1:00:44

this software is sentience and really

1:00:46

cares about me.

1:00:46

And it's like,

1:00:47

I have to imagine it has the same

1:00:49

glitches that Leo does.

1:00:50

And I don't understand how you can look

1:00:52

at that and still think that this is

1:00:54

the way to go.

1:00:54

And just that level of tech literacy to

1:00:57

not understand what's going on under the

1:00:59

hood and that it's just a prompt and

1:01:01

that it's just, you know, autocorrect.

1:01:04

And it's just,

1:01:06

It scares me that, yeah, like you said,

1:01:07

this is becoming such a – it's something

1:01:12

that people are relying on for such

1:01:13

important decisions.

1:01:14

And on top of it, like you said,

1:01:16

losing the ability to understand what it

1:01:18

is, what it does, the limitations,

1:01:21

things like that.

1:01:23

It's a tech literacy problem,

1:01:24

but it's also,

1:01:25

I think –

1:01:26

like an intentional deception issue.

1:01:30

And this almost ties back to what we

1:01:31

were talking about social media earlier,

1:01:33

where the way that AI companies,

1:01:36

in particular OpenAI, I think,

1:01:38

are treating their customers and are

1:01:40

designing these models is

1:01:43

becoming a legitimate public health hazard

1:01:45

more than anything.

1:01:46

And that's the sort of thing where, again,

1:01:49

we probably want to see more safeguards

1:01:51

and more regulation and more thought put

1:01:53

into how people interact with these

1:01:55

things.

1:01:55

Because I think people,

1:01:58

there are so many people out there,

1:01:59

I think,

1:01:59

who naturally just want to humanize and

1:02:03

anthropomorphize any technology they want.

1:02:08

They're going to be sucked into these

1:02:10

relationships like you were talking about,

1:02:11

for example,

1:02:11

or the advice that you're giving because

1:02:13

it can sound so human.

1:02:14

And I think that playing on that fact

1:02:18

to sell more subscriptions or to get more

1:02:21

users I think is really, really dangerous.

1:02:23

I think it is pretty much all of

1:02:26

the problems that we've seen with

1:02:27

algorithms and social media,

1:02:28

but like

1:02:30

ramped up to eleven,

1:02:31

it's bad stuff that I think really needs

1:02:38

to be thought of more carefully.

1:02:39

We're in a classic Silicon Valley move

1:02:42

fast and break things moment,

1:02:43

but the things that we're breaking right

1:02:45

now are extremely serious,

1:02:46

and that's not maybe the mentality we can

1:02:49

take when we're rolling out this sort of

1:02:50

technology nationwide or globally or

1:02:54

whatever.

1:02:55

It's crazy stuff.

1:02:59

Yeah, for sure.

1:03:03

All right.

1:03:05

I think in a little bit here,

1:03:07

we're going to talk about some of the

1:03:08

popular discussions on the forum and start

1:03:11

answering questions.

1:03:12

But first,

1:03:13

we're going to talk about TikTok on the

1:03:15

topic of public health crises and AI.

1:03:20

Yeah, so TikTok,

1:03:22

in case you guys haven't been paying

1:03:23

attention, which for the record,

1:03:24

I wouldn't blame you.

1:03:25

I don't really pay much attention to it

1:03:28

myself, if we're being honest.

1:03:30

But TikTok was sold.

1:03:33

Believe it or not,

1:03:33

the deal finally went through.

1:03:35

I know Trump's been trying to get that

1:03:37

pushed through for a couple of years now.

1:03:39

And finally,

1:03:41

I think the twenty second last week,

1:03:43

it like officially the deal closed.

1:03:46

It's all like the handoff has started.

1:03:49

And I know the handoff has started because

1:03:51

I overheard my wife and one of her

1:03:52

friends saying that that TikTok was

1:03:56

basically broken for like a week.

1:03:58

Well,

1:03:59

I think we've had a problem ourselves with

1:04:01

our own shorts, right?

1:04:02

We can't post.

1:04:03

We haven't been able to post.

1:04:04

Oh, that's right.

1:04:04

I forgot about that.

1:04:05

Yeah.

1:04:06

In case you guys are on TikTok,

1:04:08

our shorts stopped posting there because

1:04:11

of the technical issues they were having,

1:04:13

and it couldn't post for some reason.

1:04:16

And I actually went to go try and

1:04:18

look for the live and see if there

1:04:19

were any comments,

1:04:20

and I can't even see the live,

1:04:21

but that could be my phone.

1:04:23

So, yeah.

1:04:24

Who knows?

1:04:25

But...

1:04:26

Yeah, so anyways, TikTok sold.

1:04:29

And of course, when you open the app,

1:04:32

you have to accept the terms of service,

1:04:33

which I'm assuming you have to do even

1:04:34

if you want to delete it now,

1:04:36

which is a dark pattern that's not cool.

1:04:37

But anyways,

1:04:39

where we're going with this is there were

1:04:41

some privacy changes to TikTok.

1:04:42

Believe it or not, it got worse.

1:04:45

If you are one of the people who

1:04:46

didn't think it could, it did.

1:04:48

So this article from Wired here talks

1:04:51

about three of the biggest changes.

1:04:53

One of them is that TikTok is now

1:04:54

capable of precise location tracking.

1:04:57

Before this,

1:04:58

it did not collect precise data.

1:05:00

Yeah, precise location.

1:05:02

So now if you are a TikTok user

1:05:04

for whatever reason, like, you know,

1:05:05

we post stuff there,

1:05:07

make sure you double check and disable

1:05:09

that.

1:05:10

the precise location.

1:05:11

I mean, disabled location in general,

1:05:12

but especially precise location.

1:05:14

It now tracks your AI interactions,

1:05:17

which TikTok is loaded with AI slop,

1:05:21

but I guess there's also like AI tools

1:05:23

and I don't understand what those are

1:05:26

because I don't use them.

1:05:27

Again, like I show up,

1:05:28

I post a video, I check for comments,

1:05:30

I leave, I don't hang out there.

1:05:32

So I guess there are AI tools now

1:05:34

that formerly did not fall under the

1:05:36

privacy policy,

1:05:37

but now TikTok has started tracking

1:05:39

analytics and metadata from the usage of

1:05:41

those tools.

1:05:43

And if you're a video viewer,

1:05:45

you can see here,

1:05:46

it says the old privacy policy are not

1:05:47

explicitly mentioned,

1:05:48

the new privacy policy.

1:05:50

So that is one cool thing about this

1:05:51

article.

1:05:51

It shows you what the old privacy policy

1:05:53

says and what the new one says.

1:05:55

And then next up, not quite last,

1:05:59

because there's one more,

1:06:00

a couple more things we're going to talk

1:06:01

about.

1:06:01

But next up,

1:06:02

TikTok has expanded its ad network.

1:06:04

So

1:06:05

So previously, I want to say,

1:06:09

let me double check here.

1:06:10

So rather than using, well,

1:06:12

you use the app TikTok.

1:06:13

Yeah.

1:06:13

So now basically they're going to be able

1:06:16

to advertise to you in other places and

1:06:17

use the data from TikTok to advertise to

1:06:20

you in other places.

1:06:21

And I would assume collect that data from

1:06:23

other places to advertise to you on TikTok

1:06:25

because I know that TikTok does have its

1:06:27

own analytics tool like the Metapixel,

1:06:28

Google Analytics.

1:06:30

So yeah, that advertising has expanded.

1:06:33

Another privacy concern we should mention

1:06:36

that I have seen making the rounds.

1:06:37

Let me go ahead and change my tab

1:06:41

I'm sharing here.

1:06:42

This one comes from TechCrunch and it

1:06:44

says,

1:06:45

TikTok users freak out over apps

1:06:46

immigration status collection.

1:06:48

Here's what it means.

1:06:50

I don't like this article.

1:06:51

I'm gonna say that upfront.

1:06:53

Because basically there's – I guess

1:06:55

there's – again,

1:06:56

don't hang out there so I wouldn't know.

1:06:57

But I guess there's a lot of videos

1:06:58

going around TikTok about how TikTok now

1:07:02

is collecting your immigration status,

1:07:04

which is probably already being reported

1:07:07

to ICE.

1:07:07

I feel like I've reported on a story

1:07:08

about that before, but I could be wrong.

1:07:10

But anyways, according to this article,

1:07:14

TikTok has always done that.

1:07:16

The difference is that now with the

1:07:18

updated privacy policy,

1:07:20

because of the way that laws in California

1:07:23

are worded,

1:07:24

specifically with the CCPA and

1:07:25

California's Privacy Act,

1:07:27

now they have to specifically disclose it.

1:07:29

And it's – let me see if I

1:07:31

can find it here.

1:07:33

It's a very subtle, like, basically –

1:07:38

Yeah,

1:07:39

the policy specificity around types of

1:07:40

sensitive information has to do with state

1:07:42

privacy laws such as California's CPRA.

1:07:46

The latter, for instance, the CCPA,

1:07:48

requires businesses to inform consumers

1:07:49

when they collect sensitive information,

1:07:51

which the law defines as including the

1:07:53

following things.

1:07:54

And there's a...

1:07:55

Huge list of things here,

1:07:57

precise location, genetic data,

1:08:00

things that I think we would all agree

1:08:01

are sensitive information.

1:08:02

And it says, of note,

1:08:04

citizenship and immigration status were

1:08:06

specifically added to the category in

1:08:08

twenty twenty three.

1:08:10

Um,

1:08:10

so basically it was probably always

1:08:13

collecting this data.

1:08:14

It just didn't have to tell you that

1:08:15

before.

1:08:16

And the reason I don't like this article

1:08:18

is just the author's tone.

1:08:19

She takes this very like, guys, calm down.

1:08:22

They were always doing this.

1:08:23

It's not a big deal.

1:08:24

Now they're just being more honest about

1:08:25

it.

1:08:26

And I really don't like that tone because

1:08:27

it's like, no, it was bad then too.

1:08:29

It's still bad.

1:08:30

It was bad.

1:08:32

This is not a calm down moment just

1:08:34

because we know about it now.

1:08:35

So yeah, that, uh,

1:08:39

What's up?

1:08:41

Sorry,

1:08:42

my camera is apparently not working.

1:08:43

Maybe I have to fix this.

1:08:44

It's all good.

1:08:49

Well,

1:08:50

the only thing I was going to say

1:08:51

is I read this article and I was

1:08:52

thinking the exact same thing.

1:08:53

Like this TechCrunch article,

1:08:54

they really framed this as like, hey,

1:08:58

you know, it's actually not a big deal.

1:09:00

They have to put this in their privacy

1:09:02

policy because they're collecting it and

1:09:04

it's the law.

1:09:05

But that's not an excuse for them to

1:09:07

collect it in the first place, obviously.

1:09:09

I see we saw a question in here,

1:09:11

how did they determine immigration status?

1:09:13

I think when it comes to this and

1:09:14

also the other sensitive information that

1:09:16

was mentioned in this article,

1:09:17

like sexual life or sexual orientation,

1:09:21

I think that that stuff is kind of

1:09:23

being determined by algorithms,

1:09:25

most likely.

1:09:26

And it's probably a situation that we

1:09:28

see...

1:09:30

Similar to that stuff showing up in the

1:09:32

privacy policy of cars and vehicles,

1:09:35

for example, when we saw Mozilla's things.

1:09:37

A bit of it, I think,

1:09:38

is going to be overzealousness.

1:09:41

I think a lot of lawyers think we

1:09:43

should put everything in there just to...

1:09:45

cover our butts just in case something

1:09:47

happens.

1:09:47

But also I think they are collecting this

1:09:50

information and they're inferring it based

1:09:51

on not only the content you post,

1:09:54

but also the content that you consume.

1:09:56

And I think that they can probably get

1:09:57

a pretty good idea of all of this

1:09:59

information just based on the content you

1:10:03

consume alone.

1:10:05

And so

1:10:08

Yeah, ideally,

1:10:09

they wouldn't be collecting any of that

1:10:11

information at all.

1:10:12

I definitely don't think that just because

1:10:14

it's in state privacy laws,

1:10:15

that's an excuse to put it in there.

1:10:17

Ideally,

1:10:17

the algorithm wouldn't be able to know

1:10:19

that information.

1:10:20

And once again,

1:10:22

I think that's the theme of this episode.

1:10:25

That's the sort of thing where the social

1:10:28

media algorithms are overreaching and are

1:10:32

very dangerous and need to be reined in

1:10:36

a bit.

1:10:38

Yeah, for sure.

1:10:40

Yeah,

1:10:41

it would be nice if they just said,

1:10:43

here's how we determine that information.

1:10:45

But it's probably so many different ways.

1:10:47

Because like you said,

1:10:47

some people disclose it.

1:10:49

Some people upload a video where they're

1:10:51

like, hey,

1:10:52

I'm an immigrant here and I moved here

1:10:54

in

1:10:56

you know, but other people, yeah,

1:10:58

it's probably a lot of signals.

1:10:59

Like I would have to imagine if I

1:11:00

moved out of the U S it would

1:11:02

probably still be pretty easy to tell

1:11:04

based on the way I spell things,

1:11:05

the language.

1:11:06

I mean, there have been studies into like,

1:11:08

you know,

1:11:09

one of the most common examples is like

1:11:10

soda versus pop, right.

1:11:12

Depending on which phrase you use or Coke

1:11:13

or some specific things,

1:11:15

like depending on which word you use,

1:11:17

it's a pretty good indicator.

1:11:18

Like, Oh,

1:11:18

you're probably from up North or you're

1:11:19

probably from down South or something.

1:11:21

So

1:11:21

That's just when you add up enough of

1:11:23

those little signals,

1:11:24

you can start to reveal things that may

1:11:26

not be a hundred percent accurate,

1:11:27

but they're probably right more often than

1:11:29

they're wrong.

1:11:30

Exactly.

1:11:31

Like every single one of those pieces of

1:11:32

data, it's like a Venn diagram for like,

1:11:35

you just keep adding more circles.

1:11:36

And at the end of the day,

1:11:37

there's only going to be one person in

1:11:39

the middle of all of those circles, right?

1:11:40

You can get very specific with very broad

1:11:42

data.

1:11:43

Dude,

1:11:44

that is a really good way to put

1:11:45

it.

1:11:45

I like that.

1:11:46

That was good.

1:11:47

The next story in here I think answers

1:11:50

Jordan's question in the chat.

1:11:52

Does this affect the U.S.

1:11:53

only or the whole world?

1:11:55

Do you have that story pulled up on

1:11:56

your screen here?

1:11:57

Let me see.

1:11:58

I do, yeah.

1:11:59

So this is just kind of rounding off

1:12:01

our trio of TikTok stories.

1:12:02

So because TikTok – and I'll be honest.

1:12:06

I don't even know the full answer to

1:12:07

this story myself.

1:12:08

I am very unclear.

1:12:10

Did –

1:12:12

all of TikTok just get sold to a

1:12:14

bunch of US and one UAE investment

1:12:18

companies or did only part of it?

1:12:20

It is only the American one.

1:12:23

Only American TikTok is sold to this.

1:12:25

The worldwide TikTok continues to be owned

1:12:27

by ByteDance.

1:12:30

But how this affects TikTok, I think,

1:12:32

is still a good question and it's unclear.

1:12:34

And I think that that is the point

1:12:35

of this story here.

1:12:37

It's like Canada is now looking into this.

1:12:39

I think especially because I would imagine

1:12:42

just proximity to the US could get like

1:12:45

some Canadian users lumped into this

1:12:47

American version of the platform because

1:12:49

of, I don't know,

1:12:50

geolocation settings of their phone or

1:12:51

whatever.

1:12:52

I don't know how exactly this split works.

1:12:55

The whole TikTok and especially like

1:12:57

America,

1:12:59

the American TikTok being its own thing

1:13:01

doesn't make a lot of sense to me

1:13:02

because it's unclear whether it federates

1:13:04

with like the global TikTok.

1:13:06

Do you see the same thing?

1:13:08

content?

1:13:08

Is it just a different algorithm?

1:13:10

Can people outside the US see American

1:13:13

TikToks?

1:13:15

I unfortunately don't know enough about

1:13:16

TikTok.

1:13:17

But anyways, going back to this story,

1:13:20

and you can share more about it.

1:13:22

I think that that is the question that

1:13:24

Canada is asking right now.

1:13:26

I think it's unclear to everyone.

1:13:29

Yeah,

1:13:29

you asked a whole bunch of questions that

1:13:31

are scary.

1:13:33

Yeah,

1:13:33

and that's the headline for audio

1:13:35

listeners.

1:13:35

It says,

1:13:35

Canada's privacy czar seeks answers on

1:13:37

TikTok policy updates.

1:13:38

I don't know when we started calling

1:13:40

everybody a czar.

1:13:41

I don't know when that took off,

1:13:42

and I don't like it,

1:13:43

to be totally honest with you.

1:13:44

But yeah, it's Canada's, oh my God,

1:13:46

what are they?

1:13:47

The Office of Privacy,

1:13:49

the Office of the Privacy Commissioner of

1:13:50

Canada, the OPC.

1:13:51

And yeah, like you said,

1:13:54

I know this is way more common in

1:13:56

Europe, but even here in the U.S.,

1:13:58

I don't know about nowadays,

1:13:59

but historically, we've had areas,

1:14:01

especially in the south,

1:14:02

where people will be right on the border

1:14:04

of Mexico and some state,

1:14:06

and people will come back and forth.

1:14:08

Maybe they live in Mexico,

1:14:09

but they work in the US or vice

1:14:10

versa.

1:14:11

I don't know how that works,

1:14:12

but I do know it's a thing,

1:14:14

and I'm sure it was probably a thing

1:14:15

in Canada.

1:14:15

I've known a few Canadians who,

1:14:18

growing up –

1:14:20

Went to school or maybe not went to

1:14:21

school,

1:14:22

but like went to church in Seattle and

1:14:24

maybe not Seattle.

1:14:25

That was probably pretty far down for

1:14:26

them.

1:14:26

But, you know,

1:14:26

like they were back and forth pretty

1:14:27

regularly and they were almost like dual

1:14:29

citizens because they're just so close to

1:14:31

the border that they have a lot of

1:14:33

friends and connections in the other

1:14:34

country.

1:14:35

And so, yeah,

1:14:36

Canada is rightfully so trying to

1:14:38

understand with all this.

1:14:40

this sale going through now,

1:14:41

what does that mean for Canadians?

1:14:42

Will they get looped into this stuff?

1:14:45

Will their privacy rights still be

1:14:46

respected if they get looped in?

1:14:48

Is TikTok going to make any effort to

1:14:50

separate Canadian users and American

1:14:51

users?

1:14:53

Yeah,

1:14:54

so we don't really have much on this

1:14:55

story because this is just kind of the

1:14:56

initial announcement that, hey,

1:14:58

we're asking these questions.

1:14:59

But I think they are very good questions.

1:15:01

And like Jonah said,

1:15:03

there's so many questions right now.

1:15:06

We're trying to figure out how any of

1:15:09

this is going to work.

1:15:11

What are people going to see?

1:15:14

I know Trump said that he wanted the

1:15:15

algorithm to be retrained once America

1:15:16

bought it.

1:15:17

So yeah,

1:15:18

there's a lot of things that are kind

1:15:20

of up in the air right now.

1:15:25

I also, real quick,

1:15:26

I appreciate the people in the comments

1:15:28

when I asked about sodas.

1:15:29

And somebody said, we call it soft drink.

1:15:32

And someone else said,

1:15:33

we call it by its chemical compounds.

1:15:34

So thank you, guys.

1:15:38

But I believe Jonah is trying to fix

1:15:42

his camera right now.

1:15:45

And in a minute,

1:15:47

we will start taking some viewer

1:15:49

questions.

1:15:50

And Jonah will return to us very shortly.

1:15:53

Yes.

1:15:55

Oh, he's back.

1:15:55

He's back.

1:15:58

All right.

1:16:03

Did you have anything you wanted to add

1:16:04

to the TikTok story or are you ready

1:16:05

to move on to forum updates?

1:16:07

No,

1:16:07

I think I could point out this comment.

1:16:10

Jordan just mentioned this really quick.

1:16:12

I don't think we talked about it too

1:16:13

much,

1:16:14

but I definitely have seen a lot of

1:16:16

stories about how the algorithm is

1:16:18

changing.

1:16:18

There's definitely been allegations of the

1:16:21

American version of TikTok now censoring

1:16:25

posts that are critical of the American

1:16:27

government and that sort of thing.

1:16:31

Yeah, very concerning for Americans.

1:16:33

I don't think that it was the right

1:16:35

move to sell TikTok to Larry Ellison,

1:16:39

of all people,

1:16:40

and to Saudi Arabian private equity

1:16:44

companies and all that stuff.

1:16:46

To be fair, they are Emirati,

1:16:48

not Saudi Arabian.

1:16:49

Oh, sorry.

1:16:51

I did a little bit of digging.

1:16:52

It turns out we're actually really good

1:16:54

allies with the United Arab Emirates.

1:16:56

so is china so you know it's like

1:16:59

i don't like you but i like your

1:17:01

best friend which i don't know if that

1:17:03

makes you really mature or i don't know

1:17:06

i i'm just asking questions yeah

1:17:12

So yeah, as you were saying,

1:17:13

we're going to get into questions that we

1:17:15

see on our forum.

1:17:15

We'll also get into questions that we've

1:17:17

seen in the chat.

1:17:17

I know there's some questions.

1:17:19

We've answered some questions as they've

1:17:21

come up,

1:17:21

but I've seen some questions that we've

1:17:23

moved on and we'll get back to those.

1:17:24

So stay tuned for that.

1:17:27

But yeah, in the meantime,

1:17:30

let's talk about a couple top posts that

1:17:32

we've seen in our community and on our

1:17:35

forum.

1:17:36

I think the...

1:17:40

Big one for this week is, of course,

1:17:42

it's Data Privacy Week,

1:17:44

which is always an exciting time for all

1:17:46

of us in privacy.

1:17:48

Oh yeah, you have it pulled up here.

1:17:51

We had a Data Privacy Day post,

1:17:54

but basically on Wednesday the

1:17:55

twenty-eighth,

1:17:57

it was International Data Privacy Day.

1:17:59

Kind of just

1:18:01

A yearly event that a lot of organizations

1:18:06

in the privacy space,

1:18:07

both on the business and consumer side,

1:18:10

really try to focus on personal privacy

1:18:13

improvements and switching to private

1:18:15

alternatives and all of that sort of

1:18:16

stuff.

1:18:17

And so we've been posting some things

1:18:19

throughout the week on our social media

1:18:20

channels about

1:18:22

Data Privacy Day and Data Privacy Week,

1:18:25

ways that people can get into switching to

1:18:28

more private alternatives.

1:18:31

And we talked a bit about on our

1:18:33

forum, I think,

1:18:35

as Nate looks through that,

1:18:36

about how people are preparing for for

1:18:39

twenty twenty six and how and.

1:18:43

Yeah,

1:18:43

what people are doing to be more private,

1:18:45

which is super cool.

1:18:47

So I don't know if there's any specific

1:18:49

posts you wanted to highlight,

1:18:50

but

1:18:53

Well, I do love that hail privacy one.

1:18:54

That cracks me up.

1:18:57

But yeah, I mean, no, there's, I mean,

1:18:59

it runs the gamut here, right?

1:19:00

Like,

1:19:01

let me scroll back to the top here.

1:19:03

You know,

1:19:04

one person said one goal for twenty twenty

1:19:06

six is to fully move over to ProtonMail,

1:19:07

which, you know, whether it's Proton,

1:19:09

Tudor, something else.

1:19:12

What is it?

1:19:12

Mailbox?

1:19:13

Is that the other one we recommend?

1:19:15

Um, whether it's one of those services,

1:19:17

whichever one it's, it's, you know,

1:19:18

it's no small feat to move email.

1:19:21

And fortunately that is something you can

1:19:23

do yourself.

1:19:23

You know, it's not like a signal,

1:19:25

which thankfully signal is getting really

1:19:26

common, but it's still,

1:19:28

you have to have other people to talk

1:19:29

to you.

1:19:29

Right.

1:19:29

We talked about that earlier with the

1:19:30

WhatsApp story.

1:19:31

Email, you can move that by yourself.

1:19:33

Nobody's stopping you,

1:19:34

but it is still a lot of work.

1:19:35

And actually, uh,

1:19:37

many years ago I moved from Yahoo to

1:19:39

Gmail and I spent years still getting like

1:19:43

this account that I forgot about that's

1:19:45

went to Yahoo instead of Gmail.

1:19:46

So, you know, it's, it's, um,

1:19:49

it's a lot of work and, uh,

1:19:52

Yeah.

1:19:52

One person gave the advice about the

1:19:54

rabbit hole is very deep and it's

1:19:55

understandable temptation to give up,

1:19:57

but don't start with the low hanging fruit

1:19:59

and work your way up the privacy tree

1:20:00

one step at a time.

1:20:01

So, um, yeah,

1:20:03

they talked about smart TVs and they're

1:20:06

trying to replace it with something that's

1:20:07

a little bit more privacy friendly.

1:20:10

Um,

1:20:12

Yeah, they talked about, let's see,

1:20:13

just kind of harm reduction.

1:20:15

I know that's a big thing for me

1:20:16

is they said that their partner has

1:20:17

certain disabilities.

1:20:18

So unfortunately,

1:20:20

they can't really get away from like a

1:20:21

normal phone and stuff.

1:20:22

But they said they're researching a way to

1:20:24

run some old Linux computers and get the

1:20:25

same television programming with more

1:20:27

privacy.

1:20:28

And so, yeah, pretty cool stuff.

1:20:37

We can get into some viewer questions.

1:20:39

I think it's about that time.

1:20:42

First one I saw in the chat,

1:20:44

this was for you, Nate.

1:20:46

What ThinkPad are you using right now and

1:20:48

why?

1:20:50

I am using a ThinkPad X-TX because it

1:20:55

was a gift.

1:20:57

And it was free and I run cubes

1:21:00

on it.

1:21:00

So I'm actually reading the chat from a

1:21:03

cubes computer from,

1:21:04

I have a little VM just for my

1:21:06

work in privacy guides.

1:21:07

And yeah, I mean, I like it.

1:21:15

It's a little bit slow.

1:21:16

I think the processor is,

1:21:18

I think this computer is from like, so,

1:21:20

you know,

1:21:20

this processor struggles a little bit,

1:21:21

but yeah, but you know, it's not bad.

1:21:24

And it's definitely like, I couldn't,

1:21:27

I can never do any kind of video

1:21:28

editing or gaming.

1:21:29

And also, the screen's a little bit small.

1:21:31

But it's great when I travel.

1:21:33

I went to Europe late last year.

1:21:34

And last week,

1:21:36

you guys saw me with the,

1:21:37

what did I call it earlier?

1:21:41

I called my other computer something.

1:21:43

It's like a billboard or something.

1:21:44

I don't know.

1:21:44

But yeah,

1:21:45

my other computer is massive and covers my

1:21:47

whole face.

1:21:48

And this guy's like fourteen inches.

1:21:49

So it sat right in front of me

1:21:50

on the plane.

1:21:51

Nice and and nice and neat.

1:21:54

And that was really, really handy.

1:21:55

And it's a little slow,

1:21:57

but it runs everything just fine.

1:21:58

And it's great for surfing the web and

1:21:59

writing.

1:22:00

And so, yeah, I mean,

1:22:02

I'm going to use it until the day

1:22:03

it stops booting or something.

1:22:05

So, yeah.

1:22:09

Let's see.

1:22:10

I'm looking through here.

1:22:13

We didn't seem to get any chats on

1:22:15

the forum, which is unfortunate.

1:22:19

There's a couple more chats in here.

1:22:21

And yeah,

1:22:21

if anyone's watching and has questions,

1:22:23

this is a good time to leave them

1:22:25

in the chat.

1:22:29

Got another one here from Dread Pirate

1:22:31

Roberts.

1:22:31

Do you guys think that all of these

1:22:33

bad laws like chat control,

1:22:34

ID verification,

1:22:35

facial recognition are done in bad faith

1:22:37

to gain more control over the population

1:22:39

or just bumbling politicians making these

1:22:42

mistakes?

1:22:44

That one I think I did answer a

1:22:46

bit, I guess,

1:22:47

when I pulled up our recent tweet about

1:22:49

that.

1:22:50

I do think that the direction that a

1:22:54

lot of Western countries are going in

1:22:56

right now is towards more authoritarian

1:22:59

practices and towards more control over

1:23:01

their own citizens,

1:23:02

which I think is really unfortunate.

1:23:04

I think that that is a driving factor

1:23:06

behind a lot of them.

1:23:10

So, yeah, I don't think it's great.

1:23:13

Did you have any additional thoughts on

1:23:15

that, Nate?

1:23:17

I think in answer to the actual question,

1:23:19

I think it's both because, you know,

1:23:20

something that one of the podcasts I

1:23:22

listen to,

1:23:23

something the host says a lot is

1:23:25

everybody's the hero of their own story,

1:23:26

right?

1:23:27

Like, nobody...

1:23:29

nobody thinks they're the bad guy.

1:23:32

Even,

1:23:32

even when they are doing objectively evil

1:23:34

things in their mind, it's like, well,

1:23:36

this is a means to an end, right?

1:23:37

Like this is going to make the world

1:23:38

a better place in the long run.

1:23:40

And I have to literally kill people to

1:23:42

do it, but you know, that's their logic.

1:23:44

And I think there are a lot of

1:23:46

politicians who want to protect children

1:23:50

and just don't understand that this is not

1:23:52

the best way to do it.

1:23:55

You know,

1:23:56

whether that's technical misunderstanding

1:23:58

or whatever.

1:24:01

And don't get me wrong.

1:24:02

There's definitely a lot of politicians

1:24:03

that are also just like, hey, man,

1:24:04

whatever lines my pocket,

1:24:05

whatever makes me more powerful,

1:24:06

more prestigious, whatever.

1:24:08

I don't want to let those guys off

1:24:09

the hook.

1:24:10

But yeah, I mean,

1:24:11

even the people who are genuinely doing

1:24:13

this because they're like, oh,

1:24:14

this will make me more powerful.

1:24:15

I think in their head, they're like,

1:24:17

this will make me more powerful and I

1:24:20

can make the world a better place by

1:24:21

my definition,

1:24:23

which unfortunately means a lot of other

1:24:25

people tend to suffer along the way.

1:24:27

So, yeah.

1:24:28

Yeah.

1:24:31

One I did see here that I wanted

1:24:33

to kind of touch on a little bit.

1:24:35

I think this was right before the one

1:24:36

you shared.

1:24:38

Computer's going slow here.

1:24:41

Um, this captain haddock said, uh,

1:24:43

surely mass level awareness and

1:24:44

educational privacy is necessary.

1:24:46

Majority public simply don't have the

1:24:47

capacity to understand how privacy works

1:24:49

in a technical sense.

1:24:51

I disagree.

1:24:51

Um, I mean, I don't,

1:24:52

I don't want to get too pedantic here.

1:24:53

I mean, people are smart, right?

1:24:55

Well,

1:24:55

to borrow the line from men in black,

1:24:57

a person is smart.

1:24:58

I will agree with that.

1:24:59

But, um, you know, I don't,

1:25:00

I don't think anybody is incapable of

1:25:02

learning this stuff,

1:25:02

but I do agree that I,

1:25:03

I think most people don't want to learn

1:25:05

this stuff and it's very, um,

1:25:07

Some of this stuff is really hard to

1:25:08

wrap your head around,

1:25:09

even for those of us who are interested

1:25:11

in it and really passionate about it.

1:25:13

So yeah, I mean,

1:25:15

I just wanted to point that out.

1:25:16

I think when we discredit people,

1:25:19

that's not helpful personally.

1:25:22

But yeah, I mean, people can learn.

1:25:25

It's hard stuff to learn.

1:25:31

What else?

1:25:32

We talked about regulation a little bit.

1:25:36

we did talk about the slippery slope i

1:25:37

know there's one user here who mentioned

1:25:39

uh you know i see this as a

1:25:41

stepping stone towards banning and

1:25:42

restricted more of the internet i agree

1:25:44

that kind of goes back to what i

1:25:45

said about some nobody thinks they're the

1:25:47

bad guy but sorry i see you were

1:25:49

trying to pull one up there

1:25:51

I was trying to pull up this question

1:25:52

about VPN bans.

1:25:53

I realized they're probably not asking us,

1:25:56

but other people in the chat,

1:25:58

because that is our own tweet that blew

1:26:01

up about VPN bans.

1:26:03

But yeah,

1:26:04

it is interesting which of our posts

1:26:06

become popular and which ones...

1:26:10

not so much seems kind of random to

1:26:13

me at times unfortunately but you know

1:26:16

that's uh that's the problem with social

1:26:17

media and these algorithms they're

1:26:19

unpredictable and they're not really i

1:26:22

don't think a lot of the times they

1:26:23

get um our message out in front of

1:26:26

people who are interested in reading it

1:26:28

but sometimes it works out so social media

1:26:33

Yeah.

1:26:35

Here's one from culpable six, seven, five,

1:26:38

zero.

1:26:39

And they said,

1:26:40

do you think privacy has been getting

1:26:41

harder and harder to achieve over the past

1:26:43

couple of years,

1:26:44

as well as getting more inconvenient?

1:26:45

For example,

1:26:46

I keep trying to use Movad browser,

1:26:47

but there's no dark mode on most websites

1:26:49

and it hurts my eyes,

1:26:49

which is minor for me,

1:26:50

but it impacts people.

1:26:54

I gotta be honest.

1:26:55

I think it's both.

1:26:55

I think on the one hand we have

1:26:59

a proliferation of

1:27:02

of user-friendly tools like Signal,

1:27:04

like the Brave browser, like ProtonMail.

1:27:08

And I realize that in a lot of

1:27:10

cases,

1:27:15

these tools still have shortcomings.

1:27:16

Like I think...

1:27:18

I'm a Tudor user,

1:27:19

but I will objectively admit that I think

1:27:21

Proton is the better user experience.

1:27:23

I hate saying that.

1:27:27

So where I'm going with that is I

1:27:29

think we can all admit that a lot

1:27:30

of these tools may still leave some things

1:27:31

to be desired.

1:27:32

Oh, right, where I was going with that.

1:27:34

And even Proton is still missing things

1:27:36

compared to Gmail or Google or Apple or

1:27:40

Linux users.

1:27:42

Anyways, but on the other hand, you know,

1:27:46

there's also like you mentioned you want

1:27:48

to use Mulvad browser,

1:27:49

which I think is perfectly legit.

1:27:50

Mulvad browser is great.

1:27:51

I have Mulvad.

1:27:52

Mulvad is fantastic.

1:27:53

And so on the one hand,

1:27:55

it could be like, well, use Brave.

1:27:56

Brave has dark mode,

1:27:57

but maybe Mulvad has things like maybe you

1:28:00

agree with their definition.

1:28:01

Their privacy method of trying to make

1:28:03

everybody look the same like the whole Tor

1:28:05

browser thing does.

1:28:06

Maybe you just don't like Brave as a

1:28:07

company,

1:28:08

which is a totally valid take as well,

1:28:10

in my opinion.

1:28:13

It sucks that we don't have more really

1:28:15

good options.

1:28:15

When you're in mainstream technology,

1:28:17

you have so many options that you can

1:28:20

almost pick any of them in the work.

1:28:22

And it sucks that we don't have that

1:28:23

same freedom of choice with privacy

1:28:25

issues.

1:28:25

um,

1:28:26

that we would as with the mainstream

1:28:27

stuff.

1:28:27

But also the other thing is it's,

1:28:30

I think when we're going up against the

1:28:31

more high level threats,

1:28:33

I definitely worry that our privacy is not

1:28:36

as easily achieved there.

1:28:37

Like,

1:28:37

I think it's really easy to opt out

1:28:39

of the, um, the, uh, targeted advertising,

1:28:44

mass surveillance, automated stuff.

1:28:46

It's when you get up into the more,

1:28:47

you know, like, um,

1:28:49

Oh, what was that company called?

1:28:51

Augury, I think.

1:28:52

This was like five years ago.

1:28:54

There was a company that was basically AI

1:28:59

correlating traffic

1:29:00

Years ago,

1:29:02

this is probably actually why Mulvad

1:29:03

rolled out data,

1:29:05

that was their whole selling point was

1:29:06

they would sell to the DOD and the

1:29:08

military and law enforcement,

1:29:10

like federal law enforcement.

1:29:11

And they're like, yeah,

1:29:12

we can even unmask people that are using

1:29:13

VPNs.

1:29:14

We can correlate the traffic and figure

1:29:15

out where everybody's going.

1:29:16

And pretty much your only defense was like

1:29:17

a multi-hop VPN or Tor.

1:29:19

And I think when you're talking about

1:29:21

data,

1:29:22

That level of cutting edge,

1:29:24

I think it's getting more competitive and

1:29:26

more difficult.

1:29:27

But also, that's probably, to be fair,

1:29:29

just the cat and mouse of it.

1:29:31

They invent that, Mulvan invents data.

1:29:33

And then they invent something else,

1:29:34

and Mulvan invents something else.

1:29:37

So I don't know.

1:29:38

I try to focus on what's within our

1:29:43

ability to control and defend against.

1:29:46

And I try to be grateful that we

1:29:47

do have so many good options,

1:29:49

even if they're not perfect.

1:29:54

Yeah.

1:29:55

Really quick, not a question,

1:29:56

but somebody said ByteDance is still

1:29:58

keeping around twenty percent of the

1:29:59

U.S.-based TikTok, but most of its U.S.

1:30:01

ops are sold.

1:30:02

Yeah.

1:30:02

I don't know.

1:30:05

It's the whole thing is like clearly not

1:30:08

about getting ByteDance out of TikTok

1:30:10

either.

1:30:11

Like it's just a pure political thing

1:30:13

going on.

1:30:14

So I'm pretty sure that is true.

1:30:16

I think I have I have heard that.

1:30:18

And like the fact that it's all working

1:30:21

together and they're maintaining this this

1:30:23

ownership,

1:30:23

but they're also just like

1:30:25

partially being taken over by all these US

1:30:27

companies.

1:30:28

It makes no sense.

1:30:31

And I don't know.

1:30:33

The TikTok thing is crazy because I don't

1:30:35

know how much you got into this in

1:30:37

your interview with Taylor Lorenz,

1:30:40

but I know she's been talking about this

1:30:41

lately.

1:30:42

I know other people have pointed it out

1:30:43

on social media.

1:30:45

The whole TikTok thing was started

1:30:49

and really pushed for by the Democrats

1:30:51

during Biden's administration.

1:30:52

And I think a lot of people like

1:30:55

people in our position at the time were

1:30:56

saying like, oh, if we let this happen,

1:30:59

if we let the Democrats

1:31:02

do this and push this forward this is

1:31:04

going to obviously be misused by some

1:31:06

government in the future and then lo and

1:31:08

behold you know a few years later that

1:31:11

is exactly what is happening right we I

1:31:15

don't know it's that could be a whole

1:31:17

political discussion but yeah it we're

1:31:21

really I think that the American the state

1:31:26

of America right now is just concerning

1:31:28

because um

1:31:31

A lot of people are working towards like

1:31:33

all of the problems that we're seeing now.

1:31:35

It's not just like the current

1:31:37

administration right now decided to do

1:31:39

this, right?

1:31:39

This was a long time in the making.

1:31:40

It was a bipartisan effort to take over

1:31:43

TikTok.

1:31:44

And now we're seeing the results of that.

1:31:46

And I think that that's really

1:31:49

unfortunate.

1:31:50

Exactly like Jordan,

1:31:52

one of our producers just said,

1:31:54

seems like they wanted to control the

1:31:55

algorithm.

1:31:55

Yeah,

1:31:56

that was pretty much the only goal with

1:31:59

all this TikTok stuff, which...

1:32:01

I don't know.

1:32:01

It shouldn't be in control of any of

1:32:05

these governments,

1:32:05

you can definitely argue.

1:32:07

It wasn't great in China's hands either,

1:32:10

but we haven't improved the situation for

1:32:13

sure.

1:32:15

And this is, yeah,

1:32:16

just to agree with you, this is why...

1:32:19

I love the analogy of Kerry Parker from

1:32:22

Firewall's Don't Stop Dragons.

1:32:23

He refers to personal data as like

1:32:25

radioactive waste.

1:32:26

And he's like,

1:32:27

you want as little of it as possible

1:32:29

because you can't handle it safely.

1:32:31

And the stakes are too high if something

1:32:32

goes wrong.

1:32:33

And it drives me insane that America has

1:32:35

such this attitude of like, well,

1:32:37

it's okay if

1:32:38

Facebook collects all this data.

1:32:40

Nevermind that there's literally a

1:32:41

Wikipedia page full of their data breaches

1:32:43

and privacy scandals.

1:32:45

But, you know, it's like, oh,

1:32:46

it's okay when we do it,

1:32:47

but when China does it, it's bad.

1:32:49

And it's like, or...

1:32:50

We could just outlaw this entirely and

1:32:54

it'll stop being a problem.

1:32:55

I know this is really not the best

1:32:56

example,

1:32:57

but just because it's morally not okay.

1:33:00

But I remember after Epstein's death,

1:33:04

somebody did an investigation.

1:33:05

Some reporter pulled all the location

1:33:09

tracking data for all the cell phones that

1:33:11

went in and out of his private island.

1:33:13

And every single one of them that went

1:33:16

back to Europe,

1:33:16

as soon as they hit European airspace,

1:33:18

the tracking data disappeared because of

1:33:19

GDPR.

1:33:21

And it's like, okay, yes,

1:33:22

not a great example because it's not great

1:33:24

that bad people got away with bad things,

1:33:26

but that proves that GDPR works.

1:33:30

And it's like, why can't we do that?

1:33:32

Why can't we just get rid of the

1:33:33

data?

1:33:34

And then China can't use it either.

1:33:36

Nobody can use it because it's not there.

1:33:38

And for the record, yes,

1:33:39

I know there will always be espionage and

1:33:41

people who flout the rules,

1:33:43

but it'll drop so significantly.

1:33:45

And it would be at very least a

1:33:46

huge step towards fixing the problem,

1:33:49

if not a perfect solution.

1:33:50

And it drives me insane.

1:33:53

Absolutely.

1:33:54

I think I don't want to bring up

1:33:56

this whole the Epstein case and the

1:33:58

morality of that situation.

1:34:00

And like, obviously, you know,

1:34:02

could somebody argue that GDPR is not

1:34:05

really helping in that case?

1:34:06

Maybe.

1:34:06

I don't know.

1:34:07

But yeah.

1:34:08

I've talked about this.

1:34:10

I don't remember in a post or another

1:34:13

video a while ago.

1:34:15

Basically, I think in the privacy space,

1:34:18

in the security space,

1:34:19

something we have to keep in mind is

1:34:21

that we see a lot of stories in

1:34:24

the news like that one that you just

1:34:26

talked about, for example,

1:34:27

where we're talking about criminal

1:34:29

activity and how either, you know,

1:34:32

they had an OPSEC failure and they were

1:34:35

caught because they weren't private

1:34:36

enough,

1:34:36

or how privacy laws are protecting

1:34:38

criminals.

1:34:38

You see both sides of this, right?

1:34:40

And that is the most abundant form of

1:34:44

coverage about privacy in general.

1:34:46

I think it's talking about how criminals

1:34:47

are impacted because that's the most

1:34:50

probably newsworthy stuff.

1:34:52

But just like how you're talking about how

1:34:55

it proves that GDPR is effective.

1:34:57

Did GDPR maybe hinder this one specific

1:35:00

case?

1:35:00

Yes.

1:35:00

But all of that data that was being

1:35:02

used,

1:35:03

is very commonly behind the scenes and

1:35:06

perfectly legally being used by all of

1:35:08

these big tech companies and all of these

1:35:10

other organizations to do all sorts of

1:35:11

things that aren't catching criminals,

1:35:13

like sell you ads or try and implement

1:35:16

algorithmic pricing or trying to just

1:35:19

influence your opinions in general,

1:35:20

especially on social media.

1:35:22

And GDPR also helps prevent all of those

1:35:26

things.

1:35:26

But you don't hear about those stories in

1:35:28

the news because it's not newsworthy right

1:35:31

now, unfortunately.

1:35:33

You only hear about these criminal cases.

1:35:34

And so I just want to remind people

1:35:37

I think because there is this association

1:35:41

between privacy rights and

1:35:44

internet freedoms and digital rights and

1:35:46

all this stuff and criminals.

1:35:47

It's like just because you see it in

1:35:50

the context of like all of these things

1:35:52

being proven in court cases or in criminal

1:35:55

trials or in like law enforcement

1:35:58

investigations,

1:35:59

that doesn't mean it's the only place it's

1:36:00

happening.

1:36:01

It just means that's the only place that

1:36:02

the mainstream news media wants to write

1:36:06

about it in.

1:36:08

But you can look at all of these

1:36:10

cases and you can extrapolate

1:36:13

into like regular everyday life,

1:36:15

how people can improve their privacy.

1:36:18

You can learn from the opposite mistakes

1:36:20

of these criminals,

1:36:21

but also how these laws can impact and

1:36:24

protect you in other situations that

1:36:26

aren't

1:36:28

related to crime right gdpr protecting all

1:36:31

of that data certainly hinders all of

1:36:34

these bad things that i just talked about

1:36:35

happening um even if it's not widely

1:36:39

reported on and so i always just want

1:36:41

to make that reminder because when we talk

1:36:43

about criminal activity a lot i think that

1:36:45

always comes up it's like why are you

1:36:46

just defending criminals and that's not

1:36:48

that's not the case but they just have

1:36:51

the best cases to learn from

1:36:54

Yeah, it's like that whole,

1:36:56

like you were saying,

1:36:57

news by definition is out of the ordinary.

1:37:02

Like we don't talk about how, you know,

1:37:04

the ten thousand people today use graphene

1:37:07

and went to work and went home and

1:37:09

were completely normal law abiding

1:37:11

citizens.

1:37:12

It's you know, it's when it's, oh,

1:37:13

this guy was arrested and he was using

1:37:14

this weird phone that erases itself.

1:37:17

And it's like, OK, cool.

1:37:19

Yeah.

1:37:19

Like, obviously, that's interesting,

1:37:21

but that's not reflective of reality.

1:37:22

Yeah.

1:37:23

Yeah.

1:37:25

Let's get to our last few questions here.

1:37:27

Here's one from DQ.

1:37:30

Sorry, I think I clicked on one.

1:37:32

Sorry, you can do yours first then.

1:37:33

That's fine.

1:37:35

Okay, sorry.

1:37:35

Real quick.

1:37:36

Yeah, Dread Pirate Roberts said,

1:37:37

as more and more services block VPNs,

1:37:39

are there any solutions to be able to

1:37:40

have privacy without being blocked?

1:37:42

Again, VPNs aren't everything,

1:37:46

but I think we will see like,

1:37:47

I know Proton and I think Mulvad and

1:37:50

I think a couple others also do.

1:37:51

They have obfuscation to try and make it

1:37:53

so you can use it and it won't

1:37:55

be blocked.

1:37:55

And I know when India started requiring

1:37:58

VPNs to keep logs,

1:37:59

they did some kind of trickery where they

1:38:00

were able to move servers out of the

1:38:02

country,

1:38:03

but somehow address them in a way where

1:38:05

they looked like they were in the country.

1:38:06

So basically Indian users could still use

1:38:09

proton and be in India,

1:38:10

quote unquote in India,

1:38:12

but proton wasn't in India.

1:38:13

So they didn't have to comply with the

1:38:14

laws.

1:38:15

I don't know.

1:38:15

That's way over my head, but yeah,

1:38:18

I think we'll still see, and you know,

1:38:20

we'll still have things like tour now

1:38:22

until they outlaw that too.

1:38:24

I mean, I,

1:38:25

It's a cat and mouse.

1:38:26

I think we'll have options.

1:38:27

But yeah,

1:38:28

it will definitely get harder and be more

1:38:30

difficult.

1:38:32

Just to reply to the residential IP aspect

1:38:34

of this really quick,

1:38:35

I want to say you're correct that the

1:38:38

residential IP and proxy space is very

1:38:41

shady.

1:38:42

I definitely wouldn't support it even if

1:38:44

it does work because a lot of these

1:38:48

residential IP proxy brokers,

1:38:50

they are basically running criminal

1:38:53

organizations and they are

1:38:55

tricking people into installing malware on

1:38:58

their computers or tricking people into

1:39:00

buying like these cheap Amazon or not

1:39:02

Amazon but Android TV boxes on Amazon and

1:39:05

other marketplaces to connect to their

1:39:06

networks and that's how they get all these

1:39:08

residential IPs right and funding those

1:39:11

operations it puts a lot of like regular

1:39:14

people in danger because like law

1:39:15

enforcement goes after those people all

1:39:17

the time because they're hosting like

1:39:19

basically an exit node for a VPN that's

1:39:21

handling all sorts of crazy traffic right

1:39:23

and it's

1:39:26

Not an ideal situation for anyone

1:39:29

involved,

1:39:29

so it's definitely not something that I

1:39:31

would pursue personally if I were you.

1:39:35

I would avoid the whole residential IP

1:39:37

space because it's pretty much all malware

1:39:41

that's driving that,

1:39:43

and that's not something that should be

1:39:44

really supported, I think.

1:39:47

Real quick on a personal note,

1:39:48

Dread Pirates Roberts said there's a

1:39:51

documentary from Vice that shows the

1:39:52

facial recognition capabilities in China

1:39:54

six years ago,

1:39:55

and the people I've shown it to have

1:39:56

been very receptive.

1:39:57

Please send that my way because I want

1:39:59

to watch that.

1:39:59

That sounds cool.

1:40:00

All right, what's the next one?

1:40:03

I think you had a question lined up.

1:40:04

I think this will be our last question

1:40:06

of the show here,

1:40:08

but this is from DQ.

1:40:10

They asked,

1:40:10

have you come across the OPSEC Bible by

1:40:13

Nihilist?

1:40:15

First of all, just stopping there,

1:40:16

have you heard of this?

1:40:17

Because I actually have not,

1:40:18

unfortunately.

1:40:20

I'm not sure if you're familiar.

1:40:20

I don't think so.

1:40:22

I want to say it sounds familiar,

1:40:24

but I could be making that up.

1:40:26

If I've heard of it,

1:40:27

I've definitely never read it.

1:40:28

Okay,

1:40:29

that's definitely something I will have to

1:40:30

look into.

1:40:31

But continuing your message,

1:40:32

I'd love to hear your thoughts on its

1:40:33

extreme all-or-nothing privacy philosophy,

1:40:35

especially since the guide criticizes the

1:40:37

closed-source recommendations that appear

1:40:39

on privacy guides.

1:40:39

It seems to push a very different approach

1:40:41

from the more mainstream privacy advice

1:40:43

you usually promote.

1:40:46

And just based on that...

1:40:50

That is a pretty common thing that we

1:40:51

see with a lot of privacy guides out

1:40:53

there,

1:40:53

especially ones that are published

1:40:56

anonymously or are catered towards a more

1:41:00

hardcore audience.

1:41:02

It's definitely a different audience than

1:41:03

we're going for.

1:41:06

I think that the biggest thing that we

1:41:07

try to do at Privacy Guides is try

1:41:10

to find all of these tools in different

1:41:11

categories that can really

1:41:14

raise the bar for privacy as a whole.

1:41:17

We can't solve every problem at once.

1:41:19

And I think this ties into a lot

1:41:20

of the things that we were talking about

1:41:24

earlier in the show as far as convincing

1:41:26

people to switch from WhatsApp to Signal,

1:41:28

for example.

1:41:31

People are using

1:41:34

things that are crazy for your privacy and

1:41:36

extremely privacy invasive.

1:41:37

People are using Windows, for example,

1:41:40

which I think is not something people

1:41:42

should be doing in twenty twenty six.

1:41:44

Like that's the state that most people who

1:41:48

aren't who haven't heard of any of this

1:41:49

are are at right now.

1:41:51

And so switching I mean,

1:41:53

even switching from Windows to Mac OS is

1:41:56

not

1:41:57

ideal and it's not like you know if

1:41:59

somebody comes up to me it's like what's

1:42:00

the most private operating system mac os

1:42:03

is far behind what the actual like better

1:42:07

ones are by by a wide margin but

1:42:09

compared to what people are coming from

1:42:12

which is Windows in this case,

1:42:15

it's a huge advantage and people are more

1:42:18

apt to switch to it.

1:42:19

And I think that encouraging people

1:42:22

switching to, in some cases,

1:42:26

some proprietary systems over time is

1:42:31

better than the situation that I think

1:42:34

privacy guides like the OPSEC Bible in

1:42:37

this case probably

1:42:40

I personally think the outcome of a guide

1:42:42

like that,

1:42:42

if I put it in the hands of

1:42:43

a normal person,

1:42:44

is that they will not follow any of

1:42:46

the advice.

1:42:47

Because we see this even in our forum,

1:42:50

but definitely less so recently.

1:42:53

And we've made some changes to improve

1:42:57

this.

1:42:58

But it's a very common complaint, I think,

1:43:00

in the privacy community where people feel

1:43:03

burned out because they...

1:43:05

need to switch like all of these things

1:43:07

all at once and they feel the need

1:43:09

to switch like completely cut off less

1:43:13

private alternatives or things that their

1:43:14

friends are using and they feel socially

1:43:16

isolated and that's not really the goal of

1:43:18

being private like privacy i think um

1:43:23

gives like,

1:43:24

it's a right that you should have,

1:43:25

you should be able to exercise this,

1:43:26

but it's not like you need to be

1:43:29

completely private in all aspects of your

1:43:31

life.

1:43:31

Some people still need to,

1:43:33

everyone still needs to have a social life

1:43:34

and interact with other people and that

1:43:36

sort of thing.

1:43:37

And yeah, at the end of the day,

1:43:40

When we recommend something like one

1:43:42

password, for example,

1:43:44

it's because we've looked at that and

1:43:47

we've decided that compared to what other

1:43:50

people are using,

1:43:50

which is either no password manager at all

1:43:52

or something like LastPass,

1:43:54

which has notoriously a ton of data

1:43:57

breaches and security issues,

1:43:59

solid proprietary tools that respect your

1:44:01

privacy relatively well are acceptable to

1:44:05

us.

1:44:05

And we would rather people switch to that

1:44:07

than not follow the advice at all.

1:44:09

And for people who are looking for more

1:44:11

advanced or more customized

1:44:13

recommendations or for any of that,

1:44:14

I think we have the form which is

1:44:16

going to be able to answer those sorts

1:44:19

of questions for people who have moved

1:44:21

beyond the general advice that we have on

1:44:24

our site and who don't need an approach

1:44:27

that we take for the general population

1:44:29

where we try to balance privacy, security,

1:44:32

and user experience.

1:44:33

And you can really hone in on a

1:44:35

good situation for you

1:44:37

through these discussions.

1:44:38

And I think that that's the value of

1:44:40

the privacy guides community form that

1:44:44

none of these guides are going to be

1:44:45

able to provide.

1:44:46

Because at the end of the day,

1:44:48

all of this tailored advice is going to

1:44:51

be better in general than any of these

1:44:55

guides, to be honest.

1:44:57

So that's my thoughts on that.

1:45:01

Yeah, you kind of said what I'm thinking,

1:45:05

so I'll keep this quick.

1:45:06

But I think in addition to – y'all

1:45:11

are going to get tired of hearing me

1:45:12

say the words harm reduction.

1:45:13

In addition to the harm reduction mindset,

1:45:15

which I'm a huge, huge fan of,

1:45:17

I think there's also the idea that two

1:45:19

things can be real.

1:45:20

I really don't like the idea –

1:45:22

or I don't like the narrative that some

1:45:24

people push where it's like,

1:45:25

if you're not doing privacy my way and

1:45:27

you're not going a hundred percent,

1:45:28

then you're wrong.

1:45:29

Because the fact of the matter is they're

1:45:31

wrong too.

1:45:32

Like the only way to really be private

1:45:33

is to just like throw away your computer,

1:45:35

never get on the internet,

1:45:36

go live in a cabin in the woods,

1:45:38

which I would like to reiterate.

1:45:39

That's not foolproof either because they

1:45:41

did find Ted Kaczynski.

1:45:42

So yeah,

1:45:44

I don't know.

1:45:45

I really reject that whole extreme all or

1:45:48

nothing.

1:45:49

This is the only way to do it.

1:45:50

I think it's really arrogant.

1:45:51

I think it's really disrespectful.

1:45:54

Again, I want to reiterate,

1:45:54

I haven't read this Bible,

1:45:55

so I'm not passing judgment on nihilists.

1:45:58

I'm just in general.

1:46:00

I think when people do that,

1:46:01

it's really...

1:46:04

I don't know, like Jonah was saying,

1:46:05

like people have,

1:46:06

I've seen people look at certain guides

1:46:08

and websites and just be like, yeah,

1:46:09

I'm not doing, and straight up say that,

1:46:11

like, I'm not doing that.

1:46:12

And I would rather people make small steps

1:46:15

that do something, even if nothing.

1:46:18

And I think some people,

1:46:20

not all of them,

1:46:20

but I think some people will take those

1:46:22

small steps and go, oh,

1:46:24

that wasn't so bad.

1:46:25

That was actually kind of fun.

1:46:27

What else can I do?

1:46:28

And they'll go above and beyond.

1:46:29

Like, I don't need to be using cubes.

1:46:30

That is not part of my threat model.

1:46:32

I like it.

1:46:33

I think it's fun.

1:46:35

So,

1:46:36

and I think kind of going back to

1:46:38

what I said about like two things can

1:46:39

be real.

1:46:40

I think it's great that there are these

1:46:42

really hardcore guides

1:46:44

For the people who want to be hardcore

1:46:46

or even like when Michael Basil was doing

1:46:49

his podcast, I would listen all the time.

1:46:51

And I still read his books,

1:46:52

his extreme privacy books,

1:46:54

because I like the thought experiment.

1:46:57

That's what I'm looking for.

1:46:57

I like the thought experiment.

1:46:58

I like the like knowing how deep the

1:47:00

rabbit hole goes and just knowing what the

1:47:02

options are.

1:47:03

Even though ninety percent of the time I

1:47:05

walked away going,

1:47:06

I'm not going to do any of that.

1:47:07

But it's really cool to know that that's

1:47:09

a thing, and it's really interesting,

1:47:10

and it's fun to learn about.

1:47:12

And, you know, some people would do it,

1:47:14

and there were some things that I would

1:47:15

listen to and be like, oh,

1:47:16

I think I might want to try that,

1:47:17

actually.

1:47:18

So I don't think it's a bad thing

1:47:19

that this stuff is out there.

1:47:20

I think it's really cool,

1:47:22

as long as they're not adapting that

1:47:23

attitude of like, well,

1:47:24

this person's wrong.

1:47:26

I mean, unless somebody's actually wrong,

1:47:27

then like, hey, please,

1:47:28

if you think we're wrong,

1:47:29

open a thing on the forum.

1:47:31

Like, let us know.

1:47:32

But...

1:47:33

It's, you know,

1:47:34

it's respecting that there's different

1:47:35

priorities,

1:47:36

there's different threat models,

1:47:37

there's different resources.

1:47:39

Like, you know,

1:47:39

somebody posted in the forum recently

1:47:41

talking about they disagree with our

1:47:42

Android recommendations because not

1:47:44

everybody can,

1:47:45

lives in a country where they can get

1:47:46

a pixel or not everybody can afford one.

1:47:48

And, you know,

1:47:49

that's true of these more extreme privacy

1:47:51

things too.

1:47:51

So, yeah.

1:47:53

Yeah.

1:47:54

Again, haven't read it,

1:47:55

but if he's coming at it from the

1:47:57

perspective of like,

1:47:58

this is how I think you can get

1:47:59

the maximum level of privacy, then great.

1:48:02

I think that's really cool that there are

1:48:03

those guides,

1:48:04

but I don't think that invalidates things

1:48:06

like privacy guides where we say,

1:48:08

this is probably good enough for most

1:48:09

people.

1:48:10

And I think both of those things can

1:48:12

exist.

1:48:13

And DQ, thanks for sharing in the chat.

1:48:16

I'll link to this.

1:48:18

I want to reiterate,

1:48:20

nothing that I was saying before is any

1:48:21

judgment of this guide in particular,

1:48:23

because again, I haven't read it.

1:48:24

Neither of us have read it.

1:48:25

I was going to say,

1:48:26

he might have been talking to me.

1:48:27

It certainly could have...

1:48:31

Good advice.

1:48:32

Right.

1:48:32

Uh, and, and I'll definitely check it out.

1:48:33

So thanks again for sharing.

1:48:35

Um, I just want to, that's,

1:48:38

that's just my experience based on other

1:48:39

guides and based on like what you,

1:48:41

how you described it.

1:48:43

Um,

1:48:43

I I've definitely seen guys like that

1:48:44

where yes,

1:48:47

that it's probably not the target audience

1:48:50

that we are trying to go for.

1:48:52

Um,

1:48:52

I kind of have a philosophy that like

1:48:55

being accessible and also

1:49:00

sort of being like a more public face

1:49:02

when it comes to all of this,

1:49:03

like obviously do this under my name,

1:49:06

for example,

1:49:07

and not a pseudonym like this.

1:49:08

I think that that is a difference in

1:49:09

approach and it reaches different people.

1:49:11

And I think that guides like that and

1:49:14

projects like Privacy Guides both serve

1:49:18

their own purpose.

1:49:19

But yeah,

1:49:22

for anything more than like just the basic

1:49:24

stuff that we have on our site,

1:49:25

that is the point of our form.

1:49:28

Because, yeah,

1:49:29

I really don't know if any of these

1:49:30

guides can really be everything for

1:49:33

everyone, right?

1:49:34

But I'm sure for a certain group of

1:49:36

people, that guide could be very good.

1:49:39

And I will definitely take a look at

1:49:40

it because I like to read other guides

1:49:43

out there.

1:49:46

Yeah,

1:49:46

I produced some of the articles on the

1:49:48

website.

1:49:48

I didn't go straight to the Bible.

1:49:49

I went to the Root website.

1:49:51

Some of it is pretty extreme,

1:49:52

like which countries don't have

1:49:53

extradition laws.

1:49:55

Which, no offense to this guy,

1:49:57

but if that's my threat model,

1:49:58

I'm not going to trust a random website

1:49:59

on the internet.

1:50:00

I'm going to talk to an actual lawyer.

1:50:03

But then others were like,

1:50:05

how to get started with ITP,

1:50:06

which I don't really have strong opinions

1:50:11

on ITP, but I don't know.

1:50:14

I'll peruse it.

1:50:16

I'll check it out later this weekend.

1:50:18

Some light reading for the weekend.

1:50:19

Yes.

1:50:20

Yes.

1:50:22

Well, Nate,

1:50:23

I think this probably about wraps things

1:50:25

up here.

1:50:27

As a quick reminder to everyone,

1:50:28

PrivacyGuides is a nonprofit.

1:50:29

We're dedicated to protecting our digital

1:50:32

rights.

1:50:32

If you want to support the show and

1:50:34

our mission,

1:50:35

a donation at privacyguides.org would be

1:50:37

much appreciated.

1:50:39

I want to thank Nate for joining me

1:50:42

this week.

1:50:44

Before we wrap up this broadcast here,

1:50:47

I want to deliver a quick message as

1:50:52

the program director of Privacy Guides

1:50:53

about the current state of the United

1:50:55

States of America.

1:51:00

As a Minnesotan and a resident of the

1:51:03

city of Minneapolis myself,

1:51:04

this is a very important issue to me.

1:51:06

We're only one month into twenty twenty

1:51:09

six right now.

1:51:10

And already this year,

1:51:11

ICE agents of the federal government of

1:51:14

the United States are responsible for the

1:51:16

extrajudicial killings of two American

1:51:18

citizens right here in my city for

1:51:21

exercising their constitutional rights.

1:51:24

This happened as part of a larger ICE

1:51:26

campaign to terrorize my neighbors and

1:51:29

this country,

1:51:30

which is a campaign that I know many

1:51:32

Minnesotans protested in force last week,

1:51:34

and I know many American patriots are

1:51:37

protesting today.

1:51:39

Our mission at Privacy Guides has always

1:51:41

been to support the right of privacy for

1:51:44

all people,

1:51:45

regardless of political views or the

1:51:47

country that people live in.

1:51:49

It's also our mission to speak out against

1:51:51

government overreach,

1:51:53

particularly when it comes to surveillance

1:51:55

and especially when government agencies

1:51:57

are being pitted against the very

1:52:00

taxpayers and citizens that they are meant

1:52:02

to protect and serve.

1:52:04

Here in the United States,

1:52:05

that's meant recently speaking out against

1:52:08

the Democrats who aim to increase

1:52:09

surveillance and censorship through bills

1:52:11

like COSA or the planned repeals of

1:52:13

Section two thirty.

1:52:14

But now against republican certainly in

1:52:17

our government,

1:52:18

who are weaponizing the state surveillance

1:52:21

systems and law enforcement bodies like

1:52:23

ice to target their perceived political

1:52:25

enemies and immigrant members of our

1:52:27

communities,

1:52:28

without respect to their legal residency

1:52:31

status or any due process.

1:52:34

this weaponization of ice by the trump

1:52:36

administration is not happening in a

1:52:38

vacuum it's fueled by the very

1:52:41

surveillance data and the lack of digital

1:52:43

boundaries that we have been fighting

1:52:45

against for years laws which were enacted

1:52:48

within my lifetime like the patriot act

1:52:51

and loopholes like the continued lack of

1:52:53

regulations against commercial data

1:52:55

brokers which allow the government to

1:52:57

bypass the fourth amendment entirely by

1:53:00

purchasing our own GPS and social media

1:53:03

data from tech companies to map out our

1:53:05

neighborhoods for raids.

1:53:07

Minneapolis has also become the testing

1:53:09

ground for invasive and inaccurate facial

1:53:12

recognition apps like Mobile Fortify,

1:53:14

where AI glitches,

1:53:15

just like we talked about in this episode,

1:53:17

can lead to unlawful detentions of

1:53:20

innocent people and the sort of

1:53:22

state-sponsored surveillance that took the

1:53:24

lives of Renee Nicole Goode and Alex

1:53:27

Peretti.

1:53:29

In times of overreach,

1:53:30

our greatest defense is our community and

1:53:33

our refusal to be intimidated into

1:53:35

silence.

1:53:35

And I've seen how powerful that this can

1:53:37

be firsthand.

1:53:39

The reality is that how ICE is operating

1:53:42

within the borders of the US is

1:53:44

unjustifiable.

1:53:45

So we here recognize the significance of

1:53:49

this unprecedented situation,

1:53:51

and we stand alongside everyone who's

1:53:53

protesting in support of the protection of

1:53:55

our neighbors and for American rights,

1:53:57

which is something that I think all

1:54:00

Americans should support.

1:54:03

Thank you all for tuning in.

1:54:05

I hope you all have an excellent weekend.