The US Bans New Foreign Made Routers?!
Ep. 46

The US Bans New Foreign Made Routers?!

Episode description

The US’s Federal Communications Commission has banned all foreign made routers stating their are not secure for use. Walmart digital price labels are coming to every store shelf in the US, Systemd has integrated age verification systems, and more. Join us for This Week In Privacy #46!

Download transcript (.vtt)
0:03

The U.S.

0:03

government has banned all foreign-made

0:05

consumer routers,

0:07

SystemD's new age verification feature,

0:09

and the Meta and Google social media

0:11

addiction lawsuit.

0:13

All this and more coming up on This

0:14

Week in Privacy, number forty-six,

0:16

so stay tuned.

0:38

Welcome back to This Week in Privacy,

0:40

our weekly series where we discuss the

0:42

latest updates with what we're working on

0:44

within the Privacy Guides community,

0:46

and this week's top stories in data

0:49

privacy and cybersecurity.

0:51

I'm Jonah,

0:51

and with me this week is Nate.

0:54

How are you doing, Nate?

0:56

I'm doing very well.

0:58

Busy week behind the scenes here,

1:00

but very excited.

1:00

Good stuff.

1:01

How have you been?

1:03

I'm doing fantastic.

1:04

I'm excited to be back on the show.

1:08

Now we'll start off with the biggest news

1:10

that we've seen in privacy and security

1:12

over the past week.

1:14

Our first story today is reported by The

1:16

Verge.

1:17

The US government just banned consumer

1:19

routers made outside the US.

1:22

The US claims foreign-made routers pose

1:25

national security risks.

1:28

So this gives some context.

1:29

In December,

1:30

the Federal Communications Commission

1:31

banned all future drones made in foreign

1:34

countries from being imported into the

1:36

United States unless or until their maker

1:38

gets an exception.

1:40

Now the FCC has done the exact same

1:41

for consumer networking gear, citing,

1:43

quote,

1:44

an unacceptable risk to the national

1:47

security of the United States and to the

1:49

safety and security of U.S.

1:51

persons.

1:53

So as this article says,

1:54

we did see this happening with DJI,

1:57

who opted to just not sell new drones

2:00

in the United States rather than try to

2:01

comply with this.

2:04

And now a similar thing is happening here.

2:05

As The Verge points out,

2:07

the vast majority, if not all,

2:08

consumer routers are currently

2:10

manufactured outside the United States,

2:12

and the vast majority of future consumer

2:14

routers are now banned.

2:16

By adding all foreign made consumer

2:18

routers to its covered list,

2:19

the FCC is saying it will no longer

2:21

authorize their radios,

2:22

which de facto bans new devices from

2:24

import into the country.

2:29

So this is a interesting ban,

2:32

to say the least.

2:35

It doesn't seem to be like a lot

2:38

of other things that are banned,

2:39

because as this article points out,

2:42

domestic router manufacturing is pretty

2:44

much not a thing.

2:47

I have a couple of questions about this

2:49

and actually posted this on Mastodon.

2:51

But my biggest question is kind of like

2:53

how this relates to

2:57

they're differentiating between consumer

3:01

routers and other routers.

3:03

I've seen the FCC,

3:05

their definition of residential routers,

3:07

which basically says that it's all routers

3:11

that are intended to be used in a

3:12

residential setting and can be installed

3:16

by the end user,

3:18

which seems to me like it would

3:22

not affect something like the router that

3:24

your ISP provides.

3:25

So I think that this ban could certainly

3:28

mean that we're all going to be stuck

3:30

with these probably far more insecure

3:34

trash routers that Verizon or Comcast or

3:37

whoever provides you rather than you being

3:38

able to replace it with your own.

3:44

But yeah, it's crazy stuff.

3:46

Did you see anything in this article that

3:48

you wanted to point out, Nate?

3:51

Oh, yes.

3:52

A couple of things.

3:53

Well, specifically,

3:54

I wanted to point out that according to

3:56

the article, well,

3:57

according to the FCC as well,

4:00

this is about national security, right?

4:02

And they specifically mentioned the Volt

4:05

Typhoon, the Salt Typhoon,

4:07

and the Flax Typhoon,

4:08

which that one I'm not familiar with.

4:10

But they cited those cyber attacks,

4:14

which targeted critical American

4:15

communications, energy, transportation,

4:17

and water infrastructure.

4:19

But the thing they don't mention that I

4:21

thought was interesting is that Salt

4:24

Typhoon happened because of a law

4:27

enforcement backdoor in our

4:29

telecommunications infrastructure.

4:31

And I think the article pointed out here

4:33

that Volt Typhoon happened because

4:35

American-made routers,

4:37

they specifically – yeah,

4:38

Cisco and Netgear mostly –

4:40

we're just not kept up to date.

4:42

So it's like the flimsiest, um,

4:44

like I'm trying to think of an example.

4:46

It's,

4:47

I guess it's like that classic joke that,

4:49

you know, Oh, I don't drink water.

4:50

Do you know how many people that kills

4:51

every single year?

4:52

And it's like, that's not really related,

4:54

but okay, sure.

4:55

Go off, I guess.

4:56

Oh,

4:58

So one thing I thought I read here

5:00

that maybe you know more about is,

5:04

is this a ban on the routers or

5:05

is this a ban?

5:06

I think the article said something about

5:07

it being on like the radio chips,

5:09

which kind of makes it even worse because

5:10

I know there are,

5:12

there's a few in my non-expert opinion,

5:15

there's a few decent American

5:17

manufacturers like Netgear, for example.

5:19

But if the chip itself is the part

5:23

that's on the covered list,

5:24

then how are they supposed to produce

5:25

these?

5:26

um without getting a chip did you read

5:28

anything about that am i misremembering

5:29

that so i mean typically each individual

5:34

product is going to need to be approved

5:36

by the fcc so they do um they

5:39

would they would approve like the entire

5:41

product and the only thing that i've seen

5:43

is that they're not going to approve

5:47

consumer residential router products.

5:51

So this is not going to affect business

5:53

routers.

5:54

As far as I know,

5:55

it wouldn't affect the chips if they're

5:58

used in a non-consumer router.

6:00

So like one of my questions is,

6:04

there's certainly an interesting line in

6:05

the router space once you get to the

6:07

higher end between like

6:09

residential routers from like NetGate or

6:12

Linksys or whoever,

6:13

and then like more prosumer routers like

6:15

Ubiquity.

6:16

And then you get into like enterprise

6:18

routers, which as far as I know,

6:19

are not affected.

6:20

I don't know where something like in that

6:22

prosumer middle ground is going to fall.

6:25

But usually like even on the enterprise

6:27

side of things,

6:28

if you're looking at the the actual chips

6:30

involved there,

6:34

they're similar,

6:35

if not the same to what's in a

6:36

lot of routers,

6:37

just because there aren't like a ton of

6:40

options for chips.

6:41

So as far as I know,

6:44

individual components shouldn't be

6:46

impacted,

6:46

which makes this all the more

6:50

interesting because I don't know what

6:52

they're exactly trying to defend against

6:54

here.

6:56

I would imagine the bigger issue that they

6:59

would say that they have is more to

7:00

do with the router firmware and how it's

7:03

deployed.

7:06

But as you pointed out and as the

7:08

article said,

7:09

the most recent big attacks on routers

7:12

have been against major American ones and

7:18

enterprise ones that are typically more

7:20

powerful, enterprise firewalls.

7:22

So something from the likes of Cisco or

7:26

FortiGate or whoever are the most recent

7:30

major attacks lately.

7:32

Whereas consumer-grade routers,

7:36

certainly have security issues like don't

7:38

get me wrong but i don't think they

7:40

warrant um something a total ban like this

7:45

kind of similar to the drone thing uh

7:49

what their goals are aren't exactly clear

7:51

to me but uh it seems like they

7:55

really want

7:57

these manufacturers to just cut some kind

7:59

of deal to get approved rather than just

8:03

being approved because they made a product

8:07

that people need.

8:08

So yeah,

8:11

it's kind of like all of the tariff

8:14

stuff lately and the other trade bans that

8:16

have been going on in the US.

8:18

I think it's going to be a big

8:19

challenge for American consumers right

8:21

now.

8:25

Manufacturing capacity for these routers

8:28

certainly cannot shift to the US at a

8:31

moment's notice.

8:32

I mean,

8:32

it would take years for this to even

8:34

be a possibility.

8:36

So in the meantime, it seems not great.

8:39

And I guess we'll see how these router

8:42

companies respond.

8:43

I haven't actually looked this week to see

8:45

if any of them have made a statement.

8:47

I would be interested to know how many

8:48

are going to take DJI's approach to just

8:51

exit the US market versus how many people

8:53

are going to try to comply with this.

8:56

But pretty much the entire router industry

8:58

is going to be impacted by this.

8:59

So it's crazy stuff.

9:03

Yeah, nothing has come across my feed.

9:05

I haven't specifically gone looking in

9:07

terms of if anybody's made a statement.

9:09

um another thought that occurs to me uh

9:12

i'm assuming okay i i have a really

9:15

stupid question here i mean there are

9:17

enterprise level wi-fi routers right most

9:19

of my work is done with like hardwire

9:21

switch i've never i haven't done a whole

9:22

lot of routers or like wi-fi so yeah

9:26

definitely um i mean there's like ubiquity

9:29

for example you see that installed in

9:30

small and medium businesses uh

9:33

A lot of the time,

9:33

these enterprise things are split up into

9:35

multiple components.

9:36

They'll have an access point and a router,

9:39

and those will be separate things,

9:41

which is the case for most of Ubiquiti's

9:44

products.

9:46

It's also the case for something like

9:48

Aruba or Cisco.

9:50

They both make access points.

9:52

There's other...

9:56

There's other manufacturers like MicroTik.

9:58

I can't remember other big enterprise

10:00

ones, but there's certainly a lot of them,

10:02

which in theory should not be impacted,

10:06

but I guess it depends on how widely

10:08

the FCC decides to define all of these

10:12

products.

10:15

And then, okay,

10:16

so the other stupid question here is

10:19

there's kind of a big price gap,

10:20

isn't there,

10:21

between a consumer-level router and an

10:23

enterprise one?

10:24

How big of a price gap are we

10:26

talking?

10:27

yeah so that's where it really depends on

10:29

the product i think most of these um

10:31

enterprise routers are going to be or like

10:35

the entire system it's always going to be

10:36

more expensive because you have to buy the

10:37

router and the access points separately um

10:40

so there's that whole aspect but of course

10:42

on the router side of things you can

10:44

set up like a old computer or something

10:46

and use some open source software like

10:48

open sensor pf sense so you have that

10:51

option and then the access points um

10:55

generally cheaper you probably only need

10:56

one to cover a house realistically so it's

10:59

possible especially with um some fraud

11:04

some products like either from ubiquity or

11:06

microtik i know that they make access

11:09

points that are probably readily

11:10

accessible um some of the more enterprise

11:12

stuff like uh from aruba or uh maybe

11:17

cisco or or other companies they're gonna

11:20

require like

11:21

a whole subscription service for

11:23

management and all of this stuff.

11:24

So once you get into the real enterprise

11:26

side of things like that would be

11:27

extremely hard to do from your house,

11:29

but it really depends on the manufacturer.

11:31

But there are some lower end ones where

11:35

you could see that being possible.

11:37

But I don't know if the FCC is

11:40

going to extend that to pretty much

11:42

anything that normal residential consumers

11:45

will buy or whether it'll just be like

11:47

things that are marketed towards

11:50

consumers.

11:52

One of the questions that I had on

11:55

Mastodon was whether we're going to see an

11:57

uptick in

11:58

small business or home business routers um

12:01

that say like not for residential use on

12:03

them because i know we've seen um in

12:06

other areas that the government regulates

12:08

like uh all sorts of crazy drugs and

12:11

peptides for example i was just thinking

12:13

about research drugs that are not approved

12:15

by the fda they're not for human

12:17

consumption but of course they get sold um

12:21

to random people anyways,

12:23

and you can find plenty of threads on

12:25

Reddit and other sites that indicate they

12:28

might not be following all of the labels

12:31

on these products.

12:33

So I don't know if that would be

12:34

the case here,

12:35

but I would be interested to see if

12:38

that's the case.

12:40

Yeah, I don't want to get too political,

12:44

but that is a thought with what you

12:45

were saying about,

12:47

I think regardless of whether you're

12:49

pro the current administration or not,

12:51

this whole idea of like bringing

12:53

manufacturing back to the U S again,

12:55

whether you think that's a good idea or

12:56

not, it, it can't happen overnight.

12:58

And so this like out of the blue,

13:00

like, okay, all these routers are banned.

13:01

It's like, dude,

13:03

it's going to take us five years at

13:05

best, probably even more than that.

13:06

That's probably like delusionally

13:08

optimistic to get the manufacturing done

13:10

back over here.

13:11

And then to get the supply chain instruct

13:13

in place and the infrastructure,

13:16

it's like, it's not, it's,

13:17

Yeah, it's it's crazy.

13:19

So I'm hoping and again,

13:21

not to be too political,

13:22

but we have seen I feel like we've

13:24

seen the current administration do things

13:26

like this where like they'll ban something

13:28

or they'll institute tariffs and then

13:30

they'll kind of start to like make

13:31

exceptions.

13:32

And OK, except for this guy.

13:33

And but here's a workaround.

13:34

And and I think it's because once they

13:36

do it, they realize like, oh,

13:37

wait a minute,

13:38

that can't happen that fast.

13:39

So I'm hoping we'll see something similar

13:42

here, to be honest.

13:44

Yeah,

13:44

I don't see any other way around it.

13:45

I think they're going to have to

13:46

personally.

13:47

The worst case scenario, I think,

13:49

for this whole thing from a privacy

13:50

perspective is that the available options

13:54

that will be left on the market,

13:55

I think it's going to be much easier

13:56

to track because...

14:00

The specific definition of residential

14:02

consumer routers that they're using here

14:04

do say it's routers that are intended to

14:07

be installed by the end user.

14:09

So presumably something that your ISP

14:11

installed would not be affected.

14:13

So like I said before,

14:14

I think a lot of ISPs will be

14:15

installing their own routers.

14:17

And those are pretty well known to track

14:21

a ton of information.

14:22

It's one of the main reasons,

14:24

aside from the poor performance and other

14:26

things.

14:26

There's a lot of reasons that people will

14:28

replace their ISP-provided router,

14:31

but tracking and privacy concerns are

14:33

certainly one of them.

14:36

And this could also, I think,

14:39

get more people to switch to cellular

14:42

connections or use their cell phones more

14:43

because you don't need a router at all.

14:45

But of course,

14:46

the whole cell network system is more

14:49

problematic for surveillance and privacy

14:51

as well compared to these hardline

14:54

internet connections.

14:57

Yeah,

14:57

I think we'll have to see how this

14:59

resolves,

15:00

but there's a lot of potential privacy

15:01

concerns here.

15:04

Jordan asked in the chat whether there's

15:06

any U.S.

15:06

manufactured routers.

15:08

Not as far as I know is the

15:10

answer to that.

15:11

I think that there are

15:14

There are certainly some routers.

15:16

Yeah, they're like U.S.

15:17

designed.

15:19

So there are American companies that are

15:21

making routers,

15:22

but the manufacturing in the U.S.,

15:26

it's non-existent.

15:28

And even like companies in end products

15:32

that are making wireless chips on the

15:33

other side.

15:35

Like we've seen Apple get into this with

15:38

their latest products where they're now...

15:41

creating their own modems.

15:43

But all of that is obviously not

15:44

manufactured in the US.

15:45

It's just designed here.

15:47

And I think that a part of the

15:50

concern that they would have is whether

15:52

the back doors could be inserted into

15:54

these ships during the manufacturing

15:55

process that the designers wouldn't know

15:57

about.

15:57

So I don't know how this is going

15:59

to impact American companies.

16:02

But yeah, it'll be interesting to see.

16:07

Possibly stupid question.

16:09

Are manufacturing and assembly the same

16:11

thing?

16:12

in this context?

16:14

Um, cause my thought process is, um,

16:18

in Texas, I think it's so funny.

16:20

I see a lot of Toyota trucks driving

16:21

around with a sticker.

16:22

It has like the Texas flag on it.

16:24

And it says, uh,

16:24

built here lives here because Toyota has

16:27

an assembly plant in Houston.

16:28

And every time I see that, I'm like,

16:29

yeah, but Toyota,

16:31

like this is not an American company,

16:33

but they assemble the trucks here in,

16:35

in Texas.

16:36

So good enough.

16:36

Right.

16:37

And that's kind of my thought is like,

16:38

Would that be a loophole maybe?

16:40

Like, OK, just take all the components,

16:42

ship them over here,

16:43

and we'll spin up a factory in Houston

16:45

or wherever, Indiana,

16:47

and just assemble them here.

16:48

And now it's not manufactured.

16:50

I wonder, I don't know,

16:51

that just kind of popped into my head

16:52

while you were answering Jordan's

16:53

question.

16:55

Yeah, that is a great question.

16:57

I don't know how they would apply this

16:59

to individual router components.

17:07

That's a great question.

17:09

It's hard to say what loopholes will be

17:11

available.

17:13

I was just curious if you knew anything

17:14

about that.

17:14

Like, no, those are...

17:16

Okay.

17:17

Yeah.

17:17

I know, like, for example,

17:19

I know this ban does extend to routers

17:22

that are designed in the US.

17:23

So like right now,

17:24

those American companies are impacted.

17:27

But whether they can do like this,

17:28

I've definitely seen that before to like

17:30

get the made in the USA stickers on

17:32

different products.

17:33

It's definitely a thing that happens.

17:35

And I don't know if that'll be enough

17:37

of a loophole to get these routers in

17:39

or not.

17:39

But I guess we'll see what companies come

17:41

up with.

17:42

Yeah, that's what I was about to say.

17:44

I guess we'll find out.

17:45

Yeah,

17:46

I think the biggest thing with this story

17:49

is just if there are security concerns,

17:52

we could certainly look at all of the

17:53

security concerns and issues that we've

17:55

seen with routers.

17:57

We've definitely reported on some or

17:59

talked about them on this show before

18:01

various routers being attacked with

18:03

malware or updates that you should

18:05

install.

18:06

But this kind of blaming that on it

18:10

being because they're foreign routers

18:12

routers doesn't make a lot of sense to

18:14

me this is definitely a case of like

18:16

because all routers are foreign um it's

18:20

it's sort of a correlation situation not a

18:24

causation right like that's not the reason

18:27

it just happens to be

18:29

all routers that have security issues are

18:31

foreign because all routers are foreign,

18:35

right?

18:36

Yeah,

18:37

and something else that popped into my

18:39

head that I forgot until just now,

18:41

not to put on too much of a

18:42

tinfoil hat,

18:43

but it's weird to me that they're like,

18:44

oh, this is a national security risk,

18:46

so we have to ban consumer routers.

18:49

But...

18:50

Wouldn't we want to keep...

18:51

And China has a proven history of stealing

18:54

U.S.

18:54

intellectual property.

18:56

That's known.

18:57

That's a proven thing.

18:58

So why wouldn't we ban the business

19:00

routers instead?

19:02

It's very confusing.

19:03

It's not lining up with... Absolutely.

19:06

I mean,

19:07

the business routers definitely have a

19:09

more sensitive position in the networks.

19:13

So there should be more concern there.

19:16

I do think...

19:17

I'll play devil's advocate and...

19:21

share why I think banning like consumer

19:23

routers or taking them more seriously

19:25

makes sense because I have

19:28

Ben Tu talks and other things where people

19:32

talk about not routers,

19:33

but like those fake Android TV boxes that

19:36

have a bunch of pirated content that you

19:37

can get on Amazon or whatever,

19:39

and sort of products like that,

19:41

or browser extensions that you can install

19:43

that give you a free VPN or something.

19:45

And all of these things are typically used

19:47

to create basically a botnet of all of

19:51

these residential routers or

19:53

like proxy services where you can get a

19:55

residential IP.

19:56

And I think just the sheer scale of

19:58

like how many internet connections are

20:00

residential ones versus like a business

20:02

one is typically going to have one

20:04

fairly large business connection,

20:06

but there are less of those.

20:07

I think that from a botnet perspective,

20:11

you could be concerned about consumer

20:13

routers being used to attack like just in

20:18

a DDoS scenario more than like a data

20:21

exfiltration scenario that you might want

20:23

to protect against on the business side of

20:25

things.

20:26

But again, like I said,

20:29

I don't think that just because they're

20:31

foreign means that that is going to

20:34

happen.

20:34

So the ban doesn't make a lot of

20:36

sense to me from that perspective.

20:38

But it is certainly a concern that you

20:42

could have.

20:45

That makes sense.

20:47

I think last note on this story,

20:48

Jordan said the US doesn't really have

20:50

semiconductor fabrication capabilities.

20:52

Well,

20:52

we have a Samsung factory that last time

20:54

I checked was supposed to start pumping

20:55

out chips in twenty twenty three.

20:59

I haven't checked recently,

21:00

but as of late last year,

21:02

they still are not making chips.

21:06

I know Tesla just announced their factory,

21:08

and I think there's another one that.

21:10

I heard about this supposed to be built

21:12

in New York, I think, but I like,

21:13

I heard about the initial funding a few

21:15

years ago and that's all I've heard.

21:16

So yeah, we definitely don't have.

21:20

TSMC has been building one in Arizona for

21:22

a while as well,

21:24

but none of these have launched as far

21:25

as I know.

21:27

It's not super easy to do.

21:28

Nate, I think you're muted or you're.

21:35

Sorry about that.

21:37

Ah, okay.

21:38

Yeah, just to drive home the point,

21:40

the one in Texas is supposed to take

21:42

twenty years to build,

21:44

and that's if it's on schedule.

21:45

That's not including that it is now

21:47

running behind,

21:48

and it will probably just compound,

21:49

and we'll fully habitate Mars by the time

21:53

that thing's done.

21:54

But you know what I mean?

21:55

It's like, yeah, they –

21:57

What we were saying earlier,

21:58

you can't just spin this stuff up

21:59

overnight.

22:00

It doesn't matter your political leanings

22:01

and whether you think manufacturing should

22:03

be here or not.

22:03

We can't do this overnight.

22:05

It's just not possible.

22:06

And it's not even a matter of getting

22:08

the equipment to do it.

22:09

You can't just build a building and fill

22:11

this up with semiconductor equipment to

22:14

start making these chips.

22:17

In Taiwan right now,

22:18

there's just such a massive centralization

22:20

of knowledge of how to make these chips

22:22

and how to use all of these machines

22:23

and how to design this stuff that you

22:25

can't

22:27

you can't just replicate this.

22:29

Like we see Intel, for example,

22:32

has some of the most advanced

22:36

manufacturing equipment in the world as

22:38

well, just like TSMC,

22:40

but they've had a lot of struggles.

22:43

I don't know what Intel is up to

22:45

these days,

22:45

but I know for a while they had

22:47

a lot of struggles with improving their

22:50

processes even more,

22:52

just because it's extremely challenging to

22:53

do.

22:56

I was going to say something about AI,

22:58

but then I realized I was thinking of

22:59

Nvidia.

22:59

Yeah,

22:59

I have no idea what Intel's up to

23:01

these days.

23:03

Okay.

23:04

Before we jump into the next story,

23:06

I do want to highlight earlier today,

23:08

we had a new member join.

23:10

And if you're watching, I apologize.

23:12

I'm not even going to try to pronounce

23:13

that name because it's in a foreign

23:14

language.

23:14

It looks maybe, I want to guess Korean,

23:18

but I'm not super familiar.

23:20

It's something from that part of the

23:21

world,

23:21

but thank you so much for becoming a

23:23

member.

23:24

So now on YouTube,

23:25

you will have early access to videos and

23:27

we will talk about membership a little bit

23:29

more later in the show.

23:31

But first,

23:32

we're going to talk about everyone's

23:33

favorite topic, age verification.

23:37

SystemD is not controversial enough,

23:40

so they decided to go all in and

23:42

build in an age verification feature.

23:45

So for those who don't know,

23:46

SystemD is a Linux...

23:50

I'm going to let Joan explain what it

23:51

is better a little bit later,

23:52

because I truly don't know how to explain

23:55

it.

23:55

I know it's deep, deep in the system.

23:58

I'll put it that way.

23:59

And it's used by a lot of major

24:01

Linux distros.

24:02

I know Ubuntu uses it.

24:04

And therefore,

24:04

all the Ubuntu derivatives like min, pop,

24:10

Oh, man,

24:10

there's another one I'm forgetting off the

24:11

top of my head.

24:11

But another really, really popular,

24:13

Debian, I think, uses it.

24:14

I think that was the one I was

24:15

thinking of.

24:15

Technically, Ubuntu is based on Debian,

24:17

but either way.

24:18

So systemd is this really high privilege,

24:20

low in the system kind of thing.

24:22

And they have added a field to the

24:26

JSON user records that is simply

24:28

birthdate.

24:29

And it is what it says on the

24:30

tin.

24:30

Now, when you make a new user,

24:32

you can choose to enter their birthday.

24:34

So the good news is this is totally

24:36

optional.

24:37

This is not a mandatory field.

24:39

If you've ever made a user in Linux,

24:42

in most Linux distros,

24:44

you'll get all kinds of options like real

24:45

name, email address, location, and,

24:48

you know...

24:50

if you're like me,

24:50

you don't really need that kind of stuff.

24:52

So you can kind of skip through it

24:53

all.

24:53

But, uh, so this is the same thing.

24:55

This is just an extra field that is

24:57

four digit year, two digit month,

24:58

two digit day,

25:00

and you can skip it or,

25:02

or enter it in if you want to.

25:04

And they've specifically said,

25:06

this is not a policy engine.

25:07

This is not an API.

25:08

We just define the field so that it's

25:10

standardized if people want to store the

25:11

date, but it's entirely optional.

25:13

Um,

25:15

Yeah, I mean,

25:16

I feel like those are kind of all

25:17

the facts of the case.

25:20

Jonah, what do you think about this?

25:23

I think it's cool that it's optional,

25:24

but I also kind of see the argument

25:25

that they shouldn't have added it in the

25:27

first place.

25:27

What are your thoughts on that?

25:29

Systemd is a very interesting project.

25:33

So I'll go back to your original question

25:36

first before I get into that.

25:38

Systemd, at its core,

25:40

is an init system in Linux,

25:43

which is basically the process that starts

25:47

all of the other software on your

25:48

computer.

25:49

So if you think about booting up your

25:51

computer,

25:52

it gets ever more complex as you go.

25:55

So typically on Linux, you would have

25:58

like grub the bootloader which is

25:59

extremely lightweight and all that does is

26:02

you boot up it goes into that and

26:03

then it starts the linux kernel which is

26:06

much more complex than grub but you need

26:08

something to start it and then the kernel

26:10

starts your init system which in this case

26:14

would be systemd but it could be any

26:16

process and then

26:17

the init system starts everything else and

26:19

it's responsible for knowing what software

26:22

you want to start and doing it all

26:23

in order so that something doesn't get

26:26

started first that depends on something

26:28

else and then fails because it was started

26:30

too early and all of that stuff.

26:32

So the init system has to always be

26:34

running and it's just kind of the parent

26:35

of all of the other software that you

26:37

run on your computer, if that makes sense.

26:40

But system D is also

26:44

a whole project with a lot more ambitions

26:48

than just being an init system.

26:51

And they make a lot of different software

26:53

that basically tries to replace a lot of

26:57

basic operating system components with

26:59

systemd-developed ones.

27:01

So beyond the init system,

27:03

we see software like systemd-resolvd,

27:07

which replaces the DNS resolver on your

27:11

system.

27:14

And other and other stuff like that that

27:15

isn't necessarily related to just service

27:19

management and starting processes.

27:21

So they really want to be like the

27:23

core software for all of your system,

27:28

which is why I think they are pretty

27:31

divisive project in the Linux space.

27:37

But

27:40

Yeah,

27:40

I don't know what component this age

27:42

verification is actually being installed

27:43

in.

27:44

I didn't see this, but maybe I should.

27:46

It probably says in this article.

27:51

I don't know if it's specified.

27:52

It just said the JSON user records when

27:55

you make a new user.

27:59

XDG desktop project portal is adding age

28:02

verification portal that needs a date

28:04

source for the user's age in the user

28:07

DB.

28:08

Does that help any?

28:10

It's an interesting change to make.

28:14

You can definitely see their reasoning

28:17

behind it,

28:17

because I think if age verification is

28:19

going to come to Linux, for example,

28:22

it would be quite annoying if there were

28:25

one million different implementations of

28:27

it that the rest of the system has

28:28

to integrate with.

28:30

But I'm not sure if this makes a

28:31

ton of sense for it to be here

28:34

instead of in your desktop environment.

28:37

And also,

28:38

I don't necessarily think that age

28:41

verification is kind of a lost cause.

28:43

And I think it's unfortunate to see it

28:46

being adopted so readily in Linux

28:48

because...

28:52

I think in certain projects,

28:55

I wish that the open source community

28:57

would take a bit more of a stand.

29:00

But everything has just become very

29:02

corporate, especially in the Linux space,

29:05

and there's a lot of compliance in

29:07

advance.

29:08

So it's very tricky to kind of

29:12

keep with solid ethics and this has been

29:15

an issue for a very long time i

29:17

could go back to um one of i

29:21

personally the most annoying things that i

29:23

can think of in this space which is

29:25

um back in i want to say firefox

29:28

adding drm as like a first party thing

29:31

in their browser rather than like

29:33

relegating that to a third party

29:35

extension,

29:36

because that was kind of the end of

29:38

Firefox having any sort of say in the

29:43

web browser space and how all of these

29:45

web standards were made.

29:46

I think once they gave into that,

29:48

that was

29:48

the slippery slope that made them lose a

29:50

lot of ground in the standards committee

29:53

to Chromium because they basically showed,

29:55

hey,

29:55

we will implement anything that Google

29:57

asked for.

29:59

And I think that this is a case

30:01

certainly of the main developers behind

30:05

Linux saying, hey,

30:06

we are going to implement whatever the

30:09

corporate side of Linux asks for,

30:11

regardless of what the rest of the

30:12

community wants, which is unfortunate.

30:18

so yeah i'm not a fan of this

30:21

change there's a lot of arguments for and

30:23

against this i it i don't know like

30:30

what the point of this is really because

30:32

if you can just set your age to

30:34

whatever you want yourself i don't know

30:35

why this would be trusted by other

30:38

software um

30:41

But I guess we'll see how this is

30:42

used,

30:44

which is the main thing it'll come down

30:45

to.

30:46

I think it is like,

30:48

even if there isn't a huge issue with

30:51

how this specific thing is implemented

30:53

right now,

30:54

we are just kind of laying the foundation

30:55

for more anti-user systems and more

31:00

problematic systems to be implemented on

31:01

Linux in the future,

31:03

which is not a path that I think

31:04

we should be going down.

31:05

So that's my main concern with this whole

31:09

story pretty much.

31:12

Yeah, I agree with you.

31:14

I think, you know, I mean,

31:17

we always talk about how like companies

31:19

and services have to follow laws, right?

31:21

And somebody I spoke to recently in an

31:24

interview you guys will be seeing pretty

31:25

soon mentioned that.

31:27

It's like this whole idea of like

31:29

cyberspace as this nebulous,

31:32

like doesn't matter.

31:33

It's like, no,

31:34

the person writing the code,

31:35

your feet are touching the ground

31:37

somewhere and therefore there's

31:39

jurisdiction or your server is located

31:41

somewhere

31:42

in a physical space somewhere.

31:43

And so I understand they have to follow

31:47

the law,

31:48

but I do agree that it was really

31:49

disappointing to see them just roll over

31:51

with no fight, no nothing.

31:55

And I think to kind of go back

31:57

to, you answered it a little bit here,

31:59

but this person asked us, Leonard asked,

32:03

or Leonardo asked,

32:04

do you think air verification is a lost

32:05

battle?

32:06

I wouldn't say lost.

32:08

I mean,

32:08

there's definitely that part of me that's

32:09

like,

32:11

If I thought it was a lost cause,

32:12

I wouldn't be here, right?

32:12

Like I would just go get another job

32:14

and give up privacy.

32:16

But I do think...

32:19

I think it's partially lost in the sense

32:20

that I think this is coming,

32:22

whether we like it or not.

32:23

This is just my personal opinion.

32:25

I think it's coming,

32:25

whether we like it or not,

32:26

because I think there's just too many

32:27

people that don't understand the downsides

32:30

of it.

32:31

And I think it's really important in light

32:34

of that for us to have a seat

32:36

at the table and have this conversation,

32:38

which system deed clearly did not where we

32:40

say,

32:41

let's at least try to control it in

32:43

a way where it's less damaging.

32:45

So I think,

32:47

In that sense,

32:47

I almost like this because you can put

32:49

any age in there, right?

32:50

They're not going to verify it.

32:51

They're not going to ask you to upload

32:52

an ID.

32:53

And, you know, then there's the question.

32:55

I almost worry if like if everybody starts

32:57

doing that, like, OK, fine,

32:58

here's an age field.

32:59

Go ahead and lie.

32:59

We don't care.

33:00

Then the government's just going to be

33:01

like, fine, now you have to verify IDs.

33:03

And it's like, crap, now it's worse.

33:04

Yeah,

33:05

that's exactly my point about it being a

33:07

slippery slope.

33:08

Because if it can always be set

33:11

arbitrarily forever,

33:12

I don't understand what the point of this

33:14

would be in the first place, right?

33:15

To me,

33:16

the intent of this feature is clearly to

33:19

eventually have some sort of much more

33:21

verifiable way to set this field that

33:24

won't be as user-controlled.

33:26

And I think that that is...

33:28

dangerous to have because if they weren't

33:31

planning on doing that,

33:32

then they could just do what sites always

33:35

do,

33:36

which is like ask people to enter their

33:37

birth date or like confirm that they're

33:40

over a certain age or whatever without

33:42

this being built into the system.

33:44

I think that that works fine until you

33:47

want a much more verified way to confirm

33:51

people's ages, which,

33:53

as we talked about on the show a

33:54

lot,

33:55

is is very problematic from a privacy and

33:58

censorship standpoint and that's really

34:00

the only reason to build this feature

34:02

that's and that's my main concern here

34:04

i've seen a lot of um a lot

34:06

of mixed reactions to this like in our

34:08

community and elsewhere on the internet

34:09

where people were kind of saying like what

34:11

you were um saying at the beginning of

34:13

your thought which is like hey this isn't

34:15

really doing anything right now you can

34:17

set it to whatever you want and i

34:19

would just be

34:21

It doesn't make sense to me that that

34:23

will always be the case.

34:24

I think that the fact that they're doing

34:25

that is concerning.

34:27

And this is kind of similar to me

34:31

to the current discussions that are going

34:33

on

34:34

In the Android world right now with

34:36

developer verification,

34:38

I think we are seeing a lot of

34:40

app developers and a lot of custom Android

34:44

operating systems and other third party

34:47

open source app stores beginning to comply

34:51

in advance with that sort of thing or

34:52

make statements saying like, hey,

34:53

we are going to participate in the

34:55

developer verification system.

34:57

And I think that that is unfortunate,

35:00

because you could look at the Keep Android

35:02

Open campaign for a lot of explanations on

35:05

why you shouldn't be doing that.

35:07

You should be taking a hardline stance and

35:09

saying, hey,

35:09

we're not going to comply with this

35:12

system,

35:13

even if that means some restrictions on

35:17

where apps can be installed.

35:20

that is the best move to make our

35:23

voices heard and to potentially make a

35:26

difference.

35:28

So it's just a similar thing here.

35:32

Exactly like you said,

35:33

I think that we got to take a

35:36

stand and we can't lose our voice when

35:39

it comes to this.

35:39

And SystemD is...

35:45

kind of giving that up,

35:47

which is a real shame.

35:52

Yeah, and I think,

35:54

just to kind of add to what you

35:55

said, I think, ironically...

35:58

I think if we took more of that

35:59

attitude of, like,

36:00

let's have a seat at the table and

36:01

try to steer this, I think, ironically,

36:04

it would become a self-fulfilling prophecy

36:06

in a good way,

36:07

a good kind of irony, where, like,

36:09

because we're participating,

36:10

we might have more attention to be able

36:14

to draw attention to these issues and

36:15

point out, like,

36:16

this is why age verification doesn't work.

36:18

These are all the problems with age

36:19

verification,

36:19

the knock-on effects that are going to

36:21

make things worse.

36:22

And we might end up actually being able

36:23

to do something about it.

36:25

But, yeah, I think...

36:27

I think definitely just I know we've

36:28

called this out in the past with other

36:29

stories,

36:30

but just this attitude of like of,

36:32

you know, oh, well,

36:33

this doesn't affect me.

36:33

So I don't care because I know how

36:35

to get around it.

36:35

Congratulations.

36:36

It's coming for Linux.

36:37

It's coming for the things that we use.

36:39

We can't have that attitude forever

36:41

because eventually we're going to run out

36:42

of places that are not touched by this.

36:44

Or at very least,

36:45

they're only going to apply to like a

36:47

handful of people that are

36:48

really tech savvy and know how to write

36:50

their own code.

36:51

And now we have privacy for one percent

36:52

of people instead of, you know,

36:54

right now where it's what, ten percent.

36:55

I don't know.

36:56

I'm just making up numbers.

36:57

But my point being is like,

36:58

I think standing up and trying to do

37:01

something gives us more power and it

37:03

builds momentum to the point where maybe I

37:05

will be wrong, which for the record,

37:06

these are the kind of things I'm happy

37:07

to be wrong about where it's like, hey,

37:09

we were able to roll back age verification

37:10

because we took such an active role that

37:13

we were able to spread awareness and

37:14

attention.

37:14

So, yeah.

37:16

And real quick, just to roll here,

37:18

point it out.

37:18

Like, yeah, it's not age verification.

37:19

It's identity verification.

37:21

I'm trying to get more in the habit

37:22

of saying that, but I don't always.

37:24

So thank you for pointing that out because

37:26

you are right.

37:28

It's not just kids because how is it

37:31

going to know if you're a kid without

37:32

verifying everyone?

37:33

So yeah, thank you for noting that.

37:35

Yeah.

37:38

It's an interesting story.

37:40

I don't know if I have too much

37:41

to add.

37:42

I...

37:44

I can certainly imagine some reasons that

37:48

something like this could be useful in the

37:50

grand scheme of things,

37:51

regardless of age verification plans right

37:54

now.

37:54

But certainly the timing of this with all

37:58

of the age verification stuff going on

38:00

this year is extremely suspicious.

38:02

So that doesn't give me high hopes for

38:06

how this feature will be used.

38:10

For sure.

38:14

um i think we can move on from

38:16

this story before we dive into the meta

38:20

and google social media addiction ruling

38:23

um i want to give some quick updates

38:24

about what we've been working on at

38:26

privacy guys this week um on the website

38:28

side of things uh the big stuff has

38:30

been a lot of news articles so there

38:32

are a ton of stories that we aren't

38:36

able to discuss here on this show but

38:38

freya and others have been writing them in

38:40

our news brief section which you can visit

38:42

at

38:42

privacyguides.org slash news.

38:44

So some of the articles include the cadnet

38:47

botnet hijacking ASUS routers.

38:50

Good example of the kind of botnet issues

38:52

I was talking about earlier in the router

38:54

space.

38:55

FBI seeking info from gamers who installed

38:58

malware from Steam.

38:59

Big tech creating an accord against online

39:01

scams and fraud.

39:03

A severe meta cybersecurity incident

39:05

caused by an AI agent.

39:08

Graphene OS saying that they won't

39:10

implement age verification.

39:12

which is fantastic,

39:13

exactly what we do want to see,

39:16

unlike SystemD here.

39:18

A French aircraft carrier being located in

39:21

real time via a fitness app,

39:23

which we've definitely seen in the

39:25

military before.

39:28

Android-XVII getting a post-quantum

39:30

cryptography upgrade and Vizio TVs.

39:33

now requiring a walmart account um crazy

39:37

stuff really annoying the smart tv

39:39

industry uh so if any of you think

39:42

any of those topics are interesting or

39:45

want to learn more about them we have

39:46

those articles again at privacyguides.org

39:49

news all of those articles are also

39:52

automatically published to our form when

39:54

we publish them and there are some

39:56

discussions that go on there or you can

39:58

ask questions and follow up and we can

40:00

Talk about it there.

40:02

I think that's kind of the main stuff.

40:04

I know that there have been more

40:05

discussions in our forum and on the

40:08

community side of things,

40:09

but we're going to get into some of

40:11

the biggest ones later on in the show,

40:13

so I will leave that there.

40:16

But I know Nate has some stuff to

40:18

share about our YouTube channel and some

40:21

videos we've published lately,

40:22

so I will pass it over to you,

40:23

Nate.

40:24

Yeah, so just a really quick one here.

40:30

Last, God, was it last week?

40:32

It's been a week already.

40:33

Jonah and I were invited to Austin, Texas,

40:38

EFF Austin,

40:38

which I am a board member of.

40:40

We threw an unofficial South by Southwest

40:42

party that we called EFF Austin

40:44

Interactive because- I will just say,

40:47

technically two weeks ago, Interactive.

40:49

Two weeks ago, okay.

40:50

We did that live show, if people remember.

40:52

Yes.

40:52

That was cool.

40:54

Sorry,

40:54

my sense of time is all checked up.

40:55

It's been a busy couple weeks ever since

40:57

I got back.

40:57

It's crazy.

40:59

But yeah,

40:59

South by Southwest Interactive has been

41:01

retired,

41:01

so therefore we decided to be sneaky and

41:04

use the name.

41:05

And Jonah and I got to film some

41:07

of the talks,

41:08

and we thought they were really

41:09

insightful,

41:10

so we've been publishing them on our

41:11

channel.

41:12

We have Hugh Forrest,

41:13

who was actually one of the co-founders of

41:15

South by Southwest Interactive, who gave,

41:18

I thought,

41:18

a really good talk about South by

41:20

Southwest's

41:22

and how it ruined the world um good

41:24

talk so check that out uh dr sharon

41:26

strover who is a professor at ut austin

41:29

talked about public opinions of

41:31

surveillance technology which is i thought

41:33

was very hopeful um you might be surprised

41:35

to check that out and then uh john

41:37

lebkowski

41:39

who is also an EFF Austin board member

41:41

and very early pioneer of the internet.

41:45

Um,

41:45

he's been around since the early days and

41:48

I swear to God,

41:49

I feel like he has a story about

41:50

every,

41:50

like if you name somebody in the digital

41:52

space, um, like Phil Zimmerman or,

41:54

you know, um,

41:56

Pretty much anybody.

41:56

I feel like he knows them.

41:57

Cory Doctorow, like he knows them.

41:59

He's met them.

42:00

He's got a story.

42:01

But anyways,

42:01

he gave a very short talk that I

42:03

would loosely describe as like the state

42:05

of the internet and a call to action.

42:07

I think it's less than five minutes.

42:08

That was definitely the shortest one.

42:10

So if any of those sound interesting,

42:12

head over to our YouTube channel or it

42:13

is also over on PeerTube and check those

42:16

out because they were really good.

42:18

Yeah,

42:18

I will just say I might be a

42:20

little biased, but I did love Dr.

42:22

Sharon Strover's talk a lot,

42:24

just mainly because, well,

42:26

a lot of reasons, actually.

42:27

But she included a segment about

42:29

Minneapolis and what's going on here that

42:31

I thought was interesting as well.

42:33

So totally,

42:35

totally check that out because all of

42:37

these mass surveillance systems in cities

42:38

right now,

42:38

we've talked so much about flock in other

42:41

systems here on this show.

42:44

And it's a really good,

42:46

really good take on all of that.

42:48

Yeah, for sure.

42:50

And not to beat a dead horse,

42:53

but I would say it's as fact-based as

42:57

you can get.

42:57

I mean,

42:57

it is a lot of surveys and self-reporting,

43:00

but it's not just like, oh,

43:00

we read some news articles or we looked

43:02

at Twitter.

43:03

It's like they went out and tried to

43:05

get the best numbers they possibly could.

43:07

So it's really good stuff.

43:09

All of this stuff, the articles,

43:11

the videos,

43:11

the upcoming videos that I've been teasing

43:13

at,

43:13

all of this is made possible by our

43:15

supporters.

43:16

So if you are not a supporter and

43:17

you would like to be,

43:18

you can sign up for a membership or

43:19

donate at privacyguides.org.

43:21

We also have a merch shop,

43:23

shop.privacyguides.org.

43:25

And I think we've added some new designs

43:27

ever since we launched our activism

43:28

section.

43:28

So be sure to check that out if

43:29

you're interested.

43:30

Privacy guides is a nonprofit that

43:32

researches and shares privacy related

43:34

information and facilitates a community on

43:36

our forum and matrix where people can ask

43:38

questions and get advice about staying

43:39

private online and preserving their

43:41

digital rights.

43:42

And we'll talk a little bit more about

43:44

that later.

43:44

But for now,

43:45

we're going to talk about Hong Kong and

43:47

a new law regarding device passwords.

43:49

And I'm going to turn that one over

43:51

to Jonah.

43:53

All right.

43:53

This was reported by the BBC.

43:56

Hong Kong police can now demand phone

43:59

passwords under a new national security

44:02

rules.

44:05

This article starts out,

44:06

Hong Kong police can now demand phone or

44:08

computer passwords from those who are

44:10

suspected of breaching the wide-ranging

44:12

national security law.

44:14

Those who refuse could face up to a

44:16

year in jail and a fine of up

44:19

to Hong Kong dollars,

44:22

which is about US dollars.

44:25

and individuals who provide false or

44:27

misleading information could face up to

44:30

three years in jail.

44:32

It comes as part of new amendments to

44:34

a bylaw under the national security law

44:36

that the government gazetted on Monday.

44:39

The NSO was introduced in Hong Kong in

44:41

twenty twenty in a week in wake of

44:43

massive pro-democracy protests the year

44:46

before.

44:46

Authorities say the laws which target acts

44:48

like terrorism and secession are necessary

44:51

for stability, stability,

44:53

But critics say they are tools to quash

44:56

consent.

44:58

Of course,

44:59

this is an issue that we have talked

45:01

about in other countries.

45:03

Certainly in the UK, for example,

45:08

this is a problem right now that we

45:10

know of.

45:11

It's also kind of a gray area in

45:14

U.S.

45:14

law where you technically don't have to

45:17

provide this information,

45:18

but what they can do for you,

45:20

do with you in the meantime is kind

45:22

of up in the air and not decided.

45:24

We've seen stories certainly of people

45:27

being held in temporary custody.

45:29

temporary jails for years or more because

45:32

they didn't decide to comply with sharing

45:36

their passwords with police or they simply

45:38

forgot their passwords and weren't able

45:39

to, which is always a possibility.

45:44

And in a lot of cases,

45:45

it is exactly used to quash dissent or

45:50

target people who otherwise haven't

45:52

committed crimes.

45:55

This is a big part of the issues

45:57

that we see with encryption in general and

46:00

end-to-end encryption,

46:01

where governments really want to make any

46:04

form of encryption or end-to-end

46:08

encryption illegal.

46:10

Simply the act of using it because it...

46:13

certainly makes it much easier to

46:15

investigate crimes if you don't actually

46:17

have to do any investigation of the crime

46:19

or any of the data involved.

46:20

If you can just say, hey,

46:21

the fact that this encrypted data exists

46:23

is a crime enough that

46:28

can be used to target a lot of

46:30

people who have otherwise done nothing

46:31

wrong.

46:32

And it wouldn't surprise me to see the

46:34

same thing happen here.

46:38

There's a couple stories mentioned in this

46:42

BBC article,

46:42

if you want to check it out later,

46:44

about some examples of activists and other

46:49

big names in the area being sentenced to

46:53

jail or being

46:56

being targeted by laws that expand on this

46:59

kind of NSL national security law.

47:03

So it's definitely being used to target

47:07

protesters, activists,

47:09

former opposition lawmakers even in Hong

47:15

Kong.

47:15

So unfortunate stuff for sure to happen

47:19

here.

47:20

I think that that

47:22

would be would be kind of my main

47:23

point.

47:24

It's it's something that we could

47:25

certainly see expand to other countries.

47:29

And it's something that if other countries

47:31

aren't doing this,

47:34

there are at least plans to do something

47:36

like this,

47:36

which is is bad news for everyone around

47:39

the world.

47:42

Yeah, Nate,

47:43

do you have any other thoughts on this

47:45

story?

47:48

I don't think so.

47:48

I think you kind of covered it.

47:51

Jordan said they've already been doing

47:52

this for years in Australia.

47:54

That's wild because, yeah,

47:56

it's very – here in the U.S.,

47:59

I –

48:01

Oh my gosh.

48:02

I've taken in so much information the last

48:03

few days I'm forgetting.

48:05

Basically,

48:06

the way the courts are supposed to work

48:10

is they're supposed to take existing laws

48:12

– like when it comes to new technology,

48:14

they're supposed to take existing laws and

48:16

interpretation and figure out how to apply

48:19

them to the new laws in a way

48:20

– supposed to,

48:22

for the record –

48:22

in a way that protects Americans and

48:25

preserves their existing rights.

48:27

So for example, with privacy, right?

48:29

Here in the US,

48:30

we have the Fourth Amendment,

48:31

which says that cops need a warrant to

48:33

come in and search your home.

48:34

And so in theory,

48:36

the way the court is supposed to interpret

48:37

that when it comes to the electronic world

48:39

is the same way.

48:40

They're supposed to figure out,

48:42

electronically speaking,

48:43

what counts as your home,

48:45

and therefore the police would need a

48:47

warrant to come in and search that.

48:48

So-

48:50

Yeah,

48:51

that's not to say like this couldn't

48:52

happen here in the US because in the

48:54

US we have repeatedly refused to make a

48:57

final decision on whether or not you need

48:59

a warrant to search your phone.

49:00

But it's just really.

49:03

Yeah,

49:03

it's really unfortunate because I'm with

49:05

you.

49:06

This is something I think we could see

49:08

here in America, in the UK,

49:11

if it's not already.

49:12

I mean, anywhere, really.

49:14

And it's just it's such a.

49:17

Things get really bad once we go downhill

49:18

like that.

49:20

And, you know,

49:21

they also I think they said,

49:22

what is it like you could face?

49:24

Yeah, here it is.

49:25

They could face a fine or jail for

49:26

providing false or misleading information.

49:29

So like my first thought is if you

49:30

have a graphene phone and they're like,

49:31

oh, what's your passcode?

49:32

And you give them the duress pin that

49:33

wipes the phone.

49:34

Congratulations.

49:35

You're still going to jail because that

49:36

wasn't the pin and you knew it.

49:37

You knew that wasn't what they meant.

49:38

So, yeah, it's absolutely.

49:42

It's bad.

49:43

I have got a ton of thoughts about

49:46

courts, uh, interpreting the laws,

49:49

but I think we can talk about that

49:51

in the next story here.

49:52

So, well,

49:52

maybe we should get into that one.

49:56

Okay.

49:57

Yeah.

49:57

So, uh, on that note, um,

50:00

let's talk about the courts and meta and

50:03

Google and, uh,

50:06

Man,

50:06

so this isn't directly privacy related,

50:08

but it has a lot of knock-on effects.

50:10

And this headline, this comes from NPR.

50:13

I was going to quote Reuters,

50:15

but they do this annoying thing where it

50:16

doesn't do a link preview in Ghost.

50:20

But NPR also covered this story very well.

50:23

This is a very thorough article.

50:24

And the headline says,

50:25

jury finds Meta and Google negligent in

50:27

social media harms trial.

50:29

So the short story – the short version

50:32

is there's a woman.

50:33

I believe some other article said that

50:36

she's in her twenties.

50:37

And she was suing Meta and Google.

50:39

And she also sued Snap and TikTok,

50:41

but they settled before it went to trial.

50:43

So Meta and Google went all the way

50:45

to trial.

50:46

And –

50:47

This woman basically says she's been

50:49

addicted to social media since she was a

50:50

child because these companies purposely

50:54

make social media addictive.

50:56

And therefore,

50:57

they should be held accountable.

50:59

And the jury agreed.

51:01

And they awarded this woman six million

51:02

dollars in damages,

51:04

mostly coming from Meta.

51:05

And the article rightly points out for all

51:08

of you who are thinking like, oh,

51:09

six million dollars, who cares?

51:10

You're one hundred percent right.

51:12

Mark Zuckerberg probably – his breakfast

51:14

probably cost six million dollars.

51:15

He doesn't care.

51:16

But what matters is that this is now

51:19

on record,

51:20

and this is now set a precedent,

51:24

and that –

51:25

Oh, man,

51:26

this this just has so many knock on

51:27

effects.

51:28

And I think that's why we want to

51:29

talk about this is not even so much

51:30

for what the story itself is actually

51:33

about.

51:33

Although, for the record,

51:34

I think that is a very important thing

51:36

that I'll elaborate on in a second.

51:39

But the fact that it holds these companies

51:42

accountable and opens the door for so many

51:45

more legal actions in the future.

51:48

On behalf of everyone, I feel like,

51:50

because who hasn't had a Facebook account

51:51

or a YouTube account at some point?

51:54

Many of you are watching on YouTube.

51:56

And thank you for watching, by the way.

51:58

But yeah,

51:59

I do want to point out real quick,

52:01

again, personal opinion here.

52:03

I've been saying for a long time about

52:06

a variety of privacy topics that I think

52:09

it's extremely...

52:11

We'll take misinformation, for example.

52:13

I know a lot of people who are

52:14

like, oh, I don't fall for fake news.

52:17

That is extremely arrogant.

52:19

And I'm including myself in this.

52:20

I have definitely read stories that

52:22

somebody else is like, hey,

52:22

here's an opposing viewpoint and all the

52:24

things they left out.

52:24

And I'm like, oh,

52:25

I probably called that one wrong.

52:27

Because when there are companies whose

52:30

whole job, forty hours a week,

52:32

is to sit there and pump out fake

52:33

news,

52:34

They're going to get you at some point

52:36

or another because that's just how it

52:37

works.

52:38

Like,

52:38

think about your job and how good you

52:40

are at your job because you do it

52:41

all the time.

52:42

And now imagine some random person coming

52:44

in off the street and being like,

52:45

I could do that.

52:46

You know, whatever your job is,

52:47

it doesn't matter.

52:47

It's like, no, dude,

52:48

there's certain skills and flows and

52:50

processes that I've learned over the

52:51

years.

52:52

And it's the same thing with these.

52:55

I think...

52:56

we really underestimate how addictive

52:58

social media is.

52:58

And I'm not trying to let people off

52:59

the hook.

53:00

Agency comes with pros and cons.

53:01

If you're in charge of your own actions,

53:03

you're also responsible for the

53:04

consequences.

53:05

But at the same time,

53:07

we have to acknowledge these things are

53:09

made to be addictive by experts who are

53:11

paid to make this thing as addictive as

53:14

possible to keep you there one second

53:17

longer.

53:17

And I feel like when we discredit that,

53:19

it's like we're forgetting that

53:21

It would be like saying, oh,

53:22

cigarettes aren't that addictive.

53:23

Bro, they bake nicotine into it.

53:26

Yes, it is.

53:27

And it's the same thing here with this

53:29

kind of stuff is like this stuff is

53:30

made to be addictive.

53:33

And I think we're just really – yeah,

53:36

I know I'm kind of going in circles

53:37

now,

53:37

but I think we're just really being –

53:41

it's just really not good to ignore that

53:43

is what I'm trying to say.

53:44

So yeah.

53:46

I think that's kind of all I've got

53:48

for now on that one.

53:48

I know you said you have a ton

53:49

of thoughts on this.

53:50

So what, what was your takeaway from this?

53:53

So I've seen a ton of mixed responses

53:57

to this case on the internet,

53:59

and I have a lot of mixed feelings

54:01

on this myself, because I,

54:03

even on this show,

54:04

have said that all of the social media

54:07

stuff,

54:07

and especially the stuff that Meta is

54:09

doing, which we've,

54:10

I think it's even mentioned in this

54:12

article in a separate case from the six

54:15

million dollar one,

54:16

but

54:16

It's been found in Discovery that they

54:18

have internal discussions about

54:20

specifically targeting kids who are

54:22

thirteen or even younger and making it as

54:25

addictive as possible.

54:26

And this is, in my opinion,

54:29

a public health concern for exactly the

54:31

same reasons that marketing cigarettes to

54:36

children was a massive health concern.

54:41

But on the other hand,

54:43

I think that this really closely relates

54:47

to Section two thirty issues that we've

54:50

seen here.

54:51

And the social media companies originally

54:53

tried to use Section two thirty as a

54:55

way to say, hey,

54:56

we shouldn't be responsible for any of

54:58

this.

54:59

And they tried to get the case dismissed.

55:01

Thankfully, in this specific case,

55:04

they chose not to address any section

55:07

two-thirty issues at all.

55:09

So this can't be used as a way

55:10

to like get around section two-thirty in a

55:14

court case.

55:16

That still applies in this case,

55:17

did specifically focus on the design of

55:20

these apps and kind of the algorithm that

55:22

they're using and not on the content of

55:24

these apps themselves.

55:26

Um,

55:27

but I have a lot of fears here

55:29

that this case will be used as a

55:33

gateway to attack some of this section two

55:35

thirty stuff.

55:37

Um,

55:38

because I think it is not a stretch

55:39

for a lot of these, uh,

55:43

concerned parent groups or these

55:45

conservative religious groups to say, hey,

55:48

you know,

55:49

the algorithm on this app turned my child

55:52

trans or gay or what have you,

55:55

and they just blame the algorithm instead

55:57

of the content,

55:58

and it's a new approach to attack these

56:00

companies.

56:01

And depending on how those go,

56:03

it could be a similar case to either

56:09

attack these companies without

56:13

Section two thirty being involved,

56:15

or this could be used as an excuse

56:16

to implement such something like COSA or

56:20

the repeal of Section two thirty in the

56:22

future.

56:23

And this is just kind of a way

56:25

in.

56:26

Even though the issue at play here really

56:29

has nothing to do with the content,

56:33

I think that the parallels here to the

56:36

tobacco industry and how they were

56:38

marketed

56:39

they were marketing to children,

56:41

for example,

56:42

are very apt here and they make a

56:44

lot of sense.

56:45

And that is a case where like regulation

56:47

was needed.

56:47

And I think in a similar way,

56:50

like the way that these apps are designed

56:53

and the way that the algorithms work,

56:55

which is to kind of find all of

56:57

the most

56:59

inflammatory and addictive content they

57:02

can find um and really highlight that

57:05

which is not the fault of the content

57:07

itself but it's the fault of the algorithm

57:09

that these apps designed i think that that

57:10

is a big problem and just like um

57:16

the the tobacco industry um which i mean

57:20

their their products weren't banned um

57:23

they're they're still around you can buy

57:24

them anywhere um but they really got hit

57:27

with um

57:29

huge restrictions on marketing and how

57:30

they can sell their product.

57:32

And I think that that is probably

57:33

something that should happen here.

57:39

And just like that,

57:40

I think the fact that cigarettes weren't

57:43

banned, for example,

57:43

that's kind of similar to how all of

57:46

the content on these apps shouldn't be

57:47

banned or restricted.

57:49

We can't be going after the content

57:50

itself.

57:52

We have to be going after the format

57:55

with which these companies are presenting

57:56

that to make money.

57:57

Because we know...

57:59

that some social media is not inherently

58:01

addictive.

58:01

We can see non-algorithmic social media

58:05

like Mastodon, for example,

58:06

which doesn't have these problems,

58:08

even though you can post the exact same

58:10

content there that you can post anywhere

58:11

else.

58:12

And we know that

58:16

Facebook and other social media platforms

58:18

like Twitter,

58:19

they didn't used to be so bad until

58:22

Twitter really started making very

58:24

algorithmic timelines instead of just

58:26

showing, you know,

58:27

posting chronological order from people

58:29

that you follow.

58:30

Or, I mean,

58:31

I remember the days before Facebook had

58:34

the news feed, for example,

58:36

and they made that switch to kind of

58:40

tell people

58:42

um that like hey we're going to be

58:43

showing you the most relevant stuff

58:45

instead of uh just a way to keep

58:47

up with your friends or whatever and there

58:49

was some controversy there but facebook

58:50

was really adamant that hey this is a

58:52

very good thing whereas in reality we know

58:56

that while they were designing this they

58:58

were intentionally trying to make their

59:00

platform more addictive and more um

59:04

don't know reactionary i don't know the

59:06

right word but it was a way to

59:08

get people to stick around on facebook for

59:12

longer and to uh more effectively sell ads

59:18

right and i think that those motivations

59:20

that these social media companies have is

59:23

really at odds with how they sold these

59:25

things to consumers and that is a

59:28

legitimate

59:30

problem in deceptive marketing.

59:33

There is really no place, I think,

59:35

in our society and from these companies to

59:38

be deceptive,

59:43

just completely deceptive to how they sell

59:45

their products to consumers.

59:47

And so some restrictions here do make

59:51

sense.

59:53

But I think that

59:59

I think that the big problem is that

1:00:00

courts and lawmakers do not really

1:00:05

understand technology or the Internet.

1:00:08

And lawmakers are consistently very

1:00:10

unwilling to make a decision about this

1:00:15

themselves.

1:00:15

And it's only gotten worse lately.

1:00:17

And now that the doors are open here

1:00:19

to this issue,

1:00:20

I think that the doors are open to

1:00:22

wider issues that may impact content or

1:00:25

Section two thirty and other courts,

1:00:26

because

1:00:27

All of these courts around the US are

1:00:29

going to have slightly different

1:00:31

interpretations.

1:00:31

This isn't something that the Supreme

1:00:33

Court decided on.

1:00:35

And I think that that's really

1:00:36

unfortunate.

1:00:37

My main thought is that

1:00:41

We should be focused on some of these

1:00:44

very specific problems with how social

1:00:46

media apps market themselves and how they

1:00:48

design very addictive platforms,

1:00:51

because it is a problem.

1:00:52

But we need lawmakers to say, hey,

1:00:55

we are only focusing on this specific

1:00:57

thing, right?

1:00:58

We're leaving all of the content and all

1:00:59

of the Section two thirty and all the

1:01:01

free speech stuff alone.

1:01:03

We don't want courts to think about it.

1:01:05

We want to have this law that just

1:01:07

focuses on this one specific issue so that

1:01:09

it doesn't expand into a ton of different

1:01:11

issues,

1:01:12

which is exactly the reason Section two

1:01:14

thirty was created in the first place.

1:01:17

We already have these protections under

1:01:19

the First Amendment,

1:01:20

but lawmakers had to step in because

1:01:22

courts were interpreting.

1:01:24

how the First Amendment applied to

1:01:26

technology companies slightly differently

1:01:29

or very differently,

1:01:31

depending on like what court you were in.

1:01:33

And lawmakers had to say, hey,

1:01:34

this is how the First Amendment works for

1:01:38

all of these tech companies.

1:01:39

And now tech companies can use Section two

1:01:41

thirty to easily get these cases

1:01:43

dismissed.

1:01:43

And I think that we need

1:01:46

um another federal law like this that says

1:01:49

like hey this specific design is bad but

1:01:51

it doesn't mean that we have to regulate

1:01:55

or ban free speech on these platforms

1:01:57

because that is an unnecessary problem but

1:02:01

it seems to be the the direction that

1:02:04

some of some future cases could go in

1:02:06

based on this um so I think

1:02:11

I think it's kind of unfortunate just

1:02:12

because I can see what direction this is

1:02:14

going in and I don't think we live

1:02:15

in an ideal

1:02:17

world and i think this is more of

1:02:19

a failure of lawmakers than the courts to

1:02:22

be honest and i just wish we were

1:02:25

more effective about making these laws

1:02:29

that are more substantial and specific

1:02:32

than than we currently are we're just

1:02:34

leaving everything up to the courts it

1:02:36

seems like these days and that is not

1:02:38

an effective way to govern a country not

1:02:40

at all

1:02:43

Um, I don't necessarily disagree with you.

1:02:46

I definitely see how this could be a

1:02:48

slippery slope,

1:02:49

but did you by any chance happen to

1:02:50

see the last section of this NPR article

1:02:54

that says the LA case focused on design

1:02:55

of social media platforms to overcome

1:02:57

liability shield?

1:02:59

Uh, was there a specific point?

1:03:01

Yeah.

1:03:02

So they, they, um,

1:03:03

they specifically mentioned how the,

1:03:05

the prosecution,

1:03:06

I believe it was stayed away from section

1:03:08

two thirty.

1:03:09

Um, they said that, uh,

1:03:12

Where was it?

1:03:14

Yes.

1:03:15

By taking this approach,

1:03:16

the lawyers pursued a case alleging

1:03:17

defective design that was able to get

1:03:19

around the high bar set by section two

1:03:21

thirty.

1:03:21

It's not what the users post,

1:03:23

but the very architecture of the platform

1:03:25

itself.

1:03:26

So, I mean, again,

1:03:28

I don't disagree with you because I I

1:03:29

know there's probably there's a saying in

1:03:32

the legal world that you can indict a

1:03:33

ham sandwich.

1:03:35

So, I mean,

1:03:36

which I know an indictment is not the

1:03:37

same as what we're talking about here.

1:03:38

But the point being is like a good

1:03:39

look.

1:03:41

for better or worse,

1:03:42

our legal system in the U S is

1:03:43

basically who makes the better argument.

1:03:46

And, um, so it's definitely could happen,

1:03:50

but I think it's just a,

1:03:51

maybe a little bit reassuring that they

1:03:53

purposely stayed away from talking about

1:03:55

section two thirty or even any of the

1:03:57

content itself.

1:03:58

And instead they focused on things like

1:04:00

infinite scroll, constant notifications,

1:04:02

auto playing videos and beauty filters.

1:04:04

And they mentioned how, um,

1:04:06

when she was young the the the plaintiff

1:04:10

when she uh what was it she so

1:04:12

craved the validation of social media that

1:04:14

she would run off to the bathroom at

1:04:15

school to check the number of likes her

1:04:16

poster received um and where did it go

1:04:21

There was another section where basically

1:04:22

they talked about the beauty filters.

1:04:24

Oh, here we go.

1:04:24

She developed depression and body

1:04:25

dysmorphia as she continuously compared

1:04:27

herself to others and used beauty filters

1:04:29

to enhance her appearance.

1:04:30

And that's the thing that I think applies

1:04:33

to everybody.

1:04:35

If not the beauty filters or the physical

1:04:37

thing, I will admit,

1:04:40

I fall prey to this where somebody's like,

1:04:43

oh, I'm going on vacation.

1:04:44

We're going here.

1:04:45

We're spending the weekend here.

1:04:47

My sister has been to Europe more times

1:04:49

than I can count.

1:04:51

because we have different uh different

1:04:53

dads i don't want to talk too much

1:04:54

about privacy we have different dads and

1:04:56

her dad has a lot more money and

1:04:57

i don't know if she ever used that

1:04:58

for the record maybe it's all maybe she's

1:05:01

just really damn good with money but

1:05:03

either way like she's traveled quite a lot

1:05:05

and i really haven't and i i won't

1:05:07

lie that i'm like jealous of that but

1:05:09

also like being her brother i have the

1:05:11

insight into like i know how hard she's

1:05:13

worked i know that she's good with money

1:05:15

It's not necessarily just that her dad

1:05:17

wrote her a check, like, yeah,

1:05:18

go to go to Germany or whatever.

1:05:19

Like she probably earned all that money

1:05:21

herself.

1:05:21

It's not like she was there every other

1:05:23

week.

1:05:23

But when you don't know that person,

1:05:25

when you're looking on social media and

1:05:26

you're like, God,

1:05:27

they're in Europe all the time.

1:05:28

Yeah,

1:05:29

maybe they're posting a picture from six

1:05:30

months ago.

1:05:31

And they've been back in the States this

1:05:33

whole time.

1:05:34

And you don't know that because you don't

1:05:35

know them or you don't know how many

1:05:36

overtime shifts they worked or overnight

1:05:38

shifts.

1:05:39

Like you don't know how many times their

1:05:40

friends were like, hey,

1:05:41

let's go get drinks.

1:05:41

They're like, oh, no, thanks.

1:05:43

I'll catch the next one because I'm trying

1:05:44

to save money.

1:05:44

Like, you know,

1:05:45

and it's what's the statement about like

1:05:48

you're seeing somebody's highlight reel.

1:05:50

And anyway, I mean,

1:05:51

I guess that's more about content.

1:05:52

But my point being like we can all

1:05:54

relate to the fact that

1:05:55

social media gives us this warped

1:05:57

interpretation of what's going on in other

1:05:59

people's lives that if you don't know them

1:06:01

personally and you can't ask them like man

1:06:03

how are you affording all these trips

1:06:04

they're like oh you know my dad's really

1:06:06

good with credit card points or something

1:06:08

um i forget who it was but one

1:06:10

of the podcasts i listened to said the

1:06:11

same thing they're they're i think they're

1:06:13

even a personal finance podcast they're

1:06:14

like yeah we don't make that much money

1:06:16

but my wife is really good with the

1:06:17

travel points or else we would never

1:06:19

travel this much so it's um

1:06:22

Yeah,

1:06:22

it's anyways getting back to the topic.

1:06:24

Sorry, I kind of got distracted there.

1:06:26

It's I think it is heartening that they

1:06:28

purposely avoided the content and the,

1:06:31

you know, people are posting this.

1:06:33

And it's the beauty filters.

1:06:34

It's the infinite scroll.

1:06:35

It's the architecture of the platform

1:06:36

itself,

1:06:37

which I think is absolutely a huge part

1:06:40

of the problem personally,

1:06:41

but maybe not the whole problem,

1:06:43

but definitely a big part of it.

1:06:44

So yeah,

1:06:45

not to say that it couldn't happen because

1:06:46

I could totally see a world where this

1:06:48

does open some doors,

1:06:49

but hopefully that will at least make it

1:06:51

a little bit harder since that's not the

1:06:52

direct argument they took.

1:06:54

Yeah, I totally agree.

1:06:55

I think that them sidestepping the whole

1:06:58

content issue entirely was the correct

1:07:01

approach,

1:07:01

and I think that that is the main

1:07:03

issue with these social media platforms.

1:07:07

The challenge that I see here is that

1:07:09

I think the line between the content on

1:07:12

this platform and the algorithm serving

1:07:15

that content is...

1:07:19

it's not it's it's a fine line it's

1:07:21

not very clearly defined and i think what

1:07:24

we could see is future court cases um

1:07:29

exactly like i said um not necessarily

1:07:31

focused on the content but focused on the

1:07:34

content that these algorithms are

1:07:35

promoting um for like these

1:07:38

conservative or religious groups to say

1:07:40

like, hey,

1:07:40

it's the algorithm that turned my child

1:07:44

gay because it surfaced all of this

1:07:48

content related to that or whatever.

1:07:53

And I think that we could see...

1:07:58

a response to that from social media

1:08:00

companies that would be very similar to

1:08:03

the response that we would probably see if

1:08:06

Section two thirty was repealed.

1:08:08

I think that they could interpret that

1:08:11

like like if that's the case in the

1:08:12

future.

1:08:13

And there are already some cases where

1:08:15

this exact argument is being made.

1:08:17

I think and I think in other countries,

1:08:19

I'm not aware of cases in the US

1:08:20

right now,

1:08:21

but we've seen this before and we know

1:08:22

it could happen where

1:08:27

Like if Facebook gets sued for that issue

1:08:30

and they lose that case,

1:08:31

I can see them potentially censoring or

1:08:35

moderating much more heavily like LGBTQ

1:08:41

information or education or not even that.

1:08:44

It could be any topics that...

1:08:47

certain groups find undesirable or they'd

1:08:50

rather ignore.

1:08:51

And I think that that could lead to

1:08:55

a free speech issue on these platforms if

1:08:58

the algorithm can be targeted like that.

1:09:02

even if Section two thirty remains just

1:09:06

for liability reasons.

1:09:08

So this is the main reason that I

1:09:10

would love to see a law that really

1:09:13

restricts app design and a lot of that

1:09:19

stuff,

1:09:19

because I think that that or like the

1:09:22

beauty features that filters that you

1:09:24

mentioned, for example,

1:09:25

or some other aspects of these apps do

1:09:28

need to be reined back in.

1:09:30

And I think

1:09:31

um just how heavily these apps are

1:09:34

marketed to kids in the first place i

1:09:36

think that needs to be reined in as

1:09:39

well um but it would be nice to

1:09:43

have a law that delineates that from even

1:09:47

from the algorithm which which is

1:09:49

problematic but um

1:09:53

I would be worried about it being impacted

1:09:55

in future cases because I do think even

1:09:57

if it doesn't mandate censorship or

1:10:01

moderation,

1:10:01

I think that censorship and moderation

1:10:03

could be a likely outcome if these

1:10:08

platforms become liable for the content

1:10:11

that they display with those algorithms,

1:10:14

basically.

1:10:16

That makes sense.

1:10:17

I hear you.

1:10:19

Yeah, I don't have an answer to that.

1:10:20

Hopefully that does not happen.

1:10:23

So yeah,

1:10:23

it's something to keep an eye on.

1:10:27

I don't know.

1:10:28

I'm never very optimistic about the things

1:10:30

that our government is doing,

1:10:32

but you can always hope for the best.

1:10:36

But yeah,

1:10:37

I've definitely seen a lot of people very

1:10:40

concerned about this case,

1:10:42

even if they agree with some of the

1:10:44

issues that are being addressed here,

1:10:46

because...

1:10:50

there are concerns with how this will

1:10:51

affect future cases.

1:10:55

Totally.

1:10:59

I think that's kind of it,

1:11:00

if you don't have anything else to add.

1:11:03

In a minute,

1:11:04

we'll start taking viewer questions.

1:11:07

So if you have questions or if you've

1:11:09

been holding on to any questions about the

1:11:11

stories we've talked about so far,

1:11:12

you can leave them in the forum thread

1:11:15

for this show,

1:11:17

or you can leave it in the chat

1:11:20

here.

1:11:20

We'll try to get through all of them.

1:11:24

For now,

1:11:25

I think we should check in on our

1:11:27

community forum.

1:11:28

There's always a lot of activity on our

1:11:30

forum every week.

1:11:30

We can't talk about it all,

1:11:32

but we wanted to highlight a couple of

1:11:35

this week's most interesting discussions,

1:11:38

in our opinion, that are happening there.

1:11:42

Our first forum thread that I wanted to

1:11:44

take a look at was called Remembering

1:11:46

Device and Master Passwords.

1:11:48

This was a question that was asked to

1:11:51

the community talking about

1:11:56

password managers and replacing all of

1:11:58

their reused passwords with randomly

1:12:00

generated passwords that are stored in the

1:12:02

password manager.

1:12:04

But there are some passwords that the

1:12:06

password manager can't remember for them

1:12:09

because they need them

1:12:11

like the master password,

1:12:12

to access the password manager in the

1:12:14

first place, which is, of course,

1:12:16

a problem.

1:12:18

So they mentioned some examples,

1:12:20

the master password,

1:12:21

the user account password for each of

1:12:23

their devices,

1:12:24

the disk encryption password for each of

1:12:26

their devices.

1:12:27

And then if they had five different

1:12:30

devices, they would have eleven...

1:12:32

six word passphrases to remember,

1:12:34

which is a challenge.

1:12:37

So they asked basically what strategies we

1:12:40

have for remembering so many passwords,

1:12:43

or what password should you reuse in those

1:12:46

situations?

1:12:48

So Nate,

1:12:49

I know you had some things to talk

1:12:51

about and you saw Fria's answer there.

1:12:53

Do you want to kind of cover what

1:12:54

Fria talked about here?

1:12:57

Yeah,

1:12:58

because I really appreciated Fria's

1:13:01

response.

1:13:03

So for example,

1:13:05

one of the questions that the original

1:13:07

question asker said,

1:13:09

is it safe to reuse the same password

1:13:10

for disk encryption and user account?

1:13:13

And Fria said,

1:13:13

it's probably best to make those

1:13:14

different.

1:13:15

But ideally,

1:13:16

your online account would use a passkey or

1:13:18

something like that instead of a password.

1:13:21

And they also noted that your device

1:13:24

passwords don't leave your device.

1:13:27

So, for example,

1:13:28

my computer in front of me here... Well,

1:13:31

this is a Mac,

1:13:31

so this is a good example.

1:13:32

But my Windows computer, you know,

1:13:33

I have a VeriCrypt and I have the

1:13:35

login to my local account,

1:13:37

which is not a Microsoft account.

1:13:38

It's a local-only account.

1:13:40

So, in theory,

1:13:41

I can make both of those the same

1:13:42

password, right?

1:13:43

Because they're not going to leave the

1:13:45

device.

1:13:47

And they also mentioned that you can...

1:13:50

You can go ahead and enable biometrics.

1:13:54

I guess I should have said this to

1:13:54

start with.

1:13:55

It really depends a lot on your threat

1:13:57

model, right?

1:13:58

Because my threat model, for now,

1:14:02

is basically getting robbed.

1:14:05

Laptop, laptop, laptop, laptop.

1:14:08

several phones laying around.

1:14:10

Like my concern is not really the

1:14:12

government.

1:14:13

I personally am a strong believer in a

1:14:15

five dollar wrench attack and I do not

1:14:16

have a high pain tolerance.

1:14:17

So, you know,

1:14:19

the minute they threaten me with violence,

1:14:20

I'm going to fold like a souffle.

1:14:21

I'm just being honest.

1:14:23

But, you know,

1:14:24

if somebody comes in and breaks into my

1:14:26

house while my wife and I are out

1:14:28

and they steal all the laptops,

1:14:30

I want to make sure that they're not

1:14:31

going to be able to get into that

1:14:32

because that's when they're going to be

1:14:33

able to get into my password manager.

1:14:36

And I don't keep browser history,

1:14:37

but browser history and

1:14:38

any apps I have saved, any like,

1:14:41

like I use Thunderbird.

1:14:42

So all my emails are downloaded locally,

1:14:44

things like that.

1:14:45

And that's what I really want to protect

1:14:46

against.

1:14:46

So in that situation, yeah,

1:14:49

really just having one really strong

1:14:51

password

1:14:53

is probably sufficient because they're not

1:14:55

going to crack that as long as I

1:14:56

don't write it down and stick it on

1:14:57

the desk anywhere.

1:14:57

Right.

1:14:58

Um, you know,

1:14:59

like one randomly generated six word

1:15:00

passphrase.

1:15:01

If I really want to be safe,

1:15:02

I could give each device a different

1:15:04

passphrase.

1:15:04

So that way it's not, um, you know,

1:15:07

if they get one,

1:15:08

at least they don't get into all of

1:15:09

them, but at least that way, you know,

1:15:10

it's not, uh,

1:15:12

What's the word I'm looking for?

1:15:13

Like that's only three different passwords

1:15:14

instead of six, right?

1:15:16

Or something like that.

1:15:17

So it really does depend on your threat

1:15:19

model.

1:15:20

But going back to the biometrics thing,

1:15:21

what a lot of people have,

1:15:22

I've seen several people notice this where

1:15:25

they've been in a public situation and for

1:15:27

whatever reason,

1:15:28

they have to pull out their phone and

1:15:29

unlock their phone.

1:15:30

And they realize as they're typing it in,

1:15:32

they're like, dude,

1:15:32

there's a camera right over the cash

1:15:33

register looking at me type my password

1:15:35

into the phone.

1:15:36

Hmm.

1:15:36

And they're like, man,

1:15:37

I kind of wish I had just used

1:15:38

biometrics because at least then they

1:15:40

wouldn't have my password, right?

1:15:41

Or, you know,

1:15:41

we've covered stories in the past about

1:15:44

there's a scam that I think is still

1:15:45

going around where somebody will basically

1:15:49

watch you unlock your password for

1:15:50

whatever reason.

1:15:51

You know,

1:15:51

maybe they're flirting with you at a bar

1:15:52

or something and they see you type in

1:15:54

your passcode and then them or their

1:15:56

accomplice will steal your phone,

1:15:58

try to unlock it and send a lot

1:15:59

of money.

1:16:00

And I know Apple and Google have rolled

1:16:01

out some defenses against that,

1:16:03

but that's a good example where if you

1:16:04

unlock it with biometrics,

1:16:06

They're going to have a harder time

1:16:07

unlocking the phone when they steal it

1:16:08

from you.

1:16:08

So it's really about what are you trying

1:16:13

to protect and who are you trying to

1:16:14

protect it from.

1:16:16

If you have a very high threat model,

1:16:17

then yeah,

1:16:17

you probably want a bunch of different

1:16:18

passphrases.

1:16:20

I think it's also worth noting that it

1:16:21

is

1:16:23

I know this is really unpopular,

1:16:24

but it is okay to write down passwords

1:16:26

in some situations.

1:16:28

For example, do not call it password.

1:16:30

Do not stick it on a sticky note

1:16:31

on your screen.

1:16:32

But if you have a little notebook that

1:16:34

you carry with you everywhere or something

1:16:37

like that.

1:16:38

So I think my thing, I'll be honest,

1:16:42

I basically have two main passwords I use,

1:16:44

one for the encryption and one for the

1:16:45

local account.

1:16:46

I don't know why I do it that

1:16:47

way.

1:16:47

I just do.

1:16:48

Because now that I think about it,

1:16:49

if they get past the encryption,

1:16:50

they can just pop out the disk, right?

1:16:53

Yeah, I don't know.

1:16:54

But I think that's the big thing is

1:16:57

the threat model.

1:16:58

If your threat model is not very high,

1:16:59

it's probably safe to reuse the local

1:17:01

passwords that don't leave your device.

1:17:04

Just be aware that that is a risk.

1:17:05

If somebody gets it,

1:17:07

they can get into all the devices,

1:17:08

I guess.

1:17:08

So I don't know if Jonah has a

1:17:10

different strategy that he would approach

1:17:12

it with.

1:17:13

No, that makes a lot of sense.

1:17:15

I mean,

1:17:16

the main thought that I would have is

1:17:18

I think for most people,

1:17:20

the information that you store on your

1:17:25

different devices,

1:17:26

if you have multiple devices,

1:17:28

is usually pretty much the same

1:17:29

information.

1:17:30

you're going to probably install your

1:17:31

password manager on all of them and the

1:17:34

same web browser that you have synced

1:17:35

across them.

1:17:36

And people have a desktop and a laptop

1:17:38

and a phone for convenience purposes

1:17:41

rather than just separating all of their

1:17:43

data.

1:17:43

And so I think using the same password

1:17:46

for encryption

1:17:51

And using the same password for your local

1:17:52

account probably makes sense in those

1:17:54

situations.

1:17:58

At the end of the day,

1:17:59

those passwords aren't exposed to the

1:18:03

internet.

1:18:04

You don't have everyone on earth trying to

1:18:06

hack you in the same way that...

1:18:08

you have like thousands of hackers trying

1:18:10

to attack your online accounts all of the

1:18:12

time because because they can and it's so

1:18:14

easy to attack like all of this local

1:18:16

stuff it does it's not as much of

1:18:19

a deal and i think memorizing one password

1:18:21

for all that is good um you definitely

1:18:24

do want to have a different password for

1:18:26

your local account versus your encryption

1:18:29

password just because

1:18:31

you don't want to be entering your

1:18:33

encryption password all the time in case

1:18:34

of like shoulder surfing attacks.

1:18:36

So keeping that separate is nice.

1:18:39

And I wish that's a feature that could

1:18:40

be used on more smartphones,

1:18:43

but I digress.

1:18:46

But other than that,

1:18:48

Yeah,

1:18:48

there's a lot of good advice in this

1:18:50

thread,

1:18:51

and I think it really does depend on

1:18:53

your setup,

1:18:53

but usually I think that works.

1:18:55

If you do have very different information

1:18:58

on your devices,

1:18:58

it might make more sense to not reuse

1:19:00

those passwords,

1:19:01

but

1:19:02

I would also say if you're just trying

1:19:04

to remember these passwords,

1:19:06

even if you do use biometrics,

1:19:09

usually you could try disabling them for a

1:19:13

month or a couple months because I think

1:19:14

muscle memory is usually the way to

1:19:17

memorize these passwords quickly.

1:19:19

I remember back when I was in...

1:19:22

college I would always have to log into

1:19:25

different computers like in the computer

1:19:28

lab we didn't have laptops or anything we

1:19:30

just had these these desktops and I'd have

1:19:33

to use my account that I would normally

1:19:35

use a password manager for like um like

1:19:39

to access my email but the password was

1:19:40

the same to log in locally to these

1:19:42

computers and I didn't want to have to

1:19:45

like grab my phone or something to copy

1:19:47

my password over I just had to have

1:19:48

a passphrase that I could

1:19:50

Memorize to log in and it only took

1:19:52

like a month or maybe a month and

1:19:54

a half to eventually like have the pretty

1:19:58

long passphrase down I think if you just

1:20:00

do it all the time it's you'll probably

1:20:04

get it so yeah lots of good advice,

1:20:07

but I think typically.

1:20:11

There's only three main passwords that

1:20:13

most people have to remember,

1:20:15

which is for their local accounts or pins

1:20:18

or their encryption key and then their

1:20:22

password manager, master password.

1:20:24

I think that you should keep them all

1:20:26

separate,

1:20:27

but you probably don't need more than that

1:20:28

unless you have a good reason that you

1:20:31

know of to have more passwords than that.

1:20:33

So, yeah.

1:20:37

Oops.

1:20:37

There we go.

1:20:38

Yeah.

1:20:38

Last thing I want to add just to

1:20:39

double what she said is if this is

1:20:41

something you're struggling with,

1:20:42

definitely check out this thread.

1:20:43

Cause a lot of people gave really good

1:20:45

ideas from like different,

1:20:47

I wouldn't say different threat models,

1:20:48

but just different perspectives.

1:20:49

Like don't forget, you know,

1:20:51

hardware tokens and don't forget this.

1:20:52

And if this is your threat model than

1:20:54

this, and it was, it's a really,

1:20:55

really good thread for sure.

1:20:56

So there's a lot of different,

1:20:57

we're kind of just given like a rough,

1:20:59

you know, what,

1:20:59

what would probably work for most people,

1:21:01

but people gave a really thoughtful

1:21:03

answers about, you know,

1:21:04

keep this in mind.

1:21:05

And if your threat model includes this,

1:21:07

so.

1:21:07

Really good thread for sure.

1:21:12

And then,

1:21:13

if that's all we had on that one,

1:21:15

there was one other thread that I thought

1:21:18

was interesting that I wanted to talk

1:21:20

about a little bit because I've been...

1:21:22

My router started giving me issues right

1:21:24

before we went to Austin,

1:21:25

and I just got it fixed this week,

1:21:27

finally.

1:21:29

And this forum post asks about apartment

1:21:33

Wi-Fi privacy.

1:21:35

And so they were specifically talking

1:21:36

about the...

1:21:38

the router issued by their ISP.

1:21:40

They said they moved into a new apartment

1:21:42

complex and there's fiber pre-installed

1:21:45

from a provider.

1:21:46

The complex states that that provider is

1:21:48

the preferred provider and they have a

1:21:50

partnership.

1:21:51

And then after moving,

1:21:52

I was contacted by representative to begin

1:21:53

the internet setup process.

1:21:55

And so they were basically wondering,

1:21:57

can the apartment complex get any insight

1:22:00

into what I'm doing on my Wi-Fi and

1:22:03

what would be the best way to get

1:22:05

more privacy?

1:22:06

They mentioned,

1:22:07

should I get a different provider

1:22:08

altogether or something like that?

1:22:11

So as a longtime apartment dweller,

1:22:15

for most of my life,

1:22:16

I've lived in apartments.

1:22:19

Jonah can correct me on this one.

1:22:20

I don't think the complex would

1:22:21

necessarily have any insight into your

1:22:22

traffic.

1:22:24

Um,

1:22:24

but I think they probably just have a

1:22:28

relationship with that.

1:22:29

It seems like every apartment I go to

1:22:30

has a preferred provider.

1:22:31

So that doesn't really surprise me,

1:22:33

but it can definitely go both ways.

1:22:36

Um,

1:22:36

there's definitely a lot of apartments and

1:22:38

especially probably older apartments where

1:22:40

they have like,

1:22:41

one provider like a cable provider or a

1:22:44

fiber provider that has just like

1:22:46

pre-installed cables to all of the rooms

1:22:48

and so it's much easier to get that

1:22:49

access um and so that might be what

1:22:52

they mean when they're talking about like

1:22:53

this is our preferred provider and

1:22:55

especially typically if you have to set up

1:22:58

an account with the isp yourself usually

1:23:01

that's the case where the complex wouldn't

1:23:03

have access to that i have seen in

1:23:06

a ton of new developments around here um

1:23:10

apartments basically getting their own

1:23:12

connection and then running their own uh

1:23:16

ethernet and and access points to all of

1:23:18

these rooms but they run a central router

1:23:20

basically that keeps all the rooms

1:23:21

separate but it is all managed by the

1:23:24

apartment and usually they have like

1:23:25

apartment-wide

1:23:27

wi-fi for example so all of their routers

1:23:30

it gives you um sometimes they give you

1:23:33

a uh like a personal connection and

1:23:36

sometimes just this one wi-fi connection

1:23:38

for the whole building um that's in all

1:23:40

of the rooms um and you could certainly

1:23:43

see that and that would be managed by

1:23:46

the complex so it could go either way

1:23:48

for sure um and i it seems to

1:23:51

be even more common that it is apartment

1:23:53

run so i wouldn't rule that out

1:23:55

necessarily

1:23:57

was how our last apartment was they had

1:23:59

um i don't know if it was managed

1:24:01

by the apartment but they were like hey

1:24:03

this is included with the rent and they

1:24:04

did have the apartment the complex wide

1:24:06

wi-fi like you mentioned and there was one

1:24:09

ethernet port that was in the study for

1:24:11

some reason and that was the only one

1:24:12

that were no it was in the living

1:24:13

room and we had to run a cable

1:24:15

into the study because my wife has a

1:24:16

desktop that doesn't have wi-fi i remember

1:24:18

that now yeah so that was probably the

1:24:20

case i think in my last apartment it

1:24:21

was a similar situation where

1:24:24

If I wanted my own network,

1:24:26

I would have to plug in my own

1:24:29

router to their one Ethernet port that was

1:24:32

provided on the access point,

1:24:33

but I couldn't remove their access point

1:24:34

from the room and make a direct

1:24:36

connection.

1:24:37

And it was basically like a double NAT

1:24:39

setup.

1:24:40

It was on their network,

1:24:42

and then I had my own network just

1:24:44

for security,

1:24:45

but it was not a direct connection to

1:24:47

the ISP.

1:24:47

Okay.

1:24:49

Which leads into what I was going to

1:24:51

say.

1:24:51

This is what I've been doing for years,

1:24:52

and this is what I would recommend.

1:24:55

For those of you who are really passionate

1:24:56

about your privacy,

1:24:57

which is probably most of you watching

1:24:58

this,

1:25:00

I would strongly recommend getting your

1:25:01

own router.

1:25:03

Our official recommendation, I believe,

1:25:05

at Privacy Guides is OpenWRT, correct?

1:25:08

I think so.

1:25:09

We do recommend that as one.

1:25:11

You could use OpenSense or PFSense as well

1:25:13

if you want something more...

1:25:15

robust,

1:25:16

but usually that has to run on a

1:25:17

computer,

1:25:18

whereas OpenWrt can install on consumer

1:25:21

router platforms.

1:25:23

Gotcha.

1:25:24

OK, yeah.

1:25:25

Historically,

1:25:26

I originally went with DDWrt.

1:25:29

For years, it was not an issue.

1:25:30

And then recently, it became an issue.

1:25:33

I don't know what happened.

1:25:35

Now I'm using a different one that Jordan

1:25:38

actually recommended to me and so far has

1:25:40

been amazing.

1:25:41

Thank you for that recommendation, Jordan.

1:25:43

um but yeah if i if i whenever

1:25:46

this router dies assuming i can buy

1:25:48

routers again um given our headline story

1:25:51

i'm probably gonna go ahead and get uh

1:25:54

the what is it the

1:25:56

the one, I don't remember who makes it,

1:25:58

but it's endorsed by the free software

1:26:00

foundation.

1:26:01

And it is specifically designed to run

1:26:04

open WRT.

1:26:06

And I did try open WRT on my

1:26:09

router, but it,

1:26:10

because it is so open source,

1:26:12

it couldn't get access to multiple wifi

1:26:15

networks,

1:26:16

which is something I really want.

1:26:17

That's, that's really important to me.

1:26:19

Um, so it was like, yeah,

1:26:20

you get this one wifi network.

1:26:21

And I'm like, no,

1:26:22

that ain't going to work.

1:26:22

Um,

1:26:24

But I'm sure if I bought something like

1:26:25

the one router,

1:26:26

it would be specifically designed for that

1:26:28

and it would probably be able to do

1:26:29

more, I'm hoping.

1:26:30

I'll definitely look into it more when the

1:26:31

time comes.

1:26:32

But yeah, anyways,

1:26:33

where I'm going with this is every

1:26:35

apartment I've ever been to,

1:26:36

they tell you like, oh,

1:26:37

you can't use your router and our, okay,

1:26:40

some of them have not told me that,

1:26:41

but almost all of them are like, no,

1:26:43

you can't do that.

1:26:44

It's worked just fine for me, no problem.

1:26:45

Personally, your mileage may vary,

1:26:47

but for me, I've never had an issue.

1:26:50

There's only been like one or two that

1:26:51

are just like, yeah, whatever,

1:26:52

use your router, we don't care.

1:26:54

Um, I,

1:26:56

I know my current router and I think

1:26:58

a lot of ISP provided routers have this,

1:27:01

they have a bridge mode where it basically

1:27:03

shuts that router off and turns it into

1:27:05

just a pass through.

1:27:06

Um,

1:27:06

because a lot of the time it will

1:27:07

be, especially if it's fiber right now,

1:27:09

it'll be, um,

1:27:11

what's the word I'm looking for?

1:27:12

The fiber connection will have to go into

1:27:14

the ISP router.

1:27:15

And then from there it goes into your

1:27:16

router or there'll be like a coax cable.

1:27:19

So it's not like an ethernet that you

1:27:20

can just plug straight into the wall

1:27:21

usually.

1:27:22

And that's why you'll need their router.

1:27:23

But a lot of them,

1:27:23

you can put it in bridge mode and

1:27:25

then it shuts their router off.

1:27:26

Mine is not in bridge mode and it

1:27:27

still works just fine.

1:27:29

So yeah, I mean,

1:27:31

there's a lot of ways to go about

1:27:32

it,

1:27:32

but I really do think I would highly

1:27:34

encourage,

1:27:35

and some of them can be expensive.

1:27:37

Like I think my router,

1:27:37

when I bought it was like,

1:27:39

almost five hundred bucks,

1:27:40

and I think that was used.

1:27:42

So it's a nice router.

1:27:45

I love that router.

1:27:45

I'm glad it's lasted me this long.

1:27:46

It's a good investment.

1:27:48

So I'm not saying you have to go

1:27:49

buy a five hundred dollar router.

1:27:50

There's a lot of options out there,

1:27:51

but definitely if you're passionate about

1:27:52

your privacy and you are a renter and

1:27:55

you don't have complete control over your

1:27:57

ISP and what router they give you and

1:27:59

stuff,

1:27:59

definitely i would recommend doing your

1:28:01

research start on privacy guides start

1:28:03

with the open wrt see if that'll meet

1:28:05

your needs and um you know see see

1:28:08

about get another router because it pays

1:28:10

off now we've got an iot network we've

1:28:12

got a guest network we've got our main

1:28:13

network we've got a built-in ad blocker

1:28:16

we've got ad blocking dns um all the

1:28:19

networks are segmented vlans it's just wow

1:28:23

which is super overkill but i'm just

1:28:24

saying like the possibilities are endless

1:28:26

man yeah yeah

1:28:28

Absolutely get your own router.

1:28:29

I mean,

1:28:29

that's the point of having a router.

1:28:31

It's to segment your network from your

1:28:33

ISP.

1:28:33

Even if your apartment complex is your ISP

1:28:36

in this case,

1:28:38

you want to keep your stuff separate from

1:28:39

that.

1:28:39

And you can always do more advanced stuff

1:28:42

with your own router.

1:28:43

Like if you don't trust your apartment,

1:28:44

you can run a VPN on that to

1:28:46

protect all of your devices and stuff like

1:28:48

that.

1:28:50

Or you just use the router to prevent

1:28:52

other people on your apartment Wi-Fi or

1:28:55

maybe just other people on the internet if

1:28:56

your apartment

1:28:58

doesn't have proper security,

1:28:59

which they certainly might not because

1:29:02

they're an apartment building that knows

1:29:03

nothing about the internet.

1:29:06

You want to have like a firewall in

1:29:07

between to keep your devices safe.

1:29:09

So yeah,

1:29:11

there's almost no reason you can not use

1:29:14

a router because generally they can't

1:29:18

really tell what device you connect to it.

1:29:20

So even if they tell you you can't

1:29:23

use a router,

1:29:23

I would definitely just use one anyways.

1:29:27

Yeah.

1:29:27

Like I said, they,

1:29:28

most of them tell me you can't.

1:29:29

And then I'm like, Oh,

1:29:30

let me try it.

1:29:30

Plug it in.

1:29:31

Works great.

1:29:32

Um,

1:29:32

one last thing I wanted to add real

1:29:33

quick actually is, um,

1:29:35

when we're listing benefits,

1:29:36

a lot of the time,

1:29:37

the benefits are not always privacy

1:29:38

related.

1:29:39

It makes set up in your new place

1:29:40

super easy because you move to a new

1:29:42

apartment,

1:29:42

you plug your router in and look at

1:29:44

that.

1:29:44

All your devices are ready to go.

1:29:46

You don't have to type in a new

1:29:47

network.

1:29:47

You don't have to set up a new

1:29:48

network, just plug it in and go.

1:29:51

So yeah.

1:29:52

Yeah, absolutely.

1:29:56

All right.

1:29:57

So I think that's all we had for

1:30:00

the forums for now.

1:30:01

And now we're going to take some viewer

1:30:03

questions.

1:30:04

So we're going to start with questions on

1:30:05

the forum,

1:30:06

specifically from our paying members,

1:30:08

but I don't believe we had any paying

1:30:09

members right now.

1:30:11

But if you would like to become a

1:30:11

paying member and get priority,

1:30:13

you can go to privacyguides.org,

1:30:14

click the little red heart icon in the

1:30:16

top right corner.

1:30:17

So we'll jump into the forum first and

1:30:20

then hop over to our live chat.

1:30:23

Um,

1:30:23

and we only had a couple of questions

1:30:25

in the forum.

1:30:26

The first one asked us to talk about

1:30:28

this article from wired,

1:30:29

which I'm not going to talk about too

1:30:31

much.

1:30:32

Cause honestly,

1:30:32

I don't think there's much to talk about.

1:30:34

Um,

1:30:36

but I think it is a good thing

1:30:36

to have on your radar because it may

1:30:38

become more to talk about in the future.

1:30:40

Uh,

1:30:41

the headline says using a VPN may subject

1:30:43

you to NSA spying.

1:30:44

I did see this article.

1:30:45

The issue is the keyword there is may,

1:30:48

and that's kind of why we didn't include

1:30:50

this as one of our main stories is

1:30:51

basically, uh,

1:30:54

Excuse me.

1:30:54

Lawmakers are – well,

1:30:57

I'll just read the first part.

1:30:58

Lawmakers are pressing the nation's top

1:31:00

intelligence official to publicly disclose

1:31:02

whether Americans who use commercial VPN

1:31:04

services risk being treated as foreigners

1:31:06

under U.S.

1:31:06

surveillance law,

1:31:08

a classification that would strip them of

1:31:10

constitutional protections against

1:31:12

warrantless government spying.

1:31:13

So I know I mentioned this before in

1:31:15

the past,

1:31:16

but really quick recap for those who don't

1:31:18

know.

1:31:18

The way that –

1:31:20

surveillance laws on paper are currently

1:31:22

structured is the government is not

1:31:25

supposed to spy on its own citizens,

1:31:27

but any signals that move in or out

1:31:29

of the country are subject to surveillance

1:31:31

because now you're including somebody

1:31:33

who's potentially not a citizen,

1:31:35

potentially.

1:31:37

And there's a whole lot of reasons for

1:31:39

this.

1:31:40

Edward Snowden has a classic interview

1:31:41

with John Oliver,

1:31:42

or I guess John Oliver has a classic

1:31:43

interview with Edward Snowden,

1:31:45

where Snowden explains how a lot of the

1:31:47

time the communications will route through

1:31:49

the fastest network,

1:31:50

which may temporarily take them out of the

1:31:51

country,

1:31:52

or companies may move the server to

1:31:54

another location digitally to do physical

1:31:56

maintenance on the server, whatever.

1:31:58

Point being,

1:31:59

it's really not a good way of doing

1:32:00

things.

1:32:01

And this article points out why,

1:32:02

because they point out that since VPNs are

1:32:05

so ubiquitous,

1:32:07

Even if a VPN server is located here

1:32:09

in the US,

1:32:10

like say you've got your VPN set to

1:32:11

New York or LA or Dallas or whatever,

1:32:15

there's a really good possibility,

1:32:17

especially in those examples I gave,

1:32:18

that there may be users from other

1:32:20

countries using that server.

1:32:22

And so they're basically saying,

1:32:25

I don't know if somebody tipped them off

1:32:26

to this.

1:32:27

I didn't have a chance to fully read

1:32:29

this article all the way,

1:32:31

but they're basically saying because these

1:32:33

VPN servers could potentially include

1:32:34

foreigners,

1:32:35

does the NSA treat them as foreign

1:32:38

traffic,

1:32:39

which would mean that even if I'm a

1:32:41

US citizen,

1:32:42

if I connect to a server in Dallas,

1:32:45

I'm still in the country,

1:32:46

but are you going to assume that I'm

1:32:49

not?

1:32:49

So, yeah,

1:32:51

we didn't cover that story because it's

1:32:52

very speculative,

1:32:53

but I do agree it's definitely a good

1:32:55

thing to have on your radar.

1:32:57

Yeah.

1:32:59

I don't know what the impact of this

1:33:00

would be.

1:33:00

I think the opposite is also true here,

1:33:04

which people are concerned about,

1:33:05

whereas you're talking about connecting to

1:33:08

a server in Dallas,

1:33:09

and that could be a concern,

1:33:10

but also if an American server

1:33:13

connects to a server in some other country

1:33:16

um like like france will their data be

1:33:20

collected by the nsa because they don't

1:33:23

know that it's an american and they're

1:33:24

spying on people out of the country i

1:33:26

think that that's also a concern i don't

1:33:27

think about that that's a good perhaps

1:33:29

even more of a concern than connecting to

1:33:31

uh to an american server i would say

1:33:34

that this is probably likely to be the

1:33:37

case knowing the nsa we've seen similar

1:33:39

things um

1:33:42

And it is one argument that I've seen

1:33:44

as to why maybe you should use a

1:33:47

U.S.

1:33:47

server,

1:33:47

because if you are concerned about U.S.

1:33:51

surveillance,

1:33:51

there are more restrictions on what the

1:33:54

government can do in the U.S., supposedly,

1:33:57

than what they can do outside the U.S.,

1:33:59

which is pretty much anything they want.

1:34:02

But, I mean, even in the U.S.,

1:34:06

A lot of the way that this government

1:34:08

intelligence works is like the U.S.

1:34:11

can get other countries to do this

1:34:14

intelligence for them and spy on American

1:34:15

citizens,

1:34:17

just like they can get big tech companies

1:34:19

to spy on American citizens.

1:34:20

This is a huge problem that we've talked

1:34:21

about

1:34:23

for many weeks now, especially with Flock,

1:34:25

for example,

1:34:26

where the government doesn't have to do

1:34:30

anything because they just pay someone

1:34:32

else to do it.

1:34:33

It's also an issue with data brokers.

1:34:34

We've talked about that as well.

1:34:36

The government doesn't have to collect all

1:34:37

of your location data,

1:34:39

but they can buy all of your location

1:34:40

data, and that, for some reason,

1:34:42

is perfectly legal.

1:34:45

So

1:34:49

I would say,

1:34:50

I think I already said this,

1:34:51

but I think that this is likely to

1:34:53

be the case.

1:34:54

I guess we don't know for sure,

1:34:55

but I wouldn't be surprised if this was.

1:34:57

But also, if it is,

1:34:59

I'm not exactly sure how to

1:35:02

protect yourself against it at the moment.

1:35:04

The best thing we could do is if

1:35:06

this is confirmed, you know,

1:35:07

we'd have to demand change and demand some

1:35:11

restrictions on how the NSA works,

1:35:13

which we've also been talking about for a

1:35:15

while.

1:35:16

And I'm not sure if that'll happen.

1:35:22

Yeah.

1:35:22

Before we move on to the next question,

1:35:24

this is kind of a follow-up here from

1:35:26

one of our viewers.

1:35:27

We advertise VPNs as a tool against

1:35:29

surveillance capitalism,

1:35:30

not government surveillance.

1:35:33

So would this change how we recommend

1:35:36

VPNs?

1:35:37

Yeah,

1:35:37

I guess that's kind of my main point.

1:35:39

It doesn't seem like for the purposes that

1:35:43

most people use a VPN for,

1:35:45

this is going to make a big difference.

1:35:46

It's obviously a concern.

1:35:48

I don't think that the NSA should be

1:35:50

allowed to spy on American citizens,

1:35:52

just like the CIA shouldn't be able to.

1:35:54

None of these...

1:35:56

None of these agencies really have

1:35:58

authority to do that,

1:36:00

and they probably are anyways through

1:36:02

loopholes like this.

1:36:04

But there's a lot of ways that they

1:36:06

can get around this,

1:36:07

and it's not really the intent of a

1:36:09

VPN in the first place.

1:36:13

They're much more useful for like sharing

1:36:15

an IP address with other people so that

1:36:17

you

1:36:19

So that your data can't be like uniquely

1:36:21

identified in logs or by data brokers and

1:36:24

stuff like that.

1:36:26

So, yeah,

1:36:28

it's not something that I think is going

1:36:30

to protect from government surveillance in

1:36:32

the first place.

1:36:33

I think it's clear that

1:36:36

And using an anonymity network like Tor

1:36:41

makes a lot more sense if this is

1:36:43

your threat model,

1:36:44

but also if you are extremely concerned

1:36:46

about being targeted by the government

1:36:47

rather than swept up in their mass

1:36:49

surveillance,

1:36:50

even something like Tor may not be

1:36:52

the best choice for you and you really

1:36:54

need very specific help.

1:36:57

Whatever Edward Snowden is doing,

1:36:59

he has experts working on the best way

1:37:02

for him.

1:37:02

You can't just do that on your own.

1:37:04

But yeah,

1:37:05

for this mass surveillance stuff,

1:37:08

I think commercial VPNs are still all

1:37:10

right, usually better than your ISP.

1:37:14

At least you have a choice between VPN

1:37:16

providers.

1:37:19

But

1:37:21

They're certainly not perfect.

1:37:22

And yeah,

1:37:23

none of this would surprise me if it's

1:37:24

true.

1:37:27

Yeah,

1:37:27

that was kind of my same thought about

1:37:29

using Tor instead for government stuff.

1:37:34

We had another question here about our

1:37:37

outgoing team member, M.

1:37:40

I think I'll leave that one to you,

1:37:42

because you would be more qualified to

1:37:43

know what's going on behind the scenes

1:37:44

with all that.

1:37:47

um really there's a lot of questions here

1:37:52

um this person so em uh recently posted

1:37:56

on our forum about how she's leaving

1:37:58

privacy guides uh she's also posted some

1:38:01

stuff on mastodon um earlier this month so

1:38:05

this isn't like brand new information um

1:38:08

but i don't think we've talked about it

1:38:09

on the forum or their show before um

1:38:14

So,

1:38:16

I guess I'll go through these questions

1:38:18

one by one.

1:38:22

I guess I'll start with the end first,

1:38:24

because there's a couple questions here

1:38:26

that I probably can't get into.

1:38:27

So they asked,

1:38:29

what was the decision behind choosing

1:38:31

which staff to let go?

1:38:32

Are there plans for hiring Em again once

1:38:34

the financials allow for it,

1:38:36

as long as she's also available for hire?

1:38:38

Do we plan on hiring someone else?

1:38:39

Did you guys know we would have to

1:38:42

let someone go for some time,

1:38:44

or did it come as a shock?

1:38:46

A lot of questions about specific people

1:38:50

on our team and the contracts that we

1:38:52

have with them, I can't really get into.

1:38:55

It's not really my place to discuss a

1:38:57

lot of these contract questions and

1:38:59

there's kind of a lot going into it.

1:39:00

It's a very personal situation.

1:39:02

So unfortunately,

1:39:03

I can't really share a lot of information

1:39:08

on that front.

1:39:12

Because yeah,

1:39:12

that's that's that's not my place.

1:39:14

That's kind of a Personal thing

1:39:22

Yeah,

1:39:22

I just can't talk about any employer stuff

1:39:25

about specific employees, right?

1:39:27

That is more of a one-on-one thing that

1:39:29

we have with them.

1:39:31

So yeah,

1:39:32

I won't get into really any of that,

1:39:34

unfortunately, for you.

1:39:35

But I can talk about some of your

1:39:37

other questions.

1:39:38

So are there any goals or plans we

1:39:40

have for ensuring good financials to

1:39:42

maintain our current employees?

1:39:45

Yeah,

1:39:47

right now we have pretty solid financials

1:39:49

and there's definitely some plans to

1:39:51

improve fundraising and make more revenue

1:39:54

this year.

1:39:55

I'm not concerned at all about our ability

1:39:57

to keep our team hired.

1:40:01

And so that is not a concern for

1:40:03

me at all.

1:40:05

I think a big part of that will

1:40:07

be to focus more on videos.

1:40:11

We've seen a lot of good growth on

1:40:12

the video YouTube side of things and also

1:40:15

making more stuff for members.

1:40:19

You also asked,

1:40:19

do we plan to make more merchandise?

1:40:22

That is something that I,

1:40:25

we'd certainly like to do this year.

1:40:26

That's not,

1:40:27

the whole shop and merchandise thing isn't

1:40:29

a big, like,

1:40:31

source of money for us or anything.

1:40:34

We're really doing that more as like a

1:40:37

marketing expenditure.

1:40:38

It's good to get designs out there so

1:40:41

people are wearing them,

1:40:42

starting conversations about privacy,

1:40:45

all of that stuff.

1:40:47

yeah i hope to have more designs um

1:40:50

we definitely want to sell shirts and

1:40:53

other you know other stuff to to get

1:40:57

privacy guys out there and i'd be happy

1:40:59

to see more people wearing that stuff and

1:41:01

talking about it and going to conferences

1:41:04

or whatever and talking about privacy but

1:41:06

that's not a huge um like revenue

1:41:09

generator for us i will say um so

1:41:11

that is definitely not our main plan to

1:41:14

make money um

1:41:17

They asked about YouTube.

1:41:18

More views mean more revenue.

1:41:22

Yes, they certainly do.

1:41:23

The nice thing is that we've seen a

1:41:24

ton of growth on our YouTube channel so

1:41:27

far.

1:41:28

In the last twenty eight days,

1:41:30

we've got fifty thousand views,

1:41:32

which is thirty three thousand more than

1:41:35

our usual average for a month.

1:41:38

So things are definitely on an upswing and

1:41:42

we hope to continue that going forward.

1:41:48

Have we reached out for grants?

1:41:51

Yes,

1:41:51

we have some grant opportunities that

1:41:54

we've explored.

1:41:55

That's definitely a big thing that we'd

1:41:59

want to do.

1:42:00

We'd love to get grants for specific

1:42:02

projects in the future.

1:42:05

Yes, our sponsorships on the table.

1:42:08

All of the sponsorship stuff and affiliate

1:42:11

links,

1:42:12

that's not something that we plan to do

1:42:13

on our main content,

1:42:15

like the website or the forum.

1:42:17

It's not totally ruled out for the videos

1:42:20

that we make,

1:42:22

especially because it's so common on

1:42:23

YouTube.

1:42:25

um that's still an idea that we're

1:42:27

exploring we likely could at some point do

1:42:30

sponsorships from like companies uh

1:42:34

completely unrelated to privacy if if the

1:42:37

opportunity makes sense i i won't say that

1:42:40

we're that we never do a sponsorship with

1:42:41

this company but just because they're so

1:42:43

common on youtube i would just say like

1:42:45

raid shadow legends could be a good

1:42:47

example of something we might do just

1:42:49

because

1:42:50

They're completely unrelated to privacy

1:42:51

and they,

1:42:53

I don't know if they still do,

1:42:53

but at some point they were sponsoring

1:42:55

like every single YouTube video I watched.

1:42:57

So I won't say that specifically,

1:42:58

but that's kind of an example of something

1:43:00

we could potentially consider.

1:43:02

Another thing we could consider on the

1:43:04

video side of things is like if a

1:43:06

company we do recommend wanted to sponsor

1:43:10

like a tutorial about their product or

1:43:13

something like that.

1:43:15

Like,

1:43:17

it could be any company we recommend,

1:43:19

like Proton,

1:43:20

if they wanted us to do a walkthrough

1:43:21

on ProtonMail.

1:43:22

It's something...

1:43:25

I can't say whether we would do that

1:43:27

or not.

1:43:27

We'll have to explore it if the

1:43:28

opportunity arises,

1:43:29

but it's not something we would totally

1:43:32

rule out at this time.

1:43:34

So on the video side of things,

1:43:36

we may do that in the future in

1:43:39

some cases, but...

1:43:42

yeah we'll have to think about that a

1:43:43

lot more um and it's not something that

1:43:45

we totally that we have any immediate

1:43:49

plans for anything like that um but

1:43:53

potentially on the table uh do we intend

1:43:55

to make more members only content to

1:43:57

incentivize membership uh yeah absolutely

1:44:00

we do want to do more members only

1:44:01

stuff especially in the video uh side of

1:44:04

things um there's there's a balance that

1:44:08

we

1:44:09

uh want to draw though because a lot

1:44:11

of our members only a lot of the

1:44:14

content that we publish in general um we

1:44:16

feel that it's important to get out there

1:44:17

for everyone so it's very tricky to do

1:44:20

members only content in the first place um

1:44:23

it's i'm glad we have so many members

1:44:26

that um are generous enough to kind of

1:44:29

keep up with their subscriptions even as

1:44:31

we uh haven't published a ton of like

1:44:34

members exclusive content only like early

1:44:37

access to videos and stuff just because

1:44:40

Those memberships really help us get all

1:44:42

of this content out there and let us

1:44:44

make things that everyone,

1:44:48

hopefully everyone in the world,

1:44:49

anyone interested in privacy benefits

1:44:51

from.

1:44:53

So yeah,

1:44:55

I don't know where that line will be,

1:44:56

but we have some ideas and that is

1:44:58

something that we want to explore further

1:44:59

in twenty twenty six.

1:45:02

I think that's kind of all the questions

1:45:04

I will say.

1:45:08

A lot of the stuff that Em has

1:45:10

been working on recently,

1:45:15

we definitely plan to continue.

1:45:16

So like all of the activism stuff,

1:45:17

we want to keep up with that section,

1:45:20

expanding it.

1:45:21

It's going to be very challenging.

1:45:23

Of course,

1:45:23

it always is losing any team member for

1:45:27

us,

1:45:27

but

1:45:30

Will will make do and we,

1:45:32

we certainly don't want to give up on

1:45:34

any of those projects,

1:45:34

because we have received a lot of positive

1:45:38

responses to things like that.

1:45:39

And we're hoping to expand it.

1:45:40

So if you guys do like that stuff,

1:45:45

definitely let us know.

1:45:45

But all of that is we still plan

1:45:48

on

1:45:49

sharing that with people,

1:45:51

trying to expand it as much as we

1:45:52

can and keep it up to date and

1:45:55

really promoting it to people in

1:45:58

organizations who can use those resources

1:46:01

because they're really fantastic

1:46:03

resources.

1:46:06

Yeah,

1:46:07

I think that's all I would have to

1:46:09

say on that topic.

1:46:11

I know there's a lot of questions,

1:46:13

and I just said a lot of stuff.

1:46:14

But hopefully,

1:46:15

that answers most of your questions.

1:46:18

And if you have any other questions for

1:46:20

us, let me know.

1:46:23

I did want to add a couple things

1:46:25

to the questions about YouTube

1:46:26

specifically,

1:46:27

since that's kind of what I do.

1:46:30

They asked about upping the production

1:46:32

quality.

1:46:35

I'm not under the delusion that we have

1:46:36

the best production quality.

1:46:37

I know in a perfect world,

1:46:38

I would have a studio with multiple

1:46:40

cameras and everything.

1:46:43

And I mean this genuinely,

1:46:44

like I'm not offended by this question.

1:46:46

Like, what do you mean?

1:46:48

Feel free to like offer suggestions.

1:46:50

I can't promise we'll do them because

1:46:51

again,

1:46:51

we are limited by financial constraints,

1:46:54

space constraints, equipment constraints,

1:46:56

editing constraints.

1:46:58

But I mean,

1:46:58

if you have any very specific, like,

1:47:00

you know, oh, other channels do,

1:47:01

because honestly, that's how, at least me,

1:47:04

that's how I learn a lot of my

1:47:05

tricks is, you know,

1:47:06

watching other channels.

1:47:07

And I'm like, oh,

1:47:08

I really like the way they do their

1:47:09

titles or I really like the way they

1:47:10

do their transitions or whatever.

1:47:12

Um, so yeah,

1:47:14

if you have any specific ideas, um,

1:47:16

I'm personally all ears again,

1:47:17

can't promise we'll do it.

1:47:18

Maybe it's just, you know, um, but yeah.

1:47:22

And then you said current videos get

1:47:24

around one to ten K per views.

1:47:25

And yeah, I mean,

1:47:26

they're gonna fluctuate a lot,

1:47:28

especially because we do so many different

1:47:30

kinds of videos.

1:47:31

Like we do some, uh, interview videos,

1:47:34

we do some tutorial videos,

1:47:35

we do some

1:47:36

We're going to do some tutorial videos.

1:47:38

We do some videos that are more entry

1:47:40

level, like here's encrypted messaging.

1:47:42

And our next video is going to be

1:47:44

a little bit, not more advanced,

1:47:46

but it's not going to be quite so

1:47:47

entry level.

1:47:47

So, I mean,

1:47:50

it's kind of a wide range of topics.

1:47:52

So the views are going to fluctuate,

1:47:53

but I mean...

1:47:54

What I was taught learning YouTube is that

1:47:58

anytime you get more views than you have

1:48:00

subscribers,

1:48:00

that technically counts as going viral.

1:48:02

So considering we don't quite have ten

1:48:04

thousand views yet,

1:48:05

getting ten or ten thousand subscribers,

1:48:08

getting ten thousand views,

1:48:09

twenty thousand, thirty thousand,

1:48:10

which I know is

1:48:13

those are the exception but we have some

1:48:14

videos that really racked up quite a few

1:48:16

views and uh you know our our hope

1:48:18

is that eventually of course someday we

1:48:20

want to get you know a quarter of

1:48:22

a million subscribers and we want our

1:48:23

videos to get a million views each like

1:48:25

we definitely want to get there but um

1:48:28

yeah it's like jordan said here like i

1:48:29

think we punch above our weight for sure

1:48:32

so um just to kind of put it

1:48:34

in context like we're still very much

1:48:35

growing i think um

1:48:38

Oh,

1:48:39

and I had another thought that got away

1:48:40

from me.

1:48:41

Oh, yeah.

1:48:42

Just the other thing to remember is that

1:48:45

we are a very small team.

1:48:46

I think with Em leaving,

1:48:47

I don't know if I can say how

1:48:49

many staff members there are,

1:48:50

but there's not many.

1:48:51

And we all wear a lot of hats.

1:48:53

So a lot of these bigger YouTube channels,

1:48:57

like Veritasium and Fern and stuff like

1:48:59

that, they typically have...

1:49:04

Like,

1:49:05

at least one person whose sole job is

1:49:07

to write and research and write the

1:49:09

script.

1:49:09

And one person whose sole job is to

1:49:11

film the script.

1:49:12

And one person whose sole job is to

1:49:13

edit the script.

1:49:14

And one person whose sole job is social

1:49:16

media management.

1:49:17

But, you know, here we've got...

1:49:20

I cut most of the clips for the

1:49:21

shorts, which, again,

1:49:22

I haven't been doing lately.

1:49:23

I'm sorry.

1:49:24

But I cut most of the vertical clips

1:49:26

and stuff.

1:49:27

And I think Jordan and Jonah mostly handle

1:49:29

the social media.

1:49:31

And Jordan does most of the editing

1:49:33

because they're just so much better at it

1:49:35

than me.

1:49:35

But I try to at least do the

1:49:36

basic cuts and stuff.

1:49:38

So just kind of keep that in mind,

1:49:42

I guess,

1:49:42

just that we are –

1:49:44

I'm not trying to make excuses.

1:49:45

I'm just saying that when I said feel

1:49:48

free to offer suggestions,

1:49:49

sometimes we just may not have the

1:49:50

manpower to do something a certain way,

1:49:52

but we definitely want to get there for

1:49:53

sure.

1:49:54

I mean,

1:49:55

we talk every week about growth and what

1:49:57

our strategies are and what we can do

1:49:58

next to get the message of privacy out

1:50:01

to more people.

1:50:02

Yeah.

1:50:03

Yeah.

1:50:03

Some of those YouTube channels you

1:50:06

mentioned, I think they're definitely...

1:50:09

misleading not in like a malicious way but

1:50:11

just like i don't think a lot of

1:50:12

people understand there there's huge teams

1:50:16

behind them i mean the ones that you

1:50:18

mentioned fair tasium fern like they've

1:50:21

they've got more than one person working

1:50:23

on all of those things they got multiple

1:50:24

editors they're doing they're doing like

1:50:26

multiple animators uh people don't think

1:50:29

about that because i think a lot of

1:50:30

people just think about the person on the

1:50:33

screen on a youtube channel doing

1:50:35

everything and that is definitely not the

1:50:36

case for those larger channels

1:50:38

um but yeah definitely open to suggestions

1:50:41

on what we can do i wouldn't i

1:50:42

wouldn't rule anything out so let me know

1:50:44

what you like to see um i don't

1:50:48

wanna like uh i don't wanna we're i'm

1:50:54

not under any delusions that we're making

1:50:56

like perfect videos but i do think that

1:50:58

our videos are pretty good and i think

1:50:59

that a lot of what we can do

1:51:01

uh maybe better comes down to

1:51:04

marketing those videos and somehow finding

1:51:06

out like the best way to work within

1:51:10

the algorithms and stuff to get it out

1:51:12

to more people and there's certainly

1:51:13

improvements we could make there whether

1:51:15

it's with the script or whether it's just

1:51:17

with titles and thumbnails um but i don't

1:51:21

think the quality is that bad um

1:51:25

personally and i think that

1:51:29

I think that we're set up pretty well

1:51:30

to get a lot more views in the

1:51:32

future, thankfully.

1:51:33

And we're also almost at ten thousand

1:51:36

subscribers.

1:51:37

Probably could be as early as tomorrow or

1:51:41

next week based on how these numbers are

1:51:44

going.

1:51:44

So that's exciting stuff.

1:51:46

Yeah, I'm super excited for that.

1:51:48

That's going to be a big milestone for

1:51:49

me.

1:51:50

Also,

1:51:52

Seas said that it's World of Warships

1:51:55

sponsoring everybody now.

1:51:56

So we need to look into that.

1:51:56

We'll see.

1:51:57

All right.

1:52:03

I'm going to scroll back up to the

1:52:04

top here.

1:52:05

It's been a little bit of a quieter

1:52:08

week.

1:52:10

But let's see here.

1:52:11

We talked about ships.

1:52:16

Somebody asked about age verification.

1:52:17

We talked about that.

1:52:21

I know there were some questions.

1:52:23

I just have to go find them.

1:52:24

Oh, yeah.

1:52:25

Jordan mentioned, not really a question,

1:52:27

but here in the US,

1:52:29

if emails are more than a hundred and

1:52:30

eighty days old,

1:52:31

they don't require a warrant.

1:52:34

Um,

1:52:34

so this is one of the reasons that

1:52:36

we encourage, uh,

1:52:37

encrypted email providers like Proton and

1:52:39

Tudor and mailbox.

1:52:41

If you turn on mailbox guard and, um,

1:52:43

use that is because if you've got Gmail

1:52:46

or Yahoo or whoever it's,

1:52:48

I forget what it's called,

1:52:49

but it's basically this legal doctrine

1:52:51

where the government treats your emails as

1:52:53

abandoned property,

1:52:54

which is completely insane because like,

1:52:57

I'm sure,

1:52:58

especially those of us who are older,

1:52:59

you probably have like, I don't know,

1:53:01

letters from, okay,

1:53:02

using me as an example,

1:53:03

I was in bootcamp and my family sent

1:53:06

me a lot of letters.

1:53:07

I used to be in the military.

1:53:08

And so if I had kept those letters,

1:53:10

I'm sure my mom did,

1:53:12

if I kept those letters and put them

1:53:13

in a shoebox in the closet,

1:53:14

now that I've been out for over a

1:53:15

decade,

1:53:16

I don't think anybody would be like, yeah,

1:53:18

those are abandoned.

1:53:18

Like, no,

1:53:19

those are my memories sitting there.

1:53:20

Like, it's completely insane.

1:53:21

But if you use an encrypted provider,

1:53:23

then-

1:53:25

The cops can't access it anyway,

1:53:26

so it's a moot point.

1:53:32

Jordan asked if we saw that the FBI

1:53:33

director's email got hacked.

1:53:35

I did see that on social media.

1:53:37

I didn't have a chance to read the

1:53:40

story.

1:53:40

I just saw it this morning,

1:53:41

but I saw that that had showed up,

1:53:43

and I saw a lot of jokes about

1:53:45

it.

1:53:45

Like,

1:53:46

his password is probably cash with a

1:53:48

dollar sign and stuff like that.

1:53:53

Yeah,

1:53:54

I believe it's just his personal email,

1:53:57

which probably...

1:54:00

I mean,

1:54:00

I don't think it's like a threat to

1:54:02

the government,

1:54:02

but it is probably a pretty embarrassing

1:54:06

reason for him because I would imagine he

1:54:13

I'd imagine he's using a major service

1:54:15

like Google or Apple iCloud that

1:54:21

doesn't have security issues,

1:54:22

so it was probably more of an OPSEC

1:54:25

failure from the director of the FBI that

1:54:27

caused this rather than any service issue.

1:54:36

Yeah,

1:54:36

I haven't read that article myself either,

1:54:40

but...

1:54:42

Yeah, it doesn't look promising.

1:54:45

I haven't heard a lot of like what's

1:54:47

in the emails, anything yet.

1:54:50

So I don't know if they're going to

1:54:51

parse through it, but.

1:54:54

So that came in like last minute today.

1:54:56

I think yeah this article from the

1:54:58

Guardian says it was a personal gmail

1:55:00

address and The government has also said

1:55:06

There's no government information.

1:55:07

It's all just personal stuff.

1:55:08

It sounds like a lot of random pictures

1:55:12

of them and like historical emails in the

1:55:16

case of Google I

1:55:20

Like there's,

1:55:20

there's plenty of ways to protect your

1:55:21

account.

1:55:22

If you're going to be a Gmail user,

1:55:23

you can have a strong password.

1:55:25

You can have, um,

1:55:27

the advanced protection program, uh,

1:55:30

to call this a hack is probably

1:55:35

not the most accurate because I would

1:55:38

imagine there were plenty of things the

1:55:41

director of the FBI could do to secure

1:55:45

his data.

1:55:47

It probably wasn't like getting into the

1:55:50

mainframe of Google servers or anything

1:55:52

like that.

1:55:52

It was probably pretty mundane.

1:55:55

But what can you expect from the current

1:55:58

government?

1:56:04

Yeah, right.

1:56:11

Let's see here.

1:56:13

Okay, I'm catching up to the front now.

1:56:23

Yeah, I think that was all for questions.

1:56:25

I thought there was another one.

1:56:27

Oh, yes, here it is.

1:56:28

Anonymous,

1:56:30

fourteen sixty five said that there should

1:56:32

be a section on the website for router

1:56:33

scenarios.

1:56:34

I have no idea how to set one

1:56:35

up or be private for it.

1:56:37

The problem with detailed tutorials about

1:56:39

like,

1:56:40

here's how to get started and here's how

1:56:41

to do it is they get really outdated

1:56:44

really fast.

1:56:45

And so it would almost turn into like

1:56:48

a full time job just trying to keep

1:56:51

these tutorials current and, um, yeah,

1:56:56

I mean,

1:56:56

just trying to keep them current and

1:56:57

keeping on top of like, oh,

1:56:59

the UI changed and this options over here

1:57:00

and they added this new option.

1:57:02

And so, um, yeah,

1:57:05

we do want to do some tutorials in

1:57:06

the future.

1:57:07

I think I mentioned that earlier,

1:57:08

but it's, uh,

1:57:09

it's definitely a bit of a challenge to

1:57:11

figure out.

1:57:13

It's,

1:57:13

it's a challenge to keep them as evergreen

1:57:15

as possible.

1:57:17

Yeah.

1:57:18

I wouldn't rule it out.

1:57:21

But yeah,

1:57:23

we've been talking about doing something

1:57:25

like that for a while.

1:57:29

Which, I mean, just among the volunteers,

1:57:31

actually one of them really wants to do

1:57:33

that eventually,

1:57:34

but hasn't been able to yet.

1:57:37

But hopefully we can do something like

1:57:39

that.

1:57:48

Jordan asked if we saw that iCloud Hide

1:57:53

My Email was traced back to somebody after

1:57:56

a warrant.

1:57:57

I would imagine that's probably the case

1:57:59

for any of these aliasing services,

1:58:01

because that is how email works.

1:58:05

They can typically tie it to your mailbox.

1:58:15

yeah not yeah um most interesting story i

1:58:20

feel like but it also involved fbi

1:58:24

director cash patel because it was

1:58:26

regarding an email sent to his girlfriend

1:58:30

so a lot of cash patel stuff going

1:58:32

on in the email space this week i

1:58:34

don't know what's up with that it's been

1:58:41

a busy day for him i guess

1:58:48

I mean, at the end of the day,

1:58:49

like that alien,

1:58:50

the aliasing services and email in

1:58:52

general,

1:58:52

that's not going to apply to serious

1:58:57

threats.

1:58:58

Right.

1:58:59

So it's,

1:59:02

it's more like a spam prevention or again,

1:59:05

kind of like the,

1:59:06

kind of like the VPN thing.

1:59:08

It's good to protect yourself against, uh,

1:59:12

like mass surveillance or data brokers

1:59:14

because.

1:59:17

using a different email for every website

1:59:19

that is a good protection against your

1:59:22

accounts being correlated,

1:59:24

among other things.

1:59:25

But you know,

1:59:27

it's an email aliasing service.

1:59:29

It's not like a unique identity generator

1:59:32

for everything on the internet.

1:59:34

And they certainly can be linked together.

1:59:38

That's not really what these email

1:59:41

aliasing services are for.

1:59:48

Yeah, for sure.

1:59:53

I actually did remember one more question

1:59:55

that I skipped past.

1:59:56

I had another tab open.

1:59:58

Where did it go?

2:00:01

Oh yeah,

2:00:01

so earlier when we were talking about

2:00:03

social media,

2:00:05

somebody asked if there was a book on

2:00:06

the subject.

2:00:07

And I'm assuming you mean a book on

2:00:09

the,

2:00:09

because we were talking about how social

2:00:10

media is designed to be really addictive.

2:00:14

Age of Surveillance Capitalism by Shoshana

2:00:16

Zuboff touches on this a little bit.

2:00:17

Not so much the design of social media

2:00:21

itself, but just like how big tech works,

2:00:23

what their playbook is for invading your

2:00:25

data.

2:00:27

Jaron Lanier has a book called Ten

2:00:30

Arguments for Deleting Your Social Media

2:00:31

Accounts Right Now.

2:00:32

I have not read it personally,

2:00:34

but I know that is one.

2:00:35

And then a couple others I haven't heard

2:00:37

of,

2:00:37

but I found mentioned when I went looking

2:00:39

for answers, Hooked by Nir Eyal,

2:00:43

Addiction by Design by Natasha Shule,

2:00:46

and The Shallows,

2:00:47

What the Internet is Doing to Our Brains

2:00:48

by Nicholas Carr.

2:00:49

Again, have not read any of those,

2:00:51

but those did come up when I searched

2:00:53

that subject, so.

2:00:55

Because when I saw that you asked that

2:00:56

question, I was like,

2:00:58

I know there are some,

2:00:59

but I'm drawing a blank.

2:01:00

So those are what I found.

2:01:11

All righty.

2:01:13

Last chance for questions in the chat,

2:01:15

everyone.

2:01:15

Otherwise, we'll start wrapping this up,

2:01:21

I think.

2:01:22

Got a bit more to share here, but...

2:01:32

But yeah.

2:01:36

I think we can close off questions here

2:01:39

then, probably.

2:01:40

Thanks for tuning in, everyone.

2:01:43

All of the updates from This Week in

2:01:45

Privacy,

2:01:45

we share them on our blog every week.

2:01:47

So you can sign up for the newsletter,

2:01:49

or you can subscribe with your favorite

2:01:51

RSS reader if you want to stay tuned

2:01:54

on all of this and get all of

2:01:55

the sources.

2:01:56

For people who prefer audio,

2:01:58

we have a podcast version available on all

2:02:01

podcast platforms and RSS.

2:02:03

We also sync the recording of this video

2:02:05

to PeerTube.

2:02:07

Privacy Guides is an impartial nonprofit

2:02:10

organization that's focused on building a

2:02:12

strong privacy advocacy community and

2:02:15

delivering the best digital privacy and

2:02:17

consumer technology rights advice on the

2:02:20

internet.

2:02:20

If you want to support our mission,

2:02:22

You can make a donation on our website

2:02:24

at privacyguides.org slash donate.

2:02:28

You can make a donation by going to

2:02:31

any page on our website and clicking the

2:02:33

red heart icon located in the top right

2:02:35

corner of the page.

2:02:36

You can contribute using standard currency

2:02:39

via debit or credit card,

2:02:41

or you can opt to donate anonymously using

2:02:44

Monero or with your favorite

2:02:45

cryptocurrency.

2:02:47

Becoming a paid monthly member will unlock

2:02:49

exclusive perks like early access to video

2:02:52

content and priority during the This Week

2:02:55

in Privacy live stream Q&A.

2:02:57

You'll also get a cool badge on your

2:02:59

profile on the Privacy Guides form and the

2:03:02

warm,

2:03:02

fuzzy feeling of supporting independent

2:03:04

media.

2:03:05

Thank you all for watching.

2:03:07

We will see you next week.