Are Privacy Opt-Outs Useless?
Ep. 49

Are Privacy Opt-Outs Useless?

Episode description

An independant privacy audit of Microsoft, Google and Meta has found that privacy opt outs didn’t have an effect, Cal.com has moved from open-source to closed-source, Netgear has been exempt from the FCC’s non-US made router ban, and more! Join us for This Week In Privacy #49.

Download transcript (.vtt)
0:04

All right.

0:06

A new study finds that big tech tracks

0:08

you even when you've opted out.

0:09

Cal.com will no longer be open source and

0:12

some big developments in US and EU privacy

0:16

and surveillance.

0:17

All this and more coming up on This

0:19

Week in Privacy number forty nine.

0:21

So stay tuned.

0:47

Welcome back to This Week in Privacy,

0:49

our weekly series where we discuss the

0:51

latest updates with what we've been

0:53

working on within the Privacy Guides

0:55

community and this week's top stories in

0:58

data privacy and cybersecurity.

1:01

I'm Jordan,

1:02

and with me this week is Nate.

1:04

How are you, Nate?

1:06

I'm good.

1:07

It's been a busy week,

1:08

but I guess I can't complain.

1:10

How are you?

1:11

Yes, also a busy week,

1:13

but now let's jump into the biggest news

1:16

in privacy and security from the past

1:18

week.

1:19

So this story here from four or four

1:22

media, Google, Microsoft meta,

1:24

all tracking you, even when you opt out,

1:27

according to an independent audit.

1:32

Uh,

1:32

an independent privacy audit of Microsoft

1:34

meta and Google web traffic in California

1:37

found that the companies may be violating

1:39

state regulations and racking up billions

1:42

in fines.

1:43

According to the audit from privacy search

1:46

engine web x-ray.

1:48

Fifty-five percent of sites it checked set

1:51

ad cookies in a user's browser even if

1:53

they opted out of tracking.

1:55

Each company disputed or took issue with

1:58

the research,

1:59

with Google saying it was based on a

2:01

fundamental misunderstanding of how its

2:04

product works.

2:06

So this company itself, WebXRay,

2:09

they viewed web traffic on more than seven

2:11

thousand popular websites in California in

2:14

the month of March and found that most

2:16

tech companies ignore when a user asks to

2:19

opt out of cookie tracking.

2:22

And this is specifically concerning

2:25

because California has privacy

2:28

legislation,

2:28

thanks to its California Consumer Privacy

2:31

Act, which allows users to,

2:33

among other things,

2:34

opt out of the sale of their personal

2:38

information.

2:39

And there's basically a system called the

2:41

Global Privacy Control,

2:43

which is basically a...

2:46

In some browsers,

2:47

it's a switch and in some other browsers,

2:50

it's an extension that you have to

2:51

install.

2:54

According to the Web X-Ray audit,

2:56

Google failed to let users opt out of

2:58

eighty seven percent of the time.

3:01

Google's failure to honor the GPC opt out

3:03

signal is easy to find in network traffic.

3:06

Now,

3:06

I think this is kind of always been

3:09

a

3:10

concerning thing right there's these

3:12

opt-out signals and we're not really sure

3:13

how effective they are right because a lot

3:17

of these signals themselves right they are

3:21

often ignored like we saw the do not

3:23

track signal that used to be kind of

3:25

big right um that was also ignored by

3:28

a lot of websites and now we're also

3:29

looking at this new thing which is

3:32

GPC so a lot of companies basically argue

3:37

that they're not really sure what this

3:39

means and they're just gonna track you

3:41

anyway which is kind of silly right so

3:45

you know it's not really that surprising

3:47

to see that so many websites don't comply

3:50

with this did you have any thoughts on

3:52

this Nate I feel like this is

3:56

unfortunately kind of like I assumed this

3:59

was kind of going on so

4:03

Yeah, I mean it's um...

4:06

I don't know.

4:06

I do have a I mean,

4:07

I always have thoughts on things.

4:10

I mean, first of all,

4:11

it's it's I do want to point out,

4:13

well, OK,

4:14

to assume this is always going on.

4:15

I agree with you.

4:16

But I do want to know a couple

4:17

of things that GPC is supposed to be

4:20

an improvement over do not track because

4:23

GPC is actually legally recognized under

4:27

certain privacy laws like the California

4:29

Consumer Privacy Act, for example.

4:30

So websites are required to honor it.

4:33

So.

4:34

I'm with you.

4:34

When this was initially announced,

4:36

GPC specifically, I was kind of also like,

4:39

I don't know,

4:40

why would companies listen to this?

4:42

Like they already don't listen to things.

4:43

But I was also kind of hopeful because,

4:45

again, it is like legally required.

4:46

And we have seen in the past that

4:47

typically companies will – and I'll touch

4:51

on this in a second in the article.

4:52

But like companies do –

4:56

they kind of like to ignore things right

4:58

up until they get caught.

4:59

And then they're like, ah, okay,

5:00

you got me.

5:01

We'll play along.

5:01

Or, you know,

5:02

they'll at least start to play along.

5:03

Usually it's,

5:04

they kind of have to get caught a

5:05

few times,

5:06

but they get caught and they change what

5:07

they do.

5:07

And, um, so I don't know.

5:10

I,

5:10

I guess I was hoping that maybe this

5:11

would go somewhere and it still might if,

5:13

if that's what happens here.

5:14

But, um,

5:15

I think the real issue here,

5:16

and this person that they interviewed from

5:20

WebXRay kind of talked about this.

5:22

Oh, where did it go, actually?

5:24

Okay, yeah.

5:24

So this person who used to work at

5:26

Web, or excuse me, Timothy Liebert,

5:28

who founded WebXRay,

5:29

he used to work at Google.

5:31

And

5:32

He said that he told four or four

5:33

media he felt his job at Google was

5:35

to protect its users,

5:36

but his bosses didn't agree.

5:37

And he left the company in twenty twenty

5:38

three to start Web X-ray.

5:40

And this is a quote from him.

5:41

Shortly before I left,

5:42

my boss told me direct quote.

5:43

My job is to protect the company.

5:45

There was another time I got into a

5:46

very serious ontological discussion with a

5:48

fairly senior engineer about what the

5:49

difference was between taxes and fines.

5:51

And they didn't understand there was a

5:53

difference.

5:54

And I think this is something that a

5:55

lot of people in the privacy space have

5:57

noticed is I think the issue here is

6:00

these companies,

6:01

the fines are not really fines.

6:03

They're just cost of doing business.

6:04

Like I remember,

6:06

I wish I could remember which story it

6:07

was,

6:08

but Meta got in trouble for something and

6:11

they got issued a fine.

6:13

And the article kind of openly said –

6:15

like they didn't make a point of saying

6:16

it.

6:16

It was kind of just like a real

6:17

quick sentence that if you weren't paying

6:19

attention, you wouldn't even notice it.

6:20

But the article basically said like,

6:21

oh yeah,

6:22

Meta is going to contest the fine because

6:24

basically it's bigger than they thought it

6:26

would be.

6:26

Like they don't even care that they got

6:27

fined.

6:27

They don't even care that they're wrong.

6:29

They're just like, no, no, no.

6:30

We set aside a certain amount of money

6:31

to pay these fines, quote-unquote fines,

6:34

which are really just a cost of doing

6:35

business and a tax like he said.

6:37

But it's more than we budgeted for,

6:40

and that's why we're going to fight it.

6:41

It would be like if you got up

6:42

to the register and you're going to buy

6:43

– like you go to the grocery store,

6:45

the corner store, whatever,

6:46

and you're going to buy a soda.

6:47

And they're like, oh, it's five dollars.

6:49

It's like, whoa, whoa, whoa.

6:49

Hold on.

6:50

I have five dollars,

6:51

but this should only be three dollars or

6:54

maybe it is five dollars now with

6:55

inflation.

6:55

But you know what I mean?

6:56

It's like it's not even that I don't

6:57

have the money.

6:58

That's not how much I set aside for

6:59

this thing,

7:00

and that's basically how these companies

7:01

treat it.

7:01

Yeah.

7:04

I think to give a little bit of

7:07

a benefit of the doubt,

7:07

I think it's tricky to find these

7:10

companies sometimes in the sense that

7:13

like,

7:14

These big tech companies like Meta,

7:16

Microsoft, Google,

7:17

you want to be able to levy a

7:19

fine that's going to hurt them and going

7:21

to make them pay attention and stop doing

7:23

this crap.

7:24

But at the same time,

7:25

you need to write the laws in a

7:26

way where it's like privacy guides,

7:28

for example.

7:29

It doesn't wipe us out if we –

7:31

not that we do any of this stuff,

7:32

but if we make a mistake somehow and

7:34

we're accidentally collecting something we

7:36

didn't know, I don't know,

7:37

just throwing it out there.

7:39

A fine that is one percent of Meta's

7:42

hourly revenue would wipe us out.

7:44

And so you want to find that ground

7:46

where you're not destroying the small

7:49

guys, but at the same time,

7:50

you're still hurting the big guys.

7:52

And I do have some sympathy for that.

7:54

But at the same time,

7:55

I feel like as far as I know,

7:57

there's no laws being weighed right now or

8:00

suggested.

8:01

So, I mean,

8:01

it's not like they're really trying to fix

8:03

that.

8:04

But that's really the problem is just that

8:06

the penalties for doing this stuff are.

8:08

just a cost of doing business.

8:10

And yeah, I don't know,

8:13

really unfortunate, but I guess.

8:16

I think, oh, sorry.

8:18

No, go ahead.

8:19

I think like I do think it's interesting

8:21

what you said is the the the stuff

8:24

about finding a company based like,

8:27

you know,

8:27

if you don't want to destroy all the

8:29

small companies,

8:30

I feel like we could kind of get

8:31

around that.

8:32

Maybe if we had like, you know,

8:35

proportional of their earnings,

8:37

like based on their profits or revenue,

8:39

maybe.

8:41

But, you know,

8:42

obviously I feel like our governments are

8:44

too

8:45

captured by these big tech companies and

8:48

lobbying and all that sort of stuff um

8:50

so it's probably not super likely but I

8:53

think you know having a fine that is

8:55

actually proportionate to how much they

8:58

make because I don't know I guess that

9:00

might be hard to argue from a from

9:03

a um a damage perspective right like if

9:07

they were if they were collecting let's

9:09

say

9:10

all of Americans data that wouldn't even

9:13

be that much of the entire globe right

9:15

for meta because meta has like billions of

9:17

users so um it'd be hard to say

9:20

like just uh in California you know even

9:24

less so um I feel like it might

9:27

be hard to argue the fine being so

9:29

large I guess um but it's still a

9:32

problem uh I don't know what the

9:36

answer is but I do think they need

9:37

to be fined more especially for not um

9:41

complying with this stuff but it's good

9:43

though I didn't realize the GPC stuff

9:45

actually was um related to like legal

9:48

stuff like there was a legal precedent

9:50

behind it so that's good um but I

9:54

guess it's like you said it's kind of

9:57

the cost of doing business for these

9:59

companies

10:01

Yeah, it's very – like I said,

10:03

that was kind of what gave me hope

10:04

for it because when I first heard about

10:05

it too, I'm like, we already did this.

10:07

What's different this time?

10:08

But it's the legal enforcement.

10:10

But it's – I don't know.

10:13

The street proportional thing,

10:15

I have mixed opinions on because we'll

10:16

take – if my project –

10:22

I'll just say it.

10:22

Like the new oil,

10:23

I published transparency reports.

10:25

I made twenty thousand dollars last year,

10:26

which was by far the most I've ever

10:29

made.

10:30

And so let's say a ten percent penalty.

10:31

Right.

10:31

That's two thousand dollars.

10:33

I don't have that in the bank right

10:34

now.

10:34

Most of that money has been spent on

10:35

various things.

10:37

But, you know, ten percent,

10:39

two thousand dollars for me is a lot

10:40

of money that would wipe me out.

10:43

Whereas for Meta, you know,

10:44

ten percent of their what,

10:45

ten trillion dollars they made last year

10:47

or whatever.

10:47

I don't know.

10:48

I could look it up.

10:48

But you know what I mean?

10:49

Like ten percent for them,

10:50

they're already paying four percent.

10:51

They don't care.

10:52

Like it's just it it doesn't scale the

10:54

same.

10:54

You know,

10:55

a person who's making one hundred thousand

10:57

dollars a year and gets a ten percent

10:58

speeding ticket.

11:01

To them,

11:01

that's just a much smaller amount as

11:03

opposed to a person who's making forty

11:05

thousand dollars a year.

11:06

So, I mean, I hear you like that.

11:08

I just feel like that's not really.

11:10

I feel like that's not really a

11:11

sustainable solution.

11:14

Personally, I could be wrong, but.

11:16

I don't know.

11:18

It's tricky.

11:18

But yeah,

11:19

I think that is the solution is until

11:20

we like get some kind of better legal

11:23

enforcement,

11:24

I don't think these companies are going to

11:25

stop doing what they do.

11:26

And I think the,

11:28

what was his name again?

11:28

This Liebert, this Timothy Liebert,

11:31

he even said that too in this article.

11:33

But I think,

11:37

I guess a question for you,

11:38

what would you recommend?

11:39

Because I mean,

11:40

I think the solution here is

11:42

In my opinion would be that we,

11:44

we kind of, you know,

11:44

a lot of people argue that laws don't

11:45

work and that we need to force companies

11:47

to respect our wishes.

11:49

And I think they kind of got a

11:50

point with this kind of stuff.

11:51

Like,

11:51

I don't know if I'd go so far

11:52

as to say laws don't work,

11:53

but I think we need to do what

11:55

we can to force companies to respect our

11:57

wishes regardless.

11:58

Right.

12:00

I mean, I think it's,

12:02

I think people get too caught up in

12:04

black and white thinking, right?

12:05

Like you can definitely use the laws as

12:08

well as doing things to protect yourself,

12:10

right?

12:10

Like I wouldn't like just use a browser,

12:13

not harden it at all,

12:14

share as much information as possible,

12:16

rely on this like somewhat nebulous global

12:20

privacy thing.

12:23

Like, you know, I think it'd be,

12:26

a thing that you could use that along

12:29

with like a fingerprint resistant browser

12:32

to protect yourself a bit more, um,

12:35

not share as much information,

12:36

use email aliases,

12:39

secondary phone numbers,

12:40

stuff like that to protect yourself.

12:42

Um, because yeah, a lot of cases,

12:44

I'm not really sure if this global privacy

12:46

control thing is

12:49

to be respected like this company said but

12:52

i also think it's kind of interesting that

12:53

this person um i think you read earlier

12:58

that they were um

13:00

part of the Google team working on like

13:04

the cookie compliance stuff.

13:06

I think it's like,

13:08

I'm not entirely sure what they're

13:10

expecting.

13:11

Like you go to work at the largest

13:15

data collector in the world and you expect

13:16

them to care about like respecting

13:18

people's privacy.

13:19

I'm not really sure.

13:21

I mean,

13:21

I guess maybe you could make the argument

13:23

that like you are trying to change it

13:25

from the inside, but like,

13:27

I'm not like friends don't let friends

13:29

work at like big tech corporations.

13:31

Like let's, let's, you know,

13:32

it's not a great idea.

13:36

Yeah.

13:36

I, um, first of all,

13:37

I think that was a great answer with

13:38

the black and white thinking.

13:39

I think you're right.

13:40

Um, I'm,

13:41

I'm a real big fan of using multiple

13:43

approaches.

13:43

Right.

13:44

So like, yeah,

13:44

you should use a hardened browser and,

13:46

and Tor VPN, all that kind of stuff,

13:49

but also.

13:50

we should push for better privacy laws and

13:52

stuff like that.

13:53

So fantastic answer.

13:54

Thank you for saying that.

13:55

But yeah, I mean,

13:56

as far as this guy specifically,

13:57

I don't know when he started at Google.

13:59

So it could be like, you know,

14:00

there was that book, Careless People,

14:02

that was written by the lady,

14:03

Sarah Wynn Williams,

14:04

that used to work at Facebook.

14:06

And to be fair,

14:06

she got there back in like, what,

14:08

two thousand...

14:10

like eight or something like back when,

14:12

when Facebook was still like had the

14:14

potential to be good and she kind of

14:16

watched it become the cancer that it is

14:18

now.

14:18

So I don't know,

14:20

like if this dude had been there from

14:21

the start back when Google used to say,

14:23

don't be evil.

14:24

And back when you Google stood up to

14:25

China and all that kind of stuff,

14:27

then I could kind of see, but yeah,

14:28

I feel like, um,

14:30

I feel like with these big tech companies

14:31

these days,

14:31

you kind of have to hit a point

14:32

where you're just like,

14:33

they're not going to change.

14:34

Like, you know what you're getting into.

14:35

And I don't, I don't, I have, um,

14:39

I feel this way about a lot of

14:40

systems.

14:41

I want to be careful how I say

14:42

this,

14:43

but I feel like there's certain systems

14:45

around the world that just kind of like

14:48

I don't know.

14:48

I'm kind of cynical on if they can

14:50

be changed.

14:50

It's like you either get corrupted and

14:51

become part of the problem or you get

14:53

forced out because you refuse to fall in

14:55

line because you're trying to make things

14:57

better.

14:57

And unfortunately,

14:59

I think big tech is one of those

15:00

systems that nine out of ten times or

15:01

ninety nine out of a hundred times.

15:04

It just it is what it is.

15:05

And it's hard to change.

15:07

It's an uphill battle.

15:07

So.

15:09

But yeah.

15:10

Yeah,

15:11

but I think like to talk a little

15:14

bit more about like using those both

15:16

angles on this approach, like, you know,

15:18

trying to enforce these privacy laws.

15:20

We do have an activism section on our

15:22

website now.

15:22

So

15:25

You can check that out at

15:26

privacyguides.org slash activism.

15:29

There's some stuff in there about like how

15:31

to contact your, actually, I'm not sure,

15:34

it's not live yet,

15:35

but there was a section in the works

15:37

for contacting your data protection

15:41

authority.

15:42

And there's also a lot of bunch of

15:43

tips on there about, you know,

15:46

all sorts of things to basically build a

15:50

movement behind trying to get better laws

15:53

passed and stop people from

15:57

you know,

16:00

stop politicians from passing these

16:02

terrible laws.

16:02

So I think that's also important.

16:04

But like,

16:05

you can do multiple things at the same

16:06

time.

16:07

And I think that's kind of why people

16:08

get kind of confused.

16:10

They'll be like, oh,

16:11

these laws are like always getting

16:13

bypassed.

16:14

They're so useless.

16:15

It's like, well,

16:16

there are some laws that have done

16:17

something.

16:18

Like we can all argue that the GDPR

16:20

has

16:21

had an impact, right?

16:22

Like the right to delete has become a

16:25

lot more common since the GDPR came around

16:27

and that used to be such a pain

16:29

to like delete your information from

16:31

websites.

16:32

And it's had an effect even outside the

16:34

EU as well.

16:35

So I think, you know,

16:37

there's definitely examples of things that

16:41

have worked pretty well.

16:44

So it's just a matter of advocating for

16:51

better legislation.

16:52

I think it's definitely possible that

16:54

there's a lot of bad stuff right now,

16:56

especially with the age verification

16:57

stuff.

16:59

I think in large part it's just because

17:02

people aren't getting riled up enough

17:06

about it.

17:06

I'm sure the politicians would probably

17:08

change their mind if people were

17:12

protesting outside parliament or outside

17:14

your government buildings.

17:16

So I think there's certain things we can

17:19

do to sway people on it.

17:24

But yeah, that's sort of my thoughts.

17:29

Yeah.

17:30

Agreed.

17:30

It's,

17:31

it's politicians are at the end of the

17:33

day,

17:33

we'll do whatever keeps them in power.

17:35

So if something proves to be wildly

17:37

unpopular,

17:38

they're going to find a way to walk

17:39

it back.

17:40

I nine, not a hundred times.

17:42

So.

17:43

Um, real quick,

17:44

before we move on to the next story,

17:46

uh,

17:46

Jonah gifted five privacy guides

17:48

memberships on YouTube.

17:49

So if you're on YouTube and you're kind

17:51

of like listening to us in the background

17:52

or something, uh, check that out.

17:54

You could,

17:55

you could get a free membership trial and,

17:56

uh,

17:57

get some early access to some upcoming

17:58

videos.

17:59

So thank you, Jonah.

18:01

But, uh,

18:02

if that's all we've got on that story,

18:04

um,

18:05

we're going to move and next we're going

18:07

to talk about Mastodon and, um,

18:11

Mastodon,

18:12

I think most of our listeners probably

18:14

know.

18:14

I think most of you guys,

18:15

or not most of you guys,

18:16

but I think a lot of you are

18:19

probably currently Mastodon users or have

18:22

used Mastodon in the past.

18:24

Let us know if you are a Mastodon

18:26

user.

18:26

One in the chat for yes,

18:27

two for no.

18:28

But in the meantime,

18:29

we're going to talk about some upgrades.

18:30

Mastodon got a grant from the Sovereign

18:32

Tech Agency Fund.

18:34

And so the Sovereign Tech Agency is

18:37

something from Germany.

18:38

I pulled up the Wikipedia page here.

18:41

And basically,

18:41

it's a part of the German federal

18:43

government.

18:44

It's part of their budget that aims to

18:46

promote and secure open source

18:47

foundational technologies.

18:49

It tries to make the open source ecosystem

18:51

more resilient against external attacks,

18:53

thereby enhancing cybersecurity and

18:54

resilience across the German economy.

18:56

And so in the past,

18:57

they funded things like, let's see here,

19:00

Arch Linux with, oh my God,

19:02

over half a million euros.

19:03

That's crazy.

19:04

FFmpeg, FreeBSD, GNU, GNOME.

19:09

Oh, my gosh.

19:10

All kinds of stuff.

19:10

Open street maps, open SSH, PHP,

19:14

so on and so forth.

19:15

WireGuard.

19:15

Yeah, really, really cool stuff there.

19:18

So now they have donated to Mastodon.

19:21

They've awarded six hundred and fourteen

19:23

thousand euros

19:25

And out of that total,

19:26

ninety thousand has been set aside to be

19:28

shared with other Fediverse projects that

19:29

choose to implement the protocols

19:30

developed during the work,

19:32

which we are about to talk about.

19:33

So we did write an article or Freya

19:36

wrote an article about this for privacy

19:37

guides earlier this week and focused

19:39

specifically on the on the end to end

19:43

encryption,

19:44

which I will get to that in a

19:45

moment.

19:45

But there's a lot more in here,

19:47

although that is certainly one of the more

19:48

exciting features.

19:49

So there's blockless synchronization.

19:51

I know historically that's been

19:53

A bit of a problem on Mastodon is

19:56

moderation.

19:57

A lot of people, I'm told...

20:00

There's a struggle, right?

20:01

And I don't want to get too philosophical

20:03

right off the bat,

20:03

but there's a struggle between...

20:07

We want...

20:08

freedom of speech.

20:10

And we want people to have a space

20:11

where they can say whatever they want,

20:13

even if we don't agree with it.

20:14

But also some people maybe just don't want

20:16

to do that, right?

20:17

Like someday I have days where I know

20:19

I need to like not check the news,

20:21

because I'm just so tired and so mentally

20:23

exhausted.

20:24

And I'm like, dude,

20:24

I'll check it tomorrow.

20:25

Now's not the time.

20:26

And so I understand some people may want

20:28

like an account, for example,

20:29

where they can go and just not see

20:30

anything political or whatever.

20:31

And but the point is,

20:33

it's sometimes been a challenge,

20:34

especially for people who are new to like

20:35

open source technology,

20:36

like maybe back when

20:37

Elon bought Twitter and a lot of people

20:39

were checking out other alternatives.

20:41

You know, some people were like,

20:42

the moderation is difficult and I'm seeing

20:44

a lot of stuff I don't necessarily want

20:46

to see.

20:47

And that's been a thing.

20:48

And so now that's one of the things

20:49

they're working on is enabling Mastodon

20:52

server administrators to subscribe to

20:53

shared block lists,

20:55

which this is totally optional.

20:57

I think one of these days I floated

20:59

the idea of we do want to do

21:01

a Mastodon tutorial,

21:02

like how to self-host Mastodon,

21:03

your own instance.

21:05

And I definitely got the thumbs up from

21:07

Jonah.

21:07

We just haven't gotten around to that one

21:09

yet.

21:09

That one's in the works.

21:11

We have a lot of great ideas for

21:12

videos, but...

21:13

Anyway,

21:13

so that's one of them is a blockless

21:16

synchronization.

21:18

Remote media storage.

21:19

This is more behind the scenes stuff,

21:20

but it'll just make it easier for server

21:22

administrators.

21:24

They won't need to have quite so much

21:26

storage on hand.

21:28

Mastodon hasn't been too crazy for me,

21:29

but also my instance is a lot smaller.

21:31

So yeah.

21:33

In regards to the spam thing again,

21:34

they have automated content detection,

21:36

which is specifically for like spam or

21:38

illegal materials.

21:40

I...

21:42

I'll come back to that one actually.

21:44

End-to-end encryption, I mentioned that.

21:45

So they're going to use,

21:47

I believe it was messaging layer security,

21:49

MLS.

21:51

I believe I read that in Fria's write-up,

21:53

but I apologize if I'm wrong about that.

21:55

But yeah,

21:56

they're going to add end-to-end encryption

21:58

to DMs,

21:58

which is great because that has

21:59

historically been

22:02

one of the negatives of mastodon is the

22:03

dms are not encrypted and a uh an

22:06

administrator could still look at your

22:07

messages if they really wanted to they're

22:09

going to improve the documentation and i

22:11

believe they said they're trying to get

22:12

most of this stuff done by the end

22:14

of the year and then again there's that

22:15

ninety thousand that's bookmarked to help

22:17

other instances or other projects that

22:19

want to take advantage of this so maybe

22:21

someday we'll see end-to-end encryption

22:23

between like mastodon and pixel fed for

22:25

example or something like that so

22:27

I think that's super cool.

22:29

The last thing I wanted to add is

22:31

this automated content detection.

22:32

I could see the argument,

22:34

and this is just me kind of thinking

22:35

out loud.

22:35

I could see the argument where like,

22:37

we're not usually fans of this, right?

22:38

Because how long does it take?

22:40

You know,

22:40

maybe illegal material for now means like,

22:43

Um,

22:44

child abuse material or like in Iceland,

22:46

I found out in Iceland,

22:47

technically adult material is illegal.

22:49

I don't think anybody actually enforces

22:50

it,

22:50

but let's say you wanted to err on

22:51

the soft side or err on the side

22:53

of caution and say like,

22:56

I just want to block anything that's

22:57

adult, right?

22:58

You, you could use this for that.

23:00

I could see how it could get a

23:01

little bit tricky if there starts to be

23:02

some kind of pressure to scan for, um,

23:05

protests or something more political,

23:07

but also at the same time,

23:08

that's one of the beauties of things like

23:09

Mastodon, right?

23:10

Is if you start to feel like this

23:11

instance is getting a little bit too

23:12

heavily moderated in a way I don't like,

23:15

you can move to another instance or you

23:16

can self-host your own instance.

23:17

So I think that's definitely one of our

23:19

favorite things about the Fediverse.

23:22

Jordan,

23:23

was there anything in this announcement

23:25

that you jumped out that caught your

23:27

attention or you thought was interesting

23:29

or wanted to talk about?

23:31

Um, yeah,

23:32

I think the block list synchronization

23:35

thing is definitely going to be somewhat

23:37

controversial.

23:38

Like you said, like there was,

23:41

I think I've talked to people about this

23:43

a decent amount, but like, you know,

23:45

people kind of get frustrated that there's

23:47

almost like censorship in quotations of

23:51

like, you know, certain people, um,

23:56

And I think that, you know,

23:58

a lot of times,

24:00

maybe sometimes that can be the case,

24:02

but I think the biggest thing here is

24:04

the.

24:06

um, you know, uh, the, the,

24:10

the small server operators who don't have

24:12

a lot of time.

24:13

So maybe, you know,

24:14

like I know you run your own master

24:16

on instance,

24:16

I'm sure that can sometimes be kind of

24:18

frustrating to see, like, you know,

24:20

CSAM and like awful stuff popping up.

24:23

Um, because, you know,

24:25

a lot of administrators are basically

24:27

having to take care of that themselves.

24:29

Um, so, you know,

24:31

kind of offloading that a little bit to,

24:35

allow that to be a sort of community-based

24:37

effort is a decent way to go,

24:40

I think.

24:42

But I think, you know,

24:43

some people are still going to have a

24:44

problem with this because, you know,

24:46

it can kind of make things become a

24:48

bit like group-thinky, I guess,

24:51

where everyone is sort of

24:53

blocking people based on,

24:57

I know it's not very common,

24:58

but there are a couple of instances that

25:00

have just like de-federated with other

25:01

ones because of, you know,

25:03

beef that they have with each other,

25:05

which is, you know,

25:07

it happens on every platform.

25:08

I think people are like that.

25:10

So I don't think it's really,

25:12

an issue with Mastodon specifically,

25:15

but I do think it's still in a

25:17

better spot because even if you find that

25:19

to be an issue,

25:20

you can start your own instance or you

25:22

can just join one that doesn't have those

25:28

restrictions.

25:28

But I do think it could be better

25:30

to make it more obvious what information

25:34

is being blocked to users of your

25:35

instance,

25:36

because a lot of times it's not exactly

25:38

clear

25:40

what block list, I mean,

25:45

I hope it's clear once this gets

25:47

implemented,

25:47

but also just like being able to see

25:50

what is actually blocked by the server so

25:53

you can make a better choice if you

25:55

prefer to join a server that doesn't have

25:56

as many blocked things.

26:00

But yeah, all the other stuff seems

26:03

reasonably interesting i think i'm not

26:05

really a big fan of like the automated

26:07

content detection but i think i guess it's

26:13

kind of needed once the network gets to

26:15

a certain point do you have any thoughts

26:18

Yeah,

26:18

I think the automated detection thing,

26:20

I think it's a blessing and a curse

26:21

because like I said,

26:22

there is the one argument of like,

26:24

this is the same thing that we would

26:25

criticize like Apple or Google for, right?

26:27

But at the same time,

26:28

I think historically,

26:30

Mastodon has had a huge problem with spam.

26:32

And...

26:34

a lot of that, I mean, this is,

26:35

there's,

26:35

there's pros and cons to decentralization.

26:38

Right.

26:38

And that's one of the cons is like

26:40

their entire servers out there that are

26:42

just like abandoned.

26:44

Like,

26:44

I don't know why the owners are still

26:45

paying for server space,

26:46

but apparently they are.

26:47

And they've got open registration and,

26:49

And I've seen this happen a few times.

26:51

I've been on Mastodon long enough that

26:52

I've seen this happen more than once,

26:54

where for some reason,

26:56

a whole bunch of bots will just go

26:58

and join this one instance that's like six

27:00

versions out of date.

27:02

The admin is clearly checked out five

27:04

years ago and registrations are still

27:06

open.

27:07

And so everybody,

27:09

they send their bots and the bots start

27:10

harassing everybody and posting spam.

27:12

And usually it's in another language and

27:13

it's like,

27:14

links to gambling sites or another common

27:17

one that goes around is like,

27:18

this is Mastodon support.

27:19

You need to verify your profile,

27:21

which I hope most Mastodon users are too

27:24

tech savvy to fall for that.

27:25

But at the same time,

27:27

I think there's a,

27:30

not to get too in the weeds here,

27:31

but I think any sort of a platform

27:33

needs to have a philosophical question of

27:36

what's their end goal.

27:37

Because I think if your goal is to

27:39

be like, oh,

27:39

all of our users are too tech savvy

27:41

for that.

27:41

They're not gonna fall for that.

27:43

Then you don't really need to worry about

27:45

weeding out the spam, right?

27:47

Like at this point, it's buyer beware.

27:48

You're expecting your users to have that

27:49

level of tech savviness.

27:52

But if you want something to be,

27:54

what's the word I'm looking for?

27:58

If you want something to be accessible to

28:00

everyone and to gain mainstream traction,

28:03

then these are the things you have to

28:04

think about.

28:05

And so I would certainly appreciate some

28:08

better moderation tools.

28:10

I have my instance set to approval.

28:12

I have to approve everyone.

28:13

I usually do,

28:14

unless I think it's an AI bot,

28:15

which they're usually pretty easy to spot.

28:17

But if you're a real user,

28:18

I don't care why you're here.

28:21

But I think...

28:23

I can't help it when other people are

28:25

spamming, right?

28:26

And, you know,

28:27

I can't be on Mastodon to manage that.

28:29

So it is kind of annoying.

28:33

Yeah, I don't know.

28:34

It's got pros and cons.

28:35

Although again, like I said,

28:35

with the whole,

28:36

we would criticize Apple and Google for

28:37

this, Mastodon's decentralized.

28:39

You know,

28:40

the US government could theoretically come

28:41

up to me and be like, hey,

28:42

you need to start blocking, I don't know,

28:45

anything from anything in Arabic because

28:47

we're beefing with Iran right now, right?

28:49

But alternately, if you're,

28:53

if you're a German instance,

28:54

like the U S government has no power

28:56

over you.

28:56

So it doesn't,

28:57

I wouldn't go so far as to say

28:58

it doesn't matter, but it's,

28:59

it's a lot harder for that kind of

29:00

censorship to like really take hold,

29:02

which I think is,

29:03

is an advantage for sure.

29:05

But,

29:06

Yeah.

29:06

And then I've,

29:07

I've also got thoughts on the free speech

29:08

thing, to be honest, but I'll just,

29:09

I'll leave that there.

29:10

Um, like you said, the advantages,

29:12

you can always just go start your own

29:14

instant,

29:14

which Mastodon is one of the more

29:15

user-friendly things that I've looked into

29:17

hosting.

29:17

It's certainly not,

29:18

I wouldn't describe it as like your first

29:19

project.

29:20

I think there's definitely easier things,

29:22

but it's easier than next cloud for sure.

29:24

Um, it's,

29:26

it's definitely easier than a lot of

29:27

other, um,

29:29

a lot of other projects in my opinion.

29:31

So

29:32

Yeah,

29:32

and I do think it is good with

29:34

Mastodon because if you disagree with any

29:36

of these things,

29:37

like if you don't agree with blockless

29:38

synchronization, that's fine.

29:41

You can use like any other Fediverse

29:44

system, right?

29:45

There's loads of other ones you can use.

29:48

You don't have to use Mastodon.

29:50

I just think it's the most popular or

29:52

one of the most popular, I guess.

29:54

I think so, yeah.

29:56

So that's kind of why it has...

29:59

most features it's the most feature rich i

30:03

guess and this is just kind of adding

30:05

to that um it is interesting here i

30:07

did notice the timeline for the end-to-end

30:10

encryption for private messages is twenty

30:12

twenty seven and i just think you know

30:16

we're gonna be on the side of don't

30:18

use that so we don't really want that

30:21

but like i mean i don't really think

30:22

that's that important i think you know if

30:24

you people already i already see people

30:27

doing this on maston but they link their

30:28

signal account

30:30

We would suggest that much more than going

30:34

and using end-to-end encrypted private

30:36

messages.

30:38

Yeah,

30:39

I think actually somebody here did

30:41

mention, yeah, on YouTube,

30:44

Seismic said finally,

30:45

and chat besides Signal that I can use.

30:49

I mean,

30:49

we're going to have to wait and see

30:50

what this looks like in the final version.

30:52

I highly doubt it's going to be something

30:54

that we would recommend over Signal or

30:56

even alongside Signal.

30:58

But I always think it's great to have

31:00

more protection wherever possible.

31:01

And I think it is really good that,

31:03

because, you know, there may be,

31:05

times that I want to message somebody.

31:06

And especially in, you know,

31:08

one of the problems that a lot of

31:09

these, uh,

31:10

decentralized services have is there tend

31:12

to be like one or two or a

31:13

handful of servers that get like a massive

31:16

amount of users.

31:17

And so it's a lot of people criticize,

31:20

they're like, well,

31:20

it's not really decentralized because

31:21

everyone's using that server.

31:23

Um, but regardless, you know,

31:25

it's still like,

31:26

if you're talking to somebody,

31:27

like if I message somebody,

31:29

there's a good chance they're going to be

31:30

on like the mastodon dot social.

31:31

Right.

31:32

And so maybe I'm comfortable telling that

31:34

person like my date of birth.

31:37

but I don't want to tell everybody.

31:38

And I don't know who the admin is.

31:39

And I don't necessarily know if I trust

31:40

the admin and, and, you know,

31:42

some Mastodon instances even have like an

31:44

admin account where there may be more than

31:45

one person that has access to it.

31:47

So I think it is really good that

31:48

they're adding this level of privacy,

31:50

but yeah, I don't,

31:52

I doubt it's going to be implemented in

31:53

a way where we're like, well, shoot,

31:55

this is just as good as signal.

31:56

Everybody just use that, you know,

31:57

but it's still nice to have that extra

31:59

layer of protection for sure.

32:01

So yeah, that is a long ways off.

32:03

So

32:04

Definitely.

32:04

I think, yeah, you're right.

32:06

I think it is important, like you said,

32:08

to have more than just have everything be

32:12

have some level of protection rather than

32:14

nothing.

32:15

Right.

32:15

Definitely agree.

32:19

And I also saw real quick, somebody asked,

32:22

why are people chatting numbers?

32:23

We were running a poll.

32:24

I think it got moved off when I

32:26

started showing comments,

32:27

but we were running a poll about

32:30

if you were a Mastodon user or not.

32:32

And so you would comment in the chat,

32:33

one for yes and two for no.

32:34

But we'll try another poll in the future.

32:37

So I think for now,

32:39

that's all I've got on that story,

32:40

if we want to move on to the

32:42

next one, unless you have final thoughts.

32:44

Awesome.

32:45

Yeah, no,

32:46

I think we kind of talked about that

32:48

quite thoroughly here.

32:49

So let's move on to the next story

32:50

here.

32:51

And this one has been kind of a

32:53

hot story this week.

32:56

Cal.com is going closed source.

32:59

Here's why.

33:01

So I guess first, you know,

33:03

I think a lot of people in our

33:04

audience may not be familiar with this if

33:06

they're not really into like

33:08

uh, meeting scheduling,

33:10

self-hosting meeting scheduling sort of

33:13

stuff.

33:13

So basically cal.com was, uh, well,

33:16

it is still a thing, right?

33:18

Um,

33:19

there's basically a way to organize

33:21

meeting times with people.

33:22

So you could send someone a link and

33:25

it would have your availabilities.

33:27

And then the other person could select the

33:29

time that works best for them, which,

33:31

you know,

33:34

Personally,

33:34

I've had to do that because we communicate

33:38

across time zones now.

33:39

This is like a global economy.

33:41

So people have to sort of find the

33:45

best time.

33:46

And that is oftentimes across different

33:49

time zones.

33:52

So, um, the,

33:54

the thing here with cow.com is they have

33:57

decided to move to going closed source.

33:59

So originally there was a,

34:01

they had a self hosted version.

34:03

And I think the whole thing with that

34:05

was that it was a full, uh,

34:11

it was a full open source version of

34:13

their, uh,

34:15

service that you could self host yourself.

34:18

And basically they've announced that they

34:20

are diverging from that project.

34:24

And they have been for some time now

34:26

they've actually been working on a closed

34:28

source version.

34:29

Um,

34:30

and that's the version that runs on

34:31

cal.com and they have introduced a new

34:34

service, which is cal.diy, which is,

34:39

self-hosted version and i do want to talk

34:41

a little bit about that but first let's

34:43

kind of talk about the reasoning behind

34:45

going to this closed source model so they

34:50

posted a video here um saying that ai

34:54

is killing open source stating that you

34:56

know open source vulnerability scanners

34:59

are basically making it really hard to

35:03

keep up with patching vulnerabilities

35:06

because, you know,

35:07

they're able to scan the software and find

35:09

vulnerabilities much easier than spending

35:11

hours and hours as a, you know,

35:15

professional hacker or whatever, you know,

35:17

like a threat actor,

35:19

a proper threat actor.

35:20

They can kind of find these

35:22

vulnerabilities without

35:25

Being that technical is what I'm trying to

35:27

say.

35:28

So basically that's kind of their

35:30

reasoning behind this.

35:32

Their reasoning for moving to closed

35:33

source is security.

35:35

And I think that's kind of where we

35:37

kind of fundamentally disagree with this

35:39

because I think the source model of your

35:42

software doesn't actually have an impact

35:44

on security, right?

35:48

There's still ways to, you know,

35:51

analyze software that is closed source.

35:54

There's still ways to, you know,

35:57

test software, crash software,

35:58

find vulnerabilities.

36:01

So that's an interesting take.

36:04

I think

36:06

One thing that Jonah brought up, you know,

36:09

we have like a staff group chat,

36:11

he brought up that the cal.diy project

36:14

looks kind of sus.

36:17

If you go to the website cal.diy,

36:20

there is actually a lot of warnings all

36:24

over the page,

36:24

which kind of makes it seem like they

36:28

may not be following,

36:30

they may not really be updating this.

36:32

This seems like, you know,

36:34

something that is sort of

36:37

risk,

36:37

they're kind of making it seem like it's

36:39

extremely risky to use.

36:41

Um, so this is kind of strange.

36:44

I think, uh,

36:45

I don't really know why they're have such

36:49

a large warning on like every single page

36:51

or like at the top of the introduction

36:54

page, um, saying use it.

36:56

Your own risk is open source community

36:58

edition and is tended for users who want

37:01

to self host their own CalDIY instance.

37:04

It's strictly recommended for personal.

37:06

non-production use please review all

37:09

installation blah blah blah like it's it's

37:11

quite um strange uh and it says um

37:16

below that there's like an ad for their

37:18

um for their commercial service which is

37:20

closed source now um so you know we

37:25

we've always kind of been

37:28

are saying that, you know,

37:30

there's the source model I don't think has

37:32

an impact on the privacy or security.

37:34

And yeah,

37:36

like Jonah said in the chat here,

37:38

literally fearmongering about open source

37:41

actually.

37:42

Like this is like the silly arguments that

37:43

we hear from like people who don't really

37:45

know what they're talking about and who

37:46

say like,

37:48

open source that's like so much worse

37:50

because like then everyone can see the

37:51

code and like hack you it's just like

37:54

not really it just it just means there's

37:55

more scrutiny um and i think you know

37:59

it doesn't really make that much sense um

38:01

to do this it's we kind of talked

38:04

about this a little bit in our group

38:05

chat but the the the this company itself

38:09

cal.com is venture capital backed and

38:12

basically what that means is there's

38:14

people who

38:16

invest money in the company to, you know,

38:19

gain a stake in the company, I guess.

38:21

And they want to be able to earn

38:24

a return on that money that they've

38:26

invested.

38:27

And in a lot of cases, you know,

38:29

open source software opens the company up

38:33

to having their ideas and direction

38:36

possibly copied by a competitor or to

38:38

allow insights into their company from a

38:40

competitor.

38:41

And I think,

38:44

This is kind of a little bit silly.

38:46

I think, you know,

38:47

if you're making a really good product,

38:49

which I think cal.com is making a really

38:52

good product,

38:53

then you shouldn't be concerned about

38:57

someone stealing your ideas.

38:58

Like I'm,

38:59

I'm kind of not very familiar of that

39:01

ever being the case.

39:03

Um, I think it keeps the,

39:05

it keeps your company kind of, uh,

39:09

You don't even like you can have an

39:11

open source license that doesn't allow

39:13

people to use it for commercial use.

39:15

You can still have the software be open

39:17

source.

39:18

You can have the source available source

39:19

code.

39:20

So that's why I'm kind of confused by

39:22

this.

39:24

this move here but I did want to

39:28

hand it over here to Nate because there

39:30

was actually a bit of a clap back

39:32

here from discourse which is basically the

39:35

forum software that we use for our forum

39:37

but I'll just hand it over to Nate

39:39

here to kind of tackle that

39:42

Sure.

39:43

Um, yeah.

39:44

So quick shout out to our forum,

39:46

discuss stop privacy guides.net.

39:48

Uh, we are powered by discourse, which,

39:51

um, I,

39:53

it seems like a nice piece of software

39:54

as far as I can tell.

39:55

I haven't had to deal with it behind

39:56

the scenes.

39:57

Jonah does all our, our hosting,

39:59

but it seems to work pretty great.

40:01

And, um,

40:02

I mean, I'm not going to mince words.

40:03

This was absolutely,

40:06

like a clapback was a good way to

40:08

put it.

40:08

This was a response.

40:10

But I want to give a shout out

40:12

to Discourse because to me,

40:13

this felt like a very,

40:15

it was very direct.

40:16

It was not watered down.

40:21

but it was also not overly aggressive or

40:23

unprofessional.

40:24

And I feel like I don't see that

40:25

a lot of days,

40:25

a lot of the time these days,

40:26

and I just really appreciate that.

40:28

So thank you, Discourse.

40:29

This did not pull any punches,

40:31

but was also...

40:34

I don't know, just very professional,

40:35

in my opinion,

40:36

as professional as calling somebody out

40:38

can be.

40:38

But yeah,

40:39

so discourse literally said discourse is

40:40

not going closed source,

40:41

which I think the cal.com was cal.com is

40:43

going closed source.

40:44

Yeah, that was a direct quote.

40:48

And basically,

40:48

they kind of said everything that Jordan

40:50

said, actually, which is, you know,

40:52

they said here that like,

40:54

their reasoning is that AI has made open

40:55

source too dangerous for software as a

40:57

service companies,

40:58

codes get scanned and exploited at

40:59

buy AI at near zero cost.

41:02

Actually real quick before I dive into

41:03

that,

41:03

the cal.com one did have one statement

41:06

that I did want to kind of sympathize

41:07

with them a little bit.

41:09

So they talked about in recent months,

41:11

we've seen a wave of AI security startups

41:13

productizing this capability,

41:14

which they're talking about scanning the

41:17

source code.

41:18

Each platform surfaces different

41:19

vulnerabilities,

41:20

making it difficult to establish a single

41:22

reliable source of truth for what is

41:23

actually secure.

41:24

So the way I compared this,

41:27

I don't remember where I said this,

41:28

but the way I explained this to somebody,

41:31

or I kind of summarized it is like,

41:33

if you're at home,

41:33

like let's say you just moved to a

41:34

brand new country, right?

41:36

Like not even a state,

41:37

you're in a totally unfamiliar place.

41:39

And all of a sudden,

41:39

a bunch of like salespeople come knocking

41:41

on your door, insurance salespeople.

41:42

And this one guy is like, hey,

41:44

you need flood insurance.

41:45

And the next guy is like, no, no,

41:46

no, no, no.

41:46

There's a lot of wildfires.

41:47

You need wildfire insurance.

41:48

And the next guy is like, no,

41:49

you need tornado insurance.

41:50

And the next guy is like, no,

41:51

you need earthquake insurance.

41:52

And you're like,

41:53

I don't know what insurance I need.

41:55

And so Cal.com was basically like,

41:57

I'm just not going to get any insurance.

41:58

I'm just going to stop answering the door

42:00

is basically what they did.

42:01

Yeah.

42:02

So I want to give them a little

42:04

bit of credit because I understand how

42:05

that could be frustrating when you've got

42:07

so many different companies and they're

42:08

all giving you conflicting results.

42:10

And it's like, well,

42:10

now I've only got so many people.

42:12

I've only got so many hours in the

42:13

day.

42:14

We can only fix so many things.

42:15

However, you know, discourse here,

42:18

they said,

42:18

I understand where they're coming from.

42:19

The industry is changing fast.

42:20

New AIs with capabilities are being

42:22

released every few weeks.

42:23

It's a scary world.

42:24

And I completely agree that open source

42:25

companies need to adapt.

42:26

I do not agree with the decision that

42:27

closing source is the solution.

42:30

and um you know they they basically had

42:32

two main points one of them was exactly

42:34

what jordan said like going closed source

42:37

is uh for anybody who's new here it's

42:39

what we like to call security through

42:40

obscurity and that basically means like

42:44

it's the code equivalent of hiding under

42:45

the bed right like if i hide under

42:47

the the sheets the monsters can't see me

42:49

and that's basically what it is and they

42:50

point out here in this this blog post

42:53

they say that um

42:55

Closed source has always been a weaker

42:56

defense than people want to admit.

42:58

A web application is not something you

42:59

shimp wants to keep hidden.

43:00

Large parts of it are delivered straight

43:02

into the user's browser on every request.

43:03

Things like JavaScript, API contracts,

43:05

client-side flows, validation logic,

43:07

and feature behavior.

43:08

Attackers can inspect all of that.

43:10

And then there was another spot.

43:13

Did I already pass it?

43:13

Oh,

43:14

those same AI systems don't actually need

43:16

your source code to find vulnerabilities.

43:18

They work against compiled binaries and

43:19

black box APIs.

43:21

I will admit that I do not know...

43:24

a lot about technical stuff and code but

43:27

i do know that i see a lot

43:30

of um

43:32

I see a lot of people reverse engineering

43:33

apps, right?

43:34

Proprietary apps.

43:35

And they decompile it and they find ways

43:37

to get in there and go, oh,

43:39

look at what this app is doing.

43:40

Look at all the calls home it's making.

43:41

Look at the fact that the traffic is

43:43

not encrypted.

43:43

What's this server it's contacting?

43:45

So clearly this is not,

43:47

like the blog post says,

43:49

it doesn't need to be open source.

43:50

People can find ways into this stuff and

43:52

they've been doing it for years.

43:53

And so that doesn't actually stop

43:55

anything.

43:55

It's just security through obscurity.

43:57

And it's...

43:59

I think security through obscurity can be

44:01

part of a larger defense.

44:03

I don't know about in this case,

44:04

but in general,

44:05

I think there's times when it can be

44:07

like a data removal, right?

44:08

If you pay for a data removal service,

44:10

like Easy Opt-outs is one that we

44:11

recommend on the website.

44:15

That's a good start.

44:16

But also like...

44:17

using a PO box whenever you're able to,

44:19

like not putting your address in every

44:21

single form online.

44:22

Like, you know,

44:22

it's part of a larger defense.

44:24

I wouldn't rely on that by itself.

44:26

And so the other point they made,

44:29

this is a very, very long post.

44:33

They said that, yeah,

44:35

Basically, they think that this is a...

44:37

The security argument is a convenient

44:39

frame for decisions that are actually

44:40

about something else.

44:41

So one is, you know,

44:42

Jordan mentioned that competitors can read

44:43

your architecture and your product

44:44

thinking.

44:45

And then there's governance.

44:46

They said open source communities push

44:47

back.

44:48

They file issues about decisions they

44:49

don't like.

44:49

They fork.

44:50

It's exhausting to manage.

44:53

I mean, fair.

44:55

I will be the first to admit that

44:56

every once in a while,

44:57

I do get burned out on the community

44:59

and I need a break.

44:59

But I don't know if that's a good

45:00

enough reason to close source your code.

45:03

So yeah, it's...

45:06

And just to go back to the it's

45:08

competitors thing,

45:09

Jordan pointed this out too.

45:10

There are a lot of companies that are

45:12

open source and they're thriving.

45:14

Look at Bitwarden, for example.

45:15

I mean, granted, they do have investors,

45:17

but they're still open source.

45:19

You can self-host Bitwarden.

45:20

They have instructions on how to self-host

45:22

Bitwarden.

45:23

There clearly is a way to do both.

45:25

And I do wonder if...

45:28

cal.com explored any of those options uh

45:31

it does sound kind of like there was

45:33

just a lot of investor pressure and this

45:35

was just the easy button right like if

45:37

we go closed source that's going to make

45:38

it harder for people to self-host they're

45:40

going to have to pay for us we'll

45:41

slap a bunch of scary warnings on our

45:43

diy page which yeah that's that's not cool

45:46

and actually to make that even worse if

45:48

i can go back to their blog they

45:50

did say that um where did it go

45:52

here um

45:56

God dang it.

45:57

Okay, yes.

45:58

While our production code base has

45:59

significantly diverged,

46:00

including major rewrites of core systems

46:02

like authentication of data handling,

46:04

we want to ensure that there is still

46:05

a truly open version.

46:06

So basically,

46:08

the cal.diy version is completely

46:10

different from the cal.com version

46:13

Which raises a lot of questions for me.

46:14

And they also make it sound like,

46:17

I don't know if they're actually doing

46:19

this,

46:19

but they kind of almost made it sound

46:20

like, here's Cal.DIY.

46:22

We'll update it if we feel like it

46:23

every once in a blue moon.

46:24

But otherwise, like, we don't care.

46:26

This is just kind of shut up the

46:27

purists, which, again,

46:28

is a really crappy take from a community

46:30

you claim to have...

46:32

valued and whatever but yeah so um this

46:36

is a really long blog from discourse but

46:38

it is worth a read and again i

46:40

i really applaud that like they pulled no

46:41

punches but it also wasn't uh you know

46:45

just like a like oh it's a pr

46:47

opportunity like here's our facts here's

46:48

our our experience our reasoning um so i

46:51

really give them a lot of credit for

46:52

that one but yeah that was a

46:57

It's such a wild story, and it's so...

46:59

I hate to assume malice in a company,

47:01

but yeah, it's so... What turned me off,

47:03

I think, was just the fact that, again,

47:05

that's all it was,

47:06

was we're just going to go closed source,

47:09

and that's going to fix all our problems.

47:10

And almost immediately,

47:12

I saw everybody was just kind of like,

47:14

is it though?

47:15

Is that really what this is about?

47:16

And it just kind of...

47:19

I think that's going to do a lot

47:20

more damage than if they had just admitted

47:22

like, hey,

47:24

this business model isn't working for us

47:25

and we're going to try something else.

47:27

I think they might end up losing a

47:28

lot more customers because of the way they

47:30

handle this.

47:30

I don't know.

47:32

Do you have any additional thoughts to the

47:33

discourse response or anything?

47:36

Yeah,

47:37

I just think trying to pass this off

47:40

as selling for security reasons,

47:42

I think is

47:44

to people that actually follow and

47:46

understand security is just laughable.

47:50

And I think unfortunately those people are

47:53

in a lot of cases,

47:53

they're going to be the people that self

47:55

host this software.

47:56

So they're going to be the ones that

47:59

realize you're being kind of crappy about

48:01

it.

48:01

Right.

48:02

Um,

48:04

I think they should have been a bit

48:05

more clear about the reasoning because,

48:09

you know,

48:10

we don't know if there's another reason

48:13

why,

48:13

like we were talking about with the VC

48:15

investors.

48:15

But I think, you know,

48:18

especially when we have like, you know,

48:21

so many ways to analyze software um that's

48:26

closed source even um so you know people

48:29

can do like fuzzing they can feed programs

48:32

a bunch of random data to get it

48:33

to fail they can um do binary analysis

48:38

so you can inspect memory dumps of like

48:40

applications when they they run uh like

48:44

reverse engineering stuff so you know i

48:47

think it's

48:50

kind of a little bit, uh, it's, it's,

48:55

it's feels a bit disingenuous.

48:58

That's the word I was looking for.

48:59

Thank you.

49:00

Um, but yeah,

49:01

like we see this a lot,

49:03

like even the opposite way around,

49:05

like there's,

49:06

there's malware that we see and, you know,

49:08

we're able to analyze that malware,

49:10

stuff like that, um,

49:12

to see what it's doing and to understand

49:16

what

49:16

what the code might be.

49:19

So anyway,

49:20

I don't think them switching to closed

49:22

source is going to make it any,

49:26

I mean,

49:26

surely maybe a little bit possibly to

49:29

these,

49:29

to these basic AI vulnerability things,

49:31

but I don't think it's a good enough

49:33

reason to switch this because yeah,

49:37

I think it's, yeah,

49:42

it just feels really not great when

49:43

they're trying to make up a reason that

49:45

doesn't really exist.

49:47

Yeah, and I mean,

49:48

something that just popped into my head

49:49

is, you know, one of,

49:50

there's a lot of reasons you might make

49:52

something open source or even source

49:53

available, like you mentioned.

49:55

But one of the reasons I think is

49:57

that

49:58

it increases the chance that somebody

49:59

could find a vulnerability, right?

50:01

I want to make it clear real quick

50:02

that open source does not automatically

50:04

mean that something is more secure or more

50:06

private.

50:07

It just means that the opportunity is

50:08

there.

50:09

And I think there is a certain critical

50:11

mass where when we're talking about these

50:12

bigger projects like Bitwarden or maybe

50:14

Proton or some of these,

50:16

because I know Proton,

50:17

some parts of them are open source,

50:18

some parts aren't.

50:19

But you know what I mean?

50:19

When we're talking about big projects like

50:21

that, then...

50:22

I think odds are it probably is more

50:24

secure just because they're a big project

50:26

and they've got a lot of eyes on

50:27

them.

50:28

But especially for some of these smaller,

50:30

like mid-level projects,

50:31

I don't know how true that necessarily is.

50:33

It's probably not true,

50:35

but the opportunity exists.

50:37

And where I'm going with that is I

50:39

think, especially in this community, um,

50:42

There's such a dislike for AI.

50:46

I think they're almost going to reverse.

50:50

They're almost shooting themselves in the

50:51

foot.

50:52

If this really were about security,

50:54

they're kind of shooting themselves in the

50:55

foot because the bad guys are still going

50:57

to use AI.

50:58

They don't care.

50:59

They're going to use any advantage they

51:00

can to get ahead.

51:01

They don't play by any rules.

51:02

The good guys,

51:04

not all of them will be using AI.

51:05

Right?

51:06

And they're playing by a different set of

51:08

rules.

51:08

So you almost need to like...

51:11

Like we've been said a million times now,

51:12

the bad guys are going to find the

51:13

vulnerabilities no matter what,

51:14

whether it's open source or not.

51:15

By making it closed source,

51:16

the only people you're stopping are the

51:18

good guys who are not using AI.

51:21

So yeah, that's... I don't know.

51:23

That just kind of popped into my head.

51:26

Yeah, I think...

51:28

I'm not sure if I a hundred percent

51:30

agree on the privacy and security aspect.

51:32

I think it's more like a transparency

51:33

thing, which I mean...

51:36

is good uh for like trust and like

51:40

stuff like that but like uh i mean

51:44

i i think there's definitely there could

51:45

be closed source software that's just as

51:47

private as some open source software or

51:50

they could be closed source software

51:52

that's just as private as

51:55

open source software.

51:56

So, you know, it's, I don't know.

51:58

Uh,

52:00

it just seems like a really silly reason

52:02

to me, but I think we,

52:04

obviously we're going to,

52:05

we're going to push for transparency.

52:06

Like transparency is important, um,

52:10

rather than kind of a black box,

52:12

which we have to work out things

52:14

ourselves.

52:15

Um, so yeah.

52:20

Agreed.

52:21

Um,

52:22

Alrighty.

52:24

Well, before we dive into,

52:26

we have a story coming up about, uh,

52:28

Well,

52:29

some updates to age verification or

52:31

identity verification,

52:32

let's put it that way.

52:33

But first,

52:33

we're going to pause and talk about some

52:35

updates with what we've been working on at

52:37

Privacy Guides this week.

52:38

So in the video department,

52:40

we're really excited.

52:42

Bit of a soft announcement here.

52:43

On Sunday,

52:44

we're going to release an interview with

52:46

Carissa Veiles.

52:48

And if you guys don't know who that

52:49

is, you definitely should look her up.

52:50

You're missing out.

52:52

She wrote this awesome book called Privacy

52:54

is Power.

52:56

I'll grab it in a minute,

52:56

but I actually have it on my bookshelf

52:57

back there.

52:58

And it's honestly,

53:00

like I could gush about this book because

53:01

it is so accessible.

53:04

You know, it's so like,

53:06

I don't want to take drive-bys at other

53:08

authors,

53:08

but some other authors have written some

53:11

very seminal works in the space that were

53:13

very academic and kind of hard to read

53:15

and pretty dense.

53:17

And Carissa Vales is, I mean,

53:19

she's a professor of ethics at Oxford

53:21

university.

53:22

So she is very academic as a person,

53:24

but her writing is so like plain English

53:26

and down to earth.

53:27

Like I could give this book to anybody

53:29

and maybe they wouldn't read it because

53:30

it's not their cup of tea,

53:32

but they absolutely could read it because

53:33

it is written so plain English and it's

53:36

Um,

53:36

but in still like full of useful

53:38

information.

53:38

So yeah, I, as you can tell,

53:40

I'm a huge fan, but, uh,

53:41

we got to interview her and we talked

53:43

about her focuses on AI and ethics and,

53:46

you know,

53:46

what is AI going to do for the

53:49

future of, of our society?

53:51

Uh,

53:51

we did talk about privacy a little bit.

53:54

Um, I mean, it was a,

53:55

it was a great conversation.

53:56

I, uh, again, not to like

53:58

fanboy too much,

53:59

but I was telling people like,

54:01

I felt like I was smarter just for

54:02

having been in the same figurative room as

54:03

her.

54:04

Um,

54:04

unfortunately this was a remote interview,

54:05

not an in-person one, but, um, yeah,

54:07

so that's going to be out on Sunday.

54:09

She's absolutely awesome.

54:10

Go read privacy is power.

54:11

If you haven't, uh,

54:12

I've already pre-ordered her new book and

54:14

you will get a taste of that on

54:15

Sunday.

54:16

So definitely subscribe on YouTube or peer

54:19

tube.

54:19

And we'll be posting that when we come

54:21

out or when it comes out,

54:24

I can't talk tonight.

54:28

Yeah, no,

54:28

I'm really excited for the interview to

54:30

get released.

54:32

I've been working on like the editing side

54:34

of things.

54:35

Oh, there it is.

54:36

There's the book.

54:37

It's kind of a very recognizable cover as

54:40

well.

54:42

But yeah,

54:43

I definitely am a fan as well.

54:46

I think, yeah.

54:49

And Nate asked some,

54:50

some really good questions in the

54:51

interview about a lot of things that she

54:54

hasn't talked about publicly,

54:57

I would say as much.

54:59

And a lot of stuff that was in

55:00

the book itself.

55:01

So it's like a teaser,

55:04

like she's going to talk about some of

55:05

the stuff in the book and, you know,

55:09

I think it's interesting, yeah.

55:11

So she's got a new book coming out

55:12

called Prophecy,

55:13

which is about AI prediction stuff.

55:20

So, yeah, that's also pretty interesting.

55:22

So that could be interesting to check out

55:24

as well.

55:25

I think it's on pre-order until April XIX

55:28

or XXI, XXI, April XXI.

55:33

So it is looking quite interesting for

55:35

that.

55:38

But this week we also had some privacy

55:40

guides news posts.

55:42

So we had,

55:45

looks like we had a couple from Freya

55:47

and also a couple from Nate as well.

55:50

So Nate did one on HackerOne pausing its

55:53

internet bug bounty.

55:56

So they also kind of were saying that

55:58

they were having an issue with AI bug

56:00

reports, which that's another problem,

56:04

I think.

56:05

There was a data breach roundup from Nate

56:07

as well.

56:09

which I think is important to keep on

56:10

top of, just scan the list.

56:11

Just check it out and scan the list

56:13

because you never know what you might be

56:15

caught up in.

56:16

And I think companies are getting to a

56:20

point where they are

56:22

being a bit more accountable where

56:23

they're, you know,

56:25

sending out notices to people,

56:26

but it's also good to keep on top

56:28

of that.

56:30

And there was also some articles here from

56:33

Fria, like Nate talked about earlier,

56:37

there's Mastodon getting end-to-end

56:38

encryption, private messages.

56:41

So Fria had an article about that.

56:44

Fiverr exposing information of its users

56:47

publicly on Google search results.

56:50

Oh my goodness.

56:51

It's horrible.

56:53

India dropping proposals to require

56:56

biometric ID app after strong opposition.

57:01

So yeah,

57:02

there's a lot of interesting things going

57:04

on in India regarding that.

57:06

And there was also some stuff about Google

57:09

Chrome adding protection against cookie

57:11

stealing malware.

57:13

But yeah, kind of interesting,

57:15

interesting week.

57:17

So definitely check out the

57:18

privacyguides.org slash news section.

57:23

I guess with that being said,

57:26

all this is made possible by our

57:28

supporters.

57:29

And you can sign up for a membership

57:31

or donate to privacyguides.org.

57:35

or you can pick up some swag at

57:37

shop.privacyguides.org.

57:39

Privacy Guides is a nonprofit which

57:42

researches and shares privacy-related

57:44

information and facilitates a community on

57:47

our forum and matrix where people can ask

57:49

questions and get advice about staying

57:51

private online and preserving their

57:54

digital rights.

57:55

And yep, if you want to do that,

57:57

you can visit privacyguides.org and press

57:59

the red heart icon in the top right-hand

58:02

corner.

58:03

of the website.

58:04

You'll also be able to sign up for

58:06

a membership and get sweet perks as well.

58:09

But now let's talk about the future of

58:11

warrantless surveillance in the U S Nate.

58:17

Yeah.

58:17

All right.

58:18

As the, uh, as the American,

58:20

I guess I get to talk about this

58:22

fun little topic and, uh,

58:24

that is section seven Oh two, which, um,

58:26

many of you may not be super familiar

58:29

with.

58:29

I, for the record, um,

58:32

I follow many different news sources.

58:34

The other news source that came up in

58:36

my feed was TechCrunch that covered this

58:37

story.

58:39

I know I just want to throw it

58:40

out there.

58:41

I know this headline is obviously has a

58:42

certain political leaning to it,

58:44

but it had a lot more detail in

58:45

it as well.

58:46

So that's why I went with this one.

58:48

Definitely a lot more detailed than

58:49

TechCrunch is like five paragraphs.

58:51

But anyways,

58:53

so for those of you who don't know,

58:54

here in the US,

58:56

we have the infamous NSA,

58:58

the National Security Agency.

59:01

I think for some reason,

59:02

my brain just blanked.

59:03

I know they used to jokingly call it

59:04

the no such agency because up until the

59:06

nineties,

59:06

they didn't even acknowledge it existed,

59:07

but it does exist.

59:09

And they have so many different things.

59:11

One of them is called the foreign

59:12

intelligence surveillance act,

59:14

which basically authorizes them to spy on,

59:18

um,

59:18

communications that go in and out of the

59:20

country.

59:21

And they play really fast and loose with

59:23

that specifically at section seven Oh two,

59:25

which if I remember correctly, um,

59:28

John Oliver did a piece way back in

59:29

twenty thirteen where he talked about this

59:31

and he went to Russia and interviewed

59:32

Edward Snowden.

59:34

Super funny.

59:34

I highly recommend it still holds up.

59:37

Um,

59:38

But the way he described or the way

59:39

he read Section seven oh two is it

59:42

allows for the collection of, quote,

59:43

any tangible thing, unquote,

59:46

related to like national security and like

59:50

communications,

59:51

which he points out is like so incredibly

59:54

broad,

59:54

like telling your teenager you can only

59:56

use the car for like car related

59:58

activities.

59:58

So it's like, OK, hit and run,

1:00:00

drinking and driving like these are all

1:00:02

car like street racing.

1:00:04

These are all car related activities,

1:00:05

my dude.

1:00:06

So yeah.

1:00:07

pretty broad stuff and the government has

1:00:10

done so accordingly.

1:00:12

And so section seven Oh two has been

1:00:14

very controversial on both sides of the

1:00:16

aisle.

1:00:16

Uh,

1:00:16

there have been politicians from both

1:00:17

political parties who have said like, Hey,

1:00:19

we need to reign this in at least

1:00:20

publicly have said we need to reign this

1:00:22

in because for well over a decade now

1:00:24

we have failed to do that,

1:00:25

but that might be changing might be

1:00:28

because, um,

1:00:30

Section seven Oh two is one of those

1:00:31

things that has to be renewed

1:00:33

periodically.

1:00:34

And around midnight,

1:00:37

I don't know why he did that,

1:00:38

but for whatever reason,

1:00:40

the speaker of the house,

1:00:41

which is basically the guy running the

1:00:42

house of representatives,

1:00:44

the head representative,

1:00:46

he convened a vote on,

1:00:48

I guess this was last Friday.

1:00:49

So this would have been after we recorded

1:00:51

the podcast last week and called in

1:00:53

lawmakers to vote on extending section

1:00:55

seven Oh two.

1:00:57

And it failed.

1:00:59

by, I believe, where did it go?

1:01:00

They said about a dozen votes.

1:01:03

And for those of you who are not

1:01:05

keeping up with the US right now,

1:01:06

first of all, I very much envy you.

1:01:08

But our government is incredibly divided,

1:01:13

potentially the most divided it's ever

1:01:15

been.

1:01:15

I don't know if that's actually true,

1:01:16

but everything is very partisan right now.

1:01:20

That is not me being snarky.

1:01:21

That is just true.

1:01:22

Everything is very partisan right now.

1:01:24

And on top of it, the...

1:01:28

what's the word I'm looking for?

1:01:30

The margin of control, like the ratio of,

1:01:32

because we have a two-party system in the

1:01:33

US, which is probably our first mistake.

1:01:36

Our ratio of like one party to the

1:01:38

other is like razor thin.

1:01:40

So everything is very contentious.

1:01:42

Right now, the Republicans,

1:01:44

which is our conservative party,

1:01:45

they have a slight majority,

1:01:47

but it would not take a lot of

1:01:48

votes to flip things.

1:01:50

And that matters because about a dozen

1:01:52

Republicans voted against renewing this

1:01:54

thing.

1:01:54

And that was enough to not pass it.

1:01:57

And they tried again anyways.

1:01:59

They were like, hey,

1:01:59

let's do another vote.

1:02:00

Like the same night, they were like,

1:02:01

let's do another vote.

1:02:03

And then the number went up to like

1:02:04

twenty.

1:02:04

And I think that's when the Speaker of

1:02:07

the House was like, oh,

1:02:07

we should probably stop because I'm losing

1:02:09

support.

1:02:11

So they stopped.

1:02:12

They did manage to pass.

1:02:14

Sorry, I did a control F here.

1:02:15

They did manage to pass a ten day

1:02:17

extension.

1:02:18

So

1:02:20

Previously,

1:02:20

it would have run out on Tuesday.

1:02:22

Now it's going to go basically until the

1:02:23

end of the month.

1:02:26

But even then,

1:02:26

it's still – the US is so weird.

1:02:31

It says later on here that – yeah,

1:02:33

right here.

1:02:34

The Foreign Intelligence Surveillance

1:02:35

Court quietly recertified the program in a

1:02:37

classified ruling on March

1:02:44

I don't know how that works.

1:02:46

Jonah commented on Mastodon.

1:02:47

He doesn't know how that works either,

1:02:48

and we're both natural-born American

1:02:50

citizens as far as I know.

1:02:53

We're a very confusing country.

1:02:54

But I think this is exciting news because

1:03:00

it already failed to pass twice,

1:03:02

and I have to assume that if it

1:03:05

just full-on does not pass,

1:03:07

like if they cannot get this thing passed

1:03:09

through by the end of the month –

1:03:11

then it's got a deadline.

1:03:13

And I don't know what's going to happen

1:03:14

when March,

1:03:14

twenty twenty seven rolls around since

1:03:16

apparently Pfizer can just decide to keep

1:03:18

doing it.

1:03:19

But I don't know.

1:03:20

I think to me,

1:03:21

I'm hopeful because this represents the

1:03:24

first time in like.

1:03:26

over twenty years i think that we might

1:03:28

actually have a shot of getting this thing

1:03:30

defeated um but that's where we're at

1:03:32

right now those are kind of the facts

1:03:33

is uh it failed to vote twice it's

1:03:36

got an extension until the end of the

1:03:37

month um it really needs to i mean

1:03:40

no matter where you are on the spectrum

1:03:43

i know i'm probably mostly talking to

1:03:44

people who are like good this thing should

1:03:46

die but i i also recognize there's some

1:03:48

people who are like well you know

1:03:50

there does need to be some stuff for

1:03:51

national security, right?

1:03:53

But this thing has been repeatedly abused

1:03:56

for warrantless surveillance.

1:03:57

Like again,

1:03:58

the government is not supposed to collect

1:03:59

data on American citizens and it finds all

1:04:02

kinds of loopholes to do it anyways.

1:04:03

This is actually the thing that like,

1:04:05

they use this to buy location data from

1:04:07

third parties.

1:04:09

And I think that's one of the,

1:04:11

it's funny is like the Democrats didn't

1:04:12

even want to completely kill this thing.

1:04:13

They just wanted to reform it.

1:04:14

They're like, require a warrant,

1:04:16

stop buying data.

1:04:18

And the Republicans were like no.

1:04:20

So I'm glad to see at least some

1:04:22

Republicans agreed with this.

1:04:24

So I think the one thing I wanted

1:04:26

to add is where did it go?

1:04:28

There was – basically they did – the

1:04:32

Republicans did try to introduce some

1:04:34

quote-unquote reforms,

1:04:35

which were already existing things.

1:04:38

Like where did it go here?

1:04:41

Yeah.

1:04:42

So the amendment contained a provision

1:04:43

that was in essence a fake warrant

1:04:45

requirement.

1:04:45

It would have prohibited government

1:04:46

officers from intentionally targeting

1:04:48

Americans' communication without a

1:04:49

warrant, which is already in the statute.

1:04:52

It also offered the government a warrant

1:04:53

path if agents had probable cause to

1:04:57

suspect the subject is an agent of a

1:04:58

foreign power,

1:04:59

an authority that already exists.

1:05:00

So basically they just wanted to reiterate

1:05:02

things that were already in there without

1:05:03

actually doing anything meaningful to rein

1:05:05

it in.

1:05:06

And just to drive home the point,

1:05:09

ready to go.

1:05:10

The FBI has used Section seven oh two

1:05:12

to run warrantless queries on a U.S.

1:05:13

senator,

1:05:14

nineteen thousand donors to a

1:05:15

congressional campaign,

1:05:16

Black Lives Matter protesters and both

1:05:18

sides of the January six capital attack.

1:05:20

So.

1:05:22

I don't know what to tell you.

1:05:23

To me,

1:05:23

this is pretty obviously unconstitutional

1:05:25

and needs to be reigned at very least

1:05:27

needs to be reined in regardless of where

1:05:30

your political leaning is.

1:05:31

But it might just die altogether.

1:05:34

And I don't know.

1:05:35

I guess we'll see what happens if it

1:05:36

doesn't pass in March of next year rolls

1:05:38

around.

1:05:38

But yeah,

1:05:41

I think that's all of that story.

1:05:44

Did I did I miss anything, Jordan?

1:05:47

I guess I just have questions that maybe

1:05:50

people in the audience might also have.

1:05:52

Yeah, go for it.

1:05:53

I'm not a lawyer,

1:05:54

but I'll do my best.

1:05:56

Yeah.

1:05:56

So like,

1:05:57

I guess my question would be like,

1:05:59

I thought that you did need a warrant

1:06:00

to surveil people.

1:06:02

Is this like a specific special case that

1:06:05

people have to use like specifically or.

1:06:08

Yeah.

1:06:08

So section seven Oh two authorizes

1:06:11

warrantless surveillance on non-Americans.

1:06:13

And I know we've talked about this briefly

1:06:15

in the past in relation to other stories.

1:06:17

The loophole is that like the,

1:06:21

When I text you, for example, I mean,

1:06:23

we use Signal,

1:06:24

so they can't see it anyways.

1:06:25

But when I text you,

1:06:26

since you're Australian...

1:06:29

our communication crosses international

1:06:31

borders and that's the justification the

1:06:33

NSA uses to scoop up that surveillance or

1:06:35

to scoop up that communication and say,

1:06:37

we get to collect this.

1:06:39

And in theory,

1:06:40

they're probably supposed to throw away

1:06:41

like my side of the conversation or

1:06:43

something, but it, you know,

1:06:45

it doesn't stop them from basically spying

1:06:47

on me without a warrant to be just

1:06:49

because I'm talking to you,

1:06:50

even though they might not have a reason

1:06:52

to suspect anything.

1:06:54

They just,

1:06:54

it crosses international borders.

1:06:56

So yeah.

1:06:57

Yeah.

1:06:58

So wait, okay.

1:07:00

So I,

1:07:02

but they wouldn't be doing that to

1:07:03

everybody automatically, right?

1:07:05

Like it'd have to be like,

1:07:06

if I was on a watch list,

1:07:07

maybe they would consider doing that,

1:07:08

right?

1:07:09

Like, no.

1:07:10

As far as I know,

1:07:11

it's a like carte blanche across the

1:07:13

board.

1:07:15

They do not need a warrant to spy

1:07:16

on any non-American citizen.

1:07:21

Wow, okay.

1:07:22

Yeah, which is kind of,

1:07:24

it is very horrifying.

1:07:25

And it's also kind of crazy to me

1:07:27

that, you know,

1:07:27

when I think about like the US landscape,

1:07:29

like conservatives are so like,

1:07:32

and I don't even mean this as a

1:07:33

ding,

1:07:34

like conservatives are so like

1:07:35

pro-American, like Americans rights,

1:07:38

like I'm a US citizen.

1:07:39

I get all these wonderful freedoms and

1:07:40

rights.

1:07:41

Then why can't we agree on the basics

1:07:42

of like,

1:07:43

stop spying on your own citizens without a

1:07:45

warrant?

1:07:46

But for some reason,

1:07:47

apparently we can't even get that far.

1:07:48

So I don't know.

1:07:51

I mean,

1:07:51

I feel like spying on people that aren't

1:07:55

American citizens is also kind of

1:07:57

problematic too.

1:07:58

I mean, I agree,

1:08:00

but I'm trying to think of like the

1:08:01

bare minimum base floor that we could all

1:08:03

get to agree on.

1:08:04

But apparently, you know,

1:08:06

I guess the bar is in hell.

1:08:07

It's so low.

1:08:08

So I don't know.

1:08:11

I'm very cynical about this stuff.

1:08:13

So I guess another question that I have

1:08:15

is, like,

1:08:16

it seems like in this case it was,

1:08:18

like,

1:08:19

a lot of Republicans who were voting

1:08:21

against this to block this.

1:08:24

Is that normal?

1:08:25

Is this, like,

1:08:25

sort of somewhat of a bipartisan thing,

1:08:28

like,

1:08:28

wanting the NSA to surveil everyone or...?

1:08:32

Yeah, so our – again,

1:08:34

for foreign listeners,

1:08:36

I know the US doesn't truly have like

1:08:37

a left-wing party,

1:08:38

but our Republicans are our conservative

1:08:41

party and the Democrats are our more

1:08:43

liberal party.

1:08:44

I'll put it that way.

1:08:47

And everybody – again,

1:08:51

I hate to say it,

1:08:52

but it is true.

1:08:52

Here in America,

1:08:53

things are so partisan that people

1:08:55

typically vote along party lines.

1:08:57

And so the Republicans,

1:08:58

because they are more conservative,

1:08:59

they tend to be a lot more like

1:09:01

1:09:02

you know, we, we need to give the,

1:09:04

you know, the, I feel bad saying this,

1:09:08

but this is their logic.

1:09:09

And I swear to God,

1:09:09

I'm not like trying to ding anybody.

1:09:11

Um, they're very pro troops.

1:09:13

They're very pro police.

1:09:13

They're very pro,

1:09:14

like our intelligence community is

1:09:16

protecting us.

1:09:17

And so we need to give them all

1:09:18

the tools they can to protect us.

1:09:20

And I've literally seen,

1:09:22

I am still mad about this to this

1:09:23

day.

1:09:24

I literally saw there was a, uh,

1:09:26

An opportunity,

1:09:27

I don't remember how it got there,

1:09:28

but basically there was a moment where

1:09:30

somebody actually got a law all the way

1:09:32

up to,

1:09:33

or a bill all the way up to

1:09:34

our Congress that basically said like,

1:09:38

require, yeah,

1:09:40

require police to get a warrant instead of

1:09:42

buying data.

1:09:44

It was literally that.

1:09:46

And one of the Republicans who voted

1:09:48

against it literally said, he's like,

1:09:50

well, our enemies like China, for example,

1:09:52

they can start up a shell company or

1:09:53

they don't even just start up shell

1:09:54

company.

1:09:55

They can buy this data from any data

1:09:57

broker, right?

1:09:57

Just like we can.

1:09:58

So if we require our people to get

1:10:00

a warrant,

1:10:00

that puts us on unequal footing.

1:10:02

And I have never wanted to scream at

1:10:03

my screen so hard because I remember

1:10:05

thinking, I'm like,

1:10:06

then the solution here is to pass an

1:10:07

actual data privacy law so that nobody can

1:10:10

buy the freaking data.

1:10:11

But apparently that's just, I don't know,

1:10:13

that requires too many IQ points, I guess.

1:10:15

But anyways, personal opinion aside, like,

1:10:17

yeah, that's, it's...

1:10:18

Republicans generally tend to be a lot

1:10:21

more lenient on military and intelligence

1:10:24

and law enforcement and argue that we need

1:10:27

to give them as much help as they

1:10:29

can to do their jobs,

1:10:31

which includes putting as few restrictions

1:10:33

on them as possible.

1:10:34

So, yeah.

1:10:36

I see.

1:10:37

Okay.

1:10:37

Yeah,

1:10:38

I did mention in here was like the

1:10:40

House Freedom Caucus Republicans.

1:10:43

So I don't know if that sounds like

1:10:45

they might be like a libertarian type

1:10:47

people.

1:10:47

I'm not really sure.

1:10:49

Yeah,

1:10:49

I'm not super familiar with them either.

1:10:53

I saw that too.

1:10:54

It looks like I'd have to look more

1:10:57

into them.

1:11:00

Well,

1:11:01

it's good that they voted against it

1:11:02

anyway.

1:11:03

I think, you know,

1:11:04

if we can put aside all the other

1:11:07

partisan stuff and be like, you know,

1:11:09

privacy is an issue that's important.

1:11:11

Let's not surveil everybody and collect

1:11:15

all their information unnecessarily.

1:11:17

I think we should try and

1:11:22

against that which is uh unfortunate i'm

1:11:24

sorry this i'm sorry it's so partisan yeah

1:11:28

um because i think that definitely does

1:11:30

make things more difficult you know if

1:11:32

there's one party that's trying to get

1:11:33

something passed it's like we don't want

1:11:34

to do that because it's by those people

1:11:36

it's like uh that's not really the point

1:11:39

but it should be based on the merit

1:11:41

of what they're trying to pass not like

1:11:43

you know the party yeah it's

1:11:47

It's extremely frustrating because that's

1:11:49

exactly what's happening is like somebody

1:11:50

will put like this and, you know, like,

1:11:51

hey,

1:11:52

spying on people without a warrant is bad.

1:11:54

Well, I don't like you,

1:11:55

so I don't like your bill.

1:11:56

And it's like, dude, come on.

1:11:57

But I do want to point out on

1:11:59

that note,

1:11:59

there are some pretty big names in here

1:12:01

that I think are really telling.

1:12:02

Yeah.

1:12:03

Not to get too deep into politics,

1:12:05

but like Thomas Massey of Kentucky,

1:12:07

I'm going to assume he's a Republican

1:12:08

because Kentucky is a very deeply red

1:12:10

state.

1:12:11

Chip Roy of Texas is a Republican.

1:12:14

Lauren Bober,

1:12:14

who used to be like one of Trump's

1:12:16

biggest supporters.

1:12:17

I don't know if she still is.

1:12:18

He's kind of losing some of his key

1:12:20

supporters.

1:12:20

But I just point that out as like,

1:12:23

man,

1:12:23

these are big people that I would not

1:12:24

normally expect to like.

1:12:26

vote against the party line.

1:12:28

So that's probably more indicative of like

1:12:31

larger us politics,

1:12:32

but it's good to see that.

1:12:33

Like you said,

1:12:34

like there are some people who are just

1:12:35

like, no, this,

1:12:36

this is not a partisan issue.

1:12:38

We need to fix this.

1:12:38

So hopefully it won't pass and then we'll

1:12:41

see what happens.

1:12:42

Yeah, I think it is kind of frustrating.

1:12:45

But you know, I hope it doesn't.

1:12:47

I guess we're looking at that on Tuesday.

1:12:50

Oh, no, sorry, not Tuesday.

1:12:51

Sorry, at the end of the month.

1:12:53

So hopefully we get an update for that

1:12:55

in a next This Week in Privacy episode.

1:13:00

um but yeah i think we're trying to

1:13:02

stay tuned for updates definitely make

1:13:05

sure to subscribe and uh add this to

1:13:07

your podcast app um but i mean yeah

1:13:10

i think it's uh it's important we're

1:13:12

trying to stay um when we talk about

1:13:14

this sort of stuff you know we're just

1:13:15

talking about this from the privacy angle

1:13:19

um so you know we're not trying to

1:13:22

Because I know I personally don't talk

1:13:24

about US politics because I just feel like

1:13:27

I'm going to offend someone.

1:13:29

I'm going to always offend someone if I

1:13:31

say something.

1:13:31

So thanks for kind of explaining that

1:13:35

because...

1:13:36

I definitely have less experience.

1:13:39

I mean,

1:13:39

I know a bit about US politics because

1:13:43

it's kind of unavoidable.

1:13:45

So yeah,

1:13:47

but I think it's good to explain things.

1:13:49

But I guess moving on to this next

1:13:51

story here,

1:13:52

unless you have anything more to add.

1:13:55

Nope, that's all I got.

1:13:57

All right.

1:13:58

So this next story here is about the

1:14:02

EU age checking app.

1:14:04

So basically...

1:14:08

Yeah, we talk about this a lot,

1:14:09

you know, age verification stuff.

1:14:11

And now there's basically a movement in

1:14:15

the EU to keep kids safe online with

1:14:18

this new EU age checking app.

1:14:22

Quoting from the article here from

1:14:24

Politico,

1:14:25

the European Union's age verification app

1:14:28

is ready to be rolled out to protect

1:14:30

kids online.

1:14:32

The Bloc chief Ursula von der Leyen said

1:14:36

Wednesday, sorry if I messed up your name,

1:14:40

our European age verification app is

1:14:43

technically ready and will soon be

1:14:45

available for citizens to use,

1:14:47

the European Commission president said at

1:14:50

a press conference.

1:14:51

And basically, according to this article,

1:14:54

the app is a critical part of the

1:14:55

EU's plans

1:14:57

to keep children safe online.

1:14:58

The technology would allow people to prove

1:15:01

their age through the government approved

1:15:04

verified systems.

1:15:05

The EU said it has ensured it would

1:15:07

also protect citizens' privacy rights and

1:15:10

personal data.

1:15:13

Now,

1:15:13

I think that last sentence right there,

1:15:15

that remains to be seen because basically

1:15:18

every single age verification system we've

1:15:20

seen so far has been not great from

1:15:24

a privacy perspective.

1:15:27

And quoting the article again,

1:15:28

we're holding online platforms accountable

1:15:30

that do not protect enough of our kids,

1:15:33

maybe.

1:15:34

Might have been a misquote there.

1:15:35

The new age verification solution and the

1:15:37

enforcement of our rules go hand in hand.

1:15:41

so basically uh this app is ready to

1:15:45

be downloaded um and just kind of

1:15:48

highlighting a post here um someone on our

1:15:50

forum posted a link of their blog

1:15:52

basically going through sort of the uh the

1:15:57

new eu age verification app um so if

1:15:59

you haven't heard of them before privacy

1:16:01

dad they do sort of like parenting related

1:16:03

privacy stuff um and they've been

1:16:08

you know, posting, uh,

1:16:10

an update here about, uh,

1:16:11

the EU age verification app.

1:16:13

So you can kind of see what the

1:16:14

flow will look like.

1:16:16

Um, and apparently according to them,

1:16:19

they were able to download the APK and,

1:16:22

you know, test out the app.

1:16:24

A lot of the features aren't a hundred

1:16:27

percent ready and it was, you know,

1:16:28

has a testing mode,

1:16:29

which you can basically see how it would

1:16:32

work.

1:16:33

Um,

1:16:35

It does seem like you need to scan

1:16:40

your ID into this app.

1:16:42

So I mean, that's fine, I guess,

1:16:46

if you're sending it directly to the EU

1:16:48

government and there's no third party

1:16:50

company involved here.

1:16:53

But I guess that would be like a

1:16:57

separate governmental body that's been

1:16:59

established for this.

1:17:00

I'm not entirely sure about the whole

1:17:02

process behind this.

1:17:05

but you can kind of see the age

1:17:07

verification credential stuff.

1:17:09

Um,

1:17:09

so basically how it's meant to work is

1:17:13

you visit a website or an app and

1:17:18

you can use this, uh,

1:17:20

use this app to basically prove that

1:17:23

you're over eighteen.

1:17:24

It doesn't share your age,

1:17:25

it just shares the proof basically.

1:17:29

Um,

1:17:31

But it does look like you need to

1:17:32

take a photo of your identity document and

1:17:34

record a video of yourself.

1:17:37

So that's not great from a biometric

1:17:41

standpoint.

1:17:45

So yeah, that kind of sucks.

1:17:48

There's a lot of different things here.

1:17:52

So basically it's just a move to basically

1:17:58

change the...

1:18:01

Oh,

1:18:01

there's someone in the chat who asked a

1:18:02

question here.

1:18:03

Probably outside the stream subjects,

1:18:05

but I noticed that hosts are using Apple

1:18:07

products.

1:18:08

Is there a privacy related reason or just

1:18:10

personal preference of hardware?

1:18:13

Personally,

1:18:14

I'm not going to talk about personally,

1:18:18

but I'm just going to say for work,

1:18:21

this is, you know,

1:18:24

I need to use DaVinci Resolve.

1:18:25

I need to use...

1:18:28

applications that aren't available on

1:18:30

Linux, which I would love to use Linux.

1:18:32

I think it'd be great if I could

1:18:34

use Linux.

1:18:35

But as far as I'm aware,

1:18:39

there's still a lot to go on DaVinci

1:18:41

Resolve.

1:18:42

It's quite annoying to use on Linux.

1:18:44

It's missing some stuff.

1:18:47

It is less stable.

1:18:48

It's less supported.

1:18:50

I use Affinity for all the graphic design

1:18:52

stuff we do here at Privacy Guides.

1:18:54

And

1:18:56

As far as I'm aware,

1:18:57

that is also quite finicky.

1:19:01

Generally,

1:19:01

I want to be focusing less on the

1:19:05

technical issues,

1:19:06

like having a bug happen in DaVinci

1:19:09

Resolve where like I can't render a video

1:19:12

or like something like that,

1:19:13

less if possible.

1:19:15

So, you know, I think

1:19:20

you kind of have to use what you

1:19:22

have to.

1:19:22

I mean,

1:19:23

I don't have any personal information on

1:19:25

this computer.

1:19:26

It's like a work computer.

1:19:27

So I'm not really that bothered by using

1:19:29

an Apple product to do this.

1:19:31

Um,

1:19:33

So I think you just have to

1:19:35

compartmentalize things.

1:19:37

But sorry, I kind of got off track.

1:19:39

I just wanted to quickly answer that

1:19:40

question because I guess Nate has got a

1:19:44

MacBook and I'm using an Apple avatar.

1:19:47

So I guess that was kind of a

1:19:48

question that needed to be answered.

1:19:50

But yeah,

1:19:51

do you have any thoughts on that or

1:19:52

on the EU age verification stuff?

1:19:56

Um, well,

1:19:57

I guess let me start with the question.

1:19:59

Um,

1:19:59

so I am fortunate enough to have one

1:20:01

of each computer and this is also a

1:20:04

work computer.

1:20:04

This was,

1:20:06

this was given to me by privacy guides

1:20:07

because my windows computer is from,

1:20:11

I think.

1:20:12

So in tech years,

1:20:13

it's starting to get up there.

1:20:15

I've had a couple of close calls with

1:20:16

it already.

1:20:16

And so this was kind of like, Hey,

1:20:19

We should get me a MacBook just in

1:20:20

case the day comes when my Windows

1:20:22

computer doesn't boot and I'm not

1:20:25

completely up a creek.

1:20:28

So this is kind of my backup,

1:20:29

my computer.

1:20:30

But then also like my Windows computer is

1:20:32

like I've got all the cables dressed in

1:20:33

and it's really nice.

1:20:34

So it's like, cool,

1:20:35

the Windows computer can stay there.

1:20:36

And then I'll use this one when I

1:20:37

travel or for the podcast or something.

1:20:40

Um,

1:20:41

cause I probably should do more work at

1:20:42

a standing desk, but I don't cause my,

1:20:44

my actual desk has like three screens and

1:20:47

well,

1:20:47

two plus the laptop and I have studio

1:20:49

monitors and stuff.

1:20:50

So yeah.

1:20:52

Um,

1:20:53

I try to use Linux more for the

1:20:57

actual, um,

1:20:59

like basically anything that doesn't

1:21:01

involve editing or gaming.

1:21:03

I, um,

1:21:04

honestly,

1:21:05

I prefer windows just out of habit just

1:21:07

because I'm so used to it.

1:21:08

And also, again,

1:21:09

I do some gaming and windows generally

1:21:11

handles gaming better than Mac.

1:21:13

Um,

1:21:13

I've heard gaming's come a really long way

1:21:15

on Linux.

1:21:16

I know Nick from the Linux experiment, uh,

1:21:18

edits on DaVinci, but I also,

1:21:21

I was an audio guy for like,

1:21:23

I was a professional audio guy for years

1:21:24

before I, I took this job.

1:21:26

So, um, I, uh,

1:21:30

I have amassed a collection of plugins and

1:21:33

workflow that are very specific to

1:21:35

Windows.

1:21:36

So even if I moved over to Linux

1:21:37

for DaVinci,

1:21:39

there's a really good chance that a lot

1:21:40

of the plugins I rely on would not

1:21:42

move with me.

1:21:44

Yeah, I don't know.

1:21:45

But I...

1:21:48

Yeah.

1:21:49

I mean, that's kind of my workflow.

1:21:50

I, I use it largely for, um, production.

1:21:54

I use windows for production and gaming.

1:21:56

I use the plugins,

1:21:57

which is why I'm still on windows.

1:21:59

And also I use cubes,

1:22:00

which is that's never going to do

1:22:01

production or gaming to begin with.

1:22:03

Um,

1:22:03

not unless somebody wants to donate like

1:22:05

a, a thousand dollar computer.

1:22:06

That's just super souped up and I can

1:22:08

make GPU pass through work reliably.

1:22:10

which I've heard doesn't always, so yeah.

1:22:12

And also I hate to say it,

1:22:14

but like, so when I got this computer,

1:22:16

I used it as my main computer for

1:22:18

like a week or two just to,

1:22:19

and I edited like three or four videos

1:22:21

just to make sure like this will do

1:22:22

what we need it to.

1:22:23

This is an acceptable backup.

1:22:25

And I, it's weird.

1:22:27

Cause in college I had a Mac and

1:22:29

it was fine.

1:22:29

You know,

1:22:29

like I remember when I switched back to

1:22:31

windows, I was like,

1:22:32

which I did mostly because it was cheaper.

1:22:34

Right.

1:22:34

Like when my Mac died, I was like,

1:22:35

yeah, I'll just go back to windows.

1:22:37

And I remember thinking like,

1:22:38

I don't understand why people are so mad.

1:22:39

Like you can switch between them.

1:22:40

They're fine.

1:22:40

They're easy.

1:22:41

But for some reason,

1:22:43

when I was using this recently,

1:22:44

I was just like,

1:22:45

these keys are driving me crazy and I

1:22:47

hate it.

1:22:47

And like, even now I'd like,

1:22:49

occasionally I put things in the wrong

1:22:50

place or I like,

1:22:52

apparently there's this thing where if you

1:22:53

tap too hard, it like,

1:22:55

does something different and i don't know

1:22:57

if i'm making sense but i i tap

1:22:59

things really hard and it does not work

1:23:01

well so yeah it's just the workflow is

1:23:04

i i could get used to it if

1:23:05

i had to but it's definitely i was

1:23:06

just kind of like you know what it

1:23:07

works i'm going back to windows to be

1:23:08

honest so i don't know um macs are

1:23:12

definitely much more private and secure i

1:23:14

would argue and certainly a lot less

1:23:16

annoying with the ai um this thing did

1:23:18

not come with apple intelligence enabled

1:23:20

and uh you know apple intelligence is also

1:23:24

probably more useful than copilot i would

1:23:25

imagine haven't used either but i would

1:23:26

imagine so i don't know um yeah this

1:23:30

it's not okay well either way um yeah

1:23:34

i mean it's it's uh it's it's not

1:23:37

my daily driver um and i don't even

1:23:39

mind using linux it's just it's it's a

1:23:41

work computer mostly and it just happens

1:23:42

to fit my workflow so anyways um yeah

1:23:47

going back to the the eu story so

1:23:50

Um, yeah, I mean,

1:23:51

I think we just wanted to share this

1:23:52

because it's a bit of an update to

1:23:54

all this age verification stuff.

1:23:56

The way I understand it is that, um,

1:23:59

this is an app that can be used

1:24:02

as is,

1:24:03

but it's also designed to function as a

1:24:07

framework for other companies to build on

1:24:09

top of.

1:24:09

I could be wrong,

1:24:10

but this is how I understand it is

1:24:11

basically it's like, it's almost like, um,

1:24:14

a lot of you guys might remember during

1:24:16

COVID, um,

1:24:17

Apple and Google released like a built-in

1:24:19

contact tracing thing.

1:24:22

And that way other states could build on

1:24:26

top of that.

1:24:26

And it was kind of like, look,

1:24:28

here's a relatively private and secure,

1:24:31

certainly more so than whatever crap your

1:24:33

underpaid IT guys are going to cook up

1:24:35

in the ten minutes you give them.

1:24:37

like, it's kind of like,

1:24:38

here's a framework to start with.

1:24:40

So you can at least start off on

1:24:41

a good foot and build from there.

1:24:43

And I feel like that's kind of what

1:24:44

this is,

1:24:45

is the same thing is where it's like,

1:24:46

you could use this as is,

1:24:48

but you could also like roll your own

1:24:50

local version.

1:24:51

Um, if I understand it correctly,

1:24:53

I could be wrong,

1:24:53

but I feel like I saw some people

1:24:54

saying that.

1:24:55

Um,

1:24:57

the last thing I do want to know

1:24:58

real quick,

1:24:59

I want to pull this up is, uh,

1:25:03

For the record,

1:25:03

I don't know who this person is.

1:25:04

I don't know their credentials,

1:25:07

and I haven't seen a whole lot of

1:25:08

people verifying this,

1:25:09

but I also haven't seen a whole lot

1:25:10

of people contesting this.

1:25:12

But this claims to be a security

1:25:15

consultant who said that they found

1:25:17

potential vulnerabilities in the EU's age

1:25:19

verification app in under two minutes.

1:25:21

So one of them is that I guess

1:25:24

you can delete the PIN.

1:25:27

Like, there's a way, yeah,

1:25:28

the attacker can simply remove the pin

1:25:29

values from the file and restart the app.

1:25:32

After choosing a different pin,

1:25:33

the app presents the credentials created

1:25:34

under the old profile and lets the

1:25:35

attacker present them as valid.

1:25:37

And I think they said there were some

1:25:38

others.

1:25:39

But I guess all that to say is,

1:25:41

like, if you don't have to use it,

1:25:43

I mean, obviously,

1:25:43

we don't think you should use this kind

1:25:45

of stuff in the first place, right?

1:25:46

Like,

1:25:46

we are very anti-age verification people.

1:25:51

Jordan made an excellent point earlier

1:25:52

when they said that, like,

1:25:54

this may have been before we were live,

1:25:55

but I think we were live.

1:25:57

But Jordan pointed out that, like,

1:25:59

you know, parental controls exist.

1:26:01

They're already there.

1:26:02

They're already fine.

1:26:04

So, but...

1:26:06

Yeah,

1:26:07

if you're – I mean we're not telling

1:26:09

you to break the law,

1:26:10

but if you're in an area where this

1:26:11

is not required yet,

1:26:12

definitely I would not advocate for

1:26:14

downloading it because it seems like there

1:26:17

might potentially be vulnerabilities.

1:26:18

So I would wait for more people to

1:26:22

do some research and kind of look into

1:26:24

this and –

1:26:26

Hopefully they'll fix these

1:26:27

vulnerabilities.

1:26:27

Cause I mean, it's,

1:26:28

it's like the very least they can do,

1:26:29

right?

1:26:29

If they're going to be like,

1:26:30

everybody has to give us our ID.

1:26:31

Like the very least they could do is

1:26:33

actually secure it in a way where you

1:26:34

can't just like delete the pin and restart

1:26:36

the app.

1:26:37

Like that's completely insane if true.

1:26:38

So I don't know.

1:26:42

I think actually we're, yeah, we're,

1:26:44

we're going to talk a little bit more

1:26:45

about age verification here,

1:26:48

here in the U S so I want

1:26:50

to point out,

1:26:51

this is a brand new hot off the

1:26:53

presses story.

1:26:53

And as a result,

1:26:55

it's probably sitting in my RSS feed as

1:26:56

I say this,

1:26:57

but I have not seen any of our

1:26:59

usual,

1:26:59

more reputable outlets cover the story.

1:27:03

So unfortunately I had to go with a

1:27:04

press release from a Congressman from New

1:27:08

Jersey, Josh Gothamire, Gotham here.

1:27:10

I don't know, but yeah,

1:27:14

no offense to him,

1:27:14

but I'm just saying like,

1:27:15

this is a press release.

1:27:16

It's going to be a little bit,

1:27:19

what's what I'm looking for polished in

1:27:21

overly optimistic and maybe not the most

1:27:24

balanced piece out there.

1:27:25

So take this with a grain of salt,

1:27:27

but apparently the U S has introduced a

1:27:29

bipartisan parents decide act to protect

1:27:32

kids online.

1:27:33

And, uh, this is basically the, uh,

1:27:35

the operating system level age

1:27:37

verification, which again,

1:27:39

I keep calling it age verification.

1:27:40

I should be calling it identity

1:27:41

verification because it will require

1:27:43

everyone to do it.

1:27:44

Not just kids.

1:27:46

Um,

1:27:47

But yeah,

1:27:47

it will require operating system

1:27:49

developers such as Apple and Google to

1:27:50

verify users' ages when setting up a new

1:27:52

device rather than relying on

1:27:53

self-reported ages,

1:27:55

allows parents to set appropriate content

1:27:56

controls from the start,

1:27:58

ensure that age and parental settings

1:28:00

securely flow to apps and AI platforms,

1:28:03

and prevent children from accessing

1:28:04

harmful or explicit content by creating

1:28:06

consistent,

1:28:07

trusted standards across platforms.

1:28:09

I...

1:28:12

I feel especially cynical about age

1:28:15

verification in the US.

1:28:20

I will admit I got this from another

1:28:21

video.

1:28:21

This is not an original thought,

1:28:22

but it's a good thought.

1:28:24

First of all,

1:28:24

we don't even have a national privacy law.

1:28:27

Nothing.

1:28:28

Nothing at all.

1:28:29

So do whatever you want with this data.

1:28:31

I think just last week we covered a

1:28:33

story.

1:28:34

It was either last week or the week

1:28:35

before.

1:28:35

We covered how the governor of...

1:28:40

wisconsin i think it was vetoed a state

1:28:42

level identity verification law because

1:28:45

he's like we don't have um he's like

1:28:49

this this thing doesn't have any

1:28:51

protections against like selling the data

1:28:53

or securing the data like there's none of

1:28:55

that and and that's true at a national

1:28:57

level so first of all there's that um

1:29:00

I will forever remain cynical that schools

1:29:04

have data breaches left and right,

1:29:05

and nobody seems to care,

1:29:07

but somehow encryption and you know,

1:29:10

all this is what's putting the kids at

1:29:11

risk.

1:29:11

Not the fact that the LAP or not

1:29:14

LAPD,

1:29:14

but the LA school district just leaked the

1:29:17

date of birth,

1:29:17

email address and home address of every

1:29:19

child in the city.

1:29:21

No, it's, this is the problem here.

1:29:23

I'm being very sarcastic in case you can't

1:29:25

tell.

1:29:26

And, um,

1:29:27

There's also the lawsuit just the other

1:29:29

week where Meta and Google got legally

1:29:33

found to have addictive algorithms.

1:29:34

And I think-

1:29:37

again, not an original thought,

1:29:38

but I like this thought that I've been

1:29:39

attached to lately is the idea of like,

1:29:41

it's so ridiculous that we're saying that

1:29:44

this is only bad for kids.

1:29:46

But once, once you're an adult, it's fine.

1:29:48

Like you,

1:29:48

you can go ahead and let these companies

1:29:49

abuse you and just mistreat you and use

1:29:53

your data,

1:29:54

but you have to be a certain age.

1:29:55

It's just, I don't know.

1:29:55

It's,

1:29:57

I think we're regulating the wrong thing.

1:29:58

And I think, um,

1:30:00

Jordan and I had this discussion recently

1:30:01

too,

1:30:02

where in a lot of other countries and

1:30:04

maybe here in the US,

1:30:05

sometimes here in the US,

1:30:07

when you set up a new device,

1:30:08

it prompts you like,

1:30:09

is this for a child?

1:30:10

And if you click yes,

1:30:12

it will tell you about all of the

1:30:14

potential parental controls that exist.

1:30:17

And I really think that's a much better

1:30:18

way to go.

1:30:19

Like, I like parts of this, right?

1:30:20

Like,

1:30:21

allow parents to set age-appropriate

1:30:22

content controls.

1:30:23

I don't know who's not allowing parents to

1:30:25

do that, but let's pretend.

1:30:27

Ensure the age and parental control

1:30:28

settings securely flow to the apps and AI

1:30:30

platforms.

1:30:31

You know, like,

1:30:32

I think those are good things, of course.

1:30:34

But I don't understand why we can't start

1:30:35

there.

1:30:36

Like,

1:30:36

why don't we start by empowering the

1:30:37

parents to know that these controls exist?

1:30:40

Because, again, you know, like...

1:30:43

I'm so tired of talking about age

1:30:44

verification, but you know,

1:30:45

this whole like prevent children from

1:30:47

accessing harmful or explicit content.

1:30:49

Okay.

1:30:49

What about classical artwork?

1:30:50

Right?

1:30:51

Like that's a class, a common example,

1:30:53

like ninety percent of these classical era

1:30:55

Da Vinci's and whatever,

1:30:56

like they're both men and women are

1:30:58

partially or fully naked.

1:31:00

So like it,

1:31:02

does that count as explicit content or is

1:31:04

that like valid because the artwork,

1:31:05

you know, I just watched them.

1:31:07

Obviously this is not a one-to-one,

1:31:09

but I just watched the sci-fi movie the

1:31:10

other week called any aura that is like

1:31:12

soul crushingly depressing,

1:31:15

but it had artistic value.

1:31:16

Like, yeah, it was a really sad,

1:31:17

depressing movie,

1:31:19

but it had an artistic merit to it.

1:31:20

It wasn't just like,

1:31:21

I'm going to go watch something depressing

1:31:22

for the sake of it.

1:31:24

So it's, it's very like

1:31:27

I use that as an example.

1:31:28

It's just very like, I don't know.

1:31:29

I don't know what I'm trying to say.

1:31:32

It's getting late.

1:31:32

But it's so depressing that we have no

1:31:36

protection for data in the first place.

1:31:38

And now we want to pass this national

1:31:40

law that says turn over your ID when

1:31:42

states can't even agree what counts as

1:31:45

harmful content.

1:31:46

And, um, yeah, you know, Swiss,

1:31:48

Swiss kill said here,

1:31:49

like whose responsibility is it to raise

1:31:50

their child?

1:31:51

To be honest,

1:31:51

it's not even responsibilities for me.

1:31:53

It's like, right.

1:31:54

Like to me,

1:31:54

it feels so taking away the agency from

1:31:57

the parents to say like, okay,

1:31:59

the government's going to tell you what

1:32:00

your kids can look at now.

1:32:02

Like, that's really what we need to start,

1:32:03

how we need to start wording this because

1:32:05

I guarantee you parents on both sides of

1:32:06

the aisle are not going to be cool

1:32:08

with that.

1:32:09

And it's just, it's, uh, yeah,

1:32:11

it's so frustrating to me.

1:32:12

I don't like

1:32:14

I don't – nobody is saying the internet

1:32:17

is perfect,

1:32:17

but I think most of us can agree

1:32:19

that this is not the way to solve

1:32:20

it.

1:32:21

So I don't know.

1:32:21

I feel like I'm just going to keep

1:32:22

going in circles if I keep talking,

1:32:23

but yeah.

1:32:24

Yeah.

1:32:26

Yeah,

1:32:26

I think they should rename it to

1:32:29

Corporations Decide Act because a lot of

1:32:33

times, you know,

1:32:34

like a lot of these things that this

1:32:38

Gottheimer,

1:32:40

Josh Gottheimer guy is announcing in this

1:32:43

press release are like, you know,

1:32:45

require operating system developers like

1:32:47

Apple and Google to verify users' ages

1:32:50

when setting up a new device rather than

1:32:52

relying on self-reported ages.

1:32:55

Um, that's fine, I guess.

1:32:58

I mean,

1:32:58

but that's also all that information is

1:33:00

going to be throughout flowing through

1:33:02

Google and Apple.

1:33:03

Is that really what we want?

1:33:04

All of this personal information,

1:33:06

like flowing through big tech corporations

1:33:08

who, you know,

1:33:10

we know Apple and Google are not,

1:33:13

they don't,

1:33:13

they don't have a respect for our

1:33:15

information.

1:33:15

So, um, you know,

1:33:18

I don't think that's a great idea,

1:33:20

but it's also just, you know,

1:33:22

These app stores, like it says in here,

1:33:25

allow parents to set age-appropriate

1:33:27

content controls from the start,

1:33:29

including limiting access to social media

1:33:32

apps and AI platforms.

1:33:34

So a lot of times that's going to

1:33:36

be done through an app store.

1:33:38

And like we saw with the app store,

1:33:41

I believe it's called the App Store

1:33:42

Accountability Act.

1:33:44

Am I correct in that?

1:33:45

Okay.

1:33:46

I think so.

1:33:47

If we're thinking of the same one, yeah.

1:33:49

Right.

1:33:49

Yeah.

1:33:50

And that one was also trying to be

1:33:52

passed in the U S and I think

1:33:54

it's, this is like almost a similar thing.

1:33:56

Like it's,

1:33:56

it's kind of pushing this onto the app

1:33:58

store, which we've talked about before,

1:34:00

but like,

1:34:02

and Nate mentioned a little bit there, um,

1:34:04

like, you know,

1:34:05

how do we know what they consider is

1:34:09

mature or like,

1:34:10

how do we know what they're choosing to

1:34:12

take down and not allow people to access

1:34:14

is, um,

1:34:17

age appropriate like who decides that um

1:34:19

so that's another another slope of things

1:34:22

um I think you know there's definitely

1:34:27

easier ways to do this than having to

1:34:30

do such aggressive measures um but I think

1:34:37

it kind of does take the agency away

1:34:40

from parents a little bit because like I

1:34:42

think you know it's definitely a thing

1:34:45

where

1:34:46

parents have very different ways of

1:34:48

raising their children, right?

1:34:49

Like some people will do something a

1:34:52

certain way and some people will be the

1:34:55

complete opposite of that.

1:34:57

So I think, you know,

1:34:58

forcing people to do things a specific way

1:35:03

and to have access to certain stuff is

1:35:06

interesting.

1:35:07

I think there's different ways of doing

1:35:10

that from a parenting perspective.

1:35:14

Um, so I dunno,

1:35:15

I think a lot of times though,

1:35:17

you know, maybe we shouldn't be giving,

1:35:19

I mean,

1:35:19

this is completely a personal opinion,

1:35:21

but maybe we shouldn't be giving children,

1:35:25

you know,

1:35:26

devices that can just access the entire

1:35:28

internet.

1:35:28

Because I know when I was like younger,

1:35:32

uh, having access to the, to the internet,

1:35:35

unrestricted access to the internet was

1:35:37

probably not the greatest thing for my

1:35:40

development.

1:35:41

Right.

1:35:41

And I'm sure many people who are like,

1:35:43

you know,

1:35:45

iPad kids or like Gen Z type people

1:35:50

might also like share the same thing.

1:35:52

Like basically having answers to any

1:35:55

question and, you know,

1:35:58

access to anything at any point is not

1:36:03

a great thing in some cases.

1:36:06

So, you know, I think that's,

1:36:08

that might be something that needs to be

1:36:10

tackled from a different angle from like

1:36:13

parents or,

1:36:15

parental controls um but I don't think

1:36:19

it's I don't know I don't think this

1:36:21

should be up to the government to decide

1:36:24

um so and it doesn't really seem like

1:36:27

it respects people's privacy anyway so

1:36:31

yeah I don't really have any more to

1:36:33

add here

1:36:36

Yeah.

1:36:36

I don't, I don't think I do either.

1:36:38

It's just, um,

1:36:39

I think the thing I'll end with is

1:36:41

if you're in the U S uh,

1:36:42

definitely contact your representatives.

1:36:44

I certainly will be, um,

1:36:46

this coming week and, uh, you know,

1:36:48

try to outline, I would argue,

1:36:51

try to outline why you're against this.

1:36:53

Um,

1:36:54

I don't know if that will increase your

1:36:55

odds,

1:36:55

but I feel like it would be a

1:36:56

lot more effective instead of just be

1:36:58

like, Hey, I'm against this thing.

1:36:59

Be like,

1:37:00

I'm against this thing because it takes

1:37:01

away agency from the parents.

1:37:02

It,

1:37:03

there's no meaningful protection of the

1:37:04

data, uh, you know,

1:37:06

all these kinds of like,

1:37:07

maybe we'll get lucky.

1:37:08

And maybe some of these politicians will

1:37:10

read, I mean, obviously they won't,

1:37:11

their aides will read this,

1:37:13

but maybe some of their,

1:37:14

their assistants will read some of these

1:37:15

responses and just be like, Oh,

1:37:17

you know what?

1:37:17

These are like legitimate concerns.

1:37:19

And,

1:37:20

and I think also spreading awareness

1:37:21

around us.

1:37:22

Like,

1:37:22

I know I'm always the first one to

1:37:23

be like, Hey, contact your politicians,

1:37:25

but, um,

1:37:26

I really think telling the parents around

1:37:29

you,

1:37:29

this takes away your agency as a parent.

1:37:32

What happens when there's a data breach

1:37:33

and your ID gets leaked?

1:37:35

I think those are things that will get

1:37:37

their attention and get them to sit up

1:37:39

and realize, oh, yeah,

1:37:41

maybe this isn't the best way to go

1:37:44

about this.

1:37:44

Because a lot of people really don't see

1:37:47

what the issue is, right?

1:37:48

There's all these false equivalencies,

1:37:50

like, oh,

1:37:51

you have to show ID to go into

1:37:52

a bar.

1:37:54

But it's just...

1:37:56

Yeah.

1:37:57

So and also real quick,

1:38:02

Swiss Kill said here is, you know,

1:38:03

it's more effective than stop that.

1:38:05

I also want to point out, like.

1:38:07

Be nice to people,

1:38:08

because if you just send them an angry

1:38:09

message about like you're an idiot and

1:38:11

this is the dumbest law ever,

1:38:12

like they're just going to put you on

1:38:13

the block list.

1:38:13

Well,

1:38:13

I don't think legally they can block you,

1:38:15

but they're just going to ignore you.

1:38:16

So, yeah,

1:38:17

my mom used to say you catch more

1:38:18

flies with honey than vinegar.

1:38:19

So, yeah, I don't know.

1:38:25

Could you maybe offer some...

1:38:27

How exactly can you get in contact with

1:38:30

this person in particular?

1:38:33

That's a good question.

1:38:36

Hold on.

1:38:36

Let me look it up here because I

1:38:37

did...

1:38:38

I'm not going to show my own blog,

1:38:39

but I did write this really long opinion

1:38:41

piece on my own blog.

1:38:43

And I did include...

1:38:46

Um, so congress.gov, house.gov,

1:38:48

senate.gov are all websites you can use to

1:38:51

find your state level politicians in the U

1:38:53

S which you probably want those people

1:38:55

right now because, um,

1:38:57

this is a national law.

1:38:59

There's also common cause.org and usa.gov

1:39:03

are some additional websites to help you

1:39:04

figure out who are your representatives,

1:39:07

which honestly, if you just web search,

1:39:08

like who are my political representatives,

1:39:10

usually several websites will pop up and

1:39:12

you will, um,

1:39:14

In case anyone is not aware,

1:39:15

you will have to put in your address

1:39:17

because that's what determines what

1:39:19

districts you fall in and stuff.

1:39:20

But yeah, I don't know.

1:39:23

To me, it's worth it.

1:39:26

Yeah.

1:39:27

And, and real quick, Canada said,

1:39:29

I was under the impression that Google and

1:39:30

Apple oppose age verification.

1:39:32

They all do,

1:39:33

which I think should be extremely telling

1:39:34

that like meta doesn't want to do this.

1:39:36

Open AI is one of the companies that's

1:39:37

been lobbying these groups behind the

1:39:39

scenes.

1:39:39

I forget where that came from recently,

1:39:41

but yeah,

1:39:42

like Google and Apple have openly pushed

1:39:44

back against the app store accountability

1:39:47

act.

1:39:47

Like nobody wants to be responsible for

1:39:51

this data.

1:39:52

which to me is extremely telling.

1:39:54

Like the one time that all these companies

1:39:57

that are just built on violating your

1:39:58

privacy, monetizing your data,

1:40:00

collecting every... Like meta...

1:40:04

built an app that purposely opened up

1:40:08

ports that it doesn't normally open up to

1:40:10

get around the sandboxing built into the

1:40:12

phone.

1:40:12

I think this was on Android,

1:40:14

but it may have been iPhone.

1:40:15

It may have been both.

1:40:16

I can't remember.

1:40:17

But either way,

1:40:18

like I forget the exact details of the

1:40:20

story,

1:40:20

but they purposely found ways to get out

1:40:23

of the sandbox and bypass the protections

1:40:25

built into the device to spy on the

1:40:27

other apps on your phone.

1:40:29

This is the same company who said,

1:40:31

we don't want to be responsible for this.

1:40:32

And I think that should be extremely,

1:40:34

extremely telling.

1:40:36

Thank you for coming to my TED Talk.

1:40:38

Tip your servers.

1:40:41

Yeah, that's all I got.

1:40:43

Nice.

1:40:46

Yeah.

1:40:46

So I guess with that being said...

1:40:51

I guess in a minute we can start

1:40:53

taking fewer questions.

1:40:54

We've already kind of had a couple here.

1:40:57

So if you've been holding on to any

1:40:59

questions about any of the stories we've

1:41:01

been talking about so far,

1:41:02

go ahead and start leaving them.

1:41:04

You can either leave them in the chat

1:41:05

or you can also leave them in the

1:41:07

respective forum thread for this live

1:41:09

stream.

1:41:11

And for now,

1:41:12

let's check in on our community forum.

1:41:16

So there's always a lot of activity over

1:41:18

there.

1:41:19

And this week was no,

1:41:26

You know what I'm trying to say?

1:41:26

No exception.

1:41:28

No exception.

1:41:29

Yeah, this was a very, very busy week.

1:41:31

So I guess this first one here is

1:41:34

there was a Visa card vulnerability.

1:41:36

So if you haven't seen this already,

1:41:37

there was a video from Veritasium,

1:41:39

which was just a quickly...

1:41:43

recap what the video was about basically

1:41:45

they did a collab with uh mkbhd where

1:41:48

they basically had his uh phone and they

1:41:52

were able to extract ten thousand dollars

1:41:55

from his credit card um without any input

1:41:59

from him like they just had his phone

1:42:00

and they were able to extract ten thousand

1:42:02

dollars

1:42:04

Um, so that was kind of concerning, uh,

1:42:07

definitely a very interesting video.

1:42:09

I haven't had a time to watch the

1:42:10

entire thing yet.

1:42:12

Um,

1:42:12

cause this week has been incredibly busy,

1:42:15

but, um,

1:42:18

Definitely worth checking that out.

1:42:20

And I'll just highlight Jonah's comment

1:42:22

here because he did watch the video.

1:42:25

I just finished watching this video a

1:42:26

minute ago.

1:42:27

I knew this would be express transit

1:42:29

related,

1:42:30

but this interplay between that and Deezer

1:42:32

is interesting.

1:42:33

So basically the way that this kind of

1:42:35

exploited it is it uses this thing called

1:42:37

express transit mode,

1:42:39

which

1:42:41

I mean,

1:42:42

I can't comment on if this is in

1:42:45

the US quite a lot, but in Australia,

1:42:47

like it's kind of very common.

1:42:49

So basically when you tap onto public

1:42:53

transport,

1:42:53

you can basically use your phone to tap

1:42:59

on,

1:43:00

but you'll have to also authenticate

1:43:02

yourself.

1:43:02

Usually that's how it normally works.

1:43:05

But if you enable express transit mode,

1:43:08

it actually just allows you to tap your

1:43:10

phone without authenticating at all.

1:43:13

So that's why this becomes a bit of

1:43:16

a problem, right?

1:43:16

Because it would be fine if you had

1:43:19

to authenticate and then it goes through.

1:43:22

But basically this exploit was able to

1:43:24

basically extract ten thousand dollars

1:43:26

from MKBHD's phone and

1:43:29

without him verifying anything.

1:43:33

And apparently this was due to a

1:43:34

vulnerability in Visa.

1:43:38

They didn't have any,

1:43:40

they didn't like cryptographically check

1:43:41

the transaction or something.

1:43:43

I'm not entirely sure of what the

1:43:44

specifics are behind that,

1:43:46

but there was some more people saying,

1:43:50

there was another post here from Jonah

1:43:52

asking whether express transit mode was

1:43:54

enabled by default with a credit card on

1:43:56

their device.

1:43:59

I mean, I can comment on that,

1:44:01

that it is a specific thing you need

1:44:02

to enable.

1:44:03

And sometimes it does.

1:44:05

We don't have public transit in the US.

1:44:08

I thought you did.

1:44:08

I thought you did.

1:44:10

Oh, is it private?

1:44:12

We do, but it's pretty garbage.

1:44:15

So for all intents and purposes, we don't.

1:44:18

Years ago when we were still dating and

1:44:19

we first started living together,

1:44:20

my wife had a job that was maybe

1:44:24

about a twenty minute drive by car.

1:44:27

And long story short, I had had...

1:44:31

The particular place we lived at,

1:44:32

it was really easy for me to get

1:44:34

downtown to go to work.

1:44:35

And so I was like,

1:44:35

you should try the bus one of these

1:44:36

days.

1:44:37

Just try it.

1:44:38

You don't have to drive.

1:44:38

You don't have to park.

1:44:39

It's really handy.

1:44:40

It took her three hours by bus.

1:44:42

And she never did that again.

1:44:43

So our public transit in the US is

1:44:46

absolute garbage.

1:44:47

But continue.

1:44:48

Oh, I see.

1:44:51

OK.

1:44:51

I definitely have seen some public transit

1:44:54

stuff.

1:44:55

I mean,

1:44:55

I think there's definitely some places

1:44:57

where it's a little bit better,

1:44:58

from what I've heard.

1:45:00

Yeah, like New York is okay.

1:45:02

I had good experiences in San Francisco,

1:45:04

although I know everybody who's from those

1:45:06

places are just like, really?

1:45:07

But yeah,

1:45:10

it's definitely not great in most places.

1:45:12

Okay, right.

1:45:14

I don't know.

1:45:14

We've basically had this massive blitz

1:45:18

from Apple in Sydney where they were

1:45:21

basically saying, like,

1:45:23

use Apple Wallet to use transit.

1:45:26

Use Express Transit mode to speed up your

1:45:29

commute, like all these ads from Apple,

1:45:32

which is kind of funny because...

1:45:34

Now we're learning that there's a

1:45:36

vulnerability with visa cards and express

1:45:39

transit mode, um, which, yeah, um,

1:45:44

I personally enabled it once and I

1:45:47

accidentally tapped onto some transport

1:45:53

twice and then I disabled it because,

1:45:55

yeah,

1:45:55

you probably don't want it to

1:45:56

automatically activate like that.

1:46:01

But it is something you have to opt

1:46:02

into and it is part of the flow

1:46:04

when you set up a credit card in

1:46:06

Apple Wallet.

1:46:07

I do wonder if this...

1:46:09

could also be exploited on Google.

1:46:13

Um, it does say in this video,

1:46:14

there's a picture of a Google pixel in

1:46:16

the thumbnail and, um, it says safe,

1:46:19

but I believe express transit mode is also

1:46:22

available on Google wallet as well.

1:46:25

Um,

1:46:25

but maybe there's more checks going on

1:46:27

there that secures that better.

1:46:29

Um,

1:46:31

But yeah,

1:46:32

there were some comments responding to

1:46:33

Jonah's thread there saying that their

1:46:36

credit card wasn't enabled by default with

1:46:38

this feature.

1:46:39

So unless you accidentally enabled it or

1:46:43

did something in the setup process,

1:46:47

then it's probably not enabled.

1:46:50

I think this was kind of unfortunate for

1:46:53

MKBHD because he just got ten thousand

1:46:55

dollars removed from his credit card.

1:46:58

Obviously, they gave it back.

1:46:59

But, you know,

1:47:00

it's like imagine if he wasn't in a.

1:47:06

Imagine if that was an attacker,

1:47:07

that would be ten thousand dollars stolen

1:47:10

and all you'd have to do is steal

1:47:11

someone's phone.

1:47:12

So, you know, I think.

1:47:16

trying to reduce the things that thieves

1:47:20

can do with a mobile device is good

1:47:24

because it makes it less likely to be

1:47:25

stolen.

1:47:26

Um, I don't think stealing an, an,

1:47:29

an iPhone or a Google pixel or any

1:47:31

of these other devices is a very good

1:47:33

idea.

1:47:33

You're basically stealing a tracking

1:47:35

device at that point.

1:47:38

so yeah.

1:47:39

Um,

1:47:41

Definitely an interesting thread there

1:47:45

with some discussion.

1:47:46

Do you have any thoughts on this one,

1:47:47

Nate?

1:47:49

Yeah,

1:47:49

I think it was really the video that

1:47:52

everybody found interesting.

1:47:53

But it seems that this is primarily

1:47:56

limited to Visa cards.

1:47:58

Like, again,

1:47:58

that comment you were looking at from

1:48:00

Jonah, he said...

1:48:02

Again,

1:48:03

I didn't watch the video either because

1:48:04

it's been a busy week.

1:48:05

But he said,

1:48:06

I'd have to agree with Apple that this

1:48:08

is primarily a Visa issue.

1:48:09

But Visa's point that it is not worth

1:48:10

fixing is probably accurate too.

1:48:12

So I definitely want to try to watch

1:48:14

the video this weekend.

1:48:15

But yeah,

1:48:17

I was kind of asking Jonah a little

1:48:18

bit more about this before we started

1:48:20

streaming.

1:48:21

And there's not really any defenses at

1:48:23

this time other than just to disable the

1:48:26

1:48:27

the automatic transit or whatever it's

1:48:28

called, the express transit.

1:48:31

And so it's just kind of a reminder,

1:48:32

I guess that like privacy and well, yeah,

1:48:36

privacy and security and convenience are

1:48:40

almost always i i want to push back

1:48:42

on always because i think there's actually

1:48:43

been a few times that privacy and security

1:48:45

have actually made my life more convenient

1:48:46

but definitely ninety plus percent of the

1:48:49

time they are on opposite ends of the

1:48:51

spectrum with each other and that's kind

1:48:53

of part of a threat model right is

1:48:54

you have to ask like what am i

1:48:56

trying to protect who am i trying to

1:48:57

protect it from how much trouble am i

1:48:59

willing to go through to protect this

1:49:00

thing and i think

1:49:04

I don't really use a lot of tap-to-pay

1:49:07

stuff myself,

1:49:08

mostly just because my phone doesn't

1:49:10

support it.

1:49:11

So I can't say for certain,

1:49:13

but I would have to imagine that for

1:49:15

most people, it's pretty like...

1:49:19

It's probably not the end of the world

1:49:21

to disable this express transit.

1:49:23

Sure, it'll slow you down a little bit.

1:49:25

And I mean, I also have to ask,

1:49:28

again, I didn't watch this video,

1:49:30

but genuinely asking,

1:49:32

how easy would this be to pull off?

1:49:33

Because just because it can be done,

1:49:35

I mean,

1:49:36

we can put people on the moon.

1:49:39

Kind of hard.

1:49:40

We haven't done it a whole lot.

1:49:42

So, you know, it's the same thing here.

1:49:43

Like,

1:49:44

just because this can be done doesn't

1:49:45

necessarily mean that it's something that

1:49:46

you have to worry about every random

1:49:48

person on the street doing this.

1:49:50

So,

1:49:52

if it's something that's very unlikely and

1:49:54

you're in a really,

1:49:55

really busy area where it's like, no,

1:49:57

dude, that extra, like,

1:49:58

two seconds it would take me to do

1:50:00

this would actually kind of add up over

1:50:02

time and get really annoying.

1:50:03

Like, okay,

1:50:04

maybe it's worth leaving it on.

1:50:05

But if it...

1:50:08

if it's not really going to impact your

1:50:09

life,

1:50:09

it's probably better to err on the side

1:50:11

of caution.

1:50:13

And somebody also said here that

1:50:14

MasterCard has resolved this issue and

1:50:16

Visa stands on that this is possibility of

1:50:19

this to happen is so small.

1:50:20

I agree with you.

1:50:21

If it's one of those things where it's

1:50:22

like,

1:50:22

we know there's a solution and there's

1:50:24

really no reason not to do it.

1:50:25

I mean, that's what I'm basically saying,

1:50:26

right?

1:50:27

Like if you have no reason not to

1:50:29

turn the setting off,

1:50:30

then just turn it off.

1:50:32

And I agree with you a hundred percent.

1:50:33

Like if Visa could easily fix this,

1:50:35

then they really should.

1:50:36

But yeah,

1:50:38

It doesn't sound like they're going to do

1:50:39

that anytime soon.

1:50:40

So unfortunately, it's on us.

1:50:42

As usual,

1:50:45

it's on us to care about our own

1:50:46

privacy because these companies do not,

1:50:47

or our own security in this case,

1:50:49

because these companies do not.

1:50:50

So I think that's kind of my takeaway

1:50:52

from that one.

1:50:54

Yeah,

1:50:54

I just want to highlight Pineapple

1:50:56

Express's comment here.

1:50:58

Transit...

1:51:00

That's a good comment.

1:51:01

Yeah.

1:51:02

Thanks for commenting.

1:51:04

Transit.

1:51:05

Thanks for adding to the discussion.

1:51:10

But I think, yeah.

1:51:10

Wasn't pineapple express a type of weed in

1:51:12

a movie?

1:51:14

Sorry.

1:51:16

Possibly.

1:51:17

It's an old movie.

1:51:19

Yeah, it's definitely,

1:51:21

I feel like it's definitely some

1:51:22

references in the chat usually.

1:51:25

But yeah, I think, yeah, I mean,

1:51:28

I think it's like,

1:51:29

so the process between like,

1:51:34

I feel like it's the...

1:51:36

the process between authenticating and

1:51:38

tapping is like so little that it's like,

1:51:41

really, like, are we really doing,

1:51:42

is this really necessary?

1:51:44

Um,

1:51:45

so I feel like it's not really that

1:51:47

much of a concern.

1:51:48

Just disable it.

1:51:49

Just don't use this feature.

1:51:50

Like it's, I don't know.

1:51:53

You kind of know when you're going to

1:51:55

get off transit, you know,

1:51:56

when you're going to get off a train,

1:51:58

you know,

1:51:59

when you're going to get off a bus,

1:52:01

a ferry, whatever.

1:52:02

Um, so, you know, just

1:52:05

time it with how you're doing it,

1:52:06

just authenticate and then tap.

1:52:09

I think that's the easiest way to get

1:52:10

out of

1:52:12

falling into this issue.

1:52:14

But I guess there was also another thread

1:52:16

here from someone talking about airplane

1:52:20

mode on Graphene OS.

1:52:23

I'm just going to read their comment.

1:52:24

I'm not going to mention their name for

1:52:26

privacy reasons.

1:52:27

I think people should be aware that

1:52:29

airplane mode on Graphene OS doesn't

1:52:31

completely turn off the SIM as you can

1:52:33

still receive and make calls over Wi-Fi,

1:52:36

a technology known as VO Wi-Fi.

1:52:41

I am not certain about it,

1:52:42

but I think this means your ISP can

1:52:44

know your location,

1:52:45

at least when you stay home.

1:52:47

VO Wi-Fi might only work on router from

1:52:50

the same ISP as your mobile.

1:52:52

You can disable it in the SIM settings.

1:52:56

This is interesting.

1:52:57

Do you have any thoughts on this?

1:52:59

I don't even know if this is a

1:53:00

thing in Australia.

1:53:02

So you guys don't have airplane mode in

1:53:04

Australia?

1:53:06

I mean, the VR, VR, wifi.

1:53:08

Oh, voiceover wifi.

1:53:09

Yeah.

1:53:10

I don't know if we have that here.

1:53:11

Like I know there is, um,

1:53:13

in a lot of phones,

1:53:14

there's a setting to enable wifi calling,

1:53:16

uh, which maybe that's the same thing,

1:53:18

but, uh,

1:53:19

maybe voiceover wifi is like the protocol

1:53:22

that enables that.

1:53:22

And that's just what the settings called

1:53:24

is enable wifi calling.

1:53:25

But yeah.

1:53:26

Um, no, I think I,

1:53:27

I wanted to highlight this because, uh,

1:53:28

we do recommend on privacy guides to use

1:53:32

airplane mode whenever possible.

1:53:34

And, um,

1:53:36

I think I just really wanted to point

1:53:37

out that this is one of those things

1:53:41

where it's like it's kind of a very

1:53:42

niche, like a more advanced thing,

1:53:45

but it's still something that's good.

1:53:46

Like it's always good to have things on

1:53:47

your radar, right?

1:53:48

It's always good to have that information

1:53:50

and make decisions accordingly.

1:53:51

So here's actually one of the comments

1:53:56

that we wanted to highlight to kind of

1:53:57

explain this.

1:54:00

Disabling your SIM does not...

1:54:02

Where does it go?

1:54:04

Airplane mode is intended to disable

1:54:06

cellular radios, not your SIM,

1:54:08

and is well documented on how it works

1:54:09

on every mobile OS.

1:54:11

I think they said that Graphene documented

1:54:13

that.

1:54:14

Graphene has really good documentation.

1:54:16

They said, likewise,

1:54:16

disabling your SIM does not disable your

1:54:18

cellular radios,

1:54:19

and your device will still ping cell

1:54:20

towers unless you enable airplane mode.

1:54:22

It's in the name really airplane mode

1:54:23

exists solely to comply with regulations

1:54:25

requiring cellular radios to be completely

1:54:26

turned off.

1:54:27

The privacy factors are a side effect.

1:54:29

So basically I think what,

1:54:32

what they're saying is that if you enable

1:54:36

airplane mode,

1:54:37

you are turning off the radios,

1:54:40

but not necessarily the SIM card itself.

1:54:42

So if you do have other things turned

1:54:43

on like voiceover wifi, then that is a,

1:54:49

potentially, if you're not using a VPN,

1:54:51

for example,

1:54:51

I'm assuming a VPN would beat that because

1:54:53

it's voice over Wi-Fi.

1:54:55

I mean,

1:54:55

it's a really good thread because a lot

1:54:56

of people talked about,

1:54:58

apparently on some phones,

1:54:59

the voice over Wi-Fi still goes outside

1:55:01

the VPN,

1:55:01

but Graphene tries to send everything

1:55:04

through the VPN as much as possible.

1:55:06

So it's one of those things where, again,

1:55:08

I think this is probably...

1:55:11

more extreme privacy thing.

1:55:13

I think it's probably not going to make

1:55:15

or break most people,

1:55:16

but it's still definitely something that

1:55:18

you should know of and you should be

1:55:20

aware of.

1:55:21

And if that is part of your threat

1:55:22

model, you should factor that in.

1:55:24

It's good information to have because I

1:55:27

was also kind of under the impression that

1:55:31

I don't know.

1:55:31

I think I was kind of under the

1:55:32

impression that turning on airplane mode

1:55:34

would kind of turn off the sim,

1:55:36

or at least,

1:55:38

I don't know what I was under the

1:55:38

impression, to be honest.

1:55:39

But it's definitely something interesting

1:55:42

to keep in mind, for sure.

1:55:44

And yeah, I see you highlighted,

1:55:45

Jonah said that voice over Wi-Fi and

1:55:48

enable Wi-Fi calling are the same thing.

1:55:50

So good to know.

1:55:51

OK.

1:55:52

Yeah,

1:55:52

I've never heard it called VO Wi-Fi

1:55:54

before.

1:55:54

I thought it might be a different thing.

1:55:56

But yeah, Wi-Fi calling, we do have that.

1:55:59

Yeah, same here.

1:56:00

I think one thing as well is airplane

1:56:03

mode.

1:56:05

As far as I'm aware,

1:56:07

like if you make a call to emergency

1:56:10

services,

1:56:11

it still connects to the tower as well.

1:56:15

So yeah,

1:56:17

I think I'm not entirely sure if it's,

1:56:20

I'm pretty sure the whole point of

1:56:21

airplane mode was to stop signals coming

1:56:24

out of the device when you're in an

1:56:26

airplane.

1:56:26

So yeah,

1:56:27

I guess that's fine,

1:56:30

except if you launch an emergency call,

1:56:32

I guess.

1:56:32

I guess there's maybe laws that have to,

1:56:36

that say it has to be bypassed for

1:56:37

emergency situations.

1:56:39

I'm not sure.

1:56:43

Yeah, and to be honest, I didn't,

1:56:46

I don't know how it works in terms

1:56:48

of bypass.

1:56:49

Like, I don't know if, I'm assuming not,

1:56:53

just based on the true crime stories I've

1:56:55

heard.

1:56:55

I don't know if cops can like,

1:56:58

still continue to track you like okay

1:57:00

obviously what i'm saying is if i have

1:57:02

airplane mode on and i call nine one

1:57:04

one yes it's gonna go through um what

1:57:06

i don't know is can they reverse that

1:57:09

could the cops just surreptitiously decide

1:57:11

to figure out where i am when i

1:57:13

have airplane mode on my money says no

1:57:15

but i could be wrong on that one

1:57:17

um yeah i don't know it's interesting

1:57:20

stuff i think one thing nah that's not

1:57:25

not really relevant i was going to talk

1:57:26

about uh the the different

1:57:29

triangulation with a cell versus wifi,

1:57:31

but I don't think that's really relevant

1:57:33

to this.

1:57:33

So it's, it's interesting stuff though.

1:57:36

Like I said,

1:57:36

I think it's one of those things that

1:57:37

if you guys have some time,

1:57:38

definitely go check out that thread and

1:57:40

just kind of give it a quick browse.

1:57:42

Cause it's, it's,

1:57:43

it's not a very long thread.

1:57:44

I think there were only what,

1:57:45

like not even ten replies or something.

1:57:47

And so it's just one of those,

1:57:49

like the more, you know, kind of things.

1:57:56

On that note,

1:57:57

we're going to take viewer questions and

1:57:59

we're going to start with the questions on

1:58:01

our forum from our paying members.

1:58:03

You can become a member by going to

1:58:05

privacyguides.org and clicking the red

1:58:07

heart icon in the top right corner of

1:58:09

the page.

1:58:10

Or I keep forgetting,

1:58:11

we also have privacyguides.org slash

1:58:13

donate, which will take you right there.

1:58:16

Um,

1:58:17

so we only had one question this week

1:58:20

and somebody said that privacy guides

1:58:22

currently does not recommend to enable the

1:58:24

tell websites not to sell or share my

1:58:25

data feature in Firefox.

1:58:27

Should we enable this?

1:58:28

If so, is it still worth enabling?

1:58:30

Even if you don't reside in a jurisdiction

1:58:31

that makes GPC opt out functional,

1:58:33

but more of a statement of preference.

1:58:34

So, um,

1:58:37

I have a lot of beef with Mozilla,

1:58:39

but one thing I will give them,

1:58:41

it's both a pro and a con is

1:58:42

yes.

1:58:42

If you click the button that says tell

1:58:44

websites not to sell or share my data,

1:58:47

I, on Firefox,

1:58:50

that does not enable do not track.

1:58:51

That enables GPC.

1:58:53

And on the one hand,

1:58:54

I wish they would make that a little

1:58:55

bit more obvious.

1:58:56

I did have to dig into the documentation

1:58:58

to learn that.

1:58:59

But on the other hand,

1:59:00

the average person probably doesn't know

1:59:02

the difference anyways.

1:59:02

So what does it matter?

1:59:05

Crap, I'm out of water.

1:59:07

So as I understand it,

1:59:10

and someone please correct me if I'm

1:59:11

wrong,

1:59:13

I don't think there's a drawback to

1:59:15

enabling GPC.

1:59:17

In the past,

1:59:18

Do Not Track had this thing where when

1:59:22

you enable Do Not Track,

1:59:23

it basically did something in the headers

1:59:25

that ironically made you stand out more.

1:59:28

It created a header that wasn't there,

1:59:30

and that was one more data point they

1:59:31

could use to track you.

1:59:32

And since there was no legal enforcement

1:59:34

behind it,

1:59:36

a lot of websites straight up say in

1:59:38

their privacy policy, they're like,

1:59:39

we do not respect Do Not Track requests.

1:59:42

which is crappy,

1:59:43

but at least they say it.

1:59:44

So...

1:59:46

I don't know.

1:59:48

What I was told is that the way

1:59:51

that GPC works is somehow more privacy

1:59:53

respecting.

1:59:54

And I don't,

1:59:55

the technical stuff goes over my head.

1:59:56

I don't understand how,

1:59:57

but it's one of those things where they're

1:59:59

not supposed to be able to track you.

2:00:00

Like that was a lesson learned from Do

2:00:01

Not Track is now we've implemented this in

2:00:03

a way where it cannot be used as

2:00:05

another fingerprint data point.

2:00:07

So even if you're not in an area

2:00:09

where GPC is required,

2:00:11

as far as I know,

2:00:12

it still doesn't hurt to turn it on

2:00:15

And, you know, if they don't,

2:00:17

it's one of those things where, you know,

2:00:18

a lot of people say like,

2:00:21

there's no point.

2:00:23

Sorry, let me back up.

2:00:24

So I was told by a lawyer one

2:00:25

time that if you do not interact with

2:00:28

a cookie banner,

2:00:29

companies are supposed to treat that as

2:00:31

the same as saying, don't track me.

2:00:35

And they're not supposed to track you.

2:00:37

They're not supposed to put the cookie

2:00:38

there.

2:00:40

a lot of people will argue that like

2:00:43

the cookie banner doesn't really matter.

2:00:44

And they're just going to track you

2:00:45

anyways.

2:00:45

It's one of those things where like,

2:00:46

in my opinion,

2:00:47

it doesn't hurt to say no,

2:00:48

because it just, I don't know.

2:00:52

I'm having a hard time with words tonight.

2:00:53

It just doesn't hurt is what I'm getting

2:00:54

at.

2:00:55

As far as I know, if,

2:00:56

if it doesn't,

2:00:57

If the company's not going to respect it,

2:00:59

they're not going to respect it

2:00:59

regardless.

2:01:00

But if they do respect it,

2:01:02

it's not going to make you any more

2:01:03

fingerprintable.

2:01:03

I know I remember,

2:01:05

I wish I could remember what it was,

2:01:06

but there was a period where I was

2:01:07

like going to websites and I would keep

2:01:10

seeing a little pop-up just for a second,

2:01:13

a very non-intrusive pop-up.

2:01:14

Imagine that, crazy.

2:01:15

That just said like, hey,

2:01:16

we saw your browser has GPC.

2:01:18

We respect that and we're not tracking

2:01:19

you.

2:01:19

And I was like, holy crap, that's awesome.

2:01:21

I haven't seen it a lot lately,

2:01:22

but yeah.

2:01:23

So as far as I know,

2:01:24

in my opinion,

2:01:25

I think it's totally worth enabling.

2:01:27

Um,

2:01:27

Jonah said we'll have to make a video

2:01:28

or something explaining it more.

2:01:30

So, uh, he didn't say I was wrong,

2:01:32

so that's good news.

2:01:33

I think,

2:01:33

I think I was right about that.

2:01:37

Yeah.

2:01:38

I mean, I agree with all points.

2:01:40

Well said.

2:01:41

Um,

2:01:41

I didn't really have anything to add to

2:01:43

that.

2:01:43

Um, yeah.

2:01:48

Cool.

2:01:49

That was our only question in the forum.

2:01:51

The only other one person said that Chrome

2:01:53

is planning to add the GPC toggle this

2:01:56

year.

2:01:58

We still don't recommend Chrome.

2:02:00

Someone else said that it is enabled by

2:02:02

default in LibreWolf, which makes sense.

2:02:04

And we do have one question in the

2:02:07

comments so far.

2:02:09

Swisskill is asking about any router

2:02:13

recommendations in the EU after the US

2:02:15

banned foreign manufactured devices.

2:02:20

I don't have any reason to believe that

2:02:22

there's any backdoors.

2:02:23

I don't... Okay,

2:02:26

I'm going to be a little political here.

2:02:28

A lot of what the administration is doing

2:02:29

does not make sense,

2:02:30

even to a lot of us Americans.

2:02:32

Some of it does, I will say.

2:02:34

That doesn't mean I agree with it,

2:02:35

but some of it does have a logic.

2:02:37

Some of it very much looks like somebody

2:02:39

just woke up and decided something one

2:02:40

day.

2:02:41

And this is one of them where there's

2:02:44

no...

2:02:46

as far as we know,

2:02:47

at least there's absolutely no evidence to

2:02:49

suggest that any of these routers,

2:02:51

cause they're all like,

2:02:53

if you go back and watch, we,

2:02:54

we made this our headline story when this

2:02:56

happened on the podcast.

2:02:58

Um, so go out,

2:02:59

go back and check that one out.

2:03:00

I don't know what episode that is,

2:03:06

we don't know of any existing backdoors.

2:03:08

All routers are currently foreign

2:03:09

manufactured anyways.

2:03:11

So this whole idea of like the U

2:03:12

S is banning foreign manufactured routers.

2:03:14

The U S is banning all routers,

2:03:15

basically a quick update,

2:03:17

actually net year finally got their first

2:03:18

exemption.

2:03:19

That was almost one of the stories we

2:03:21

covered, but a pretty crowded week.

2:03:22

So we decided that one was the weakest

2:03:24

one, but I don't know, personally,

2:03:27

I wouldn't worry about it.

2:03:28

What I would focus on instead is looking

2:03:29

for a router that's,

2:03:31

um,

2:03:32

compatible with open source firmwares like

2:03:34

open WRT.

2:03:35

Um,

2:03:36

I've had good experiences so far on fresh

2:03:37

tomato is still working great for me.

2:03:39

DDW RT used to work really great up

2:03:41

until about a month or two ago.

2:03:43

Um, so yeah, I would,

2:03:45

I would focus more on like looking for

2:03:47

an open WRT router or something similar

2:03:49

personally.

2:03:51

That'd be my recommendation.

2:03:54

Yeah,

2:03:56

I feel like the big one that I

2:03:58

see a lot of people using is the

2:03:59

GLInet routers,

2:04:02

which I believe they all come with.

2:04:04

Well, not all of them,

2:04:05

but the majority of their more reasonably

2:04:09

priced ones support OpenWRT,

2:04:14

and they also have their own spin of

2:04:16

OpenWRT, which is what it comes with,

2:04:19

which is a bit more user-friendly because

2:04:22

OpenWRT is...

2:04:27

It allows you to do a lot,

2:04:28

but its interface is not the greatest,

2:04:35

let's just say.

2:04:37

I'm not like any networking expert,

2:04:41

but I have had issues with configuring

2:04:46

stuff properly because I'm not really...

2:04:50

super network savvy where like, you know,

2:04:52

it's so easy on like GLI net or

2:04:55

like DDWRT or like fresh tomato to

2:04:59

basically, um,

2:05:03

you know, set up separate networks,

2:05:05

set up VPN connections,

2:05:06

all that stuff is a lot easier on

2:05:08

those.

2:05:09

Um,

2:05:09

so GLI net is one that I see

2:05:11

recommended a lot.

2:05:12

Um, I don't know if this,

2:05:14

this probably a pretty regional thing,

2:05:16

but we have Dray tech.

2:05:17

Uh, I think they're a Taiwanese company,

2:05:20

but a lot of their routers also support

2:05:21

open WRT.

2:05:23

Um, yeah,

2:05:25

I can't really think of too many, uh,

2:05:28

other companies that I would, I mean,

2:05:31

I guess there's,

2:05:34

Yeah, I mean,

2:05:35

I can't really think of any European

2:05:37

companies that make routers, really.

2:05:39

Can you?

2:05:42

I think there's one.

2:05:43

Oh, my God.

2:05:44

Jonah and I talked about it because I

2:05:46

remember the subtitles got it right,

2:05:47

and I was like,

2:05:48

I've never heard of this company.

2:05:49

And so I had to ask him if

2:05:50

that's Microtech or something.

2:05:53

I think they're like a Finnish company.

2:05:56

Everybody's going to be so offended that I

2:05:57

can't keep my European country straight.

2:06:01

Microtech is Taiwanese company.

2:06:03

No, no, no.

2:06:05

There's another one.

2:06:08

There is.

2:06:09

God, what a...

2:06:12

Yeah, Microtik.

2:06:14

Yeah, Pineapple Express got it.

2:06:16

It's not like that.

2:06:17

Latvian.

2:06:18

They're Latvian.

2:06:18

That's who they are.

2:06:19

Okay.

2:06:20

I knew they were European.

2:06:21

Apologies to Latvians.

2:06:24

But yeah,

2:06:24

so they're a Latvian network equipment

2:06:28

manufacturing company.

2:06:29

I don't know much about them,

2:06:31

but I remember Jonah mentioned them when

2:06:32

we were talking about this story in the

2:06:33

first place.

2:06:34

Yeah.

2:06:36

And I also just wanted to say,

2:06:37

I checked,

2:06:38

because I know we have a page about

2:06:39

routers.

2:06:40

OpenWrt and OpenSense are currently our

2:06:43

two top recommendations.

2:06:45

So if you can find something that's

2:06:46

compatible with those,

2:06:47

that would probably be your best bet.

2:06:51

Yeah.

2:06:51

I mean, you can also buy the...

2:06:53

I got the OpenWrt one, which is like...

2:06:58

It supports the OpenWrt project.

2:07:03

But again, that's...

2:07:05

as far as I'm aware that was coming

2:07:07

from China.

2:07:08

So, you know, Oh no, I guess,

2:07:11

but I feel like everything's made in

2:07:12

China.

2:07:13

So I feel like that's,

2:07:14

I haven't heard of a EU made router

2:07:17

or anything.

2:07:18

So,

2:07:19

Yeah, I was going to say,

2:07:20

that was kind of the point that Jonah

2:07:22

and I kept harping on when we talked

2:07:24

about this story,

2:07:24

is that there are no American-made

2:07:27

routers.

2:07:28

They're all made in China,

2:07:29

except for apparently there's one from

2:07:30

Starlink,

2:07:31

which I'm sure is a total coincidence.

2:07:32

But anyways, so I mean,

2:07:34

this whole idea of like, yeah,

2:07:37

I don't know.

2:07:38

And I don't know if Europe's any

2:07:40

different, but here in America, for sure,

2:07:43

there are no

2:07:45

made in American routers.

2:07:46

Like there's some of them are designed

2:07:47

here from American companies like Netgear

2:07:49

and Cisco,

2:07:49

but they're all manufactured and assembled

2:07:53

in China or overseas.

2:07:55

So, yeah.

2:07:58

I am seeing some interesting stuff about

2:08:00

MikroTik.

2:08:02

Um,

2:08:02

apparently a lot of their stuff isn't made

2:08:05

in China now.

2:08:06

It's made in other countries.

2:08:07

So that is interesting.

2:08:10

Um,

2:08:12

so I guess we're seeing a lot of

2:08:13

companies kind of divesting from,

2:08:15

or at least trying to, uh, I guess,

2:08:19

uh, what do you call that word?

2:08:20

Like have multiple bases of manufacturing

2:08:25

diversifying.

2:08:26

Thank you.

2:08:26

I don't know what it is today.

2:08:27

I can't find any words that I'm going

2:08:29

to say.

2:08:29

Me either.

2:08:30

Words are hard tonight.

2:08:33

But yeah, so yeah, I mean,

2:08:37

it's good to see that there's more stuff.

2:08:39

I mean,

2:08:39

I think it's still like the national

2:08:42

security concern is probably still the

2:08:44

same, right?

2:08:45

Like Vietnam or like Malaysia.

2:08:48

I mean,

2:08:48

there's still the possibility of them

2:08:50

being.

2:08:52

doing something sus,

2:08:53

but I think it's probably not that likely.

2:08:58

I mean,

2:08:59

I haven't seen any evidence that there's

2:09:01

been any routers that have been tampered

2:09:03

like that from, like,

2:09:04

any of these big American companies.

2:09:06

So I'm not sure how much of a

2:09:08

risk that is.

2:09:10

And just to point that out, yeah,

2:09:12

it's like we – first of all,

2:09:13

we don't have any evidence that there's

2:09:14

been any issues.

2:09:15

This is all stuff we went over in

2:09:17

the show.

2:09:18

And I think the bigger concern would be

2:09:19

like the cheap off-brand stuff or like the

2:09:22

knockoff stuff because we have seen –

2:09:25

I don't know about routers specifically,

2:09:26

but we have seen like Android TVs.

2:09:29

Like if you buy the really cheap Android

2:09:30

TVs on Amazon,

2:09:32

we've seen articles that talk about how

2:09:33

like, yeah,

2:09:34

a lot of them come preloaded with malware

2:09:36

and they run botnets and stuff like that.

2:09:38

So I think if you're getting a good

2:09:40

reputable name brand router from a

2:09:42

reputable source,

2:09:44

i don't think there's really that much to

2:09:45

worry about and then i think if you

2:09:47

want to go the extra mile and be

2:09:48

extra safe which of course we always

2:09:50

recommend then you should put something

2:09:51

like open sense on there um i i

2:09:54

definitely want to get the open sense one

2:09:55

next time i buy a router i have

2:09:57

been very excited about that project i

2:09:58

think it's really cool um i just my

2:10:01

current router still has a lot of life

2:10:02

left in it so i'm not ready to

2:10:03

do that yet but um yeah i don't

2:10:06

think it's a huge i

2:10:09

I really disagree with the government on

2:10:12

this whole like it's a risk thing because

2:10:14

it literally is just trust me, bro,

2:10:15

I said so.

2:10:16

And not to get too far off topic,

2:10:19

but that's an issue I've always had.

2:10:21

Like I've literally met people that when I

2:10:22

talk about privacy, they're like, well,

2:10:25

I have a buddy who works in national

2:10:26

security and he says like they've stopped

2:10:27

so many bad things.

2:10:28

And I'm like,

2:10:29

then your buddy needs to come forward and

2:10:31

tell us about that.

2:10:32

Because right now,

2:10:33

every study we have says that mass

2:10:35

surveillance has literally never done

2:10:37

anything

2:10:38

and always makes things worse than better.

2:10:41

And so if it is actually making the

2:10:43

world a better place,

2:10:44

we need to have that information so that

2:10:45

we can have this debate in good,

2:10:47

honest faith.

2:10:48

Because right now it doesn't seem like

2:10:49

that's the case.

2:10:50

And so that's how I feel about this

2:10:52

whole like router ban.

2:10:53

It's like, oh,

2:10:54

these things are national security risk.

2:10:55

Where's your evidence?

2:10:57

Because right now there is no evidence and

2:10:58

you sound like an idiot.

2:11:00

So yeah, I don't know.

2:11:03

That's my opinion.

2:11:04

Yeah, I think, yeah, I don't know.

2:11:08

I don't know what it's like in the

2:11:10

US really that much.

2:11:11

But in Australia, there's a lot of, yeah,

2:11:14

fear mongering about that sort of stuff.

2:11:17

How we need to have more laws to

2:11:24

see criminal stuff.

2:11:25

I mean,

2:11:25

we have the assistance and access laws.

2:11:29

which basically means that police get

2:11:32

access to stuff without a warrant and

2:11:34

stuff um you know i think there's plenty

2:11:36

of countries that are doing a similar

2:11:37

thing um i just want to quickly uh

2:11:42

circle back to glinet uh apparently i mean

2:11:46

i don't i don't really research this

2:11:48

because i don't own a glinet one i

2:11:49

just see it that's what a lot of

2:11:51

people use um it does look like they

2:11:53

are

2:11:55

based in at least according to their

2:11:57

websites uh one of their offices is in

2:12:01

hong kong and the other one is in

2:12:03

shenzhen um so i guess just be aware

2:12:07

of that if that's a concern i mean

2:12:09

i think basically all these router

2:12:12

companies are even the open wrt one is

2:12:16

manufactured and like done in china so i'm

2:12:20

not really sure what the risk is there

2:12:22

um

2:12:24

against another company I think GLINET is

2:12:26

very reputable so uh what's someone saying

2:12:32

uh Sino Sinobu Sinobu it's actually worse

2:12:37

in China and North Korea they're

2:12:38

constantly tracked yeah yeah so like in a

2:12:40

lot of these countries there is I'm not

2:12:42

sure about North Korea but I know I've

2:12:44

definitely seen stuff in China with like

2:12:46

you know the mass surveillance they have

2:12:48

they have

2:12:48

like more cameras than people right like

2:12:51

well not more but like they have a

2:12:53

lot of cameras um if you've ever been

2:12:56

there's like cameras literally everywhere

2:12:58

um it'll be kind of striking thing to

2:13:00

see um so I think yeah we obviously

2:13:04

we don't want to have cameras literally

2:13:06

everywhere tracking everybody or at least

2:13:09

recording what everyone's doing um

2:13:13

So yeah, I don't know.

2:13:16

It's kind of been a thing where I

2:13:20

feel like a couple of years ago,

2:13:21

people were kind of making things about

2:13:25

how China had a digital ID system,

2:13:27

and it was super dystopian.

2:13:30

But now we're like, oh, no,

2:13:31

let's introduce a digital ID bill.

2:13:33

It's like, guys,

2:13:34

what about what you were saying a few

2:13:36

years ago?

2:13:37

What's happening?

2:13:39

I've literally seen some politicians here

2:13:41

in the US point out,

2:13:42

or maybe not the politicians,

2:13:43

but I've seen people point out,

2:13:44

they're like,

2:13:45

this is literally the stuff we criticize

2:13:47

Russia and China for.

2:13:48

Why are we doing this?

2:13:49

So yeah, it's not cool.

2:13:53

Yeah, it's kind of frustrating.

2:13:56

But yeah, I mean,

2:13:57

is there any other comments you can see

2:13:59

here that we haven't already got to?

2:14:02

No, I haven't seen anything.

2:14:05

Looks like everybody's been a little bit

2:14:07

quiet this week,

2:14:07

but we still appreciate you guys tuning in

2:14:10

and watching, even if you're lurking.

2:14:12

Thank you for listening.

2:14:14

All the updates from This Week in Privacy

2:14:16

will be shared on the blog every week,

2:14:18

so sign up for the newsletter or subscribe

2:14:20

with your favorite RSS reader if you want

2:14:22

to stay tuned.

2:14:22

I want to remind you guys,

2:14:23

we send the newsletter at the same time

2:14:26

that we go live,

2:14:26

so it also works as a really good

2:14:28

reminder that we're going live.

2:14:29

Little notification there.

2:14:31

If you prefer to listen on audio,

2:14:32

we also offer a podcast available on all

2:14:35

platforms and again on RSS.

2:14:37

And this video will be synced to PeerTo.

2:14:39

Privacy Guides is an impartial nonprofit

2:14:41

organization that is focused on building a

2:14:43

strong privacy advocacy community and

2:14:46

delivering the best digital privacy and

2:14:48

consumer technology rights advice on the

2:14:49

internet.

2:14:50

If you want to support our mission,

2:14:51

then you can make a donation on our

2:14:53

website, privacyguides.org.

2:14:55

To make a donation,

2:14:56

click the red heart icon located in the

2:14:58

top right corner of the page.

2:14:59

You can contribute using standard fiat

2:15:01

currency via debit or credit card,

2:15:03

or you can donate anonymously using Monero

2:15:05

or your favorite cryptocurrency.

2:15:07

Becoming a paid member unlocks exclusive

2:15:09

perks like early access to video content

2:15:11

and priority during our Q&A.

2:15:12

You'll also get a cool badge on your

2:15:14

profile in the forum and the warm,

2:15:15

fuzzy feeling of supporting independent

2:15:17

media.

2:15:18

So thank you again for watching and we'll

2:15:20

be back next week.

2:16:03

okay um all right so oh do you

2:16:19

have to go okay oh god my phone

2:16:28

is almost dead no