Why Does Microsoft Hate Security?
Ep. 48

Why Does Microsoft Hate Security?

Episode description

Microsoft is suspending developer accounts for key privacy tools like VeraCrypt and Wireguard, Employers are using your personal data to figure out the lowest salary you’ll accept, and more! Welcome to This Week In Privacy #48!

Download transcript (.vtt)
0:04

All right.

0:05

Microsoft abruptly terminated the accounts

0:07

of two multiple actually FOSS projects.

0:10

Employers are now using personal data to

0:12

figure out the lowest salary that you'll

0:13

accept.

0:14

And the FBI was able to recover deleted

0:16

signal messages from a device's local

0:18

notification database.

0:20

Busy week this week coming up on This

0:22

Week in Privacy number forty eight.

0:24

So stay tuned.

0:46

Welcome back to This Week in Privacy,

0:49

our weekly series where we discuss the

0:51

latest updates with what we've been

0:53

working on within the Privacy Guides

0:55

community,

0:56

and this week's top stories in data

0:59

privacy and cybersecurity.

1:01

I'm Jordan,

1:02

and with me this week is Nate.

1:04

How are you doing, Nate?

1:07

I'm doing pretty good.

1:08

It's been a good week.

1:08

How are you?

1:10

I'm doing good,

1:11

getting ready to dive into this top story

1:13

here.

1:14

So this week,

1:15

kind of a huge story that's been going

1:16

around.

1:17

Microsoft abruptly terminates VeraCrypt

1:20

account, halting Windows updates.

1:24

So basically, quoting from the article,

1:26

Microsoft has terminated an account

1:28

associated with VeraCrypt,

1:30

a popular and long-running piece of

1:32

encryption software,

1:33

throwing future Windows updates of the

1:35

tool into doubt.

1:37

Veracrypt's developer told for media,

1:41

the move highlights the sometimes delicate

1:43

supply chain involved in the publication

1:46

of open source software,

1:48

especially software that relies on big

1:50

companies, even tangentially.

1:53

So according to the Veracrypt developer,

1:55

I didn't receive any emails from Microsoft

1:58

nor any prior warnings.

2:01

VeriCrypt is an open source tool for

2:03

encrypting data at rest.

2:05

Users can create encrypted partitions on

2:07

their drives or make individual encrypted

2:09

volumes to store their files in.

2:12

Like its predecessor, TrueCrypt,

2:14

which VeriCrypt is based on,

2:15

it also lets users create a second

2:17

innocuous looking volume if they are

2:19

compelled to hand over their credentials.

2:21

And I'd also like to add you can

2:23

actually do full disk encryption with

2:25

VeriCrypt.

2:25

This is why this is quite a concerning

2:27

move because

2:29

uh if you lose access to your full

2:31

disk encryption then obviously you're not

2:33

going to be able to access your files

2:35

so that is kind of a big concern

2:38

here as well um and i guess moving

2:41

on to the second thing here wireguard vpn

2:43

developer also can't ship software updates

2:46

after microsoft locks their account so if

2:49

you didn't know already wireguard is a

2:51

protocol that most vpn providers use to

2:56

facilitate the VPN connections.

2:58

And they also have an official WireGuard

3:00

app that you can use with your WireGuard

3:02

profiles.

3:04

And this app on Windows is currently not

3:07

being able to be updated because the

3:10

WireGuard developer has also been locked

3:11

out of their Microsoft developer account.

3:14

So that means that they can't ship

3:15

software updates to Windows users.

3:18

Jason Donnefeld

3:20

The creator of open source WireGuard VPN

3:22

software told TechCrunch that he has been

3:25

locked out of his Microsoft developer

3:27

account and as a result cannot sign

3:29

drivers or ship updates for WireGuard for

3:31

Windows users,

3:33

which are critical for its software to

3:35

run.

3:36

And I think probably the most concerning

3:38

thing here is if there was a critical

3:41

vulnerability found in WireGuard VPN on

3:43

Windows,

3:45

they wouldn't be able to ship an update

3:46

and get it fixed.

3:47

So this is kind of a concerning thing

3:49

to be happening right now.

3:53

But we did see that there was a

3:57

sort of update to this story.

3:58

So we did want to mention this immediately

4:00

off the bat.

4:01

Zach Whitaker from TechCrunch did a

4:03

Mastodon post saying that he had contact

4:07

from VeraCrypt and WireGuard,

4:08

both telling them that they had regained

4:12

access following their Microsoft account

4:13

lockouts and can now release updates

4:15

again.

4:16

So that was on the eleventh of April.

4:18

And a lot of these articles were coming

4:20

out on the April the eighth.

4:22

So it took a couple of days for

4:23

this to happen.

4:24

And I guess a lot of backlash, actually.

4:29

So there's this article here from Bleeping

4:32

Computer that also goes a little bit more

4:34

into depth about how this worked.

4:38

So according to the VeraCrypt developer,

4:39

his account was actually terminated,

4:42

which I basically need to have

4:45

a Microsoft developer account to sign

4:47

Windows drivers and the bootloader.

4:52

I believe this is because the secure boot

4:54

process, if the drivers aren't signed,

4:57

and it won't be able to load properly

4:59

or something like that.

5:01

I've tried to contact Microsoft through

5:02

various channels,

5:04

but I've only received automated replies

5:05

and bots.

5:06

I was unable to reach a human.

5:08

I cannot publish Windows updates.

5:11

So I think this story kind of highlights

5:12

the concern of trusting a centralized

5:15

entity with the update process.

5:18

So in this case,

5:20

Microsoft does have quite a lot of control

5:22

over which developers can actually

5:25

create apps and updates on their platform,

5:28

which, you know,

5:29

we obviously oppose this because people

5:32

should be able to run whatever software

5:33

they want on their computer.

5:35

They shouldn't be held hostage by a

5:38

corporation.

5:40

And I think in this case,

5:41

it also ended up being a kind of

5:46

security risk because they weren't able to

5:48

release updates and they could have been a

5:50

critical vulnerability found.

5:51

But I do want to mention that WireGuard

5:53

VPN is notoriously very stable.

5:55

And the amount of updates that have been

5:57

published for it have been quite minimal.

5:59

And there haven't been that many critical

6:02

vulnerabilities found.

6:04

So that is a benefit.

6:07

It's also because the WireGuard protocol

6:09

is much leaner than other protocols like

6:12

OpenVPN.

6:13

So that does benefit it in that

6:15

circumstance.

6:16

In this article here from TechCrunch,

6:19

Sorry,

6:20

this article here from Bleeping Computer,

6:22

it also stated that dev teams from

6:24

Windscribe and memtest-eighty-six have

6:28

also been locked out of their accounts

6:30

too,

6:30

so big issues for Windscribe and

6:32

memtest-eighty-six.

6:35

And it is kind of concerning that this

6:37

happened without any warning,

6:39

no notification,

6:41

just these developers trying to access

6:42

their accounts and they weren't able to.

6:46

And like Jonah said here in the chat,

6:48

he said,

6:50

almost like having a triopoly on app

6:51

stores isn't a good idea.

6:53

Yeah,

6:53

I think we should be trying to focus

6:55

more on having more app stores,

6:59

having alternative ways to install

7:01

software.

7:02

I think, yeah.

7:04

I mean,

7:04

one of the benefits with Windows is you

7:06

don't have to use the app store to

7:07

install apps.

7:08

You can actually install them

7:09

independently, which is a benefit.

7:11

But it also,

7:13

like we said in this story,

7:14

there is a...

7:16

element of control that Microsoft has

7:18

because you need their permission to sign

7:20

drivers and write to the bootloader.

7:23

So if you don't have that developer

7:26

account,

7:26

then you're not going to be able to

7:27

do that properly.

7:29

In that case,

7:29

your users would have to basically bypass

7:32

that,

7:32

which is a much more technical process.

7:35

Although it is possible,

7:37

it's not really recommended and a lot of

7:38

users are probably going to feel unsafe

7:40

about doing that.

7:42

so i feel like i've talked for quite

7:44

a while here um nate do you have

7:47

any thoughts on this story so far oh

7:52

man i mean i think you covered most

7:55

of it for sure i think um yeah

7:58

it's really uh it i'm glad you mentioned

8:01

the um the full disk encryption nature of

8:05

vericrypt i'm a vericrypt user myself i i

8:07

will admit that and um it's

8:10

VeriCrypt can be used either to encrypt

8:13

specific things like you can create a

8:14

container or you can encrypt like an

8:18

external hard drive.

8:19

Or like you said,

8:20

you can encrypt your entire Windows

8:22

computer,

8:22

which my wife is primarily a Windows user.

8:26

I kind of dual boot in between Windows

8:27

and Linux for the most part.

8:28

I really just use this Mac for this

8:30

and that.

8:32

when I travel and stuff.

8:33

So that's kind of one of the first

8:35

things we do is where I'm going with

8:36

that is we encrypt our computers with

8:38

VeriCrypt.

8:39

And the developer,

8:42

I forget which article he said it in.

8:43

He may have said it in all of

8:44

them.

8:45

But the developer pointed out that if they

8:47

hadn't gotten this fixed in time before

8:49

the certificate ran out,

8:50

which I think would have been early June

8:52

or end of June, sometime in June,

8:56

then that could have potentially meant the

8:57

computers wouldn't boot.

8:59

And on top of it,

8:59

since he can't push out an update,

9:01

how are you supposed to make sure that

9:02

people know like, Hey,

9:03

decrypt your computers because they might

9:05

not boot.

9:06

So it's really, it's really,

9:08

really troubling for sure.

9:10

The,

9:12

The thing I like about the bleeping

9:14

computer article is that it gives more of

9:16

Microsoft's side of the story.

9:18

And I don't like that necessarily because

9:19

I care what Microsoft's opinion is,

9:21

but just to have the full,

9:22

complete information, right?

9:23

And Microsoft claims that ever since,

9:26

I think it was April of last year,

9:27

they've had this program where developers

9:30

have to verify,

9:33

which we're seeing that now on the Android

9:36

side of things, right?

9:37

And Android, it's not...

9:40

I don't wanna say it's not going well

9:41

because it's not been rolled out yet,

9:42

but this just shows things can go wrong,

9:45

even well-meaning things.

9:46

There are problems with this model.

9:48

And so it's really confusing.

9:51

Did these people just somehow miss the

9:52

notifications?

9:53

Was there some kind of glitch where they

9:54

didn't get notified?

9:56

Because the Microsoft,

9:57

I think it was like a VP,

9:59

was saying that they've been sending out

10:00

emails,

10:01

they've been sending out notifications.

10:03

He said they've been emailing everyone

10:04

since October,

10:06

I think the guy from VeriCrypt said that

10:08

he noticed this account was shut off in

10:10

January.

10:10

It's just,

10:11

we're just now finding out about it for

10:12

some reason.

10:14

Not like he was hiding it.

10:14

He was just,

10:15

he was busy trying to figure it out.

10:16

And for some reason,

10:17

it's just now that he's coming forward and

10:19

being like, hey,

10:19

here's where I've been for the past couple

10:20

of months.

10:21

So, I mean, yeah, there's so many,

10:25

Like, so many questions here, you know?

10:27

Because there's... Again, there's like...

10:31

Did something just get missed?

10:34

I don't know.

10:34

Why all of these?

10:35

I have a hard time believing that like

10:36

Windscribe, they have a whole team.

10:38

How did a whole team miss this?

10:40

And like, for the record,

10:42

I'm not blaming Windscribe with that.

10:43

I'm blaming Microsoft.

10:44

Like,

10:45

I don't think Microsoft did a good job

10:46

of notifying everybody if they did at all.

10:49

It's really weird.

10:49

I don't like to assume malice.

10:51

You know,

10:51

I don't like to look at this and

10:52

be like, oh,

10:53

Microsoft's trying to crush open source.

10:55

But I certainly have a lot of questions

10:56

and I don't understand how so much fell

10:57

through the cracks and so much got missed.

10:59

And

11:00

It really shouldn't take all this negative

11:04

media coverage for Microsoft to be able to

11:06

do something.

11:07

That's a big concern that I've had for

11:09

years,

11:10

that I've noticed years and years and

11:12

years ago that...

11:16

It was with Facebook specifically.

11:18

So I used to manage bands and one

11:19

of the bands I managed wanted to change

11:21

their name because there was another band

11:22

with the same name that got super,

11:24

super popular.

11:25

So they're like,

11:25

we need to rebrand because we keep getting

11:27

confused with them.

11:28

And like trying to get Facebook to rename

11:30

this band was literally a multiple months

11:33

long deal.

11:35

And it's like,

11:35

why can't I just talk to a person?

11:37

Like it doesn't even have to be real

11:38

time.

11:38

I'm not asking to call somebody or chat

11:40

with somebody.

11:40

Why can't I email a human being?

11:43

And it's just – everybody's trying to cut

11:46

down on the cost of having a support

11:51

staff, and they're trying to save money,

11:52

and it's all about shareholders making

11:55

more money and stuff.

11:56

But it's just like this is the result

11:57

is things don't get – like what if

12:01

this had never taken off the way it

12:03

had been?

12:04

Like this would have been a really big

12:05

deal.

12:05

This would have been really bad.

12:07

So –

12:08

Sorry,

12:08

I'm having trouble putting my thoughts in

12:09

order.

12:09

But yeah, this was so, so not cool.

12:14

And I guess it makes me wonder.

12:17

So, like, first of all,

12:19

what do you think,

12:20

if you have any opinions,

12:21

what do you think developers could do

12:23

against this?

12:24

Because, I mean,

12:24

it really does seem like Microsoft holds

12:26

all the cards here.

12:27

And I don't know.

12:30

Like, it's...

12:32

because these are privileged software,

12:35

right?

12:35

I think that's the difference.

12:36

We were talking about this in another chat

12:38

earlier today,

12:39

is like when I install things on Windows

12:42

that don't come from the Microsoft Store,

12:43

even when they're unsigned,

12:44

there's a little pop-up that's like, hey,

12:45

this is unsigned.

12:46

And I can tell it like, yeah,

12:47

I know, install it anyways.

12:49

But for these like really high level,

12:51

very privileged things, it's like,

12:55

I don't even know if there's a way

12:55

around that.

12:56

So I mean, yes, in a perfect world,

12:58

switch to Linux, right?

12:59

But I don't know if there's any other

13:00

alternatives there.

13:01

Do you have any thoughts on that?

13:04

Yeah, I think, you know,

13:06

when there's a gatekeeper and that

13:07

gatekeeper is Microsoft,

13:09

there is like not a whole lot we

13:11

can do in this case, unfortunately,

13:13

which kind of sucks.

13:14

Like you said,

13:14

like switching to operating systems that

13:18

respect you and don't hold your computer

13:20

hostage and do this sort of silliness is,

13:24

I mean,

13:24

I'm sure there is a security benefit from

13:26

doing this, but I think, you know,

13:28

we have to look at things from like

13:31

different angles as well, right?

13:32

Like what does this power that Microsoft

13:35

holds, what can it be used for?

13:38

And if they can lock people out of

13:40

their accounts accidentally, accidentally,

13:44

I'm saying in quotations,

13:45

very large quotations, but, you know,

13:50

then what could they do if, you know,

13:51

there was a foreign government that didn't

13:53

really like, oh,

13:54

we don't like that you're allowing people

13:57

to use WireGuard VPN to bypass our

13:59

firewall or,

14:01

something like that, right?

14:03

I think people should be able to use

14:06

their computer in whatever way they want,

14:08

right?

14:08

We shouldn't be, we shouldn't be,

14:11

you know,

14:11

bowing to Microsoft's wins on what we do

14:15

on our computers.

14:15

So ideally people are using Linux here.

14:19

I mean,

14:19

there's definitely the ability to use

14:23

different repositories is great.

14:26

I think

14:28

it's kind of unfortunate though,

14:30

whoever controls the platform kind of gets

14:32

to decide these things.

14:33

And as far as I'm aware,

14:34

there's Linux is basically the only system

14:37

where there isn't a big tech corporation

14:39

with ultimate control over everything

14:42

because Android,

14:45

like we've kind of seen Google owns the

14:48

Android source code.

14:49

So a lot of times they can exert

14:52

things onto the operating system that, uh,

15:00

against the user's interests,

15:01

for instance.

15:02

Like we've seen where they're stopping

15:04

people from installing apps on their

15:07

devices and stuff like that.

15:09

We're still kind of monitoring that

15:11

situation as well.

15:12

But I think when we operate on platforms

15:15

that are not controlled by the community,

15:18

then that's when we run into issues like

15:20

this.

15:20

So yeah, I mean,

15:24

I don't really have too much more to

15:25

add unless you have something.

15:28

No, yeah,

15:29

I guess just the last thing I wanted

15:32

to mention is,

15:33

do you happen to know off the top

15:34

of your head what PrivacyGuide's official

15:37

encryption software recommendations are?

15:43

Off the top of my head,

15:44

we do recommend VeriCrypt, I believe.

15:48

Yeah.

15:49

Okay.

15:50

Yeah, I've got it pulled up here.

15:51

Let's see.

15:53

Can I share?

15:54

Yes, sharing this tab.

15:55

Okay.

15:56

So it looks like we do recommend...

15:59

If you're trying to upload to the cloud,

16:00

we do recommend Cryptomator.

16:02

We do recommend VeriCrypt.

16:05

It does say that...

16:08

for disk encryption um interesting i did

16:12

not know we recommended that for full disk

16:13

encryption but yeah the like i said it

16:15

can also be used for standalone for to

16:17

encrypt like an external disk but oh there

16:19

it is operating system encryption yeah we

16:21

typically recommend whatever it comes

16:23

built in with so like linux distros will

16:24

come with lux which is the linux unified

16:26

key setup um the only thing that kind

16:28

of sucks is i'm told that it can

16:29

only happen

16:31

At the start, like when you install Linux,

16:34

I don't think you can go in and

16:35

add it afterwards,

16:36

at least not very easily.

16:37

It's kind of clunky.

16:38

Macs come with FileVault and Windows.

16:42

So Windows has BitLocker.

16:45

It's kind of tricky to use because

16:46

originally it was only in like the upper

16:48

level versions,

16:49

like the Pro and the Enterprise,

16:51

which typically cost a lot of money.

16:53

The home version now has it if you

16:55

use a...

16:57

if you use an online account,

16:58

which we definitely do not recommend for a

16:59

variety of reasons.

17:01

And, uh, yeah, but there's,

17:03

there's ways to, you know,

17:04

you can upgrade to pro, which, uh,

17:06

you can usually find, um,

17:09

make sure they're the reputable,

17:11

but you can usually find resellers online

17:12

who will sell them a pro license for

17:14

a lot cheaper and stuff like that.

17:15

So BitLocker, I will be honest,

17:18

me personally, um,

17:20

I'm not too crazy about BitLocker because

17:21

I've seen a lot of vulnerabilities in the

17:23

past where there's a vulnerability found

17:27

in full disk encryption and BitLocker is

17:28

vulnerable to it, but VeriCrypt is not.

17:30

Or I've also seen stories about there's a

17:32

bug and now BitLocker won't decrypt and

17:35

you can't boot your operating system.

17:36

You have to recover it.

17:37

So always make sure you save the recovery

17:39

methods because that's really important.

17:41

But at the same time,

17:43

I guess to make devil's advocate argument,

17:45

We know that BitLocker is secure.

17:46

We did cover a story about this a

17:48

while back where law enforcement requested

17:50

some BitLocker keys from Microsoft,

17:53

and Microsoft only had them because the

17:54

whole online account thing,

17:56

they were not able to get it just

17:58

the – what's the word I'm looking for?

18:01

They were not able to get it just

18:02

from the device.

18:03

As far as we know,

18:04

BitLocker has not been broken by law

18:07

enforcement or anything.

18:08

But also,

18:10

we would have been in the same situation

18:11

about not being able to boot had this

18:13

VeriCrypt thing not happened, right?

18:14

And there's also plenty of Windows

18:17

softwares that break Windows even without

18:18

the encryption enabled.

18:19

So I guess my point is,

18:20

I think there's pros and cons,

18:21

but BitLocker does have a lot going for

18:23

it.

18:24

Just something to be aware of.

18:25

Yeah.

18:27

Yeah,

18:28

I guess that's all I got on that

18:30

one.

18:30

Also, yeah,

18:30

just a reminder to get off Windows if

18:32

possible.

18:32

Even if you dual boot,

18:33

I mentioned I have Windows and Linux.

18:35

I don't fully run Windows all the time.

18:38

I try to use Linux whenever I can,

18:40

and then I use Windows for the more

18:43

CPU-intensive stuff,

18:44

like video editing and stuff like that,

18:46

that my Linux machine just can't really

18:47

handle.

18:48

I think that is all I've got.

18:55

All right.

18:57

And I think if that's all we've got

18:59

on that story,

19:00

we are going to talk next about a

19:04

wage.

19:05

What is it?

19:05

Surveillance wages.

19:07

Cause you know,

19:08

just when you think privacy can't get any

19:10

worse.

19:11

So this comes from market watch and it

19:14

says employers are using your personal

19:15

data to figure out the lowest salary that

19:17

you'll accept.

19:18

And yeah,

19:20

Man, I mean, honestly,

19:21

this article really,

19:22

the headline kind of says it all.

19:23

So there's been a lot of talk lately

19:25

about surveillance pricing,

19:28

which is where

19:30

And as far as we know,

19:31

this is happening more online than in

19:33

person.

19:34

But companies will use the data about you

19:36

to try and figure out like, oh,

19:37

maybe you'll pay a little bit extra.

19:39

You'll pay a little extra for this plane

19:41

ticket.

19:41

You'll pay a little extra for this.

19:43

I think there was a story a while

19:44

back that found out that if you were

19:47

in the parking lot of a Target and

19:48

you open the app on your store or

19:50

open the website,

19:51

They would actually try to charge you more

19:53

because they figured that you probably

19:54

weren't going to go to another store,

19:57

which I don't know who's going to sit

19:57

in the parking lot and then order

19:58

something.

19:59

That's kind of weird.

19:59

But there's also been a lot of allegations

20:03

about – well,

20:04

I mean actually –

20:05

No, because that's this data.

20:07

There's been allegations like Uber,

20:08

for example,

20:09

if they can tell your battery is low,

20:10

that they'll charge you more because they

20:12

know that you can't afford to wait for

20:13

the price to go down.

20:14

I don't think that was proven,

20:16

but that's definitely an allegation and I

20:17

wouldn't put it past them personally.

20:19

So yeah,

20:20

now the new thing is using your personal

20:23

data to figure out how much to pay

20:24

you,

20:25

which in the past has historically been

20:27

based on things like how long you've been

20:29

in this industry,

20:30

how long you've been at this job,

20:31

what the actual job title you're applying

20:33

for is, certifications, things like that.

20:36

which I'm sure will probably still play a

20:37

role.

20:38

But of course,

20:39

now they've got to factor in things like,

20:40

where did it go?

20:42

Things like if you've taken out a payday

20:43

loan,

20:43

if you have a high credit card balance,

20:46

I just made a big move and I'm

20:48

not going to lie.

20:48

We moved from a very high cost of

20:50

living area to a lower cost of living

20:51

area.

20:52

We didn't have savings because the cost of

20:54

living was so high.

20:56

So I've got a pretty high credit card

20:58

balance right now because we use that to

20:59

fund a lot of the move.

21:01

Um, so things like that,

21:03

I think further down, they did talk about,

21:06

we've actually,

21:07

I really feel the need to point out,

21:09

this is not hypothetical because I've been

21:11

hearing for a long time now that this

21:12

is how it works with gig, gig work.

21:14

And, you know, things like, um,

21:17

like DoorDash,

21:18

there are certain people who start out

21:22

making more, like if, if, okay,

21:24

if I put it in an order to

21:25

DoorDash,

21:27

And it goes to two different people,

21:29

two different dashers.

21:31

One of them will see a higher price

21:33

than the other one based on things like

21:34

how picky they are with their – do

21:38

they just accept every single one that

21:39

comes along or do they wait for the

21:40

better ones?

21:40

And the ones that typically pay more will

21:44

usually go to those people first, right?

21:46

Which is so predatory.

21:49

It's so – I feel like I'm getting

21:50

ahead of myself,

21:51

but like –

21:53

I don't know.

21:54

Yeah.

21:54

Like, okay,

21:55

we'll just jump into that part because I

21:56

mean, again, this is the story.

21:57

Well, okay, hold on.

21:58

There is one more thing I want to

21:59

point out before I jump into the analysis

22:00

portion, which is they say that, um,

22:03

The vendors that provide tools that make

22:05

this possible, oh, no,

22:08

it's a little bit further up.

22:12

Yeah,

22:12

a first of its kind audit of five

22:14

hundred labor management artificial

22:15

intelligence companies found that

22:18

employers in healthcare, customer service,

22:20

logistics,

22:20

and retail are customers of vendors whose

22:23

tools are designed to enable this

22:24

practice.

22:25

The report does not claim that all

22:26

employers using these systems engage in

22:29

wage surveillance.

22:30

Instead,

22:30

it warns that the growing use of

22:31

algorithmic tools to analyze workers'

22:33

personal data can enable pay practices.

22:35

And I skipped over it earlier,

22:37

but when they talk about, like,

22:38

high credit card balance and stuff,

22:40

they can also scrape your social media to

22:41

see if you are more likely to join

22:42

a union or could become pregnant.

22:44

And...

22:45

I guess the last thing I'll say is

22:46

I love down here.

22:47

They were talking about how there are laws

22:49

now that are trying to outlaw surveillance

22:51

pricing,

22:52

but a lot of them have not caught

22:53

up to wage surveillance wages,

22:55

except for one,

22:56

which is Colorado is trying to pass the

22:59

prohibit surveillance data to set prices

23:01

and wages act,

23:02

which would ban companies from using

23:03

intimate personal data.

23:05

Um, it carves out performance-based wages,

23:08

uh,

23:08

which I mean on the surface sounds fine.

23:12

Uh, I like that.

23:13

He says, uh,

23:15

Here it is.

23:15

The bill would prohibit companies from

23:17

using workers' personal data without their

23:18

consent to determine what they're paid.

23:21

So, I mean,

23:24

I read that without their consent.

23:25

I'm like, yeah, of course,

23:26

that's going to be buried on page fifty

23:27

of the contract, right?

23:28

That's ridiculous.

23:29

So, yeah, I don't know.

23:32

Anyway,

23:33

so getting back to the analysis portion of

23:34

this, this is really frustrating.

23:39

I had the privilege of interviewing

23:40

someone recently that we'll talk about a

23:41

little bit more later.

23:43

coming up here, but she spoke about how

23:47

a lot of this surveillance used to like

23:49

set prices and set wages like this,

23:52

it becomes deterministic, right?

23:54

Because here's what I can see happening is

23:57

you go up to your boss and you

23:58

say, I think I deserve more.

24:00

And your boss says,

24:01

or even negotiating a new job, right?

24:02

Because I've done that where I go to,

24:04

I get a job and they say,

24:05

we'll pay you this much.

24:06

And I say, I think I deserve more.

24:07

And I've successfully done that.

24:09

And what happens when they come back and

24:11

say, well, we can't.

24:12

Like I do that every,

24:14

you guys may or may not know this.

24:15

If you rent,

24:16

sometimes you can negotiate your rent.

24:18

Like I've done that.

24:18

I've gone to the leasing office and been

24:20

like, I don't want to pay more.

24:22

Like,

24:22

let's see if we can find to an

24:23

agreement and they'll walk it down a

24:25

little bit.

24:25

And then I've been to other places where

24:27

they're like, no,

24:27

that's out of our control.

24:28

We can't do anything.

24:29

And my fear is as this continues to

24:31

grow,

24:31

we're going to see more of that

24:33

That second thing where people are like,

24:34

no, we can't do anything.

24:35

It's just,

24:36

it's set what it is because that's what

24:37

the algorithm says.

24:38

And I can't,

24:39

I don't have the authority to push back

24:41

on the algorithm,

24:41

which is demeaning to your employees.

24:43

But it's also deterministic.

24:45

And it sets us in this like,

24:49

this like,

24:50

almost like a lack of free will.

24:51

I just don't want to use the word

24:52

deterministic again.

24:53

But it sets us in this environment where

24:56

we have no freedom, really.

24:58

We have no growth because now it doesn't

25:00

matter how hard you work.

25:01

It doesn't matter, you know,

25:03

how good you do.

25:04

It's, you'll forever hit a cap, right?

25:06

And sure, those things will matter.

25:07

Those things will help.

25:09

But, you know,

25:09

if you've got the high credit card debt,

25:11

if you've got pro-union views on social

25:14

media,

25:15

now you can either choose to not talk

25:16

about that

25:17

Or you can choose to just like forever

25:19

not reach your full earning potential,

25:21

which then keeps you trapped in this

25:23

cycle.

25:23

And it's just, oh my God,

25:25

this is so predatory.

25:26

And yeah, I don't know.

25:28

I feel like I kind of rambled a

25:30

little bit on that one,

25:31

but hopefully I kind of said something

25:34

coherent.

25:37

Yeah.

25:37

I mean,

25:38

I think one important thing to add to

25:39

this is we already see like when it

25:43

comes to employment and like how much

25:46

people are paid,

25:46

like this is already an issue without like

25:49

the surveillance stuff, right?

25:50

Like,

25:52

we have studies that are done where,

25:53

you know,

25:53

there's people that apply with the same

25:55

resume,

25:56

but they change the name from a female

25:58

name to a male name.

25:59

And then the person they get employed more

26:02

under the male name,

26:03

then they get more interviews under the

26:05

male name than the female name.

26:07

I think this is just like increasing the

26:09

level of discrimination.

26:10

People are going to be finding themselves

26:12

in right.

26:13

Like, Oh, your name appears like this.

26:15

And we already kind of know that these

26:17

AI systems are incredibly biased against,

26:21

um,

26:22

people of color, you know, uh,

26:25

marginalized groups that are less

26:26

represented.

26:27

They're, they're not as that,

26:29

that these systems are trained on data

26:33

that doesn't have them as the majority.

26:36

Right.

26:36

So it's going to kind of deprioritize

26:39

their, uh,

26:42

their skills and their experience, right?

26:44

So I think this is sort of an

26:47

additional layer to that discrimination.

26:50

I think someone said here,

26:51

Plants McGee said,

26:53

giggles in European where this is illegal.

26:55

Yeah,

26:55

this is illegal in a lot of the

26:57

world, actually.

26:58

I'm kind of surprised that this isn't

26:59

illegal in the US,

27:01

but I guess that is the state of

27:03

things.

27:05

You know what, though?

27:06

I don't mean to cut you off,

27:07

but I'm glad you mentioned that because I

27:08

did want to mention that.

27:10

I don't want to get after anybody here,

27:12

but I just want to point out,

27:15

in my personal opinion,

27:16

I think that's a dangerous attitude to

27:17

have.

27:17

We're like, haha,

27:18

that wouldn't happen here.

27:21

Just, what was it, late last year,

27:22

early this year?

27:24

The EU is talking about rolling back parts

27:26

of GDPR to be more competitive on the

27:28

AI industry.

27:30

So like,

27:31

Yes.

27:31

Like laws help.

27:33

Laws are good.

27:33

That's I'm glad you guys have that,

27:35

but I just really feel the need to

27:36

point that out.

27:36

Like still keep an eye on this stuff

27:38

because laws can change.

27:40

And I am under no delusion that European

27:43

politicians care more about their citizens

27:45

than the U S ones.

27:46

They're just, you know,

27:47

they put on a better facade about it

27:48

in my opinion, but yeah,

27:49

just to just keep that in mind,

27:51

laws can change and that stuff can go

27:52

away.

27:52

We gotta,

27:53

we gotta make sure that we're constantly

27:55

fighting for our privacy rights,

27:56

not taking it for granted.

27:58

So yeah.

27:59

Yeah, exactly.

28:00

I mean, I don't know.

28:02

This story is kind of, uh,

28:05

I don't know.

28:06

I think, yeah,

28:07

there's laws in countries like the

28:08

European union where like, you know,

28:11

access to personal information for

28:15

employment purposes is protected and not

28:19

able to be run through an algorithm or

28:20

whatever.

28:21

Um,

28:22

and there's laws against that in the EU

28:24

and, um,

28:25

I know in Australia, technically,

28:26

that's classified as discrimination.

28:28

So it depends on the country.

28:33

But like Nate said,

28:34

I think it's important.

28:35

We can't just say, oh,

28:39

it's just the US being the US.

28:41

I think we should be constantly vigilant

28:44

of governments that are trying to do this

28:47

stuff.

28:49

Yeah,

28:49

so it is a good point that AI

28:51

will just be fancy autocorrect and picking

28:53

the most likely responses will inherently

28:55

lead to a tyranny of the majority rather

28:57

than a fair system.

28:59

Exactly.

29:00

So it kind of has that effect, right?

29:05

I think, you know, it's more likely to,

29:09

it basically just mirrors the reality,

29:11

right?

29:11

In a lot of cases,

29:12

which the reality is people get

29:14

discriminated against and people are paid

29:16

less depending on their,

29:20

their identity, which is,

29:21

we've done studies on this.

29:23

We know this is the case.

29:25

It's kind of something that we're trying

29:26

to fight against to stop,

29:28

but it's not something we've completely

29:33

solved.

29:33

And yeah, I think, you know, if we,

29:38

if we try and make sure people are

29:40

aware of this,

29:40

I think some of the stupid stuff that

29:42

I've seen is a lot of companies are

29:43

using like

29:45

AI to scan people's resumes when they

29:47

apply and it will like check the keywords

29:49

and stuff and people were just like

29:51

putting invisible keywords on their

29:53

resumes To make it detect it like this

29:56

is this is incredibly silly stuff I can't

29:59

believe I have to say this but like

30:01

we need to go back to when humans

30:03

were reading resumes and interviewing

30:06

people and not feeding their information

30:08

through an AI system and

30:10

That's also just terrible for the person's

30:12

privacy.

30:12

Like,

30:13

I don't want everyone to know my

30:15

employment history or where I worked or

30:18

what I've the schools I've been to.

30:22

And, you know,

30:23

that's just another thing that we're

30:25

feeding the AI systems.

30:26

Like really we're putting all this

30:28

information through these massive AI

30:31

companies with like no corporate like

30:33

control, especially in the US.

30:35

Like there's, I feel like there's,

30:36

it's very lax at the moment because the

30:38

entire economy is basically propped up by

30:40

the AI data center industry.

30:43

It does seem like that is starting to

30:44

fall down a little bit now,

30:46

like with a lot of data centers being

30:47

canceled, but.

30:50

I definitely think the AI hype stuff is

30:53

propping up a lot of stuff.

30:54

And it kind of means that a lot

30:56

of cases this behavior is allowed and when

31:02

it shouldn't be.

31:04

So, yeah.

31:06

That's kind of my thoughts on that.

31:07

Did you have anything more you wanted to

31:09

add, Niamh?

31:11

No.

31:13

I think just, yeah, it's such a, like...

31:16

it doesn't matter where you are in terms

31:18

of your economic beliefs.

31:19

Like even if you're a free market person,

31:21

there's always going to be someone who can

31:23

do the work for less.

31:25

And when it's a race to the bottom

31:26

like this, everyone loses.

31:27

I mean, just look at airplanes, right?

31:29

It's, you know,

31:31

even Southwest now is getting away,

31:33

doing away with their like first come

31:35

first serve seating and like free check

31:37

bag because it's becoming such a

31:39

competitive market.

31:40

And they, I don't know,

31:42

it's such a race to the bottom.

31:44

One of the,

31:46

the headers here that I think I may

31:47

have scrolled past said judging our

31:48

desperation rate, which is again,

31:50

it's just so predatory.

31:52

It's one thing like surge pricing is one

31:55

thing, right?

31:55

Because surge pricing looks at the entire

31:57

market and says using Uber as an example,

31:59

a concert just ended.

32:01

There's a, you know, ten thousand people.

32:03

That's probably too many.

32:04

Five thousand people in this one spot

32:05

trying to get home.

32:06

We're going to charge more.

32:08

Right.

32:08

But this is looking at an individual

32:10

person.

32:11

This is looking at you specifically and

32:13

saying, I know that.

32:15

that you've sent out five hundred resumes

32:17

this week.

32:18

I don't know how you did that.

32:19

You probably used a bot,

32:20

which I wouldn't blame you.

32:21

You sent out five hundred resumes this

32:22

week.

32:23

You've gotten two callbacks and you've got

32:25

a thousand dollars left in your savings

32:27

account.

32:27

You will literally do anything.

32:30

And I'm going to give you the bare

32:31

minimum that I can to make you say

32:32

yes.

32:33

but also not pay you.

32:35

Your other coworkers are making more than

32:37

you do.

32:37

Personal opinion,

32:38

I've always thought it was ridiculous that

32:40

you're not supposed to talk about pay at

32:41

work.

32:42

No,

32:42

I was always happy at my last job

32:44

to tell people how much I was making

32:46

because I wanted everyone else to know.

32:48

I think I've mentioned this in my last

32:49

job.

32:50

I was...

32:51

the highest paid person in in our job

32:53

title just by sheer coincidence and luck i

32:55

don't know how i got that and i

32:57

was very open about that not because i

32:58

wanted to brag it to everybody else but

33:00

because i was like you guys like i

33:01

don't think i'm better than everyone else

33:02

you guys deserve to be getting paid more

33:04

too and i would tell people that kind

33:05

of stuff all the time so like yeah

33:08

um jonah says five hundred resumes in a

33:11

week sounds like fighting ai with ai hey

33:12

man you know that's that's the the

33:14

situation we're in right it's ai writing

33:16

emails ai reading emails ai responding to

33:18

emails it's yeah

33:20

But I don't know.

33:21

It's yeah,

33:21

I'm kind of going off on a rant,

33:22

but this just makes me so mad because

33:25

it's so, I keep using the word predatory,

33:27

but it removes, it removes everything.

33:30

It removes the hard work.

33:32

It removes the whole like, and again,

33:34

it doesn't matter what your beliefs are.

33:35

Even if you're like a pull yourself up

33:36

out of your bootstraps person,

33:37

you can't anymore because it removes that

33:40

possibility because they know exactly how

33:42

much you need and they will never give

33:43

you a penny more.

33:45

It's just, it just frustrates me.

33:47

So sorry,

33:47

I feel really passionately about this

33:49

subject.

33:51

Yeah,

33:51

I think it goes without saying people

33:53

should be paid a dignified amount.

33:57

And yeah, I agree totally.

33:59

Like, I think it is important to me.

34:01

There's a weird culture around not sharing

34:04

how much money you make.

34:05

I'm not really sure what the reasoning is

34:07

behind that,

34:08

but I think it is important to be

34:10

open about that, especially because,

34:12

you know, your employer, well, I mean,

34:14

not every employer,

34:15

but this

34:17

the people that are using this software,

34:19

they are not holding back.

34:20

They are doing everything in their power

34:22

to pay you the least amount.

34:23

So the least you can do is discuss

34:26

this with your coworkers, um, unionize,

34:30

do all those sorts of things.

34:31

Right.

34:31

Like, I dunno, maybe that's, uh,

34:35

that's too much, but I think it is,

34:36

uh, it is important, like another thing.

34:41

Right.

34:41

But I think, you know,

34:42

if your employer is doing this sort of

34:44

stuff to employ people,

34:47

Name and shame, name and shame,

34:49

like seriously,

34:50

like that is the sort of thing that

34:53

people go on strike for.

34:54

So I think if there's companies that are

34:58

doing this,

34:58

definitely try and get them to stop

35:02

because like Nate said,

35:05

it's discriminatory.

35:06

It's like, you know,

35:07

it's removing people's power to control

35:09

things.

35:11

And yeah,

35:12

if you think it's not already happening,

35:13

definitely go read this article because

35:15

they lay out several scenarios they looked

35:17

at where it's like, this is happening,

35:18

like not could happen.

35:19

Like this is happening in like,

35:22

they mentioned, what is it?

35:24

What is it?

35:25

Staffing, gig nurses, again, gig workers,

35:27

DoorDash, Uber,

35:28

like it's already happening there.

35:31

There's no reason it's going to stop.

35:32

And God,

35:34

twice now I've had something pop into my

35:35

head and then I lost it.

35:36

I hate when that happens, but.

35:41

Yeah, it's...

35:43

This is one of those moments where laws...

35:46

I mean...

35:47

I know laws are controversial.

35:48

Like people are always afraid of

35:50

over-regulation and afraid of like,

35:51

you know, Oh,

35:52

companies break laws all the time,

35:53

but like,

35:54

what else can we do about this?

35:55

There is no, I mean, yes,

35:56

we can all take our privacy seriously and

35:57

we should,

35:58

regardless of whether or not this is

35:59

happening, but there's really like,

36:00

I don't see any way to fix this

36:02

other than just straight up outlawing it.

36:03

Like we were talking about earlier,

36:04

this is illegal in a lot of countries.

36:06

It should be legal here in the U

36:07

S it should be very illegal.

36:08

Everyone should be mad and sending this to

36:10

their politicians and being like,

36:11

we need to outlaw this before it becomes

36:12

a regular practice.

36:13

Cause yeah,

36:15

Oh,

36:15

I remember what I was going to say.

36:16

Because yeah, on that note,

36:17

you can't tell me this is a problem

36:19

that the free market is just going to

36:20

fix.

36:20

Because look at Amazon.

36:22

Everyone knows Amazon,

36:24

especially the Amazon brand,

36:25

is usually cheap garbage.

36:27

And you can't tell me that Amazon became

36:28

the behemoth they are today by putting out

36:31

the best product.

36:31

They did it by undercutting everyone else,

36:33

by knocking off everyone else,

36:35

by using manipulative algorithms to

36:36

prioritize their crap first.

36:38

I guarantee you,

36:38

Amazon doesn't pay for that ad slot at

36:40

the beginning.

36:41

It's just, this is not...

36:43

Yeah, this is a complicated thing to fix.

36:47

And it's not just going to fix itself.

36:49

That's what I'm getting at.

36:50

But anyways,

36:52

I think we beat that to death unless

36:53

you have something else to add there.

36:57

Okay, so in a minute,

37:00

we're gonna talk about how the FBI was

37:02

able to recover signal messages, sort of,

37:05

from a locally stored database on a phone.

37:08

But first,

37:08

we're gonna talk about some updates about

37:10

what we've been working on at Privacy

37:11

Guides this week.

37:12

And the first thing is,

37:13

we have a new interview coming out on

37:15

Sunday.

37:18

If you guys have not seen that yet,

37:19

it's in the newsletter.

37:20

Go check out privacyguides.org slash

37:22

livestreams.

37:22

It should be there right now.

37:24

That's live streams with an S on the

37:25

end, just by the way.

37:26

And we have an interview with the one

37:28

and only executive director of the EFF,

37:30

Cindy Cohen, will be coming out on Sunday.

37:34

I'm super excited for it.

37:35

I was the one who got to do

37:36

the interview and I'm still excited about

37:38

it.

37:39

It was really cool.

37:40

She was awesome.

37:41

And I think it was a really good

37:42

interview.

37:42

I tried to make sure it was applicable

37:44

to everyone.

37:45

So we talked about how to stay motivated

37:47

in the fight for privacy.

37:49

We talked about how to build a good

37:50

community.

37:51

We talked about what she learned in her

37:53

time fighting with the government and her

37:54

kind of insights on that.

37:56

So really excited for that.

37:58

Make sure you're subscribed on YouTube,

38:00

on PeerTube,

38:01

because we'll be posting it there as well.

38:03

PeerTube, of course,

38:04

does not have the little premiere feature,

38:05

but obviously we do post everything on

38:07

PeerTube as well.

38:07

So

38:09

Make sure to check that out.

38:10

And just to hype you guys up for

38:12

it a little bit, on the XIX,

38:13

we're also going to be interviewing

38:15

Carissa Veiles about her upcoming book,

38:18

which is coincidentally about AI and all

38:21

this stuff we just talked about.

38:22

And a lot of the stuff I got,

38:25

you know,

38:25

a lot of the stuff I was saying

38:26

about how this makes it deterministic and

38:27

it removes meritocracy.

38:29

Like,

38:30

these are all things that she talks about

38:32

in her book and in her interview.

38:33

So...

38:34

Yeah.

38:35

And then my last thought is that we

38:36

are working on a video coming up soon

38:38

that many of you have requested.

38:40

And that's all I'm going to say to

38:41

kind of build a little bit of hype

38:42

for that.

38:43

So that is what is going on on

38:44

the video front here.

38:46

And I'm going to turn it over now

38:47

to Jordan to let me know what I

38:49

may have missed.

38:51

No, didn't miss anything.

38:53

But we do have some other things that

38:55

we've been working on more on the site

38:56

update section.

38:58

So there wasn't any site updates this

38:59

week, but there was.

39:01

Freya has put out another article here.

39:03

It's about OKCupid settling after selling

39:06

three million photos to a facial

39:08

recognition company.

39:10

Oh,

39:11

now that's probably not what you want to

39:13

hear about your dating app.

39:15

But yeah, if that sounds interesting,

39:16

you can check that out.

39:17

Go to privacyguides.org slash news to

39:20

check it out.

39:22

And we've also been what,

39:24

so our activism lead,

39:27

M has been working on a section for

39:29

the website,

39:29

which if you haven't called it already,

39:32

there was the Privacy Activist Toolbox has

39:36

been released.

39:37

So you can visit that by going to

39:39

privacyguides.org slash activism.

39:41

There's the Privacy Activist Toolbox,

39:44

which came out a couple of weeks ago.

39:47

And that has a lot of tips about

39:49

how to

39:50

be an effective privacy activist.

39:53

There's a lot of great tips in there,

39:54

but she's also released this new support

39:58

request here on GitHub and

40:01

Basically, it's for a DPA directory.

40:04

So there's data protection authorities and

40:06

that's basically the organizations that

40:08

you need to contact to lodge a complaint

40:13

with.

40:14

So this pull request has got basically all

40:17

the regions in the world.

40:18

Well, I mean, I'm not sure if...

40:20

Yeah,

40:20

basically every region that you would

40:22

think.

40:24

I'm sure there could be some that we're

40:26

missing,

40:26

but I think Em has done a really

40:28

good job here and has covered, I think,

40:31

basically all of them.

40:31

But there could be small ones that we

40:33

didn't find.

40:36

So if there is any of those,

40:37

I guess you could take a look at

40:38

the pull request and suggest adding those.

40:41

But so far, what we've got is Africa,

40:44

Asia, Europe, North America, Oceania,

40:47

and South America.

40:49

So basically it will list the privacy law

40:52

in particular, the abbreviation,

40:54

the data protection authority.

40:56

So you can click on that.

40:58

And there's also a contact page link and

41:00

a complaint link.

41:01

So you can basically get directly to the

41:03

page that has the complaint form.

41:06

So basically what we're trying to do with

41:07

this is make it as easy as possible

41:09

for people to make a complaint against a

41:11

company, against the government, because,

41:14

you know,

41:15

this is kind of important to

41:18

utilize.

41:18

Because if you don't use your privacy

41:20

rights, well,

41:24

you're not going to have privacy.

41:25

So if there's companies that are misusing

41:28

your data,

41:29

or if you want to get something deleted,

41:31

I think using this DPA directory is going

41:33

to be really helpful.

41:35

So definitely stay tuned for that.

41:38

I know Jonah said he was taking a

41:39

look at the pull request.

41:41

So I'm sure it'll be released in the

41:42

next couple of

41:44

weeks so it does look really nice uh

41:46

definitely check out the pull request on

41:48

github there's a preview there um but it's

41:51

really well put together um so i

41:54

definitely recommend checking that out um

41:57

it has you know all the regions that

41:58

you would expect but if there's any

42:00

regions that we missed or you know that

42:02

there's that we need to add still um

42:04

that you think we might have missed this

42:06

there's just so many countries on earth um

42:09

we i'm sure we might have missed one

42:10

or two so if there's anything that you

42:12

would recommend seeing

42:14

adding to that, um,

42:15

if you're from one of those countries,

42:16

definitely do reach out and let us know.

42:18

Um,

42:18

it's kind of going to be a community

42:20

project, I guess.

42:21

Um, if there's, there's a couple of, um,

42:24

countries now that are sort of in the

42:28

process of putting together a data

42:30

protection authority,

42:31

which is really good,

42:32

like Egypt and Mexico.

42:34

So definitely, uh,

42:35

keep an eye on that as well.

42:37

Um,

42:39

There's definitely a lot of important

42:41

information there,

42:42

but also it's sort of a project here

42:45

where we are trying to get community input

42:49

as well,

42:50

because we try and represent every country

42:53

here,

42:53

but I'm sure there's things that change or

42:56

if there's countries that are establishing

42:58

data protection authorities,

43:00

then that's a really positive step for

43:02

people in those countries as well.

43:06

Yep,

43:06

so that's basically everything that I've

43:08

got to talk about here.

43:11

There wasn't any articles this week,

43:12

so kind of a light week on that

43:14

side.

43:15

But yeah,

43:16

I guess we can hop right into our

43:18

next story here.

43:21

Oh, actually, before we do that,

43:23

all of this is made possible by our

43:24

supporters,

43:25

and you can sign up for a membership

43:27

or donate at privacyguides.org or pick up

43:29

some swag at shop.privacyguides.org.

43:33

I recently made another purchase on the

43:35

shop.

43:36

There's some really cool new merch that we

43:38

released for the activism section,

43:39

so definitely check that out.

43:42

and have a look if you didn't visit

43:44

it in a while.

43:44

There's some new stuff on there now.

43:46

Privacy Guides is a nonprofit which

43:48

researches and shares privacy-related

43:50

information and facilitates a community on

43:53

our forum and matrix where people can ask

43:55

questions and get advice about staying

43:57

private online and preserving their

44:00

digital rights.

44:02

Now,

44:02

let's talk about how little snitch is

44:06

coming to Linux.

44:09

So kind of a big announcement from Little

44:11

Snitch,

44:11

which has been historically a Mac OS only

44:15

app.

44:16

They've now announced that there is a

44:19

version available for Linux.

44:23

So this is kind of reading a little

44:24

bit from their blog post here announcing

44:26

it.

44:27

I guess press release.

44:28

Recent political events have pushed

44:30

governments and organizations to seriously

44:32

question their dependence on foreign

44:34

controlled software.

44:35

The core issue is simple and

44:37

uncomfortable.

44:38

Through automatic updates,

44:39

a vendor can run any code with any

44:41

privileges on your machine at any time.

44:44

Most people know this,

44:45

but prefer not to think about it.

44:47

Linux is the obvious candidate for

44:49

reducing that dependency.

44:51

No single company controls it.

44:53

No single country owns it.

44:54

So I decided to explore it myself.

44:58

And basically the article goes on to say

45:00

that this person was trying to find an

45:02

alternative to little snitch.

45:04

They tried open snitch.

45:06

which has several command line tools and

45:09

stuff like that.

45:10

But basically,

45:12

it doesn't have the same ability to show

45:15

which process is making connections,

45:18

which is basically the way that it works

45:20

on macOS.

45:21

Like any process on macOS,

45:23

you're able to see the connections and

45:25

block them if you don't want them to

45:26

be made.

45:28

As far as I'm aware,

45:31

Open Snitch is somewhat a little bit more

45:32

limited.

45:33

It kind of only does like application

45:35

level.

45:36

I'm not a hundred percent sure because

45:37

it's been quite a few years since I've

45:40

used Open Snitch.

45:42

And it did seem like it was a

45:45

bit more complex.

45:47

So to use like the interface wasn't

45:50

particularly easy.

45:52

So this person has developed,

45:54

this person at Objective Development has

45:57

created a

46:00

linux version of little snitch now let's

46:03

like kind of clear the clear the air

46:06

on how this works so basically it is

46:09

there's another app on little snitch on

46:10

mac os but i can't remember what it's

46:12

called so there is another app on the

46:14

it's called lulu and it's by

46:17

another company.

46:18

It's a nonprofit company.

46:21

So definitely, yeah,

46:22

Lulu is the other one you're thinking

46:23

about.

46:24

But yeah,

46:24

this one here is Little Snitch has

46:26

previously been a paid software.

46:28

So it's actually kind of surprising that

46:29

this is a free and open source.

46:32

It's licensed under GPL v.

46:34

So it could be quite cool to see

46:36

package managers just adding this by

46:38

default,

46:38

like just adding this as a package.

46:42

But basically, the way that this works is

46:47

It is a browser app, kind of.

46:50

It's like a web app, basically.

46:52

And the reason why they decided to go

46:54

with this is that it can work on

46:58

a server, right?

46:59

Because this is something that you just

47:01

run as a system process.

47:03

For instance, basically,

47:05

that allows you to access the connections

47:07

that the server is making through that

47:09

nice interface.

47:12

So that's a benefit of it being in

47:14

the browser, right?

47:15

You can access that for a remote computer,

47:18

which is extremely useful.

47:20

I'm not really sure of many solutions that

47:23

do this sort of thing,

47:24

especially that easily.

47:25

You just install a package and it's

47:27

instantly monitoring.

47:28

But kind of scrolling down here,

47:33

this is basically based on a kernel

47:35

component written for eBPF.

47:38

And that's an open source component

47:41

which is available.

47:43

So just to be clear,

47:44

the UI is open source and the eBPF

47:47

filtering component is free and open

47:50

source,

47:51

but the

47:52

Basically, the backend,

47:54

which manages rules, block lists,

47:56

and a hierarchical connection view is free

47:59

to use, but not open source.

48:00

So that's basically the reasoning behind

48:02

that is because that part is kind of

48:05

proprietary to objective development.

48:07

They've been working on that for like,

48:09

twenty years to perfect it.

48:10

So they argue that that should be

48:14

kept closed.

48:17

And they kind of did an important note

48:20

here.

48:20

Unlike the Mac OS version,

48:21

Little Snitch for Linux is not a security

48:24

tool.

48:24

eBPF provides limited resources,

48:27

so it's always possible to get around the

48:30

firewall, for instance,

48:31

by flooding tables.

48:32

Its focus is privacy,

48:35

showing you what's going on and,

48:36

where needed,

48:37

blocking connections from legitimate

48:38

software that isn't actively trying to

48:41

evade it.

48:42

So if you install malware,

48:45

little snitch is not going to protect you

48:47

from the connections getting out, right?

48:52

And at least right now,

48:53

there are some limitations.

48:55

I did see some people having issues with

48:57

Fedora workstation working correctly.

48:59

They do note that on the page,

49:01

on the download page.

49:03

I tested it on Debian and it was

49:06

working perfectly fine for me.

49:08

You just install the package and basically

49:11

it only works on kernel Linux kernel six

49:14

point one two and above.

49:16

So basically the reasoning behind this is

49:19

that older kernels currently have an eBPF

49:23

verifier maximum instruction limit.

49:26

So they kind of have to backport this

49:31

fix.

49:32

Hopefully they can kind of get in contact

49:34

with the Linux kernel developers and do

49:37

that.

49:37

So that is an interesting thing too.

49:40

But I think this is kind of a

49:43

pretty, it's a pretty basic app so far.

49:45

Like it allows you to enable block lists

49:47

and see the connections that your computer

49:49

is making.

49:49

But I think that's really all you really

49:51

need at this point.

49:53

I think just being able to see the

49:54

connections itself,

49:55

because a lot of times you'll be using

49:58

software and you won't realize that it's

49:59

making connections to like ads and stuff

50:03

like that.

50:05

especially if you're using software that

50:07

is genuine it's genuine normal software

50:10

but it's a proprietary app that you know

50:12

might have some data tracking built in

50:14

like discord or any of any other of

50:17

those types of apps um i think this

50:19

is an important tool to have on linux

50:21

uh especially because people on linux

50:25

still need to be able to see the

50:28

connections that are being made and you

50:30

know there's still privacy invasive stuff

50:33

on linux uh you can install

50:35

Facebook Messenger on Linux,

50:37

you can install Discord on Linux,

50:39

you can install Steam,

50:40

like all these apps are not great for

50:42

your privacy,

50:42

but being able to see some of those

50:44

connections I think is pretty important.

50:49

But yeah,

50:49

we kind of got this little poll up

50:50

on the screen to use Little Snitch,

50:52

you can type one,

50:53

two or three in chat to respond and

50:54

it'll pop up on the screen.

50:56

But Nate,

50:57

did you have any thoughts on this one?

51:00

No,

51:00

I think you kind of you kind of

51:02

answered the question I was going to ask,

51:04

which is, you know,

51:05

Linux is known for being more private.

51:07

So my first thought is kind of like,

51:09

is there a use case for this?

51:11

Why or why would people want to use

51:13

this?

51:13

And you made a really good point.

51:14

You know, one of the this is

51:17

tangentially related but you know a common

51:19

question is like how do i get people

51:20

to switch to xyz signal linux whatever and

51:23

one of the things that we are you

51:25

know myself and jonah and um you know

51:27

some of us always say is like you

51:29

have a lot more luck by focus focusing

51:31

on the features and so one thing i

51:33

like to point out i'm trying to get

51:34

my sister to switch to linux because she

51:36

was on windows ten and it's you know

51:37

not getting updates anymore and um

51:40

I'm going to do that with her next

51:41

time I see her in person.

51:43

And one of the things I'm going to

51:43

try to convince her is, you know,

51:44

like everything you do on a windows

51:45

computer,

51:46

you can more or less do on Linux,

51:47

especially for her.

51:48

If, if, you know,

51:49

assuming she's not using any special

51:51

software for her job.

51:53

Um, you know, you can browse the internet,

51:54

you can download discord.

51:55

You can, uh,

51:56

a lot of games are now gaming on

51:57

Linux is doing really well from what I

51:59

hear actually.

52:00

So, um, for the most part, even,

52:02

even some video editing, you know, uh,

52:05

the, the Linux computer I use is cube.

52:06

So that's a lost cause,

52:08

but for like Fedora,

52:09

you can run DaVinci on Fedora.

52:10

And I think.

52:11

Um, you know, there's, yeah,

52:13

it's just point being is like,

52:15

just because Linux itself is relatively

52:17

private,

52:18

but especially once you start adding on a

52:19

lot of these,

52:20

these features that people might use,

52:21

even at first,

52:22

like if somebody makes a switch to Linux

52:24

and at first they start using, uh,

52:26

Microsoft office, God forbid,

52:28

or something.

52:28

I don't even know if that's Linux

52:29

compatible, but you know what I mean?

52:30

Like they start off and then after a

52:32

while, they're just like, yeah, you know,

52:34

maybe I'll check out LibreOffice or

52:35

something.

52:35

And, and, you know, it's,

52:37

it's just helpful to kind of have that

52:39

ability to control things.

52:41

It may also let you know,

52:42

like if you fire it up and you're

52:43

like, oh my God,

52:44

this thing is pinging like twenty

52:45

different servers ten times a day.

52:47

Like, hold on,

52:47

let me take a closer look at this

52:48

thing.

52:49

So, yeah, it's pretty cool.

52:51

And I... I don't know.

52:53

I think that's really cool that he made

52:55

this...

52:59

I guess the selfish side of me would

53:00

like to see this come to something like

53:02

windows,

53:02

even though I know we already have things

53:03

like port master open, what is it?

53:06

Open wall, simple wall.

53:08

But it's not quite one-to-one.

53:09

So I think anytime we have more,

53:12

more options is always good in my book.

53:13

So I think that's pretty cool.

53:16

Yeah,

53:16

I think I'm also kind of biased on

53:18

this because I use their software.

53:20

I use Little Snitch on Mac.

53:22

I really think it is exceptional software.

53:25

People say to use Lulu because it's free,

53:28

but it really does not do the same

53:30

thing as Little Snitch.

53:31

Little Snitch has a lot of benefits over

53:36

that.

53:37

So it's great to see them kind of

53:39

expanding because...

53:41

people have kind of been complaining about

53:43

little snitch.

53:44

They're like, Oh,

53:45

I wish it was on windows.

53:46

Like you, like, uh,

53:47

I wish it was on Linux, you know,

53:49

different platforms I think is good.

53:51

I mean,

53:51

I'd like to see it be on windows

53:53

too, but it may just be, you know,

53:55

I feel like when we talk about these

53:57

filtering softwares, it's extremely, uh,

54:02

it's extremely specific to the platform.

54:06

Like in this case, it was using eBPF,

54:09

but, you know, on Windows,

54:10

I'm sure they've got some whole other

54:11

system, right?

54:12

So, you know, making that basically,

54:17

I guess,

54:18

compatible with Little Snitch is probably

54:20

a lot of work and, you know,

54:21

they kind of have to port the entire

54:23

thing over.

54:24

which is kind of a pain,

54:26

but I think it's good to see that

54:28

a little snitch is expanding to other

54:32

platforms and it's free,

54:33

which I think was very generous, but.

54:37

Yeah, I agree.

54:38

I fully recognize that it's not an easy

54:40

thing to go from platform to platform.

54:41

And I mean, even Linux, right?

54:42

You were saying some people on Fedora are

54:44

having some issues getting it up and

54:45

running.

54:46

And hopefully since it is open source like

54:48

that,

54:48

hopefully people can do what they need to

54:50

do to get it up and running.

54:51

But yeah, it's certainly no small task.

54:53

And I think that is really cool.

54:55

And actually, yeah,

54:56

I could never remember because I'm not a

54:57

Mac user.

54:58

I could never remember if Little Snitch or

54:59

Lulu was the one that was free.

55:01

And I didn't realize Little Snitch was the

55:03

paid one.

55:03

And I think that's really cool that

55:05

this is free.

55:06

So bummer that it's not fully open source,

55:09

but I understand the logic of like,

55:11

you know,

55:11

I've been doing this for years and I

55:13

don't want somebody to, twenty years,

55:15

more than twenty years,

55:16

and the algorithms and concepts are

55:18

something we'd like to keep closed for the

55:19

time being.

55:19

Like, I get it.

55:20

So.

55:21

Yeah, I think also here,

55:23

let's quickly cover this question we got

55:25

from Cass K. So they asked,

55:27

any tips you guys have for people who

55:28

want to start making YouTube content

55:30

related to privacy?

55:31

All right, Nate, what do you got here?

55:35

Yeah,

55:35

so I just want to mention that I

55:38

have my own website called The New Oil,

55:39

which is...

55:43

It's supposed to be like a very, very,

55:44

very beginner level to privacy stuff.

55:46

And my hope is that when people finish

55:48

reading that,

55:49

they'll move on to other resources like

55:51

privacy guides.

55:52

But over there,

55:54

I do actually have a quick start guide

55:55

on the front page for content creators.

55:58

And the reason I'm referring you over

55:59

there is because there is a lot of

56:00

different stuff there.

56:01

But it kind of goes over things like...

56:04

it's really, I mean,

56:06

as with everything in privacy, right?

56:07

It's like, what, what do you want?

56:10

What are your, your priorities and stuff?

56:12

So for example, um,

56:13

if you're going to be a Twitch streamer,

56:15

you could totally just use your handle,

56:17

right?

56:17

You know, I mean like Markiplier that,

56:19

that dude's obviously not his real name.

56:21

It's derived from his real name, I think.

56:23

But, um,

56:25

A lot of YouTubers and stuff,

56:26

they're known by their handles.

56:28

But then if you're going to be a

56:30

public figure, like a politician,

56:32

a lot of them go by nicknames.

56:33

Like Ted Cruz, his first name is Raphael.

56:36

So things like that.

56:37

It's just to keep in mind,

56:40

what are you going for will determine a

56:41

lot of those things.

56:42

But I'm a big fan of things like

56:43

using pseudonyms wherever possible.

56:45

So again, handles, fake names.

56:48

being mindful of your online presence.

56:50

You don't have to plant your flag on

56:51

every single website, but it may not hurt.

56:53

And sometimes you may not need certain

56:54

websites.

56:54

Like I remember way,

56:56

way back in the day,

56:57

there's a band I follow, uh,

56:58

Oh Sleeper actually.

57:00

Um, they only ever had a Twitter account.

57:01

They never signed up for Facebook.

57:03

They never signed up for Instagram.

57:04

Like you could only follow them on

57:05

Twitter,

57:05

which was slightly annoying as someone who

57:07

didn't use Twitter at the time,

57:08

but you know,

57:09

like that's what they wanted to stick to.

57:10

And

57:12

Again, likewise,

57:13

most bands probably don't need Twitch

57:14

unless they're going to do live streaming

57:15

nowadays.

57:15

And maybe a lot of Twitch streamers don't

57:18

need Twitter.

57:18

So just kind of asking those questions,

57:21

but

57:22

Also,

57:22

a lot of the technical tools that we

57:25

recommend to privacy guides as well,

57:26

things like email aliasing,

57:27

password managers,

57:29

basically securing your online account.

57:30

Because I've also, again, I've seen bands,

57:32

their Facebook gets hacked and all of a

57:34

sudden they're spamming out like,

57:35

twenty percent off Ray Bans or whatever at

57:37

this sketchy link.

57:39

And again,

57:39

we just talked about this earlier,

57:40

Facebook does not care.

57:42

Unless you're Taylor Swift,

57:43

they don't care.

57:44

Sucks to suck, scrub.

57:45

We're not going to help you get your

57:46

account back.

57:48

I actually read one earlier today,

57:49

an article earlier today about how

57:51

Discord's support system is still the

57:55

dumbest thing that has ever been designed

57:57

by a human being.

57:59

I'm not sure a human being designed it.

58:00

It's so dumb.

58:00

But anyways, yeah.

58:01

So like just, I don't know.

58:04

I think I would check that out and

58:06

definitely bounce any of my

58:07

recommendations off privacy guides because

58:08

I will admit privacy guides has much

58:09

stricter,

58:10

I don't want to say vetting process,

58:14

criteria for a lot of their

58:15

recommendations.

58:16

So I mean,

58:18

I don't know.

58:18

Yeah, I would check both of those out.

58:20

I think they're really good resources

58:22

that'll get you started, hopefully.

58:24

Yeah, I just want to add as well,

58:26

I think there's some parts that kind of

58:28

go... I haven't read your streamer guide,

58:31

so maybe I'm just repeating what's already

58:32

on the page.

58:33

But I think stuff that's kind of important

58:36

to establish early on,

58:38

are you going to show your face?

58:40

I mean, that is kind of important, right?

58:43

I think...

58:45

You can kind of see what I'm doing

58:46

here.

58:47

Like,

58:47

I don't really want to show my face.

58:49

I'm sure someone could find what I look

58:51

like,

58:51

but it's just a layer of privacy on

58:56

that aspect.

58:57

People aren't going to recognize you in

58:59

the street or whatever,

59:00

or people aren't going to be able to

59:02

immediately know who you are.

59:04

So that is a benefit too.

59:06

And I think also thinking about things

59:09

that you share,

59:09

being very mindful of things that you

59:11

share, because, you know,

59:14

you take a picture of the room that

59:17

you're in,

59:17

someone could analyze the texture of the

59:21

roof or something,

59:22

find rental property stuff and analyze and

59:26

work out where you're living or something

59:27

like that.

59:27

You know, people are pretty creepy.

59:30

So I think being aware of

59:32

you know, what you're sharing,

59:34

how it can be used to find you

59:38

being very deliberate about posting stuff.

59:42

But I think it's definitely a personal

59:43

preference whether you want to show your

59:44

face or even show anything about you.

59:48

You can certainly be

59:50

faceless.

59:51

There's plenty of channels that do that

59:53

and pretty successfully, I'd say.

59:55

So, you know,

59:56

I think it's definitely worth thinking

59:58

about at least.

59:58

But I hope those tips were somewhat

1:00:01

helpful.

1:00:03

We try and answer people's questions in

1:00:05

the chat.

1:00:06

There was another question here from

1:00:08

Plants McGee,

1:00:09

which I'm not really sure about what it

1:00:12

means.

1:00:13

So what's the deal with Twitter, Sydney?

1:00:15

They just left.

1:00:16

So what is this referring to exactly?

1:00:19

They're referring to the EFF just left

1:00:22

Twitter.

1:00:23

Real quick, I do want to say,

1:00:24

I didn't mention the face thing and I

1:00:25

really should add that because that's a

1:00:26

really good point.

1:00:28

I've been just watching random YouTube

1:00:29

videos lately about dinosaurs and space

1:00:32

because I will forever be five years old

1:00:33

at heart.

1:00:34

And yeah, a lot of those...

1:00:35

So it's not just a privacy thing.

1:00:36

A lot of those channels are faceless too.

1:00:39

And I'm actually sitting here thinking,

1:00:40

I'm like, man,

1:00:41

maybe I should do more faceless videos

1:00:42

because I bet they can pump those out

1:00:43

real quick.

1:00:44

But yeah, going back to the question,

1:00:46

the EFF left Twitter.

1:00:48

I...

1:00:50

I have personal opinions,

1:00:51

but all I'm going to say is go

1:00:52

check their blog post.

1:00:53

They laid it out in very plain numbers

1:00:55

where basically they said – you can tell

1:00:57

I agree with their decision.

1:00:58

But basically they said that they're just

1:01:00

– they're not reaching people,

1:01:01

and they only have so many resources,

1:01:04

and they've decided their resources are

1:01:05

better spent elsewhere.

1:01:06

So you can disagree with them.

1:01:08

That's fine.

1:01:09

Free country for now.

1:01:09

You're welcome to do that.

1:01:11

But that's their logic.

1:01:12

So –

1:01:14

I think I'm just going to share my

1:01:15

thoughts on this.

1:01:17

Obviously these, these are my thoughts,

1:01:18

not,

1:01:19

not related to privacy guides as an

1:01:21

organization,

1:01:21

but I think the platform itself has kind

1:01:24

of become pretty toxic.

1:01:27

I think a lot of people are complaining

1:01:28

about it kind of becoming a bit of

1:01:29

a,

1:01:31

an echo chamber for like conservative

1:01:33

voices and stuff.

1:01:35

I think that's not great.

1:01:37

It's definitely,

1:01:37

the platform's definitely changed and

1:01:40

it's,

1:01:41

in a lot of ways became worse.

1:01:43

I think we're seeing more and more people

1:01:44

leaving because, you know,

1:01:46

it is kind of a platform that allows

1:01:51

in, in a lot of countries,

1:01:52

I would say hate speech,

1:01:54

maybe not in the US because the laws

1:01:56

there are a little bit looser, but yeah,

1:01:59

I can kind of understand not wanting to

1:02:02

be on a platform like that.

1:02:03

And I think, you know,

1:02:05

In our case,

1:02:06

Privacy Guides is still on Twitter posting

1:02:07

stuff.

1:02:08

I think we are getting some traction.

1:02:11

So maybe our strategy is different to that

1:02:14

of the EFF,

1:02:16

but we're still getting quite a lot of

1:02:18

traction with that.

1:02:19

I think it's important to reach people no

1:02:21

matter what platform they're on.

1:02:23

So, you know, in a lot of cases,

1:02:26

we're going to be on all these crappy

1:02:27

platforms.

1:02:28

Doesn't mean we support the platform or we

1:02:31

want to...

1:02:35

Doesn't mean we want to make people move

1:02:40

to better platforms like Mastodon.

1:02:42

We recommend different ones that people

1:02:45

should move to instead.

1:02:46

But I think you have to meet people

1:02:49

where they are.

1:02:50

And if we just stopped posting on all

1:02:52

these platforms,

1:02:53

we wouldn't be reaching as many people and

1:02:56

converting them to believing different

1:02:59

things,

1:02:59

like that X is a bad platform and

1:03:01

invades your privacy.

1:03:03

Same thing Jonas said here.

1:03:04

It doesn't make a ton of sense to

1:03:06

me to leave X,

1:03:07

but not Facebook or TikTok,

1:03:08

but shrugging emoji.

1:03:11

I think it's definitely like a personal

1:03:13

choice.

1:03:15

If the analytics said that they weren't,

1:03:17

I mean,

1:03:18

it doesn't really make that much sense in

1:03:19

my opinion,

1:03:20

because

1:03:23

we have all these multi-posting tools.

1:03:25

Like, for instance, our team here, Nate,

1:03:28

I, me, and Jonah, we're basically,

1:03:30

a lot of our posting is through Buffer.

1:03:32

And all it is is just ticking another

1:03:34

box to send it to another platform.

1:03:37

Like,

1:03:37

we're not specifically creating anything

1:03:39

for a specific platform.

1:03:41

So, I mean, if those platforms

1:03:45

I mean, we can debate all day,

1:03:47

like how bad X is as a platform.

1:03:51

We can debate all day how bad Facebook

1:03:53

is and TikTok, but I still think,

1:03:56

you know,

1:03:57

we post on all those platforms because we

1:03:59

wanna be able to reach

1:04:01

these people because you know everyone

1:04:03

deserves privacy not just everyone on

1:04:05

mastodon um so and especially because

1:04:08

these people are probably less aware of

1:04:11

the issue that's why they're on those

1:04:13

platforms in the first place so i can

1:04:16

kind of understand from like an

1:04:17

ideological perspective like if you really

1:04:19

don't like being on a platform that kind

1:04:21

of amplifies conservative voices i can

1:04:23

kind of understand why you may not want

1:04:25

to be on a platform like that where

1:04:26

you get harassed but

1:04:29

I think from an organizational

1:04:31

perspective,

1:04:31

I think I'm not quite sure if I

1:04:34

agree with this because it is kind of

1:04:37

easy to cross post and I can respect

1:04:42

the decision,

1:04:44

but I'm not sure if I agree with

1:04:45

it particularly.

1:04:47

But yeah, that's kind of my thoughts.

1:04:52

Okay.

1:04:54

I don't really have much to add.

1:04:55

I think that was a good point about

1:04:56

buffer, but I don't know.

1:04:57

I don't, I don't have a,

1:04:59

what do they say?

1:05:00

I don't have a dog in this fight.

1:05:01

So it's kind of a messed up saying

1:05:03

now that I think about it,

1:05:04

don't fight dogs.

1:05:07

On that note,

1:05:07

I think we're going to move into a

1:05:09

story about the FBI extracting a suspect's

1:05:13

deleted signal messages saved in the

1:05:15

iPhone notification database.

1:05:16

So I'm not going to scroll on this

1:05:17

one too much because this is actually a

1:05:18

paid post.

1:05:20

But this comes from four or four media.

1:05:22

Highly recommend.

1:05:23

It's totally worth it, in my opinion.

1:05:25

They do great reporting.

1:05:26

But Jonah did say in the chat that

1:05:28

you felt like this is a little bit

1:05:30

of a nothing burger here.

1:05:31

And I...

1:05:34

I halfway agree.

1:05:35

Um, cause you know, it's the,

1:05:36

the headline kind of says it all,

1:05:38

but I think it's worth talking about

1:05:40

because it kind of points out,

1:05:42

they said further down in the article that

1:05:43

this, like,

1:05:45

this just kind of amplifies how difficult

1:05:47

it can be.

1:05:48

Actually,

1:05:48

let me see if I can find it

1:05:50

here,

1:05:50

but they basically said it just points out

1:05:53

how difficult it can be to, um,

1:05:55

to think about every possible angle of

1:05:57

your OPSEC,

1:05:59

especially when it really matters this

1:06:00

much, you know?

1:06:01

So like last year in June,

1:06:02

we found out that, which I mean,

1:06:05

the more technical people who know this

1:06:06

kind of stuff, which remember,

1:06:07

not everybody is super technical,

1:06:09

but yeah,

1:06:12

We found out last year that because push

1:06:15

notifications are usually not encrypted,

1:06:17

Apple and Google can see them,

1:06:19

which probably was not a shocker to most

1:06:21

people.

1:06:21

But also, basically,

1:06:23

the way that they're registered to make

1:06:24

sure they get to the right device and

1:06:26

stuff, it's... Basically...

1:06:32

It's another way that police can get your

1:06:33

data, right?

1:06:34

Police can go to Apple and Google and

1:06:35

they can subpoena you for your data.

1:06:37

And so we're really big fans of services

1:06:40

like Tudor, for example,

1:06:41

does not rely on Google for push

1:06:43

notifications on Android.

1:06:45

Signal, I think,

1:06:45

also has their own implementation.

1:06:47

Proton does rely on Google,

1:06:49

but they encrypt it.

1:06:50

So there's not really anything useful

1:06:51

there, although there is still metadata,

1:06:53

which is worth noting.

1:06:55

But...

1:06:56

And now we're learning it's still even

1:06:58

more complicated than that, right?

1:07:00

Because basically what happened is they

1:07:03

arrested somebody and they were able to,

1:07:04

you know,

1:07:05

they ran the Celebrite or whatever,

1:07:06

whichever device it was.

1:07:07

They ran the forensic tools on the

1:07:08

person's phone,

1:07:09

which was an iPhone in this case.

1:07:11

And this person had already deleted

1:07:12

Signal.

1:07:13

I believe they even had disappearing

1:07:16

messages enabled,

1:07:16

but don't quote me on that.

1:07:17

I feel like I read that in this

1:07:18

post somewhere.

1:07:20

And the police were still able to extract

1:07:23

some of the messages because the

1:07:25

notification history was stored on the

1:07:27

device.

1:07:29

And, um,

1:07:31

Yeah,

1:07:33

I did not see that coming personally.

1:07:35

So it is worth noting that because these

1:07:37

are device notifications that we only got,

1:07:41

or they only got, I should say,

1:07:42

half the,

1:07:46

I can't word today and I apologize.

1:07:47

They only got half of the conversation,

1:07:49

right?

1:07:49

They got the incoming stuff.

1:07:51

They did not get everything, of course.

1:07:53

And I personally am a little bit unclear

1:07:55

on exactly how this would work.

1:07:56

Like for example,

1:07:57

how long do these notifications stay

1:07:58

there?

1:08:00

It sounds like this is a like a

1:08:01

volatile memory kind of thing like RAM.

1:08:03

So would rebooting the phone get rid of

1:08:05

them?

1:08:05

I'm assuming not because I think if these

1:08:07

people were smart enough to use signal and

1:08:08

disappearing messages and to delete

1:08:10

signal,

1:08:10

then they probably rebooted their phone as

1:08:12

well.

1:08:13

Or maybe not since they were able to

1:08:14

forensically examine the phone.

1:08:15

I really can't say for sure.

1:08:16

But

1:08:17

I just, personally,

1:08:20

I have some technical questions like that.

1:08:22

But I think the big reminder here that

1:08:25

I thought was interesting I wanted to talk

1:08:26

about was just the reminder to be mindful

1:08:32

of your notifications.

1:08:33

One thing I really appreciate about

1:08:34

Signal,

1:08:35

and I know other apps do this too

1:08:37

to various extents.

1:08:39

Privacy apps tend to be a lot better

1:08:40

about this, of course,

1:08:41

as opposed to something like Discord.

1:08:42

But Signal lets you get pretty granular in

1:08:45

terms of like,

1:08:46

I can mute this chat.

1:08:47

I can mute this chat for an hour.

1:08:49

I can mute it for a day.

1:08:50

I can mute it indefinitely.

1:08:52

I can select notifications.

1:08:54

It says here that includes name, content,

1:08:57

name only, or no name in content.

1:08:59

And so this is actually what I used

1:09:01

to do ever since I found out about

1:09:02

that story from last year.

1:09:04

I have Signal set to send me just

1:09:05

a notification that says Signal.

1:09:07

And then from there,

1:09:08

I use notification profiles

1:09:11

to kind of manage things.

1:09:12

So like when I was at work in

1:09:14

my last job,

1:09:15

which was a more traditional nine to five

1:09:17

sort of job, I, you know,

1:09:19

in the sense of like,

1:09:19

I can't be on my phone during the

1:09:20

day and stuff like that.

1:09:22

I had a notification profile that would

1:09:24

start at working hours and at end of

1:09:26

day, which was never really end of day.

1:09:27

But at that point I'm like, whatever,

1:09:28

we're staying late.

1:09:29

I don't care.

1:09:30

And during that time,

1:09:31

pretty much the only notification that

1:09:32

would come through would be my wife in

1:09:34

case there was an emergency.

1:09:35

And so at that point,

1:09:37

I don't need the notification content,

1:09:38

right?

1:09:38

Because I know exactly who it is.

1:09:39

It's the only person that the notification

1:09:41

will get through.

1:09:42

And likewise, when I go to sleep,

1:09:44

my wife in case I'm traveling and my

1:09:47

sister in case of emergency.

1:09:48

And I think that's it.

1:09:49

Now, I have one local friend, too,

1:09:50

in case of emergency.

1:09:52

And those are the only people,

1:09:53

even though I try,

1:09:54

lately I've been doing a good job,

1:09:55

but usually I try not to sleep with

1:09:55

the phone in the bedroom.

1:09:56

But you know what I mean?

1:09:57

It's like,

1:09:59

I'm able to craft these very specific

1:10:00

notification profiles.

1:10:01

And even now here at Privacy Guides,

1:10:03

I do have working hours where I'm like

1:10:05

at work and I try to focus.

1:10:06

So I've got everything that isn't, again,

1:10:08

like my wife and my sister and then

1:10:11

all the Privacy Guides people,

1:10:12

they're the only ones that the

1:10:13

notifications go through so that I don't

1:10:15

get distracted by other people.

1:10:16

And yeah.

1:10:18

I think trying to figure out how to

1:10:21

make a device work for you in that

1:10:22

sense, you know, somebody, I think I saw,

1:10:24

I think it was in the privacy guides

1:10:25

forum when people were talking about this

1:10:26

story,

1:10:27

or it may have been in the comments

1:10:29

of this actual article,

1:10:30

but somebody mentioned like changing the

1:10:32

ringtones, you know,

1:10:33

before I started using these notification

1:10:35

profiles,

1:10:36

that was something I did is my wife

1:10:37

had a different ringtone than everyone

1:10:38

else.

1:10:39

So that if I got a, you know,

1:10:41

a notification, I would know, is it her,

1:10:44

do I need to check this or can

1:10:45

I just ignore it?

1:10:46

And so, yeah,

1:10:49

it's definitely something to be mindful

1:10:51

of.

1:10:51

And something else that kind of came up

1:10:53

here,

1:10:53

I'm looking at the comments now on this,

1:10:54

is on iPhones,

1:10:56

if the calls show in recent,

1:10:58

that will show up in the actual phone

1:11:00

app log.

1:11:02

On both iPhone and Android,

1:11:04

there's always relay calls,

1:11:05

so your IP address isn't exposed.

1:11:07

Slight quality trade-off,

1:11:09

but if you're a high-risk person or if

1:11:11

you live in an area where you've always

1:11:12

got a good signal,

1:11:12

it's probably not a big deal.

1:11:14

So just things like that to keep in

1:11:15

mind.

1:11:16

But one last thing I do want to

1:11:18

circle back to when Jonah said, he's like,

1:11:19

this is kind of a nothing burger because

1:11:21

Signal doesn't encrypt their own local

1:11:23

database in the first place.

1:11:24

And I think that's a valid point.

1:11:26

I mean, I love Signal.

1:11:27

I recommend Signal.

1:11:29

It's very user-friendly.

1:11:30

It's very easy.

1:11:31

It's cross-platform.

1:11:32

It's got all kinds of shiny little

1:11:33

features that people enjoy and makes them

1:11:35

more likely to use it.

1:11:36

But I think it is important to note

1:11:39

that there is no perfect tool,

1:11:40

whether that's Signal,

1:11:41

whether that's SimpleX,

1:11:42

whether that's

1:11:43

Um, you know, whatever messenger,

1:11:45

whether that's email,

1:11:46

email itself is incredibly imperfect,

1:11:48

which we did explicitly mentioned that in

1:11:49

our latest video about email.

1:11:51

So, you know, it's,

1:11:52

it's looking at your threat model.

1:11:54

It's looking at what you need from a

1:11:55

tool and it's understanding what the

1:11:57

limitations are and how to either

1:11:59

eliminate them, mitigate them,

1:12:01

work around them.

1:12:03

Cause then I'm not gonna lie.

1:12:03

My, my, um,

1:12:05

One of my thoughts I had today when

1:12:06

I was thinking about this story is like,

1:12:08

I can't control other people's phones.

1:12:09

You know,

1:12:09

I have my notifications set not to do

1:12:11

things.

1:12:11

But now when I text people,

1:12:13

my notifications are on that device.

1:12:16

And that's just something to be mindful

1:12:17

of.

1:12:17

So yeah, kind of a shorter story,

1:12:22

I think today, but you know,

1:12:23

it's a pretty quick takeaway.

1:12:25

So I don't know if you had any

1:12:26

thoughts on that one that you wanted to

1:12:27

share, Jordan.

1:12:30

Yeah, I mean,

1:12:30

I think you brought up some great points

1:12:32

there.

1:12:32

Like, you know,

1:12:33

there's ways to at least somewhat protect

1:12:36

this on your end.

1:12:37

Like you said,

1:12:38

the settings in Signal itself.

1:12:40

I almost wonder though,

1:12:43

you could almost disable notifications

1:12:46

just in general, you know,

1:12:47

no notifications.

1:12:49

And then, you know,

1:12:49

there wouldn't be a database where

1:12:51

anything would be stored in this case.

1:12:54

But, you know,

1:12:55

maybe that's kind of problematic for

1:12:57

people to do.

1:12:58

But there's also the case where

1:13:01

know there's at least on android there is

1:13:03

the ability to enable notification history

1:13:06

so it can save notifications that's off by

1:13:10

default but it's another thing to check um

1:13:12

to see if you have that enabled definitely

1:13:14

disable that i don't think that's

1:13:16

necessary to to enable like it's it's the

1:13:20

whole point of notifications is they're

1:13:22

kind of ephemeral they're there on the

1:13:23

screen i kind of assumed though

1:13:26

that's when you dismiss the notification,

1:13:28

it's gone, right?

1:13:30

I didn't think that there would be saved

1:13:33

on your device in a database.

1:13:35

So this might be something for Apple to

1:13:36

actually fix because I feel like that is

1:13:40

a bit of a concern.

1:13:43

and anonymous three, four, four,

1:13:44

just put in the chat, threat model,

1:13:46

threat model, threat model.

1:13:47

Exactly.

1:13:47

Like if,

1:13:48

if your concern is your device being

1:13:50

seized and you want to protect the data

1:13:52

on it, um, you know, use Molly,

1:13:55

have an encrypted database in signal.

1:13:57

Um,

1:13:59

don't use notifications cause they can be

1:14:01

accessed.

1:14:02

Right.

1:14:02

Um, I think it's kind of hard though,

1:14:05

especially because, uh,

1:14:06

in this story in particular,

1:14:07

I think it was,

1:14:10

They said the case was the first time

1:14:12

authorities charged people for alleged

1:14:14

Antifa activities after President Trump

1:14:17

designated the umbrella term a terrorist

1:14:20

organization.

1:14:20

So I don't know,

1:14:22

this is kind of a very nebulous thing

1:14:23

going on in the US.

1:14:24

Like what is Antifa?

1:14:26

Like it's not really an organization.

1:14:28

It's kind of a bit silly that they're

1:14:30

calling it that.

1:14:31

But I think people should be

1:14:34

guess more vigilant than usual because you

1:14:36

know you never know if your your

1:14:38

activities are going to be considered

1:14:40

antifa and then your devices might be

1:14:42

seized so it could be worth thinking if

1:14:45

you know you might be a target of

1:14:46

this sort of thing a little bit more

1:14:48

thoroughly because it does seem like the

1:14:50

government is cracking down on political

1:14:53

behavior um i can't really read the full

1:14:56

article here so i'm not really sure it

1:14:58

says here that it was uh people present

1:15:02

It said the case involved a group of

1:15:03

people setting off fireworks and

1:15:05

vandalizing property at a ICE detention

1:15:07

facility.

1:15:09

Yeah, in Texas,

1:15:11

and one person shooting a police officer

1:15:13

in the neck.

1:15:13

So I'm not going to comment on whether

1:15:16

they were guilty or not,

1:15:18

but that's what the FBI is finding.

1:15:19

Yeah,

1:15:19

I think that's up to the courts to

1:15:20

decide.

1:15:21

But I think the thing is, right,

1:15:23

you know,

1:15:24

if you're doing any sort of

1:15:27

political action.

1:15:28

I'm not saying, you know,

1:15:29

you should go out and vandalize an ICE

1:15:31

facility, but, you know,

1:15:32

I'm saying like any sort of political

1:15:33

action,

1:15:33

whether that's peaceful protest or,

1:15:36

you know, marching through the streets,

1:15:38

you know,

1:15:39

I think it's important to think about ways

1:15:41

that your data could be extracted and used

1:15:45

against you.

1:15:46

I think in this case, you know,

1:15:47

it's definitely,

1:15:49

everyone deserves privacy,

1:15:51

even if these people were

1:15:53

vandalizing something um so it's kind of

1:15:56

unfortunate that uh the the iphone was

1:16:01

kind of a bit uh vulnerable to this

1:16:05

extraction method i also do wonder if

1:16:08

lockdown mode could have prevented this um

1:16:11

because i think lockdown mode does prevent

1:16:14

a lot of these uh

1:16:16

like tools that these FBI uses to like

1:16:20

forensic extraction tools.

1:16:22

Um,

1:16:22

I think it probably could have been

1:16:23

interesting to hear, um,

1:16:26

if that was the case or not,

1:16:27

I'm going to assume no, but you know,

1:16:29

I feel like anyone who's going to any,

1:16:32

uh, political action,

1:16:34

like just enable lockdown mode.

1:16:35

It's like the least you can do.

1:16:36

It's, it's, it's a basic thing.

1:16:39

It's just a switch you turn on.

1:16:40

Um,

1:16:42

but I do think it is, uh,

1:16:45

calling out Antifa is big,

1:16:47

the hacker known as Fortan energy.

1:16:50

Yeah.

1:16:50

It's just like, it's, it's just,

1:16:52

it just shows that the government doesn't

1:16:54

really know about like any sort of,

1:16:58

any sort of political organizations.

1:17:00

I think it's a lot easier to, uh,

1:17:04

it's a lot easier to lump a whole

1:17:07

bunch of people together and sort of say

1:17:09

this nebulous thing is bad.

1:17:12

Um, then, you know,

1:17:14

having any sort of specifics like, yeah,

1:17:17

these,

1:17:17

these protesters at this specific ice

1:17:19

facility, um, who that's why, I mean,

1:17:24

we don't really know if they're part of

1:17:25

a group or anything, but, um,

1:17:27

they could have just been acting

1:17:28

independently.

1:17:29

Um, I think it's just,

1:17:31

it's kind of ridiculous that we're,

1:17:33

grouping it all together like that.

1:17:35

But yeah,

1:17:36

I don't really have more to add than

1:17:37

that.

1:17:41

Yeah.

1:17:43

The only thing I wanted to comment on

1:17:44

is you said that it would be best

1:17:45

to turn off notifications altogether,

1:17:47

which yeah, I mean, if it's really,

1:17:49

really, it's all about threat model,

1:17:50

right?

1:17:51

Like that wouldn't be feasible for me in

1:17:52

my day to day for sure.

1:17:53

But yeah, if you're,

1:17:55

especially if you're doing something

1:17:58

sensitive,

1:17:58

whether that's simply protesting or

1:18:00

whether you're taking it further,

1:18:01

which I'm not advocating for violence or

1:18:02

breaking any laws, but I'm just saying,

1:18:05

you know, threat model.

1:18:06

Yeah.

1:18:06

You maybe should bring a separate device.

1:18:08

You maybe should turn off notifications.

1:18:09

It's, it's tricky, but it's also just,

1:18:15

yeah, I don't know.

1:18:16

I wonder how

1:18:18

I personally just wonder how well-known

1:18:20

this kind of vulnerability is.

1:18:21

Is this the kind of thing that technical

1:18:23

people would look at and be like, yeah,

1:18:25

obviously, iPhones are keeping a...

1:18:27

Because it sounded like Jonah...

1:18:29

Full disclosure,

1:18:30

Jonah's the person I usually go to with

1:18:32

deep technical questions because he's

1:18:33

really smart about this kind of stuff.

1:18:35

And when I was asking him questions about

1:18:36

this earlier this week, I was like...

1:18:38

And I asked him those questions about

1:18:40

would the phone restart?

1:18:41

Would this, that, and the other...

1:18:42

Or excuse me,

1:18:43

would a restart clear out the...

1:18:45

the cache or the database.

1:18:47

And he didn't really know either.

1:18:49

And he said the same thing that you

1:18:50

said, where he's like,

1:18:51

I always thought that when you swiped a

1:18:53

notification, it was gone.

1:18:54

Is this like, does you,

1:18:56

do you have to handle it differently

1:18:58

somehow?

1:18:58

And so I,

1:19:00

where I'm going with that is like, he's,

1:19:02

he's really smart in my opinion,

1:19:03

not to like, you know, but, uh,

1:19:06

and if he doesn't even know this stuff,

1:19:07

it's like, how many people do know this?

1:19:08

Like, this is not common knowledge.

1:19:10

And, um, I think we did find a,

1:19:12

uh,

1:19:14

some some court cases i mean this was

1:19:15

like a real real quick web search we're

1:19:17

not lawyers obviously but we did find some

1:19:19

some cases where the judge kind of threw

1:19:21

out certain evidence because they're like

1:19:23

how would a person be expected to even

1:19:25

know that they were leaving evidence which

1:19:27

i mean that's not exactly what his

1:19:29

argument was but you know it's like

1:19:30

there's a certain level of just like like

1:19:32

it's insane that you found this and i'm

1:19:34

not going to allow this in court and

1:19:36

it's i i kind of wonder how this

1:19:39

would qualify as well how how well known

1:19:41

is this kind of a thing and

1:19:44

Yeah, super crazy.

1:19:45

But I think this also brings up like

1:19:47

the other concern with iOS, right?

1:19:49

Because iOS is a closed system, right?

1:19:52

We don't have access to the source code.

1:19:54

We don't know that there's a database

1:19:56

storing people's notifications.

1:19:58

Like we're up to iOS now.

1:20:00

Like there's been versions of

1:20:02

Well, not twenty six,

1:20:03

but I guess nineteen.

1:20:05

Nineteen updates.

1:20:07

And we still haven't seen this be an

1:20:09

issue before.

1:20:11

So I think that's the benefit of open

1:20:12

source operating systems like, you know,

1:20:14

graphing OS.

1:20:15

We know there's no notification database.

1:20:17

We know that those messages are being

1:20:19

stored after they're being dismissed.

1:20:21

Whereas with iOS,

1:20:22

it's sort of a black box.

1:20:24

we can protect as much as we can

1:20:27

from this sort of thing like we can

1:20:29

make assumptions that things are done a

1:20:32

certain way but i feel like making

1:20:33

assumptions is kind of risky because we

1:20:35

don't have clear evidence with what these

1:20:39

systems actually do so i think that's

1:20:42

where open source operating systems are

1:20:44

going to definitely beat out this sort of

1:20:46

thing jonah says was about to type what

1:20:48

jordan is saying yeah um so i guess

1:20:51

uh we kind of had the same thought

1:20:52

on this um but

1:20:54

I think it's definitely, you know,

1:20:58

I think one thing to think about as

1:20:59

well is, you know,

1:21:01

there's certain apps that need

1:21:02

notifications and there's certain apps

1:21:04

that don't.

1:21:05

Like we can definitely try and reduce

1:21:07

notifications

1:21:08

the amount of stuff sending notifications,

1:21:10

because as far as I'm aware on like

1:21:12

Googled and Apple devices,

1:21:14

there's the stuff that's sent,

1:21:16

like the way that notifications work is it

1:21:18

sent through, you know,

1:21:19

Google Firebase or it's sent through Apple

1:21:22

service.

1:21:23

And we've already had a story previously

1:21:25

where, you know, someone's,

1:21:26

notifications were able to be subpoenaed

1:21:29

from Apple and Google and get access to

1:21:33

the notification content.

1:21:34

And there's possibly sensitive information

1:21:37

there.

1:21:38

We did end up finding out, though,

1:21:39

that in a lot of cases,

1:21:41

a lot of these apps that are privacy

1:21:42

focused,

1:21:43

they actually encrypt the notification

1:21:45

content.

1:21:46

So Apple and Google will get notified when

1:21:51

notification is arriving they won't have

1:21:52

any insight into what it actually is so

1:21:56

you know i think notifications are kind of

1:21:58

one of those sort of uh they have

1:22:02

a lot of like foot guns i guess

1:22:04

they kind of are like a bit of

1:22:06

a uh dangerous thing uh that we need

1:22:10

to consider i think bringing the story up

1:22:12

is like you know brought this to the

1:22:14

forefront again um i think

1:22:17

Obviously,

1:22:18

there's people that need to disable

1:22:20

notifications,

1:22:21

but I think it's good to at least

1:22:24

consider this as a threat,

1:22:26

because I guess people haven't really been

1:22:28

doing that.

1:22:31

Yeah, and not to speculate too much,

1:22:34

because you literally just said, like,

1:22:35

you know,

1:22:35

all we can do is speculate sometimes,

1:22:36

but I wonder if, because you pointed out,

1:22:39

it's like

1:22:41

there's been so many versions of iPhone

1:22:43

and we're just now learning this.

1:22:45

And part of me just wonders,

1:22:47

has it always,

1:22:47

and this is just me thinking out loud,

1:22:49

obviously,

1:22:49

I know none of us have answers to

1:22:51

this, but like,

1:22:51

has it always been doing this or is

1:22:53

this a new feature?

1:22:54

Because my thought process is,

1:22:56

if it's always been doing this,

1:22:58

I think that kind of highlights the arms

1:23:03

race nature of privacy and security where,

1:23:05

you know, before it was like,

1:23:09

I don't know,

1:23:09

just to pull random examples out of thin

1:23:12

air that may not fit because I'm making

1:23:13

this up as I go along.

1:23:15

But before,

1:23:15

the police would walk by a building,

1:23:17

look in the window, and go, oh,

1:23:18

there's my evidence.

1:23:20

And now they have to go deep into

1:23:22

the building, into the bank vault.

1:23:24

So it just kind of makes me wonder,

1:23:26

is this some new thing?

1:23:27

Or is they've just never used it before

1:23:29

because they've never needed to try so

1:23:31

hard to find evidence before,

1:23:32

which would...

1:23:34

If that is true,

1:23:34

then that would just show how much more

1:23:36

secure everything is getting.

1:23:37

But again, we don't know.

1:23:38

We'll never know, probably.

1:23:40

Just a random thought that I had.

1:23:45

I think that's all I've got on that

1:23:46

story, personally.

1:23:50

Alrighty then,

1:23:51

I guess that's kind of a little bit

1:23:54

of time now to move into the forum

1:23:57

updates this week.

1:23:58

So in a minute,

1:23:58

we'll start taking viewer questions.

1:24:00

So if you've been holding onto any

1:24:02

questions about any of the stories we've

1:24:04

talked about so far,

1:24:05

go ahead and start leaving them on our

1:24:07

forum thread or in the chat on the

1:24:11

live stream and just so you know if

1:24:12

you're watching this and you're not you

1:24:14

don't have an account on one of those

1:24:15

platforms we do stream on stream yard so

1:24:18

check out the forum post and there's a

1:24:19

link there you can join without an email

1:24:22

just a name and you can ask a

1:24:24

question but now for now let's stick

1:24:29

to our community forum.

1:24:30

And there's always a lot of activity

1:24:31

there,

1:24:32

but here's a few of this week's most

1:24:34

interesting discussions happening there.

1:24:37

So there was this thread that Nate linked

1:24:39

here, and this one is about Wisconsin.

1:24:42

So Wisconsinites can keep watching porn

1:24:46

after governor vetoes age verification

1:24:50

bill.

1:24:52

So I guess I'm going to kind of

1:24:53

throw this to you, Nate,

1:24:53

because I feel like you have quite a

1:24:55

lot of thoughts on this one.

1:24:58

Yeah.

1:24:59

Um,

1:25:00

so I originally thought this was good

1:25:02

news, not just from the porn angle.

1:25:03

I think that's just four Oh four being

1:25:04

clickbaity.

1:25:06

Um, which I say that with love.

1:25:07

I mean, it's to me,

1:25:08

it's only clickbait if you don't deliver

1:25:10

on the promise and you know,

1:25:12

it's everybody's trying to stand out.

1:25:13

Right.

1:25:13

But anyways, um, yeah, so we've been, uh,

1:25:19

you know, privacy is an uphill battle.

1:25:22

I think we all know that.

1:25:23

And I think that, um,

1:25:25

It can be really depressing because I

1:25:26

think if we're being honest,

1:25:27

we generally tend to lose more than we

1:25:29

win,

1:25:30

which I don't think it's a lost cause.

1:25:32

I think...

1:25:33

Especially, I think, as things get worse,

1:25:35

people are going to start realizing the

1:25:36

value of their privacy,

1:25:37

and hopefully we can start to reverse that

1:25:39

trend a little bit.

1:25:40

But a lot of the time,

1:25:41

we do take some pretty severe losses,

1:25:43

and so it's important to celebrate the

1:25:45

wins,

1:25:46

which I'll get to why this is a

1:25:48

bit of a mixed bag in a minute.

1:25:49

But for now,

1:25:50

I do want to celebrate the good sides,

1:25:52

which is that the governor rejected this.

1:25:54

This was an age verification bill.

1:25:56

Uh, assembly bill one Oh five,

1:25:58

which would have four sites with more than

1:25:59

one third of material harmful to minors,

1:26:02

uh,

1:26:02

defined as depictions of actual or

1:26:04

simulated sexual acts or body parts

1:26:06

included, including blah, blah, blah,

1:26:07

blah, blah.

1:26:08

Um, female nipples,

1:26:10

not male nipples as always, but whatever,

1:26:11

that's a rant for a different time.

1:26:13

Anyways.

1:26:13

Um,

1:26:14

It would have required using any

1:26:16

commercially reasonable method that uses

1:26:17

public or private transactional data

1:26:19

gathered about the individual.

1:26:20

And the article says this means uploading

1:26:22

an ID,

1:26:22

showing their face for a biometric scan,

1:26:23

uploading credit card information,

1:26:25

or a combination of these.

1:26:26

And the governor vetoed this bill and

1:26:28

said,

1:26:29

I am vetoing this bill in its entirety

1:26:30

because I object to this bill's intrusion

1:26:32

into the personal privacy of Wisconsin

1:26:33

residents.

1:26:34

While I agree that we should protect

1:26:35

children from harmful material,

1:26:36

this bill imposes an intrusive burden on

1:26:38

adults who are trying to access

1:26:39

constitutionally protected materials.

1:26:42

Um,

1:26:43

Evers wrote that the bill doesn't prevent

1:26:45

platforms from giving collected personal

1:26:47

data to third parties,

1:26:48

such as the government or data brokers.

1:26:49

And he wrote,

1:26:50

this is a violation of personal privacy.

1:26:52

Additionally,

1:26:52

I'm concerned about data security and the

1:26:54

potential for misuse of personally

1:26:56

identifiable information that could be

1:26:58

intercepted by or transmitted to a third

1:26:59

party used for the basis of blackmail or

1:27:01

identity theft.

1:27:03

Further,

1:27:04

although the bill includes penalties for a

1:27:05

business entity who violates the

1:27:06

prohibition of retention of personal data,

1:27:08

those penalties cannot undo the harm.

1:27:09

So all really,

1:27:11

really good stuff that I was super stoked

1:27:14

to see.

1:27:15

Unfortunately,

1:27:17

I think here in the comment section is

1:27:22

kind of where it went wrong is some

1:27:24

people pointed out,

1:27:25

and I don't know if I missed this

1:27:27

in the original article because I'm still

1:27:29

not seeing it here either.

1:27:31

Um,

1:27:32

maybe it's like in a different statement

1:27:33

that he gave her,

1:27:34

like the rest of the statement,

1:27:35

but some people quoted that, uh,

1:27:37

the governor wants device level age

1:27:38

verification.

1:27:40

And apparently this is a quote from him.

1:27:41

Uh,

1:27:41

we can and should work to prevent minors

1:27:43

from accessing adult content.

1:27:44

Um,

1:27:45

but there are better solutions than the

1:27:46

one offered by this bill.

1:27:47

For example,

1:27:48

we can work with tech companies to

1:27:49

implement device-based device-based age

1:27:51

verification that takes place on a user's

1:27:52

phone or computer,

1:27:53

which can be more secure and effective

1:27:55

method.

1:27:55

Other States have been moving toward

1:27:56

device-based solutions and major tech

1:27:57

companies are adopting these options as

1:27:59

well.

1:27:59

So yeah, it's, I don't know.

1:28:03

I don't want to get into the age

1:28:04

verification debate.

1:28:05

Cause we've, I still am fresh from like,

1:28:08

there was,

1:28:08

there was like a three or four week

1:28:09

run where we talked about it every single

1:28:11

week.

1:28:12

And I still don't feel like we have

1:28:13

anything new to add to that,

1:28:14

or at least I certainly don't.

1:28:16

Um, you can feel free to,

1:28:18

to chime in if you have more to

1:28:19

add to that.

1:28:20

But, um,

1:28:22

I don't know.

1:28:23

I have mixed feelings about device based

1:28:24

stuff, but I certainly see the drawback.

1:28:27

And anyways,

1:28:28

I guess this one's a mixed bag,

1:28:29

but it's I want to celebrate the win

1:28:32

that he did repeal it.

1:28:33

And I certainly agree that this would have

1:28:35

been way worse than device based.

1:28:37

I'm definitely I definitely know the

1:28:39

problems with device based.

1:28:40

I'm not saying I'm in favor of it.

1:28:42

But there's a difference between getting a

1:28:44

paper cut and getting your finger chopped

1:28:46

off.

1:28:46

And I think this would have been getting

1:28:47

your finger chopped off.

1:28:48

So I think that's good that we avoided

1:28:50

a much worse fate.

1:28:51

I hope he doesn't go for the device-based

1:28:53

stuff.

1:28:54

That's kind of the drawback.

1:28:54

But yeah, anyways,

1:28:56

I'm going in circles now.

1:28:57

I just wanted to celebrate a small,

1:28:59

even if it's a mixed bag,

1:29:00

we did have a small win this week

1:29:01

that I thought was pretty cool.

1:29:04

Yeah, that's good to hear.

1:29:06

I mean, one thing that I kind of,

1:29:08

it kind of bounces off the issue that

1:29:10

we talked about in the highlight story is,

1:29:12

you know,

1:29:13

I think we should be against all these

1:29:15

sort of centralized things, right?

1:29:18

Like this is centralization.

1:29:21

Again, like we talked about app stores,

1:29:22

right?

1:29:23

Three app stores, or I guess really two,

1:29:26

there's like two big ones,

1:29:27

Google Play and Apple's app store, right?

1:29:30

And

1:29:32

I think this is just like sort of

1:29:33

reinforcing why these platforms are bad.

1:29:36

Like we talked about at the start,

1:29:37

like having these platforms decide what is

1:29:40

allowed, what is not allowed.

1:29:42

This is just a bad idea.

1:29:46

This bill in particular,

1:29:47

it sounds to most people,

1:29:50

I think it would sound reasonable,

1:29:52

but the issue is where, you know,

1:29:55

there's more, there's things where like,

1:29:57

you know,

1:29:58

what's classified as adult content, right?

1:30:00

Like that's the stuff that they mentioned

1:30:02

sounded reasonable, I guess if you're,

1:30:06

I mean, I guess, uh, but you know,

1:30:09

there's all sorts of issues when it starts

1:30:11

covering more stuff that isn't technically

1:30:14

adult content.

1:30:14

That's restricting people from accessing

1:30:17

those, uh,

1:30:19

applications unless they verify their

1:30:21

identity.

1:30:23

Um, so I dunno,

1:30:24

this is definitely sort of,

1:30:26

I guess it's somewhat of a win.

1:30:27

It's not like a.

1:30:32

The bill got taken down, I guess,

1:30:34

but there's still the chance that

1:30:35

device-based age verification might make

1:30:38

its way through,

1:30:38

like we've been seeing with the app store

1:30:42

transparency stuff.

1:30:45

I think that would definitely be

1:30:48

probably in a lot of cases worse because

1:30:51

you know doing this on a device level

1:30:53

is a lot more invasive and it gives

1:30:56

a lot more control to these tech companies

1:30:58

so I'm certainly against both I think I

1:31:03

can't believe I keep having to say this

1:31:04

but like you know we have parental

1:31:07

controls like we have all these amazing

1:31:09

tools that people have access to I don't

1:31:11

think the government needs to get uh

1:31:15

involved on people's devices so much,

1:31:18

right?

1:31:19

Maybe people have different opinions.

1:31:20

There's a way to do this privately.

1:31:22

I just think there's always leaks, right?

1:31:26

Like this article in particular, actually,

1:31:29

it mentions, like if you scroll down,

1:31:33

there's a section in there where they talk

1:31:34

about the Discord age verification stuff

1:31:37

where people that were having to send

1:31:39

their ID data and selfies was breached in

1:31:43

a security breach, right?

1:31:44

Like

1:31:46

Even though we think all of these services

1:31:48

are done properly,

1:31:50

the age verification systems,

1:31:52

a lot of times security is just not

1:31:54

the priority.

1:31:55

So that's sort of my thoughts on this

1:31:57

story, I guess.

1:32:00

And even if they were,

1:32:00

just to add on to that last part

1:32:02

you said, this week alone,

1:32:04

I've covered three or four stories about

1:32:05

insider threats and people...

1:32:07

abusing their access to a system to get

1:32:10

into.

1:32:11

I covered one.

1:32:12

I need to add these to my website,

1:32:13

actually.

1:32:13

I covered one about a police officer who

1:32:16

was using DMV photos to make AI nudes

1:32:18

of women, not even making that up.

1:32:20

And then I found another one.

1:32:22

I wasn't even looking for this one.

1:32:23

This one didn't even come across my

1:32:25

newsfeed.

1:32:25

I was web searching for something else and

1:32:28

it magically showed up in the search

1:32:29

results.

1:32:30

There was a dude at Facebook who was

1:32:32

giving himself access to over thirty

1:32:34

thousand private photos.

1:32:36

So, yeah,

1:32:37

even if these systems are made correctly,

1:32:39

quote-unquote correctly,

1:32:40

I'm not going to count Facebook as

1:32:41

correctly because they've had more data

1:32:43

breaches than there are grains of sand on

1:32:45

Earth.

1:32:46

But even if these things are made

1:32:48

correctly, they're still insider threats,

1:32:50

and it's just –

1:32:52

Yeah,

1:32:52

I think I'm kind of coming around to

1:32:53

you because there's so many ways to solve

1:32:55

a problem, right?

1:32:56

And there's the technical solutions,

1:32:58

there's legal solutions,

1:32:59

but then there's like educational

1:33:00

solutions,

1:33:01

which I think is kind of like where

1:33:02

things like privacy guides come in and

1:33:03

this this podcast and stuff.

1:33:06

And I think

1:33:08

From what I'm seeing,

1:33:09

I think this is probably largely an

1:33:11

educational problem because I feel like I

1:33:13

mentioned the example of, again,

1:33:14

my sister didn't even know that iPhones

1:33:16

have parental controls.

1:33:18

And granted, her kid is really young.

1:33:19

She doesn't have to worry about that yet.

1:33:20

He never has.

1:33:22

He doesn't have his own phone or tablet

1:33:23

or anything.

1:33:25

Um,

1:33:25

so she's not at that point where she

1:33:27

has to worry about it,

1:33:27

but like how many parents know these

1:33:29

things exist?

1:33:30

How many parents know what they're capable

1:33:31

of?

1:33:31

I've heard,

1:33:33

I never used them cause I don't have

1:33:33

kids.

1:33:34

I've heard that some of these parental

1:33:35

controls are actually really good,

1:33:37

but how many of them, you know,

1:33:38

just don't know they're there or they'll

1:33:40

pay for some garbage third party thing.

1:33:42

That's going to be selling their kids

1:33:43

data.

1:33:44

Cough Cough Life, three sixty.

1:33:46

Because, you know, again,

1:33:47

like especially my my generation,

1:33:48

we came up in an era where like

1:33:50

Windows security was garbage.

1:33:51

Nobody trusted Windows firewall.

1:33:53

Of course,

1:33:53

you had to pay for a third party

1:33:55

antivirus.

1:33:55

And that's just not true anymore.

1:33:57

And I just I wonder how many people

1:33:58

even know that.

1:33:59

So, yeah,

1:33:59

I think I am kind of starting to

1:34:01

lean more towards the side of like I

1:34:02

think this is largely an edge or at

1:34:04

very least we need to start with the

1:34:06

educational aspect.

1:34:08

And then if we get everybody up to

1:34:09

speed and find out that there's cracks,

1:34:10

then maybe we start talking about how do

1:34:11

we fix this?

1:34:12

But

1:34:13

Yeah,

1:34:13

this definitely feels like an

1:34:14

oversimplified... I mean,

1:34:17

I've known that from the start.

1:34:18

But yeah,

1:34:19

age verification is just an overly simple

1:34:21

solution.

1:34:23

I think I want to add a little

1:34:24

bit extra onto what you're saying there

1:34:25

about how we should...

1:34:29

teach adults about these features.

1:34:32

We have gotten to a point, right,

1:34:34

where it is like,

1:34:36

I almost feel like people don't have an

1:34:37

excuse because if you buy a new Google

1:34:40

device, if you buy a new Apple device,

1:34:42

if you buy a new Windows device,

1:34:44

in the setup process,

1:34:46

it literally asks you,

1:34:48

is this device for a child?

1:34:51

Um, like it is kind of like,

1:34:54

I feel like we've gotten to a point,

1:34:56

right?

1:34:56

Where like, maybe,

1:34:58

maybe the device has to come with like

1:35:00

a red sheet of paper or something that

1:35:01

says, if this device is for a child,

1:35:04

please set it up during the setup process.

1:35:06

Like how much more obvious can we get?

1:35:08

Like, you know what I mean?

1:35:09

But actually I do want to push back

1:35:11

on that a little bit.

1:35:12

Cause it doesn't do that here in the

1:35:13

U S. Um,

1:35:15

but I think that would be a good

1:35:16

idea if it did that in the U

1:35:18

S because I don't see any reason it

1:35:19

shouldn't.

1:35:20

No.

1:35:21

It doesn't ask?

1:35:23

I haven't seen a single one in the

1:35:24

US.

1:35:24

And I mean, granted,

1:35:25

it's been a while since I've set up

1:35:26

anything that wasn't... No,

1:35:28

even because the most recent device...

1:35:31

Well, I mean, okay,

1:35:31

this computer is a company computer,

1:35:33

so it was already set up when I

1:35:34

got it.

1:35:36

My Windows computer I got in...

1:35:44

When did I get my iPhone?

1:35:46

I don't know,

1:35:46

but they're all within the last five years

1:35:47

for sure.

1:35:48

And not a single one of them has

1:35:49

done it to me.

1:35:49

I don't know.

1:35:51

I don't think I know anybody who set

1:35:53

up a device from scratch.

1:35:54

I think most people I know just like

1:35:55

transfer their Apple ID or their Google

1:35:57

account or whatever.

1:35:58

So I don't know.

1:36:00

I could try to do some digging and

1:36:01

look into it.

1:36:01

But yeah,

1:36:02

I've never had a device ask me that

1:36:04

ever here in the US.

1:36:05

But like I said,

1:36:05

I don't think that would be a bad

1:36:06

idea because...

1:36:08

Yeah.

1:36:08

How cool would that be if, you know,

1:36:09

my sister goes out and buys her kid

1:36:11

his first iPhone and it says,

1:36:12

is this device for a child?

1:36:13

And she goes, why?

1:36:14

Yes.

1:36:15

Yes, it is.

1:36:15

And then it just walks her through the

1:36:16

parental controls.

1:36:17

So yeah, I think that, I mean,

1:36:20

I can only comment on like,

1:36:22

I don't have particularly new devices.

1:36:24

Like I have an older phone, right.

1:36:26

I reset it recently and through the setup

1:36:29

process, at least on iOS, it did ask,

1:36:33

um, you know, it said,

1:36:36

It said during the setup process, like,

1:36:38

is this device for a child?

1:36:39

Same with this Android phone on the Google

1:36:41

operating system.

1:36:42

And same with Windows, actually.

1:36:43

I installed Windows recently and they did

1:36:45

ask.

1:36:46

But it does say on Apple's website.

1:36:49

I did look into this before because I

1:36:51

was having this conversation with someone.

1:36:53

Like, it does offer this section.

1:37:01

It does say on their website, you know,

1:37:03

this is a –

1:37:07

before you can set parental controls on a

1:37:09

child's device.

1:37:10

So it doesn't say specifically on here.

1:37:17

I mean, I don't know.

1:37:18

I don't think it would be different in

1:37:19

the US,

1:37:19

but I guess you can trial and error

1:37:22

this.

1:37:22

But it does say,

1:37:23

I'm seeing articles here where it says it

1:37:26

is...

1:37:27

implementing new features when you set up

1:37:29

a device.

1:37:30

So I'm not sure,

1:37:30

maybe we'll have to look more into how

1:37:32

that affects things globally.

1:37:35

Because I know in Australia,

1:37:37

we do have like age verification laws and

1:37:39

stuff.

1:37:40

So it could be applied differently here.

1:37:42

But I do think companies are making things

1:37:46

incredibly easy now to do this.

1:37:49

And that's why I kind of get a

1:37:50

bit frustrated when there's government

1:37:52

officials who are pushing for these really

1:37:56

aggressive methods to do this right like i

1:38:00

don't know i feel like generally it's not

1:38:03

up to the government to decide this sort

1:38:05

of stuff like it should be up to

1:38:07

like a parenting decision um from the

1:38:09

parents like if they want to have a

1:38:10

child using an adult device they can but

1:38:14

um

1:38:16

Yeah, I definitely think the process,

1:38:18

at least even if it doesn't display it

1:38:21

on setting up a device,

1:38:22

I think it's good that the integration is

1:38:25

already there.

1:38:26

The options are there.

1:38:29

Maybe we could do a better job showing

1:38:31

people that this isn't even an option.

1:38:34

But I feel like it's, I don't know,

1:38:36

maybe Joda can comment on like a US

1:38:39

perspective on this.

1:38:40

But every device I've set up so far

1:38:43

has asked me if it's a child's one.

1:38:45

So I don't know.

1:38:48

Yeah,

1:38:49

maybe I need to go reset my iPhone

1:38:50

and see what happens.

1:38:51

And I agree with you.

1:38:53

That's what I'm saying.

1:38:54

I've heard the parental controls are

1:38:55

really good.

1:38:56

It's just, at least here in the US,

1:38:58

I feel like it's an issue of how

1:38:59

do we let people know those are out

1:39:00

there.

1:39:01

I didn't even know Windows had parental

1:39:02

controls, to be totally honest with you.

1:39:04

But I don't know.

1:39:05

I've just...

1:39:07

Maybe that's something that's only rolled

1:39:08

out in the past couple years,

1:39:09

and I just barely missed it.

1:39:11

So...

1:39:12

I don't know,

1:39:12

but I certainly would not be opposed to

1:39:14

it.

1:39:14

To like, yeah,

1:39:15

these controls are already built in.

1:39:16

Say that this device is for a child

1:39:18

and we'll walk you through how to set

1:39:19

them up and how to use them.

1:39:20

I think that would be awesome.

1:39:23

Yeah, I've definitely seen it on Windows.

1:39:24

Can't recall on Android and iOS.

1:39:26

So it does,

1:39:27

I think it might've been because you

1:39:29

might've got the laptop as a Windows X

1:39:32

laptop and then you upgraded it to Windows

1:39:34

XI.

1:39:34

It might've bypassed the screen.

1:39:35

That could be it, yeah.

1:39:37

Because it, wait, was this one?

1:39:42

I'm not sure, actually.

1:39:43

I think it was more a Windows Eleven

1:39:45

feature.

1:39:46

So it could have been before they fully

1:39:48

released it all.

1:39:50

But the way it works on Windows is

1:39:51

quite good as well.

1:39:52

It works really well on iOS and Google

1:39:54

as well.

1:39:56

I think it's currently a Mac.

1:39:59

Yeah, Mac has it too.

1:40:01

So all the major platforms do have it.

1:40:03

So I kind of become a little bit

1:40:06

frustrated when we're

1:40:08

Trying to push these aggressive laws.

1:40:11

I've seen it for Apple Watches,

1:40:13

says Jonah.

1:40:14

Yeah, I've seen it too.

1:40:15

Quite good if you use their online

1:40:17

accounts, as far as I know.

1:40:18

I mean, yeah.

1:40:21

I mean, this is kind of a drawback,

1:40:24

right?

1:40:25

I don't think Graphene OS has parental

1:40:27

controls built in.

1:40:29

Don't think that's really a priority for

1:40:30

them.

1:40:31

And definitely not Linux, so...

1:40:34

I mean, yeah,

1:40:35

that is kind of an issue with these

1:40:36

more open platforms.

1:40:37

They tend to not include these sort of

1:40:39

features.

1:40:40

So yeah,

1:40:41

it's definitely a good discussion to have

1:40:43

though.

1:40:45

Yeah, for sure.

1:40:45

And I'm definitely going to keep an eye

1:40:46

out for it next time I buy a

1:40:47

new device now,

1:40:48

because now I'm really curious.

1:40:50

I think that would be great if it

1:40:51

was a default prompt for sure.

1:40:53

So.

1:40:57

On that note, I think it's, I mean,

1:41:00

that's all I had on that forum thread.

1:41:02

So I think it's time to head over

1:41:04

to some viewer questions.

1:41:07

So we'll start with questions on our forum

1:41:09

from paying members.

1:41:11

If you are interested in becoming a

1:41:12

member,

1:41:13

you can go to privacyguides.org and click

1:41:14

the red heart icon in the top right

1:41:16

corner.

1:41:17

of the page but we only had one

1:41:20

question this week on our initial um on

1:41:23

our forum post about this this topic and

1:41:27

uh this is from expert forty forty eight

1:41:30

seventy who says what are the privacy

1:41:31

implications of using an alternative front

1:41:33

end that fetches content directly from the

1:41:35

original service rather than proxying it

1:41:37

um you say something like something

1:41:39

invidious instances do uh so it will be

1:41:43

It will be, while using a popular VPN,

1:41:45

how does that compare to just using the

1:41:47

original website with a content blocker

1:41:48

like uBlock Origin?

1:41:49

My experience has been that browser

1:41:50

fingerprinting techniques can still track

1:41:52

users easily,

1:41:53

even with content blockers enabled,

1:41:54

so I'm wondering whether non-proxying

1:41:55

frontends offer different protections.

1:42:01

I'll be honest.

1:42:02

I'm not super familiar with the technical

1:42:03

aspects of frontends,

1:42:04

so I'm not sure which ones proxy and

1:42:06

which ones don't.

1:42:07

I can talk about it if you want.

1:42:12

Um, I'll let you go first then.

1:42:13

Cause you probably know more about this

1:42:14

than I do.

1:42:16

Yeah.

1:42:16

So basically there's, uh,

1:42:18

I guess let's talk about the main two

1:42:20

ones here,

1:42:20

but like we're talking about piped and

1:42:22

NVIDIAs, at least the web-based ones.

1:42:25

So, um, by default,

1:42:27

as far as I'm aware, like a lot,

1:42:30

it depends on the instance, right?

1:42:31

Because these are like decentralized

1:42:33

services.

1:42:33

So it depends on what the instance is

1:42:35

configured.

1:42:36

So the piped, um,

1:42:40

Piped uses a piped proxy.

1:42:42

So it's actually your requests to YouTube

1:42:44

are going through a separate,

1:42:46

the server that you're connecting to for

1:42:48

the websites,

1:42:49

basically they're proxying the requests on

1:42:51

your behalf.

1:42:52

And the issue with this sort of method,

1:42:54

right,

1:42:55

is

1:42:56

it's a lot easier to be blocked, right?

1:42:58

Because if there's ten thousand people

1:43:00

accessing a piped instance, it's going to,

1:43:03

YouTube's going to block that.

1:43:04

They're going to think you're a bot,

1:43:05

you're spamming.

1:43:07

So that's the issue that we kind of

1:43:09

have with piped.

1:43:10

They're kind of getting blocked a lot and

1:43:13

the access is not as good.

1:43:16

NVIDIUS in this instance,

1:43:18

it actually plays

1:43:20

basically what it does is it strips out

1:43:22

all the add-in tracking technology from

1:43:25

the YouTube website and it will actually

1:43:27

play the video directly from Google.

1:43:29

So you're still making a connection to

1:43:31

Google itself.

1:43:34

Again, though,

1:43:34

there's a setting in the settings called

1:43:36

proxy videos,

1:43:37

and that will proxy it through the NVIDIA

1:43:39

instance.

1:43:40

But by default,

1:43:42

it should play it directly from Google.

1:43:45

So the reason this is kind of also

1:43:48

becoming a problem is because a lot of

1:43:50

VPN servers are getting restricted and

1:43:52

they require you to sign in to play

1:43:55

videos.

1:43:57

Yeah,

1:43:58

there's like more restrictions being made.

1:43:59

So with NVIDIA,

1:44:01

you can actually check this yourself,

1:44:02

right?

1:44:02

You can use uBlock Origin and you can

1:44:04

see the connections that the website is

1:44:07

making.

1:44:07

You'll see it's connecting to Google

1:44:08

Video.

1:44:10

So your IP address is visible to YouTube

1:44:13

itself, right?

1:44:14

But there's significantly less tracking

1:44:16

happening because

1:44:18

the JavaScript on YouTube's website isn't

1:44:20

actually loading,

1:44:21

which is the usual concern, right?

1:44:25

So YouTube will know that your IP address

1:44:28

is pulling a video from their servers,

1:44:30

but

1:44:32

Yeah,

1:44:32

then we can also talk about like FreeTube,

1:44:35

and that does give you the option when

1:44:36

you're setting it up if you want to

1:44:38

do fully local playback.

1:44:40

So the same thing as NVIDIA is pulling

1:44:41

the video directly from Google,

1:44:42

or you can also use a piped proxy,

1:44:45

which like we talked about,

1:44:47

it can have issues,

1:44:48

but it does offer more privacy because

1:44:51

your IP address isn't being directly

1:44:53

exposed to Google itself.

1:44:56

So

1:44:57

Another thing here is using a VPN and

1:44:59

then using NVIDIUS or like a local,

1:45:02

locally fetching these videos,

1:45:03

it's going to be a lot less

1:45:05

easy to track it because you're using an

1:45:07

IP address that a bunch of other users

1:45:08

have.

1:45:10

So I think that's basically kind of the

1:45:12

rundown.

1:45:13

I wouldn't use NVIDIUS if you are on

1:45:16

like a residential connection because

1:45:18

it'll just be linked to your IP address.

1:45:20

They'll just see your residential IP

1:45:22

address accessing all the videos.

1:45:23

It'll be easier for them to track it.

1:45:25

So I'd say try using a VPN and

1:45:28

try using NVIDIUS and

1:45:31

directly fetching the videos over pipes

1:45:35

because pipes is usually locked a lot more

1:45:38

commonly.

1:45:39

So hopefully that answers the privacy

1:45:42

question about this topic.

1:45:44

It's kind of confusing,

1:45:45

but if I didn't answer it well,

1:45:47

just let me know.

1:45:49

Cool.

1:45:50

Thank you.

1:45:51

I think it's also just one thing I

1:45:53

want to throw out is we don't really

1:45:54

know much about browser fingerprinting.

1:45:58

Like I made like a year ago,

1:46:00

I made a video about that over on

1:46:03

my YouTube.

1:46:03

And the thing I learned is that a

1:46:05

lot of it

1:46:07

I mean, there's like two categories.

1:46:08

There's the people who are like marketing

1:46:10

companies who are like, yeah,

1:46:11

we can fingerprint anybody anywhere.

1:46:12

And it's like, okay,

1:46:13

and I'm going to take you with a

1:46:14

grain of salt because you'll say anything

1:46:16

to make a sale.

1:46:17

And then there's the technical people who

1:46:19

are just like literally everything can be

1:46:20

fingerprinted.

1:46:21

And like the privacy people who say this.

1:46:24

And I think the issue is we don't

1:46:26

actually know for sure

1:46:28

how prevalent it is,

1:46:29

which techniques they're actually using.

1:46:31

I've seen all kinds of proofs of concept

1:46:32

about CSS can be fingerprinted if you do

1:46:35

it right.

1:46:36

A lot of extensions can be fingerprinted.

1:46:39

There's so many different ways to do it,

1:46:41

but we don't know for sure what ways

1:46:43

they're doing and what ways they're using.

1:46:46

I'm not saying not to worry about it.

1:46:47

I just want to point that out.

1:46:49

It's really...

1:46:50

um it's not like you block origin does

1:46:52

nothing i know it does block a lot

1:46:54

of stuff and then you know brave obviously

1:46:56

has a lot of built-in protections firefox

1:46:58

especially with the setting changes that

1:47:00

we recommend offers really good protection

1:47:02

it's obviously not perfect um if you need

1:47:04

perfection or as close to perfection as

1:47:06

you can get you need something like tor

1:47:08

and hunix but at that point you're

1:47:09

probably not streaming youtube so um just

1:47:12

something to keep in mind there's still

1:47:13

definitely a use a place for those so

1:47:16

I mean, I think there's definitely,

1:47:18

we do have some research that has been

1:47:20

done specifically on browser

1:47:22

fingerprinting.

1:47:23

Like when I was looking into,

1:47:25

like we also did a video here at

1:47:27

Privacy Guides.

1:47:28

We interviewed someone about it as well.

1:47:31

There are at least some hard facts about

1:47:33

it.

1:47:34

So, I mean,

1:47:35

definitely go maybe check out that video.

1:47:37

We did talk a little bit with...

1:47:39

um we got information from someone at the

1:47:41

tour project who works on a lot of

1:47:43

the fingerprinting stuff for that um so

1:47:46

definitely look at that i think the um

1:47:49

there's definitely papers that have been

1:47:50

done when i was researching for that video

1:47:52

there was quite a lot of papers about

1:47:57

specifically talking about stuff like

1:47:59

entropy and you know how that affects the

1:48:02

fingerprint I guess I think Nate's right

1:48:05

though just like we don't really have the

1:48:07

specifics of what people what companies

1:48:10

are doing because it's kind of hard to

1:48:12

know right but I think going off the

1:48:16

research that we do have you know

1:48:20

increasing entropy you know like with Tor

1:48:22

browser I think there's

1:48:24

there's pretty much,

1:48:25

there is basically proof at this point

1:48:27

that like, you know,

1:48:28

if you use Tor browser,

1:48:29

if you take all the precautions as

1:48:30

possible, you're not going to be,

1:48:35

you're not going to be able to identify

1:48:36

someone specifically with their

1:48:38

fingerprint if they're using something

1:48:39

like Tor browser.

1:48:41

But I think we have gotten to a

1:48:44

point where so many tools just have all

1:48:46

this built in,

1:48:46

like Firefox and Brave both have

1:48:48

fingerprint protection built in by default

1:48:50

now.

1:48:51

So it's like,

1:48:54

Basically,

1:48:55

we're getting to a point where these

1:48:57

protections are becoming pretty

1:49:01

mainstream.

1:49:01

But I think if you need something a

1:49:03

bit more extreme,

1:49:04

then something like Molvado Tor is

1:49:06

definitely going to offer better

1:49:09

protection.

1:49:11

Yeah, just to be clear, like you said,

1:49:12

we have a lot of research into how

1:49:14

good the browsers are at resisting it.

1:49:16

We just don't have a lot of research

1:49:17

into how many companies are doing it,

1:49:20

what exact techniques they're using,

1:49:21

how common it is.

1:49:23

I assume it's pretty common.

1:49:24

I assume that a lot of companies are

1:49:27

doing it.

1:49:28

They don't tell us because it's kind of

1:49:30

like their secret sauce for marketing,

1:49:32

and this is why we're so effective.

1:49:34

But yeah, it's just...

1:49:37

I guess what I'm getting at is I

1:49:38

think uBlock Origin and a good privacy

1:49:40

browser is probably a lot more effective

1:49:42

than we give it credit for.

1:49:43

But I mean,

1:49:45

I'll definitely never complain about

1:49:46

somebody going the extra mile if they feel

1:49:49

the need to.

1:49:50

So it doesn't hurt.

1:49:52

Yeah, I think it's definitely...

1:49:55

I'll just push people towards...

1:49:57

I know Nate did a video about it

1:49:59

as well.

1:49:59

I thought that was quite good.

1:50:00

We also did a video.

1:50:01

Definitely check out,

1:50:02

get different perspectives on it

1:50:03

because...

1:50:05

uh there's definitely a lot of different

1:50:07

opinions right because we've got we've got

1:50:09

the tor people we've got the brave people

1:50:11

we've got the firefox people they've all

1:50:13

got different uh we've got the ark and

1:50:16

fox people they've got a different uh

1:50:18

perspective than the tor browser people so

1:50:21

you know guess go to different places for

1:50:23

information try and uh

1:50:26

try and understand the topic as best as

1:50:28

you can.

1:50:28

Hopefully the resources that we've put out

1:50:30

there is good enough to kind of make

1:50:32

a good judgment on it.

1:50:33

But like, like Nate said, like,

1:50:35

I feel like when we talk about this

1:50:37

stuff, it is kind of an extreme topic.

1:50:39

Like, you know,

1:50:41

having a privacy browser and a new block

1:50:43

origin, like Nate said,

1:50:44

is like going to be better than ninety

1:50:45

nine percent of people.

1:50:47

So just put it in perspective.

1:50:53

Which on that note,

1:50:54

I've been perusing the live chat here.

1:50:57

And I think there's only one question we

1:50:58

haven't addressed so far.

1:50:59

But it actually kind of touches on this

1:51:02

a little bit.

1:51:04

And it says,

1:51:05

this comes from anonymous three four four

1:51:06

here in the stream yard chat.

1:51:08

Threat modeling should be deferred to

1:51:09

experts that you personally consult on

1:51:11

over and over again.

1:51:12

It's unfeasible for an individual to know

1:51:14

every single vulnerability and scenario

1:51:15

that they have to protect against.

1:51:17

Do you guys plan to provide privacy

1:51:18

consulting in the near or far future?

1:51:23

I mean,

1:51:23

I don't speak for everybody around here.

1:51:25

I don't think we're planning anything like

1:51:27

that as far as I know.

1:51:29

I certainly haven't heard anything about

1:51:30

it.

1:51:31

Probably not would be my guess.

1:51:33

Not anytime soon, at least.

1:51:35

I don't know if we make far.

1:51:37

I personally do not make far,

1:51:38

far future plans because you never know

1:51:41

what will happen.

1:51:42

Right.

1:51:42

I've had my long term plans thrown into

1:51:44

chaos multiple times throughout the course

1:51:46

of my life.

1:51:46

So I've given up on long term plans.

1:51:48

I just worry about the next five years

1:51:49

or so and go from there.

1:51:51

But I do want to say that threat

1:51:55

modeling I don't think has to be an

1:51:57

expert thing because there's multiple

1:52:00

steps to threat modeling, right?

1:52:01

And one of those steps is basically

1:52:05

figuring out –

1:52:08

how bad are the risks if i fail

1:52:09

like that's that's one of the steps right

1:52:11

and so i think for a lot of

1:52:14

people like that's i think that's kind of

1:52:15

where we come up with the idea of

1:52:16

like a low threat model you know if

1:52:17

somebody's like i'm gonna pick on people

1:52:21

here but back in the day i used

1:52:22

to see people having like really really

1:52:25

extreme meltdowns where they were like oh

1:52:27

my god i connected to youtube once and

1:52:29

i didn't have my vpn on like i'm

1:52:31

so screwed and it's like

1:52:33

Calm down.

1:52:35

It's not that big a deal.

1:52:36

Google has one IP address.

1:52:39

It probably rotates anyways in a lot of

1:52:40

parts of the country or a lot of

1:52:41

countries around the world.

1:52:43

The results for most people are not that

1:52:45

big a deal.

1:52:46

And again, going to what I said earlier,

1:52:48

if your threat model is so high that

1:52:50

Google can't have your one IP address,

1:52:51

you probably shouldn't be going to YouTube

1:52:52

in the first place.

1:52:53

But anyways,

1:52:53

my point is I don't think it's something

1:52:55

that everybody necessarily has to go see a

1:52:57

professional for.

1:52:58

And I say this as somebody who has

1:53:00

done consulting in the past.

1:53:02

I think, yes,

1:53:03

if you have a very high threat model,

1:53:04

then yeah,

1:53:05

you probably shouldn't just be winging it

1:53:06

and trying to piece together a bunch of

1:53:08

random websites and YouTube videos.

1:53:10

But at the same time,

1:53:11

like if you're just like, dude,

1:53:12

I'm not an activist, I'm not,

1:53:14

uh political figure i'm not super i just

1:53:16

i just want my privacy i just want

1:53:17

to not get targeted ads i just want

1:53:19

people to not be stalking me but it's

1:53:22

not that big a deal and i'm not

1:53:23

willing to bend over backwards i mean

1:53:24

that's part of a threat model too right

1:53:25

how much effort are you willing to go

1:53:27

through because not everybody is willing

1:53:28

to go through the same amount of effort

1:53:30

and i'm going to say that again because

1:53:31

i feel like a lot of people in

1:53:32

the privacy community forget that

1:53:33

Not everybody is willing to go through the

1:53:34

same amount of effort and that's fine.

1:53:37

As long as like their threat model is

1:53:38

being met.

1:53:39

So yeah, it's an, I don't know.

1:53:42

I think if you want to get consulting,

1:53:45

I mean,

1:53:46

if that's something you want to do,

1:53:47

that'll help you sleep at night,

1:53:47

go for it.

1:53:48

But I don't think it's something that

1:53:50

should only be deferred to by experts.

1:53:52

Cause I mean, we're human too.

1:53:53

There's no like,

1:53:55

governing board that certifies privacy

1:53:57

experts or anything so I don't know yeah

1:54:02

I just that's my thoughts

1:54:04

Yeah,

1:54:05

it's like one of these things where like

1:54:07

I feel like, you know,

1:54:08

there's certain things where you can just

1:54:09

throw money at something and kind of

1:54:11

remove a bunch of the effort here.

1:54:13

Like I feel like going through and trying

1:54:14

to understand what are the best tools,

1:54:17

what do I need to do?

1:54:18

Like it is kind of time consuming.

1:54:19

Like Nate was saying,

1:54:20

like not everyone has hours every day to

1:54:24

go through an hour.

1:54:25

you know,

1:54:26

work out the tools and update things.

1:54:29

So, I mean,

1:54:29

it can make sense to throw money at

1:54:31

something.

1:54:31

I don't think you need to.

1:54:33

I think everything is available for free.

1:54:35

Like,

1:54:35

we try really hard to make things

1:54:38

accessible to everybody.

1:54:40

Like, we don't want to paywall stuff.

1:54:42

You know,

1:54:43

we offer benefits to members who give us

1:54:45

donations because, you know,

1:54:46

it's the least we can do for supporting

1:54:49

us.

1:54:49

But I think, you know,

1:54:51

if there's something that you want to kind

1:54:54

of

1:54:55

easy mode you can talk to an expert

1:54:58

i mean i'm not going to recommend it

1:54:59

i think all the content is available for

1:55:01

free and we've talked about this before

1:55:04

like you can take things slowly like you

1:55:08

can just do one thing every month like

1:55:10

when you have a bit of spare time

1:55:12

like you don't have to to go at

1:55:14

like i think it was michael basil who

1:55:16

said this it's like uh privacy is a

1:55:20

marathon not a sprint um

1:55:23

And I really like that quote because I

1:55:25

think it's, you know,

1:55:27

we think about things that we need to

1:55:30

do,

1:55:30

but I don't think we need to do

1:55:33

things immediately and we don't need to

1:55:37

try and blitz through everything in like

1:55:38

two days.

1:55:40

You certainly can if you want,

1:55:41

but it's definitely not required.

1:55:45

So, you know,

1:55:45

definitely put that into perspective for

1:55:49

you.

1:55:51

I always love telling people how I was

1:55:52

that lunatic that sat down one weekend and

1:55:54

went,

1:55:54

I'm going to move all my passwords to

1:55:56

a password manager and I do not recommend

1:55:57

it.

1:55:58

But,

1:56:00

Yeah,

1:56:01

not to get into a big back and

1:56:02

forth, but you said like, yeah,

1:56:03

for the average person,

1:56:04

threat modeling is not too high.

1:56:05

I mean,

1:56:05

everybody should threat model because

1:56:06

that's how you know if you're doing

1:56:07

enough,

1:56:08

but you're like going back to the story

1:56:09

of the Notification League,

1:56:10

surely it would be better to consult an

1:56:11

expert for blue team defenses.

1:56:13

Yeah, again,

1:56:14

if you're working on a professional blue

1:56:15

team,

1:56:15

if you are an activist who's facing jail

1:56:17

time or could potentially,

1:56:18

like sure at that point, but yeah,

1:56:22

we don't offer consulting at this time.

1:56:24

I don't know if there's any plans to,

1:56:25

but I mean, if we do,

1:56:27

I'm sure we'll announce it.

1:56:29

I think one thing also to add to

1:56:31

this comment, right, is they're saying,

1:56:33

like, threat modeling, like,

1:56:34

they're saying you should surely be better

1:56:37

to consult an expert for blue team

1:56:40

defenses.

1:56:40

I think, you know,

1:56:42

let's be a little bit honest here.

1:56:43

This is, like,

1:56:44

a very privileged position to be in.

1:56:46

Like,

1:56:46

not everyone has the money to just throw

1:56:49

this.

1:56:49

Like, we're talking about, like,

1:56:50

decentralized groups of activists here.

1:56:52

Like,

1:56:53

we're not – I don't mean any shade

1:56:55

when I say this,

1:56:56

but a lot of organizations are not exactly

1:56:58

1:57:00

uh, they're cash strapped, right?

1:57:02

Like they don't have money to do this

1:57:04

sort of thing.

1:57:05

Uh,

1:57:05

it's not really that high on their list

1:57:07

of priorities.

1:57:08

Um, so if it's like a,

1:57:11

a business where they have a certain

1:57:13

budget to,

1:57:14

to spend on this sort of thing,

1:57:15

obviously makes sense,

1:57:16

but that's why I think it's so important

1:57:18

to offer this stuff free because,

1:57:20

you know,

1:57:21

there are people who are in less, uh,

1:57:25

less privileged positions that also need

1:57:27

this information.

1:57:28

Um,

1:57:29

And it should be accessible, right?

1:57:31

So obviously, in the best case scenario,

1:57:34

this person should have consulted an

1:57:36

expert for blue team defenses.

1:57:38

But I'm pretty sure that this person was

1:57:40

probably not someone who had the money or

1:57:42

the time to be investing in this sort

1:57:45

of protection, I guess.

1:57:47

Yeah, for sure.

1:57:54

Taking one more look at the thread here.

1:57:56

Doesn't look like anybody's added

1:57:57

anything.

1:58:00

Anything else you wanted to mention or

1:58:01

call out?

1:58:04

Not particularly.

1:58:05

I guess if no one's got any extra

1:58:08

questions,

1:58:08

I guess we can move into the outro

1:58:11

here.

1:58:12

So all the updates from this week in

1:58:14

privacy will be shared on the blog every

1:58:16

week.

1:58:16

So you can sign up for the newsletter

1:58:18

or subscribe with your favorite RSS reader

1:58:21

if you want to stay tuned.

1:58:22

For people who prefer audio,

1:58:24

we also offer a podcast available on all

1:58:27

podcast platforms and RSS.

1:58:29

And this video will also be synced to

1:58:32

PeerTube.

1:58:33

Privacy Guides is an impartial nonprofit

1:58:36

organization that is focused on building a

1:58:39

strong privacy advocacy community and

1:58:41

delivering the best digital privacy and

1:58:44

consumer technology rights advice on the

1:58:46

internet.

1:58:47

If you want to support our mission,

1:58:49

then you can make a donation on our

1:58:51

website at privacyguides.org.

1:58:55

To make a donation,

1:58:57

click the red heart icon located in the

1:59:00

top right corner of the page,

1:59:02

and you can contribute using standard fiat

1:59:05

currency via debit or credit card,

1:59:08

or opt to donate anonymously using Monero

1:59:11

or with your favorite cryptocurrency.

1:59:14

And becoming a paid member unlocks

1:59:16

exclusive perks like early access to video

1:59:19

content and priority during the This Week

1:59:23

in Privacy livestream Q&A.

1:59:26

And you'll also get a cool badge on

1:59:28

your profile in the Privacy Guides forum

1:59:30

and the warm,

1:59:31

fuzzy feeling of supporting independent

1:59:34

media.

1:59:35

Thanks for watching,

1:59:36

and we'll see you next week.