Apple's Lockdown Mode Blocked the FBI?
Ep. 39

Apple's Lockdown Mode Blocked the FBI?

Episode description

The FBI Couldn’t Get into a reporter’s iPhone because it had lockdown mode enabled, RAM shortages have hit Raspberry Pi prices, more countries are rolling out Australia style under 16 social media bans, and much more! Join us for This Week In Privacy #39.

Download transcript (.vtt)
0:22

Welcome back to This Week in Privacy,

0:25

our weekly series where we discuss the

0:26

latest updates with what we're working on

0:28

within the Privacy Guides community and

0:30

this week's top stories in the data

0:32

privacy and cybersecurity space,

0:34

including how lockdown mode thwarted

0:35

forensics tools,

0:37

Windows reconsidering their AI tools,

0:39

and more.

0:40

I am Nate,

0:41

and joining me this week is Jordan.

0:43

How are you doing, Jordan?

0:45

I'm good, thanks.

0:45

Happy to be here again,

0:47

returning on the podcast.

0:48

It's good to be back.

0:50

Yeah, it's been a minute.

0:51

Glad to have you back.

0:52

It's exciting.

0:54

Yes, a very exciting week this week.

0:57

Indeed.

0:58

Real quick, for those who don't know,

0:59

Privacy Guides is a nonprofit which

1:01

researches and shares privacy-related

1:03

information and facilitates a community on

1:05

our forum and matrix where people can ask

1:07

questions and get advice about staying

1:09

private online and preserving their

1:10

digital rights.

1:12

This week,

1:13

we are jumping straight into the news and

1:15

we're jumping straight in with a story

1:18

about how the FBI could not get into

1:21

a Washington Post reporter's phone because

1:24

of lockdown mode.

1:26

Um, so this story is admittedly not super,

1:30

super detailed.

1:31

Um, I mean,

1:32

I guess it's about as detailed as one

1:33

would expect, but, uh, let's see, where's,

1:37

oh, my screen popped off the stage.

1:39

Sorry about that.

1:40

Y'all let me go ahead and add that

1:41

back real quick.

1:43

Uh, shoot.

1:46

Hold on.

1:47

I think I'm hitting the wrong button

1:48

twice.

1:48

Yeah.

1:51

Okay.

1:54

There we go.

1:55

Got it.

1:55

Okay.

1:56

So this story comes from four Oh four

1:58

media.

1:59

Let me scroll up to the top here.

2:01

And, um,

2:02

You know, the title, like I said,

2:04

the title really says it all.

2:05

So the FBI did have a warrant.

2:07

That's a nice change of pace here.

2:09

The FBI had a warrant and they went

2:11

to a Washington Post reporter's home and

2:15

seized a bunch of devices.

2:17

And specifically,

2:19

they say that when they plugged in the

2:20

iPhone to their forensics tool to try and

2:23

extract information, they couldn't.

2:26

There's some interesting stuff here that

2:28

we talked about behind the scenes this

2:29

week and we were trying to figure out.

2:31

They said there was a notification on the

2:34

device's home screen that specifically

2:35

said it's in lockdown mode.

2:37

And those of us here at Privacy Guides

2:39

that have iPhones,

2:41

we were kind of playing around with it

2:42

and turning on lockdown mode and plugging

2:43

it in and trying to like,

2:45

can we recreate this?

2:46

Because I've never heard of that.

2:47

And as far as I know,

2:48

none of us were able to recreate that.

2:50

So we think this may have been a...

2:53

This may be or I think this may

2:56

be like something specific to that tool.

2:58

Like maybe when you plug this thing into

3:00

the forensics tool, it pops up and like,

3:01

oh, lockdown mode is enabled or something.

3:04

But yeah.

3:06

And then here's like a little screenshot

3:08

for video viewers.

3:09

Here's a little screenshot of the warrant

3:11

that or the court record that says that

3:13

they were unable to get it.

3:15

So.

3:17

I guess just to kind of round out

3:18

this story,

3:19

there is kind of some bad news,

3:20

which is that they were still able to

3:22

get the information because they had the

3:24

reporter access her MacBook.

3:28

And specifically...

3:31

This is interesting.

3:33

They told her to use Touch ID.

3:36

They asked her for a Touch ID or

3:37

a password, which, for the record,

3:39

I have it on good authority from a

3:40

lawyer.

3:41

You don't ever have to hand over your

3:43

password.

3:43

That is a violation of, I believe,

3:45

the First and Fifth Amendment here in the

3:47

U.S.,

3:47

But Touch ID, they can do.

3:50

They can make you unlock the device or

3:52

enter biometrics.

3:53

You just never have to specifically hand

3:55

over your password.

3:56

She said that she does not use biometrics,

3:58

but investigators told her to try anyways.

4:00

And they say when she applied her index

4:01

finger to the fingerprint reader,

4:02

the laptop unlocked.

4:04

And then they were able to, excuse me,

4:07

it says they have not yet obtained a

4:08

full image.

4:10

But they did take photos and audio

4:12

recordings of conversations stored in

4:13

Signal.

4:15

So a lot of lessons to learn here.

4:17

And I do wonder,

4:19

not to be a judge or anything,

4:20

but that's really not going to go over

4:21

well that she's like, oh,

4:22

I don't use biometrics.

4:23

They're like, we'll do it anyways.

4:24

And then it worked.

4:25

So, you know,

4:25

is that going to be like,

4:26

who's that New York mayor that was like,

4:29

oh, I lost my phone.

4:30

And it's like, yeah, sure you did, buddy.

4:32

Anyways.

4:34

So this is a man.

4:36

There's so many things to take away from

4:38

this story.

4:40

First of all, I guess to address the

4:43

I don't want to say elephant in the

4:44

room, but to address the headline,

4:45

these stories are useful because iPhone is

4:49

closed source, right?

4:51

And I think it's really cool to get

4:53

this kind of insight and confirmation that

4:55

things like lockdown mode do work and they

4:57

do provide that benefit that we're

5:00

trusting them to provide.

5:02

And

5:04

I know that in a perfect world we

5:05

would use something open source,

5:07

but I don't know.

5:08

The point I'm getting at is regardless,

5:09

it's still good to know that these things

5:10

work,

5:11

and it's good to have this kind of

5:12

third-party confirmation,

5:14

even if it's kind of unfortunate that it

5:15

has to come that way.

5:16

And the other thing that really jumped out

5:18

at me is,

5:19

like I said at the end there,

5:20

they were still able to access Signal

5:22

because her device was unlocked.

5:23

And I think that's a really important

5:25

thing to talk about because a lot of

5:26

the time we see this –

5:28

abused in the news or twisted in the

5:30

news where they're like, oh, my God,

5:31

they access this person's signal.

5:32

And it's like, well, yeah,

5:33

if the device is unlocked,

5:34

of course they did.

5:35

So these tools,

5:38

all tools have limitations, right?

5:39

There is no perfect tool out there.

5:42

But it's important that we take it.

5:46

First of all,

5:46

it's important that we know that.

5:47

So if you send anything, you know,

5:50

I've said in the past on other recordings

5:52

that anything you put in a digital format,

5:55

you should be prepared for the possibility

5:57

that

5:58

It may end up in a data breach

6:02

or something like that.

6:03

So keep that in mind.

6:04

But also,

6:04

it's important to enable things like

6:06

disappearing messages to keep in mind that

6:09

there's only so much we can do from

6:14

the endpoint itself.

6:16

So I think those were kind of my

6:18

takeaways from this story.

6:20

Jordan,

6:21

did you have an opportunity to read this

6:22

story?

6:22

And did you have any thoughts on it?

6:25

Yes, definitely.

6:26

Uh, it's good coverage there from, uh,

6:29

from you just there.

6:29

But I think that, uh,

6:31

one thing that was quite interesting about

6:33

this story is, um, the, uh,

6:37

four or four media article kind of goes

6:39

into, uh, some,

6:41

some more detail about the reason why,

6:44

like,

6:45

I think we should talk a little bit

6:46

about why lockdown mode is kind of able

6:49

to thought these sort of, uh,

6:51

police forensic tools, um,

6:55

And one of the most important things that

6:57

the lockdown mode actually does is it

7:00

disables the device connections.

7:03

So if you connect your iPhone or iPad

7:06

to a computer,

7:08

the device needs to be unlocked.

7:10

So without lockdown mode enabled,

7:12

sometimes it can actually automatically

7:16

connect to a computer without explicit

7:18

approval.

7:19

So...

7:22

that can allow mobile forensic tools like

7:24

GreyKey and Cellebrite,

7:26

which is like the common police forensic

7:28

tools to basically be able to extract

7:31

information from the device.

7:33

And we've seen this as well with Graphene

7:35

OS,

7:35

they have a similar setting in the

7:38

settings where you can control the USB

7:41

connection and allow, you know,

7:43

change the options that are required if

7:47

you connect the device to another

7:48

computer, right?

7:50

And as you can see on the screen,

7:52

Nate's brought this up here.

7:53

It's the Apple support page about like the

7:55

specifics of lockdown mode.

7:57

And another thing that these forensic

8:01

tools do is they often utilize

8:03

vulnerabilities in, you know,

8:06

software and hardware and with lockdown

8:09

mode enabled it kind of protects you in

8:12

on both fronts right because it stops the

8:14

device connections which can sometimes be

8:16

uh exploited with vulnerabilities and also

8:19

it disables a lot of these you know

8:22

quality of life features but ones that are

8:25

often used in exploits and vulnerabilities

8:28

such as like you know complex web browsing

8:31

uh

8:32

um like tools are disabled in this case

8:36

it says certain complex web technologies

8:37

are blocked which might cause some

8:39

websites to load more slowly or operate uh

8:42

correctly so you know as well as you

8:45

know some it disables some of the apple

8:47

services as well so i think this is

8:50

quite an interesting story to see that uh

8:55

lockdown mode actually did provide quite a

8:58

benefit to someone,

9:00

especially in like quite a vulnerable

9:01

position.

9:03

It's good to always see that this tool

9:06

that Apple has put out is actually helping

9:08

journalists in the field.

9:10

But it is another thing that's kind of

9:12

a bit of an unfortunate thing with this

9:14

case that

9:15

Nate kind of talked about was that they

9:18

were kind of able to bypass the entire

9:20

process because she had her MacBook with a

9:23

Touch ID.

9:24

And I think it'd be really good if

9:26

you could, Nate,

9:27

if you could elaborate a little bit.

9:29

You said there was laws around the First

9:34

and Fifth Amendment.

9:35

Could you maybe go into what exactly that

9:37

means and why Touch ID can kind of

9:41

bypass that?

9:43

Yeah.

9:45

Yeah.

9:46

Real quick.

9:46

Thank you for mentioning graphene because

9:48

officially privacy guides are our top

9:51

recommendation for a phone will always be

9:53

a graphene phone because it really does

9:54

have the best privacy, the best security.

9:58

But there's lots of reasons people may

9:59

want to use an iPhone.

10:00

Like there are some countries where pixels

10:02

are not available and things like that.

10:04

But yeah,

10:05

graphene has some very granular controls

10:07

over the USB port.

10:08

uh, port, but yeah, the,

10:10

so the first and fifth amendment here in

10:12

the U S, um, the first,

10:14

the first ten, I believe it is.

10:16

It's been a while since I was in

10:17

school.

10:18

Um,

10:18

the first ten amendments in the U S

10:20

constitution are called the bill of

10:21

rights.

10:22

And they were,

10:23

I think the first ten that like existed

10:25

when the country,

10:26

when the U S was founded.

10:27

And, uh,

10:29

The first specifically is – I mean a

10:32

lot of them have multiple parts,

10:33

but the first mostly pertains to freedom

10:35

of speech.

10:36

The government is not allowed to infringe

10:38

on your freedom of speech or your freedom

10:40

of expression,

10:41

your freedom to lobby the government for

10:43

complaints, things like that.

10:45

And then the Fifth Amendment is the one

10:46

that protects against self-incrimination.

10:48

So for example,

10:50

if any of you have ever watched a

10:51

cop show that's set here in the US,

10:53

which I'm sure is most of them,

10:54

the Fifth Amendment is the one that when

10:56

people say, I plead the Fifth,

10:57

I'm not going to talk.

10:59

What's the word?

10:59

I invoke my right against

11:01

self-incrimination, whatever.

11:02

There's a million ways to put it.

11:04

But

11:06

basically the reason that this passwords

11:08

and stuff like that fall under this is

11:10

because first of all,

11:13

they can't force you to give up your

11:14

password because that one,

11:17

and that's the one I'm not sure of,

11:18

but I feel like I read this somewhere

11:20

that would fall under free speech.

11:21

Like they're kind of forcing you what to

11:23

say,

11:24

but it definitely falls under the fifth

11:26

amendment because you have a right,

11:28

whether you're guilty or innocent,

11:30

you have a right to not give,

11:32

incriminate yourself um that's kind of the

11:33

way the u.s justice system the u.s justice

11:36

system is supposed to work is it's

11:39

supposed to be that and i don't want

11:41

to get too far off topic but on

11:42

paper the way it's supposed to work is

11:44

that the defense is supposed to prove that

11:46

you are not guilty no matter what and

11:48

the prosecution is supposed to find the

11:50

truth whether that's you're guilty or

11:52

you're not um which is a subtle but

11:55

distinct thing anyways so yeah there's

11:57

this protection against

11:59

Admitting and confessing if you don't want

12:01

to.

12:01

I mean, obviously, if you want to confess,

12:02

you can confess.

12:03

Nobody's going to stop you.

12:03

But yeah,

12:05

so it falls under that because giving up

12:07

your passwords could incriminate you.

12:09

And therefore,

12:10

you are not required to do it.

12:12

But like you said, there's workarounds.

12:15

Biometrics...

12:18

I want to say is a gray area.

12:20

I could be wrong about that because, well,

12:23

real quick,

12:23

let me say I know for sure that

12:25

you can be ordered to unlock a device,

12:27

whether that's putting in the password and

12:29

then handing it over,

12:30

scanning your fingerprint, whatever.

12:31

That is something the court can force you

12:33

to do.

12:36

Biometrics,

12:37

I've heard conflicting reports,

12:38

and a lot of these – we talked

12:39

about this last year with the person who

12:41

was charged for wiping their phone,

12:44

which was also a four-oh-four story.

12:47

Phones and biometrics and things are

12:49

really in a gray area right now legally

12:51

in the US because some courts have ruled

12:54

that you need a warrant to search phones,

12:56

and other courts have ruled, no,

12:57

you don't.

12:58

And they're kind of at odds,

13:01

and we're kind of waiting for the Supreme

13:02

Court to weigh in and settle the matter

13:05

once and for all.

13:06

And yeah,

13:08

biometrics kind of fall under that

13:09

category of even if you do need a

13:13

warrant or whatever,

13:14

it would still be something that they can

13:15

force you to unlock the device because it

13:17

doesn't really fall under those those same

13:19

purviews of like against

13:21

self-incrimination and stuff.

13:22

It's.

13:23

Yeah,

13:25

it's a fun gray area right now in

13:26

the US.

13:26

Right, yeah.

13:30

It is kind of, it's definitely tricky,

13:32

especially for me as someone who's outside

13:35

the US seeing what's going on.

13:38

It is kind of hard to follow some

13:40

discussions around it.

13:42

So it's good that you kind of explained

13:44

the situation.

13:47

But I think another important thing about

13:48

this story that you kind of touched on

13:50

a little bit when you were doing the

13:52

initial coverage here was that when they

13:55

were able to unlock the device,

13:57

the MacBook in this case,

13:59

there was actually a signal application on

14:03

the MacBook and they were able to take

14:05

evidence from that signal application on

14:08

that MacBook.

14:09

And I think that's like a really important

14:11

thing that we need to discuss because I

14:14

think people need to be careful about

14:18

using Signal desktop applications because

14:21

of this instance, right?

14:24

Desktop computers have less security

14:28

protections than a mobile device.

14:31

And in this case, it's...

14:34

less of a security issue,

14:35

more of just like an unfortunate

14:37

circumstance where someone had biometrics

14:40

enabled, but it does kind of show that,

14:43

you know,

14:43

your security is only as secure as your

14:46

weakest link, right?

14:47

As soon as you have one link that

14:49

breaks your entire signal history is going

14:52

to be available to whoever's trying to

14:56

attack the device.

14:58

So I thought that was quite an interesting

15:00

part of the story actually.

15:03

Yeah, definitely.

15:05

It's... Yeah, and it's...

15:09

You're definitely a hundred percent right.

15:10

Like I heard...

15:13

I forget who it was,

15:14

but somebody on Firewall's Don't Stop

15:16

Dragons was talking about...

15:18

Phones are almost like,

15:19

from a security perspective,

15:20

it's almost like we took all the lessons

15:22

that we learned from computers and

15:24

integrated them into smartphones.

15:25

So smartphones are, for example,

15:27

smartphones rarely have their updates mess

15:29

up.

15:30

I'm going to take a fun little pot

15:32

shot at Windows here.

15:33

When I bought my first Windows eight

15:34

computer...

15:36

brand new fresh out of the box i

15:37

opened it i booted it up and it

15:39

failed to install and had to reboot and

15:41

it got it on the second try but

15:43

uh i've never seen anything like that

15:45

happen with a phone right like phones

15:47

almost never don't update and they have

15:50

significantly better security all around

15:53

um you know they they have better

15:55

sandboxing and all kinds of stuff i'm not

15:57

really an expert on phones but i know

15:58

they're significantly more secure but even

16:02

so yeah i think i think the

16:05

I think it just kind of goes back

16:06

to not just the security,

16:09

because there's so many aspects of it.

16:11

That certainly is a good point,

16:12

that desktop devices are generally less

16:15

secure,

16:16

but also just being mindful of

16:19

you know, backups, right?

16:20

Like backups are a good thing to have,

16:21

especially in the privacy community.

16:22

We're really big fans of not using the

16:24

cloud, generally speaking.

16:25

And it's like, okay, that's great.

16:27

But what happens if, you know,

16:29

there's a house fire or you're, you know,

16:31

there's a flood or anything.

16:33

What happens if you get robbed?

16:33

You know,

16:34

especially like I need to buy a new

16:36

one.

16:36

I used to have a little external portable

16:39

hard drive that was my backup drive.

16:41

And the first thing I did was encrypt

16:42

it because I knew I'm like,

16:43

if somebody breaks in,

16:44

they can grab that right off the desk

16:46

and go.

16:46

Like there's no friction to stealing that

16:49

at all.

16:49

So yeah, just keeping in mind every,

16:54

again, all the end points.

16:55

And we talk about that,

16:57

I think in the privacy community,

16:58

we think about that a lot in terms

16:59

of,

17:01

the far end,

17:03

like I see a lot of people who

17:04

criticize encrypted email because they're

17:05

like, oh,

17:06

most people don't use encrypted email.

17:08

So, you know,

17:09

it might be encrypted while it's in your

17:10

inbox,

17:10

but it's still unencrypted in their inbox.

17:13

And which is a valid but different

17:15

argument.

17:15

But the point is, like,

17:16

we don't always think about it in the

17:18

context of, OK,

17:20

but now there's a second copy of your

17:21

data on your end of things sitting on

17:24

that external hard drive on the desk or

17:26

sitting on that computer desk.

17:28

you know,

17:28

synced up through Signal or whatever

17:30

that's less secure.

17:30

Or, you know,

17:31

some people run a NAS and they use,

17:32

what is it,

17:32

like SyncThing to sync their data.

17:34

And it's like, okay,

17:35

now you've got this constantly running,

17:37

twenty-four-seven device sitting in the

17:38

corner that's syncing all your calendar,

17:41

your contacts, your data.

17:42

So, yeah, it's really important to,

17:44

when you do that threat modeling and you

17:45

start building your

17:48

your security posture,

17:49

I guess we'll call it.

17:50

It's important to have that moment where

17:51

you stop and think about every step of

17:54

the chain.

17:54

You know,

17:55

what happens if this gets compromised?

17:57

If this threat happens, break-in, flood,

18:00

you know, it's, yeah.

18:01

I mean,

18:02

it's really important to think about that

18:04

whole picture and not just think like,

18:05

okay, I'm using Signal, I'm good.

18:09

Yeah, definitely.

18:11

Your security needs to be like...

18:14

very,

18:15

your plan needs to be very clear and

18:18

multi-layered.

18:19

But I think another thing that's also kind

18:20

of important in this story is the person

18:24

that had the iPhone seized,

18:26

it was actually only an iPhone XIII,

18:29

which isn't, you know,

18:30

one of the latest generations of iPhones

18:32

that has more of those hardware security

18:34

features like the iPhone XII,

18:35

which has

18:37

mte memory tagging enforcement so it's

18:41

interesting that even an iphone with

18:43

lockdown mode enabled was able to block

18:47

these sort of forensic tools um which i

18:50

think kind of shows the power of this

18:52

of lockdown mode itself actually um

18:56

because you would expect an older device

18:59

is probably more likely to be exploited

19:02

because of its age, right?

19:04

Through hardware vulnerabilities and stuff

19:07

like that.

19:07

And because it doesn't have that updated

19:10

hardware, it's kind of surprising.

19:12

So that was another thing that was quite

19:14

surprising to me in this story.

19:18

Yeah, I think I missed that part.

19:19

Thank you for bringing that up.

19:21

I did see people mention that.

19:23

Now that you mentioned it,

19:23

I saw people talking about an iPhone XIII

19:26

with lockdown mode, but for some reason,

19:28

my brain didn't make the connection.

19:34

Obviously,

19:35

we're always going to encourage people to

19:36

do the best they can and use the

19:38

latest devices and the best security

19:39

measures,

19:39

but I think that also I would like

19:44

to use that as an opportunity to remind

19:45

people that

19:47

there's a lot of factors that go into

19:48

security.

19:49

Cause I know there's some people that

19:50

maybe can't afford like a brand new device

19:53

or, uh, you know,

19:54

for a lot of reasons,

19:55

like environmentalism,

19:56

like I don't want to throw out this

19:57

perfectly good iPhone.

19:59

Um,

19:59

I don't know if the iPhone is still

20:00

getting updates, but I assume so if,

20:01

if lockdown would work, but, uh, you know,

20:04

my point being like,

20:04

I'm one of those people,

20:05

like I have a pixel six a,

20:06

and I plan on using it probably until

20:08

next year when it stops getting updates.

20:10

And you know, it's,

20:11

when that happens,

20:12

I want to get the latest pixel so

20:13

I can get all these nice hardware

20:15

features.

20:15

But I think that's just what I'm getting

20:17

at is that's,

20:18

that's part of threat modeling as well is

20:19

like, will this meet my needs?

20:20

Will it is, is this,

20:23

am I doing enough to protect myself?

20:25

And I think sometimes we can feel like

20:27

we always have to do more because privacy,

20:29

there's always more to do in privacy.

20:31

There's, there's always new, new tools,

20:33

new ways to improve your privacy and,

20:35

unfortunately, new threats popping up.

20:37

But you know, that's,

20:39

that's one of the reasons that we do

20:41

threat modeling is to try and make sure

20:44

that we are hitting those goals,

20:45

we are doing what we need to to

20:46

protect our data.

20:47

And

20:49

Yeah,

20:49

I guess what I'm getting at is as

20:51

long as you're fulfilling your threat

20:53

model,

20:53

it's okay to give yourself some grace if

20:55

you're not going as hardcore as you would

20:57

like to.

20:58

Keep going, keep trying to get there,

21:00

but as long as you are meeting your

21:01

threat model.

21:02

And that doesn't necessarily mean you have

21:04

to run out and buy the latest thing

21:05

because now it's got this new hardware

21:07

feature, but try to if you can.

21:10

Again, I want to reiterate that,

21:11

but this is why threat modeling works.

21:14

Exactly, yeah.

21:15

And on the topic of protecting your data,

21:17

let's head into this new story here.

21:19

And this is Notepad++ update feature

21:22

hijacked by Chinese state hackers for

21:24

months.

21:25

So Chinese state-sponsored threat actors

21:28

were likely behind the hijacking of

21:29

Notepad++ update traffic last year that

21:33

lasted for almost half a year.

21:35

The developer states in an official

21:37

announcement today.

21:39

So that's quoting from the bleeping

21:40

computer article here.

21:43

A statement from the hosting provider for

21:45

the update feature explains that the logs

21:47

indicate that the hacker compromised the

21:49

server with Notepad++ update application.

21:54

External security experts helping with the

21:57

investigation found that the attack

21:58

started in June of twenty twenty five.

22:01

So I guess for anyone who's not on

22:03

the

22:04

who doesn't use Windows,

22:07

Notepad++ is basically an advanced version

22:11

of Notepad that has a lot of the

22:13

features that you would expect in,

22:15

you know,

22:16

advanced note-taking applications like

22:18

syntax highlighting and such,

22:20

so it definitely builds upon the Notepad

22:25

like piece of software that's available in

22:27

windows.

22:28

It's basically like a fully featured

22:31

alternative to that.

22:32

And it's actually extremely popular.

22:35

So that's why it's extremely concerning

22:37

that the update feature has been hijacked

22:39

by Chinese state hackers.

22:43

And just quoting from the article again

22:45

here,

22:45

the attackers specifically targeted

22:47

notepad++ domains with the goal of

22:49

exploiting insufficient update

22:51

verification controls that existed in

22:53

older versions of notepad++.

22:57

So basically in December,

22:59

Notepad++ released version eight point

23:01

eight point nine,

23:02

which addressed a security weakness in its

23:04

Win GUP update tool after multiple

23:08

researchers reported the updater would

23:10

receive malicious packages instead of

23:12

legitimate ones.

23:14

And there was a great post on Mastodon

23:17

that I saw by security researcher Kevin

23:20

Beaumont,

23:21

where he warned that at least three

23:23

organizations were affected by these

23:25

update hijacks.

23:27

which were followed by hands-on

23:29

reconnaissance activity on the network.

23:33

So yeah,

23:34

this is kind of a pretty concerning story,

23:36

especially because it's such a popular

23:38

piece of software.

23:39

Like a lot of people on windows that

23:41

are editing text files will be using this

23:43

software.

23:45

So it is kind of concerning,

23:47

but just handing it off to you here,

23:48

Nate, what are your thoughts so far?

23:52

Yeah, I will admit,

23:55

I use Notepad++ on Windows.

23:57

So that was a fun headline to read.

24:00

I think I say this partially to make

24:03

myself feel better.

24:04

One thing to note in the article is

24:05

it does mention that this attack was very

24:07

targeted in that the attackers very likely

24:12

only targeted specific users of Notepad++.

24:16

so probably hopefully not me but they

24:19

could have just as easily targeted me

24:21

right um and actually real quick on that

24:22

note i think here at the bottom of

24:25

this article yeah um there's some links

24:28

they talk about this group called rapid

24:29

seven that did research on this and they

24:31

released an analysis a technical analysis

24:33

of the malware and they do include um

24:37

how to check and see if you were

24:38

compromised i i believe they concluded a

24:41

What is it?

24:42

Caches of specific files, basically,

24:44

that you can check and see if you

24:46

have the right file or a compromised file.

24:49

So if you are worried that you may

24:51

be impacted by this, first of all,

24:53

go ahead and update to the latest one

24:54

because hopefully that should fix it and

24:55

close off access.

24:56

But also you can double check that to

24:58

see.

24:59

But yeah, it's really not a good feeling.

25:05

And it's really unfortunate because...

25:08

Notepad++ kind of checks all the boxes

25:10

that we would normally advertise for a

25:14

good privacy tool, right?

25:15

I mean, it's open source.

25:17

As far as I know, it's offline.

25:19

I'm sure it pings for updates and stuff,

25:20

but it's not a cloud-based notepad.

25:23

It's...

25:24

It's very actively maintained.

25:26

The developer is very active.

25:27

But it just... This happened.

25:31

And it's really frustrating because I was

25:34

sitting here when I was reading this story

25:36

and I was taking notes for the show.

25:39

I was thinking to myself,

25:40

what's the lesson here?

25:41

Because normally the lesson is try to

25:44

stick to open source or try to make

25:46

sure you only get it from official

25:47

sources.

25:48

But when there's a supply chain attack

25:50

like this, they're really...

25:53

of just not what you can do i

25:54

guess so yeah i don't know this this

25:57

is a really unfortunate story i think my

25:58

takeaway is and i know i already said

26:00

this in the last story but just to

26:02

to remember that um

26:06

I mean,

26:06

nothing is unhackable and it's important

26:08

to be aware that anything you put in

26:11

a digital format.

26:12

And this is actually one of the things

26:13

I like about notepad is you don't even

26:15

have to save it.

26:16

Like I have literally like seven open

26:17

notes that I haven't saved for weeks and

26:20

I should really do something with them.

26:22

But you know,

26:22

that's one of the things is it'll like

26:23

hold onto those notes, um,

26:25

without having to save them.

26:26

And so even if you don't save it,

26:27

if it's any,

26:28

any kind of a digital format, it's,

26:30

it's at risk.

26:31

And I hate to say that cause that

26:33

feels like paranoia, but, um,

26:35

Yeah, that's kind of, I don't know,

26:37

is there any better lesson that you took

26:38

away from this one or?

26:39

Yeah,

26:41

I think this is kind of an unfortunate

26:43

circumstance, right?

26:44

But I think it does need to be

26:47

said that, you know,

26:48

open source tools are just as vulnerable

26:51

to being exploited by these sort of things

26:53

as closed source ones.

26:56

You know,

26:56

just because a tool is open source doesn't

26:58

technically mean that this wouldn't have

26:59

happened or this would have happened,

27:02

right?

27:04

So I think that's one thing to consider,

27:06

but I think when it comes to note

27:08

taking tools and such,

27:11

I think it's important to use tools that

27:14

have been independently verified, right?

27:16

So if a security researcher had done an

27:18

audit on this piece of software,

27:20

for example,

27:21

I think it would have been pretty obvious

27:23

that this was a mistake in the update

27:26

process, right?

27:28

Any security researcher that would have

27:30

had an eye over this,

27:33

would have noticed this.

27:34

So, you know, I think it's

27:38

unfortunate,

27:38

but I think there's things you can do

27:40

to make sure that the software you're

27:41

using isn't, uh,

27:43

doesn't contain easily exploitable stuff

27:46

like this.

27:46

So, you know, with privacy guides, we,

27:49

a lot of times we require independent

27:51

verification by security researchers, um,

27:53

or like a security firm,

27:55

such as like cure or other ones.

27:58

Um, so that is,

27:59

that is something you can do to try

28:01

and mitigate some of these concerns.

28:03

But in that case as well, it is,

28:06

you know,

28:07

Things get missed.

28:08

Not every single piece of software is

28:10

infallible.

28:13

And I think, yeah,

28:14

someone in the chat said,

28:15

JS said a secure OS could help a

28:18

lot.

28:19

Yeah, exactly.

28:19

So if you have an operating system that

28:22

has more control over the sandboxing of

28:25

applications, for example,

28:27

I'm sure Nate would like to talk about

28:29

cubes,

28:30

but there's all sorts of things you can

28:34

do on Mac OS and Windows even has

28:37

some sandboxing capabilities.

28:40

So I think that is another important thing

28:42

that people need to think about, right?

28:45

Having adequate security protections and

28:49

sandboxing and separating tasks into

28:52

separate areas can definitely reduce the

28:56

damage of an attack because in all

28:58

circumstances, it's possible.

29:02

Every piece of software can be exploited.

29:03

So trying to reduce the damage that can

29:07

happen is definitely a good step in the

29:10

right direction.

29:14

Yeah, I'm glad you mentioned audits,

29:15

because that did completely slip my mind,

29:17

because audits are really good.

29:19

And I do want to point out that

29:21

audits, and you did kind of mention this,

29:24

things can still slip through the cracks,

29:25

because audits are a snapshot of code at

29:30

a specific point in time.

29:31

So it's entirely possible that Notepad++

29:35

could have been audited in June,

29:37

or let's say May,

29:39

And then this happened like the following

29:41

month and there's really nothing to

29:42

protect against that.

29:44

But another thing,

29:45

another advantage of audits is that they

29:47

tend to show that an organization is

29:49

really serious about their security and

29:52

securing their supply chain and stuff like

29:54

that.

29:55

And, um,

29:57

Again,

29:57

I want to point out like audits are

29:58

also really expensive.

30:00

So just because a company doesn't get an

30:02

audit doesn't mean that they're not safe

30:03

or they're not taking it seriously,

30:05

especially if it's like a really hobbyist

30:06

project.

30:07

Like I think notepad plus plus,

30:08

if I remember correctly is donation only.

30:10

So they probably don't have thousands of

30:12

dollars to invite cure fifty three to come

30:14

in and look at their code.

30:15

But for those organizations that do have

30:17

that kind of money, like Proton, Moldad,

30:20

IVPN, I know there's plenty of others,

30:22

but

30:24

signal.

30:25

When they bring in auditors and have them

30:27

look at their code and they have that

30:29

kind of resource, it shows that, again,

30:32

like you said,

30:32

things can slip through the cracks,

30:34

but it's a really good sign because it

30:35

means that they probably have a culture of

30:39

security and it's less likely for bad

30:42

things to happen.

30:43

So yeah, I'm glad you brought that up.

30:45

Um, and yeah, I mean, if you want,

30:48

I will say cubes or, you know, there's,

30:50

there's plenty of other ways to do this,

30:51

but, uh, yeah, I,

30:52

I am using cubes right now in front

30:54

of me for those who didn't know.

30:55

And, um,

30:57

that is one of the things I like

30:58

about it is the compartmentalization of,

31:00

I literally have a cube for privacy

31:02

guides.

31:02

I have a cube for my personal stuff.

31:04

I have a cube for banking.

31:06

And, uh, if,

31:08

if one of my software was to become

31:12

in, um,

31:14

I don't know why my brain refused to

31:16

pick a word there.

31:17

If one software I use became compromised

31:19

and impacted,

31:21

it would limit the damage to that specific

31:23

cube, which is still suck,

31:25

but at least it would limit the damage,

31:27

right?

31:28

So yeah,

31:29

compartmentalization and secure OSs and

31:31

things like that really do go a long

31:34

way.

31:35

All righty.

31:41

Before we move on to the next story,

31:43

we're going to pause here briefly to give

31:45

an update of what's been going on behind

31:49

the scenes at Privacy Guides.

31:52

Um, so yeah,

31:53

we're going to talk about Ram shortages

31:54

affecting the raspberry PI,

31:56

but first in case you guys didn't hear

31:58

our smartphone security course is now open

32:01

to the public or the first part of

32:03

it is.

32:03

So there's a,

32:05

there's three parts technically kind of

32:06

four,

32:07

cause we have like a little intro video.

32:09

And then from there,

32:09

it branches off into Android and iPhone,

32:12

and there's a beginner and intermediate

32:13

and an advanced level.

32:15

The beginner level is now open to the

32:16

public.

32:18

I will let Jordan talk about the

32:20

intermediate iOS video in a second.

32:23

But yeah, the beginner video focuses,

32:24

if you guys haven't seen it yet,

32:26

I do think it's on PureTube now.

32:27

I could be wrong about that,

32:28

but I'm pretty sure it is.

32:30

The beginner video focuses on things that

32:35

are really subtractive instead of additive

32:37

to make your phone more secure.

32:39

So things like removing apps you don't

32:40

use,

32:41

changing the settings and things like

32:42

that.

32:43

which I think should be doable for a

32:45

lot of people because even simple things

32:47

like downloading Signal or switching to

32:49

Brave or Firefox,

32:50

those are things that require people to

32:54

take action, right?

32:55

They require them to go download an app,

32:57

sign up for an account,

32:58

no matter how easy.

33:00

And I think some people...

33:03

I think it's a mental block that some

33:04

people have a hard time doing that.

33:05

And so the beginner level is things that

33:09

anybody can do anytime, right?

33:10

Like you're laying on the couch,

33:11

instead of doom scrolling,

33:13

start deleting apps and stuff like that.

33:14

So yeah,

33:16

I'm really excited that that's out to the

33:18

public now.

33:20

And I will turn it over to Jordan

33:22

real quick to give us some updates on

33:24

any upcoming videos if you'd like to,

33:27

or if you want to hold off on

33:28

that.

33:29

Yeah, thanks.

33:31

So I guess, yeah, like Nate said,

33:32

we've got the intermediate smartphone

33:35

security guides coming out soon,

33:37

and we've already got the Android version

33:40

done,

33:40

and the iOS one is pretty close as

33:42

well.

33:42

So I'm hoping possibly early next week we

33:45

can get that published for members.

33:48

And if you're a member,

33:49

you can get early access to videos,

33:51

just a reminder.

33:54

And that was sort of covering,

33:56

I guess I should say,

33:58

this whole smartphone security course has

34:00

been written by Nate and overall direction

34:02

by Nate here,

34:03

which I think he did a great job

34:05

with.

34:06

And I think it'll be really,

34:08

really useful to people.

34:09

We've already had the beginner ones

34:12

released to the public,

34:13

and we already had some really good

34:14

feedback on those videos.

34:16

And that's just the beginning.

34:18

So the intermediate ones dive more into

34:22

things that, you know,

34:24

are less to do with your device and

34:26

more to do with the services that you

34:29

use and the applications that you're

34:31

utilizing.

34:32

So for instance, you know,

34:35

moving to encrypted email providers

34:38

instead of using a plain text one like

34:40

Gmail.

34:42

I'm not going to give too much away

34:44

just because I don't want to spoil the

34:45

video.

34:45

But we also do go into some of

34:48

the things on Android,

34:49

such as alternative app stores.

34:51

So if you're interested in that sort of

34:53

thing,

34:53

you're not all the way into the advanced

34:56

area,

34:57

definitely look out for those videos next

34:59

week.

35:00

And we should have the advanced

35:03

series coming out shortly after that as

35:06

well because that's you know quite a bit

35:08

shorter than the intermediate ones and

35:10

Nate's kind of been on fire here he's

35:12

been like writing so many videos up and

35:14

sending the footage over to me so we've

35:16

also got another video in the works which

35:19

is the private messaging video which we're

35:24

working on there's also I guess we haven't

35:27

released have we released this yet

35:29

We're also working on a video about

35:32

private browsing.

35:33

I'm not sure if that's actually released

35:35

yet.

35:35

Is it, Nate?

35:39

That's a good question.

35:41

The wheels are turning.

35:42

I know because you've sent me the preview

35:46

versions,

35:47

but I don't know if we've published those

35:49

to members yet.

35:50

We're going to have to follow up on

35:52

that one.

35:52

I don't think we have.

35:54

All right, then.

35:55

I guess members, be ready for that.

35:57

That'll be coming out hopefully after the

35:59

stream if we can line everything up.

36:01

But that's also done.

36:03

So that was another video that Nate's

36:05

scripted and put together the footage and

36:08

recorded.

36:08

And that was another really interesting

36:10

video, I think,

36:11

which is going to be useful for people

36:14

to send to people that are stuck using

36:17

Chrome or that have misconceptions about

36:20

the private mode on Chrome because

36:23

it's called incognito mode,

36:25

but you're not really very incognito.

36:27

So I think that'll be an interesting video

36:30

for some people to send to their

36:32

relatives, family, friends, you know,

36:34

get people moving off Chrome.

36:36

I think it's one of the easiest things

36:38

you can get people to switch, you know,

36:40

instead of Chrome, use Firefox, you know,

36:43

instead of

36:46

using incognito mode,

36:47

you could use more bad browser instead.

36:49

And that's really a private way to browse

36:51

the internet.

36:52

So that's kind of what it's looking like.

36:55

We've got that look out for the private

36:58

messaging video.

36:58

That's what we'll be working on next week.

37:01

And yeah,

37:02

I guess I can throw it back to

37:03

you, Nate.

37:03

Is there any other things that we've been

37:05

working on this week?

37:08

I know we've got a few more news

37:09

articles coming out this weekend.

37:12

I, I tried really hard to,

37:15

Throughout the week,

37:15

I will collect news stories that I'm like,

37:17

oh,

37:17

this is important enough that I think we

37:19

should write a brief about it,

37:20

but it's not necessarily important enough

37:22

to put in the podcast.

37:23

And I keep telling myself every day,

37:25

I'm like, all right, just take, you know,

37:27

ten minutes, twenty minutes,

37:28

write up a little quick,

37:29

short thing about this.

37:30

And then for some reason,

37:31

by the time I get done with my

37:32

work for the day, I'm so tired.

37:35

So my point being, I end up, like,

37:36

stacking them all towards the end of the

37:37

week, and I really shouldn't do that.

37:39

But...

37:40

Yeah,

37:41

we do have a bunch of those coming

37:42

out, maybe half a dozen or so.

37:45

And some of those are from me,

37:46

some are from Freya,

37:48

who's one of our regular writers.

37:49

So I think that's about it.

37:53

Yeah, and just to real quick,

37:54

just to kind of address it,

37:56

some of these videos we're putting out

37:57

right now, I think are very entry level.

38:01

And I think they're really good for

38:03

sharing with your friends and family who

38:04

maybe are not as excited about privacy as

38:07

you are, or maybe...

38:10

I don't know.

38:10

I think most of you guys probably do

38:12

a really good job of like sharing the

38:13

stuff with your friends and family,

38:14

but I guess it's just something you can

38:15

share with people.

38:16

But the point I'm getting at is don't

38:18

worry.

38:18

We will still be doing plenty of like

38:20

really advanced, uh,

38:21

stuff in the future for, you know,

38:23

whatever your tech level is.

38:24

So if, if you're thinking like, man,

38:26

these are really entry level stuff,

38:27

it's honestly,

38:28

cause we kind of looked at our,

38:29

our library of content and went, Oh,

38:31

we haven't covered a lot of the really

38:32

basic stuff.

38:33

So that's kind of what we're doing.

38:34

And, um,

38:35

we're definitely going to get back into

38:36

other stuff as we go.

38:38

I think, um,

38:39

probably sooner than later here, actually,

38:41

probably after this next video,

38:42

I would imagine we'll maybe looking to do

38:43

in something a little bit more high level.

38:45

So

38:47

Yeah,

38:47

I think one other thing that I guess

38:50

we can touch on now,

38:51

which should be coming out at least I

38:55

would say in the next month or so,

38:57

we're working on another section of the

38:59

website.

39:00

So if anyone's...

39:01

I think people should kind of get hyped

39:03

for that.

39:03

There's been a lot of work behind the

39:05

scenes from Em who's been working a lot

39:07

on a separate section.

39:09

I don't think I'll go into too much

39:11

detail about it here because...

39:13

There's, you know,

39:14

it's still all being finalized,

39:16

but it's definitely being worked on.

39:19

And I think this is going to be

39:20

kind of a pretty big year for privacy

39:22

guides because we're expanding in so many

39:24

exciting ways.

39:25

So I'm really excited to see how that

39:28

pans out.

39:31

Yeah, definitely.

39:32

Em has been super,

39:33

super busy working on that.

39:35

And I...

39:37

I've seen the outlines of how that is

39:40

going to be structured,

39:41

but I haven't seen the...

39:44

What I'm trying to get at is I'm

39:45

excited to see it.

39:46

I think it's going to be awesome.

39:47

I know she's been working really hard on

39:48

it, and yeah, it's going to be great.

39:52

One more thing real quick before we jump

39:54

into the next story is JS left another

39:56

comment here about private browsing.

39:58

Said Brave has a pretty good strategy with

39:59

having ad block included.

40:02

Yeah, it's really...

40:04

Jonah and I touched on this in last

40:05

week's episode, but...

40:08

I think in my experience,

40:09

one of the best ways to try and

40:11

nudge people towards privacy tools is

40:13

rather than focusing on the privacy and

40:15

the security benefit,

40:16

treat that as like a bonus and focus

40:18

on how it makes their life easier.

40:20

Like I've had really good success with

40:21

getting people to try password managers

40:23

because I happen to be in the room

40:24

when they're trying to log into something

40:25

and they're like, oh my God,

40:27

what was my password for this website

40:28

again?

40:28

And I'm just like,

40:30

hey,

40:31

you want to know how to never forget

40:32

your password again?

40:34

And I run into those people six months

40:35

later and they're like, oh my God,

40:36

this is amazing.

40:37

How did I live without this?

40:38

So yeah, definitely.

40:40

I've gotten people to switch to Brave

40:41

doing that exact same thing.

40:43

I saw two people in a Discord room

40:45

I was in having a conversation where

40:46

they're like, oh my God,

40:46

the ads on this website are obnoxious.

40:48

I'm like, have you tried it in Brave?

40:50

And like, oh my God, Brave fixed it.

40:53

This is awesome.

40:54

So yeah,

40:57

that is one thing Brave has going for

40:58

it.

40:58

But I know there's probably ways to

40:59

recreate that in something like Firefox as

41:01

well.

41:02

So yeah, I just wanted to mention that.

41:05

Yeah,

41:05

it's kind of surprising to see what people

41:08

will put up with on the modern web,

41:10

like full-page ads,

41:11

auto-playing video ads.

41:13

Like, oh my goodness,

41:15

it's a wasteland out there.

41:18

Dude,

41:18

I had to turn it off on YouTube

41:19

the other day for something.

41:21

I can't remember what it was.

41:23

Oh, it was our intro video.

41:27

I wanted to make sure that the end

41:28

cards were, you know,

41:29

click here to go to Android,

41:30

click here to go to iOS.

41:32

Quick note for other Brave users.

41:34

And I wasn't seeing the end cards.

41:36

And I'm like,

41:36

did we not put end cards on there?

41:38

And so I turned off my Brave Shields.

41:40

I guess I had some optional thing turned

41:41

on that got rid of the end cards.

41:43

But while the Brave Shield was off,

41:45

there was literally like a one minute

41:46

pre-roll ad to start the video.

41:48

There was an ad on the side.

41:49

There was an ad below that.

41:50

There was an ad at the bottom of

41:51

the video.

41:51

I'm just like, oh my God,

41:53

how do people put up with this?

41:55

It's so bad.

41:58

Oh yeah.

41:59

So definitely,

42:00

I don't know how people live like that.

42:03

Speaking of bad things,

42:04

there is being an ongoing RAM crisis.

42:08

And I don't know if you've seen that,

42:10

but one of our previous team members,

42:12

Kevin,

42:13

he wrote an article about the whole RAM

42:15

crisis that's going on.

42:17

And unfortunately,

42:18

that has now started to affect Raspberry

42:22

Pi.

42:22

So Raspberry Pis have received another

42:25

price hike in the last two months.

42:28

And basically, the more RAM the board has,

42:30

the more its price is increasing.

42:32

So just quoting from this article here

42:34

from Ars Technica,

42:36

the ongoing AI-fueled shortages of memory

42:39

and storage chips has hit RAM kits and

42:42

SSDs for PC builders the fastest and

42:45

hardest,

42:46

meaning it's likely that for other

42:47

products that use these chips,

42:49

we'll be seeing price hikes for the entire

42:51

rest of the year, if not longer.

42:54

And the latest price hike news comes

42:56

courtesy of Raspberry Pi CEO

42:58

Eben Upton, sorry if I said that wrong,

43:01

who announced today that the company would

43:03

be raising prices on most of its single

43:05

board computers for the second time in two

43:07

months.

43:08

Prices are going up for all Raspberry Pi

43:10

four and Raspberry Pi five boards with two

43:12

gigabytes of or more of LPDDR four RAM,

43:17

including the compute module four and five

43:20

and the Raspberry Pi five hundred computer

43:22

inside a keyboard.

43:24

And the two gigabyte boards pricing will

43:27

go up by ten dollars.

43:28

Four gigabyte boards will go up by fifteen

43:30

dollars.

43:31

Eight gigabyte boards will go up by thirty

43:33

and sixteen gigabyte boards will increase

43:35

by a whopping sixty dollars.

43:38

So I'm kind of happy about this in

43:41

a very selfish way.

43:42

I actually bought a Raspberry Pi like two

43:45

months ago before the price hike.

43:46

So I'm kind of happy that I decided

43:49

to finally replace mine.

43:51

But what do you what do you think

43:52

about this?

43:56

Yeah,

43:57

I happen to have a couple of Raspberry

43:59

Pis sitting around at home that I'm

44:01

actually not using,

44:02

and I'm trying to figure out what to

44:03

do with them.

44:06

And pardon my technical difficulties while

44:08

I try to get my screen share back

44:10

up here.

44:10

But yeah, it's... I don't know.

44:13

It's really a bummer because the whole

44:14

selling point of the Raspberry Pi is that

44:17

they're so cheap, right?

44:18

And personal opinion, I think that's a...

44:23

I think that's slightly misleading to say

44:25

that they're so cheap because they do...

44:29

They're very cheap if you just want the

44:30

board.

44:31

But if you want the case that goes

44:33

over it and stuff like that,

44:34

I think it becomes a very different story

44:37

at that point.

44:37

I think the price does start to go

44:39

up significantly.

44:41

And maybe it's me because I have cats.

44:43

And I'm like,

44:43

the cat hair will get everywhere.

44:44

It will absolutely get all over the board

44:46

and everything.

44:48

But either way, they're very inexpensive.

44:50

And there's certainly...

44:52

In my opinion,

44:53

I don't think they make for good full

44:55

computers.

44:56

It's not like, oh,

44:56

I got this thirty dollar computer.

44:59

They're still definitely.

45:02

They're still definitely.

45:06

They're made for very specific tasks,

45:08

is what I'm trying to say.

45:10

They're really popular,

45:11

and this is why we're covering them.

45:11

They're really popular in the privacy

45:12

community for running like a single and

45:14

single member Mastodon instance or like a

45:17

simple next cloud instance or DNS,

45:20

you know,

45:20

to block ads and stuff at home.

45:22

Um,

45:22

ads and trackers and all kinds of stuff.

45:24

And so this to me.

45:30

Sorry, I know I'm rambling a little bit,

45:31

but this to me touches on a larger

45:32

issue that I've been trying to figure out

45:35

how to put into words and address for

45:38

several years now,

45:39

which is the idea of privacy as a

45:41

privilege.

45:43

Because it is really unfortunate when you

45:46

have to pay for privacy.

45:48

Like kind of going back to phones, right?

45:49

We talk about something like Graphene OS.

45:51

And Graphene OS is only available on a

45:53

Pixel,

45:53

which is a several hundred dollar phone.

45:56

which is still cheaper than an iPhone.

45:57

But it's not a cheap phone.

45:59

And again, certain areas can't get pixels.

46:02

And so it's almost like you have to

46:04

be privileged enough to have the money and

46:06

live in the right area to get a

46:07

graphene phone.

46:09

And that's really unfortunate.

46:10

But at the same time,

46:11

things like Raspberry Pis,

46:13

things like VPN servers,

46:15

those things cost money.

46:17

And you can't

46:21

You can't pay the rent for your office

46:24

by, I don't know, telling the landlord,

46:26

how do you signal, right?

46:28

That's just not how the world works.

46:29

So the point I'm getting at is it's

46:31

really unfortunate to see this price go up

46:33

on these very reasonably priced tools that

46:37

are designed to help,

46:38

or maybe they're not designed to,

46:40

but they're very good for helping people

46:42

reclaim control of their data.

46:44

And it's really unfortunate to see that

46:46

that barrier to entry go up,

46:48

even if it's only a little bit.

46:50

But at the same time, it's like,

46:51

I don't know, some things just cost money.

46:52

And that's an unfortunate reality of

46:54

especially this situation right now.

46:56

So, yeah.

46:58

I think this is just another good reason

47:00

to really dislike AI.

47:02

This is the reason why all of this

47:04

stuff is becoming more expensive.

47:06

There's all these AI CEOs, tech companies,

47:09

they're buying up all the RAM,

47:11

they're buying up all the SSDs for running

47:13

AI models because they need all that

47:16

memory to train and to run all these

47:19

models, right?

47:21

It's not...

47:24

And I think we've kind of seen this

47:25

for a while with Raspberry Pi.

47:26

It's kind of been a continuing drama with

47:29

Raspberry Pi.

47:30

They keep kind of,

47:31

I don't know if you've noticed,

47:33

but over the last couple of generations of

47:35

Raspberry Pi,

47:35

it's like they've been pushing the price

47:39

up and up and up.

47:40

And it's got to a point where,

47:42

you know, I was looking at,

47:45

like I had an original Raspberry Pi from

47:48

like, like a Model B.

47:51

And it finally died this year after.

47:55

Fourteen years of dedicated service.

47:58

So I had to replace it.

47:59

Right.

48:00

And there was nothing that was really the

48:02

same price as what I'd paid for that

48:04

original Raspberry Pi.

48:05

I think I only paid like twenty five

48:07

dollars for that original Raspberry Pi.

48:09

So I guess I could have bought like

48:11

a Raspberry Pi like mini one.

48:14

But, you know, I think it's.

48:17

Still,

48:18

I think it's a problem with affordability,

48:20

right?

48:20

And I think Raspberry Pi has kind of

48:22

been becoming more and more unaffordable

48:25

as time goes on.

48:26

And I'm not sure if the pricing is

48:28

justifiable, in my opinion.

48:30

I think there's plenty of alternative

48:33

options to Raspberry Pi.

48:35

Unfortunately, it does come with,

48:36

you know,

48:37

downsides such as software support not

48:39

being as good and, you know,

48:42

just general community support not being

48:44

as good.

48:45

But I think, you know,

48:47

There's definitely, you know,

48:49

bargains to be found like JS in the

48:50

chat said, I bought a mini PC,

48:52

sixteen gigabytes of RAM and one hundred

48:54

Intel CPU for one hundred and sixty USD.

48:57

Exactly.

48:58

You know, I think that's oh, yeah.

49:02

So they increased the price of that

49:05

because of the RAM shortage.

49:07

I think also another another thing with

49:09

with these buying new is kind of not

49:14

always the best way to go.

49:15

Right.

49:18

there's plenty of used options for,

49:20

you know,

49:20

you don't need a Raspberry Pi to run

49:22

a couple of small services.

49:24

You could always get like a refurbished

49:27

business computer,

49:28

like a Dell Optiplex or, you know,

49:31

these old retired business computers that

49:34

they're selling for like absolutely

49:37

nothing.

49:37

Like you can basically get them for free.

49:40

They may not be as power efficient,

49:42

but you're still getting the,

49:44

ability to run those low powered services.

49:48

So I think there is always an option.

49:49

But I do think when it comes to

49:51

the thing you said about privacy being a

49:54

privilege,

49:55

I think it's definitely a matter of

49:58

perspective.

49:59

I think

50:01

people can still do quite a lot for

50:03

their privacy.

50:04

Like a lot of privacy tools are free,

50:06

like ProtonMail, Tudor, that's all free.

50:09

Like you don't actually need to pay.

50:11

Like you can still get away with having

50:13

a free account.

50:15

And I think it's important to have these

50:17

free options, right?

50:18

Where, you know,

50:21

freemium model is really good in that

50:23

circumstance, right?

50:24

Like with Bitwarden,

50:25

you can have a syncing vault with security

50:28

and it's free for the most part.

50:30

You miss out on some small features,

50:32

but you're getting a lot of those features

50:34

that are the most important for free.

50:37

And I think, you know,

50:40

it is kind of unfortunate that there's

50:41

things like hardware where it's a little

50:43

bit more tricky with like, you know,

50:45

like Nate said, there's Google pixels.

50:47

They cost quite a lot of money depending

50:49

on what country you're in.

50:51

And they can also kind of, uh,

50:54

have you know problems with international

50:58

uh shipping you know not every country has

51:01

has pixels available um it's not a

51:03

globally available uh device but you know

51:09

I think a lot of people can also

51:10

get a really good level of privacy just

51:13

by, you know, deep loading their phone,

51:15

switching the usage on their device, or,

51:17

you know,

51:18

instead maybe going for an iPhone,

51:20

which is available in a lot of locations.

51:22

So I think people don't have, you know,

51:25

zero options.

51:27

But I do think if you do want

51:28

that highest level of security and

51:30

privacy,

51:31

you definitely are going to need to shell

51:34

out some money for the hardware.

51:36

But I think when it comes to software,

51:38

we have a lot of good free options.

51:42

There's a bunch of examples like the Tor

51:43

browser.

51:44

You can browse anonymously for free.

51:47

It doesn't cost money.

51:48

But a lot of that is supported by

51:50

Tor network operators who do this for the

51:54

love of the game.

51:55

They're not...

51:56

in this to make money.

51:57

They're doing it to promote the free

52:00

internet,

52:01

to allow people to access information.

52:03

So I think we do have a lot

52:06

of privacy tools that are accessible to a

52:08

lot of people.

52:09

But there's also that gap that I think

52:11

needs to be filled when it comes to

52:13

hardware,

52:13

because we don't really have very many

52:16

affordable hardware options.

52:18

Because Google Pixels,

52:19

they're starting at like,

52:20

five hundred bucks.

52:23

And it's kind of hard to shell out

52:27

that much money.

52:28

A lot of people don't have that much

52:31

expendable income to just drop on a phone.

52:35

That's why a lot of people,

52:36

they go with carrier plans,

52:39

like they pay their phone off every month.

52:41

So I think we do need to be

52:44

thinking about the accessibility of things

52:46

and

52:47

It'll be good if there was another phone

52:49

option that was similar security to a

52:52

Google Pixel that, you know,

52:54

would allow people to do the same thing,

52:56

but with a much more affordable price.

52:58

But right now we don't really seem to

52:59

have that.

53:01

So, yeah, I don't really have,

53:04

I feel like I've been talking about this

53:05

for a while,

53:06

but do you have anything you want to

53:07

add to that, Nate?

53:10

No, that's fine.

53:10

I mean, I ramble plenty.

53:14

No, and yeah,

53:15

just to back up what you said,

53:16

you're absolutely right.

53:17

There is so much in privacy that can

53:19

be done for free or cheap,

53:21

like Signal and encrypted email.

53:24

But it's just, yeah,

53:26

it's unfortunate that once you start

53:27

getting to those higher,

53:28

and those things do so much.

53:31

How am I trying to word this?

53:40

It's the whole like, what is it?

53:41

The Pareto principle, like, that,

53:43

you know, I think, like,

53:54

switching browsers,

53:55

switching communication methods,

53:57

blocking ads, all that kind of stuff,

53:59

probably gets you, like,

54:04

There are a handful of, I would argue,

54:07

very reputable free VPNs,

54:09

but they're also horribly slow and very

54:11

limited on their capabilities.

54:13

Like you were saying,

54:16

it's unfortunate that once you start

54:18

getting to those higher levels of

54:20

really...

54:22

I guess perfecting for lack of a better

54:24

word, like perfecting your privacy.

54:26

Cause obviously privacy is a journey.

54:27

We'll never really get there.

54:29

There's no such thing as perfect privacy,

54:31

but once you start really doing those

54:32

advanced things,

54:33

like you mentioned switching to a graphene

54:34

phone,

54:35

like there's significant barriers and it's

54:37

just, it's very unfortunate I feel like.

54:40

But yeah.

54:41

I don't know.

54:44

Yeah.

54:44

I think,

54:45

I think you kind of summed up all

54:46

my, my thoughts on this one.

54:47

It is,

54:50

It's unfortunate, and yeah,

54:51

it's another reason to be mad at AI,

54:52

because yeah, this is... Oh my god, yeah.

54:57

I'm so mad.

54:58

Everything's so expensive right now,

55:00

and...

55:00

You know,

55:01

you and I were talking before the

55:02

recording about maybe, yes,

55:05

the rain cloud, perfect.

55:06

We were talking before the recording that

55:08

maybe I might need to get a new

55:09

computer and my wife,

55:11

she's been having some issues with her

55:12

Pop!

55:12

OS laptop, which it's not system.

55:14

It's literally like a different ThinkPad

55:16

that has Pop!

55:16

OS on it.

55:17

And thankfully we found some stuff online

55:21

that we could install and it helps with

55:22

the power management.

55:23

And so that can has been kicked down

55:25

the road, but yeah,

55:25

now is not a good time to need

55:27

new computers.

55:30

So frustrating.

55:31

Yeah.

55:32

I'm truly sorry to anyone who needs to

55:34

buy an SSD or RAM right now.

55:37

I still need to buy a backup drive.

55:41

Yeah.

55:42

I do wonder if hard drives are affected.

55:44

That would be interesting.

55:46

I was going to say,

55:46

I think hard drives are significantly less

55:48

affected, if at all.

55:51

So it's probably not that big a deal.

55:52

But yeah, that's frustrating.

55:55

Okay.

55:59

We do have one more story real quick,

56:01

and I think I got my screen share

56:03

fixed here.

56:04

And this is about Microsoft and their use

56:09

of AI.

56:11

Is it going to work?

56:13

Oh, hold on one moment.

56:16

Yeah,

56:16

so this is actually potentially some good

56:18

news,

56:19

which is that Microsoft is apparently,

56:22

allegedly reconsidering how much AI they

56:26

are cramming into Windows.

56:29

So for those who are fortunate enough not

56:32

to have to deal with Windows Eleven,

56:34

Microsoft,

56:36

on the topic of AI and raising prices

56:38

and everything,

56:39

Microsoft is part of the problem because

56:40

they have been cramming AI aggressively

56:43

into everything you could possibly

56:45

imagine.

56:45

Like,

56:46

I'm going to get really niche with my

56:48

references here,

56:49

but if you guys have ever seen the

56:49

show The IT Crowd, which if you haven't,

56:51

you should because it's hilarious,

56:53

but

56:54

In the first season of The IT Crowd,

56:56

and I didn't notice this until I rewatched

56:58

it years later,

56:59

there's EFF stickers literally everywhere.

57:02

And I found out later that's because Cory

57:03

Doctorow was one of the advisors on that

57:05

show.

57:05

And he was, I believe,

57:06

a board member at EFF at the time.

57:08

So he just stuck stickers everywhere.

57:10

And then in season two,

57:11

there's significantly less stickers

57:12

because I think even the EFF told him

57:14

to chill out.

57:15

So that's basically what Microsoft has

57:17

been doing with AI is anything you can

57:20

think of, they're slapping AI on it.

57:22

And...

57:24

This has not gone over well at all,

57:26

at all, at all, at all,

57:28

because numerous studies are coming out

57:30

showing that most people at best do not

57:33

care about AI.

57:34

Most people either hate it or they're

57:36

completely indifferent.

57:37

There are very few people that are excited

57:39

about AI.

57:40

And Microsoft is finally getting that

57:43

memo.

57:45

Oh, man.

57:46

Where do we even begin?

57:48

Apparently,

57:49

Copilot is integrated into Notepad for

57:51

some reason on the topic of why people

57:53

use Notepad++ earlier.

57:55

It's in Paint,

57:56

which I guess that one I kind of

57:58

understand a little bit better for

58:00

generative AI, but...

58:03

I don't know.

58:04

If I want a powerful image editor,

58:06

I'll probably go with Photoshop or

58:08

something.

58:08

I don't know.

58:08

I'm getting off topic.

58:09

But anyways,

58:10

I'm just pointing out how deeply they've

58:12

shoved AI into everything.

58:13

Most notably,

58:15

many of the privacy veterans will remember

58:18

Recall,

58:19

which is one of the most horrifying

58:21

privacy invasions of the last several

58:23

years, probably.

58:24

I said one of, for the record.

58:26

It's definitely not the worst,

58:27

but it's up there.

58:27

Which,

58:28

real quick for those who don't know,

58:30

it's literally an AI that takes a shot

58:32

of your screen, a screenshot,

58:33

every couple of minutes or couple of

58:35

seconds.

58:36

And the idea is that you're supposed to

58:38

be able to type into your computer, like,

58:40

oh,

58:40

what was that website that had the green

58:43

shoes that I was looking at or whatever?

58:45

And it would go through and it would

58:46

be like, oh,

58:47

that was this link on Amazon.

58:48

The problem is it was so poorly thought

58:50

out that it didn't redact social security

58:52

numbers, passwords.

58:53

It did redact Netflix though.

58:55

So we know where Microsoft's priorities

58:56

were.

58:57

Anyways, anyways, all this hatred aside,

59:00

real quick,

59:01

just to round off that saga for anyone

59:02

who wasn't there,

59:03

there was so much pushback that Microsoft

59:05

did actually delay it a whole year to

59:06

try and fix it.

59:07

And by fix,

59:09

I'm going to put that in heavy quotations

59:10

because it

59:12

They made it less obviously terrible,

59:14

but it was still pretty bad.

59:15

But anyways, according to...

59:18

So this article is pretty light on

59:19

details.

59:20

I do want to acknowledge that upfront.

59:22

But this comes from Windows Central and

59:24

they have unnamed sources that work at

59:26

Microsoft who basically said Microsoft is

59:29

reconsidering a lot of things.

59:31

Unfortunately,

59:32

I don't think most of it's going away

59:33

from the way this article's word.

59:35

They said they have paused work on any

59:37

additional co-pilot buttons for inbox apps

59:39

for now.

59:41

I do believe there are a few things

59:43

that they're still going to go ahead with,

59:44

like semantic search, agentic workspace,

59:47

Windows machine learning,

59:48

and Windows AI APIs.

59:50

Microsoft believes that these

59:51

under-the-hood AI efforts are still

59:53

important for app developers and users,

59:55

even though nobody wants an agentic AI

59:58

workstation, but whatever.

1:00:02

There is, I can't find it now,

1:00:03

but I think they did say there are

1:00:05

a few things that they're just dropping

1:00:06

entirely.

1:00:07

Oh, sorry.

1:00:08

Okay, so they didn't, here it is.

1:00:09

They didn't commit to it,

1:00:10

but Notepad and Paint,

1:00:11

the ones I mentioned earlier,

1:00:12

they said that those are under review.

1:00:13

So those might get pulled entirely.

1:00:16

What's very also likely is they said that

1:00:19

the company is basically trying to

1:00:22

rethink things uh what they used a

1:00:25

phrasing about copilot that i thought was

1:00:27

or not copilot recall uh recall in its

1:00:29

current implementation has failed though i

1:00:31

understand the company is exploring ways

1:00:33

to evolve the concept rather than scrap it

1:00:35

entirely possibly dropping the recall name

1:00:38

in the process though this is unconfirmed

1:00:40

uh personal opinion they can evolve it

1:00:42

right into the recycle bin where it

1:00:43

belongs but anyways yeah it's it's

1:00:47

So it's,

1:00:47

it's really unclear what direction this is

1:00:49

going to take,

1:00:50

but it is good that they're finally

1:00:51

listening.

1:00:52

And, uh, this is coming,

1:00:53

the article notes,

1:00:54

this is coming on the heels of Microsoft

1:00:56

admitting that windows is horribly broken

1:00:59

right now.

1:00:59

Uh,

1:01:00

it's been in my headlines in my newsfeed

1:01:02

a lot lately and usually not for good

1:01:03

reasons.

1:01:04

And they did commit to trying to quote

1:01:06

unquote, fix windows this year.

1:01:08

Although again,

1:01:08

they didn't really say what that entails.

1:01:10

This is probably part of that.

1:01:12

Um,

1:01:13

I will say that in my opinion,

1:01:15

I think their AI implementation is so bad

1:01:17

that really anything is fixing it at this

1:01:19

point.

1:01:21

Anything they can do to make it less

1:01:22

terrible,

1:01:23

whether that's making it less obnoxious,

1:01:27

whether that's getting rid of it

1:01:28

completely,

1:01:28

whether that's hopefully making recall

1:01:30

less of a privacy nightmare.

1:01:32

I think the very least they should do

1:01:33

is make it optional.

1:01:35

They're probably not going to do that

1:01:36

because this is Microsoft.

1:01:38

I guess I'll just say that.

1:01:39

I don't really have high hopes that this

1:01:40

is going to be revolutionary,

1:01:42

but I am hoping that they can make

1:01:46

it less terrible.

1:01:47

I really think...

1:01:49

I don't want to say that's the only

1:01:50

direction they can go in because this is

1:01:51

Microsoft.

1:01:52

They can definitely find ways to surprise

1:01:53

us,

1:01:53

but I definitely don't have hopes for them

1:01:58

to actually make this good in any way.

1:02:03

I just hope they'll find a way to

1:02:04

make it

1:02:06

Not so bad.

1:02:08

I think those are kind of my thoughts

1:02:10

on this one.

1:02:10

Do you have anything you want to add

1:02:12

to this, Jordan?

1:02:13

I think you're lucky enough not to be

1:02:14

subjected to the nightmare that is

1:02:16

Windows,

1:02:17

but I don't know if you still have

1:02:18

any thoughts.

1:02:19

Yeah,

1:02:20

I've kind of avoided using Windows almost

1:02:23

completely.

1:02:24

I mean,

1:02:24

Mac OS isn't really that much better,

1:02:26

but Linux is definitely a good place to

1:02:29

be right now.

1:02:31

So I think that's, yeah,

1:02:32

I agree with everything you were saying.

1:02:34

You know,

1:02:34

it is kind of frustrating that like

1:02:36

Microsoft is working on stuff that

1:02:38

seemingly not that many people are

1:02:40

actually interested in.

1:02:41

Like, you know,

1:02:42

adding all these AI recall features and

1:02:44

co-pilot things.

1:02:47

I think, you know...

1:02:49

Microsoft is kind of realizing, yeah,

1:02:50

we're going to have to actually add

1:02:52

features to the operating system that our

1:02:54

users actually want.

1:02:55

You know,

1:02:55

like we don't just have to add silly

1:02:58

like AI integrations.

1:03:01

I really hope that, you know,

1:03:03

this is a sign that the AI bubble

1:03:05

is finally going to, you know,

1:03:07

explode because you know these companies

1:03:09

aren't getting the returns on their

1:03:11

investments they were looking for right

1:03:13

they're probably dumping a huge amount of

1:03:14

money into into developing these features

1:03:17

and kind of going all in on AI

1:03:19

because I know like I've seen the CEO

1:03:20

of Microsoft he was saying you know

1:03:22

they're going all in on agentic operating

1:03:25

systems and utilizing all these new you

1:03:28

know developing all these new AI

1:03:30

integrations and I think

1:03:32

it's finally good to see them realize

1:03:35

that, you know,

1:03:35

the majority of people are not really

1:03:37

interested in this and that they're going

1:03:40

to start scaling it back because yeah,

1:03:43

it's,

1:03:45

I guess we should also talk about this

1:03:47

from a privacy perspective.

1:03:49

You know, I think it goes without saying,

1:03:50

you know,

1:03:50

an agentic operating system and operating

1:03:53

system that is basically sharing a lot of

1:03:57

information about your system with,

1:03:59

you know,

1:04:00

a third party because in a lot of

1:04:02

cases, this processing isn't done locally.

1:04:06

You know,

1:04:06

a lot of the processing that is done

1:04:08

for these agentic systems is actually

1:04:11

being sent to a third party service.

1:04:13

And this is kind of terrible for a

1:04:14

lot of reasons because, you know,

1:04:17

information that you thought was private

1:04:19

and was local on your computer is then

1:04:23

basically being broadcast onto the

1:04:25

internet.

1:04:26

Like I know a really good, uh,

1:04:29

a really good thing to think about this,

1:04:30

right,

1:04:31

is let's say you're talking to someone on

1:04:33

Signal on a Windows computer,

1:04:36

and they've got some agentic operating

1:04:39

system features running at the same time,

1:04:41

that could be recording you,

1:04:43

that could be monitoring your call,

1:04:46

and it could be saving that or sending

1:04:48

it to a third party server,

1:04:50

which is breaking the privacy of Signal.

1:04:54

You're just adding a listener on the other

1:04:55

end.

1:04:56

You're just sending this information to a

1:04:58

third party server.

1:05:00

So I think at least some people that

1:05:04

care about their privacy have been saying

1:05:06

this,

1:05:07

but I think it's good that in general

1:05:09

people aren't that interested in this

1:05:11

because if this was to become more popular

1:05:15

and available on operating systems,

1:05:16

it would kind of be a nightmare.

1:05:18

It would kind of be a privacy nightmare

1:05:19

because then you don't really know what

1:05:23

you're sending to another person is

1:05:25

actually private or if it's being sent to

1:05:27

a third party server.

1:05:30

I'm just happy that Microsoft is realizing

1:05:33

that this is a bad idea.

1:05:35

Maybe it's not for the same reasons that

1:05:36

we care about, which is, you know,

1:05:38

it being a privacy nightmare and

1:05:41

destroying any sense of privacy that

1:05:43

people have.

1:05:45

But it's good either way.

1:05:47

I think we can take this win.

1:05:51

Yeah, for sure.

1:05:52

And that's...

1:05:54

That's something that's in the back of my

1:05:56

head, too, when I talk about, you know,

1:05:57

I said, like,

1:05:58

hopefully this will be optional and it'll

1:05:59

be, like, less terrible.

1:06:02

But even optionally, Microsoft is,

1:06:04

or Windows, I mean,

1:06:06

is such a notoriously leaky operating

1:06:08

system from a privacy perspective.

1:06:11

And I don't...

1:06:15

I think I posted it in a group

1:06:16

chat with my friends.

1:06:16

I don't think I posted it in the

1:06:17

Privacy Guides chat.

1:06:19

But a few weeks ago,

1:06:21

I may have mentioned it on here, actually.

1:06:22

A few weeks ago,

1:06:23

I realized that for some reason,

1:06:25

my Windows computer hadn't updated since

1:06:28

version, like, .

1:06:31

It was whatever version just stopped

1:06:33

getting support in November.

1:06:34

And so I was like, okay.

1:06:36

I need to sit down.

1:06:37

I need to figure this out because I

1:06:38

want to make sure I'm getting those

1:06:39

security updates.

1:06:40

And I had to jump through many,

1:06:42

many hoops.

1:06:43

I had to chase down all kinds of

1:06:44

issues.

1:06:45

Eventually, I did get it to update.

1:06:46

Everything went smooth.

1:06:47

But then because it was an update,

1:06:49

it's introduced all these new features.

1:06:51

And it was basically like a whole new

1:06:52

computer.

1:06:53

So I had to go through and I

1:06:55

had to use...

1:06:57

And for the record,

1:06:58

not all of these are equal.

1:06:59

So this is not like a broad endorsement,

1:07:03

but I have very specific third-party

1:07:04

scripts and tools that I trust to like

1:07:06

de-bloat some of this Microsoft stuff.

1:07:08

And I swear to God,

1:07:09

it was like an hour long process to

1:07:11

go through.

1:07:11

And I only have like three of them

1:07:12

that I use,

1:07:13

but to go through all three of them

1:07:15

and run them to get rid of the

1:07:17

telemetry, get rid of the AI,

1:07:19

get rid of this, get rid of that.

1:07:21

And it was such, and that was,

1:07:23

I'm not even talking about the whole

1:07:24

updating process.

1:07:25

I'm talking about just that part of going

1:07:27

through,

1:07:28

making sure all the settings that I want

1:07:29

turned off are off,

1:07:30

making sure that these scripts run,

1:07:32

checking the scripts and making sure I

1:07:34

know what they do and I'm okay with

1:07:35

it.

1:07:35

And just, it was so obnoxious.

1:07:37

And it's like,

1:07:38

it shouldn't be this ridiculous to use a

1:07:40

computer

1:07:41

without giving up all my privacy,

1:07:43

plus the kitchen sink,

1:07:44

plus the neighbor's kitchen sink,

1:07:46

like it's insane.

1:07:47

And so going back to what you were

1:07:48

saying about AI is that's my concern is

1:07:50

that even if they roll this out in

1:07:52

a form where it's optional,

1:07:53

does that only mean optional to the,

1:07:56

like on the end user facing, like what's,

1:07:59

what's the word I'm looking for?

1:08:00

Like,

1:08:01

where it seems optional.

1:08:02

But in the background,

1:08:03

it's still submitting data.

1:08:04

It's still collecting data.

1:08:06

And I'm sure it's less than if it

1:08:07

was running in agentic mode or if Recall

1:08:09

was taking those screenshots every three

1:08:11

minutes or whatever it is.

1:08:13

But it still just worries me that it's

1:08:14

like, yeah,

1:08:14

but this thing could still be running in

1:08:16

the background,

1:08:16

potentially introducing vulnerabilities,

1:08:18

potentially sending more data than it

1:08:20

should be back to Microsoft.

1:08:24

That's what worries me.

1:08:25

And it's such a shame

1:08:27

as somebody who grew up on windows,

1:08:29

like I'm not a windows fan boy by

1:08:30

any measure, but I mean, let's be real.

1:08:32

Like Macs are really expensive.

1:08:35

And I mean,

1:08:37

I guess now they're about the same because

1:08:38

Mac is the only one that can afford

1:08:39

to eat the price hike on Ram,

1:08:41

but you know,

1:08:41

historically Macs are really expensive and

1:08:43

they work great and they've got great

1:08:44

security,

1:08:44

but you are absolutely unarguably paying

1:08:46

for a brand name.

1:08:47

That is a fact.

1:08:49

And then Linux is free, but

1:08:53

historically doesn't always do the things

1:08:54

I need it to do.

1:08:55

I really do want to test some of

1:08:56

my production stuff and see how well it

1:08:57

works.

1:08:57

But I guess my point being is like,

1:09:00

Windows has always been such a relatively

1:09:02

affordable, customizable system.

1:09:05

And it's really frustrating to just see it

1:09:07

become worse and worse and worse in every

1:09:08

sense of the word,

1:09:09

from the UI to the privacy,

1:09:11

to the bloat, to the just,

1:09:13

it makes me sad.

1:09:14

Like I said,

1:09:15

I was never a Microsoft fan boy,

1:09:16

but it's just, it used to be better.

1:09:19

We used to be a society.

1:09:23

I digress.

1:09:25

I've ranted about that plenty.

1:09:29

If that's all we've got for now,

1:09:30

I think there are a couple of quick

1:09:32

stories that we wanted to highlight.

1:09:35

We're not going to talk about these

1:09:37

extensively because admittedly we had this

1:09:40

conversation earlier in the week.

1:09:42

Jonah and I have been covering all the

1:09:44

age verification stuff for literal weeks

1:09:48

and we have nothing new to add.

1:09:52

It's bad.

1:09:53

We're not in favor of it.

1:09:54

We think that there's

1:09:56

better ways to protect children online.

1:09:58

But we did want to let some of

1:10:00

our listeners know,

1:10:01

specifically in Spain and Greece,

1:10:03

if you are Spanish or I guess it's

1:10:06

Greek, isn't it?

1:10:07

I almost said Grecian for some reason.

1:10:08

But if you are Spanish or Greek,

1:10:10

these are

1:10:12

the latest countries who are now planning

1:10:14

to ban social media for children under

1:10:17

fifteen.

1:10:18

So yeah, like I said,

1:10:21

I don't think we really have anything to

1:10:22

add to that,

1:10:22

but if you do live in any of

1:10:24

those countries,

1:10:25

you should be aware of that.

1:10:27

Do you have anything you want to add

1:10:28

to that, Jordan?

1:10:32

Not particularly.

1:10:33

I mean,

1:10:34

I think both you and Jonah have talked

1:10:37

about this pretty extensively,

1:10:38

so I don't think we need to drag

1:10:39

this out,

1:10:40

but

1:10:41

It's frustrating that the start of this,

1:10:44

you know,

1:10:45

was with Australia's social media ban and

1:10:47

now other countries are following.

1:10:49

I think this is basically what we were

1:10:50

saying from the start, you know,

1:10:52

as soon as you normalize this in one

1:10:53

country,

1:10:54

every other country is going to start

1:10:56

following suit and

1:11:00

yeah,

1:11:00

it's kind of unfortunate that that is

1:11:02

exactly what is happening right now.

1:11:04

We warned you,

1:11:05

like we warned them that this is going

1:11:07

to happen and no one was taking it

1:11:10

seriously, but here we are.

1:11:13

So it is kind of frustrating.

1:11:17

Yeah.

1:11:17

It is kind of frustrating that, you know,

1:11:20

we've, we've been here, we've been, uh,

1:11:23

saying it from the start, but, uh,

1:11:26

now it's actually coming true.

1:11:27

So yeah, I think, you know,

1:11:29

do what you can in these countries,

1:11:31

make your contact, your representatives,

1:11:34

you know,

1:11:35

try and educate people in your life about

1:11:37

why this is bad.

1:11:39

Hopefully there can be some,

1:11:42

positives from this like you know it's it

1:11:44

sounds like these are both announcing a

1:11:46

ban they're not actually implemented yet

1:11:48

so there's still a chance for you to

1:11:49

have your voice heard so definitely try

1:11:51

and at least make some noise about it

1:11:53

because in a lot of cases uh if

1:11:57

a lot of people are against this then

1:11:59

you know it will end up being

1:12:04

uh a lot harder for them to pass

1:12:05

this with you know as many uh restrictions

1:12:11

but i guess the the thing to remember

1:12:13

about this is these are both you know

1:12:18

in spain um

1:12:20

I feel like they might have much stronger

1:12:22

data protection laws than that of like

1:12:25

Australia or the UK.

1:12:27

So it's interesting how they're going to

1:12:28

actually be able to implement this

1:12:30

without, you know,

1:12:31

because they're going to have to require

1:12:32

people to provide their ID or do some

1:12:34

sort of facial scanning.

1:12:35

It'll be interesting to see how they're

1:12:37

going to navigate that,

1:12:38

like regarding the data protection laws.

1:12:42

So that'll be something interesting to

1:12:44

watch, I guess,

1:12:45

but definitely try and make your voice

1:12:47

heard about this issue.

1:12:51

Yeah,

1:12:51

I just want to add the slippery slope

1:12:54

is I kind of hate using the slippery

1:12:57

slope argument because it is not always

1:13:00

applied in good faith and it doesn't

1:13:02

always turn out to be true.

1:13:04

But in the case of tech,

1:13:06

I feel like it is true more often

1:13:08

than it's not.

1:13:10

Actually,

1:13:10

a really good example is facial

1:13:11

recognition.

1:13:12

There have been multiple articles and

1:13:14

stories written about how Facebook

1:13:17

invented their little Ray-Ban facial

1:13:20

recognition glasses years ago.

1:13:23

And even at Facebook, they were just like,

1:13:24

no, this is too much.

1:13:26

This is a line too far.

1:13:27

We're not going to do this.

1:13:28

This is creepy.

1:13:30

Until Clearview AI came along.

1:13:32

And once Clearview AI came along and made

1:13:34

facial recognition totally cool,

1:13:36

now they couldn't wait to jump on the

1:13:38

bandwagon.

1:13:38

So yeah,

1:13:40

I feel like with tech in particular,

1:13:42

the slippery slope is true more often than

1:13:44

it's not, which is so, so frustrating.

1:13:47

Absolutely.

1:13:54

With that, in a moment,

1:13:57

we're actually going to start taking

1:13:59

viewer questions.

1:14:01

So if you have been holding on to

1:14:03

any questions about the stories that we've

1:14:05

talked about,

1:14:06

go ahead and start leaving them on our

1:14:07

forum thread or in the comment section of

1:14:10

the livestream.

1:14:11

But first,

1:14:12

we're going to check in on our community

1:14:13

forum.

1:14:14

There's always a lot of activity,

1:14:16

and this week has been no exception.

1:14:18

But we're going to highlight just a couple

1:14:20

of interesting stories that we wanted to

1:14:23

discuss here.

1:14:24

One of them is DuckDuckGo did a poll

1:14:27

that shows that people are against AI.

1:14:31

I feel like I read this story when

1:14:33

it was first published.

1:14:34

But yeah, I mean,

1:14:39

the author of this post sums it up

1:14:40

pretty well here.

1:14:40

DuckDuckGo made a public poll to see what

1:14:42

people think about AI,

1:14:43

and ninety percent voted against.

1:14:46

And I believe this thread has largely just

1:14:48

been people

1:14:50

discussing their opinions about AI.

1:14:53

And I don't know,

1:14:55

we just got done talking about AI and

1:14:58

the RAM shortage and how that's affecting

1:15:00

everything.

1:15:03

Do you have any thoughts about AI, Jordan,

1:15:05

or do you want me to go first?

1:15:09

I mean, I think we can kind of,

1:15:11

you know,

1:15:12

I think we talked about a little bit

1:15:13

before,

1:15:13

but I think for a lot of people,

1:15:16

these...

1:15:18

you know,

1:15:18

AI companies are kind of having a pretty

1:15:20

negative impact on people,

1:15:22

like just at like a personal level, right?

1:15:25

Like they're building massive data

1:15:28

centers.

1:15:28

They're using a bunch of electricity,

1:15:30

which is driving up electricity prices.

1:15:33

It's driving up RAM prices.

1:15:34

Like this is nothing that's good for the

1:15:36

average person.

1:15:37

And all for, you know,

1:15:39

my cool little chat bot I can talk

1:15:41

to whenever I want.

1:15:42

Like, is that really,

1:15:44

is the benefit really worth the cost?

1:15:47

And I think,

1:15:48

for a lot of times people are saying,

1:15:50

you know, maybe not, you know,

1:15:53

this is not really that useful.

1:15:56

So I don't know,

1:15:57

it could just be a sort of

1:15:59

There could be a bias in the sample

1:16:02

here.

1:16:02

Like, for instance,

1:16:03

this could have been posted.

1:16:05

I did see it being posted on Mastodon,

1:16:07

which means, you know,

1:16:09

it's kind of people on Mastodon don't like

1:16:11

AI.

1:16:12

So that definitely could have skewed the

1:16:14

results a little bit.

1:16:15

But I think it's still an interesting

1:16:19

idea, right,

1:16:20

to see and to have a poll go

1:16:22

out like that.

1:16:23

It would have been interesting to see if,

1:16:25

you know,

1:16:26

where a lot of these votes were coming

1:16:27

from and how this was published,

1:16:31

because I think that could have had a

1:16:32

pretty big impact on the results of this

1:16:37

poll.

1:16:38

But I think, you know,

1:16:39

ninety percent is kind of conclusive,

1:16:42

I guess.

1:16:43

So, yeah,

1:16:45

I'm not really not really that surprised

1:16:47

by ninety percent being against it.

1:16:49

What about you?

1:16:51

Yeah, no,

1:16:52

I think that's a really good point,

1:16:53

the selection bias.

1:16:55

I will say,

1:16:56

ninety percent surprises me a little bit

1:16:58

because I know DuckDuckGo is one of those

1:17:00

companies that has integrated A.I.

1:17:04

a little bit.

1:17:05

I think they even have their own like

1:17:07

A.I.

1:17:07

proxy and.

1:17:11

I think it's.

1:17:14

Yeah, I mean,

1:17:15

I think that's a really interesting point

1:17:16

for sure, but.

1:17:20

I mean, my personal opinions,

1:17:22

I definitely like...

1:17:24

We mentioned this on a previous episode.

1:17:25

I like the point that Em made a

1:17:29

few weeks ago about AI in its current

1:17:32

form cannot be private because it scrapes

1:17:35

up so much user data from people who

1:17:38

probably didn't consent.

1:17:40

And I think... This is one of the...

1:17:50

This is one of the... Sorry,

1:17:54

I'm trying to put my thoughts in order.

1:17:55

This is a thing that I've said before

1:17:57

is...

1:17:58

I think there's certain things about AI

1:18:00

that are technical problems in the sense

1:18:02

that they can be solved.

1:18:04

Can be.

1:18:05

Will they?

1:18:05

I don't know, but they can be.

1:18:07

Things like the energy usage.

1:18:09

I think that in time,

1:18:11

I think AI will become more energy

1:18:13

efficient simply because the financial

1:18:14

incentive is there to make it cost less

1:18:17

money or maybe find more sustainable ways

1:18:21

to power that energy.

1:18:23

Um,

1:18:23

I wish they would open up more solar

1:18:25

farms instead of nuclear plants,

1:18:27

but I don't know.

1:18:28

I digress.

1:18:28

Um, but then I think there's,

1:18:31

there's the much harder problems that I'm

1:18:33

not sure if we'll ever be able to

1:18:34

tackle,

1:18:34

which are things like compensating the

1:18:36

people for the data collected,

1:18:38

the training sets, things of that nature.

1:18:40

And I think those are, yeah,

1:18:43

I think those are the things that are,

1:18:45

like I said,

1:18:45

going to be harder to solve if solvable

1:18:47

at all.

1:18:48

And, um, yeah, I think, um,

1:18:52

Not to be pedantic,

1:18:52

but I know there's also a really good

1:18:54

discussion to be made about the difference

1:18:55

between,

1:18:56

because AI is just such a blanket term

1:18:58

that we're applying to everything

1:18:59

nowadays.

1:18:59

And I think there's useful types of AI,

1:19:02

like machine learning that's being used

1:19:03

for medical research and stuff like that,

1:19:05

versus the generative AI that's drying up

1:19:12

an entire lake just to make everybody on

1:19:14

Facebook look like Studio Ghibli.

1:19:16

And yeah,

1:19:18

I think it's really unfortunate that,

1:19:23

I don't know.

1:19:23

It's just really unfortunate.

1:19:25

And also, as a creative type myself,

1:19:28

I'm really annoyed that AI is taking away

1:19:31

all the fun jobs,

1:19:32

like making music and writing stories and

1:19:35

making videos,

1:19:36

and instead of taking away the crap jobs

1:19:38

that nobody wants to do.

1:19:41

Yeah, it's...

1:19:43

It's really unfortunate.

1:19:44

I think it's one of those things that

1:19:45

could be good, but probably,

1:19:48

it definitely is not right now.

1:19:50

And I don't know if we'll ever address

1:19:52

those difficult questions for sure,

1:19:54

but that's really unfortunate.

1:19:56

Yeah,

1:19:56

I think one thing to think about when

1:19:58

it comes to this AI stuff is

1:20:00

I'm not really sure if it could be

1:20:01

done in an ethical way, right?

1:20:03

Because basically the whole, like,

1:20:05

I don't know if you saw,

1:20:06

there was like an article that I saw

1:20:08

about Sam Altman and he was saying if

1:20:12

there was a stop to the wholesale scraping

1:20:15

of the internet for AI training,

1:20:16

then these AI companies literally wouldn't

1:20:19

be able to exist because the data that's

1:20:21

required to train these models is

1:20:24

basically done by scraping the entire

1:20:26

network

1:20:28

entire internet right um so you know i

1:20:32

don't think that like i think people think

1:20:35

about like their personal privacy like i'm

1:20:37

sure i could use like an ai model

1:20:39

locally on my computer that wouldn't be

1:20:41

sending information to a third-party

1:20:42

company but the model itself was trained

1:20:45

off non-consensual like

1:20:49

scraping of people's information and,

1:20:51

and data.

1:20:52

And when you train that model,

1:20:54

you're basically, you know,

1:20:55

encapsulating an entire, you know,

1:20:57

section of the web into a model.

1:21:00

Right.

1:21:01

And that's kind of the antithesis of

1:21:03

privacy, right?

1:21:04

Because if you, let's say you,

1:21:06

you deleted an article that you wrote

1:21:08

about something that could have been

1:21:10

scraped and put inside this model and,

1:21:13

you know, you're basically, uh,

1:21:17

storing this information forever.

1:21:19

And, you know,

1:21:20

it's also storing and scraping a lot of

1:21:22

personal information as well.

1:21:24

So, you know, it's,

1:21:27

it's kind of problematic.

1:21:28

I think there's not really any good way

1:21:33

to do this.

1:21:34

Like I'm sure maybe this,

1:21:35

there's a possibility that someone's made

1:21:37

an AI model based on only

1:21:40

publicly available and consensual data,

1:21:42

but, um,

1:21:44

I'm sure it's not very good and as

1:21:45

useful as the ones that have scraped the

1:21:47

entire internet.

1:21:48

Right.

1:21:48

Um, so, you know,

1:21:50

I think I'm not even certain that,

1:21:53

you know,

1:21:53

if we were able to use completely

1:21:56

consensual data and also have, you know,

1:21:59

use renewable energy,

1:22:01

I think it's just kind of a waste

1:22:03

of electricity as well.

1:22:04

Like electricity is not infinite.

1:22:06

Like it has an impact on the grid,

1:22:09

um,

1:22:10

just delivering electricity to people is

1:22:12

producing carbon.

1:22:14

So I don't know.

1:22:15

I don't think that it's,

1:22:19

I guess maybe our opinions differ slightly

1:22:22

on this, but I think, yeah,

1:22:24

there's definitely, in my opinion,

1:22:26

not really any ethical way to do it

1:22:29

that respects everybody in the process.

1:22:35

That's just my opinion.

1:22:37

No,

1:22:37

and I kind of agree with you because

1:22:39

that was something that I said when I

1:22:41

first mentioned this is like maybe,

1:22:44

and that's why I say those like less

1:22:46

technical problems I think are harder to

1:22:47

solve because like you were saying,

1:22:49

maybe we could,

1:22:50

like I know Creative Commons is working on

1:22:53

a license that basically says, yes,

1:22:55

I'm okay with AI training on this data.

1:22:58

But what if so few people opt into

1:22:59

that,

1:23:00

that

1:23:01

it can't create a good AI model.

1:23:03

And so there is no ethical way to,

1:23:06

to do that.

1:23:07

And, you know, you mentioned the,

1:23:10

the idea of like,

1:23:13

or maybe I just heard you,

1:23:15

you say this, but like,

1:23:17

we can't even like really remove training

1:23:19

data.

1:23:19

You know, we can't,

1:23:22

like we could remove it from the next

1:23:23

iteration when they run the AI and update

1:23:25

it,

1:23:26

but we can't really reliably say that

1:23:28

like, oh,

1:23:29

I want this data removed from the model.

1:23:31

And because of that,

1:23:33

it kind of doesn't respect like the right

1:23:34

to be forgotten,

1:23:35

which to me is personally is a really

1:23:37

big deal to me because I think one

1:23:40

of the most harmful things about the

1:23:41

permanent digital record that we have

1:23:43

nowadays is that

1:23:44

people have almost lost their ability to

1:23:46

grow.

1:23:51

All the older listeners will be with me

1:23:53

on this one.

1:23:53

I grew up pre-internet,

1:23:56

not super pre-internet.

1:23:57

I think we got internet when I was

1:23:59

in my teens,

1:23:59

but it definitely wasn't like it is now.

1:24:03

Social media was not a thing until I

1:24:04

was in high school.

1:24:06

I think Facebook came out when I was

1:24:07

in college.

1:24:08

So I grew up in a world where

1:24:12

You say dumb things.

1:24:13

You do dumb things.

1:24:16

You get in trouble for that,

1:24:17

but then you grow,

1:24:18

and you learn not to do those dumb

1:24:19

things.

1:24:20

And I don't want to sound too political,

1:24:23

but I feel like we're in a world

1:24:24

now where you say something dumb,

1:24:26

and no matter how long ago it was,

1:24:30

it lives there.

1:24:31

And so somebody will go like,

1:24:33

and I guess I'm kind of sort of

1:24:34

talking about cancel culture,

1:24:35

but I don't mean it in that context.

1:24:37

It's, you know,

1:24:38

you say something and somebody will go dig

1:24:39

up a tweet from twenty, you know,

1:24:40

ten years ago.

1:24:42

And it's like, oh,

1:24:42

here's something really bad you said.

1:24:44

And it's like, OK,

1:24:45

but I don't believe that anymore.

1:24:46

I've grown.

1:24:47

I've changed.

1:24:48

Or, you know, it was a.

1:24:49

stupid edgy joke that didn't land well or

1:24:51

whatever it was.

1:24:53

And it just,

1:24:53

it doesn't give us that freedom to grow

1:24:55

and like move on.

1:24:55

And I feel like that's a concern with

1:24:57

AI,

1:24:58

even though it may not necessarily like

1:25:00

directly trace back to you.

1:25:01

It's just,

1:25:01

I don't like that idea of something that

1:25:03

you don't want out there anymore because

1:25:05

you don't believe that anymore.

1:25:06

And you've moved on.

1:25:07

That's still stuck in the training data.

1:25:08

So I don't know.

1:25:10

That's why I said like, maybe,

1:25:12

maybe those,

1:25:13

those harder issues don't have a solution.

1:25:15

Maybe they don't, I don't know, but yeah.

1:25:18

Yeah,

1:25:19

I think those are the harder issues to

1:25:21

tackle if they can be tackled at all.

1:25:23

All right.

1:25:29

There was another thread that we had

1:25:32

written down to take a look at.

1:25:35

This one, the title says,

1:25:37

Android recommendations should reflect

1:25:38

real life,

1:25:39

not just worst-case threat models.

1:25:41

And this author... This author...

1:25:49

basically how would I summarize this?

1:25:51

Cause this is a, um, very,

1:25:53

not an overly long post,

1:25:54

but it's a very detailed post.

1:25:56

And, uh,

1:25:58

Basically, they're disappointed that,

1:26:00

for example, we only recommend graphene.

1:26:02

We don't really recommend other custom

1:26:05

ROMs.

1:26:05

We don't recommend iPhones or anything.

1:26:08

And he mentions some of the stuff that

1:26:11

we talked about earlier,

1:26:12

like some people don't want to buy a

1:26:14

Pixel because it's from Google.

1:26:15

And they don't want to buy a Google

1:26:17

device,

1:26:17

even if they could buy it secondhand.

1:26:19

Or Pixels aren't sold in their country.

1:26:21

Pixels are out of their price range.

1:26:22

And graphene doesn't offer parental

1:26:24

control options.

1:26:27

I definitely have a lot of thoughts on

1:26:28

this one.

1:26:30

Do you need a minute to formulate some

1:26:32

thoughts or do you want to go first?

1:26:35

No, definitely take it from here.

1:26:39

Okay.

1:26:39

So, yeah, I think,

1:26:45

and I'm going to try to paraphrase

1:26:48

something I saw Jonah say,

1:26:49

and I hope I paraphrase this right.

1:26:50

So I apologize, Jonah,

1:26:51

if I get this wrong.

1:26:54

I think with websites like Privacy Guides,

1:26:57

what we try to do is we're trying

1:26:59

to give people the ideal answer.

1:27:03

And I think,

1:27:06

I will say this is my personal opinion,

1:27:07

but hopefully I'm getting it kind of close

1:27:09

here.

1:27:10

I think there's always going to be

1:27:11

exceptions.

1:27:13

So when we say, you know,

1:27:15

like this person pointed out,

1:27:16

when we say like, yeah, the best,

1:27:18

and I went on a little bit of

1:27:19

a rant about this this morning actually on

1:27:21

the forum,

1:27:23

two things can be real,

1:27:24

even when they're contradictory.

1:27:27

Graphene can be the best option for

1:27:29

privacy and security.

1:27:31

But I think there's also situations where

1:27:33

it's perfectly valid that you can't or

1:27:35

won't do that.

1:27:35

Because again,

1:27:37

not sold in your country out of your

1:27:38

price range.

1:27:39

And I think in those situations,

1:27:41

it's really important to

1:27:46

or maybe not in those situations.

1:27:47

I think,

1:27:48

I think it's just important to realize

1:27:49

that, uh, you know, we, we can't possibly,

1:27:51

especially when we make something that is

1:27:54

being mass broadcast,

1:27:55

like a website or like a, uh,

1:27:57

you know, a podcast like this one,

1:28:00

there's always going to be exceptions.

1:28:01

There's always going to be people who have

1:28:03

perfectly legitimate reasons that they

1:28:04

can't do something.

1:28:05

And I think the reason we suggest these

1:28:07

perfect tools is kind of like the idea

1:28:09

of like,

1:28:09

try to get as close to this as

1:28:10

you can.

1:28:12

Um,

1:28:14

But yeah, I mean,

1:28:16

I think obviously I'm going to defend our

1:28:19

choices.

1:28:20

I think there's a reason for the

1:28:22

suggestions that we make.

1:28:23

But yeah,

1:28:26

I guess what I'm trying to get at

1:28:27

is I understand where this person's coming

1:28:30

from,

1:28:30

that there's always going to be exceptions

1:28:32

to the rule and reasons that somebody

1:28:33

can't do something specific like that.

1:28:35

But I don't think that necessarily makes

1:28:38

the advice we give wrong, personally.

1:28:43

But yeah,

1:28:44

we can't possibly cater to every single

1:28:46

exception, every single threat model.

1:28:49

We're trying to give advice that we're

1:28:50

hoping will work for the vast majority of

1:28:52

people in one way or another.

1:28:54

And if nothing else,

1:28:57

I think hopefully it will kind of give

1:28:58

you a direction to aim towards.

1:29:00

And so when you can see,

1:29:01

especially this is something I really like

1:29:02

about the privacy guides website,

1:29:04

it tells you why we recommend these tools.

1:29:07

So when you can see like, oh,

1:29:08

we recommend pixels because they have very

1:29:10

good hardware security.

1:29:12

You know, they have this, they have that.

1:29:14

And if you say, okay, well,

1:29:15

I can't get a pixel,

1:29:16

but what else can I find that checks

1:29:17

most of these boxes or which of these

1:29:20

boxes can I look at that are important

1:29:22

to me?

1:29:22

And I can go find something that checks

1:29:24

those boxes.

1:29:26

So.

1:29:28

Yeah, I think that's where I land.

1:29:30

This was a very popular post.

1:29:31

It's got a hundred and fifty seven

1:29:32

replies.

1:29:33

Holy cow.

1:29:36

Was there anything you wanted to add to

1:29:37

that?

1:29:38

I think this kind of goes back to

1:29:40

some of the discussion that we had about

1:29:42

like, you know,

1:29:42

you were saying like privacy is a

1:29:44

privilege,

1:29:44

like people who have money can afford

1:29:46

these things.

1:29:49

I think, yeah,

1:29:50

it is kind of interesting and important to

1:29:52

at least discuss.

1:29:54

I think that's like a good purpose of

1:29:56

the forum in this case, right?

1:29:58

Like people can discuss these issues.

1:30:00

less perfect tools in a place where,

1:30:03

you know,

1:30:03

people can be critical and talk about it.

1:30:06

But I think, you know, the

1:30:12

The recommendations on the website are

1:30:14

kind of like, you know,

1:30:14

meant to be like the most ideal ones.

1:30:17

I know I've certainly used the forum like

1:30:19

I've been interested in another tool and

1:30:21

I've been like, huh,

1:30:22

I wonder what the people in the privacy

1:30:23

guides forum said about it because,

1:30:25

you know,

1:30:25

they probably looked into it pretty hard.

1:30:28

You know,

1:30:28

people are scrutinizing things a lot.

1:30:32

So I think just because something isn't

1:30:34

listed on the recommendations doesn't mean

1:30:36

that it's not something you can't use.

1:30:39

For instance,

1:30:40

I use a bunch of stuff that's not

1:30:41

recommended by the website.

1:30:43

Doesn't mean that it's the wrong option.

1:30:45

It just depends on your specific use case,

1:30:48

right?

1:30:50

So yeah,

1:30:51

I agree with what you were saying there.

1:30:54

But yeah,

1:30:55

I think this is definitely an interesting

1:30:57

forum to have a read.

1:31:00

I'm not sure if I agree entirely with

1:31:02

what this person is saying.

1:31:05

I think stuff like headphone jacks and

1:31:09

physical SIM slots and SD card slots are

1:31:14

somewhat novelties.

1:31:16

I find that they're not super important.

1:31:19

I think the most important thing should be

1:31:24

the security of the device if you can.

1:31:28

So even if there was a cheaper device

1:31:30

that

1:31:32

could run something that is more private

1:31:35

than stock Android.

1:31:37

I think that would be better.

1:31:40

But yeah,

1:31:41

it's kind of unfortunate that like

1:31:43

Fairphone has,

1:31:44

has got devices that they sell,

1:31:47

but they're also pretty expensive and

1:31:49

they're only available in Europe.

1:31:50

So it kind of limits the,

1:31:53

the availability of that.

1:31:54

And they don't seem to take security as

1:31:56

seriously as the graphing people.

1:31:58

So, yeah, I don't know.

1:32:00

It's kind of an unfortunate circumstance,

1:32:02

I guess.

1:32:05

I mean, I...

1:32:07

I don't use an Android device daily.

1:32:11

I do have one,

1:32:12

but I just prefer an iPhone just because,

1:32:15

you know, it just works best for me.

1:32:16

It's not recommended on the website,

1:32:18

but it's something you can do instead.

1:32:21

And I don't know,

1:32:23

I think people need to make decisions for

1:32:25

their circumstances and you don't have to

1:32:28

a hundred percent follow everything that

1:32:32

we recommend.

1:32:34

It's kind of up to you to make

1:32:35

up your own mind on things.

1:32:39

yeah for sure yeah i don't think i

1:32:44

have anything to add to that so i

1:32:49

guess we'll hop into answering questions

1:32:51

now um unfortunately it looks like we

1:32:53

didn't get any questions on the forum

1:32:56

thread uh i know we kind of

1:33:00

We've, uh,

1:33:00

some weeks we kind of wait a little

1:33:01

bit longer than,

1:33:02

than others to pick headline stories,

1:33:03

just to see if there's anything

1:33:05

particularly, um,

1:33:06

obvious that jumps out at us as like,

1:33:08

oh,

1:33:08

this should definitely be the headline

1:33:09

story.

1:33:10

Um, and I don't know if, uh,

1:33:12

this week people just didn't have enough

1:33:13

time or, or what, but yeah,

1:33:17

it looks like there's not too much there,

1:33:18

but let's go ahead and address some of

1:33:22

the questions in the chat here.

1:33:26

And let's see.

1:33:28

Excuse me.

1:33:30

I guess I'll take this one.

1:33:32

This person I don't think has a display

1:33:34

name.

1:33:34

That or my computer is not showing it.

1:33:37

But I guess we'll say anonymous.

1:33:41

So this is actually in regard to a

1:33:42

question from last week.

1:33:43

They asked if I had a chance to

1:33:45

review the OPSEC Bible.

1:33:46

So last week,

1:33:47

somebody asked about this website called

1:33:49

the OPSEC Bible.

1:33:52

I took a look at it.

1:33:54

I don't know if Jonah did.

1:33:55

Jonah's really busy.

1:33:57

He may or may not have.

1:34:02

I don't really feel like the website was

1:34:04

for me.

1:34:05

I kind of want to reiterate what I

1:34:06

said when you first asked about this

1:34:08

website,

1:34:09

which is I think it's great that there's

1:34:16

different levels of things that cater to

1:34:18

different people.

1:34:20

Because I think there's some people that

1:34:23

when you go up to them and say,

1:34:26

you have to use Graphene,

1:34:27

you have to use SimpleX,

1:34:28

you have to use Cubes,

1:34:30

they're just going to stop listening.

1:34:32

And that may be unfortunate.

1:34:33

That may not be in their best interest.

1:34:36

That may be a mistake.

1:34:38

But that is what they're going to do,

1:34:39

is they're going to stop listening and

1:34:40

walk away.

1:34:41

And I think that's...

1:34:46

That's kind of where some other projects

1:34:48

come in to try and try to

1:34:51

kind of make it a little bit easier

1:34:52

for people to get started on privacy.

1:34:54

And I think some people, most people,

1:34:56

I hope, but I think some people will,

1:34:58

once they start getting into privacy,

1:34:59

they'll realize like, Oh,

1:35:00

this is actually a lot more achievable

1:35:02

than it sounds.

1:35:03

And I can go, you know,

1:35:04

I talked about threat models earlier.

1:35:05

Like I'm a big fan of going above

1:35:07

and beyond.

1:35:07

If you can, a threat model,

1:35:09

in my opinion is kind of more of

1:35:11

a minimum,

1:35:11

like do this minimum to keep yourself

1:35:13

safe,

1:35:14

but feel free to go past that if

1:35:16

you want to.

1:35:17

And, um,

1:35:19

I think it's great for people who want

1:35:20

to go past that and want to learn

1:35:22

how to do some of the more advanced

1:35:24

stuff and use some of the more secure

1:35:26

options out there for whatever reason.

1:35:29

And so I don't really have an issue

1:35:31

with sites that push the more hardcore

1:35:35

things, but I, I don't know.

1:35:40

I just,

1:35:40

I don't think it was for me.

1:35:42

So thanks for bringing it to our

1:35:44

attention, I guess.

1:35:45

But yeah, that was,

1:35:47

that was my thought on that one.

1:35:49

I don't know.

1:35:50

I think I may have shared that one

1:35:51

in the chat.

1:35:51

Did you check that one out, Jordan,

1:35:52

or no thoughts?

1:35:55

Yeah.

1:35:56

I don't really have any thoughts on it.

1:35:57

I think you covered it pretty well.

1:36:00

I think we need to go super in-depth.

1:36:04

that website but i think it's definitely

1:36:06

good to use multiple sources uh for

1:36:10

finding your information right like if we

1:36:12

recommend something don't just take that

1:36:15

as the truth right you should also

1:36:17

investigate other people who are

1:36:18

recommending different things see what

1:36:21

they say as well um so you know

1:36:24

maybe this website could be useful as

1:36:26

another resource to check what

1:36:29

they say about certain things.

1:36:30

And then you can come up with your

1:36:32

own opinion, I guess.

1:36:34

Um, but you know,

1:36:36

privacy guides has a specific way of doing

1:36:38

things.

1:36:39

We're going to choose specific tools based

1:36:40

on specific criteria.

1:36:42

So it kind of is going to differ

1:36:46

to what other people recommend and that's

1:36:49

fine.

1:36:49

Everyone has their own recommendations and

1:36:51

what they recommend.

1:36:53

So I think it's good.

1:36:54

More diverse, uh, information is better.

1:37:00

For sure.

1:37:02

All right, our next question came from JS,

1:37:05

who asked about DuckDuckGo's AI.

1:37:09

You said,

1:37:10

I hear people complain about DuckDuckAI as

1:37:12

a proxy because it just queries the

1:37:13

respective companies,

1:37:16

which in this context,

1:37:17

I think he means like a chat GPT,

1:37:19

Anthropic, whoever.

1:37:21

And you say,

1:37:22

isn't this the same thing that private

1:37:23

search engines do?

1:37:24

From privacy guides, quote,

1:37:25

DuckDuckGo does not log your searches for

1:37:27

product improvement purposes,

1:37:28

or does log your searches,

1:37:31

but not your IP address or any other

1:37:32

PII.

1:37:33

Yes, this is actually...

1:37:39

I don't know.

1:37:39

Personally,

1:37:40

I use BraveSearch because I feel the same

1:37:42

way.

1:37:43

DuckDuckGo itself is just a proxy.

1:37:46

I think most of these meta search engines

1:37:48

do technically draw from multiple sources,

1:37:50

but they usually primarily heavily draw

1:37:52

from one source.

1:37:53

So DuckDuckGo mostly draws from Bing.

1:37:56

StartPage is mostly Google.

1:37:58

And...

1:38:00

I think there's a couple others that I'm

1:38:01

forgetting.

1:38:01

But there are a handful,

1:38:03

like I think Kaji or Kagi,

1:38:04

however you pronounce it,

1:38:05

I think they mostly have their own index

1:38:09

these days.

1:38:11

Brave mostly has their own index.

1:38:13

And I think Mojik is the other one

1:38:16

that they all kind of have their own

1:38:18

index.

1:38:18

So yeah, personally,

1:38:20

I like to use those or I like

1:38:22

to use Brave just because it has its

1:38:24

own index.

1:38:24

I'm already using the Brave browser.

1:38:26

I think they do a pretty good job.

1:38:28

I feel like I'm also just,

1:38:32

I feel like maybe I just,

1:38:34

maybe it's something about me and knowing

1:38:36

how to use search engines the right way,

1:38:37

quote unquote, the right way.

1:38:38

But because I hear some people will get

1:38:40

off of Google and they'll switch to like

1:38:42

DuckDuckGo or StartPage and they'll be

1:38:43

like, oh,

1:38:44

the results just aren't the same to which

1:38:46

number one, yeah,

1:38:47

Google is personalizing your results

1:38:49

because they're stalking you.

1:38:50

But number two, I mean, I just,

1:38:51

I personally have never had that issue.

1:38:53

And I don't mean to like invalidate people

1:38:54

who do have that issue.

1:38:55

I'm sure that must be really frustrating,

1:38:57

but yeah.

1:38:57

don't know for me it's um brave does

1:38:59

the job really well and that's what i

1:39:01

use but yeah that's um

1:39:04

Real quick,

1:39:04

that is another thing that I think almost

1:39:06

all of these AIs, like Jordan was saying,

1:39:12

they're all,

1:39:14

or going back to what Jordan was saying,

1:39:15

they're all kind of just proxies.

1:39:18

Brave's Leo, Proton's Lumo,

1:39:21

and they're done in such a way that

1:39:23

it's a little bit more private for the

1:39:24

end user,

1:39:25

but at the end of the day,

1:39:25

they're still using ChatGPT or Claude or

1:39:30

whoever else,

1:39:30

and

1:39:31

So they still kind of do suffer from

1:39:33

the same privacy problems of scraping data

1:39:37

that they morally probably shouldn't have.

1:39:40

So yeah,

1:39:43

that's all I got on that one.

1:39:44

I think, yeah, there's definitely,

1:39:48

I think there's some valid issues with

1:39:51

these alternative search engines, right?

1:39:53

Like I think one area that is sometimes

1:39:58

not talked about is a lot of these

1:40:02

alternative search engines,

1:40:05

they aren't going to be as good.

1:40:07

I think your location certainly does

1:40:09

matter, right?

1:40:11

Some people complaining about the search

1:40:12

results might be in different countries.

1:40:16

If I search

1:40:19

something that's super relevant in my

1:40:20

country and it doesn't show up,

1:40:22

that's obviously going to be a problem,

1:40:23

right?

1:40:25

And I think it gets compounded even more,

1:40:29

for instance, in different languages.

1:40:31

So if you were searching in a different

1:40:33

language,

1:40:33

I'm sure that is going to be even

1:40:35

worse.

1:40:36

The results are going to be even worse

1:40:37

in Google.

1:40:38

So it is kind of definitely a bit

1:40:43

of a, I don't know,

1:40:46

your use case might vary situation.

1:40:49

Personally, I found that, you know,

1:40:52

I've tried a lot of these different search

1:40:55

engines and none of them have been

1:40:57

particularly terrible.

1:40:59

But, you know,

1:41:00

I find that DuckDuckGo is generally fine.

1:41:04

But I think, you know,

1:41:05

people need to try and see which one

1:41:08

works the best and which one you're most

1:41:10

comfortable with.

1:41:12

I've tried a lot of them and a

1:41:13

lot of them I just wasn't super happy

1:41:15

with it.

1:41:16

And, you know,

1:41:18

it just depends on your use case of

1:41:20

the search engine.

1:41:21

Like not everyone is going to use the

1:41:22

search engine the same way.

1:41:24

So I don't think it's strange to have

1:41:28

an issue with the search results.

1:41:33

Yeah,

1:41:33

I wonder if his point was just kind

1:41:35

of,

1:41:37

maybe what he was saying is it's just

1:41:38

kind of weird that people complain that

1:41:39

like, oh, this is just a proxy.

1:41:41

And it's like, well, yeah,

1:41:41

but isn't the search engine?

1:41:43

But I don't know.

1:41:44

I think that might be more indicative of

1:41:45

the,

1:41:46

general attitudes that people have toward

1:41:48

AI in general.

1:41:50

Yeah, which I guess at that point,

1:41:53

my question would be,

1:41:54

is DuckDuckGo at least keeping...

1:41:57

in line with their ethos.

1:41:58

I hate to take shots at Mozilla here,

1:42:00

but Mozilla's AI integration is absolutely

1:42:04

abysmal.

1:42:06

I don't feel bad saying that,

1:42:07

but their AI integration,

1:42:09

they straight up tell you,

1:42:10

even in their blog post,

1:42:11

I'll give them credit,

1:42:11

they're very open about this,

1:42:13

but they're basically just like, oh yeah,

1:42:14

we've integrated ChatGPT and you can just

1:42:17

click the little button in the sidebar and

1:42:18

it's super easy.

1:42:19

And by the way, once you use it,

1:42:21

you're totally at OpenAI's mercy as far as

1:42:24

privacy goes.

1:42:26

But it's like, what's even the point?

1:42:29

I can just bookmark chat GPT.

1:42:31

I don't need a little pop-up window to

1:42:33

use that.

1:42:34

Why would I use this at all?

1:42:36

So I do have to wonder if,

1:42:38

and I think it does,

1:42:39

but don't quote me.

1:42:39

I do have to wonder if DuckDuckAI is

1:42:41

like, yes, it's a proxy,

1:42:42

but at least they have legal agreements

1:42:45

with the companies not to train on your

1:42:47

prompts or they proxy it so it doesn't

1:42:49

get your

1:42:51

device fingerprint like i have to wonder

1:42:52

if they've done anything to make it more

1:42:54

privacy respecting in which case i would

1:42:56

argue like yes it is still a proxy

1:42:59

but at least they're in keeping with their

1:43:01

ethos of like trying to make it a

1:43:02

more private experience again for the end

1:43:05

user i really want to stress that but

1:43:06

um but yeah i don't know um i

1:43:11

will say real quick in response to what

1:43:12

you said i think my favorite search engine

1:43:14

in terms of actual effectiveness was uh

1:43:18

I think it's search is how you pronounce

1:43:20

it.

1:43:20

The CRX, I think it's CRXNG now.

1:43:22

Searching, I think is how it's pronounced.

1:43:25

I used to use the crx.be instance,

1:43:29

which I think is still around.

1:43:31

And I will admit the results were

1:43:33

impressive.

1:43:33

I could always find the weirdest niche

1:43:36

stuff that I couldn't find anywhere else.

1:43:39

Because it drew from so many different

1:43:41

search engines.

1:43:43

But yeah, at the end of the day,

1:43:44

personally,

1:43:45

I decided that what I really wanted was

1:43:46

like an independent index that wasn't just

1:43:49

a meta search engine.

1:43:50

So that's one of the reasons that I

1:43:51

went with Brave.

1:43:52

But yeah, I'm with you.

1:43:55

They're all different.

1:43:56

I think it's totally valid to just try

1:43:58

out different ones and see which one works

1:43:59

best.

1:44:03

That's kind of really it for questions,

1:44:05

I think.

1:44:05

We did have another comment here.

1:44:06

Somebody said, happy birthday, privacy.

1:44:09

Is that in reference to Data Privacy Day

1:44:12

recently?

1:44:12

Or is this when Privacy Guides was

1:44:14

founded?

1:44:15

I'm not sure.

1:44:17

I feel like I should know that.

1:44:20

Yeah, I don't know.

1:44:21

It seems an interesting comment to make.

1:44:24

Thanks, I think.

1:44:27

You know,

1:44:27

I will take this as an excuse to

1:44:28

go get some cake after this.

1:44:30

I'm down.

1:44:33

We'll put it on the company card.

1:44:35

I don't have a company card.

1:44:36

Jonah, I want a company card.

1:44:37

I'm kidding.

1:44:40

I just like saying random stuff.

1:44:42

And then, yeah,

1:44:43

one last person left a comment when we

1:44:44

were talking about AI.

1:44:45

They said use local AI instead.

1:44:48

Yeah,

1:44:48

I think local AI is probably going to

1:44:50

be best for, again, end user privacy,

1:44:54

especially because a lot of them you can

1:44:55

completely firewall them on your computer

1:44:57

or your device and they never have to

1:44:59

touch the internet.

1:45:00

But again, we were also talking,

1:45:03

I know I'm like really beating it over

1:45:04

the head lately,

1:45:05

but just the idea of like,

1:45:06

it's still trained on user data that may

1:45:10

not have been consensually collected in a

1:45:12

lot of cases.

1:45:12

There's several ongoing lawsuits about

1:45:14

this very issue right now of copyrighted

1:45:17

works that, and actually,

1:45:20

you mentioned it earlier and I completely

1:45:21

forgot to bring it up,

1:45:21

but I think that's so funny that Sam

1:45:24

Altman is like, oh, well,

1:45:25

if we can't steal copyrighted material,

1:45:27

then we don't have a business.

1:45:28

And it's like, okay,

1:45:30

Like,

1:45:30

could you imagine a drug dealer using that

1:45:32

in court?

1:45:32

Like, well, your honor,

1:45:33

if I can't cook meth,

1:45:34

then I don't have a business.

1:45:35

And it's like, well, meth is illegal.

1:45:37

So that sounds like a you problem.

1:45:39

Like,

1:45:40

that's insane that they're even trying to

1:45:41

use that as a defense.

1:45:42

And it's like, well,

1:45:43

if I can't steal everybody's property,

1:45:45

then I don't have a business.

1:45:46

It's like, then you don't have a business.

1:45:47

That's how it works.

1:45:51

Sorry.

1:45:52

I know I'm ranting a little bit,

1:45:53

but the nerve to use that.

1:45:58

All right.

1:46:00

I think that's all we got this week.

1:46:04

So let me...

1:46:07

pull up my notes here.

1:46:08

All right,

1:46:10

so all the updates from this week in

1:46:11

privacy are already shared on the blog,

1:46:14

actually.

1:46:15

We have actually,

1:46:16

in case you guys didn't know,

1:46:17

we now are sending the blog post out

1:46:20

at the same time that we start streaming.

1:46:22

So if you want,

1:46:22

you can go sign up for the newsletter

1:46:24

and subscribe on your favorite RSS reader

1:46:26

and you will get a reminder.

1:46:28

There's a link to the StreamYard stream

1:46:30

right there in the newsletter,

1:46:31

so it's a really easy way to get

1:46:32

a reminder and start watching.

1:46:34

For people who prefer audio,

1:46:36

we offer an audio podcast available on all

1:46:38

podcast platforms and RSS,

1:46:40

and the video will also be synced to

1:46:42

PeerTube.

1:46:43

These will be after the fact.

1:46:45

Privacy Guides is an impartial nonprofit

1:46:48

organization that is focused on building a

1:46:50

strong privacy advocacy community and

1:46:52

delivering the best digital privacy and

1:46:54

consumer technology rights advice on the

1:46:55

internet.

1:46:56

If you want to support our mission,

1:46:57

you can make a donation on our website,

1:46:59

privacyguides.org.

1:47:01

To make a donation,

1:47:02

click the red heart icon located in the

1:47:04

top right corner of the page.

1:47:05

You can contribute using standard fiat

1:47:07

currency via debit or credit card,

1:47:09

or opt to donate anonymously using Monero

1:47:11

or your favorite cryptocurrency.

1:47:13

Becoming a paid member unlocks exclusive

1:47:15

perks like early access to video content

1:47:18

and priority during the This Week in

1:47:19

Privacy livestream Q&A.

1:47:21

You'll also get a cool badge on your

1:47:22

profile in the forum and the warm fuzzy

1:47:24

feeling of supporting independent media.

1:47:26

Thank you guys so much for tuning in

1:47:27

this week,

1:47:27

and we will be back next week with

1:47:29

more news.

1:47:30

Bye, everybody.