New Exploit Affects 220 Million iPhones
Ep. 45

New Exploit Affects 220 Million iPhones

Episode description

The ‘dark sword’ exploit affects over 220 million iPhones running outdated iOS versions, the FBI is buying location data to track US citizens, a Tennessee grandmother was jailed after an AI facial recognition error linked her to fraud, and much more, join us for This Week In Privacy #45!

Download transcript (.vtt)
0:05

A brand new exploit impacting iPhones.

0:08

The FBI has resumed buying location data

0:11

and Google's update to installing third

0:13

party apps.

0:15

All this and more coming up on this

0:17

week in privacy number forty five.

0:20

So stay tuned.

0:26

I don't.

0:54

Welcome back to This Week in Privacy,

0:56

our weekly series where we discuss the

0:57

latest updates with what we're working on

0:59

within the Privacy Guides community and

1:01

this week's top stories in the data

1:02

privacy and cybersecurity space.

1:05

I am Nate,

1:05

and with me this week is Jordan.

1:07

Jordan, it's been a while.

1:08

How are you?

1:10

I'm good.

1:10

Just excited to be here and cover the

1:12

latest news.

1:14

Yeah, it's good to have you back.

1:16

Privacy Guides, for those who don't know,

1:17

is a nonprofit which researches and shares

1:20

privacy-related information and

1:21

facilitates a community on our forum and

1:24

matrix where people can ask questions and

1:25

get advice about staying private online

1:27

and preserving their digital rights.

1:29

With that,

1:30

we will launch into the biggest news in

1:32

the privacy and security space from the

1:33

past week.

1:35

And Jordan is going to tell us all

1:36

about hundreds of millions of iPhones that

1:38

can be hacked with a new tool found

1:40

in the wild.

1:43

Yes, that's right.

1:44

So basically there's a story here from

1:46

Wired.

1:47

A powerful iPhone hacking technique known

1:49

as Dark Sword, one word,

1:52

has been discovered in use by Russian

1:54

hackers.

1:55

It can take over devices running iOS that

1:58

simply visit infected websites.

2:01

so uh reading into this story here iphone

2:05

hacking techniques have sometimes been

2:06

described almost like rare and elusive

2:10

animals hackers have used them so

2:12

stealthily and carefully against such a

2:13

small number of hand-picked targets

2:15

they're only rarely seen in the wild now

2:18

a recent spat of espionage and cyber

2:20

criminal campaigns have deployed those

2:23

same phone takeovers tools

2:26

Embedded infected websites to

2:28

indiscriminately hack phones by the

2:31

thousands.

2:32

You might have to take over here, Nate,

2:33

because the article is paywalled for me.

2:37

Oh no, that's unfortunate.

2:39

Okay.

2:39

Um, yeah.

2:41

So, uh,

2:43

basically this article came or this

2:45

disclosure, I should say,

2:46

came from Google as well as I verify

2:49

and another firm called lookout.

2:52

They revealed this on Wednesday and they

2:55

said that this isn't really a, well,

2:58

I guess it kind of is.

2:59

Um, this isn't an exploit.

3:02

How do I word this?

3:03

This is an exploit on iPhones,

3:05

but also not,

3:07

because they're actually infecting

3:08

websites.

3:09

And then the websites are the ones who

3:11

are delivering this, again,

3:13

not even payload.

3:14

Further down on the article,

3:16

it says that this is actually one of

3:18

those

3:19

those malwares that can be defeated with a

3:21

reboot, when your device becomes infected,

3:27

it's able to grab as much data as

3:28

it possibly can.

3:29

And because it's not persistent,

3:30

it's actually pretty hard to...

3:35

for these cybersecurity companies to trace

3:38

evidence of it.

3:39

It's not like the typical Pegasus or those

3:42

kind of more advanced malwares that we see

3:44

where there's things that they can look

3:47

for.

3:47

I think it's actually right here.

3:49

It uses fileless malware.

3:51

Hold on.

3:52

Okay, yeah.

3:52

Rather than install spyware that persists

3:54

on users' phones,

3:54

Dark Sword uses stealthier techniques that

3:56

are more often seen in fileless malware

3:58

that typically target Windows devices.

4:00

They hijack the legitimate process on an

4:02

iPhone's operating system to steal data.

4:04

And then this is a quote from one

4:05

of iVerify's people.

4:06

It says,

4:06

instead of a spyware payload to brute

4:08

force your way through the file system,

4:10

which leaves tons of artifacts of

4:11

exploitation that are pretty easy to

4:13

detect,

4:13

this just uses system processes the way

4:15

they're meant to be used,

4:17

and it leaves far fewer traces.

4:20

Um, so yeah,

4:20

the upside there is that it does not

4:22

persist after reboot.

4:23

Uh,

4:24

but instead it steals data from the phone

4:26

within the first few minutes after it's

4:27

hacked,

4:27

which is called a smash and grab approach,

4:29

or at least that's what this guy calls

4:31

it.

4:31

So it's very, um,

4:34

it does the damage very quickly,

4:35

I should say.

4:37

And, uh, yeah, so I guess the,

4:41

The pro and con here,

4:42

and just in case anyone's wondering,

4:43

because earlier this week or late last

4:45

week,

4:45

we also saw there was a malware called

4:47

Karuna, which appears to be an iPhone,

4:53

not state-sponsored.

4:54

How do I explain it?

4:57

So for those who don't know,

4:59

a lot of the time we see...

5:02

We see companies,

5:04

big companies that will spend millions of

5:05

dollars to either find zero days or they

5:08

will go to places like DEF CON and

5:11

Black Hat and they will they will pay

5:15

big money if people there say, you know,

5:17

they do a presentation.

5:17

They're like, hey,

5:18

I found this this exploit.

5:20

And it's interesting,

5:20

and it's never been seen before.

5:22

They'll go up to those people and be

5:23

like, hey, next time, give us a call,

5:25

and we'll pay you to kind of keep

5:26

it quiet.

5:28

I believe it's Nicole Perlroth has a great

5:30

book called This is the Way They Tell

5:32

Me the World Ends that's all about the

5:34

zero-day market and everything.

5:35

So if you want to know more,

5:36

definitely check that out.

5:39

there was an employee of one of those

5:42

firms who was accused of selling access to

5:44

these tools to Russia.

5:45

Uh, I believe he was convicted recently.

5:49

And then around the same time we saw

5:50

this other malware or this other exploit

5:53

called Karuna,

5:54

which was making the rounds.

5:55

This does not appear to be Karuna,

5:57

but they do have evidence to believe that

5:59

this came from one of those zero day

6:02

resellers firms.

6:04

Um,

6:05

Which, you know what, yeah,

6:06

I'll go ahead and touch on that now.

6:08

So they talk about – and I know

6:10

I've said this in the past.

6:11

Like when Pegasus first came to light and

6:13

everything, a lot of people were like, oh,

6:14

no, how do I know I'm infected?

6:16

And we used to say like you're probably

6:17

not because this is not something they're

6:19

going to burn on any random person.

6:22

They're going to use this on like lawyers

6:24

and activists and political figures,

6:26

journalists, dissidents.

6:28

Um, the thing is this dark sword one,

6:32

I verify as Cole argues that the fact

6:34

that it was used so brazenly with no

6:35

real attempt to prevent its discovery

6:37

suggests that hacking techniques are now

6:39

attainable on the black market.

6:41

Uh,

6:41

attainable enough that hackers are willing

6:43

to use them indiscriminately,

6:44

even if the result is their exposure.

6:45

He says, if one gets burned,

6:47

I'll just go buy another one.

6:48

Uh,

6:48

they know that there's more where this

6:49

came from.

6:50

So, um,

6:53

I still think the risk of falling to

6:56

some of these malware is pretty low,

6:57

but it does seem to be increasing,

6:59

which...

7:01

unfortunately is something we see

7:02

historically.

7:03

I mean, we see this all across technology,

7:05

right?

7:05

Like when computers, computers alone,

7:08

when computers first came out,

7:09

it was like really expensive and only rich

7:11

people had them.

7:12

And now you can buy a Chromebook for

7:14

a couple hundred bucks,

7:15

which I understand is still relatively

7:16

expensive for some people.

7:17

But the point is the price came down

7:19

and now it's something that's much more

7:21

attainable to the average person.

7:22

So that does appear to be what's happening

7:25

with malware here.

7:27

Now,

7:28

the last thing I want to touch on

7:29

that was in the story,

7:31

that many of you may have noticed.

7:33

This only works on iOS,

7:35

which is because Apple changed their

7:37

naming scheme with the latest iOS.

7:38

So this current iOS is iOS because it

7:42

is it would be iOS if they hadn't

7:45

renamed it.

7:46

So this is the previous major version of

7:49

iOS.

7:51

However,

7:53

Apple confirms that about a quarter of all

7:56

their devices are still running iOS.

7:58

That could be for any number of reasons.

8:01

Liquid Glass was really, really unpopular,

8:03

so a lot of people did not like

8:04

iOS.

8:06

A lot of people choose not to update

8:07

because they don't want the AI features,

8:10

which I think might actually be in iOS

8:13

I could be wrong there, but yeah.

8:16

Apple, as another explanation,

8:18

for some reason,

8:19

Apple is really bad at automatic updates.

8:21

We were talking about this in a group

8:22

chat the other day, actually.

8:23

It's like every time I go to the

8:24

app store on my iPhone,

8:25

it's got like a bunch of apps that

8:27

haven't updated,

8:28

even though the update came out like three

8:29

or four days ago.

8:30

And you have to update those.

8:33

There was a...

8:34

Actually, if you're an Apple user,

8:35

there was a background security update

8:37

that just came out earlier this week that

8:40

most people,

8:41

it did not automatically install.

8:43

So go check for that.

8:44

It's just, yeah, Apple's,

8:46

so that could be part of it.

8:50

And just for context, there are,

8:52

I checked, according to one source,

8:53

there's one point five billion iOS devices

8:55

out there right now in active use.

8:58

So a quarter of those is still like

9:00

three hundred million,

9:02

which is like the entire population of the

9:03

U.S.

9:03

So even though this is an older iOS

9:08

device,

9:08

it still affects hundreds of millions of

9:10

people potentially.

9:11

And if this has fallen into the hands

9:14

of the average...

9:16

What's the word I'm looking for?

9:19

The average cyber criminal, then...

9:24

What I was saying earlier about they're

9:26

only going to use this on dissidents and

9:28

journalists,

9:29

and unfortunately that does not seem to be

9:31

the case.

9:32

So it is really important to keep your

9:34

stuff updated.

9:37

It is...

9:39

Yeah, I don't know.

9:41

I think that's all I got to say

9:41

is it is really important to keep your

9:44

stuff updated.

9:45

And I see some people in the comments,

9:47

personal pet peeve,

9:48

I see some people in the comments

9:49

sometimes that are like, well,

9:50

I'm still on Android twelve because I

9:51

don't want the AI stuff.

9:52

And it's like,

9:53

I respect that you don't want the AI

9:55

stuff and I'm not telling you you should

9:56

just embrace it.

9:57

But at that point,

9:57

maybe you should be looking into like

9:59

alternative ROMs or moving to a more

10:01

trusted OS because, yeah,

10:02

sometimes these security updates really

10:04

are important.

10:06

um i think that's kind of the the

10:08

bare bones of the story and that's all

10:10

i got did you have anything to add

10:11

that i missed um i think it is

10:14

important that we talk about specifically

10:18

like what this attack actually looked like

10:20

so if you don't know this is like

10:22

sort of i guess uh i verify was

10:24

saying this is like a watering hole attack

10:27

so basically that means it's an attack

10:29

strategy where basically an attacker will

10:33

find websites that users commonly visit

10:37

and then use those websites to distribute

10:40

malware.

10:41

So in this case, it was

10:45

The attack was against users running iOS,

10:47

eighteen point four to eighteen point six

10:49

point two.

10:51

So just to be clear,

10:53

the if you're fully up to date on

10:55

iOS eighteen, you should be on, I think,

10:58

iOS eighteen point seven point something.

11:01

So this didn't affect like even if you're

11:04

running iOS eighteen,

11:05

it may not affect you.

11:07

So just be aware of that.

11:08

And the attack itself was basically

11:13

as far as iverify is stating here it

11:15

was a you know an attack from russia

11:20

and it was specifically um a used on

11:26

government websites so ukrainian

11:28

government websites um so that was any

11:31

website ending with gov.ua so basically

11:35

they were able to um

11:39

compromise Ukrainian government's servers

11:42

and basically put this malware out there

11:48

onto these devices.

11:49

And especially because it was a government

11:51

website, it was very, you know,

11:55

no one from another country is going to

11:57

be visiting that website.

11:58

So it's a pretty effective way to infect

12:01

a lot of people's devices.

12:03

And I think, you know,

12:04

staying up to date is important as well.

12:07

But I think, you know,

12:10

I think a lot of people probably wouldn't

12:13

have been affected by this if they were

12:14

running lockdown mode,

12:15

because it does sound like this is

12:19

probably that would probably block the uh

12:23

the exploit chain because in a lot of

12:24

cases this this exploit itself was written

12:28

in javascript and the exploit according to

12:31

iverify it was uh it used six

12:34

vulnerabilities across two exploit chains

12:37

so um i think you know

12:42

Staying up to date is important,

12:43

but also minimizing your attack surface.

12:46

So in this case,

12:47

not using all these third party, um,

12:49

you know, JavaScript libraries,

12:52

locking that down with lockdown mode,

12:54

that's gonna definitely protect you in

12:55

that case.

12:56

Same thing with Android, right?

12:58

You can, I know on Graphene OS, they,

13:01

they use like MT on the browser and

13:03

a bunch of other protections.

13:05

So I think reducing the attack surface and

13:07

just in time is also commonly exploited

13:09

JavaScript.

13:10

Um,

13:12

So I think disabling a lot of those

13:14

things can help, but obviously, you know,

13:17

updating your device is important,

13:20

but I think, you know, it's usually the,

13:24

uh,

13:25

these things that like are there for like

13:29

web convenience and are actually there to

13:31

protect you.

13:32

Like the.

13:34

they use for rendering WebGL stuff,

13:36

that that can be exploited.

13:38

I think it's important to be aware of

13:40

that and not to just trust every single

13:44

website just because it's a government

13:46

website, right?

13:47

Um,

13:47

so I think there was another thing that

13:49

they also said, um,

13:50

basically because they didn't obfuscate

13:53

the JavaScript, um,

13:55

it basically was sitting on the website

13:57

and a bunch of other groups were stealing

13:59

the code to use as well.

14:01

So, uh, apparently according to iVerify,

14:04

um,

14:04

a Chinese criminal group was also using

14:07

this, um, Dark Sword and Karuna exploit,

14:10

um,

14:12

So yeah,

14:14

just be on the lookout because people are

14:16

definitely using this.

14:17

So make sure you're updated.

14:18

Make sure you're using lockdown mode if

14:20

you're thinking you might be a target of

14:22

this.

14:22

But it does seem like this is like

14:24

a very large

14:29

like they're trying to target a lot of

14:30

people with this.

14:31

It's not like a specifically, um,

14:34

it's not specifically targeted towards a

14:36

single individual.

14:37

Um,

14:38

so I'm sure that there's people in the

14:40

military,

14:41

in the Ukrainian military who probably

14:42

visit those websites who unfortunately

14:44

have been, um, compromised.

14:46

So it's,

14:49

it's a wide,

14:50

they're casting a wide net to, to,

14:52

to get access to people's, um, devices.

14:54

But I think, uh, I think the,

14:58

the estimate that they've given on here

15:00

was on the, on the,

15:01

on the high side.

15:02

I think I saw a couple of other

15:04

websites saying it was closer to two

15:05

hundred million devices affected.

15:07

So I don't know.

15:08

I think it's, yeah,

15:13

just be on the lookout for that.

15:14

I don't really have too much more to

15:15

add.

15:16

Do you have anything else you want to

15:17

add here, Nate?

15:19

No.

15:20

Yeah,

15:20

that like three hundred million number was

15:22

just an estimate I came up with by

15:23

doing the math of like one point five

15:25

billion devices or whatever.

15:26

So it may not be exact.

15:27

That may be on the high side.

15:30

But yeah,

15:31

it's thank you for mentioning lockdown

15:33

mode because I definitely forgot to

15:34

mention that.

15:35

They did say that lockdown mode would have

15:37

defended against this.

15:39

So Apple did.

15:40

Like you said,

15:41

they did push out an update to devices

15:43

that are not able to update to iOS

15:45

twenty six.

15:46

So if you're sitting here and you're just

15:48

like, I can't update, dude,

15:49

we'll make sure you get that update at

15:50

least because that would be helpful.

15:52

But yeah, also lockdown mode is helpful.

15:56

Yeah, like you said,

15:57

that's an important piece of context is

15:59

whoever got a hold of this,

16:01

which I think was Russia, like you said,

16:03

they kind of left it out in the

16:05

open.

16:05

So originally they were using it

16:07

specifically on like Ukrainian news sites,

16:09

Ukrainian government sites,

16:10

like they were clearly targeting

16:11

Ukrainians.

16:12

But now that they just left it out

16:14

there unsecured and anybody can go grab it

16:16

and there's like comments in the code

16:18

about what each module does and how to

16:20

use it.

16:20

So it's kind of like,

16:24

They made it so easy now,

16:25

and now it's out there in the wild,

16:27

and who knows where it'll pop up.

16:28

So yeah, that's unfortunate for sure.

16:34

Definitely.

16:35

I guess we can move on to the

16:36

next story here,

16:38

if you want to take that.

16:41

Sure.

16:41

So this next story is about the U.S.

16:45

government buying location data.

16:47

And I know this probably...

16:53

This is and isn't a surprise.

16:54

So back in twenty twenty three,

16:57

the government put a pause on buying

16:59

location data.

17:01

I cannot remember if that was something

17:03

that they were ordered to do by the

17:05

White House or if they just stopped doing

17:07

it for one reason or another.

17:08

But they stopped doing it.

17:10

And now I think confirm is a strong

17:12

word here.

17:13

Here in the title, I believe, is that,

17:14

yeah, director confirms he didn't confirm.

17:18

So the question was.

17:21

Basically, Ron Wyden,

17:24

who I think not a controversial take.

17:27

We like him for privacy at least.

17:28

I'll be honest.

17:29

I don't know any of his other policies,

17:30

but he does really good work for privacy,

17:33

and he's really on the ball for that.

17:35

He asked the FBI – or he asked

17:37

Kash Patel if the FBI would commit to

17:39

not buying Americans' location data,

17:41

and Kash Patel said that the agency,

17:43

quote,

17:43

uses all tools available to do our

17:45

mission.

17:47

So he didn't directly say it,

17:49

but he I mean, you know,

17:51

come on when he refuses to admit it

17:53

for sure.

17:54

So this.

17:58

This is a. Outside of the privacy space,

18:01

because I think in the privacy space,

18:02

we all universally recognize that this is

18:04

an awful thing that needs to stop.

18:07

But even even in mainstream circles,

18:09

this is a very controversial thing that

18:11

the U.S.

18:11

government does law enforcement because.

18:14

Law enforcement is supposed to get a

18:16

warrant whenever they want to search your

18:18

data.

18:19

And by going to these third-party vendors,

18:27

they don't have to get a warrant.

18:28

But the article notes that, interestingly,

18:31

this is –

18:34

Well, okay, maybe this isn't a one-to-one,

18:36

but the FBI claims it does not need

18:37

a warrant to use this information for

18:39

federal investigations,

18:40

though the theory has not yet been tested

18:41

in court.

18:42

So the way that I read that is

18:44

like maybe this whole –

18:46

going to third party brokers,

18:48

if that went to court and a judge

18:50

said, no, you can't do that,

18:51

then maybe that would become illegal,

18:52

but it has not yet been to court.

18:55

Or maybe it is just totally legal.

18:57

I know,

18:57

I believe Wyden has in the past tried

19:00

to introduce a bill.

19:01

It was called like the fourth amendment is

19:02

not for sale act or something,

19:04

which would have outlawed this

19:05

specifically,

19:06

but of course it did not pass.

19:08

And now I know the,

19:12

I think section seven Oh two,

19:13

if I remember correctly, which is,

19:16

what allows the NSA to like bulk collect

19:18

data, um,

19:19

that I believe is up for renewal and,

19:22

um,

19:23

hopefully will not get renewed.

19:24

But then it says here at the end

19:25

that Wyden and several other lawmakers

19:26

have introduced a bipartisan act called

19:28

the Government Surveillance Reform Act,

19:30

which among other things would require a

19:31

court authorized warrant before federal

19:33

agencies can buy Americans information

19:34

from data brokers,

19:36

which personal opinion does not seem

19:38

unreasonable.

19:39

Like I don't think anybody's telling them

19:41

not to do their job.

19:42

I think we're just telling you to go

19:43

through the proper channels where there's

19:44

oversight and there's accountability.

19:47

But I don't know.

19:49

They seem to disagree for some reason.

19:51

The last thing I want to mention here

19:53

– this has become a little bit of

19:55

a personal crusade of mine.

19:56

It says here for audio listeners,

19:59

it says that U.S.

20:00

Customs and Border Patrol control – Border

20:02

Protection, excuse me.

20:03

U.S.

20:03

Customs and Border Protection purchased a

20:05

bunch of data sourced from real-time

20:07

bidding or RTB services according to a

20:08

document obtained by Foro Foro Media.

20:10

So –

20:11

For those who do not know about this,

20:13

there's a lot of really good resources out

20:15

there.

20:15

EFF has an article.

20:16

I mentioned before that Byron Tao has a

20:18

book called Means of Control that dives in

20:20

deep into this.

20:21

But the way that ads on the internet

20:23

work is you go to a website that

20:24

has ads.

20:25

Let's say Reuters because as much as I

20:27

like Reuters,

20:27

their website is littered with ads.

20:30

Most news websites are.

20:31

So you go to a news website and

20:34

20:35

When there's that ad space,

20:37

they basically open it up for bidding,

20:39

just like any given auction.

20:40

They're like, who wants this ad space?

20:41

Who's willing to pay the most for it?

20:43

And in order for those advertisers to

20:44

decide how much they want to pay,

20:47

they get your data.

20:49

They get a copy of your data so

20:51

that they can decide, oh,

20:52

this person is worth this much to me.

20:53

And then they submit their bid and whoever

20:56

wins, you see that ad.

20:58

The thing is they don't have to bid

21:01

to get your data,

21:03

which in theory makes sense, right?

21:05

Because if they get your data and they're

21:06

like, oh, nevermind, I don't want to bid,

21:07

but they still have that copy of your

21:09

data.

21:09

And so this is a proven thing.

21:11

There are companies out there who will

21:13

enter the advertising ecosystem just to

21:16

get a copy of your data and then

21:18

turn around and sell it to people like

21:19

the FBI.

21:20

So where I'm going with this is if

21:23

you are not using an ad blocker,

21:26

That is, in my opinion,

21:27

one of the currently most overlooked ways

21:29

to protect your privacy.

21:30

And obviously there's a million other

21:32

ways, right?

21:32

You need to switch to a secure messenger.

21:34

You need to switch to a...

21:37

private email.

21:39

Ideally,

21:39

we should get off Windows and switch to

21:42

Linux and stuff.

21:43

And I know I have a Mac in

21:44

front of me.

21:44

It's specifically for streaming and

21:46

editing, for the record.

21:47

This is not my daily computer.

21:48

But we should make all those steps.

21:52

But to me,

21:53

the lowest hanging fruit to start with is

21:55

installing an ad blocker because that

21:58

real-time bidding is happening everywhere,

22:00

all the time, constantly.

22:02

And like I said,

22:02

they don't even need to bid.

22:03

They can just sit there and collect your

22:04

data and then resell it to

22:07

whoever they want.

22:08

Um, so yeah, that is that story.

22:11

That is my takeaway from that story.

22:13

Um, Jordan, did you have any,

22:17

any additional thoughts on that one?

22:18

I mean,

22:19

I guess like this is kind of surprising,

22:22

but I guess not like with,

22:24

with the prevalence of data brokers and

22:26

stuff like that, it's not that surprising.

22:28

Like they said in the, Oh,

22:31

they said in the article, um,

22:33

there was like, you know,

22:37

the FBI is going to use all tools

22:39

at their disposal to do their job.

22:42

So, um, it's kind of, you know,

22:45

it makes sense that they would do that,

22:48

but I guess it needs to be like

22:50

Senator Ron Wyden was saying, like,

22:53

it's not really consistent with the

22:55

constitution.

22:56

Like it's a little bit,

22:57

they kind of bypassing a lot of the

23:00

protections that people had with,

23:02

you know,

23:02

places having to require a warrant

23:04

instead.

23:06

Um,

23:08

It is, yeah,

23:08

I don't really have too much to add

23:09

here, really.

23:11

This is sort of a very American story,

23:14

so I can't really comment too much about

23:16

it.

23:20

Yeah, that's fair.

23:21

Yeah, I mean,

23:24

it's a pretty straightforward story,

23:26

really.

23:26

I don't have too much to add other

23:28

than what I already said.

23:31

But just before we continue,

23:32

I just did see a couple of comments

23:34

we should probably quickly mention here.

23:39

So there was someone who said,

23:41

how do you find out what security updates

23:43

have been loaded if you can't update to

23:45

iOS twenty six?

23:47

So I wouldn't update to iOS twenty six.

23:49

If you're on iOS eighteen,

23:50

just make sure you're on the latest

23:52

version of iOS eighteen.

23:54

Um,

23:54

I would also check the background security

23:56

improvements tab as well.

23:59

Um, that will also have like,

24:00

there was a,

24:01

there was a background security

24:02

improvement that was released.

24:03

I don't know if that's for iOS,

24:05

you have to look at that.

24:06

Um,

24:07

but I would make sure you're on the

24:08

latest version of iOS.

24:09

You don't have to update to iOS.

24:10

Um,

24:14

I think the latest versions fix a lot

24:16

of these issues.

24:17

So.

24:19

Yeah,

24:20

I wouldn't be too worried as long as

24:23

you're on the latest version.

24:25

Someone also said,

24:25

what is the timeline for the disclosure of

24:27

these sorts of things?

24:28

Is the idea it's better to announce it

24:32

to help make people update?

24:34

I think they've already released.

24:36

Usually, they notify the company,

24:39

in this case, Apple.

24:41

Apple releases a fix,

24:42

the fix gets released,

24:43

and then they disclose it to the public.

24:45

And then that's basically where we're at

24:47

right now.

24:47

You need to update.

24:49

to make sure you're not,

24:52

there's no background improvements option

24:54

to check.

24:56

Uh, maybe that's an iOS twenty-six thing.

24:58

I don't think so though.

24:59

Um, I believe it's in...

25:01

I don't have an iOS eighteen device to

25:02

check exactly where it is.

25:04

Um, but there is,

25:07

there should be a setting there.

25:08

Um, but yeah, we're in the,

25:10

we're in the point right now where we

25:12

need to be updating.

25:13

That's why iVerify came out with like this

25:15

whole, um, press release, I guess,

25:17

about the Dark Sword attacks and...

25:20

Karuna stuff.

25:21

Um, so yeah, it's kind of, uh,

25:25

unfortunate, but I think people should be,

25:29

that's why we're trying to share it as

25:30

like the main story here,

25:32

because if you're running an older version

25:33

of iOS,

25:34

I kind of do wonder as well,

25:36

if this would affect older devices,

25:38

for instance,

25:39

like I was and I was because I

25:41

know there's some devices that are limited

25:43

to like iOS or .

25:45

So it'd be interesting to see if they're

25:48

also affected,

25:49

but

25:51

Yeah,

25:52

I think this is one of those things

25:54

where you need to be using, I think,

25:56

I think lockdown mode doesn't really

25:58

introduce that many problems now.

26:00

Like a lot of websites have already fixed

26:02

out.

26:03

Um, oh, it's only iOS,

26:05

that has background improvements.

26:07

Okay.

26:08

I thought it was a,

26:09

I think they called it something else.

26:11

They call it like rapid security responses

26:13

or something.

26:18

um so maybe i don't know that is

26:20

a good point i guess um looks like

26:23

nate is back here hello hopefully i'm back

26:28

um this is the first really warm day

26:30

we've had of the year and i think

26:31

my camera was overheating so um if it

26:34

goes out again i apologize y'all but i

26:36

think i found a solution for now that

26:38

can get us through the episode awesome

26:40

then uh why don't you take us into

26:42

the next section of the show here

26:44

yeah so um in a little bit we're

26:47

going to talk about google's updates to

26:49

their third-party app installation

26:51

procedures but first we're going to give

26:54

some updates about what we've been working

26:55

on at privacy guides this week so we'll

26:58

start by talking about the videos our

27:00

private messaging video is now available

27:02

to the public so if you are not

27:04

a paying member you can

27:06

access that now paying members do get

27:07

early access to these things but uh that

27:10

is up our next video will be about

27:12

encrypted email um which that is fully

27:14

recorded and the first round of editing is

27:16

done so that is off to jordan to

27:18

work their magic and they uh they do

27:21

all the graphics the zoom and they

27:23

basically just make it look a thousand

27:25

times more awesome which we are super

27:26

grateful for

27:28

And we, a lot of you guys,

27:31

if you tuned in last week,

27:32

you saw that Jonah and I were at

27:34

an event for South by Southwest,

27:35

an unofficial event.

27:37

And we had the awesome opportunity to

27:40

record some of those talks.

27:42

And those should be out hopefully in the

27:45

next coming days.

27:46

They should be trickling out.

27:47

There's only a few of them,

27:48

but they were really insightful and really

27:50

good.

27:51

And we wanted to share those with you

27:52

guys.

27:52

So expect those in the near future.

27:58

Awesome, yeah.

28:00

I think there's also, yeah,

28:02

Nate's kind of been piling on the videos

28:05

for me to work on.

28:06

So I've got quite a big backlog now,

28:07

which is great.

28:09

So definitely be on the lookout.

28:11

I think we're trying to have something

28:12

come out next week

28:14

For our members,

28:15

hopefully that encrypted email video.

28:17

That's the plan at least So definitely

28:20

look out for that.

28:21

And there was also a couple of extra

28:23

things we should have mentioned We had

28:26

privacy guides news articles coming out So

28:28

Freya is working on that every week and

28:31

we have a couple of new articles that

28:33

came out this week one was about Instagram

28:36

ending end-to-end encryption on their DMS,

28:39

which is kind of a

28:41

very surprising i guess but also like

28:44

facebook being facebook i guess um they

28:48

just end up making their product worse uh

28:50

i don't know instagram has notified its

28:53

users that it will no longer support

28:55

end-to-end encryption after may eighth so

28:57

if you use instagram i feel like not

28:59

many people in our community are using

29:00

instagram but it's good to know uh good

29:03

to put the info out there but

29:05

And there was also another one about,

29:08

we were debating on talking about this

29:10

one,

29:10

but Pokemon Go players data was used to

29:13

train visual positioning AI.

29:17

So there was a parent or spin-off company

29:20

from Niantic which basically runs Pokemon

29:23

Go and they used images from Pokemon Go

29:27

to train its visual positioning system.

29:30

So that is kind of scary too.

29:32

Freya did a great write-up on that so

29:34

definitely check that out as well.

29:37

And there was also another thing,

29:39

it's kind of another thing where we're

29:41

keeping on the lookout,

29:42

which is like the homomorphic encryption.

29:46

Intel made an advance in that area.

29:50

I think it's a lot to do with

29:51

these, you know,

29:54

the ability to do server-side processing

29:57

end-to-end encrypted.

29:58

So the server processes the data,

30:00

but it doesn't,

30:01

isn't available for the server to access,

30:03

which is kind of a problem we have

30:04

with like AI at the moment.

30:07

because it's kind of hard to run that

30:09

on your device as well.

30:11

Like, you know,

30:12

you need a lot of RAM,

30:12

you need a lot of CPU processing power.

30:14

Mobile devices can't really do that.

30:17

So yeah,

30:19

this is basically a trusted execution

30:21

environment,

30:22

which segregates the CPU using encryption.

30:27

Definitely read into it.

30:28

Freya did a great write up of that

30:30

as well,

30:30

explaining the whole system there.

30:32

So if you're interested in that,

30:33

check that out too.

30:36

But yeah,

30:37

if you want to stay up to date

30:38

with that stuff,

30:39

you can go to privacyguides.org forward

30:44

slash news if you want to check out

30:46

that.

30:47

Nate's also doing every week,

30:49

he does a Data Breach Roundup,

30:51

which is

30:52

really useful if you want to make sure

30:54

you stay on top of things and you

30:56

aren't missing if you're in a breach.

30:58

A lot of tools that detect if your

31:01

credentials are in a data breach are

31:04

usually pretty slow to determine that

31:06

because they have to add the data set

31:08

to scan it.

31:10

So if you're wanting to keep on top

31:12

of data breach stuff,

31:13

definitely check that out.

31:15

Nate does a great job on that.

31:16

It's very comprehensive.

31:18

Let's see how many here.

31:20

One, two, three, four, five, six, seven,

31:24

eight.

31:24

Yes.

31:24

So eight ones this week.

31:27

I basically write about any data breaches

31:29

that come through my RSS feed that affect

31:32

individuals.

31:33

If it's like company had their source code

31:34

stolen, I don't usually cover that stuff.

31:36

But yeah, so it varies week to week.

31:38

Sometimes there's like three,

31:39

sometimes there's like twelve.

31:42

So kind of a medium, a midweek,

31:45

which is,

31:46

I guess less data breaches is better.

31:48

Let's normalize less data breaches.

31:52

But yeah,

31:52

that's kind of what we've been working on

31:53

this week.

31:55

I guess we can head into the next

31:57

article here.

31:58

Nick kind of mentioned it before.

32:02

basically Google is making changes to,

32:06

if you haven't heard already,

32:07

there was this whole project with keep

32:10

Android open.

32:12

And basically Google was trying to

32:15

combat malware by basically restricting

32:19

application installation on your device

32:22

but it was usually apps outside the google

32:24

play store so it would stop you from

32:26

installing that and there's been a huge

32:28

amount of backlash to this as well like

32:30

we've uh we signed the open letter to

32:32

google um with keep android open and if

32:35

you notice that on our socials um you

32:38

can share that with your friends and

32:39

family get people talking about this

32:41

because i think it's important that

32:45

you know,

32:46

people are pushing against this because

32:49

it's basically Google using their power as

32:52

a monopoly here.

32:53

Like they do have control over the Google

32:56

Android ecosystem.

32:57

It allows them to make these sort of

32:59

wide reaching changes with really no one

33:01

to stop them.

33:03

Well, I guess we are,

33:04

we are trying to stop them,

33:05

but

33:07

Clearly,

33:08

we do have some power because this week

33:11

there was a change.

33:13

Actually, it was, I believe,

33:14

today or yesterday.

33:17

There was a change.

33:18

Google detailed a new twenty four hour

33:20

process to we're not going to mention

33:22

sideload here.

33:23

We're going to we're going to say install

33:25

unverified Android apps.

33:28

because that's what it is.

33:29

You're not sideloading, you're installing.

33:32

So Google is planning big changes for

33:34

Android in twenty twenty six aimed at

33:37

combating malware across the entire device

33:39

ecosystem.

33:40

Starting in September,

33:41

Google will begin restricting application

33:44

installation with its developer

33:47

verification program.

33:48

But not everyone is on board.

33:50

Android ecosystem president Samir Samat

33:53

tells us that the company has been

33:55

listening to feedback.

33:57

And the result is the newly unveiled

34:00

advanced flow,

34:02

which will allow power users to skip app

34:05

verification.

34:07

So I think one thing to mention,

34:09

like right off the bat here,

34:10

people will probably have, uh,

34:12

they're probably thinking like, oh,

34:13

does this affect my Graphene OS device?

34:16

Oh no,

34:16

I'm not going to be able to install

34:17

apps without going through the

34:18

sideloading, uh,

34:19

installation process that warns me I'm

34:22

installing something.

34:24

No,

34:24

this is affecting Google Android devices.

34:27

Um, so just to put that preface here,

34:30

um,

34:32

So basically, uh,

34:34

as Nate's showing on the screen,

34:35

there's now this new advanced flow for

34:39

power users to install apps from

34:42

unverified developers.

34:44

So basically Google wants developers to

34:46

register centrally with them,

34:47

which often requires payment

34:50

identification.

34:51

Not many people who create these,

34:54

you know,

34:54

independent free and open source apps want

34:58

to verify through Google.

34:59

It's the whole point, right?

35:01

Um,

35:03

Yeah.

35:03

So there's a twenty five dollar fee.

35:04

These independent developers, you know,

35:06

I think a lot of independent developers

35:08

aren't really up to paying the twenty five

35:11

dollar fee.

35:11

Like I've seen people who were kind of

35:13

like, oh,

35:14

I don't want to pay Apple's one hundred

35:17

dollar a year thing to publish on the

35:18

App Store.

35:19

Same thing in this point,

35:20

like twenty five dollars.

35:22

for someone in India might be a

35:25

significant amount of money or in Turkey

35:27

or, you know,

35:27

a country where the currency is worth a

35:29

lot less.

35:31

So I think that also puts another barrier

35:34

on people where, you know,

35:37

they would be able to release apps without

35:38

having to worry about that.

35:39

But it does seem like Google has folded

35:42

a little bit here.

35:44

Basically,

35:46

The whole flow is that it makes sure

35:48

no one is telling you to turn off,

35:51

to allow you to install from unverified

35:54

sources.

35:54

Basically it'll say, yes,

35:56

someone is guiding me.

35:57

No one is instructing me.

35:58

And then it starts a security delay of

36:00

twenty four hours.

36:02

And once that delay has been passed,

36:05

then it allows you to select which option

36:09

you want to do,

36:10

which is turn on temporarily,

36:11

which will allow installing unregistered

36:13

apps for seven days or turn on

36:16

indefinitely,

36:16

which will allow unregistered apps to be

36:19

installed indefinitely.

36:21

And it does give you a confirmation tick

36:23

mark.

36:23

You can select install anyway.

36:26

I think this is.

36:28

really just uh we can kind of read

36:31

uh keep android open did actually put a

36:33

response to this so let's just have a

36:35

look at what they said um but i

36:37

think you can take that um nate if

36:39

you want sure uh give me one second

36:44

i'm pulling that up right now i had

36:46

that tab open and then i closed it

36:47

okay um so keep android open yeah they

36:50

did they said this is not a solution

36:53

um and they kind of highlighted some of

36:57

the

36:58

issues in this is the actual workflow.

37:00

I think this is actually copy and pasted

37:01

from that article we were just showing you

37:02

guys.

37:03

But they say you have to enable developer

37:05

mode,

37:05

which there I think this is to kind

37:07

of illustrate to people why this is a

37:08

little ridiculous.

37:09

And to me,

37:10

this also was like the first thing that

37:11

I was like, oh, but why?

37:13

For those of you who've never enabled

37:14

developer mode,

37:15

you have to go into your settings.

37:17

You have to go to about phone and

37:18

then you have to find software build

37:20

number and you tap that seven times.

37:24

Which, I mean,

37:24

obviously it's tapping a screen.

37:26

It's not that hard.

37:27

But just the fact that you...

37:27

Because once you enable developer mode,

37:29

then you unlock a whole new menu of

37:31

settings.

37:32

And it's just kind of like,

37:33

but why do we have to go in

37:34

there to enable this?

37:36

That is very onerous.

37:38

And then they point out that they call

37:40

these scare screens,

37:42

confirming that you are not being coerced.

37:45

You know,

37:45

there's another scare screen warning.

37:47

And then, of course,

37:49

the twenty-four hour waiting period,

37:50

which...

37:51

As Jordan noted,

37:53

Google's argument for the twenty four hour

37:56

waiting period is that.

37:58

So I'm not trying to defend Google here,

38:01

so follow me on this one.

38:02

From what I understand.

38:05

Sideloading malicious apps is a much

38:08

bigger problem in other parts of the world

38:10

outside America and Europe.

38:11

Like I think they said it's going to

38:12

roll out in.

38:14

like Brazil.

38:15

Yeah, here it is.

38:15

Brazil, Singapore, Indonesia,

38:17

and Thailand.

38:18

And that's because those are the areas

38:20

where these types of scams are extremely

38:22

common.

38:22

And the way those scams will work is

38:24

they'll call you with some kind of

38:26

pretense about like, oh,

38:27

your bank account's under attack or

38:28

whatever.

38:29

We need you to update to the latest

38:31

app,

38:31

but it's not in the app store yet.

38:34

So we're going to have you sideload it

38:35

and they walk you through the process.

38:37

So the idea is that if there's a

38:42

most scams, um, as you guys probably know,

38:44

most scams rely on urgency.

38:47

They want you to just do it now

38:48

so that your brain doesn't have time to

38:50

kick in and go, wait a minute.

38:51

Like, I don't know if, uh, well,

38:53

some of the older members of the crowd

38:54

might remember.

38:55

And I'm counting myself when I say that

38:56

back in like,

38:57

there was a scam going around that was

38:58

like, Oh, I was on vacation in like,

39:01

you know, um,

39:02

was it like somewhere in Southeast Asia,

39:04

not India,

39:04

but like

39:05

Not Thailand.

39:06

I can't remember where it was.

39:07

But anyways,

39:07

I was on vacation in this part of

39:08

the world and I lost my passport and

39:11

I got arrested and I need you to

39:12

wire me like two thousand dollars to buy

39:14

a new passport.

39:15

And I remember I got that one from

39:16

my mom and I just laughed and deleted

39:17

it because I'm like,

39:18

we don't have the money to be traveling

39:19

like that.

39:20

Like, I know this is a scam.

39:21

There's no way.

39:22

Um, and so,

39:24

but a lot of the time,

39:25

like when you get those,

39:26

the idea is like, Oh,

39:27

I need it quick.

39:27

Or they're going to, you know, my,

39:29

my hearing is tomorrow.

39:30

Like the embassy is going to be closed

39:31

this weekend.

39:31

Like they want you to not think and

39:33

to just do it because once you start

39:36

to think you're going to be like,

39:36

wait a minute,

39:37

why didn't they tell me they were going

39:38

to Thailand?

39:39

That seems like really big news.

39:40

They would tell me.

39:42

Um, and again,

39:43

there's like a million other variations,

39:45

but the point is that's,

39:46

that's the point of the

39:52

force that that period of time where it

39:54

stops and slows down but um yeah that

39:58

that is still really honest because now

40:00

like let's say i get a brand new

40:02

phone right and i'm trying to set up

40:03

this phone and you know being in privacy

40:06

i do a lot of sideload excuse me

40:07

installing i do a lot of app installing

40:09

from non uh outside the play store and

40:12

so now when i get that phone i

40:15

have to enable this and then wait a

40:16

whole day before i can actually start

40:18

setting up my phone which is

40:20

Really not cool, especially if,

40:22

I don't know,

40:23

your phone blows up or something.

40:24

Wouldn't know anything about that.

40:25

But anyways, yeah, all that to say, like,

40:28

I agree with them.

40:29

I think this is really...

40:32

On the one hand,

40:33

I feel a little bit of sympathy for

40:35

Google.

40:36

Just a little bit.

40:37

Because they do want to... You know,

40:38

they pointed out here in the actual

40:40

article, they said that...

40:43

Where was it?

40:44

Yeah, in a lot of countries,

40:45

there's chatter about if this isn't safer,

40:47

then there may need to be regulatory

40:48

action to lock down more of this stuff.

40:51

And I don't think that it's well

40:53

understood.

40:54

This is a real security concern in a

40:56

number of countries.

40:57

And that came from Google's spokesperson,

40:59

which, yeah, I mean,

41:03

obviously he's trying to push his

41:05

narrative, but I think he's right.

41:06

I think this is a real security concern

41:08

that Google's trying to solve.

41:10

However,

41:10

I also don't have a lot of sympathy

41:12

for Google because I feel like every

41:13

month, at least once a month,

41:15

usually more than once a month,

41:16

I read an article from Bleeping Computer

41:19

or Ars Technica that's like, oh,

41:21

Google just removed an app from the Play

41:22

Store that was malicious and it had like

41:24

a million downloads or a couple million

41:28

downloads.

41:29

And it's like,

41:30

I never read those stories from Apple.

41:32

And again, Apple's got problems.

41:34

I'm not trying to put them up on

41:34

a pedestal.

41:35

But

41:36

My point being is like,

41:37

and they do happen with Apple,

41:38

for the record.

41:39

I've seen them.

41:40

But those happen a couple times a year,

41:42

tops.

41:42

Whereas with Google, again,

41:43

it's like almost every month,

41:45

sometimes even more than that.

41:46

So I find it kind of hypocritical that

41:48

Google's like, oh,

41:48

we need to fix this problem.

41:51

But you're not necessarily guiding people

41:52

towards a safer alternative.

41:54

You haven't really made the Play Store

41:56

safer, in my opinion.

41:57

So it kind of weakens their argument.

41:59

But yeah, yeah.

42:03

I don't know.

42:03

I think if we want to give Google

42:05

the benefit of the doubt,

42:06

which I know a lot of people don't,

42:08

I think they are trying to strike...

42:11

I think now because there's pushback,

42:12

they are trying to strike a balance.

42:14

But I definitely understand that this does

42:16

feel very heavy handed.

42:17

I'm not looking forward to the idea of

42:18

like I'm getting a new phone and now

42:20

I have to wait twenty four hours,

42:22

which I mean,

42:22

I guess I use custom operating systems

42:24

that, you know,

42:24

the classic like that won't affect me.

42:26

But, you know,

42:26

my wife still uses stock Android and she's

42:28

not ready to make the jump to custom

42:30

operating systems yet.

42:31

But she does use like Neo store and

42:33

some of those alternative systems.

42:35

um side loading excuse me uh those

42:37

third-party app installation features and

42:39

so it again it sucks that it's like

42:41

she's gonna get a new phone and it's

42:43

like hey first thing you do go in

42:44

and turn this on because we gotta wait

42:46

a whole freaking day for it to get

42:49

out of the waiting period so yeah i

42:51

don't i don't think this was the best

42:53

solution they could have come up with and

42:55

i i think they um

42:57

I don't know.

42:57

I don't know what is the best solution,

42:58

but yeah,

42:59

I think this is really heavy handed and

43:00

I would like to see something even less

43:03

obnoxious than this personally.

43:05

My big thing is the developer settings,

43:06

but I know there's other issues as well.

43:11

Yeah, I agree.

43:12

I think this is...

43:14

Obviously, it's not ideal,

43:16

but I think it's important to remember

43:18

here a thing that the Keep Android Open

43:21

team was saying here is this entire like

43:24

the thing that we showed before that Nate

43:25

had on the screen,

43:26

it was the entire flow is delivered

43:28

through Google Play services.

43:31

So it's not actually part of

43:34

the Android operating system.

43:36

So the thing with Google Play services is

43:40

that it kind of just automatically updates

43:42

and applies changes to operating system

43:44

without your consent.

43:48

This is useful for Google because they

43:50

need to roll out fixes or introduce new

43:53

features.

43:53

But when it starts being about whether you

43:56

can actually install apps from third party

44:01

sources,

44:02

I don't think we want Google to be

44:04

the

44:05

the arbiter um i think you know uh

44:11

they state here the advanced flow has

44:13

still not appeared in android beta dev

44:15

preview or canary releases so basically

44:18

this entire flow that they're displaying

44:20

is

44:22

basically just a blog post and some UI

44:25

mockups.

44:27

So I think we should wait until we

44:29

see how exactly this works until we

44:32

actually get our hands on it.

44:33

I don't think anyone should be accepting

44:36

this.

44:37

And I think there could be a better

44:40

way to do this.

44:42

I don't know what that solution would be,

44:45

but I think, you know,

44:46

as soon as you start placing

44:49

restrictions on third-party developers,

44:51

I think it's getting to the point of

44:53

like, it's slightly anti-competitive.

44:56

I mean,

44:56

a lot of these apps aren't trying to

44:57

make money,

44:58

but

45:00

I think everyone should get a fair chance

45:02

of being installed on someone's device.

45:06

People should be allowed to choose what

45:08

they want on their device.

45:10

They shouldn't have to go through a

45:12

twenty-four hour waiting period to install

45:14

something on their device.

45:17

We should be able to choose what we

45:18

want on our device.

45:20

So, I don't know,

45:22

I think just from a freedom perspective,

45:25

Everyone should be in favor of people

45:28

being allowed to install software on the

45:30

device that they've paid for.

45:32

Like Google is basically just becoming the

45:35

arbiter of app installs on your device.

45:38

It's like a very, I don't know,

45:41

like people definitely wouldn't have

45:42

accepted this like ten years ago,

45:45

but I feel like we've gotten to a

45:46

point now where like everything is so

45:48

locked down.

45:50

like restrictions on apps are becoming

45:53

worse and worse.

45:55

So people are more likely to accept this

45:57

slight compromise that Google's made here.

46:00

But I think it's still not time

46:03

personally.

46:04

I mean,

46:05

I know Nate said like he was he

46:07

felt like it was a decent middle ground.

46:09

I think it's okay,

46:11

but I think we can definitely push Google

46:13

for something a bit better.

46:16

And hopefully we'll actually see an

46:18

implementation of this before it actually

46:20

gets released.

46:25

Because I think right now we've only got

46:27

like a hundred and sixty three days until

46:31

it's locked down.

46:32

So we need to see a working prototype.

46:34

We need to see at least something from

46:36

Google to know that this is not

46:39

kind of just a sham to make everyone

46:41

stop talking about this and be like,

46:43

Google has announced that they're going to

46:45

fix it.

46:47

You don't need to worry about it anymore,

46:48

everybody.

46:49

And then, you know,

46:51

Google rolls out the original

46:52

implementation.

46:55

But yeah,

46:55

someone says Play Store sucks and

46:58

Yeah,

46:59

I think Nate just said there was so

47:02

much malware on the Play Store.

47:05

I don't think it's particularly useful

47:07

that they're saying... Obviously,

47:09

there's a larger percentage of malware

47:11

used through these unverified apps, right?

47:13

But I don't think the Play Store is

47:16

also very safe because I've got

47:20

grandparents,

47:21

I've got older people in my life, and...

47:25

They absolutely will install a torch app

47:28

that requires your GPS location and your

47:31

camera and your

47:33

messaging history and your contacts and,

47:36

you know, they weren't,

47:37

they weren't bad at that.

47:39

And that's clearly like a data harvesting

47:41

app, but Google play has no problem, uh,

47:44

allowing that app to be, uh,

47:45

installed on people's devices.

47:47

You know,

47:47

there's apps that like spam your phone

47:49

with notifications and like ads that's

47:53

perfectly fine to exist.

47:55

Um, I think, yeah, every,

47:58

every store is going to have, uh,

48:01

Every store is gonna have malware and

48:04

issues.

48:05

I think even really curated ones are gonna

48:08

have apps that have vulnerabilities as

48:10

well.

48:11

And I'm sure something might sneak through

48:14

eventually.

48:15

It's not like there's definitely not a

48:18

zero percent chance

48:20

And I think, yeah,

48:23

most people would prefer,

48:24

most people probably don't even know that

48:26

there's another way to install apps.

48:28

Like most people would just assume that

48:29

Google Play is like where you get your

48:31

apps from.

48:32

Like it's kind of a problem that Google's

48:34

created because they want to be the number

48:36

one place to get apps, right?

48:38

So yeah, anyway, sorry,

48:41

I feel like I've been rambling a little

48:43

bit,

48:43

but hopefully that helped add some points

48:46

to discuss here.

48:49

I mean, I ramble plenty,

48:50

so it's totally fair.

48:52

Yeah.

48:53

And I mean,

48:54

just to kind of back up what you

48:55

were saying, like, yeah,

48:55

there's never going to be a perfectly

48:59

vulnerability free store.

49:01

I mean, like I said,

49:01

it happens to iOS every now and then.

49:03

It's just it happens a lot less on

49:05

iOS.

49:06

And I feel like to Google's defense to

49:08

what you were saying is they will remove

49:09

apps as far as I know once they

49:11

get found.

49:12

But it's the fact that they got there

49:14

on the first place.

49:14

Like,

49:15

why does this happen so much less on

49:17

iPhone?

49:18

And I have to assume it's a vetting

49:19

thing because, you know, I mean, sure,

49:21

there's a higher barrier to entry to put

49:23

your apps on an iPhone in the first

49:24

place.

49:24

But at the same time, it's like,

49:26

They still try to submit malicious apps

49:28

there too.

49:28

Like there was a study a few years

49:29

ago about how Apple has stopped like a

49:32

quarter of a million malicious apps from

49:34

ending up in the app store.

49:35

To be fair, maybe Google's got like,

49:37

we've stopped one million.

49:38

Like I don't know what their stats are.

49:39

But my point being is like,

49:40

clearly Google could put more effort into

49:42

this.

49:42

And I just feel like it's really

49:43

disingenuous to be like,

49:45

we want to keep people safe.

49:46

So we're going to push them into our

49:47

store, which is only marginally safer,

49:50

arguably.

49:51

And it's also like a...

49:53

um, what do you call that?

49:55

Like a survivor bias or a confirmation

49:57

bias where it's like, okay, sure.

49:59

We hear about all the maliciously

50:01

third-party unverified apps that get

50:03

installed, but at the same time,

50:05

what about the, you know, I,

50:07

at least fifty percent,

50:08

probably more than that,

50:09

of the apps on my phone are third-party

50:12

unverified apps.

50:13

They're, you know, NextCloud, they're, um,

50:15

Trying to think what else I have on

50:16

there.

50:17

I don't know.

50:17

My brain's drawn a blank, but they're,

50:18

they're all things that like signal,

50:20

you know, they're,

50:20

they're all things that I can obtain from

50:23

outside the play store.

50:24

So I prefer to do that because I

50:25

don't want the Google analytics there.

50:27

And it's like, those are never malicious.

50:29

So those never get reported, but you know,

50:32

it's yeah, I don't know, but it's crazy.

50:35

And just to be clear,

50:36

I didn't necessarily say I like this

50:37

solution.

50:38

It's just not as crappy as what it

50:40

was.

50:41

but yeah, it's still not great.

50:45

And one thing I think you mentioned

50:46

earlier,

50:47

but I kind of forgot to touch on

50:48

as well is this whole,

50:50

Google is not really giving satisfying

50:52

answers to a lot of this stuff.

50:53

Like,

50:54

so you mentioned the twenty five dollar

50:55

fee and how twenty five dollars is like

50:57

a lot more to somebody in like India,

50:58

for example.

50:59

And he did say that like, oh,

51:01

we're going to account for that.

51:02

We're going to kind of balance it out.

51:04

But it's like this.

51:07

Who was it?

51:07

The Samat person.

51:09

I forget what the role is,

51:10

but they didn't really answer any

51:12

questions like they did.

51:15

at least this Ars Technica,

51:16

they mentioned like,

51:16

I don't know if they actually asked Google

51:18

directly,

51:19

but they did mention things like one of

51:20

the concerns is that Google is now

51:22

building this list of app developers if

51:24

the developers choose to get verified,

51:26

which already presents a host of privacy

51:28

and security concerns.

51:29

Like here in America,

51:30

we had that whole like a, like that,

51:31

what was that?

51:32

Ice spotter, ice block or something.

51:34

We had that app where you could report

51:36

sightings of people

51:38

Immigration agents.

51:39

And in some countries,

51:41

that is super illegal.

51:43

Even just peacefully putting some kind of

51:45

protest app, super illegal.

51:47

And so if that person chooses to verify,

51:50

now Google has their information.

51:51

They have their payment.

51:52

They have their government ID.

51:53

They know exactly who they are.

51:55

And so Google...

51:57

Um, like actually right here,

51:58

Google swears is not interested in the

51:59

content of the apps and it won't be

52:01

checking proactively when registers

52:02

developer, uh, when developers register,

52:05

excuse me, I can't talk tonight.

52:06

Um,

52:07

this is only about identity verification

52:09

so that basically if they become a,

52:11

if the developer distributes malware,

52:12

they're unlikely to remain verified and

52:13

they can get booted from the program.

52:15

Um,

52:16

but then like, you know, when he's a,

52:19

when this smart person, he's like, Oh,

52:21

but this, uh, you know,

52:22

we're not keeping a list of developers.

52:24

Well then how are you going to verify

52:25

if somebody is a repeat offender?

52:27

Like your,

52:27

your answers don't make sense here.

52:29

And yeah.

52:32

Um, I also just need to,

52:34

to be snarky and point out,

52:35

he says that, uh,

52:37

Not everything is malware.

52:38

It depends on the context.

52:39

So like a rootkit is malware,

52:40

but a rootkit you download intentionally

52:42

because you want to root access to your

52:43

phone is not malware.

52:44

Likewise,

52:45

an alternative YouTube client that

52:46

bypasses Google's ads and feature limits

52:48

isn't causing the kind of harm that would

52:49

lead to issues with verification.

52:52

Anybody who uses things like NewPipe or

52:55

FreeTube knows that those things break

52:57

about once a month because Google does

52:58

something on their end to block it.

53:00

And then they have to update and do

53:01

the cat and mouse.

53:02

So yeah,

53:03

that was just kind of funny to hear

53:04

them cite that.

53:05

I don't think that came from Google,

53:06

for the record.

53:07

I think ours wrote that.

53:08

But it's still kind of funny to hear

53:09

them cite that as an example.

53:10

And it's like, yeah,

53:11

but Google still treats that stuff in a

53:13

very hostile manner.

53:15

Yeah.

53:16

Yeah.

53:16

I think the other thing that you kind

53:18

of mentioned it a little bit,

53:19

like with the ice block thing,

53:21

but I think there's plenty of countries

53:22

that we've already seen this happen with

53:24

where like, you know,

53:25

having a centralized app store kind of

53:28

allows governments to basically get apps

53:31

removed or to like have apps not be

53:34

allowed to install on devices.

53:36

Like I'm pretty sure the,

53:37

the way that works, like in China,

53:39

a lot of apps don't want to comply

53:41

with a lot of the like legislation that

53:43

they have.

53:44

Like, Oh, you've got to,

53:46

share a certain amount of data or you

53:47

have to meet these like economic

53:50

requirements or whatever.

53:51

Um,

53:52

and so they're actually just removed from

53:54

the play store or for like censorship

53:56

reasons.

53:56

Like there's stuff that's being shared on

53:58

those platforms that they don't want, um,

54:01

people to access basically.

54:03

Um, so having the centralized,

54:05

like we already saw this with like iOS

54:07

devices,

54:08

basically turns your phone into a Rick.

54:09

Like you can't install the software that

54:11

you want on it.

54:12

Right.

54:13

Um,

54:14

But yeah,

54:15

I think it's like it's like with Linux,

54:17

right?

54:17

You can kind of install software from a

54:21

trusted repository and you can also add

54:25

additional repositories.

54:26

I think it should work similar on Android.

54:28

Like you should have the option to have

54:31

install stuff from additional places.

54:35

Maybe there's more of a warning about it,

54:39

but I don't think having to go through

54:41

developer settings and all this stuff is

54:45

particularly great.

54:47

I think it definitely also puts up a

54:48

bit of a barrier,

54:49

especially when you're like,

54:51

showing all these warnings like,

54:53

this is very un-recommended.

54:55

What you're about to do could compromise

54:57

your device.

54:58

It's not gonna sit well with someone who

55:01

doesn't understand the technical reasons

55:04

why they're showing that.

55:07

Yes,

55:07

a lot of cases it's it could be

55:09

useful for someone to see that warning.

55:12

But if it's like someone and they're like,

55:14

oh,

55:14

I want to be able to watch YouTube

55:16

with without the ads,

55:17

I'm going to download new pipe.

55:19

And it's like this app could compromise

55:21

your device.

55:21

This is a highly this is a highly

55:25

suspicious action you're about to take.

55:27

Are you sure you want to do this?

55:28

People are going to be like, oh,

55:29

this is just malware.

55:30

I'm just installing malware.

55:32

It's not malware.

55:34

So

55:36

I don't know,

55:37

not really happy with this situation.

55:40

I think we're going to keep pushing Google

55:42

here to make a better decision.

55:44

I just think they should go back on

55:46

all of this and just go back to

55:48

what it was before.

55:49

Like you have to enable it, right?

55:52

But

55:54

it should still be an option for people.

55:56

Um,

55:57

I don't know if there's a better way.

55:58

Maybe they have a way of scanning like

56:02

the device to see if the permissions are

56:04

suspicious or I don't know.

56:06

I can't really think of a better way

56:07

that doesn't involve Google just like

56:10

doing more invasive stuff on your device.

56:12

But

56:14

I mean,

56:14

I think it should go back to how,

56:16

like how it is on Linux.

56:17

Like you can install additional

56:18

repositories,

56:19

you can install the applications you want

56:21

on your device.

56:22

And that is an increased risk of using

56:25

a third party platform to install packages

56:28

or like using a third party repository.

56:30

But that should be up to the user

56:32

to determine if they want to take that

56:34

risk, not Google,

56:35

who's like making that decision for you.

56:39

But yeah,

56:41

that's pretty much all my thoughts on this

56:43

one.

56:44

Do you want to take the next story

56:45

here, Nate?

56:47

Yeah, sure.

56:49

So this will probably be a pretty quick

56:52

one.

56:52

I just thought it was really interesting

56:53

because I am a nerd who really likes

56:56

thought experiments.

56:57

And the headline from this one,

56:59

this comes from Slashdot.

57:00

It says, should Banksy remain anonymous?

57:02

And the original article comes from

57:04

Reuters.

57:05

And Reuters did this really deep dive

57:08

um, really deep dive.

57:10

Uh, I'll be honest.

57:10

I didn't read it all cause it's so

57:12

long, but I skimmed it.

57:13

And, uh, they tried to identify Banksy,

57:15

which for anyone who doesn't know Banksy

57:17

is a very, very famous, um,

57:20

graffiti artist, I guess you would say.

57:21

Uh, well, I mean,

57:22

I would say artist in general,

57:24

he's done a lot of like legit artwork

57:25

as well,

57:26

but he's also well known for doing

57:27

graffiti work, um, all around the world,

57:29

actually, not just he's from the UK,

57:31

I believe, but, uh,

57:33

Well, we're assuming he's from the UK.

57:34

I believe that's where he's most active.

57:36

But Reuters did this deep,

57:38

deep dive to try and figure out who

57:40

Banksy was because there's been a lot of

57:43

I mean, of course,

57:44

there's been a lot of speculation over the

57:45

years.

57:45

And there's also just been a lot of

57:48

there's been a couple of like,

57:49

we're pretty sure it's this guy.

57:50

It might be this guy.

57:51

But they set out to like for sure

57:53

figure out who he is.

57:54

And spoiler alert, I think they did.

57:58

And I kind of don't like that personally.

57:59

I think it took some of the magic

58:00

out of it.

58:01

But I liked this headline of should Banksy

58:05

remain anonymous?

58:06

And I thought that was something

58:07

interesting to think about because there's

58:09

a few different angles here.

58:11

One of them is a legal liability.

58:13

This dude is technically a graffiti

58:15

artist, although...

58:17

I don't think it's here in the Slashdot

58:18

summary,

58:18

but in the actual Reuters article,

58:20

they mentioned how he does kind of seem

58:22

to get a pass because he is so

58:24

well-known.

58:25

And to be fair,

58:26

his art probably brings a lot of tourists

58:28

and stuff.

58:28

So even though he's technically doing

58:30

illegal things,

58:31

other graffiti artists have noticed.

58:32

It's like, if I did that,

58:33

I would absolutely go to jail.

58:36

But the police don't even seem interested

58:37

in figuring out who he is anymore.

58:38

They're just like, yeah, whatever.

58:40

He made some art.

58:40

Let's clean it up and move on.

58:42

But...

58:43

They also talked about his lawyer when

58:46

Reuters reached out to them and said like,

58:49

hey, we want a statement for this piece.

58:50

He urged us not to publish this report,

58:52

saying doing so would violate the artist's

58:54

privacy, interfere with his art,

58:55

and put him in danger.

58:57

And they pointed out again that what he's

59:00

doing is technically illegal and the

59:03

police could come after him and it could

59:04

stifle free speech.

59:06

So yeah,

59:07

it was just – it was really interesting

59:09

to –

59:10

um i mean i i have a feeling

59:12

our our whole audience is going to say

59:13

like yes he should remain anonymous or

59:15

maybe not maybe you're one of those like

59:16

hardline lawful good people that's like

59:18

yeah it doesn't matter if he's not doing

59:19

any real damage i mean he's costing some

59:21

people some paint on their building but

59:22

other than that he's not doing any real

59:24

damage let him do his thing but it

59:26

was just really interesting to see this um

59:28

this again huge deep investigation on the

59:31

front page of reuters that uh kind of

59:33

challenged like

59:36

I don't know.

59:36

It was just really interesting.

59:38

I don't think I have much more to

59:40

add than that, to be totally honest.

59:41

But to...

59:43

To kind of like, where is that line?

59:44

I think that's kind of where my mind

59:46

went.

59:46

It's like, where is that line of like,

59:47

again, yes, he's doing something illegal.

59:48

He should not be allowed to do that,

59:51

but also like free speech and free

59:53

expression.

59:53

And it's not just the UK.

59:56

He protests things all over the world.

59:57

Like he's drawn on the walls in Palestine,

1:00:01

separating Israel from Palestine.

1:00:03

Most recently he was in Ukraine,

1:00:05

which is what sparked this investigation.

1:00:07

So it's not all like him saying,

1:00:11

living in a repressive regime,

1:00:13

criticizing his government.

1:00:14

Well, I don't know.

1:00:15

Some of the stuff I've seen come out

1:00:16

of the UK lately has me really worried.

1:00:17

Maybe he is living in a repressive regime,

1:00:19

but it's not all that.

1:00:20

It's also him going to other places around

1:00:22

the world just to make all kinds of

1:00:24

statements,

1:00:25

all kinds of political statements,

1:00:26

I guess.

1:00:27

But, you know,

1:00:28

just kind of the hippie stuff, you know,

1:00:29

like,

1:00:29

why can't we all just get along kind

1:00:31

of political statements?

1:00:32

But it's yeah.

1:00:36

Like I said,

1:00:36

I really don't have too much to add

1:00:37

to that one.

1:00:38

It's just it's just an interesting story

1:00:40

that

1:00:42

I thought was a good discussion about,

1:00:46

I guess about public interest, right?

1:00:48

Because we think about that a lot too,

1:00:49

about famous people and how much privacy

1:00:52

versus transparency do they deserve

1:00:55

depending on their roles.

1:00:56

And I don't know.

1:00:57

I like thought experiments.

1:00:58

I think that's what prompted me to want

1:01:00

to talk about this one.

1:01:00

I don't know if you have any thoughts

1:01:04

on this.

1:01:05

I think when it comes to art,

1:01:06

I think

1:01:07

this is, you know,

1:01:09

there's plenty of artists that do this

1:01:11

sort of stuff, like not just Banksy.

1:01:13

Um, and you know, obviously people,

1:01:17

the government is not going to be super

1:01:18

happy if you're like defacing a public

1:01:20

building or like there's a, there's,

1:01:23

I think defacing is,

1:01:24

is certainly up in the air, right?

1:01:27

Like I think in a lot of cases,

1:01:29

uh,

1:01:31

it's very much like, you know,

1:01:34

trying to make a message,

1:01:35

trying to make a pub,

1:01:37

make a message publicly,

1:01:39

people publicly aware of an issue,

1:01:41

for instance, like, uh,

1:01:43

I don't know if it's,

1:01:45

we've had a lot of like street art

1:01:46

just like pop, pop up in Sydney,

1:01:48

Australia, like, and it was never,

1:01:50

you know,

1:01:51

it was never publicly sanctioned.

1:01:53

It's just a lot of it's to do

1:01:54

with like

1:01:57

know street art um criticizing the

1:02:00

government criticizing like social justice

1:02:03

issues um i think you know it's not

1:02:08

really hurting anybody so i think you know

1:02:11

maybe i'm it's it's it's showing an art

1:02:14

artist's vision i think in a lot of

1:02:17

cases when there's like graffiti uh

1:02:21

it actually brings people, like Nate said,

1:02:23

it's like a tourism thing,

1:02:25

especially when it's like a famous artist.

1:02:27

There's plenty of places where there's

1:02:28

like graffiti in places.

1:02:30

Um,

1:02:31

and people come there just cause they want

1:02:33

to take pictures.

1:02:34

Um,

1:02:34

it doesn't have to be any famous artists.

1:02:36

Right.

1:02:37

Um, but I think, you know,

1:02:40

it's part of the community.

1:02:42

It's part of like,

1:02:45

it's just,

1:02:46

it's kind of an expression of people in

1:02:48

that who live in that place.

1:02:49

Um, so I dunno, I think it's,

1:02:54

I don't think Banksy's identity should be

1:02:57

like revealed obviously.

1:02:59

Cause I think, you know,

1:03:00

people should be able to choose whether

1:03:01

they,

1:03:04

share that information or not.

1:03:06

Um, I think it just applies.

1:03:08

It doesn't really,

1:03:09

I think someone could definitely make the

1:03:11

argument that because he was technically

1:03:13

committing crimes or like not crimes,

1:03:16

I guess maybe like a,

1:03:19

I don't know what you would classify

1:03:21

graffiti as like vandalism, I guess maybe,

1:03:25

but yeah, some kind of misdemeanor,

1:03:27

I think.

1:03:28

Yeah.

1:03:29

So I think, you know,

1:03:31

It's up to the community to determine

1:03:34

whether it's acceptable or not, I guess.

1:03:41

I think you know there's definitely a

1:03:42

difference between like a lot of people

1:03:44

just do like tagging stuff or they like

1:03:46

put their name on something that's not

1:03:47

really art that's just like vandalism but

1:03:50

I think if it's actually something that's

1:03:52

trying to display a message I think it's

1:03:54

a little bit different um like social

1:03:56

commentary and stuff I think is definitely

1:03:58

more acceptable but I think you know

1:04:01

legitimate actual street art is definitely

1:04:04

on a different

1:04:06

different level,

1:04:06

but I think it's definitely,

1:04:08

I think one of these things where it's

1:04:10

down, it's down to someone's beliefs, um,

1:04:12

as a person,

1:04:13

like it's not really a very clear cut

1:04:16

thing.

1:04:16

I don't think, um,

1:04:18

whether it's a clear cut, obvious answer,

1:04:20

but I think in this community, it's like,

1:04:23

you know,

1:04:24

I think people should be for protecting

1:04:26

artists, privacy,

1:04:28

protecting anyone's privacy if they don't

1:04:30

want to have their identity revealed.

1:04:31

But,

1:04:34

yeah i think yeah i don't really have

1:04:36

too much more to add do you have

1:04:37

any thoughts no yeah um i mean yeah

1:04:42

i was really disappointed to see that they

1:04:43

went ahead and published his name anyways

1:04:45

or who they believe it is um

1:04:47

And it's,

1:04:48

I'm with you on the one hand,

1:04:49

because like, to me, it's like,

1:04:50

I don't think his message is

1:04:51

controversial.

1:04:52

You know,

1:04:52

I could see the argument of like, well,

1:04:54

let's say I own a business and he

1:04:55

graffitis the side of that business with a

1:04:57

message that I don't agree with.

1:04:59

Like, okay, I hear that,

1:05:00

but he's not in my opinion.

1:05:02

I mean,

1:05:03

I don't see anything controversial about

1:05:05

any of the stuff he's posted.

1:05:06

I mean, for the record,

1:05:07

I don't follow him super closely.

1:05:09

So I don't know if somebody is going

1:05:09

to go dig up and be like, oh,

1:05:10

go look up this painting.

1:05:11

This was like super political and somebody

1:05:13

may not agree with that one.

1:05:17

on the wall in Palestine was like it

1:05:20

was like it was forced or not forced

1:05:23

perspective but you know it was like it

1:05:23

was a lifelike painting of like a hole

1:05:26

in the wall and it was like this

1:05:27

beautiful beach on the other side and you

1:05:29

know it's art so it's open to

1:05:31

interpretation but the way I took away

1:05:32

from that was like

1:05:34

this could be paradise if we could find

1:05:36

a solution here.

1:05:37

And he wasn't trying to say what the

1:05:38

solution is.

1:05:39

He was just trying to say like,

1:05:40

be human,

1:05:41

be kind to each other and figure out

1:05:42

a solution.

1:05:43

And it's like,

1:05:44

I don't think that's a particularly

1:05:45

controversial take personally, but yeah.

1:05:48

So, I mean, it's, it's, I don't know.

1:05:50

I think there's much worse crimes in the

1:05:51

world, but yeah, it was just,

1:05:54

I don't know.

1:05:54

He's, he's so, yeah.

1:05:58

I was disappointed to see the Reuters went

1:05:59

ahead and published it, but yeah.

1:06:02

It's interesting to think about because I

1:06:04

think about that a lot as a quote

1:06:06

unquote semi-public figure is like,

1:06:08

how much transparency do I owe people

1:06:10

versus how much privacy do I get to

1:06:12

have as an individual?

1:06:13

And it's, I don't know.

1:06:16

Yeah, life is full of nuance.

1:06:17

Definitely.

1:06:22

All right, so in a moment,

1:06:23

we're going to start taking viewer

1:06:25

questions.

1:06:25

So I know there have already been some

1:06:27

questions,

1:06:28

but if you guys are holding on to

1:06:29

any more,

1:06:30

definitely go ahead and start leaving

1:06:32

those in the chat or in the forum

1:06:35

thread.

1:06:36

But for now, speaking of the forums,

1:06:39

we're going to check in on our community

1:06:41

forum because there's always a lot of

1:06:43

activity.

1:06:44

This week has been no exception,

1:06:45

been very busy week.

1:06:46

So here's a few of the most interesting

1:06:48

discussions happening.

1:06:50

And the first one we're going to talk

1:06:51

about is there's a community discussion

1:06:54

about Firefox's new features.

1:06:58

So for those who don't know, Firefox,

1:06:59

I believe it's one forty nine is coming

1:07:01

out here pretty soon.

1:07:03

And it's got a few pretty big changes.

1:07:06

Some of them are very.

1:07:08

Cosmetic welcome cosmetic, for the record,

1:07:11

like I just found out.

1:07:13

I feel dumb.

1:07:14

But I just found out two or three

1:07:16

weeks ago that in Brave,

1:07:18

you can do split tabs.

1:07:19

So it's kind of like tiling a window,

1:07:22

which I just realized I should totally be

1:07:23

doing here, but I'm not.

1:07:25

The split tab thing, I mean.

1:07:26

It's kind of like tiling a window,

1:07:27

except it's the same window,

1:07:30

and it's just the tabs are side by

1:07:31

side.

1:07:32

which is probably a little bit of a

1:07:33

niche use case,

1:07:35

but it's really cool for me.

1:07:37

It's really nifty and I like it.

1:07:39

Firefox is going to be adding that,

1:07:41

but then there's also some more serious

1:07:42

things.

1:07:42

Like there's a sanitizer API, which...

1:07:46

I'm forgetting off the top of my head

1:07:47

exactly what that does.

1:07:48

I think that's supposed to help protect

1:07:50

against cross-scripting attacks,

1:07:51

but don't quote me.

1:07:53

It's definitely a security update.

1:07:55

And noticeably, this one is new.

1:07:56

Apparently,

1:07:56

they've announced the sanitizer API

1:07:58

before.

1:07:59

But Firefox is going to include a VPN.

1:08:03

I believe from what I've heard,

1:08:05

they did not really say for sure in

1:08:07

their blog post,

1:08:08

but it will be free for up to

1:08:10

fifty gigs a month.

1:08:12

And to start with,

1:08:14

it's going to roll out in France, Germany,

1:08:15

the UK and the US.

1:08:17

We'll see about the UK if they start

1:08:19

requiring ID for VPNs.

1:08:20

But that's a different discussion.

1:08:24

And yeah,

1:08:25

I think I've heard rumors that it's going

1:08:26

to be in-house.

1:08:27

I know last time they did this,

1:08:29

it was a white label of Mulvad.

1:08:32

And I actually stand corrected because

1:08:34

I've always said that like,

1:08:34

I don't see the point of the in-browser

1:08:37

VPN because I want more than just my

1:08:39

browser to be protected.

1:08:41

And from what I'm told,

1:08:42

that is not how this is going to

1:08:43

work.

1:08:43

It is actually going to like protect your

1:08:45

whole device.

1:08:46

It's just going to give you a lot

1:08:47

more granularity in the browser.

1:08:48

That's what I've heard.

1:08:49

But yeah,

1:08:51

Yeah, what do we think about this?

1:08:54

I think I'll go ahead and say that

1:08:56

I'm notoriously critical of Mozilla,

1:08:58

but I'm happy to see them putting good

1:09:00

features into Firefox.

1:09:01

I mean,

1:09:03

at least it's not an AI feature that

1:09:05

nobody asked for, right?

1:09:07

So yeah,

1:09:09

I think this is potentially a good step

1:09:12

forward.

1:09:12

I will be interested to see how that

1:09:14

VPN works potentially, but yeah.

1:09:18

Uh, I think you did, unfortunately,

1:09:21

unfortunately, Nate, to,

1:09:22

to ruin your parade of anti AI.

1:09:24

They unfortunately did include, uh,

1:09:27

there's an update in this, in this update,

1:09:29

they're including smart window,

1:09:31

which was previously called AI window,

1:09:34

which is basically.

1:09:36

Oh yeah.

1:09:37

That, okay.

1:09:37

I missed that.

1:09:38

I was just reading the summary here in

1:09:40

the thread.

1:09:41

Yeah.

1:09:41

So unfortunately that is coming in this

1:09:43

update.

1:09:44

I think they realized calling it AI window

1:09:46

was probably a bit too on the nose.

1:09:48

So they've changed the name to smart

1:09:50

window this time, I think.

1:09:53

We did talk about this a little bit

1:09:55

internally about this privacy,

1:09:58

this free inbuilt VPN.

1:10:00

I think the thing I was specifically

1:10:03

talking about was Mozilla VPN.

1:10:06

So this is a different thing.

1:10:08

This is, I guess, Firefox VPN.

1:10:12

which is different to Mozilla VPN.

1:10:14

Mozilla VPN,

1:10:15

one of the cool things about Mozilla VPN,

1:10:17

like Nate kind of talked about,

1:10:19

was it would cover your whole device and

1:10:21

then when you use Firefox,

1:10:24

it would integrate with the desktop client

1:10:26

and it would allow you to select different

1:10:29

locations for where your browser would

1:10:31

exit based on the website.

1:10:33

so you know obviously you wouldn't want to

1:10:36

like access your bank's website and also

1:10:39

be coming from like turkey because that

1:10:42

would like cause your bank to like you

1:10:44

know lock down they're not gonna they're

1:10:47

not gonna like that um so that allowed

1:10:51

you to have different end points coming

1:10:53

out there um i think that is also

1:10:55

very useful because you know

1:10:58

A lot of times VPNs are blocked.

1:11:00

Like on Reddit,

1:11:01

you'll frequently find it's blocked.

1:11:03

On YouTube, it'll ask you to sign in.

1:11:06

I think that's an interesting thing with

1:11:08

Mozilla VPN.

1:11:09

But I think like Nate said,

1:11:11

this is a separate thing.

1:11:13

This isn't the same thing.

1:11:14

It's kind of confusing.

1:11:16

They've got two products.

1:11:17

This is only for your browser.

1:11:20

As far as we're aware,

1:11:21

they haven't said that it's going to be

1:11:23

your entire device because they say this

1:11:25

is a proxy.

1:11:28

So as far as we are aware,

1:11:29

that is only going to be through the

1:11:31

browser itself, as far as we know.

1:11:34

So I would say that's what we should

1:11:36

think that this is first.

1:11:38

Um, I don't think this is, you know,

1:11:43

an amazing.

1:11:47

because I think we have such good free

1:11:49

privacy, like full VPNs you can use now.

1:11:51

Like you can use ProtonVPN free.

1:11:53

Like they have quite good speeds.

1:11:57

It's free.

1:11:58

I think Proton's doing a great job by

1:12:00

offering that for free to people.

1:12:03

I think people should use that if they

1:12:05

don't have another way to protect their

1:12:08

privacy.

1:12:08

But I think especially with the low cost

1:12:11

of VPNs at this point,

1:12:12

like more that is five euros a month,

1:12:15

like that is a pretty cheap price for

1:12:18

a lot of people.

1:12:19

But I think, you know,

1:12:21

price is also it's a trying time.

1:12:25

You know, people are trying to save money.

1:12:27

So I think, you know,

1:12:29

fifty gigabytes of data is definitely

1:12:32

pretty uh pretty generous i would say it's

1:12:37

like that's gonna take you quite a long

1:12:39

way um especially monthly i feel like i

1:12:42

don't even use i use like only a

1:12:44

couple of gigabytes a month on my phone

1:12:46

so i mean if that's i mean i

1:12:49

mean i know there's people that use like

1:12:50

hundreds of gigabytes on their phone every

1:12:52

month i don't know how you do that

1:12:53

exactly but um

1:12:56

I think fifty gigabytes is a lot maybe

1:12:59

I'm like I think it might just be

1:13:01

because our internet is really slow here

1:13:03

but it's kind of hard to download that

1:13:06

much stuff but fifty gigabytes and it's

1:13:10

it's kind of frustrating Firefox and

1:13:12

Mozilla in general do this all the time

1:13:14

like they only release their products in

1:13:17

specific regions

1:13:20

Like in this case, they're saying the US,

1:13:22

France, Germany, and the UK to start.

1:13:25

That's where they're releasing this free

1:13:27

Firefox VPN.

1:13:29

And it's the same thing with Mozilla

1:13:31

Monitor, which I think is defunct now,

1:13:34

and Mozilla VPN and Mozilla Relay.

1:13:38

It's like their email aliasing thing.

1:13:40

It was only available in certain

1:13:42

countries.

1:13:43

I was always kind of like interested in

1:13:45

trying it.

1:13:47

never was available in australia so i

1:13:50

think they should probably look at you

1:13:53

know i don't really understand the reason

1:13:55

why they're only releasing this in certain

1:13:58

locations but um i think especially in

1:14:01

locations where i feel like they don't

1:14:05

need the privacy as much like what about

1:14:07

countries that are like you know under

1:14:09

siege by like authoritarian governments

1:14:12

maybe we should focus on those first to

1:14:13

get this technology to but um

1:14:18

It's still an interesting thing.

1:14:19

I didn't really read any of the comments.

1:14:21

Was there anything you were thinking that

1:14:23

people mentioned that we haven't really

1:14:24

talked about yet?

1:14:26

I don't think so.

1:14:27

There was kind of a discussion right off

1:14:29

the bat about whether they meant, um,

1:14:31

like there was a confusion of, um,

1:14:36

When they said to start,

1:14:37

did they mean to start?

1:14:39

And that might change?

1:14:40

Or did they mean the countries might

1:14:41

change?

1:14:42

But I think everybody kind of agreed that

1:14:44

it's like, no, it's probably the country.

1:14:45

But yeah,

1:14:47

there was a lot of discussion about is

1:14:49

it

1:14:51

like what I was saying,

1:14:52

is this going to be an in-house thing

1:14:54

or is this going to be a,

1:14:57

like a white label of Mulvad was some

1:15:01

people here are saying this might be like

1:15:03

competition against opera,

1:15:04

which I I'm with you.

1:15:05

Like personally, I don't,

1:15:08

I do think the proton last time I

1:15:10

tried one of them,

1:15:10

the proton free servers tend to be a

1:15:12

little bit slow,

1:15:13

but I also know since then they've kind

1:15:14

of added a few more.

1:15:16

So hopefully that's helped.

1:15:19

But that said, I do think,

1:15:22

I'm not opposed to them adding this as

1:15:24

like a compete with opera thing,

1:15:25

especially if they can keep the cost low

1:15:27

for them.

1:15:28

And this isn't going to be one of

1:15:29

those things that,

1:15:30

you know, in a year, they're just like,

1:15:32

oh,

1:15:33

we killed this off because it's really

1:15:34

expensive.

1:15:35

But I don't know.

1:15:37

I mean,

1:15:37

I know there's the whole smart window

1:15:38

thing, which I don't know.

1:15:40

To me,

1:15:40

that reminds me of like Brave's Leo.

1:15:42

Like Brave has like a little pop out

1:15:45

mode where you can just talk to Leo

1:15:46

directly and have a conversation with it,

1:15:49

have a conversation in the sense of like,

1:15:50

I'm not asking it to paraphrase this page

1:15:53

or whatever.

1:15:55

But they also have like a little sidebar

1:15:57

where you can ask questions about the page

1:15:59

you're on.

1:16:00

And they say that this will be completely

1:16:02

optional.

1:16:02

So I don't know, to me,

1:16:04

that's just competing with brave, which,

1:16:08

again, I don't know, it's just,

1:16:10

it's good to see the mostly focusing on

1:16:12

the browser again,

1:16:13

and not buying like ad companies or fake

1:16:17

review plugins or Yeah,

1:16:20

so

1:16:22

Yeah,

1:16:23

I think one interesting thing you said,

1:16:24

oh, this is like,

1:16:25

I feel like Brave also has like a

1:16:27

VPN built in Vivaldi.

1:16:30

Oh, they do.

1:16:31

I forgot about that.

1:16:31

So I think it's more of a,

1:16:33

I think they're going more to try and

1:16:35

challenge Vivaldi here and Opera.

1:16:38

But Brave also has, it's a paid thing,

1:16:40

but it's still technically built in,

1:16:42

I guess.

1:16:44

I guess they're just trying to be like

1:16:45

feature compliant.

1:16:49

competing against, you know, this stuff.

1:16:54

so yeah i don't know uh i think

1:16:57

it's also one other thing that uh firefox

1:17:01

has actually rolled out in like the latest

1:17:03

release they do have the ai block switch

1:17:06

now so like if you've got that enabled

1:17:10

you're not going to get any of this

1:17:11

ai stuff so i wouldn't worry about that

1:17:14

i would make sure you have that ticked

1:17:15

if you use firefox because you don't want

1:17:17

to get this in the next update um

1:17:23

So yeah, I don't know, this is,

1:17:26

it's good to see Firefox actually doing

1:17:27

something this time.

1:17:28

Like I feel like we were sitting at

1:17:30

like no changes being made every year.

1:17:32

There was like absolutely barely any

1:17:34

changes to Firefox for like, I feel like,

1:17:37

like, like, like, like, like, like, like,

1:17:38

like, like, like, like, like, like, like,

1:17:39

like, like, like, like, like, like, like,

1:17:40

like, like, like, like, like, like, like,

1:17:41

like, like, like, like, like, like, like,

1:17:42

like, like, like, like, like, like, like,

1:17:44

like, like, like, like, like, like, like,

1:17:45

like, like, like, like, like, like, like,

1:17:46

like, like, like, like, like, like, like,

1:17:48

like, like, like, like, like, like, like,

1:17:49

like, like, like, like, like, like, like,

1:17:50

like, like, like, like, like, like, like,

1:17:51

like

1:17:53

I don't think it's going in the direction

1:17:56

I would like.

1:17:57

I don't think many people agree that it's

1:17:59

going in the direction they want.

1:18:02

And I guess with all this AI stuff,

1:18:04

I think it's pretty tricky to avoid at

1:18:16

this point.

1:18:16

Every company is basically rolling this

1:18:18

stuff out.

1:18:20

At least Firefox is making it easy to

1:18:23

opt out, but I just,

1:18:25

it kind of frustrates me that all the,

1:18:27

all the donation money and all this money

1:18:29

from Google to be the main search engine

1:18:31

is just being dumped into like AI and

1:18:33

like privacy preserving analytics.

1:18:37

Um, it's not really stuff that is gonna,

1:18:39

I don't think it's gonna bring people into

1:18:41

the browser,

1:18:42

but I think if they actually made some

1:18:44

big changes and listen to what community

1:18:48

people actually wanted from the browser,

1:18:50

I think they could.

1:18:52

you know,

1:18:52

there's plenty of projects that are doing

1:18:54

interesting things.

1:18:54

Like I think one of the most interesting

1:18:56

ones was arc browser.

1:18:58

Like they were doing quite a lot of

1:19:00

interesting, you know,

1:19:01

different things that no other browser was

1:19:03

doing.

1:19:04

Like,

1:19:04

I think it'd be interesting to see Mozilla

1:19:06

just actually try something new,

1:19:09

like not just like copy what other people

1:19:11

are doing,

1:19:11

like actually try and make something, uh,

1:19:16

little bit revolutionary a little bit

1:19:18

different um to actually give people a

1:19:20

reason to use it because right now it's

1:19:22

like firefox just kind of is bad

1:19:26

especially on some websites like you're

1:19:28

just gonna be have a worse experience like

1:19:30

people don't test for firefox now um like

1:19:34

even this website we're using streamyard

1:19:36

to do this right now i can't use

1:19:38

firefox

1:19:39

to do this.

1:19:41

So, you know, it's,

1:19:44

if you can't do basic stuff with your

1:19:47

browser,

1:19:47

I think that's going to push people away

1:19:49

from doing, from using it as well.

1:19:52

But yeah,

1:19:53

I think that's kind of my thoughts on

1:19:56

this.

1:19:57

Somewhat positive, I guess, but yeah.

1:20:01

Yeah, I agree.

1:20:02

I mean, for me,

1:20:03

it's unfortunate that Mozilla is

1:20:05

constantly playing catch-up to everyone

1:20:07

else.

1:20:08

Like, again, the split view.

1:20:10

Brave has that.

1:20:11

I don't know how long they've had it,

1:20:12

because I just discovered it,

1:20:12

but Brave has that.

1:20:14

And even their AI stuff, it's like...

1:20:16

Like, everyone else... The AI ship...

1:20:20

I mean, I feel comfortable saying this,

1:20:21

because this isn't like a, you know,

1:20:22

hustle podcast or whatever, but, like,

1:20:25

I feel like at this point,

1:20:26

if you're just now jumping on the AI

1:20:28

bandwagon, it's gone.

1:20:29

Like...

1:20:30

It's gone.

1:20:31

Why are you there?

1:20:32

And so it's, you know, it's like,

1:20:35

I don't understand why they're,

1:20:36

and they're doing it in such a poor

1:20:37

way too.

1:20:38

Like I remember being really disappointed

1:20:39

when I looked into their AI features,

1:20:42

not because I wanted to use them,

1:20:43

but just because I wanted to understand

1:20:44

them and they don't even do anything.

1:20:45

It's like, oh,

1:20:46

here's a tab where you can talk with

1:20:47

chat GPT.

1:20:49

Your privacy policy,

1:20:50

like their privacy policy is literally

1:20:52

like go see open AI's privacy policy.

1:20:54

And it's like,

1:20:54

so what's the difference with this?

1:20:56

and just going to chatgpt.com.

1:21:01

What use is this?

1:21:02

And it's like, oh, well,

1:21:02

it's integrated in there.

1:21:04

I don't care about that.

1:21:05

If I cared about that,

1:21:06

I'd be using ChatGPT's browser.

1:21:08

I don't understand why it needs to...

1:21:11

to do that.

1:21:12

I don't know.

1:21:12

It's just,

1:21:13

it's weird to me that like they're

1:21:14

constantly playing catch up and yeah,

1:21:16

it would be nice to see them because

1:21:18

they have such a passionate,

1:21:19

active community.

1:21:19

I know they do.

1:21:21

And I'm sure people have plenty of ideas

1:21:23

about how they can improve it, but it's,

1:21:26

it's, it's, yeah,

1:21:29

it is nice to see them investing in

1:21:30

something that isn't AI for,

1:21:32

even if they have the little smart window

1:21:33

thing, but yeah,

1:21:34

The split view, the tab notes,

1:21:36

which I don't know how that's going to

1:21:37

help, but the sanitizer API, the VPN.

1:21:41

I agree with you.

1:21:42

It's not enough,

1:21:43

but it's nice to see them starting to

1:21:45

get back into it.

1:21:46

And hopefully,

1:21:47

I'm hoping the momentum will pick up for

1:21:49

sure.

1:21:51

Yeah, I think we had a question here.

1:21:52

We have, well, not a question.

1:21:53

I guess someone was just saying, uh,

1:21:55

without manifest V two extensions,

1:21:58

I find the internet to be pretty bad.

1:22:01

Um, I agree.

1:22:02

I think, you know, you block origin,

1:22:04

I think is kind of a needed tool

1:22:08

at this point.

1:22:09

Uh,

1:22:09

you block origin light is it doesn't work

1:22:15

as well and it doesn't block a lot

1:22:17

of things that you need, right?

1:22:19

Like, you know,

1:22:22

you would hope that, uh, you know,

1:22:26

websites don't have a million pop-ups and

1:22:28

like cookie banners and paywalls and all

1:22:31

this sort of stuff.

1:22:32

But it's kind of the modern internet at

1:22:33

this point.

1:22:34

Um, you need to, you need to,

1:22:35

you need to use an ad blocker unless

1:22:36

you want to go completely, you know,

1:22:39

off the rails, I think.

1:22:41

So if,

1:22:42

if Mozilla is like the last bastion of

1:22:44

MV two extensions, then, uh,

1:22:47

I think that is definitely a thing that

1:22:48

separates them from Chrome, but,

1:22:52

You know,

1:22:52

that's not going to be enough to keep

1:22:54

people there because plenty of people are

1:22:56

still using Chrome and they're still using

1:22:58

you block origin lights.

1:22:59

Um, it's good enough for them.

1:23:02

It's not perfect,

1:23:03

but it's definitely good enough.

1:23:05

Um, so.

1:23:07

people kept saying that we're going to

1:23:09

leave Chrome.

1:23:09

If, if Chrome doesn't, uh,

1:23:12

if Chrome doesn't use, um,

1:23:14

doesn't allow MV two,

1:23:15

I'm going to leave Chrome.

1:23:16

And then everyone just stayed on Chrome.

1:23:17

Like, like,

1:23:19

I think people might not realize that a

1:23:25

lot of people don't actually use

1:23:27

extensions.

1:23:28

They don't even know what they are.

1:23:29

They just use their web browser like

1:23:32

normally.

1:23:33

Um, so yeah, I dunno.

1:23:36

Um,

1:23:38

It's, yeah,

1:23:39

I don't think Firefox is in a very

1:23:41

good position at the moment,

1:23:44

unfortunately.

1:23:46

I do got to point out,

1:23:47

I disagree that most people don't use

1:23:48

extensions because I feel like every time

1:23:50

I look at somebody's Chrome browser,

1:23:51

they've got like ten extensions and it's

1:23:54

always like grammarly.

1:23:55

And then like what's funny is it's always

1:23:57

like six different ad blockers.

1:23:58

It's always like ad block plus,

1:24:00

plus ghostery, plus privacy badger.

1:24:03

It's more...

1:24:05

I almost get the impression that like

1:24:06

people don't understand extensions and

1:24:08

they don't understand which ones,

1:24:09

like what they do and how they work.

1:24:11

And they're just like, Oh, you know,

1:24:13

the more I throw on there,

1:24:14

the better it gets.

1:24:15

Right.

1:24:15

And it's like, no,

1:24:16

you need to be intentional with which ones

1:24:18

you use because you're giving them a lot

1:24:19

of permission, but yeah.

1:24:21

Yeah.

1:24:22

Which,

1:24:22

which just kind of goes back to what

1:24:23

you're saying though, is like,

1:24:24

people don't understand like manifest V

1:24:25

two versus V three and they don't really

1:24:28

like, they don't understand like, okay,

1:24:30

now I've got ad block plus or whichever

1:24:32

one,

1:24:32

but it doesn't work as well as it

1:24:35

used to because Google has hindered it and

1:24:38

they don't understand why.

1:24:39

And which is still unfortunate, but yeah.

1:24:43

So, I mean, if we,

1:24:44

if we like take into account the amount

1:24:46

of people that use, uh, Chrome, right.

1:24:50

And we look at like, you know,

1:24:52

ad block plus or you block origin.

1:24:55

Um,

1:24:56

there's not that many people using them.

1:24:58

If you, if you consider the actual,

1:24:59

like amount of people using Google Chrome.

1:25:02

Um, sure.

1:25:02

The percentage.

1:25:03

Yeah.

1:25:04

uBlock Origin Lite is like,

1:25:06

sixteen million.

1:25:07

That's pretty small,

1:25:08

like if you compare it to the amount

1:25:09

of people.

1:25:10

I mean,

1:25:10

it could be like a sample thing,

1:25:12

like I've personally seen people that use

1:25:15

Chrome and they didn't have any

1:25:17

extensions,

1:25:19

and I've also seen people with a bunch

1:25:20

of them,

1:25:20

so

1:25:21

It's kind of hard to determine what this

1:25:24

is through like anecdotal things.

1:25:26

But I think if we look at the

1:25:27

numbers, we can get some idea,

1:25:29

at least at least like these ad blocking

1:25:33

ones.

1:25:33

I mean,

1:25:33

we could look at like other extensions

1:25:36

that people are using and installing,

1:25:37

probably, you know,

1:25:39

some really weird stuff.

1:25:42

But it doesn't seem like it's super

1:25:46

common.

1:25:48

But that's just going off the numbers,

1:25:51

I guess.

1:25:52

It's not really... No,

1:25:55

to back up what you're saying,

1:25:56

one source says that Chrome has almost

1:25:58

four billion users,

1:25:59

three point nine eight billion users

1:26:01

worldwide based on an estimate.

1:26:04

So, yeah,

1:26:04

like sixteen million people is not much.

1:26:08

I don't know what the math is on

1:26:09

that one.

1:26:09

I'm not even going to try,

1:26:10

but it's not much.

1:26:12

Yeah, I mean,

1:26:14

it's probably not the greatest way to

1:26:16

determine it, right?

1:26:18

People,

1:26:19

it could be multiple installs by one

1:26:22

person.

1:26:22

It could be counted by like, you know,

1:26:25

you've installed uBlock Origin a couple of

1:26:27

times on a couple of your devices.

1:26:28

It could be even less than sixteen million

1:26:30

people, unfortunately.

1:26:32

It doesn't exactly paint a very good

1:26:34

picture because, yeah,

1:26:37

it sounds like most people don't care.

1:26:39

MV two to MV three gives people more

1:26:42

security protections, I guess.

1:26:45

But it does.

1:26:48

It's kind of an issue comes at a

1:26:49

cost, comes at a cost.

1:26:51

Yeah, exactly.

1:26:52

Um, okay.

1:26:53

So yeah,

1:26:55

we could move on to the next, uh,

1:26:57

forum thread here.

1:26:58

Um,

1:26:58

cause we have talked about Firefox and

1:27:00

Mozilla quite a bit.

1:27:01

I feel like it's an easy topic to

1:27:03

just kind of talk about for a long

1:27:05

time because there's just so many issues

1:27:07

for sure.

1:27:07

Um, but this next one was.

1:27:10

Someone started a thread.

1:27:12

It was actually a very recent thread,

1:27:13

only sixteen hours ago.

1:27:15

So favorite underrated hobby for staying

1:27:18

productive.

1:27:20

I'm looking for hobbies that aren't just

1:27:22

fun,

1:27:22

but also help clear your mind or improve

1:27:25

skills in subtle ways.

1:27:27

Anything offbeat that people swear by?

1:27:31

I feel like this is definitely an off

1:27:33

topic section of the forum.

1:27:36

I think this could be interesting to read

1:27:38

some of these things here.

1:27:39

I feel like Nate added this.

1:27:41

So I feel like you have something you

1:27:43

want to say about this.

1:27:44

Do you?

1:27:46

Yeah, I do.

1:27:50

I wanted to add this one because

1:27:54

I don't know about you guys.

1:27:54

Okay, so a quick tangent off topic.

1:27:58

When I used to work with Henry in

1:28:00

Surveillance Report,

1:28:01

he was very open about the fact that

1:28:02

he's like, I do privacy all day.

1:28:05

So when I'm not working,

1:28:06

I don't really listen to privacy podcasts

1:28:08

or read privacy books.

1:28:09

I need to detox from it.

1:28:12

And now that I am also doing privacy

1:28:14

full time, I...

1:28:17

I haven't gone quite to that extent,

1:28:18

but I get where he's coming.

1:28:19

I mean, I understood it before,

1:28:20

but now I'm living it.

1:28:21

And, um, so I,

1:28:22

I think it's just really important to,

1:28:25

I don't want to say touch grass.

1:28:26

Cause that's a very like disparaging term,

1:28:28

but it's, it's just really important.

1:28:30

I think for all of us to like

1:28:31

take a breather, especially privacy,

1:28:32

like it can be so depressing sometimes.

1:28:34

Cause unfortunately I feel like we do take

1:28:36

more, more losses than wins.

1:28:38

A lot of the time, you know,

1:28:39

we don't, um, we don't get to, uh,

1:28:44

I wouldn't say we don't get to.

1:28:45

We see a lot more bad news regularly

1:28:47

about Instagram rolling back and encrypted

1:28:49

DMs and Android trying to crack down on

1:28:52

third-party installations and this, that,

1:28:53

and the other.

1:28:54

And so it's very...

1:28:57

it can be a little depressing sometimes

1:28:59

because we only get the good news like

1:29:01

chat control was defeated.

1:29:03

We only get that stuff every so often.

1:29:04

So I really like this idea of what

1:29:06

are your hobbies just in general?

1:29:10

I like these people talking about things

1:29:12

they do.

1:29:12

One person here said they read,

1:29:14

which is pretty...

1:29:17

not really offbeat but you know reading is

1:29:19

is a really good thing and they said

1:29:20

like they read a lot of fiction too

1:29:21

like it's not all tech and privacy stuff

1:29:23

they read a lot of non-fiction fiction um

1:29:25

one person did mention self-hosting which

1:29:27

is a good way to learn more about

1:29:28

tech and privacy uh to your comment one

1:29:31

person did say i didn't realize we had

1:29:32

an off-topic section of the forum where

1:29:34

we're allowed to talk about things

1:29:35

unrelated um so yeah definitely we do have

1:29:38

that and then um

1:29:40

Somebody said they do chess.

1:29:43

One of my favorites, they said,

1:29:45

not sure it would qualify as offbeat,

1:29:46

but I enjoy dribbling watercolors on

1:29:48

potato slices,

1:29:50

letting them dry out and then taking

1:29:51

photos of them.

1:29:52

You blow up the images and they kind

1:29:53

of resemble an aged artsy fartsy painting.

1:29:56

one day I'll print and put these up

1:29:57

for sale.

1:29:59

And somebody replied, they're like,

1:30:00

I'm going to go on a limb and

1:30:01

say it qualifies as offbeat.

1:30:03

But, you know, it's a...

1:30:04

And for the record,

1:30:05

I thought that one was super cool.

1:30:06

I want to see those too.

1:30:07

Those sound awesome.

1:30:08

But yeah, it's just, I guess,

1:30:10

kind of a reminder for all of us

1:30:12

to find something enjoyable that helps you

1:30:15

unwind because this stuff can be a lot

1:30:18

sometimes for all of us.

1:30:20

I don't think you have to work in

1:30:21

it full time,

1:30:22

but it is really good to remember that

1:30:24

there's...

1:30:25

Privacy should be a means to an end,

1:30:27

in my opinion.

1:30:28

Privacy should be what enables you to take

1:30:30

control of your online life and your data

1:30:33

and build the life that you want.

1:30:35

And that includes going out and doing

1:30:38

other stuff sometimes.

1:30:39

So, yeah.

1:30:42

I don't know if I have any underrated

1:30:44

or productive...

1:30:46

Also that,

1:30:46

I just want to throw that out there.

1:30:47

Personally,

1:30:47

I'm a really big fan of like being

1:30:49

productive and self-improvement and stuff

1:30:50

like that.

1:30:51

So obviously not everything has to be,

1:30:52

like when I'm playing video games,

1:30:53

that's not always productive, right?

1:30:55

But it's fun and it relaxes you.

1:30:57

So, yeah.

1:31:01

Yeah, I think it's good to remember,

1:31:02

you know,

1:31:03

not everything you do

1:31:05

has to be productive i think being

1:31:07

unproductive a little bit you know and

1:31:10

doing things that aren't actually you're

1:31:11

not going anywhere like you're just doing

1:31:14

something for the sake of it it's like

1:31:15

kind of the point of being human right

1:31:17

like we're not here just to produce and

1:31:20

and make things and uh and make money

1:31:23

and work you know i think people need

1:31:25

to also take time and be and do

1:31:28

things like you know nature like gaming

1:31:30

and all these other hobbies that people

1:31:32

have put here um

1:31:35

But I think, you know,

1:31:37

taking time to be unproductive can help

1:31:40

you be more productive.

1:31:42

I think taking a break,

1:31:43

taking rest is kind of important.

1:31:46

And, you know,

1:31:47

I guess I'll throw in a couple of

1:31:49

extra ones.

1:31:52

I do think exercise is pretty important.

1:31:55

It's pretty good for your health as well.

1:31:58

It's productive, I guess,

1:31:59

because you are becoming healthier.

1:32:02

I think people should...

1:32:04

if you're able, uh, exercise regularly,

1:32:07

you know, it's an important thing.

1:32:08

I think it doesn't really achieve any

1:32:13

particular goal.

1:32:13

It just is, you know,

1:32:15

it can be any sort of exercise is

1:32:16

important.

1:32:17

Um, yeah, I mean, I think it's,

1:32:22

there's plenty of different, uh,

1:32:23

things you can do.

1:32:24

I enjoy photography, like in my free time,

1:32:27

stuff like that.

1:32:28

I think art stuff is also important,

1:32:29

gets your brain going.

1:32:31

Um,

1:32:33

but I do think it is important to

1:32:36

not make everything in your life about

1:32:40

securing your privacy and like about this

1:32:43

one topic.

1:32:44

Cause that's, uh,

1:32:46

that's one way you're going to get burnt

1:32:48

out.

1:32:48

That's actually a section on the activism

1:32:50

section we recently launched.

1:32:52

Um, so definitely check that out.

1:32:54

Um, but I think, you know, it's,

1:33:00

Yeah, it's an interesting thread.

1:33:01

Maybe go over there and drop your favorite

1:33:04

thing you like doing.

1:33:05

I think it's nice to have these off

1:33:06

topic forum threads sometimes because I

1:33:09

feel like every thread is just like so

1:33:12

draining.

1:33:13

Like there's just every day,

1:33:14

there's just a new story of like,

1:33:17

the absolute worst thing happening.

1:33:20

Um,

1:33:21

and sometimes it's good to disconnect a

1:33:24

little bit.

1:33:24

Maybe that means not actually going on the

1:33:26

privacy guides forum for a day, you know,

1:33:29

taking a break.

1:33:30

Um, it's definitely helpful.

1:33:33

Um, and yeah,

1:33:34

I think it would be more productive if

1:33:36

you take more breaks.

1:33:37

Um, everyone needs days off.

1:33:39

Yeah.

1:33:41

A hundred percent.

1:33:42

Um, yeah, I mean,

1:33:44

I don't really have much more to add.

1:33:45

Do you have anything?

1:33:48

I don't think so.

1:33:49

I was going to say we could probably

1:33:50

move into viewer questions now,

1:33:55

which I think we've kind of been answering

1:33:57

them as we went in the live chat,

1:33:59

right?

1:34:00

Have we missed any that we haven't covered

1:34:01

yet?

1:34:04

Um,

1:34:04

I think there was just people kind of

1:34:07

sort of making comments here.

1:34:09

Not really any questions per se.

1:34:12

We did kind of talk about a lot

1:34:13

of stuff, uh,

1:34:15

that was already covered in a lot of

1:34:16

these points.

1:34:17

Like someone mentioned.

1:34:19

Ninety-nine percent of what these browser

1:34:22

AI things can be replicated in a browser.

1:34:26

Um,

1:34:27

and browsers are less permission heavy.

1:34:29

So like using an AI app is kind

1:34:31

of useless.

1:34:33

Yeah, I agree.

1:34:36

They need a light version of the iPhone.

1:34:41

Maybe.

1:34:41

Oh, no, no, no.

1:34:41

They're talking about a uBlock Origin

1:34:43

light.

1:34:43

There's the uBlock Origin light for

1:34:45

iPhone,

1:34:45

which I think actually we did add back

1:34:47

to the website.

1:34:51

I think we talk about it on our

1:34:53

iOS section, if I remember correctly.

1:34:55

I think you're right.

1:34:56

Yeah.

1:34:56

Um, and ad guard,

1:34:57

I think those are the two recommendations

1:34:59

still, cuz ad guard does, uh,

1:35:01

it does still protect web apps and things.

1:35:04

So that is a good point.

1:35:05

Sorry.

1:35:05

I missed that.

1:35:06

I kind of misunderstood that comment.

1:35:09

Um, but yeah,

1:35:11

was there any comments from members on our

1:35:13

forum thread this week?

1:35:16

So, yeah, I, I think I passed it.

1:35:21

Um,

1:35:22

Yeah, we did have not too many.

1:35:25

I know one question we got was about...

1:35:28

I don't know if this person's watching

1:35:30

right now, but somebody asked us,

1:35:32

is it possible to provide a list of

1:35:33

news articles that the stream will go over

1:35:35

in advance?

1:35:36

Just to give you guys a little peek

1:35:37

behind the scene, the short answer is no.

1:35:39

Because what happens is,

1:35:42

and I think I may have said this

1:35:43

before, is throughout the week,

1:35:45

we kind of collect articles that we may

1:35:47

want to talk about.

1:35:48

And we try to keep it to four

1:35:50

to six articles on average.

1:35:53

And so...

1:35:54

we kind of wait until friday and that's

1:35:57

when we go over like okay what are

1:35:59

the main things we really really want to

1:36:01

talk about and what are the things that

1:36:03

we can um drop off to you know

1:36:06

uh like the news feed or the news

1:36:08

section um thankfully we do have the news

1:36:10

section where even if we don't cover an

1:36:11

article here we might still write about it

1:36:13

there so uh and sometimes we do both

1:36:15

but

1:36:16

Yeah.

1:36:17

Uh, a lot of the time,

1:36:17

like we're not, uh, we're,

1:36:20

we're still like Friday afternoon.

1:36:23

Um, us time we're,

1:36:24

we're still like putting this stuff

1:36:25

together.

1:36:25

So unfortunately that's not really doable

1:36:28

in advance.

1:36:29

And to also add to that as well,

1:36:30

sometimes we're like, you know,

1:36:32

it's Thursday morning and we're like still

1:36:34

trying to work out what the highlight

1:36:36

story is because sometimes there's just

1:36:38

not that much going on.

1:36:39

Like, you know,

1:36:40

we can't really release the newsletter if

1:36:42

we don't even know what the highlight

1:36:43

story is going to be.

1:36:44

Um, so we're sorry that that's,

1:36:47

that it's kind of frustrating, I guess.

1:36:49

Um, but you know, we've been like,

1:36:51

Nate's been doing a great job with like,

1:36:53

we published the newsletter as soon as the

1:36:56

live stream starts.

1:36:57

Like if you check your inbox,

1:36:58

like it'll be there.

1:37:00

Um, so I would, if you're worried,

1:37:03

if you want to know what we're talking

1:37:04

about on the live stream,

1:37:05

then that'll be the best place to see

1:37:07

that.

1:37:08

Um,

1:37:09

I did drop a link in the forum

1:37:10

thread there.

1:37:11

Um,

1:37:12

but if you do want to sign up

1:37:13

it's just privacyguides.org forward slash

1:37:16

live stream and if you press the donate

1:37:17

button in the bottom right and you select

1:37:20

free on that so you don't have to

1:37:22

pay money to join the newsletter or

1:37:24

anything you'll get the update

1:37:27

notifications for the live stream and that

1:37:29

includes all the links and also like some

1:37:32

small summaries of the stories as well so

1:37:35

if you want to follow along while we're

1:37:37

talking on the live stream you can

1:37:39

get that to your inbox.

1:37:41

It also goes live onto the website

1:37:43

eventually, but let's see,

1:37:45

is it on there right now?

1:37:47

Yeah, it looks like it is.

1:37:48

It should be, yeah,

1:37:49

because when I publish it,

1:37:51

I choose publish and email,

1:37:52

so it should go to both the website

1:37:53

and the... So yeah,

1:37:55

if you prefer to use RSS for some

1:37:56

reason,

1:37:56

you can subscribe to that section and

1:37:58

that'll pop up in your RSS feed as

1:37:59

soon as we publish it.

1:38:02

Looks like we got a comment from

1:38:04

Cannabida.

1:38:05

Do you recommend any books that are not

1:38:07

explicitly about privacy,

1:38:09

but privacy adjacent?

1:38:12

That is a very good question.

1:38:14

I know the answer is yes,

1:38:15

but I'm struggling to remember what they

1:38:17

are because I know there's been a few

1:38:18

books that I've read and I'm like,

1:38:19

I kind of want to add this over

1:38:21

on the new oil as a recommended book,

1:38:23

but it's not really privacy related per

1:38:25

se.

1:38:26

And now I'm trying to remember what they

1:38:27

were.

1:38:28

I feel like in shitification, uh,

1:38:31

by Cory doctor is a good one.

1:38:32

Like that's,

1:38:33

I just bought that one the other day.

1:38:34

I'm waiting for it to ship.

1:38:36

Nice.

1:38:37

Yeah.

1:38:37

That's a, that's a definite,

1:38:38

that's like one that's it's not

1:38:40

technically about privacy.

1:38:41

It's just like, you know,

1:38:43

adjacent big tech being awful kind of

1:38:46

explaining that whole process.

1:38:48

Um, Hmm.

1:38:50

Ooh, Andy Greenberg,

1:38:52

who I think actually wrote one of the

1:38:53

articles we covered today,

1:38:55

or maybe one of the ones we were

1:38:56

considering.

1:38:57

But he's a writer for Wired,

1:38:59

and he's written quite a few.

1:39:00

Like, Sandworm is really good,

1:39:02

and that's about Russia's state hacking

1:39:05

group.

1:39:07

He's written Tracers in the Dark,

1:39:10

which is...

1:39:11

Um,

1:39:12

it's divided into four sections and the

1:39:13

last section is about finding people who

1:39:17

host CSAM websites on the dark web.

1:39:18

So just fair warning.

1:39:20

That was a tough read.

1:39:22

Um, the first three parts are great.

1:39:24

That last part was a little rough to

1:39:25

get through.

1:39:26

Um, yeah,

1:39:28

he's written a couple of books that I

1:39:29

wouldn't say are like directly privacy

1:39:30

related.

1:39:31

Cause again,

1:39:31

they're about like cyber crime and state

1:39:34

hackers,

1:39:34

but they're very interesting and they're,

1:39:36

they're adjacent for sure.

1:39:40

yeah i mean this i feel like you

1:39:42

have quite a few different options to pick

1:39:46

uh maybe you might have to i reckon

1:39:48

if you go to like corey doctor's stuff

1:39:51

he probably has like a bunch of books

1:39:52

that are semi-related to this whole thing

1:39:55

right i think

1:39:56

He's a good person to look at.

1:39:58

But I don't know.

1:39:59

Yeah, I can't really think of too many.

1:40:00

I know there's like quite a few books

1:40:02

about like sort of the AI stuff that's

1:40:04

going on now.

1:40:05

I saw those like one on my timeline

1:40:07

the other day, The AI Con.

1:40:10

That's also an interesting one.

1:40:11

I can't really think of too many other

1:40:13

non privacy related books.

1:40:16

I can think of a lot of privacy

1:40:17

related books,

1:40:18

but just not like somewhat outside that.

1:40:23

I haven't read it,

1:40:24

but on the topic of AI,

1:40:25

I've heard a lot of good things about

1:40:28

if anyone builds it, we all die,

1:40:32

which is about the quest to build AGI,

1:40:34

artificial general intelligence.

1:40:36

So I haven't read it,

1:40:37

but I've heard a lot of good things.

1:40:41

Yeah.

1:40:42

I can't really think of too much here,

1:40:44

too much more.

1:40:45

But yeah,

1:40:46

was there any other things you were

1:40:48

thinking on the forum thread here?

1:40:51

The last thing I wanted to mention that

1:40:52

you did,

1:40:53

we mentioned it in the site updates,

1:40:55

but somebody asked us to go over the

1:40:59

homomorphic encryption story from Fria and

1:41:02

just kind of explain it.

1:41:04

Please go over the story.

1:41:06

For more people to understand it simply

1:41:08

put,

1:41:08

I think it's important to know and follow.

1:41:10

So homomorphic encryption,

1:41:12

and this is grossly oversimplified,

1:41:14

but it's basically a way,

1:41:16

and it's a real thing.

1:41:17

It's not just theoretical.

1:41:18

It's a way to process data on a

1:41:21

remote server in a way where it's still

1:41:25

encrypted and the server can't see your

1:41:27

data.

1:41:28

So hypothetically, like right now,

1:41:31

let's use Google and Proton as an example,

1:41:33

right?

1:41:34

Google...

1:41:36

and I might have this wrong, but correct.

1:41:38

Well, this part, I know I'm right.

1:41:39

Google,

1:41:39

you put your stuff on their server,

1:41:41

you interact with it,

1:41:41

but Google can see it.

1:41:43

Proton,

1:41:44

a lot of it has to be decrypted

1:41:46

in your browser.

1:41:48

So it tends to be a little bit

1:41:49

slower because of that delay.

1:41:52

Homomorphic encryption would be a way

1:41:54

where it can still stay on the server

1:41:57

and you can work with it in real

1:41:58

time, but it would still be private.

1:42:01

And I think it's designed more for...

1:42:04

Oh, man,

1:42:05

I can't think off the top of my

1:42:06

head.

1:42:06

But I know it's not designed for things

1:42:07

like Proton, where it's like, oh,

1:42:08

you can take that little performance hit.

1:42:11

It's got very specific use cases.

1:42:15

But the big problem is,

1:42:17

and I don't know if this is an

1:42:20

exaggeration or not, but...

1:42:23

Freya wrote here that it's thousands of

1:42:25

times slower than processing the data

1:42:27

normally.

1:42:27

And I don't think that is an exaggeration.

1:42:31

So literally,

1:42:33

just to give a tiny bit more context,

1:42:36

Proton mentioned this when they talked

1:42:37

about Lumo.

1:42:37

And they were trying to figure out how

1:42:39

they wanted to make Lumo private.

1:42:41

And they mentioned that they had

1:42:42

entertained the idea of homomorphic

1:42:44

encryption,

1:42:45

except it would literally take about ten

1:42:46

minutes to get an answer back from your

1:42:48

prompt.

1:42:49

So like you type in your prompt,

1:42:51

you go make coffee.

1:42:52

Don't even just get a new cup,

1:42:53

just make a whole new pot of coffee.

1:42:55

And then you come back and hopefully your

1:42:57

prompt will be ready for you.

1:42:58

So it's not really feasible.

1:43:00

It's not practical for most applications,

1:43:03

but Intel released this new chip that

1:43:06

they're calling Hercules.

1:43:08

And it across seven key operations,

1:43:12

Hercules was one thousand to five thousand

1:43:14

times as fast.

1:43:16

So it's still not quite there.

1:43:21

Freya does talk about some of the

1:43:22

challenges that are still facing

1:43:24

homomorphic encryption here.

1:43:25

But it is definitely really cool that

1:43:27

we've seen such a major jump on this

1:43:31

technology.

1:43:32

Because if they can get it up to

1:43:34

a more usable speed,

1:43:35

that really would be a game changer.

1:43:40

I don't want to compare it to nuclear

1:43:41

fission or cold fusion or whatever it is

1:43:43

because that's one of those things that

1:43:44

it's like, oh, at this point,

1:43:46

some people aren't even sure it's possible

1:43:48

because it's so far away.

1:43:50

But it is one of those holy grail

1:43:53

kind of things that it's like, man,

1:43:54

if we could do this,

1:43:55

it would solve a lot of

1:43:58

potential privacy problems.

1:44:00

Although I do feel compelled to point out

1:44:01

that at that point,

1:44:02

the challenge would be getting companies

1:44:04

to use it as we're seeing meta rollback

1:44:06

and to end encryption.

1:44:07

So there's already a lot of solutions that

1:44:09

people just don't feel like using,

1:44:10

but it would be nice to have this

1:44:11

in our toolkit too, because again,

1:44:12

there are specific use cases for it where

1:44:15

I think people would readily use it.

1:44:16

It's just not where we need it to

1:44:18

be right now.

1:44:19

So yeah.

1:44:21

I think, you know,

1:44:24

I've got to be the AI hater on

1:44:26

the podcast.

1:44:27

So I'm going to say, you know,

1:44:29

if you do read the link,

1:44:31

if you look at the link that Freya

1:44:33

linked with this chip that they're working

1:44:37

on, it does still mention, like,

1:44:39

when they use this homomorphic encryption,

1:44:42

it basically...

1:44:43

significantly increases the amount of

1:44:45

memory that's used.

1:44:47

And I don't know if you're aware of

1:44:49

the global RAM shortage,

1:44:52

the global computer component shortage.

1:44:55

I feel like we don't need to make

1:44:57

it any worse by doing this,

1:44:59

by doing this homomorphic encryption

1:45:01

thing.

1:45:02

I think, you know,

1:45:06

I would push, you know,

1:45:07

I don't recommend that you use these AI

1:45:10

tools.

1:45:11

I mean, if you have to though,

1:45:12

if you absolutely have to,

1:45:13

there's local options,

1:45:15

but I think one interesting thing that

1:45:17

this sort of homomorphic encryption thing

1:45:20

or

1:45:21

I guess it's like trusted computing,

1:45:23

I guess.

1:45:24

Is that sort of,

1:45:25

I feel like this is a similar thing.

1:45:27

Um, but yeah,

1:45:28

Freya mentions that in the article.

1:45:30

Okay.

1:45:30

Right.

1:45:31

Yeah.

1:45:31

So basically the,

1:45:33

there was a VPN service that was doing

1:45:35

this through Intel's SGX system to

1:45:38

protect,

1:45:38

basically it would be an additional layer

1:45:41

because when you trust a VPN service,

1:45:44

you basically have to trust that they're

1:45:47

not gonna log your traffic or they're

1:45:49

gonna, you know,

1:45:51

because there has to there has to be

1:45:52

processing that's done to actually

1:45:56

facilitate the connection between you and

1:45:58

the VPN server.

1:46:00

So that can't be encrypted.

1:46:01

But there was this VPN company that was

1:46:04

saying that's what they were doing.

1:46:06

They were using like an Intel SGX like

1:46:09

secure enclave system.

1:46:10

So like basically no one would be able

1:46:12

to get access to it.

1:46:12

It would be in like a trusted platform

1:46:15

thing.

1:46:18

It's also interesting because I feel like

1:46:20

Apple was also pushing this sort of thing.

1:46:23

They're like doing their private cloud

1:46:25

compute system.

1:46:27

hello, Apple, where is it?

1:46:29

It's like,

1:46:31

this seems like a similar technology

1:46:33

thing.

1:46:34

Like it seems like a very similar thing,

1:46:36

except, you know, they're not using Intel,

1:46:37

they're using Apple Silicon instead,

1:46:39

which I think gives them an edge really,

1:46:42

because they're not relying on a third

1:46:44

party company like Intel.

1:46:48

Like, you know, if you,

1:46:50

they can do everything in house,

1:46:51

like firmware's in house,

1:46:53

the Silicon's made in house.

1:46:55

I still think that they use fabricators

1:46:58

still, but they're like a,

1:47:00

what do you call that?

1:47:03

I don't know.

1:47:03

They don't fabricate the silicon

1:47:05

themselves.

1:47:06

They outsource it, I believe.

1:47:07

But yeah,

1:47:09

it kind of puts them in a better

1:47:10

position to do that.

1:47:12

But that still hasn't really appeared.

1:47:16

I don't know what's going on with the

1:47:18

private cloud compute thing.

1:47:19

I think it's an interesting topic to keep

1:47:22

an eye on.

1:47:22

But I think, you know,

1:47:23

like Freya was saying,

1:47:26

the constraints of this are too...

1:47:29

are too high like it's it doesn't it

1:47:31

can't do enough but maybe this could be

1:47:33

used as you know this technology could be

1:47:37

used in a specific application like a vpn

1:47:39

where it doesn't need as much processing

1:47:41

power i'm not sure but i think it's

1:47:44

definitely an area that you know privacy

1:47:47

advocates should keep an eye on because

1:47:49

this is technology that could be used

1:47:53

in a positive way, hopefully not for AI,

1:47:55

but if it's used for AI, I mean,

1:47:58

I hope it offers some sort of extra

1:48:00

privacy protection.

1:48:02

Um,

1:48:03

I think one concern a lot of people

1:48:04

have is their prompts being used for

1:48:06

training data.

1:48:07

If it wasn't in a secure SGX, like,

1:48:10

or I guess,

1:48:11

what are they calling this one?

1:48:12

They're calling it the,

1:48:13

the trusted execution environment.

1:48:17

fully homomorphic encryption chip in a

1:48:20

trusted security environment or whatever

1:48:22

Nate said.

1:48:24

Yeah, like, yeah, I don't know.

1:48:26

That would be better than people just

1:48:29

giving their data straight to open AI.

1:48:30

But I feel like the interest of these

1:48:34

big companies is not in protecting

1:48:36

people's privacy.

1:48:37

They like to slurp up your data for

1:48:39

training.

1:48:41

Um,

1:48:41

so I'm not sure this can maybe become

1:48:44

more popular on like a niche product like

1:48:46

proton,

1:48:46

but I don't think open AI or Google

1:48:49

Gemini is gonna sacrifice their speed,

1:48:53

their processing power just for, you know,

1:48:56

protecting their people's private,

1:48:57

the user's privacy.

1:48:59

I don't think.

1:49:02

Yeah, I mean, not to be overly optimistic,

1:49:06

but I think the thing that makes me

1:49:07

excited about this kind of stuff is that

1:49:09

it's another step forward, right?

1:49:11

Like, yeah,

1:49:11

it's still not ready in this state.

1:49:14

It's still too slow, and there's...

1:49:16

What did they say at the end here?

1:49:18

Uh...

1:49:19

For FHE to take off,

1:49:20

there needs to be support at all levels.

1:49:22

And then there's a company that focuses

1:49:24

more on the software side of things.

1:49:26

There's another company that's looking to

1:49:28

move away from the limits of traditional

1:49:29

computers and utilize photonics,

1:49:31

computing with light to speed up FHE even

1:49:33

more.

1:49:34

So there's still a lot to be done

1:49:38

and different people trying to tackle it.

1:49:40

I think...

1:49:41

What I like about it is just the

1:49:43

fact that it is a step forward because

1:49:45

FHE, yeah, I mean, it's,

1:49:48

I think we both kind of said the

1:49:48

same thing that like,

1:49:49

there's no guarantee that companies will

1:49:50

use this.

1:49:51

And Freya did even specifically mention

1:49:53

like AI, you know,

1:49:54

maybe they said it could be the case

1:49:56

that in a few years,

1:49:57

it'll be the norm to make a fully

1:49:58

end-to-end encrypted query to Google or

1:50:00

ask chat GPT for dinner ideas in a

1:50:02

fully end-to-end encrypted manner.

1:50:04

But even if we get to a point

1:50:06

where it's like, yeah,

1:50:07

the resource usage is minimal,

1:50:08

the speeds are minimal,

1:50:10

this is totally economically feasible,

1:50:13

will it still be economically feasible for

1:50:16

the company who collects all your data?

1:50:19

Which at that point, I think,

1:50:21

this is kind of a different discussion,

1:50:22

but I think some...

1:50:25

I think there has been a rise in

1:50:28

people caring about privacy.

1:50:30

You can tell in the marketing.

1:50:32

Everybody's always trying to like, oh,

1:50:34

we care about your privacy with this

1:50:36

product, even if they don't.

1:50:37

They say they do.

1:50:39

We give you the option to opt out.

1:50:41

We don't train on your prompts.

1:50:43

Companies say that stuff,

1:50:44

which to me tells me that there are

1:50:46

people who

1:50:47

are concerned about this stuff and maybe

1:50:48

don't know as much as they should.

1:50:49

Maybe don't understand what the company's

1:50:52

lying when they say that or how to

1:50:53

tell if the company's lying.

1:50:54

But the point is,

1:50:55

I think there will be some people who

1:50:58

like, you know,

1:50:59

for all the crap we give Apple,

1:51:00

I could totally see Apple if this became,

1:51:03

again, economically feasible,

1:51:04

Apple being like, yeah, let's do this.

1:51:06

And it's like,

1:51:08

Now that Apple's doing it,

1:51:10

Google's got to keep up or somebody's got

1:51:11

to keep up.

1:51:12

So they'll always try to find a way,

1:51:15

just to be clear,

1:51:16

they'll always try to find a way to

1:51:17

do the bare minimum.

1:51:18

So even if Apple or anybody,

1:51:21

if anybody were to roll this out,

1:51:23

there will be other companies who are

1:51:24

like, yeah,

1:51:25

we encrypt your stuff at rest and we

1:51:27

say that it's encrypted.

1:51:29

We already see that right now, right?

1:51:30

We see that with Apple.

1:51:32

like companies saying, oh,

1:51:33

we secure your stuff with military grade

1:51:34

encryption, which means nothing.

1:51:36

And it's just a marketing thing while

1:51:38

they're doing the bare minimum.

1:51:39

It's like, yeah,

1:51:39

you use passwords and TLS.

1:51:41

Nobody's impressed.

1:51:42

But I don't know.

1:51:43

My point being is it's definitely a

1:51:45

different set of obstacles to get over,

1:51:47

but it's still nice to see

1:51:49

that this is taking steps forward, um,

1:51:52

and even becoming an option in the first

1:51:53

place,

1:51:53

because that's really the first step,

1:51:55

right?

1:51:56

Is this has to be usable so that

1:51:58

people can use it.

1:51:58

And then hopefully from there it'll become

1:52:00

adopted.

1:52:00

But at that point we're speculating and my

1:52:03

crystal ball is currently in the shop.

1:52:04

So I cannot predict the future.

1:52:08

Yeah, but yeah,

1:52:09

that's pretty much all I had to comment

1:52:11

on that one.

1:52:12

I mean,

1:52:12

hopefully that is a useful discussion for

1:52:15

you to understand it a bit better.

1:52:16

I hope we explained it well enough and

1:52:20

at least cut through some of the hype

1:52:22

because definitely is a little bit hyped,

1:52:24

I think.

1:52:25

But yeah.

1:52:29

Yeah, definitely.

1:52:30

It's a complicated topic.

1:52:32

So we like severely dumbed it down,

1:52:34

but hopefully that did help.

1:52:36

But I think that's everything we had for

1:52:39

this week.

1:52:40

So thank you guys for watching.

1:52:42

All the updates from this week in privacy

1:52:44

will be shared on the blog every week

1:52:45

that we just talked about.

1:52:46

So go ahead and sign up for the

1:52:48

newsletter or subscribe with your favorite

1:52:50

RSS reader if you want to stay tuned.

1:52:52

If you are an audio listener,

1:52:53

we have this podcast available on audio

1:52:55

platforms,

1:52:56

all podcasting platforms and RSS as well.

1:52:59

And the video itself will be synced to

1:53:00

PeerTube, so stay tuned for that.

1:53:02

Privacy Guides is an impartial nonprofit

1:53:05

organization that is focused on building a

1:53:06

strong privacy advocacy community and

1:53:09

delivering the best digital privacy and

1:53:11

consumer technology rights advice on the

1:53:12

internet.

1:53:13

If you want to support our mission,

1:53:14

then you can make a donation on our

1:53:15

website, privacyguides.org.

1:53:18

To make a donation,

1:53:18

click the red heart icon located in the

1:53:20

top right corner of the page.

1:53:22

You can contribute using standard fiat

1:53:23

currency via debit or credit card,

1:53:25

or you can donate anonymously using Monero

1:53:27

or your favorite cryptocurrency.

1:53:29

Becoming a paid member unlocks exclusive

1:53:31

perks like early access to video content

1:53:33

and priority during the This Week in

1:53:35

Privacy livestream Q&A.

1:53:36

You'll also get a cool badge on your

1:53:37

profile in the Privacy Guides forum and

1:53:39

the warm,

1:53:39

fuzzy feeling of supporting independent

1:53:41

media.

1:53:42

Thank you all so much for watching,

1:53:44

and we will be back next week.

1:53:47

See you next week.