Claude Code Leaked It's Own Source Code
Ep. 47

Claude Code Leaked It's Own Source Code

Episode description

Anthropic’s Claude Code just leaked it’s own source code, ONLYOFFICE suspends Nextcloud partnership, LinkedIn scans your browser extensions, and more. Join us for This Week In Privacy #47!

Download transcript (.vtt)
0:02

Hi, folks,

0:03

we got a lot to talk about.

0:05

Claude's source code was leaked.

0:07

LinkedIn scrapes your browser extensions.

0:10

There's a horribly insecure messenger app

0:13

going around and more.

0:15

All this and more coming up this week

0:17

in on this week in privacy number forty

0:19

seven.

0:19

So stay tuned.

0:43

Welcome back to This Week in Privacy,

0:45

our weekly series where we discuss the

0:47

latest updates with what we're working on

0:48

in the PrivacyGuides community and this

0:50

week's top stories in data privacy and

0:52

cybersecurity.

0:54

I am Nate,

0:54

and joining me again this week is Jonah.

0:57

How was your week, Jonah?

0:58

You know, my week has been pretty good,

1:00

thanks for asking.

1:02

Besides misspeaking during the intro

1:04

there.

1:05

Can't complain.

1:06

A lot of these things happen, right?

1:09

Yes, yes.

1:12

All righty.

1:13

Yeah.

1:14

I guess with that,

1:14

we'll jump right into our headline story

1:17

this week.

1:18

And you guys have probably heard about

1:21

this one.

1:22

So there's an AI called Claude,

1:25

Claude Code specifically,

1:26

because there's a few different kinds of

1:28

Claude.

1:31

I'm not a heavy AI user myself,

1:33

so I've heard that Claude is one of

1:36

the better ones in terms of the results

1:38

it puts out are mostly accurate.

1:40

It puts out mostly good code.

1:42

That's just what I've heard.

1:44

You could do a lot worse than that

1:45

one,

1:45

but we're not going to talk about that.

1:47

We're here to talk about the fact that

1:48

Claude code had its source code leaked

1:52

thanks to some human error.

1:55

To clarify,

1:57

this is the source code for the app

1:59

itself, the Cloud Code CLI,

2:03

not like the models or anything like that.

2:05

But it still gives us a little bit

2:06

of insight into what's going on under the

2:10

hood.

2:11

And...

2:13

I guess I'll go over it a little

2:14

bit, but I'm also mostly going to,

2:16

I mean, we're a privacy podcast, right?

2:17

So we're going to focus mostly on the

2:18

privacy and security stuff.

2:20

But just to kind of give you a

2:21

little bit of a recap.

2:22

So this happened because when they

2:25

published the newest version of the NPM

2:27

package, there was a source map file,

2:30

which I'll be honest,

2:31

that's technical stuff that goes over my

2:33

head.

2:33

But basically it allowed clever people who

2:36

noticed it

2:37

to access the source code.

2:38

Like we said,

2:38

it was almost two thousand TypeScript

2:41

files and more than five hundred and

2:42

twelve thousand lines of code.

2:44

I saw somebody else round up to five

2:45

hundred and thirteen thousand.

2:46

So, yeah.

2:49

I mean,

2:50

it's one of those once the cat's out

2:51

of the bag things, right?

2:52

Or once the horse has left the barn.

2:54

Because everybody quickly went and

2:58

downloaded this,

2:58

and there's other repos are springing up,

3:02

which we'll talk about that in a second.

3:04

Anthropic tried to get some of them taken

3:05

down with a DMCA takedown,

3:07

a copyright thing, basically.

3:09

unrelated we're not going to talk about it

3:11

but you know there was a whole uh

3:13

like github interpreted that dmca

3:16

according to the the official story github

3:18

interpreted that dmca a little harshly and

3:20

took down even things that were not

3:21

supposed to be taken down but yeah it's

3:23

it's been a whole thing um

3:25

So I've also seen some pretty polarizing

3:29

takes here because I think it was this

3:30

article.

3:30

Yeah,

3:31

this article said that its sophistication

3:33

is, quote, both inspiring and humbling,

3:35

according to some people who looked at the

3:36

code.

3:37

I saw some people on Mastodon look at

3:38

the code and say that it was pretty

3:39

sloppy and kind of shocking that it was

3:41

so bad.

3:42

But I mean, to be fair,

3:43

Mastodon tends to be a pretty anti-AI

3:45

crowd.

3:45

So I don't know who's telling the truth

3:46

there, but.

3:47

Yeah.

3:49

So, and then real quick,

3:51

before I jump into an analysis part,

3:54

we have like a follow-up to this story

3:56

that's related that says,

3:56

Clawed code leak used to push InfoStealer

3:58

malware on GitHub.

4:00

And this one comes from Bleeping Computer.

4:02

Basically, once the leak was out there,

4:05

a lot of people started...

4:07

putting up their own GitHub repos where

4:09

they would advertise that this was cloud

4:11

code with all the paywalled stuff removed,

4:14

basically.

4:15

So free premium cloud code.

4:17

And they would game the SEO to make

4:19

sure that it would show up in the

4:20

front.

4:20

If y'all are watching the video,

4:22

you can see here,

4:23

this one outlined in red is like the

4:25

third result from the top on Google.

4:26

And this is one of the malicious ones

4:28

that the article focused on.

4:30

And yeah, turns out, shocker,

4:33

it includes an InfoStealer malware.

4:36

I'm going to go out on a limb.

4:36

The article didn't say this,

4:37

but I'm going to go out on a

4:38

limb and say that it did work once

4:40

you fired it up.

4:41

Because usually that's how it is, right?

4:43

It works,

4:43

so you don't think anything's wrong.

4:45

But when you install it,

4:46

it's actually got that InfoStealer in

4:47

there.

4:48

And they said that there were multiple

4:49

repos like this.

4:50

So...

4:52

yeah so um cyber security takeaways from

4:54

this the we're covering this as a headline

4:57

story partially because this is a really

4:58

big story going around right but there's

5:00

there's a couple reminders here um one of

5:02

them is as far as the uh the

5:05

repo thing goes you know we we always

5:06

talk about making sure you get things from

5:08

an official source and um not not to

5:10

go too far out of my way to

5:11

pick on google here not like they don't

5:13

deserve it but uh you know we've been

5:15

covering a lot this whole google side

5:16

loading story and you know google

5:19

is trying to act like, oh,

5:21

this is all for security, right?

5:23

It's dangerous to get apps from a

5:25

third-party store,

5:26

even though the Play Store has plenty of

5:27

malware on its own.

5:28

But even so,

5:31

the point being is get things from a

5:32

trusted source.

5:35

Signal, for example, does have an APK,

5:37

but it's kind of hard to find.

5:38

But it is okay to get it from

5:39

the Play Store because that's a trusted

5:40

source.

5:41

Um,

5:41

there's also other places that you could

5:42

get the APK directly.

5:44

There's third party app stores like F

5:45

droid,

5:46

which I know have some concerns about

5:48

them.

5:48

But the point being is like,

5:49

this is when I go to download something,

5:51

typically what I do is I go straight

5:53

to the developer's website and I go, okay,

5:54

what are their official channels?

5:56

And then they'll say, you know, it's,

5:57

it's on the play store.

5:58

It's on F droid.

5:59

It's on GitHub directly.

6:01

And then I'll look at the list and

6:02

decide which one I want to use.

6:03

It's not so much the channel it's making

6:05

sure it's, it's official.

6:07

Um,

6:08

So maybe don't try to get free Claude

6:12

doing that.

6:13

And then, yeah,

6:14

just the other takeaway I had was the

6:17

whole source code leak thing.

6:19

Anthropic was really quick to own up that

6:21

it was a human error.

6:22

They said here, what was it?

6:25

Yeah,

6:25

this was a release packaging issue caused

6:27

by human error, not a security breach.

6:31

Yeah,

6:31

not like AI doesn't do this kind of

6:32

stuff all the time.

6:33

But, you know,

6:34

it's just remembering that there is...

6:38

remembering the human element in

6:39

everything.

6:39

You know,

6:40

if you listen to any social engineering

6:42

people,

6:42

they're always quick to point out that

6:44

humans tend to be the weakest link in

6:45

any system.

6:46

You know,

6:46

I could spend a lot of time trying

6:48

to,

6:48

if I'm trying to get into a building,

6:49

right?

6:50

I could spend a lot of time trying

6:51

to hack the door code or the card

6:52

readers or whatever,

6:53

Or I could come up with a really

6:55

convincing story for why I need to be

6:56

there,

6:57

usually involving a high-vis vest and a

6:58

clipboard, in my opinion.

7:00

But yeah,

7:01

so I think those are kind of the

7:04

more technical things that I took away.

7:07

Jonah,

7:07

was there anything specific about this

7:09

story that jumped out to you from your

7:12

expertise?

7:13

yeah there were a couple things that i

7:14

noticed and i was trying to find a

7:17

tweet that i saw from somebody else but

7:19

i couldn't pull it up here um but

7:22

i'll talk about some stuff going back to

7:24

what you said about mastodon i do think

7:26

it's interesting um

7:28

like the supposed quality or sloppiness of

7:31

the code,

7:32

because I believe Anthropic has said for a

7:34

while that all of their code base is

7:36

now AI-generated by all of their

7:38

developers.

7:39

That does, I think,

7:43

at least bring into question whether you

7:45

can DMCA or copyright any of this code

7:48

at all.

7:49

Maybe you can't because it's all

7:50

AI-generated,

7:51

which AI companies have been pretty firm

7:53

about saying, you know, this is not...

7:56

like a copyright concern at all.

7:59

So it's kind of a taste of their

8:00

own medicine there that all of this is

8:02

out, I think.

8:05

The main thing that I think we see

8:07

in this source code,

8:10

because like you said,

8:11

the models aren't leaked,

8:12

but there is a lot of information about

8:14

the

8:16

system prompts that Claude uses for a lot

8:19

of different tasks,

8:21

which definitely gives a lot of insight

8:23

into how Claude works and how like,

8:29

it like to to their competitors,

8:31

I think it gives a lot of insight,

8:33

like how you could make a similar product

8:34

and also

8:35

to people who are trying to do prompt

8:37

injections to bypass some of the

8:38

restrictions in placing cloud code,

8:41

you can more easily see how they're

8:43

implemented and get around them.

8:45

So I don't know how people are going

8:46

to end up using that,

8:47

but I think that there is a lot

8:48

of opportunity for people to do something

8:54

with it.

8:55

All of the AI stuff, I mean,

8:57

we've talked about it on the show before,

8:59

not...

9:01

the most interesting to me from a security

9:04

or privacy standpoint,

9:05

because like cloud code and all of these

9:08

AI models,

9:09

they're going to run fully in the cloud.

9:11

So they get all of this information.

9:13

I think it is sort of dangerous to

9:15

be using and relying on,

9:16

especially for sensitive information.

9:18

And that hasn't changed from any of this.

9:23

But yeah, it's interesting stuff.

9:24

The tweet that I was trying to pull

9:26

up talked about how Claude and how

9:31

Anthropic is using their AI to contribute

9:36

security patches to a lot of different

9:38

open source projects.

9:40

And they've been doing that out in the

9:42

open.

9:43

Certainly,

9:44

I've seen a lot of security

9:45

vulnerabilities submitted to

9:47

GitHub from Anthropic.

9:49

I think one of the latest Mastodon

9:51

security vulnerabilities in patches was

9:55

submitted by Anthropic.

9:56

So I believe I've seen contributions to

9:59

that and to Firefox and a lot of

10:01

other open source projects from them.

10:05

Unfortunately,

10:06

I just cannot find this source,

10:07

but maybe I'll be able to pull it

10:08

up later.

10:09

But I saw some information about internal

10:13

tools that Anthropic is using where the

10:15

system prompt is like,

10:16

create these security vulnerability

10:18

patches without giving any indication that

10:22

AI or cloud code is used at all.

10:25

So it's very specifically told not to

10:29

attribute anything to cloud or Anthropic.

10:31

It's told...

10:34

you know,

10:34

not to include comments that might

10:36

indicate it's AI, et cetera.

10:38

So I think that that's really interesting

10:41

that they are,

10:42

I don't know what cases they're using

10:44

those tools in.

10:46

I would have to find out more information

10:47

about that,

10:48

but I think it's interesting that they are

10:51

doing that.

10:53

Yeah,

10:53

it looks like you pulled up on the

10:54

screen some of the instructions that I

10:56

saw.

10:56

Yeah,

10:56

I found it on another article from Ars

10:58

Technica.

10:58

Yeah,

10:59

I don't know where the original thing is.

11:01

But yeah,

11:01

basically they were saying there's an

11:04

undercover mode.

11:05

So as you can see there...

11:09

they're basically telling Claude that

11:10

they're operating undercover in a public

11:12

open source repository.

11:14

So they can't contain any Anthropic

11:17

related information.

11:20

I can imagine that's probably used because

11:22

a lot of open source projects are very

11:26

anti-AI contributions and anti-AI pull

11:29

requests and just automatically close

11:31

anything that's AI generated.

11:34

So this is probably a way for them

11:35

to

11:37

um try and get around those restrictions

11:40

whether that's a good idea for them to

11:43

be doing or not i guess that's a

11:45

debate but um it seems to be what

11:48

they what they are doing and that's kind

11:50

of confirmed with this so i thought that

11:51

that was um fascinating yeah i agree i

11:56

i feel very torn on that because on

11:58

the one hand um

12:03

there's probably an angle I'm missing

12:04

here.

12:05

On the one hand,

12:06

I understand the idea of like,

12:07

let's just assume they're doing that

12:08

altruistically, right?

12:09

Like we want to make these open source

12:10

projects better.

12:11

We want to make them more secure.

12:13

You know,

12:13

like I don't think at Privacy Guides,

12:16

for example, correct me if I'm wrong,

12:18

we don't typically go out and solicit

12:20

people to like, hey,

12:21

come check out our website and make sure

12:22

all this information is accurate.

12:24

But we totally welcome it if somebody does

12:26

come up and they're like, hey,

12:27

I found an inaccuracy and they report it.

12:30

And I feel like that's kind of what

12:31

they're doing is, you know,

12:32

on the one hand,

12:33

it's like it's still creating a more

12:34

secure project, right?

12:35

Assuming that the bug report is good.

12:37

I know that's historically been a problem

12:40

is a lot of AI slot bug reports

12:42

that aren't really valid and they're not

12:44

really bugs or whatever the case.

12:47

And semi-related,

12:48

but I did see an article earlier this

12:50

week.

12:50

that said that actually there's been a

12:52

noticeable increase in quality on AI bug

12:54

reports.

12:55

So maybe they're starting to make some

12:57

progress on that.

12:57

But either way, point being,

12:59

I understand the idea of the end result

13:01

is the same and either way it makes

13:02

the project more secure.

13:03

But it also feels very disrespectful of

13:05

like, if I don't want AI reporting it,

13:09

why would you go out of your way

13:10

to hide that?

13:12

And I don't know,

13:13

it's a really weird thing and I don't

13:14

know how to feel about it.

13:15

But I did see that too.

13:16

That's really strange.

13:18

yeah i would be really interested to see

13:20

data on like all of the security related

13:23

pull requests or vulnerability reports

13:25

that anthropic specifically has reported

13:28

because i feel like there's two different

13:29

types of ai contributions to to these

13:32

projects i think a lot of them are

13:33

kind of slop contributions because a lot

13:37

of

13:38

a lot of people in the open source

13:39

space or some students, for example,

13:43

they want to pad their GitHub profiles

13:46

because it looks more attractive to

13:47

developers.

13:48

I see that quite a bit where if

13:50

you can get like a PR merge into

13:52

a major project,

13:52

it just kind of

13:54

looks good for you.

13:55

And so I think a lot of people

13:57

are just spreading a wide net and just

14:00

submitting a ton of AI slot pull requests

14:02

and hoping that some of them get accepted,

14:03

which is very annoying for open source

14:06

maintainers.

14:07

But on the other hand,

14:08

if Anthropic themselves,

14:10

if they have a legitimate interest in

14:11

improving open source tools,

14:14

which they probably do because a lot of

14:16

these big companies do use these open

14:18

source tools themselves for a lot of

14:20

different reasons,

14:21

I can imagine that

14:24

like somebody being like an engineer at

14:27

Anthropic being paid to use AI and submit

14:29

these pull requests might be doing a

14:33

better job in not just completely

14:35

submitting slop but like using AI to find

14:38

these vulnerabilities and write this code

14:39

but checking it themselves before

14:41

submitting it and writing like explainers

14:43

because they're getting paid to do this

14:44

unlike the people who are just you know

14:47

rapid fire submitting

14:49

vulnerability reports and PRs, right?

14:51

I don't know if that's true or not,

14:52

but I would imagine Anthropic would

14:55

probably argue that that's true and would

14:57

probably use that as the reason that

15:00

they're doing this.

15:02

And like I said,

15:03

I have definitely seen AI companies report

15:06

security vulnerabilities that were patched

15:08

to open source projects,

15:09

and some of them were major

15:10

vulnerabilities.

15:10

So there is some merit to the idea

15:14

that AI can find these vulnerabilities

15:18

more easily than, I mean,

15:19

I don't know if it's more easily than

15:20

people who are auditing the code,

15:22

but it certainly is happening.

15:25

So yeah, I mean,

15:27

if all of the reports that Anthropic

15:31

themselves are submitting are accurate and

15:34

worthwhile to fix,

15:38

I don't know if that's necessarily a

15:40

problem.

15:40

But of course, people are

15:42

all along the spectrum of AI and AI

15:45

contributions and AI code specifically.

15:47

So yeah,

15:49

I think that's going to be quite a

15:51

debate in the open source community for a

15:54

while,

15:55

and I don't know how people are going

15:56

to handle that.

16:00

Yeah, I don't know either.

16:03

It seems like one of the better uses

16:04

of AI, in my opinion,

16:05

as opposed to writing songs or putting out

16:10

blog posts.

16:12

It's still just, yeah.

16:14

Like you said, what, what would be the,

16:17

I wonder what the success ratio is,

16:19

especially from Claude.

16:20

And is there a human review?

16:21

It doesn't sound like it from that,

16:23

that snippet that I shared,

16:24

but that's personally,

16:25

that's where I fall.

16:26

Like,

16:26

I don't mind and I'm not a developer,

16:28

so maybe I just don't understand how,

16:30

how bad the problem is, but like,

16:32

I would imagine,

16:32

I don't mind if AI helps you find

16:34

the vulnerability,

16:35

as long as a human looks it over.

16:37

um but yeah i'm sure there's a lot

16:40

of people that are not doing that

16:41

unfortunately so yeah i mean we've we've

16:43

definitely talked about this in the

16:45

privacy guys community and when we're

16:47

talking about all these different tools

16:49

that we recommend um people really want to

16:52

see audits but they're extremely expensive

16:55

and um if ai is not being used

16:58

to like write new code but it's being

17:00

used as like a second pair of eyes

17:02

to take a look at all of this

17:03

code that could be a good thing um

17:08

You know,

17:08

it is not going to be perfectly accurate,

17:12

but if we're being honest,

17:14

all of these security audits that projects

17:15

are paying for are not completely accurate

17:17

or totally thorough either.

17:20

And they're certainly going to be cheaper

17:22

to run AI than have a whole team

17:26

of people auditing this code.

17:28

So while I would imagine it's probably

17:31

going to...

17:34

be worse quality and probably have more

17:37

false positives if you're using AI.

17:39

I do think that doing it and revealing

17:43

some of these vulnerabilities is probably

17:45

better for a lot of open source code

17:47

bases than not doing any sort of audits

17:49

at all and just hoping that the maintainer

17:52

catches all of these bugs.

17:53

So I can definitely see a use case

17:57

here.

17:57

It's a tricky situation.

18:00

Yeah, for sure.

18:01

I know it's not really AI per se,

18:03

but I know, and you probably do too.

18:05

I get the emails from GitHub every once

18:07

in a while.

18:07

That's like, Hey,

18:08

there's a thing that you use NPM or

18:11

whatever,

18:11

and there's like a vulnerability go ahead

18:13

and upgrade.

18:14

So yeah, I, I would be, uh,

18:16

I don't know.

18:18

I mean, mine's,

18:18

mine's just a static website,

18:20

so I can't imagine the damage would be

18:21

too terrible,

18:22

but still it's nice to get that proactive

18:25

without having to go out and get a

18:26

whole code audit thing.

18:28

So useful stuff.

18:32

But I don't have anything to add to

18:36

that story unless you did.

18:38

Did you want to tell us about this

18:40

next story out of California?

18:42

Yeah.

18:43

So this one was reported by the Los

18:45

Angeles Times.

18:47

Their headline is California bill would

18:49

require parent bloggers to delete content

18:51

of minors on social media.

18:54

Yeah.

18:57

So they have a quote here from somebody

19:00

directly impacted.

19:02

It says,

19:02

as the daughter of a social media

19:04

influencer,

19:04

Kami Barrett says she navigates life

19:06

within a digital footprint she wished

19:08

never existed.

19:09

Everything my mom posted is still on

19:11

social media, she said.

19:13

Photos I wish never saw the light of

19:15

day, private details about my health,

19:17

even when I started my first menstrual

19:19

cycle.

19:20

She was saying this at a Wednesday news

19:22

conference to advocate for Senate Bill,

19:25

which would require social media platforms

19:28

to offer a process for adults to request

19:31

the removal of content that features

19:32

themselves as minors and was created by a

19:35

family member who received compensation

19:37

for sharing material online.

19:42

So yeah, this is an interesting story,

19:44

and I guess it specifically relates to all

19:46

of these family influencers that we see,

19:49

which has definitely become more of a

19:51

problem lately.

19:54

Especially, I would imagine,

19:55

in California.

19:59

So it's interesting,

20:02

but probably makes sense that this is only

20:04

going to apply to...

20:07

kind of public influencers,

20:09

ones who are receiving money or

20:11

sponsorships in exchange for all of this

20:13

stuff.

20:14

But it is only going to be available

20:16

for adults.

20:17

So there isn't really a process that

20:19

prevents any of this stuff from being

20:21

posted in the first place or anything like

20:27

that.

20:27

It's only a retroactive thing that

20:32

adults can do about their childhood if

20:34

they were a part of like a family

20:37

influencer situation um does i i would say

20:43

i don't know if that makes a lot

20:44

of sense from my perspective because i

20:46

think as we always say um

20:49

anything that you post on the internet is

20:51

sort of permanent all of this stuff is

20:53

going to be archived and it could be

20:55

potentially years before you're able to

20:56

take any of this stuff down so uh

20:59

children who are like uncomfortable with

21:01

all of this going on um at the

21:04

moment i don't think have a lot of

21:05

protections um and i don't know um how

21:10

that should be handled to be honest i

21:12

know that that's been a debate that's been

21:13

going on for quite a while how children

21:16

should be

21:18

um like compensated for that is that

21:20

considered child labor um there's all

21:23

sorts of laws especially in the

21:26

entertainment industry and in hollywood

21:29

and on the internet um that that come

21:32

into play here so

21:35

I don't know if this is going to

21:36

really impact a lot of the people that

21:39

we see in the privacy guides community who

21:41

are trying to clean up their digital

21:43

footprint,

21:44

because I think a lot of people are

21:45

more concerned about smaller scale

21:49

situations than some of these commercial

21:51

ventures that this bill is going to

21:53

attack.

21:53

But I do think it's a good idea

21:56

for more privacy protections and some sort

22:00

of

22:01

process to get that data removed if you

22:03

are an adult and you don't want that

22:04

information out there.

22:05

So it seems to be a good thing.

22:07

I'm not sure how effective it'll be or

22:10

if it goes far enough,

22:12

but I think any protections and processes

22:14

to protect your privacy are good at the

22:17

end of the day.

22:19

Was there anything you wanted to note in

22:21

this article, Nate?

22:23

No, I agree with you.

22:25

It's funny.

22:28

I think most people would agree I'm a

22:29

lot more

22:31

lenient with some privacy stuff than a lot

22:33

of other privacy people are.

22:34

But like kids are kind of one of

22:36

the few things where I'm actually kind of

22:37

like,

22:39

like in a perfect world,

22:40

I think it should be illegal to post

22:41

pictures of your kids online at all.

22:42

Um, or at very least publicly, like,

22:45

you know,

22:45

if you're going to post pictures of your

22:46

kids,

22:46

it has to be in like a closed

22:49

group chat or like a,

22:50

a friends only Facebook post again in a

22:53

perfect world, there wouldn't be Facebook,

22:54

but that's beside the point.

22:56

Um, so like, yeah, this, I,

22:58

and I agree with you.

22:58

It's really sad.

22:59

Cause like, even in this article, um,

23:01

one of the,

23:02

one of the people they talked to said

23:04

that, um,

23:07

I think it was that first girl, Barrett.

23:09

Yeah, Kami Barrett.

23:10

Further down,

23:11

she says that she recalled being a target

23:13

for predators and online bullying,

23:15

said her mother was aware of the problems

23:17

it created,

23:17

but continued to share her daughter's life

23:18

on social media.

23:19

So, like, cool, thanks.

23:22

Now that I'm twenty, twenty-five, thirty,

23:25

I can ask you to take it down,

23:26

but that doesn't help me when I'm ten,

23:28

fifteen, sixteen, seventeen.

23:31

You know, like you said,

23:32

the damage is already done in so many

23:33

ways, and...

23:36

I mean, I guess, yeah, I don't know.

23:39

It's just, it's crazy.

23:40

And it's one thing I thought was

23:42

interesting is it says the legislation

23:44

requires that social media platforms offer

23:46

a process for adults to request the

23:48

removal of content.

23:50

And then basically from there,

23:52

they pass it on to the parent and

23:54

the parent has ten days to take it

23:55

down.

23:56

After ten days,

23:56

they get a three thousand dollar a day

23:58

fine.

23:59

So I don't know.

24:00

It's just it's really I'm with you.

24:02

I feel like it doesn't go far enough

24:03

and it doesn't.

24:05

it's not proactive enough,

24:07

but at the same time, I mean,

24:08

I guess it's better than nothing.

24:10

And I think that's, I don't know.

24:13

It's.

24:14

It frustrates me.

24:15

I wish it would do more,

24:17

but it's a story for sure.

24:19

A couple of things to note about this

24:20

story.

24:21

This bill hasn't passed yet.

24:23

It's just a proposal.

24:24

But this the person in question in this

24:26

article was talking about their support

24:28

for it.

24:31

The other thing I would note is similar

24:35

laws do exist in a couple of other

24:36

states, including here in Minnesota.

24:38

There are some laws that

24:42

here restrict um more highly how children

24:46

can participate in like commercial content

24:49

in the first place um so I think

24:51

if you're under thirteen you can't

24:53

actively uh participate in any of this

24:56

content creation at all you can maybe be

24:59

featured in it but you can't be um

25:02

Like an active part in it,

25:03

so I think that in Minnesota,

25:04

at least a lot of those toy unboxing

25:06

channels where people have their children

25:08

unbox a bunch of toys and that kind

25:09

of thing that's not allowed.

25:13

teenagers here in Minnesota are allowed to

25:16

participate,

25:16

but there are laws in both of these

25:18

situations around how that revenue is

25:20

split between everyone involved,

25:23

so there are some protections I think for

25:25

people participating in these commercial

25:27

ventures.

25:27

But from a privacy perspective,

25:30

I think they probably don't go far enough

25:34

in any case.

25:36

But it is interesting to see how this

25:38

is being handled.

25:39

It is a very, I think,

25:41

new issue with the internet and everything

25:44

that none of the existing laws were really

25:49

equipped to handle around child labor and

25:53

stuff like that.

25:54

So it's good that this is at least

25:56

getting attention,

25:57

and we'll see how this plays out.

26:00

Yeah,

26:01

it does say in California they have a

26:02

law that was signed two years ago that

26:04

content creators that feature minors and

26:06

at least thirty percent of the material

26:07

have to place some of their earnings into

26:08

a trust that children can access when they

26:10

turn eighteen.

26:11

So, yeah, like you said,

26:12

there's there's some it's an issue that's

26:14

starting to get attention for sure.

26:16

But.

26:17

Also, just on a personal note,

26:19

they interviewed Alison Stoner,

26:20

who they said was a former child actor

26:22

who appeared in films like Step Up and

26:23

Cheaper by the Dozen.

26:25

They were also Isabella in Phineas and

26:27

Ferb, and no mention of that.

26:28

And I feel so offended because I love

26:30

that show.

26:31

I just had to call that out.

26:33

Interesting.

26:36

I had to.

26:39

So in a little bit,

26:40

we are going to talk about LinkedIn's

26:41

browser scanning.

26:43

So that should be fun.

26:44

But first,

26:45

we're going to go ahead and jump into

26:46

site updates and talk a little bit about

26:49

what's been going on at Privacy Guides

26:50

this week.

26:52

Just this afternoon,

26:54

we dropped a new video.

26:55

It is currently members only.

26:58

So we usually leave those members only for

27:00

about a week.

27:01

This one is about encrypted email.

27:04

This is another one of those like really

27:05

beginner friendly videos that if you're a

27:07

bit of a privacy veteran,

27:08

you probably know this stuff,

27:09

but hopefully it's something that you can

27:10

share with your friends and family.

27:11

It talks about why mainstream providers

27:13

like Gmail and Yahoo aren't quite good

27:16

enough and how encrypted email works and

27:19

some of the different ones we recommend,

27:20

pros and cons of each.

27:22

So yeah,

27:23

if you are not a member yet and

27:24

you want to check that out,

27:25

you can join on YouTube or you can

27:27

go to privacyguides.org slash donate and

27:30

that will take you to a link where

27:31

you can sign up for a membership.

27:33

But that's what we did this week in

27:37

the video department.

27:39

And I will turn it over to Jonah.

27:41

Very cool.

27:43

Yeah, another thing we did recently,

27:46

Nate and I recorded this a few weeks

27:48

ago, but it's finally live.

27:50

We did a panel discussion on the Firewalls

27:54

Don't Stop Dragons podcast.

27:57

So episode four seventy four of that

27:59

podcast is now out.

28:00

It's called Privacy Guides Panel.

28:02

Nate and I are on it and we

28:03

talked about a ton of interesting stuff.

28:07

So I would definitely recommend checking

28:09

that episode out if you want to listen

28:15

to those discussions.

28:16

You can look at the table of contents

28:18

here.

28:18

It looks like Nate's showing that on the

28:19

screen,

28:20

but you can find the Firewalls Don't Stop

28:22

Dragons website for more information.

28:24

And if any of those topics sound

28:25

interesting to you,

28:27

Definitely check it out because it was a

28:28

ton of fun for us to record.

28:30

I think we talked about a lot of

28:33

cool, interesting, informative stuff.

28:36

So hopefully somebody finds it useful or

28:39

at least finds it entertaining.

28:44

In other news,

28:45

we again published a bunch of news briefs

28:47

that we're not covering here on this show,

28:49

but you can find our articles at

28:51

privacyguides.org slash news about them.

28:56

We have stories on Mac OS,

29:01

improving security in the terminal app.

29:04

a grandmother who was wrongfully arrested

29:06

because of facial recognition, iOS,

29:09

twenty six point five beta,

29:11

including end to end encryption for RCS

29:13

messages, Walmart,

29:15

digital price labels and more.

29:17

So definitely check that out.

29:18

Again,

29:18

it's privacyguides.org slash news if you

29:21

want to read those stories and let us

29:24

know if you have any questions about them

29:26

on the forum or anything else,

29:27

because there's always a lot of

29:28

discussions about these stories.

29:30

over there as well everything that we do

29:32

at privacyguides is made possible by our

29:35

supporters you can sign up for a

29:37

membership or donate at privacyguides.org

29:40

donate or you can support us by picking

29:42

up some swag like this water bottle for

29:45

example at shop.privacyguides.org

29:48

Privacy Guides is a nonprofit which

29:50

researches and shares privacy-related

29:52

information,

29:53

and we facilitate a community on our forum

29:55

in Matrix where people can ask questions

29:58

and get advice about staying private

29:59

online and preserving their digital

30:02

rights.

30:03

Now let's move on to our next story.

30:05

This is about NextCloud and OnlyOffice.

30:10

That is right.

30:12

So, um, full disclosure,

30:15

I am a next cloud user and a

30:16

little bit of a next cloud fan boy.

30:18

So, um, I'm bummed to hear this story,

30:21

but only office has suspended their

30:23

partnership with next cloud for forking

30:25

its project without permission.

30:28

And this comes on the heels of another

30:30

announcement.

30:30

So earlier this week, uh,

30:32

next cloud IONOS and several other

30:34

European tech companies

30:36

came together and announced this new open

30:38

source project called Euro Office,

30:40

which they describe as, quote,

30:41

a sovereign replacement for Microsoft with

30:43

intuitive interface and strong

30:45

compatibility backed by European open

30:46

source community.

30:49

Only Office has basically claimed that

30:52

this is a fork of their code.

30:55

And they say that this violates license

30:57

agreements because they offer only office

31:01

is source available or open source.

31:04

And they use the AGPL version three.

31:06

So specifically towards the end here,

31:09

if you're watching on screen,

31:10

you can see this, but towards the end,

31:11

it says we require compliance with

31:14

applicable licensing conditions,

31:15

including,

31:15

but not limited to the preservation of

31:17

only office branding logo and all required

31:19

attribution elements as defined in our

31:21

licensing terms, which is,

31:23

If this is a brand new project,

31:24

it would, of course,

31:25

have none of those things.

31:27

So for those who do not use Nextcloud,

31:30

you may or may not know that Nextcloud,

31:33

one of the things that it comes with

31:34

by default is an online document editor or

31:37

Office editor.

31:38

And there's a couple different ways to

31:40

make this work.

31:40

You can use Collabora online,

31:43

or you can use only Office.

31:45

And this has been...

31:47

I think they said for eight years,

31:49

only Office has partnered with NextCloud,

31:51

and now they are terminating that.

31:53

They do say...

31:55

I think they said that no, yeah,

31:57

no existing partners or clients will be

31:58

affected.

31:59

So basically if you've already got it

32:00

installed, you're good to go.

32:02

I don't know what that means for updates

32:03

and stuff, but yeah,

32:05

I guess we'll find out.

32:06

They also, interestingly, um,

32:08

just to throw it out there,

32:09

only office said that in the past,

32:10

and I'm quoting the article here,

32:11

next cloud has behaved in a manner not

32:13

expected from a partner,

32:14

including trying to poach its employees

32:16

and influencing customers against the

32:17

company,

32:18

but directly forking the project and

32:19

repacking it was the straw that broke the

32:21

camel's back.

32:22

Um,

32:23

Yeah, then kind of a statement here.

32:25

They said partnership is built on trust

32:26

and trust requires shared principles where

32:28

those principles are no longer upheld.

32:29

Continuing operation is no longer

32:31

sustainable.

32:31

For this reason,

32:32

we made the decision to suspend our

32:33

partnership cooperation.

32:37

And then just kind of, I guess,

32:38

a little bit more background towards the

32:40

end.

32:41

Lever Office has criticized OnlyOffice for

32:43

being, quote unquote, fake open source.

32:45

They say, for one reason,

32:46

OnlyOffice defaults to Microsoft Office

32:48

formats like DocX, XLSX, and PPTX,

32:52

which is Word, Excel, and PowerPoint,

32:55

rather than open standards like Open

32:57

Document Format or ODF.

33:00

And there's also, apparently,

33:02

Nextcloud says they didn't just

33:04

collaborate directly with OnlyOffice.

33:07

Let me rephrase that.

33:08

When asked why they didn't just

33:09

collaborate with OnlyOffice,

33:10

they said that there were a number of

33:11

reasons,

33:12

including that OnlyOffice is a Russian

33:13

company that tends to obscure its origins.

33:17

Developers often leave code comments in

33:18

Russian and many users are hesitant to use

33:20

software potentially linked to the Russian

33:21

government.

33:22

They also claim the only office

33:23

discourages contributions,

33:24

ignores pull requests and lacks

33:25

transparency since commit messages

33:27

frequently reference internal issue

33:29

trackers only.

33:30

So yeah,

33:34

I don't know that I have a lot

33:35

of thoughts on this one.

33:38

Jonah, did you have any,

33:39

like what do you know about this AGPL

33:42

V three, for example?

33:44

Yeah, so what's in question here is, well,

33:51

only office says that they've added

33:52

provisions to AGPL requiring certain

33:59

attribution in forks of the project.

34:03

So we could,

34:08

if I could share my screen here,

34:09

let's see.

34:11

Huh.

34:13

So they're talking about in their license

34:16

two things.

34:17

You have to retain the original product

34:18

logo when you distribute the program.

34:21

And they do not grant any rights under

34:23

trademark law for the use of any only

34:28

office trademarks.

34:29

And the Euro Office Project Initiative

34:35

basically removed these provisions saying

34:39

that

34:41

Basically, if I can find it here,

34:42

section seven of the AGPL says that you...

34:50

Which line is this?

34:52

Says that you can remove any additional

34:56

restrictions or any of these terms from

34:58

that license on your own.

35:00

And this is kind of the basis of

35:02

Euro Office's claim that they can kind of

35:06

change this license.

35:07

And they say that they don't have to

35:08

use their logo to...

35:11

Give attribution to OnlyOffice.

35:13

The AGPL is still going to require that

35:15

they provide some attribution somehow,

35:18

but according to the Euro Office project,

35:20

they don't have to use the OnlyOffice

35:22

trademark.

35:23

I think this is kind of interesting

35:24

because usually open source projects like

35:29

OnlyOffice in this position,

35:32

fight tooth and nail for forks to not

35:34

use their branding at all.

35:35

So the fact that they want them to

35:37

use their logo is kind of strange because

35:39

we've seen like Mozilla, for example,

35:41

when there's any Firefox forks,

35:43

they want to make sure that there's no

35:44

Firefox branding whatsoever associated

35:46

with that because they don't want it

35:47

associated with their

35:49

project um and related to this there's

35:52

actually another case um around it a few

35:56

years ago this started and then there was

35:58

a i think the latest update on this

36:00

was in um but a company called neo

36:05

four j um started a lawsuit against uh

36:10

another company purethink and

36:14

about a very similar issue.

36:16

Basically Neo-FourJ added a lot of clauses

36:19

to their AGPL license,

36:22

and Neo-FourJ said that because the AGPL

36:27

says that you can remove certain passages

36:32

or restrictions that were added onto the

36:34

AGPL, that they were able to do that.

36:36

And Peerthink actually lost this case.

36:39

This twenty twenty five article is

36:41

basically announcing an appeal that's

36:43

taking place.

36:44

I don't know if that's actually gone to

36:45

court yet,

36:48

but they

36:52

So yeah,

36:53

this article says that the AGPL allows

36:55

added-on terms like the Commons clause

36:57

that Neo-FourJ was using to be stripped

36:59

from the license.

37:03

And Neo-FourJ said that because they added

37:05

it,

37:06

you have to comply with all of the

37:07

terms of the license.

37:08

And the court basically agreed that any

37:10

terms in the license have to be followed

37:13

regardless of what the AGPL says.

37:16

And then the Free Software Foundation and

37:18

other organizations in the open source

37:20

space

37:22

said that that's not the case and that

37:24

they did intend this tenant or this

37:29

provision in the AGPL to work and for

37:33

these restrictions to be removed because

37:34

they believe that you can't really have

37:37

restrictions on free and open source

37:39

software,

37:39

which is kind of the point of the

37:42

AGPL.

37:44

So

37:46

It's a strange case.

37:48

It's definitely in a gray area.

37:50

And it really depends on how much

37:52

OnlyOffice wants to fight this.

37:55

But I think you could certainly argue in

37:58

the Neo-FourJ case and probably in this

38:00

OnlyOffice case that any of these

38:02

restrictions that are being added onto the

38:04

AGPL that have very...

38:07

specific restrictions on how the software

38:08

can be used probably make the software not

38:11

open source um so at the end of

38:14

the day you shouldn't be calling it an

38:15

agpo licensed project if you really want

38:18

these terms to be followed i think that

38:20

they would have to call it something else

38:22

and it

38:23

wouldn't be, I mean,

38:25

it would be at odds with the open

38:27

source in the same way that a lot

38:28

of these source available licenses that we

38:30

see are.

38:32

It's definitely a hot debate in the

38:36

community in general.

38:38

We've seen a lot of talk about like

38:39

the FUDO license, for example.

38:43

not being open source and they went with

38:45

a different name because of that.

38:47

But there's certainly other licenses that

38:49

projects are trying to use and they still

38:51

continue to claim to be open source when

38:53

in reality they're source available.

38:55

So I think that if only Office really

38:58

wants to follow through on having these

39:01

restrictions in place,

39:02

I think that would be very at odds

39:04

with their claims that they are an open

39:06

source

39:07

project um which would be a bit concerning

39:11

because the entire idea of open source is

39:13

that these forks should be able to exist

39:15

and like you should be able to completely

39:17

fork and create um this euro office that

39:22

nextcloud is making um without any

39:26

restrictions or preservation of OnlyOffice

39:30

branding.

39:31

That doesn't make a lot of sense for

39:33

a fork to be doing.

39:35

And so OnlyOffice is in a bit of

39:37

a strange situation here.

39:41

It's always the business

39:43

pace against open source software in

39:46

general.

39:46

They don't want people taking their work.

39:48

And OnlyOffice clearly believes that

39:50

because they say that they've spent years

39:52

building a fully functional production

39:54

ready Office document editor.

39:56

But at the same time,

39:57

they marketed that as an open source

39:59

project.

39:59

And that is what people kind of expect

40:02

from that.

40:04

I would also note,

40:07

Nextcloud kind of has a history of

40:10

forking open source projects um in not a

40:14

very collaborative way i mean their next

40:17

cloud itself was forked from own cloud of

40:19

course and that division was um

40:23

I don't think super well received by

40:25

OwnCloud themselves.

40:26

So it's kind of a situation that they're

40:30

used to.

40:30

But I think a lot of people side

40:32

with Nextcloud in that case.

40:34

And I think that a lot of people

40:36

are going to side with Nextcloud here as

40:39

well.

40:40

So it might just kind of be what

40:42

it is.

40:47

yeah and like kind of going back to

40:49

what you were saying about um you

40:52

mentioned that like a lot of companies

40:56

they they put work into it and then

40:58

they don't want people stealing that work

41:00

it's one of those things where like in

41:01

that case and i say this kind of

41:04

spitefully but like then just don't be

41:05

open source because i mean obviously in a

41:07

perfect world i would prefer everything

41:09

was or at very least be like you

41:10

said like be transparent about being

41:12

source available because um

41:15

in a perfect world,

41:15

I would love for everything to be at

41:17

very least source available,

41:18

because that's how we're able to verify

41:20

that the code is doing what it's doing.

41:22

And it helps build that trust at very

41:24

least, I think, um,

41:25

especially things that deal with security,

41:27

like password managers should at very

41:29

least have their cryptographic bits be

41:31

source available, um, bare minimum,

41:33

but cause security is something where like

41:35

everyone benefits.

41:36

Right.

41:36

But

41:37

To me,

41:38

it's just such a crappy thing because

41:39

that's the risk you take.

41:40

And I've talked to a lot of projects

41:43

that are not open source and I've asked

41:44

them that.

41:45

I'm like,

41:45

why don't you guys have any open source

41:46

clients or anything?

41:47

And that's usually the number one reason

41:49

they give is they're like,

41:49

we're worried that people are going to

41:50

take our stuff and steal it.

41:52

We have no real way to control that.

41:55

And then there's...

41:56

to counter their argument.

41:57

There are plenty of companies who seem to

41:58

be doing just fine despite that.

42:00

But yeah, so it's, I don't know.

42:02

It's just me.

42:03

I say this with a little bit of

42:03

bitterness in my voice towards only

42:05

office.

42:05

It's like, then just don't be open source.

42:07

Like it feels almost, um,

42:10

it's going to be a really niche reference.

42:12

And I don't think this is as much

42:13

of an issue anymore,

42:14

but back in like the early two thousands,

42:16

um, there was a real,

42:17

I don't know if you'd call it an

42:19

issue or not.

42:19

I guess it depends on how you feel

42:20

about it, but a lot of bands would,

42:22

um,

42:24

would market themselves as Christian bands

42:25

because the Christian market would be a

42:27

lot easier to break into.

42:29

And then once they hit a certain level

42:30

of success,

42:30

they would quote unquote go mainstream.

42:33

And some of them would even like

42:34

vehemently deny, like, no,

42:35

we were never a Christian band.

42:36

And it's like, well,

42:37

we have interviews with you where you said

42:39

that you were, so whatever, dude.

42:41

But to me,

42:42

it just feels the same way.

42:43

It's like,

42:43

you don't actually believe this stuff.

42:44

It's just some kind of marketing gimmick.

42:46

And in this case, the open source,

42:49

I just feel like that, I mean,

42:50

I guess in the Christian thing too,

42:51

just feels kind of crappy.

42:52

It's like, you know,

42:54

don't say that you believe in this stuff

42:55

just to get ahead in the competition,

42:58

like commit to it or don't.

43:00

So I don't know.

43:01

That just, that really frustrates me.

43:06

Quick thanks to at Sod This All for

43:09

gifting a Privacy Guys membership.

43:10

Thank you for your support.

43:15

Oh, nice.

43:15

I just saw that.

43:16

Yeah.

43:16

I had comments closed down so I could

43:19

see the screen a little better.

43:20

But yeah, thank you.

43:21

That's super cool.

43:21

All right.

43:24

I think that'll take us into a LinkedIn

43:26

story that Jonah actually alerted me to

43:28

this story right before we started

43:30

recording.

43:30

This one's like hot off the presses.

43:33

Yeah,

43:33

I think this was reported just yesterday

43:36

or today, if I remember correctly.

43:38

I definitely saw it only yesterday,

43:42

but maybe it's been talked about a bit

43:44

for a while.

43:46

But there's this report that LinkedIn is

43:49

illegally or allegedly illegally searching

43:53

your computer.

43:53

They are scanning installed browser

43:55

extensions without user permission.

44:00

So this is reported by Apple Insider.

44:05

And something is wrong with my computer.

44:09

There we go.

44:10

They say researchers have determined that

44:13

Microsoft's LinkedIn is scanning browser

44:15

plugins and other information without

44:17

permission and building user profiles

44:19

using data that the company did not get

44:21

permission to take.

44:23

A European advocacy group claims LinkedIn

44:26

is probing browser extensions through its

44:29

website code.

44:30

Fairlinked EV published a BrowserGate

44:34

report alleging LinkedIn detects installed

44:37

browser extensions by probing for known

44:38

identifiers through JavaScript.

44:40

The group says the technique reveals

44:42

personally identifiable information.

44:46

And so this is a threat that we've

44:47

talked about before,

44:49

I think in a previous episode of this

44:51

show,

44:51

but definitely on the forum where the

44:54

browser extensions that you install can

44:56

definitely add to your browser fingerprint

44:59

and can specifically identify you based on

45:03

what extensions you have installed.

45:05

And that's been a known threat for quite

45:08

a while,

45:08

but I think this is one of the

45:09

first and maybe the largest examples of a

45:13

real world situation where this is

45:16

happening.

45:18

And so if we look at this Fairlinked

45:23

BrowserGate website,

45:25

they point out a lot of different problems

45:27

with these tools,

45:31

namely that

45:34

Microsoft is designated as a gatekeeper

45:38

under the Digital Markets Act in the EU.

45:41

So Microsoft Windows and Microsoft

45:44

LinkedIn are both regulated products under

45:47

the DMA, and they need to allow,

45:50

as a result, free, effective,

45:52

high-quality, continuous,

45:53

real-time access to all data,

45:55

including personal data that's generated

45:57

through the use of these products,

46:00

which LinkedIn is not doing because

46:02

they're doing this

46:03

in the background.

46:04

They also point out that this search of

46:07

all of your browser extensions can reveal

46:09

a lot of different personal information,

46:13

and they give some examples of extensions

46:17

that could potentially reveal that.

46:19

It could reveal your political opinions,

46:22

for example,

46:23

because there are extensions like

46:26

anti-woke, anti-Zionist tag,

46:28

no more Musk that you can install.

46:31

I don't know what those extensions do,

46:32

but obviously having them installed

46:34

definitely shares a bit about what you

46:39

believe.

46:39

It could share some...

46:41

Could reveal your religious beliefs

46:43

because there are extensions like Porta AI

46:45

which blur haram content or Dean Shield

46:49

which blocks haram sites.

46:52

It could reveal potential disabilities or

46:55

neurodivergence through extensions you

46:57

have installed like Simplify which aids

47:01

neurodivergent users in browsing the

47:02

internet.

47:05

Certainly,

47:06

LinkedIn could be getting your employment

47:09

information.

47:10

There's a lot of obvious ways to do

47:11

that,

47:12

but there are job search extensions that

47:15

people use on LinkedIn where that could

47:21

reveal information to LinkedIn or your

47:23

current employer.

47:25

And then it just reveals a lot of

47:28

potential trade secrets because LinkedIn

47:30

is this network where so many

47:32

professionals are located and they share a

47:35

ton of information about where they work

47:38

and Microsoft would have access to all of

47:39

that data and they would also have access

47:41

to all of the extensions that these people

47:43

have installed,

47:44

some of which would be mandated by their

47:48

companies.

47:49

So like whether you use

47:54

The examples that they give are Apollo,

47:56

Zoom, Info.

47:56

You could imagine other browser extensions

48:01

of professional tools that would be

48:03

installed by these companies.

48:08

I don't know what tools companies use,

48:10

to be honest.

48:11

I know in the education space,

48:13

we would use tools like GoGuardian,

48:14

for example.

48:15

And so...

48:16

In that example,

48:17

they could find out what we're using.

48:20

But a similar case would apply to all

48:22

of these organizations and their employees

48:24

who use LinkedIn.

48:28

They say,

48:29

Fairlink says in their BrowserGate site

48:31

that LinkedIn has not disclosed this

48:33

practice in its privacy policy.

48:35

There's no mention of extension scanning

48:37

in any public-facing document that

48:40

LinkedIn has published.

48:41

And so on this BrowserGate website,

48:43

which you can find at browsergate.eu,

48:47

they list six thousand two hundred twenty

48:49

two extensions that a hidden JavaScript

48:52

program on LinkedIn will scan your browser

48:56

for.

48:56

I believe this only applies to Chrome

48:59

browsers,

49:00

but that's probably most people visiting

49:03

LinkedIn, I would imagine.

49:06

and you can't opt in or opt out

49:07

of that and there's again no mention of

49:09

any of this happening in any of their

49:11

privacy policies um which is definitely

49:14

very concerning um so

49:18

It's kind of a mass breach of your

49:21

personal data.

49:23

They say that this is deceiving

49:25

e-regulators, which is probably true.

49:28

And so I think it's just interesting to

49:30

note for sure that this definitely leads a

49:33

lot of credence to the idea that your

49:35

browser fingerprints are going to identify

49:38

you and reveal a lot of information about

49:42

you and what you do,

49:42

especially when it's being done by a

49:44

company like

49:45

LinkedIn that has probably a lot of

49:48

information about you if you use it.

49:49

It has your real name.

49:51

Some people ID verify on LinkedIn.

49:54

They have your whole resume and being able

49:56

to tie all of this digital data to

49:58

those profiles.

50:01

creates a very unique and very

50:03

comprehensive profile of you when you use

50:05

the service.

50:06

So I think it is very concerning for

50:09

sure.

50:09

And it shows that the threats that we

50:11

talk about when it comes to your privacy

50:12

are in fact a real issue.

50:15

And these companies are trying to get all

50:18

of this data wherever they can.

50:24

Yeah, for the record,

50:25

I tried to show the browsergate.eu

50:27

website.

50:28

For some reason,

50:28

it's not loading on the device.

50:31

It worked fine earlier,

50:32

but I guarantee it's DNS.

50:35

It's always a DNS issue.

50:38

But yeah, it's my first thought.

50:39

OK, so my first thought,

50:41

because I was recently educated,

50:44

for general browser fingerprinting,

50:46

like the day-to-day,

50:49

Some browsers like Firefox, for example,

50:52

they do actually try to obfuscate what

50:55

extensions you have installed.

50:57

And I guess just to back up a

51:00

little further, I know that for, again,

51:03

for general fingerprinting,

51:04

it's not always a guarantee that having

51:05

more extensions will make you more

51:07

fingerprintable because it generally

51:08

depends on what does the extension do and

51:10

whether or not it modifies the page.

51:11

But obviously this one is going out of

51:13

its way to scan your extensions, right?

51:15

So that's a little bit of a different

51:16

story.

51:17

which I would argue that general

51:18

fingerprinting probably does that too.

51:21

But going back to what I was saying

51:23

about Firefox,

51:23

I know Firefox basically tries to,

51:28

and I'm probably going to get the fine

51:29

details wrong on this, so I apologize,

51:30

but they basically try to like randomize

51:32

the ID that your extensions have to make

51:34

it a little bit harder for you to

51:36

be fingerprinted.

51:37

Do you think that would,

51:39

do you think that would stop something

51:40

like this or slow it down?

51:41

Or is it just going to be able

51:43

to get past that anyways?

51:45

um it could potentially but i mean yeah

51:49

it depends on you know i'm not sure

51:52

how these programs work randomizing it

51:54

could work if you can't find the files

51:56

in the first place um and that probably

51:58

is a strong protection against it um but

52:04

If those extensions modify the page

52:06

itself, which a lot of extensions do,

52:08

then that probably is still going to be

52:10

detectable.

52:10

And so that's only going to protect your

52:13

privacy against certain extensions you

52:15

have installed that make public resources

52:17

available, but don't modify the page,

52:22

which I don't think would be a ton

52:25

of extensions,

52:28

especially like password managers.

52:30

I can imagine where...

52:33

like if they edit the page itself to

52:34

add like a pop up or like a

52:36

drop down menu to logins,

52:37

that's going to be impacted.

52:39

So if you disabled all of that autofill

52:42

stuff, and you kept the extension,

52:45

and only like manually copied from it on

52:47

certain pages, you know,

52:49

it could potentially protect you in that

52:51

situation.

52:51

But I don't think most people are doing

52:56

that.

52:56

So I don't know how extensive that

52:58

protection would really be.

53:01

Which even then,

53:02

my thought process is that kind of defeats

53:04

one of the advantages of a password

53:06

manager, which is if it doesn't autofill,

53:08

that could be an indicator that you're on

53:09

a phishing page.

53:10

So if it never autofills,

53:12

then you never have that moment of like,

53:13

wait, am I on the right page?

53:14

Yeah.

53:15

Yeah.

53:16

And then I guess my other thought, too,

53:17

is just not really a question,

53:19

but just a thought.

53:21

You pointed out that this was tested on

53:23

Chromium browsers,

53:24

which is probably what most people are

53:26

going to use anyways.

53:29

I, at my last job,

53:31

they gave us work computers that came with

53:33

Microsoft.

53:35

And I mean,

53:36

ninety nine percent of what I did was

53:38

logging into the company stuff anyway.

53:39

So I just use Edge because that's what

53:40

it came with.

53:41

And at one point,

53:42

I think at one point I did get

53:43

Brave installed on it and then I was

53:45

never able to do it again.

53:47

But I think I did try Firefox because

53:48

I was like, well, you know,

53:49

it'll it's not Edge, right?

53:51

It'll be a way bigger improvement in

53:52

privacy.

53:54

But I got really annoyed because

53:55

everything in a corporate environment is

53:57

optimized to work

53:59

with Edge.

54:00

And so it was just so much extra

54:02

friction to use Firefox.

54:03

So where I'm going with this is, yeah,

54:05

like most corporate environments are

54:06

probably going to be using either Chrome

54:09

or Edge because everybody's familiar with

54:10

Chrome.

54:11

And where I was going with that is

54:12

at my job, they said like, yeah,

54:14

if you go to our little app store,

54:15

you can download Chrome or whatever.

54:16

We don't care.

54:17

Use whatever browser you want.

54:18

So most people are probably going to be

54:20

using Chrome or Edge.

54:20

Maybe some will be using Safari,

54:22

which I think the article did say that

54:25

Yeah,

54:25

Safari users are less likely to be

54:26

affected by the specific mechanism based

54:28

on how extension detection typically works

54:30

across browsers.

54:31

Apple's browser model limits

54:33

fingerprinting surfaces.

54:35

But it kind of goes back to...

54:41

It's unfortunate because not everything...

54:44

Where am I?

54:44

How am I trying to word this?

54:47

It's important to try to compartmentalize

54:48

your professional life and your personal

54:50

life, right?

54:50

Like never...

54:52

Never do personal stuff on a work

54:54

computer.

54:56

For some reason, people do anyways,

54:57

and I don't know why.

54:58

But even then,

54:58

LinkedIn is something that...

55:00

You wouldn't get in trouble for doing that

55:02

on a work computer, I would imagine,

55:04

but it's also something you would do at

55:05

home, right?

55:06

LinkedIn is supposed to be something that

55:08

follows you from job to job to job.

55:12

It's not necessarily specific to that job.

55:14

So it is something that I could see

55:15

people checking on a home device,

55:17

which is so frustrating because it's like

55:20

you're trying to...

55:23

I don't know.

55:23

It's like,

55:23

it's one of those things where like,

55:24

you're not really necessarily doing

55:26

anything wrong and you're still getting

55:27

punished.

55:27

And that's, that's super frustrating,

55:29

but yeah, I guess I,

55:31

I just wanted to point that out.

55:33

It's, it's, yeah, I don't know for sure.

55:35

I mean, a ton of people, I think,

55:37

um,

55:40

Their work laptop is their only computer

55:43

in a lot of cases besides their phone.

55:45

I think a lot of I know people

55:46

in that situation.

55:48

And yeah,

55:49

definitely do not recommend doing that.

55:52

You should get your own personal laptop

55:54

and use that instead.

55:55

But I know a lot of people do

55:56

that anyways.

55:58

Another thing that I wanted to share,

56:01

not in the notes,

56:02

but kind of related to this is browser

56:04

extensions aren't the only ways that

56:07

websites can potentially fingerprint you

56:09

or like software you have installed on

56:10

your computer.

56:11

Sometimes the software itself on your

56:14

computer can work against you.

56:16

And so kind of recently,

56:18

I think this has been going on for

56:19

a while,

56:20

but it's been picked up by some news

56:22

sources.

56:26

Basically, Adobe Creative Cloud,

56:28

of course it's Adobe,

56:30

is changing the host file on your

56:32

computer,

56:35

which allows websites to detect whether

56:37

you have Adobe Creative Cloud installed.

56:40

So this is posted to Reddit.

56:42

Basically,

56:42

Adobe is adding this line to your host

56:45

file.

56:45

And then when you visit the Adobe website,

56:49

it tries loading an image from that exact

56:52

domain.

56:52

And if the image loads because of this

56:56

line that they've added that points that

56:58

domain to a specific IP,

57:00

then they know that you have Creative

57:03

Cloud installed.

57:04

And that could, of course,

57:04

be checked by any number of different

57:07

websites to detect whether you have Adobe

57:08

Creative Cloud installed.

57:10

And so even if you don't have any

57:13

browser extensions,

57:14

there are other ways that software

57:16

on your device itself can um increase your

57:20

browser fingerprinting profile um

57:24

regardless of what you do with the browser

57:25

so that is something to definitely keep an

57:28

eye out for because the only thing that

57:30

would really protect you against this is

57:32

either not letting creative cloud do this

57:34

which i don't know if there's a mechanism

57:35

to do that but it might be

57:37

uh worth looking into or using a browser

57:39

like tor browser which is going to bypass

57:42

all of your local network stuff

57:44

specifically but that's um challenging to

57:47

do and not a lot of people are

57:48

doing that for day-to-day use um and so

57:51

software that does something like this um

57:54

is a problem i don't know of any

57:56

other software that's going to do this

57:57

besides adobe but um

58:02

Of course, again,

58:03

of course it's Adobe doing that, but yeah,

58:07

that is another attack vector unrelated to

58:09

extensions that websites could be using

58:11

that you'd also have to look out for.

58:14

That's insane though.

58:14

Editing the host file.

58:18

I don't even like screwing with that.

58:19

That's some deep level stuff.

58:22

Oh my God.

58:23

Wow.

58:24

These companies are out of control, man.

58:26

Yeah.

58:30

My brain hurts.

58:31

Just related to that, it's always DNS,

58:33

right?

58:34

DNS can be used against you.

58:35

DNS is used for evil.

58:37

Anyways,

58:39

I think that's all I have to say.

58:40

Do you want to talk about our next

58:41

story here?

58:43

Yeah.

58:44

My brain is still hurting from the host

58:45

file thing, so we'll just move on.

58:49

So this next story,

58:51

it helps if I share the actual screen.

58:53

Here we go.

58:54

So this next story comes from Four of

58:56

War Media.

58:56

It says,

58:57

a secure chat app's encryption is so bad,

59:00

It's quote unquote meaningless.

59:04

I mean, okay,

59:04

we'll go through it a little bit.

59:05

So the app is called Teleguard and I've

59:08

heard of it a little bit.

59:09

It actually rang a bell when I read

59:10

this.

59:12

I really, I'm not going to lie.

59:13

I really wanted a moment where I went

59:15

and checked the DMs on the forum because

59:19

we get a lot of projects and privacy

59:21

guides that message us directly.

59:22

And they're like, hey,

59:23

you should recommend our product.

59:24

And we always tell them like,

59:24

go post on the forum.

59:25

This is a community project.

59:26

Let the community vet it.

59:28

Um, so I,

59:29

I went and checked and I thought like

59:30

maybe I,

59:31

I knew their name cause they messaged us,

59:32

but, um, nothing like that, I guess.

59:34

So I don't know where I've heard it

59:36

from, but, um,

59:37

it has been mentioned on the forum once

59:38

or twice,

59:38

but never really like heavily recommended

59:40

or anything.

59:41

Just, I don't know.

59:42

But either way.

59:42

Yeah.

59:43

So this is an app that markets itself

59:44

as a secure end to end encrypted messaging

59:47

platform.

59:48

It's been downloaded at least a million

59:49

times.

59:50

Um, but apparently this researcher, uh,

59:53

found, it says there's no storage,

59:55

highly encrypted, highly encrypted, um,

59:59

kind of like military grade encryption,

1:00:00

right?

1:00:01

Anyways, um, Swiss made,

1:00:02

and there's an anonymous researcher in

1:00:04

March who contacted four Oh four.

1:00:06

They said that the private encryption keys

1:00:08

are sent to the company server upon

1:00:10

account registration.

1:00:11

And, um,

1:00:13

Jonah can correct me if I'm wrong about

1:00:14

any of this, cause I'm,

1:00:15

I'm speaking a little bit outside my

1:00:16

element here, but I think I'm,

1:00:18

I'm right about this.

1:00:19

Um, there are services like proton,

1:00:21

for example, that, um,

1:00:24

I don't know if I'd say the private

1:00:25

key gets sent to the server,

1:00:26

but they do have a way where like

1:00:28

you can log in from any device and

1:00:29

your email is decrypted.

1:00:30

Right.

1:00:31

But they also store that in such a

1:00:33

way where they don't really get the key

1:00:35

itself.

1:00:35

Um, I, again,

1:00:37

I could have the details wrong here,

1:00:38

but my point being like,

1:00:39

I think there is a way to store

1:00:42

private keys,

1:00:43

but they weren't doing it this way.

1:00:45

They weren't doing it in a way where

1:00:46

it's like,

1:00:46

we don't have access to your private key.

1:00:48

Like, no, they just had your private keys.

1:00:50

Um,

1:00:52

So yeah, they also...

1:00:54

I think it's further down.

1:00:56

They go through every single issue they

1:00:58

found,

1:00:58

which is basically like your private key

1:01:00

was derived from your user ID.

1:01:02

So anybody who had your user ID could

1:01:04

plug it into this API and decrypt your

1:01:06

messages, which is anybody you message.

1:01:09

Or a lot of people will post their...

1:01:12

Well, Signal, for example,

1:01:13

but a lot of people will post their

1:01:15

username publicly because they're like,

1:01:16

hey, anybody who wants to contact me,

1:01:17

go ahead.

1:01:19

They said further down that metadata was

1:01:20

stored in plain text.

1:01:21

So basically every single mistake you

1:01:25

could possibly imagine a company doing or

1:01:28

a messenger doing,

1:01:29

it seems like they were doing.

1:01:30

And oh, man, hold on.

1:01:32

I do have to find...

1:01:36

Yeah.

1:01:36

So the CEO, after publication,

1:01:38

the CEO contacted four Oh four via

1:01:40

LinkedIn,

1:01:42

hopefully from a company computer in a

1:01:44

direct message and said, quote, this,

1:01:46

the information is incorrect.

1:01:47

Exclamation point.

1:01:48

The person who gave you the technical

1:01:49

information that has completely misled

1:01:51

you.

1:01:51

That person is not competent.

1:01:52

Exclamation point.

1:01:54

Uh,

1:01:54

the CEO did not provide any evidence for

1:01:55

this or point to any specifics.

1:01:57

Um, very.

1:01:59

Yeah.

1:02:00

I don't know.

1:02:00

I always like when people do that kind

1:02:01

of stuff.

1:02:02

Very professional.

1:02:04

Um,

1:02:04

So is my making fun of them,

1:02:05

but whatever.

1:02:07

So yeah, I personally,

1:02:09

I wanted to share this story because I

1:02:10

feel like in the privacy community in

1:02:12

general,

1:02:13

I see a lot of people who I

1:02:17

think...

1:02:18

we get excited about new projects.

1:02:21

I think there's two kinds of privacy

1:02:22

people.

1:02:22

I think there's the people who get excited

1:02:23

about new projects and the people who are

1:02:24

suspicious of anything new.

1:02:27

But I see a lot of people who

1:02:28

get excited about new projects and they're

1:02:30

constantly like, oh,

1:02:31

there's this new messenger I just heard

1:02:32

about.

1:02:32

I'm excited to try it.

1:02:33

What does everybody think?

1:02:34

And first of all,

1:02:35

I think that's really awesome when you go

1:02:37

to other members of the community.

1:02:38

What do people think?

1:02:39

And because I have seen one of the

1:02:41

messages that I mentioned when I was

1:02:44

trying to figure out where I've heard of

1:02:45

this app before.

1:02:46

I went to the privacy guides forum and

1:02:47

one person was asking like, hey,

1:02:49

what does everybody think of this?

1:02:50

And a lot of people were like, oh,

1:02:51

it's proprietary.

1:02:52

Like this seems weird.

1:02:53

This seems weird.

1:02:54

There's a lot of red flags here.

1:02:57

I don't think anybody did like an actual

1:02:58

technical analysis like this person did.

1:03:00

But, you know,

1:03:01

it's good to get that kind of feedback

1:03:02

from other people.

1:03:03

Like I'm very open about the fact that

1:03:05

I don't really know a lot of code.

1:03:07

I did take a...

1:03:09

There's a little app that kind of gamifies

1:03:10

learning code, kind of like Duolingo does.

1:03:12

And allegedly it taught me Python,

1:03:13

but I wouldn't trust me to code anything

1:03:15

in Python if I were you.

1:03:16

I can now look at Python and recognize

1:03:17

it as Python, basically.

1:03:19

So that said, like,

1:03:21

I think it's really good to,

1:03:23

in my case, you know, like, hey,

1:03:24

I don't know enough about code to

1:03:25

understand this.

1:03:26

Can anybody else weigh in on this?

1:03:29

That's a really good thing.

1:03:30

But I think it's just this...

1:03:34

be a little bit cautious, right?

1:03:36

There's a fine line because on the one

1:03:37

hand, if we never trust anything new,

1:03:39

we would never have any mass adoption of

1:03:42

all these great tools like Proton, Intuda,

1:03:44

Signal, SimpleX.

1:03:46

All these really good tools would never

1:03:47

get out of the small phase because nobody

1:03:50

would ever trust them.

1:03:51

But at the same time,

1:03:52

we have seen so many apps that shut

1:03:54

down, sold.

1:03:56

Every once in a while,

1:03:56

it does turn out to be a honeypot.

1:03:58

And so there's a very fine line between

1:04:01

these things.

1:04:02

And-

1:04:03

Yeah, I would also ask,

1:04:04

especially with chat messengers,

1:04:07

one of my personal beefs is I feel

1:04:08

like there's

1:04:09

an obnoxious amount of messengers.

1:04:11

And one of the questions I always ask

1:04:13

with any new product, not just messengers,

1:04:15

but any new product is what are you

1:04:16

solving?

1:04:17

Like people send me links all the time

1:04:19

and they're like, this looks really cool.

1:04:20

And I'm like, okay, what is it doing?

1:04:22

What is it solving?

1:04:22

What problem is this solving that,

1:04:25

you know, whether it's a search engine,

1:04:26

an email provider, whatever,

1:04:28

like what is it doing that this existing

1:04:30

tool doesn't already do?

1:04:31

And I'd say about half the time people

1:04:32

are like, oh, I don't know.

1:04:33

I just,

1:04:34

I saw it and thought it was cool.

1:04:37

it's gotta be solving a problem for me

1:04:38

personally, but yeah.

1:04:40

So, um, I don't know.

1:04:42

Did you have any thoughts on this?

1:04:43

I know,

1:04:44

I think this one may have gone below

1:04:46

your radar a little bit,

1:04:47

but did you have any thoughts about it?

1:04:49

Yeah.

1:04:50

Um, yeah,

1:04:51

I think that's all a good takeaway.

1:04:53

Um,

1:04:53

thankfully the only thing I would say is

1:04:55

in the case of teleguards specifically,

1:04:57

um,

1:04:59

thankfully we've known about some of the

1:05:00

issues with it for a while.

1:05:01

I know that they note at the end

1:05:02

of this article, um,

1:05:05

TeleGuard handed over information to the

1:05:08

FBI in around,

1:05:11

according to the Washington Post.

1:05:14

That article was shared on our forum and

1:05:17

in all of the posts where TeleGuard is

1:05:19

brought up or in the thread about

1:05:21

TeleGuard itself.

1:05:24

People have known for a while that they

1:05:25

can provide information like the push

1:05:27

notification tokens and other information

1:05:31

and hopefully are avoiding that.

1:05:32

But yeah,

1:05:34

it's definitely a good thing to keep in

1:05:38

mind because there is a balance,

1:05:39

like you said.

1:05:39

We do need to have more products in

1:05:43

this space,

1:05:44

but knowing whether they work well is...

1:05:49

kind of tricky.

1:05:50

And it's always good to keep an eye

1:05:52

on this stuff because they certainly do

1:05:54

not always work the way that they

1:05:56

market themselves for sure.

1:05:59

I wasn't aware until I just read this

1:06:01

article that it was made by Swiss cows.

1:06:03

I've heard a lot of, well,

1:06:04

not a lot,

1:06:05

but I've heard their search engine brought

1:06:08

up a few times.

1:06:09

I know that they also have a file

1:06:10

storage service,

1:06:11

which is as far as I know,

1:06:12

just based on next cloud and uses like

1:06:15

the next cloud end to end encryption,

1:06:16

which isn't the best.

1:06:19

So they kind of just seem to be

1:06:20

one of the,

1:06:21

one of those companies where um they're

1:06:26

just putting stuff out there probably with

1:06:28

open source tools without really um adding

1:06:32

too much or changing it i don't know

1:06:34

if telegard is its own homebrew product i

1:06:37

would imagine it is because i don't know

1:06:39

of any like open source stuff that would

1:06:41

be um

1:06:43

that would have this poor encryption,

1:06:46

at least the people who are like forking

1:06:48

element, for example,

1:06:51

are getting a reasonably decent encryption

1:06:54

implementation,

1:06:55

whereas I don't know what's going on with

1:06:57

teleguard.

1:07:00

But yeah,

1:07:02

I think I think in this specific case,

1:07:03

people already know not to use it.

1:07:05

And otherwise,

1:07:09

with with stuff not like this,

1:07:10

it's it's everything you said for sure.

1:07:15

Yeah, I looked into Swiss cows briefly,

1:07:18

I think the only thing it has going

1:07:19

for it is it says,

1:07:21

like the search engine.

1:07:22

It says that it will censor adult content,

1:07:26

which I think could be useful if you

1:07:28

have really young kids,

1:07:30

just as like one of those layers of

1:07:31

defense, you know,

1:07:32

maybe set that as the default search

1:07:34

engine on the family computer and

1:07:36

But then we get into the whole topic

1:07:38

of like,

1:07:38

at what point is it appropriate to kind

1:07:40

of transition your kids off that?

1:07:42

But I don't know.

1:07:43

I remember when I looked into it,

1:07:44

that was kind of the only advantage I

1:07:45

saw was like, okay,

1:07:47

I could see this if I had young

1:07:48

kids and I just wanted it as one

1:07:49

more layer of defense of like,

1:07:51

I don't want them to accidentally find

1:07:52

their way onto something bad.

1:07:53

But yeah, of course,

1:07:55

for for immediate does note that Teleguard

1:07:57

has a reputation of being linked to cam

1:08:01

models and child abusers at the end of

1:08:05

this article.

1:08:06

So how much I would trust their approach

1:08:08

to child safety,

1:08:09

it probably would not be that far.

1:08:11

But yeah, in general,

1:08:14

it's probably a good idea for companies to

1:08:16

be a bit more thoughtful about all of

1:08:18

that stuff.

1:08:20

Yeah, that's fair.

1:08:21

I, I don't know.

1:08:22

I trust four Oh four,

1:08:23

but I'm not going to lie.

1:08:24

When I read that part,

1:08:24

my brain kind of went to like,

1:08:27

I wonder how much Teleguard does get used

1:08:29

for that stuff.

1:08:30

I don't know.

1:08:32

Yeah,

1:08:32

maybe if you turn a blind eye to

1:08:34

it.

1:08:34

I don't know how much they market it,

1:08:37

but I know that Kik had this reputation.

1:08:42

Maybe it still does, I don't know.

1:08:45

It does with me, that's for sure.

1:08:46

Yeah,

1:08:47

I've definitely heard this about various

1:08:49

messaging apps to the point where it seems

1:08:52

to be...

1:08:53

If you have that reputation and it

1:08:55

remains,

1:08:55

it seems to be kind of intentional,

1:08:57

and if it's on the radar...

1:08:59

Of these officers saying that they're

1:09:01

notorious for it,

1:09:03

that is a bit of a red flag.

1:09:05

Of course, with law enforcement,

1:09:07

it can always go either way because a

1:09:08

lot of law enforcement officers will say

1:09:11

Drive Fina OS, for example,

1:09:12

is notorious for being used by criminals

1:09:15

when in reality it's just a security tool.

1:09:19

But seeing as how this chat app doesn't

1:09:21

seem to provide adequate security,

1:09:26

I don't think it's the same sort of

1:09:28

situation.

1:09:30

Yeah.

1:09:30

Which not to get off topic,

1:09:31

but I know I've said in the past,

1:09:33

like,

1:09:33

cause there was that story about a year

1:09:34

or two ago about, um, apparently in Spain,

1:09:37

just having a pixel phone automatically

1:09:40

makes you suspicious,

1:09:40

like maybe not legally, but in practice,

1:09:42

it makes you suspicious because the only

1:09:45

people in Spain that have pixels are drug

1:09:48

dealers using graphene.

1:09:50

Um, and so it's the,

1:09:52

my argument when we covered that story

1:09:54

back then was like,

1:09:55

this is why we need to normalize tools,

1:09:57

uh, privacy tools,

1:09:58

because

1:09:59

if the only people using Signal are,

1:10:02

not that they're doing anything wrong,

1:10:03

of course,

1:10:04

but like dissidents and drug dealers,

1:10:06

then like it becomes like, oh,

1:10:08

you're on Signal,

1:10:09

you have something to hide, which,

1:10:10

you know,

1:10:10

in some countries being a dissident is

1:10:12

illegal.

1:10:12

So my point is I'm not trying to

1:10:14

morally group them into the same thing,

1:10:15

but my point,

1:10:16

it becomes something suspicious.

1:10:18

Whereas like if my stepdad is using

1:10:20

Signal, probably does not know what it is.

1:10:22

I had to download it and put it

1:10:23

on his phone and set it up for

1:10:24

him and get him in the family chat.

1:10:26

He didn't even know it could do video

1:10:27

or voice calls.

1:10:29

I tried to call him on it one

1:10:29

time and he didn't pick up.

1:10:31

So I called him on the regular phone

1:10:32

and he's like,

1:10:33

did you just try to call me on

1:10:34

signal?

1:10:34

I'm like, yeah, it does voice calls.

1:10:35

He's like, oh, I didn't know that.

1:10:37

So,

1:10:38

but my point being like when everybody's

1:10:39

using it,

1:10:40

then it takes away from that stigma

1:10:41

because they can't point to it and be

1:10:42

like, oh,

1:10:43

only bad people are using signal.

1:10:45

Really?

1:10:46

Really?

1:10:47

My seventy year old stepdad,

1:10:48

you think is running drugs from the

1:10:49

border?

1:10:50

Come on.

1:10:51

So anyways, yeah,

1:10:53

I just I know that's a little off

1:10:54

topic,

1:10:55

but I always feel the need to say

1:10:56

that.

1:10:59

So I think that'll take us into forum

1:11:03

updates if I remember correctly.

1:11:06

Yeah, well, in a minute, everyone,

1:11:08

we're going to start taking viewer

1:11:09

questions.

1:11:10

Of course,

1:11:10

you can always leave them in the chat

1:11:11

anytime.

1:11:12

But if you've been holding on to any

1:11:13

questions about any of these stories that

1:11:15

we've talked about so far, go ahead,

1:11:18

start leaving them now here in the chat

1:11:20

or in the forum thread for this live

1:11:24

stream.

1:11:25

Otherwise, yeah,

1:11:26

let's check it on the community forum.

1:11:28

There's always a lot of activity on the

1:11:29

forum every week,

1:11:30

so you should always check it out.

1:11:31

But here's a couple discussions that we

1:11:34

had

1:11:34

wanted to highlight from this week.

1:11:37

The first one is here about Russia's

1:11:41

internet blocks.

1:11:42

Let me get this pulled up.

1:11:48

This was just a discussion on a New

1:11:50

York Times piece which talked about

1:11:54

Russian internet restrictions and how

1:11:57

Russians are evading them.

1:11:58

So it's a bit of a cat and

1:11:59

mouse game there.

1:12:05

So the person who posted this said,

1:12:09

as some background, since early March,

1:12:11

Moscow and St.

1:12:12

Petersburg have experienced widespread

1:12:13

mobile internet blackouts,

1:12:15

not just blocked apps,

1:12:16

but full mobile data shutdowns.

1:12:18

Telegram is reportedly being blocked

1:12:20

entirely starting in April.

1:12:21

The government regulators now have the

1:12:24

authority to disconnect Russia from the

1:12:26

global internet entirely.

1:12:28

And some regions of Russia are on

1:12:29

lockdown.

1:12:30

whitelist mode meaning everything on the

1:12:32

internet is blocked except state-approved

1:12:34

services like yandex and government

1:12:37

portals um so yeah this version was

1:12:42

interested in whether or not there's a way

1:12:44

around this government censorship um which

1:12:47

could expand to europe and north america

1:12:50

The whitelisting situation,

1:12:54

that is pretty tricky because that's going

1:12:56

to block even the ability to use Tor

1:12:59

bridges, for example.

1:13:02

I know that Tor bridges are probably the

1:13:06

best way to get around censorship,

1:13:08

but if you're in a full whitelist

1:13:10

situation, that may not work.

1:13:12

At the end of the day,

1:13:13

if your internet service provider isn't

1:13:15

going to allow you to make any sort

1:13:18

of connections,

1:13:22

There isn't much you can do about that

1:13:24

besides find an entirely alternative

1:13:26

network.

1:13:26

So people in this thread note that Russian

1:13:30

citizens have started using Meshtastic to

1:13:33

communicate,

1:13:34

which is a decentralized network that

1:13:36

doesn't use the internet at all.

1:13:37

It uses LoRa radios,

1:13:42

which are small devices that you can

1:13:43

connect to your phone to communicate,

1:13:45

but they have very limited range,

1:13:48

although you can set up a mesh with

1:13:50

them.

1:13:54

There's probably other solutions,

1:13:56

but I think, yeah,

1:13:59

there's probably not too much you could do

1:14:01

from a technical perspective here that I

1:14:05

can think of.

1:14:06

Was there anything in this form that you

1:14:08

wanted to highlight specifically?

1:14:11

No,

1:14:13

it was really just kind of the Russian

1:14:16

internet blocks in general.

1:14:17

I know when those, well,

1:14:20

when the war in Ukraine first started,

1:14:22

I know Russia started cracking down on

1:14:25

VPNs.

1:14:26

And at the time I was with Surveillance

1:14:28

Report and Henry really made a good point

1:14:30

about how this is one of the drawbacks

1:14:32

of a centralized app store.

1:14:34

And at the time we were talking about

1:14:35

Apple,

1:14:35

but now it seems like we're starting to

1:14:37

talk about Android too.

1:14:39

Um, because, you know,

1:14:40

with Android and sideloading, uh,

1:14:42

which I know people don't like that term,

1:14:43

but you know,

1:14:44

Android and installing third-party

1:14:45

installs, whatever you want to call it.

1:14:47

It's, um,

1:14:49

it's kind of hard for Android to be

1:14:50

like, well,

1:14:50

we blocked VPN installs because they can't

1:14:52

block VPN installs and Tor installs.

1:14:55

Whereas Apple, you know, when,

1:14:56

when Russia came to Apple and was like,

1:14:58

Hey,

1:14:58

remove proton VPN and Nord VPN and all

1:15:00

these VPNs, they had no choice,

1:15:02

but to be like, all right, we'll do.

1:15:04

Cause you know that everything's so

1:15:06

centralized and locked down, but.

1:15:08

Yeah, with a total internet blackout,

1:15:11

the thing that comes to mind is years

1:15:13

ago, again, back on surveillance report.

1:15:17

So I interviewed John Todd,

1:15:22

who was the president of Quad Nine,

1:15:24

I think.

1:15:26

He's from Quad Nine.

1:15:27

I think he was the president at the

1:15:28

time.

1:15:28

I'm not sure if he's still there.

1:15:29

But it was interesting because we talked

1:15:31

about,

1:15:31

or it briefly came up about censorship

1:15:35

resistance.

1:15:35

And something he said that always stuck

1:15:36

with me is he's not a fan of,

1:15:40

DNS over HTTPS specifically for stuff like

1:15:43

this because if if the government starts

1:15:48

doing mass blocking at a DNS level and

1:15:51

you use something like DOH it makes your

1:15:54

traffic just blend in and eventually it

1:15:57

kind of like

1:16:01

I'm not trying to use language that is

1:16:03

sympathetic to a government for the

1:16:04

record,

1:16:04

but it kind of backs them into a

1:16:06

corner where they just decide to shut off

1:16:08

the internet entirely because they can't

1:16:09

figure out what traffic is going around

1:16:11

the censorship.

1:16:12

And I guess he was really ahead of

1:16:14

his time with that prediction because

1:16:16

that's basically what we're looking at

1:16:18

right now.

1:16:18

So yes,

1:16:18

it's a really tricky thing because how

1:16:21

would you, you know, and especially,

1:16:22

I don't know,

1:16:24

I feel like completely disconnecting from

1:16:25

the global internet is a completely

1:16:26

different beast that I don't even know how

1:16:28

we would handle that.

1:16:29

And I guess at that point it's,

1:16:32

I mean,

1:16:32

it's what are you trying to do?

1:16:33

If you're just trying to talk to people

1:16:34

locally, then yeah.

1:16:36

Things like Meshtastic, I think,

1:16:39

I really want to get into that,

1:16:40

but it looks like it would require a

1:16:43

little bit of skill just to kind of

1:16:45

first time dive in, you know,

1:16:46

to figure out the hardware and figure out

1:16:48

the install and the apps and the,

1:16:51

it feels like a bit of a commitment,

1:16:53

but if there's maybe a way to make

1:16:54

those things a little bit more

1:16:55

user-friendly or...

1:16:59

I don't know.

1:16:59

Yeah,

1:17:00

it's a good question because there's

1:17:02

different things you would need in that

1:17:04

situation, right?

1:17:05

I would need to be able to communicate

1:17:06

with my family here in the country,

1:17:09

hypothetically.

1:17:10

But then I would also need to be

1:17:13

able to communicate with the wider

1:17:15

internet and get information,

1:17:16

which here in the US, unfortunately,

1:17:18

we are kind of the wider internet.

1:17:19

But in another country,

1:17:20

that wouldn't be the case.

1:17:21

Or I mean, Proton even.

1:17:22

I wouldn't be able to check my ProtonMail,

1:17:24

so...

1:17:24

Yeah.

1:17:25

I don't know.

1:17:26

It's crazy.

1:17:26

And even Meshtastic,

1:17:27

that puts people in a dangerous situation.

1:17:30

And there's always the possibility that

1:17:33

Russia could, I mean,

1:17:35

both ban the use of it,

1:17:36

but also ban the import of Meshtastic

1:17:38

hardware.

1:17:40

I doubt any of it is being made

1:17:41

domestically in Russia.

1:17:44

And if anything is or could be,

1:17:47

the Russian government could stop that.

1:17:49

Yeah.

1:17:50

and also just using it,

1:17:52

or any sort of radio service,

1:17:53

you can be trivially tracked.

1:17:56

It does have a short range,

1:17:58

so it depends where you are,

1:17:59

but if people go around from the

1:18:01

government and try to track people down

1:18:03

who are using Matchtastic in the future,

1:18:06

they would pretty much be able to find

1:18:07

out who's

1:18:08

using it um so there so there are

1:18:10

concerns there i mean we even talked about

1:18:13

in a previous episode um i think it

1:18:15

was in belarus if i remember correctly um

1:18:18

ham radio enthusiasts were being accused

1:18:20

of like um being espionage agents

1:18:23

basically um for

1:18:26

for using their own like radio waves to

1:18:28

communicate rather than like these

1:18:30

government sanctioned things.

1:18:31

So it does like any of this amateur

1:18:34

radio stuff does put you in a dangerous

1:18:36

position in a country like this.

1:18:38

And especially if it becomes too

1:18:40

widespread,

1:18:41

it's very easy to imagine that Russia

1:18:43

would take a similar position to the

1:18:45

Internet in general and just blanket ban

1:18:47

it because they don't really need it.

1:18:51

The other reason this can't really be

1:18:53

solved from like a technical perspective

1:18:55

is

1:18:57

Um, like,

1:18:57

I don't think it's something that another

1:18:59

country like the United States or someone

1:19:01

else could kind of reach in and try

1:19:04

to solve for Russian citizens.

1:19:06

Like immediately what might come to mind

1:19:08

is something like Starlink, for example,

1:19:10

providing direct access to the internet,

1:19:12

bypassing, you know,

1:19:14

anything going on in Russia.

1:19:17

But Starlink,

1:19:18

like when that technology is in place,

1:19:21

we see it used for, um,

1:19:25

a lot of different things that the United

1:19:27

States and companies like SpaceX

1:19:29

definitely do not want to promote or

1:19:31

support.

1:19:31

We saw in the war with Ukraine,

1:19:35

for example,

1:19:36

Russian frontline troops were using

1:19:38

Starlink extensively to communicate on the

1:19:41

battlefield.

1:19:41

That's actually the reason SpaceX

1:19:45

does not operate in that region at all

1:19:47

and hasn't for many years.

1:19:49

And bringing it back for Russian citizens

1:19:51

to get around something like this would

1:19:54

just enable that usage of it again,

1:19:58

which they definitely don't want to do.

1:20:00

So it puts Russians in not a great

1:20:04

situation and really the only solution.

1:20:08

like we say for a lot of these

1:20:09

very widespread privacy issues,

1:20:12

whether it's age verification in Western

1:20:15

countries or mass censorship in other

1:20:19

countries, like in this case,

1:20:21

it's more of a social issue that you

1:20:24

have to resolve within your own country.

1:20:26

And hopefully people can fight back

1:20:28

against this there.

1:20:32

Because, I mean,

1:20:33

this should not be unacceptable.

1:20:34

I mean, this should not be acceptable,

1:20:37

if you know what I mean.

1:20:38

So, yeah.

1:20:41

Yeah,

1:20:41

it's definitely something tricky that I

1:20:44

don't know if we're qualified to solve.

1:20:46

But I guess if there's any takeaways on

1:20:49

this one,

1:20:49

it would just be kind of a...

1:20:54

I'm pretty open about having a mild

1:20:55

interest in disaster prep.

1:20:57

And sometimes that gets categorized as

1:21:00

wrongly it gets characterized as like you

1:21:02

know worrying about the end of the world

1:21:04

um which i don't care about that but

1:21:06

you know just little things like floods uh

1:21:08

hurricanes tornadoes earthquakes and

1:21:11

unfortunately we are in an incredibly

1:21:12

digital world so you have to think about

1:21:14

outages and cyber attacks and so um i

1:21:17

guess yeah if nothing else this is just

1:21:19

kind of a thought experiment of

1:21:21

if you're listening and you're in a

1:21:23

situation where you don't have to worry

1:21:24

about this yet,

1:21:26

just think about it a little bit.

1:21:27

Like don't lose any sleep over it,

1:21:28

but you know, what,

1:21:29

what would I do in that situation?

1:21:30

And just kind of give that some thought,

1:21:32

I guess.

1:21:36

The other forum post we were going to

1:21:39

look at, this one,

1:21:40

there's probably not too much to say on

1:21:41

this one, but there's a new video,

1:21:43

a YouTuber, this got shared on our forum,

1:21:46

that said, if you ran this debloater,

1:21:48

reinstall your system immediately.

1:21:49

And this is specifically,

1:21:52

so for the Windows users out there,

1:21:54

you know that there's a lot of scripts

1:21:58

that promise to do all kinds of different

1:22:00

things to your system.

1:22:02

Um,

1:22:03

there's a lot of ones that are popular

1:22:04

in the privacy community that promise to

1:22:06

remove a lot of telemetry and stuff.

1:22:08

There's also some that claim to optimize

1:22:10

the graphics and the performance and this,

1:22:12

that, and the other,

1:22:13

there's even entire windows ISOs that, um,

1:22:18

I want to say it was called Atlas

1:22:19

OS.

1:22:19

And if I've got that wrong,

1:22:20

I apologize to those guys.

1:22:21

But there was one that advertised itself

1:22:23

as like a gaming distro.

1:22:25

And it's basically like you install

1:22:26

Windows from scratch using this customized

1:22:28

ISO and it comes pre-optimized for gaming.

1:22:32

But the downside is it turns off Windows

1:22:35

Defender.

1:22:37

which I don't know why you would do

1:22:38

that.

1:22:39

So yeah, I'll be honest,

1:22:41

I didn't watch this specific video,

1:22:42

but basically it was not trustworthy.

1:22:46

I think it may have even come with

1:22:47

actual malware,

1:22:48

but don't quote me on that.

1:22:50

And so this whole thread is basically

1:22:54

talking about these deep loaders and

1:22:55

stuff.

1:22:56

And I think the official position of

1:22:58

Privacy Guides is that we don't recommend

1:22:59

them because they are...

1:23:02

They're tricky.

1:23:03

I know there's some that are open source

1:23:04

or source available, I should say.

1:23:07

And if you know code and you're

1:23:08

comfortable doing it, then sure,

1:23:10

you could look through it and make sure

1:23:11

that you verify what it's doing.

1:23:13

But it's definitely very...

1:23:15

A lot of these deep loaders,

1:23:16

you're giving them a lot of power over

1:23:17

your Windows system.

1:23:18

And...

1:23:20

If you're going to use one,

1:23:20

you have to be like come off of

1:23:22

a mountain and found a religion positive

1:23:23

that this thing is trustworthy because it

1:23:27

would not take much for it to do

1:23:28

something malicious,

1:23:29

whether that's planning malware,

1:23:31

crypto mining, stealing data, whatever.

1:23:33

So, yeah.

1:23:34

Absolutely.

1:23:36

I think I just wanted to take a

1:23:37

moment to mention that.

1:23:39

And it's worth noting this tool in

1:23:41

question is open source as well.

1:23:44

As far as I know,

1:23:44

it doesn't come with malware,

1:23:45

but it basically acts...

1:23:48

the way that malware would.

1:23:50

There's a pinned comment on this video

1:23:52

saying there's a few inaccurate

1:23:53

statements, but overall,

1:23:57

the conclusion of the video is that this

1:24:00

is all implemented poorly.

1:24:01

I do think this is a classic case

1:24:02

of like,

1:24:05

somebody putting something out there

1:24:06

without really fully understanding what it

1:24:08

does.

1:24:08

I don't want to say whether it is

1:24:11

or not,

1:24:11

but I think we're just going to see

1:24:14

this happening more often as more people

1:24:16

try to AI code tools to solve all

1:24:18

of their problems without really

1:24:19

understanding what they are.

1:24:22

Whether or not that's the case in this

1:24:24

situation,

1:24:24

it's definitely something to look out for

1:24:26

when you're running any sort of scripts

1:24:29

that you don't fully understand.

1:24:31

I think that is absolutely the most

1:24:32

important takeaway here,

1:24:34

that you cannot run any of these deep

1:24:37

loading scripts unless you know exactly

1:24:39

what they do and you see how they're

1:24:40

doing it,

1:24:40

because at which point you could probably

1:24:43

do it yourself, by the way.

1:24:44

But yeah, all of these scripts like this,

1:24:49

they affect the system so substantially.

1:24:55

And Windows is already such a not secure

1:24:58

and not private platform in the first

1:25:00

place that it doesn't really make a lot

1:25:01

of sense to me to try and improve

1:25:03

it, especially to this degree,

1:25:06

unfortunately.

1:25:07

There isn't really a ton...

1:25:11

that you can do at the end of

1:25:12

the day to improve your privacy on Windows

1:25:14

because the operating system itself is

1:25:16

going to be constantly fighting against

1:25:18

you.

1:25:21

So that's unfortunate.

1:25:24

It's cool to see more videos from this

1:25:26

YouTuber.

1:25:28

I first...

1:25:31

heard about this person who made this

1:25:33

video calling the other YouTuber who made

1:25:35

the script out because they made a video

1:25:37

about Freely around the same time that I

1:25:40

published a video about Freely.

1:25:41

So they came up in my feed.

1:25:44

And I think we had some overlapping

1:25:46

complaints.

1:25:48

I haven't watched the rest of their

1:25:49

videos,

1:25:49

but I think anybody who is creating

1:25:51

content in the privacy space

1:25:54

That's always a good thing.

1:25:56

And so if they continue to be brought

1:25:59

up and they continue to post more useful

1:26:03

content like this,

1:26:05

I think that's fantastic.

1:26:09

Yeah, I think about that a lot.

1:26:10

There's...

1:26:13

I think there's still plenty of room in

1:26:14

the privacy space for more voices,

1:26:15

for sure.

1:26:17

Yeah, real quick,

1:26:17

looking through his description on his

1:26:19

video, you're right.

1:26:21

It doesn't look like it installs malware,

1:26:23

but it disables crucial security

1:26:24

components and makes your system severely

1:26:26

vulnerable to malware.

1:26:27

So it also makes your system much more

1:26:29

unstable and prone to corruption and

1:26:31

breaking.

1:26:31

I recommend that anyone who ran this tool

1:26:32

immediately reinstall a fresh copy of

1:26:34

Windows.

1:26:34

So yeah, they're dangerous things.

1:26:39

You've got to make sure they're trusted if

1:26:40

you're going to use them at all.

1:26:42

I also totally hear your argument of like,

1:26:44

it's already so like such a lost cause

1:26:47

that it may be safer for a lot

1:26:48

of people to just not even try and

1:26:49

just stick to like the,

1:26:51

the toggles and the settings and stuff.

1:26:53

So, yeah.

1:26:53

Yeah.

1:26:56

Well,

1:26:57

we've been going for about an hour and

1:27:00

a half here.

1:27:02

We'll probably give a last call for any

1:27:04

questions or comments that people want to

1:27:06

leave on the forum.

1:27:07

I don't think we had any on our

1:27:08

forum post today,

1:27:11

and I'm not sure if I've seen any

1:27:13

in the chat here.

1:27:15

I know there's a bit of a delay

1:27:17

on this live stream between when I'm

1:27:19

saying this and when you'll hear it,

1:27:21

so we'll give people a couple of minutes

1:27:23

if you want to add anything else.

1:27:24

Otherwise,

1:27:26

we'll probably begin to wrap things up.

1:27:28

So this is your final morning,

1:27:31

anyone who's watching and wants to chime

1:27:34

in on any of these stories.

1:27:36

It's always my favorite part of a live

1:27:38

stream is knowing that delays there.

1:27:39

So you say stuff like that, like, Hey,

1:27:41

we're going to open the floor.

1:27:42

And now I have to fill time for

1:27:44

a couple minutes and give people time to

1:27:45

hear it and write their questions.

1:27:48

Uh, it just feels so awkward, but, um,

1:27:52

Yeah,

1:27:52

apparently we don't have any questions in

1:27:55

the forum here.

1:27:57

Got one question from Hogan in the chat

1:27:59

here.

1:28:00

There's been a couple of supply chain

1:28:01

attacks recently.

1:28:02

The current best practice is to always

1:28:04

update your apps,

1:28:04

but this opens you up to those attacks.

1:28:07

Does it still make sense to keep apps

1:28:09

updated as recent as possible?

1:28:12

Definitely depends on the app.

1:28:14

I know you have definitely seen this more

1:28:17

prominently in apps that are built with

1:28:20

NPM, probably a lot of web-based apps.

1:28:23

But in general,

1:28:24

I do think it's probably the safer option

1:28:28

to keep your apps up to date versus

1:28:34

not updating them because

1:28:38

Typically,

1:28:40

all of these updates are going to... Well,

1:28:42

not all of them,

1:28:43

because some of them just add features.

1:28:44

But most updates that you see are going

1:28:46

to be patching known vulnerabilities or

1:28:48

vulnerabilities that you already see in

1:28:50

the wild.

1:28:52

And so the potential of a new update

1:28:54

having a zero-day vulnerability that

1:28:55

hasn't been discovered yet is probably a

1:28:58

lot lower than the potential of using code

1:29:02

that

1:29:03

almost certainly has known vulnerabilities

1:29:05

that can be exploited.

1:29:08

So yeah,

1:29:09

I would definitely recommend keeping apps

1:29:12

up to date.

1:29:13

And especially the lower level you go,

1:29:15

the more important it is.

1:29:16

Keeping your operating system up to date

1:29:17

is super important.

1:29:18

As we saw, I don't know if...

1:29:22

when we talked about this on the show,

1:29:24

but recently iOS had a bunch of updates

1:29:28

for zero-day vulnerabilities in Safari and

1:29:32

some other security vulnerabilities which

1:29:33

were not patched at all in the previous

1:29:37

version of iOS.

1:29:39

You had to be on iOS to receive

1:29:41

some of these security updates.

1:29:43

And so it's examples like that where even

1:29:46

a company like Apple,

1:29:47

which is relatively well known to provide

1:29:52

security patches to older versions of

1:29:54

their operating system,

1:29:55

they almost are never doing that super

1:29:59

consistently.

1:30:00

And in that case, they weren't.

1:30:02

And so it's always, I think,

1:30:03

a danger to not be fully up to

1:30:05

date.

1:30:06

It's interesting, kind of related to this,

1:30:08

I just installed an app on my phone

1:30:11

and during the setup process it said you

1:30:13

should disable automatic OS updates

1:30:15

because they don't validate how it works.

1:30:17

And I was like, that's terrible advice.

1:30:21

That was a medical device related app.

1:30:23

So I think they were saying that because

1:30:27

they have to validate how OS updates work,

1:30:30

but it's like that kind of puts all

1:30:32

of the users of this device in danger.

1:30:34

So that's...

1:30:36

yeah sometimes you will definitely see

1:30:39

software and advice that are at odds with

1:30:41

security advice but generally um yeah keep

1:30:46

your stuff up to date yeah i agree

1:30:50

i think um i hope that i would

1:30:54

be interested to see an actual study on

1:30:57

how many like supply chain attacks versus

1:31:00

how many um like known vulnerability or

1:31:02

zero days are being patched um i'd be

1:31:05

willing to bet that

1:31:07

the supply chain attacks are more rare

1:31:10

just by raw numbers.

1:31:11

And, you know, something,

1:31:13

something I struggle with a lot in all

1:31:15

areas of life is I forget what the

1:31:18

name of it is, but, um,

1:31:20

It's a logical fallacy because news is

1:31:22

news because it's out of the ordinary,

1:31:25

right?

1:31:26

Even the example I like to use is

1:31:28

traffic accidents.

1:31:28

Nobody ever goes on... Tonight at five,

1:31:31

man gets home from office safely without

1:31:33

incident.

1:31:34

Nobody talks about that.

1:31:35

And even traffic,

1:31:37

accidents are so common that we don't even

1:31:39

really talk about the accidents that much.

1:31:40

It's usually just like, hey,

1:31:41

traffic's bad because there's an accident.

1:31:43

It's more about the traffic.

1:31:44

But

1:31:45

News is news because it's unusual.

1:31:48

So when we see all these supply chain

1:31:51

attacks, it's because, I'm guessing,

1:31:53

they're still the exception instead of the

1:31:56

norm.

1:31:57

But that said, I hope...

1:32:01

kind of a dark way to look at

1:32:02

it, but I,

1:32:02

I hope we're seeing enough of them that,

1:32:05

um,

1:32:05

companies are starting to wake up and

1:32:07

realize the importance of securing their

1:32:08

supply chain, whatever that may look like.

1:32:11

Um,

1:32:11

and hopefully we will start to see those

1:32:13

go down because if they do become too

1:32:15

common,

1:32:16

it becomes a problem for the companies

1:32:17

too.

1:32:17

Cause think about it.

1:32:18

That's money.

1:32:18

They have to spend a, um, regain control,

1:32:21

kick out the person,

1:32:22

try to push out the good code to

1:32:24

fix the bad code.

1:32:25

Um, the reputational damage,

1:32:27

like all of that stuff.

1:32:28

So,

1:32:29

It affects them too.

1:32:30

I don't know if we're at that point

1:32:31

yet, but... Absolutely.

1:32:33

I mean,

1:32:33

to take it to the most extreme example,

1:32:36

right?

1:32:36

Like,

1:32:37

you never would ever see a news article

1:32:41

today about a new vulnerability in Windows

1:32:44

XP or something like that.

1:32:45

But everyone knows you can't be using

1:32:46

Windows XP on the open internet because

1:32:51

it's just so insanely vulnerable to all of

1:32:54

these attacks.

1:32:54

But, like...

1:32:57

We already know you shouldn't be using it.

1:32:59

So if a new attack is discovered,

1:33:00

that's not going to make the news.

1:33:01

And that's going to be the case for,

1:33:03

I think,

1:33:03

a lot of apps that you don't keep

1:33:05

up to date, which is why, in general,

1:33:07

I would still say the updates are super

1:33:09

important.

1:33:15

Yep.

1:33:16

All right.

1:33:18

I guess that's all we got this week.

1:33:27

Okay.

1:33:28

All right.

1:33:30

Well, I think, yeah, we can, I'll just,

1:33:33

I'm going to give the form thread one

1:33:35

more check unless you just did,

1:33:37

but it looks like there's nothing else.

1:33:38

Yeah,

1:33:38

I've got it open on another window here.

1:33:39

Cool.

1:33:40

Okay, well, thanks everyone for tuning in.

1:33:44

Like usual,

1:33:45

all of the updates from This Week in

1:33:47

Privacy,

1:33:48

we share them on the blog and in

1:33:50

our email newsletter every week,

1:33:51

so you can sign up for that newsletter

1:33:53

or subscribe with your favorite RSS reader

1:33:55

if you want to stay

1:33:57

tuned about new episodes,

1:33:59

and also all of the sources for this

1:34:01

episode.

1:34:02

That's where we post them all,

1:34:03

so if you want links to all the

1:34:04

articles we talked about, check that out.

1:34:07

For people who prefer audio,

1:34:09

we have a podcast available on all podcast

1:34:11

platforms in RSS,

1:34:12

so you can use your own podcast reader.

1:34:15

The recording for this video is also going

1:34:18

to be synced to PeerTube like usual,

1:34:20

so you can watch it outside of YouTube.

1:34:24

Here at Privacy Guides,

1:34:25

we are an impartial nonprofit organization

1:34:28

that is focused on building a strong

1:34:30

privacy advocacy community and delivering

1:34:33

the best digital privacy and consumer

1:34:35

technology rights advice on the internet.

1:34:37

If you want to support our mission,

1:34:39

you can make a donation on our website

1:34:41

at privacyguides.org.

1:34:43

To make a donation,

1:34:44

you can click the red heart icon located

1:34:46

in the top right corner of our website,

1:34:49

or go to privacyguides.org slash donate.

1:34:52

You can contribute using standard fiat

1:34:54

currency via debit or credit card,

1:34:56

or opt to donate anonymously using Monero

1:34:59

or with your favorite cryptocurrency.

1:35:01

Becoming a paid member of Privacy Guides

1:35:03

will unlock exclusive perks like early

1:35:05

access to video content,

1:35:07

early access to the show notes for this

1:35:10

show,

1:35:11

and priority during the This Week in

1:35:13

Privacy livestream Q&A.

1:35:16

You'll also get a cool badge on your

1:35:17

profile on the forum and the warm,

1:35:20

fuzzy feeling of supporting independent

1:35:23

media.

1:35:24

Thank you all again for watching,

1:35:26

and we will see you next week.