Can You Get Arrested For Wiping Your Phone?
Ep. 31

Can You Get Arrested For Wiping Your Phone?

Episode description

A man is being charged for wiping his phone before being arrested, India is considering enabling GPS tracking on all mobile devices, a remote code execution vulnerability is exploiting over 30 organisations and 70k IP addresses, RAM shortages are causing massive increases in RAM costs, and much more, join us for This Week In Privacy #31!

Download transcript (.srt)
0:06

[MUSIC PLAYING]

0:22

Welcome back to This Week in Privacy, our weekly series

0:25

where we cover the latest updates with what we're working on

0:27

within the Privacy Guides community.

0:29

in this week's top stories in the data privacy

0:31

and cybersecurity space.

0:34

Privacy Guides is a nonprofit which researches

0:36

and shares privacy-related information

0:38

and facilitates a community on our forum

0:40

and matrix where people can ask questions,

0:42

get advice about staying private online

0:44

and preserving their digital rights.

0:46

Before we dive into this week's show,

0:48

here's a rundown of how the show is laid out.

0:50

We'll start by covering site updates,

0:52

then we will discuss the top stories

0:53

in data privacy and cybersecurity.

0:55

Next, we'll explore some trending posts

0:57

on the Privacy Guides forum.

0:58

And finally, we will answer questions from viewers.

1:01

So if you have a question for us, please leave a comment

1:04

either on the forum thread over at discuss.privacyguides.net

1:08

or in the YouTube live chat.

1:11

I am Nathan and I am joined today by Jordan.

1:15

Hello, Jordan.

1:16

- Hey, how's it going?

1:17

This is, I'm excited to be here this week.

1:21

- Excited to be co-hosting with you.

1:23

I think we co-hosted once together so far, right?

1:27

- Yes, a couple of weeks back, but yeah.

1:29

So just to let everyone know,

1:31

this is kind of a last minute swap in here.

1:34

Jordan had some technical issues,

1:35

so we'll be trying our best this week,

1:38

but just letting you know ahead of time.

1:42

- Yeah, no worries.

1:44

All right, we're gonna go ahead and kick things off

1:47

with the site updates.

1:48

It's been a little bit of a quieter week.

1:51

We got a lot of stuff cooking behind the scenes,

1:54

But to start with, we do have a new video out for members.

1:58

We have released the first,

2:01

so we've been talking about doing

2:02

this smartphone security course.

2:04

There's technically four videos, there's an intro,

2:06

and then there's like beginner, intermediate, and advanced.

2:08

The intro and the beginner are now out for members.

2:11

So the intro's real short, I think it's less than two minutes.

2:14

So I would say the first of three videos is out,

2:16

but it's technically the first two of four.

2:19

So if you are a member over on the Privacy Guides Forum,

2:21

or I believe even here on YouTube,

2:23

you could go ahead and check that out already.

2:26

And we've got another video in the works

2:29

that I actually sent off to Jordan yesterday, I believe,

2:32

that is ready for the next phase of editing.

2:34

So that is shot and that is edited.

2:36

We're doing some, or I'm trying some interesting things

2:41

with like moving around and just,

2:43

I'm excited for you guys to see it.

2:44

I think obviously Jordan does amazing work on the editing

2:47

and I think it's really gonna be exciting.

2:50

So definitely let us know what you guys think about that

2:52

when it comes out.

2:54

And we do have one change to the forum

2:58

and we wanted to let you guys know about it.

3:00

We have added experience levels.

3:03

So when you join the forum now,

3:05

we will prompt you to go ahead and enter,

3:08

this totally self-reported,

3:10

what do you think your experience level is?

3:11

And it's a beginner, intermediate and advanced.

3:14

And it'll add a little tag to your profile.

3:18

Actually, if you look at the announcement,

3:21

I don't know if we can put that on screen or not,

3:22

but it is right there on the forum.

3:24

Jonah has a screenshot where he shows an example

3:27

of what it'll look like on your profile.

3:30

And there seems to be some confusion about this.

3:33

So we're gonna go ahead and talk about it real quick.

3:36

This is, again, this is entirely self-reported

3:38

and it's designed to make the forum a little bit more,

3:43

how would I describe it?

3:44

I guess user-friendly, not really user-friendly,

3:46

but basically what it is, is it's designed for,

3:49

if you are more of a beginner,

3:52

then when you ask a question,

3:54

it'll automatically add a tag that says ELI5,

3:58

which is explained like I'm five.

3:59

And it basically lets other people know

4:02

not to withhold anything.

4:04

Jonah actually specifically says that in the announcement here,

4:06

like we don't expect you to withhold information

4:08

for it being too difficult.

4:09

But, you know, just to kind of explain it in simpler terms

4:11

and to be patient with the person asking the question,

4:13

because they may not necessarily know all the same things.

4:16

They may be kind of new to this privacy stuff.

4:18

And it does not change the experience at all for anyone else.

4:22

If you're an advanced user or an intermediate user,

4:24

you can still add that tag.

4:25

If it's a concept that you're not really familiar with

4:28

and you want that kind of explanation,

4:30

there's no like sections of the forum

4:33

that are not available to beginners

4:35

or vice versa or anything like that.

4:36

So just to be clear, and again, it's self-reported.

4:40

So, you know, it's kind of up to you.

4:43

Do you feel like an expert or expert is the wrong word?

4:45

Do you feel like a more advanced user?

4:46

Do you feel like an intermediate user?

4:48

And I believe you can actually change it too

4:50

at a later date.

4:51

So maybe after a little while, you're like,

4:52

yeah, I feel like I'm getting the hang of this.

4:53

You can bump it up to intermediate.

4:55

So yeah, hopefully that is something

4:58

that will make the site a little bit more welcoming

5:00

for people who are maybe just getting started

5:04

in the privacy journey.

5:05

'Cause I have seen from a lot of other people,

5:09

it can be really overwhelming when you first join.

5:11

And the goal is to try and make it a little bit easier

5:12

to navigate for people who are new

5:14

to PrivacyGuides forums.

5:16

So, and sorry, I'm navigating back to my notes here.

5:22

Unless Jordan has anything to add,

5:24

I think we'll start getting into the news.

5:27

You got anything I missed or?

5:28

- No, great overview on everything.

5:30

I guess we can dive in straight into this first story here.

5:34

And it's a write up from Kevin.

5:38

So here's our community and news intern.

5:41

And it's about an Atlanta activist being charged

5:45

wiping their phone before CBP search. So just reading from Kevin's summary here.

5:56

Samuel Tunik, a Atlanta-based activist, was arrested and charged with destroying evidence

6:02

after a US Customs and Border Protection unit searched his Google Pixel smartphone

6:07

404 Media reports. The search was conducted by the Tactical Terrorism Response Team,

6:14

an elite unit of the CBP on January 25th, 2025. Tunik supposedly erased the data of his pixel

6:22

smartphone before the CBP officer searched his device. Although the circumstances of his arrest

6:28

was unclear, an official indictment states that Tunik allegedly did knowingly destroy

6:35

damage, waste, dispose of, and otherwise take any action to delete the digital contents of

6:41

of a Google Pixel cellular phone.

6:44

There is no evidence that Tunik committed any crimes

6:47

beyond this charge.

6:50

As of today, Tunik has been released

6:52

and is awaiting further trial proceedings.

6:55

He's not allowed to leave the Northern District of Georgia

6:57

in the meantime.

6:59

Privacy Guides cannot confirm if the device had GrapheneOS

7:02

or a similar operating system installed.

7:05

GrapheneOS is exclusively to Pixel devices

7:09

providing features such as a dress pin that allows users to quickly wipe data off their

7:14

phone in high risk situations.

7:18

And as you might know, evidence tampering is a crime in most countries.

7:23

If you're caught erasing data prior to a law enforcement search, you may experience similar

7:28

charges faced by tunic.

7:31

Nevertheless, journalists and activists alike may feel unnerved by this recent development.

7:39

So I feel like this is kind of a unprecedented thing, I suppose.

7:45

I was under the, I'm completely like, I'm not super into like how US law enforcement

7:51

works, but I was under the impression that you would get this sort of arrested if you

7:59

wiped the device after they collected it

8:02

or while they're in the process of collecting it.

8:05

But am I understanding correctly that this happened

8:08

before the Border Patrol people collected the device?

8:16

- I believe so.

8:18

So it's really hard to know for sure.

8:22

But yeah, according to 404 Media here,

8:25

It says that he wiped the device before they were able

8:30

to search it and I believe it was before he was arrested

8:34

or anything and yeah, it's,

8:40

this is a really confusing, not confusing story.

8:45

So I have a friend who is a lawyer

8:48

and is very passionate about this kind of stuff.

8:50

And she sent me like eight pages of thoughts about this

8:55

phones. And this is specific to the US. Yeah. To be fair, a lot of those pages were a paper

9:02

she wrote in college that she shared with me that was basically about the state of cell

9:09

phones search rights here in the US. And it's very unclear area. So within 100 miles of

9:21

the US border and that does include the ocean.

9:24

Customs and Border Patrol has jurisdiction

9:26

and they can choose to search your phone.

9:29

And there's a very,

9:34

everyone, I should also mention,

9:36

this includes any port of entry.

9:37

So that's why this would have happened in Atlanta.

9:39

Atlanta is not within that 100 mile zone,

9:41

but because they have the busiest airport

9:44

that is an international airport in the US,

9:47

Customs and Border Patrol would have some jurisdiction there.

9:49

So it's a little unclear why they wanted to search his phone,

9:53

but he wiped his phone.

9:54

And like you said, we don't know if it had a graphing on it,

9:57

although it would suggest that he did

9:59

because graphing does have that feature,

10:01

but it could have been, I think,

10:03

I think Alex has that feature too.

10:06

And the article simply says that he put in a code

10:08

and wiped his device.

10:09

So I don't, you know, it could have been any number of things,

10:12

but it could have been a third party app

10:13

that did it automatically on a stock pixel.

10:15

So we can't say for sure, but for one reason or another,

10:19

he wiped the phone and now they're basically trying to charge him with destruction of evidence.

10:24

And this is where the gray area is, is the Supreme Court has really, actually, let me

10:31

go ahead and open one of the documents that she sent me, because the Supreme Court has

10:37

really declined to weigh in on this.

10:43

So like when does your phone become evidence that you can be charged for destroying?

10:46

Can you protect your data before they physically take it?

10:48

the legal difference between wiping and refusing to unlock.

10:52

The only case that we really have that like clearly says anything about cell phone privacy

10:57

is Riley v. California.

11:00

And what happened is this dude was pulled over because his car registration or his tags,

11:04

his license plate basically was expired.

11:06

And then when they pulled him over, they found out that his license was also suspended.

11:10

So the standard practice is to impound the car and search the car.

11:13

And then when they searched the car, they found hidden guns.

11:16

So obviously that escalated the situation.

11:21

And just to clarify, when they search the car, when they impound it, it's just to make

11:24

an inventory.

11:25

Like here's everything we found in the car, you'll get it back.

11:27

When they found the guns, that changed the math.

11:30

So now they had, they arrested him for, I believe, possessing guns he wasn't supposed

11:35

to have or maybe he didn't disclose them.

11:37

But sorry, there's a lot of context that goes into this.

11:40

When they arrested him, when cops arrest somebody, they are allowed to do a basic search of that

11:45

person and the immediate area for the officer's safety. So the idea is they want to make sure

11:50

that you don't have a knife or you're hiding a gun under the seat, which he kind of was,

11:53

that you're going to try to attack the cop with. And when they arrested him and they did that basic

11:58

search, they also searched his phone. And that's what this case, this Riley v California, he said

12:02

they had no business searching my phone because that did not present a risk to the police. And I

12:07

believe if I remember correctly, he won that. So the Supreme Court did say you need a warrant to

12:12

to search a phone in that case.

12:14

But I mentioned this 100 mile border zone

12:17

where Border Patrol has jurisdiction,

12:19

they don't need a warrant to do a routine search.

12:23

And the Supreme Court has refused to say

12:25

what does a routine search look like?

12:27

We know what it doesn't look like

12:28

if they hook it up to Celebriot or anything like that,

12:30

that counts as a more advanced search

12:31

and they do need a warrant for that.

12:33

If you're a US citizen,

12:34

I'm not sure about if you're not a citizen.

12:37

But a routine like hand me your phone

12:39

as long as they're not plugging it into anything

12:42

might be a routine search that might not.

12:44

So cell phones exist in this really gray area

12:47

where we have conflicting court rulings

12:49

where some courts say you need a warrant,

12:52

some courts say you don't,

12:53

some courts say you need it in this situation,

12:55

some courts say you don't.

12:56

And it's really unfortunate

12:58

because the Supreme Court has continually refused

13:01

to lay down a legal precedent on this.

13:03

According to my friend,

13:04

they declined three cases in 2021,

13:06

the Fifth Circuit declined a case in 2023.

13:09

So her speculation is that the court

13:14

is kind of trying to figure out what the public consensus is.

13:18

And that's really what they want to base their ruling on.

13:21

And right now they can't kind of figure out

13:22

where the public opinion is on this matter.

13:26

And that's why they don't really want

13:27

to put down a ruling yet.

13:31

So not to get too much into like analysis or anything,

13:34

but one thing she said is in her opinion,

13:36

that's why we need to build consensus.

13:38

We don't want to wait for them to make that ruling based on scattered litigation, kind of quoting her a little bit here.

13:44

But the idea is that we want to try and form public opinions so that they rule in the favor of like, no, you need a warrant to search a phone.

13:51

But yeah, that's kind of the all the background of this.

13:55

So it's unfortunately, there's like really no like legally speaking, we can't really say for sure what exactly.

14:04

There's no precedent here. There's no laws.

14:05

Everything is purposely vague and blurry.

14:07

And yeah, it sounds like the article goes on to say,

14:12

"Tunik has been released. It is awaiting further trial.

14:14

He's not allowed to leave the state of the Northern District."

14:17

I think you may have mentioned that.

14:20

So yeah, it is something to be aware of.

14:23

I realize that a lot of people,

14:26

or some people may have really sensitive things on their phones.

14:29

Like I believe one story I read, I can't remember where,

14:32

But it talked about an immigration attorney who had his phone

14:36

searched, like plugged in and searched the advanced search.

14:39

And he had to go through like multiple court cases.

14:42

But at that point, the damage is kind of done, right?

14:44

And that's supposed to be like confidential information, client attorney

14:48

and stuff like that.

14:51

But it's, it's tricky because we now see this and I've, I've thought

14:54

about this for a long time is like, if you delete that phone, if you wipe it,

14:58

could the cops charge you with obstruction of justice or, you know,

15:00

tampering with evidence or anything like that. And again, according to my friend, there's

15:04

really no clear like, Oh, well, he wasn't arrested yet. So it's not technically evidence yet.

15:09

And there's also a lot of, um, what's the word I'm looking for? The, uh, gosh, I just

15:19

lost my trade of thought. I'm so sorry. Um, yeah, there, there's somebody was making the

15:24

point about, uh, there, there's kind of the law weirdly does and doesn't, from what I

15:29

I can tell put a lot of emphasis on motive.

15:32

So like if he had a script that every single day

15:35

reset his phone, it would be really easy for him

15:37

to be like, no, they just happened to grab my phone

15:39

right before that script ran or whatever.

15:41

But, you know, if he manually triggered it,

15:44

it becomes a much trickier gray area, I think.

15:47

So, yeah.

15:49

- Yeah, I think the, the write up from Kevin,

15:52

like he suggests like if you're a US citizen or resident,

15:56

the best bet is not actually to wipe your device.

15:59

If you expect an imminent search or arrest,

16:04

something that you could think about

16:05

is make sure your device is fully patched

16:07

and setting a strong password or passphrase on the device

16:10

instead, which would make it a lot more resistance

16:14

to forensics tools like Celebriot.

16:17

But I think in a lot of cases when we see these cases,

16:21

it's like a case that goes on for like a year, two years,

16:26

more like it's a long time to have a device in isolation. And I think by the time, you know,

16:35

they actually need to extract the information from the device. It might already be, there might be

16:40

exploits available for Celebrate to access the information. So it's a bit of a problem. But I

16:47

think, yeah, it's, I think in this case, I heard there was, there was actual like video evidence

16:54

of him actually wiping the device. That's why this has been a concern, I think. So,

17:00

I mean, I don't know how someone could prove that you've wiped your device. I mean,

17:06

that's another whole thing, right? Like, I guess there might be some metadata on the device that

17:12

could potentially notify law enforcement that it's been wiped recently. But yeah, before first

17:19

unlock state is the most secure state you can have your device in. So if you keep your device in

17:24

that state when it's taken by the police,

17:26

then that will give you the most security.

17:31

- Yeah, just to back that up,

17:32

I think for most people that is probably your best bet.

17:36

I, like I said, there are definitely some people

17:38

who have really sensitive stuff on your phone

17:41

where it's worth it just to go ahead and wipe it

17:43

and roll the legal dice like that.

17:45

But I think for the vast majority of people,

17:48

what would be better is to just restart your phone.

17:50

And well, there's two things specifically.

17:53

Number one, be mindful what data is on your phone, right?

17:58

This is a really, and I know,

18:01

I think we're planning to talk a little bit more

18:03

about digital minimalism in the future,

18:04

in like a future video,

18:06

but there is a strong argument here

18:09

for the idea of digital minimalism

18:11

of try to keep things off your phone

18:13

if you don't actually need them.

18:14

And sometimes you do,

18:15

but sometimes things like email or banking apps

18:19

or setting up the disappearing messages

18:21

apps like signal or session, um, you know, or simple X, simple X also has those two setting

18:27

those disappearing messages.

18:28

So that way, even if the phone is taken and unlocked, there's very little data on it.

18:32

It's kind of a harm reduction, uh, attitude.

18:35

But also, yeah, for those who don't know, when you restart your phone before you unlock

18:38

it for the first time, that is called before unlocked, uh, first state or before unlocked

18:42

state.

18:43

And that is basically the most secure your phone can possibly be.

18:47

Um, because once you unlock it, your phone starts to store things in the temporary memory

18:51

And it becomes a little bit easier to compromise.

18:54

I'll admit, I don't really know the technical details of that.

18:58

But it's, yeah, so for most people, your best bet is probably just going to be to go ahead

19:02

and reboot your phone before handing it over.

19:05

And like Jordan was saying, as long as it's like up to date and it's the most recent operating system,

19:11

that's usually going to be, that's going to provide you a pretty decent amount of protection,

19:16

I think in a lot of cases.

19:17

There's also not to go on too long, but there's a, you know, there's a lot of the apps that

19:23

we recommend in the privacy community have the option to add an additional layer of like

19:27

passcode locking or authentication.

19:29

So like you can lock, I think proton is one of them.

19:31

I believe signals one of them, Molly's one of them, things like that, that you can add

19:35

that additional lock where even if the phone itself is unlocked, the app itself may not

19:38

necessarily be unlocked.

19:40

I don't know how well that works against forensic tools, but it could slow down people

19:43

a little bit.

19:44

And then just one last thing to kind of alert people to is biometric unlock is kind of a

19:51

pro and a con.

19:53

Face unlock is not usually something we recommend because there have been stories about people

19:57

like pointing the phone at you and unlocking it.

19:59

But things like fingerprint unlock, generally speaking, again, phones are a really great

20:05

legal area.

20:06

But generally speaking, you cannot be ordered to turn over the password because that is a

20:10

violation of your Fifth Amendment right against self-incrimination and your First Amendment

20:13

right to free speech here in the US, but you can be ordered to unlock the device.

20:17

So, um, yeah, things like a fingerprint may not be, they may provide a little bit of

20:24

protection.

20:25

They may provide a good balance between longer passwords and convenience.

20:29

But also if you're going to a protest, hopefully you're going to harden your

20:31

phone a little bit more than usual.

20:33

So yeah, I don't know.

20:36

Oh yeah.

20:36

And real quick, Kevin pointed out in the article here that graphing specifically

20:39

does have an option to automatically reboot your phone after a certain amount of

20:42

time of inactivity. So yeah, if you're going to, if you're

20:45

bringing it to a protest, you could lower it dramatically to

20:48

like a couple hours. So yeah, I mean, I think in a lot of cases,

20:54

you should weigh up whether you actually need to take your phone

20:57

to any event, you know, especially if you suspect that

21:01

there will be police presence or law enforcement, I guess.

21:07

Because yeah, you can eliminate the risk entirely. And I think

21:11

It's important to be mindful and use tools

21:14

that don't leave trails behind,

21:18

especially if it's something sensitive, right?

21:20

Like journalism or any sort of thing like whistleblowing.

21:24

Yeah, it's better to utilize tools

21:27

that don't open you up to having that data

21:30

searched in the first place.

21:31

So, but yeah, I think that's a pretty good summary

21:35

of that article there.

21:38

I think we're pretty good to move on to the next one here.

21:43

Alrighty, so our next article, we're gonna head all the way on the other side of the

21:47

world and talk about Australia where this world first teen social media ban that we've

21:53

been discussing for quite some time has now taken effect.

21:57

So I believe that took effect on the 10th, if I remember correctly, so that's two days

22:02

ago at the time of this recording.

22:05

And for those who are just joining us, basically Australia has banned children from under 16

22:11

building social media accounts.

22:13

And yeah, I do.

22:15

So the article we cited here is the register,

22:18

but I mean, you can find this being talked about

22:19

in any number of news outlets.

22:21

And I know it's probably not great for editorial reasons

22:27

for them to be injecting their own bias,

22:28

but I do really appreciate their cynical take on this,

22:31

which actually does kind of reflect,

22:34

experts have been saying this isn't really gonna do anything.

22:36

And the prime minister also said that too.

22:38

He said, "From the beginning,

22:39

we've acknowledged this process won't be 100% perfect,

22:42

but he compares it to underage drinking.

22:44

He says the fact that teenagers occasionally find a way

22:46

to have a drink doesn't diminish the value

22:47

of having a clear national standard.

22:51

So yeah, I mean, that's really,

22:54

that's kind of the nuts and bolts of the story

22:55

as far as the actual facts go is if you're under 16,

22:58

you can't have a social media account in Australia anymore.

23:02

I don't think the accounts are getting deleted.

23:06

I could be wrong about that part.

23:09

Yeah, that's definitely a good question, right?

23:11

I think it's we should probably like set some some work, some pre pre information

23:19

here, but it's basically it's the way that this ban works is that it will

23:25

deactivate those accounts and the information will be deleted.

23:29

So yeah, oh, losing their accounts.

23:32

It depends on the platform, what they decide to do in particular.

23:37

But like we've already seen people that have their account.

23:41

I mean, the main issue that I've seen with this is that the social media platforms that have asked for people's day to birth,

23:50

a lot of kids have already lied and said they're like 42.

23:55

So it's not coming up and saying, oh, you're underage.

23:59

You can't use this.

24:00

I think that the government is kind of a bit confused about what children do.

24:07

They don't usually tell the truth.

24:08

They're not going to get on platforms before they really should.

24:12

That's just how it works.

24:13

It's just how children are.

24:15

So, I mean, it's not really a surprise that this isn't working too well, I guess.

24:21

I think there's at least currently I've seen a couple of articles from the ABC.

24:27

That's like the national Australian national like news agency.

24:32

Australian broadcasting company or something like that.

24:35

Yes, something like that.

24:36

And it's like the state media, I guess.

24:39

And they've been kind of talking to children and talking to parents and things.

24:44

And there's quite a lot of instances, at least currently, of people who have,

24:50

who are under 16, having multiple accounts on the same device.

24:55

Some of their accounts not being banned.

24:56

some of them are being banned because the age is correct.

24:59

So it's funny.

25:00

It's it's not working that great.

25:02

I also saw some comments because the way that this law works is they're not allowed to.

25:09

I believe they're not allowed to actually ask for your ID.

25:12

They have to use like age assurance techniques.

25:14

So they're using like facial recognition, age facial recognition,

25:20

which is like you point the device at your face and it guesses your age,

25:25

which I think is a little bit of a problem, right? Like what if someone who's like 18 looks

25:31

younger than they are and then now the racket's banned? Like that doesn't really make that much

25:37

sense to me. And there's also people who just point their device at other people. So I mean,

25:42

I'm not sure that that is really that great at stopping this. The only way that I see it actually

25:48

being effective is if they introduce some sort of digital ID system or they actually force people

25:53

would upload their ID documents. Like everyone to upload their ID documents. So the thing

26:02

that I've noticed though is that a lot of people who are adults haven't really had any

26:06

effect on their accounts yet. So it seems like they might be going for the most obvious

26:12

accounts first that I've put the correct birth date and stuff and disabling those. I think

26:18

according to the government, they said they shut down like 300,000 children's accounts

26:22

on Snapchat or something. So it's definitely working for people who are under 16, their

26:32

accounts where they have put their real date of birth are being disabled. So yeah, I don't

26:39

think this is, we've talked about this a bit, but I think when you restrict children's ability

26:46

to access information, it becomes a little bit of an issue. It's also kind of confusing

26:51

which platforms they're banning like Pinterest and Discord.

26:56

I don't know about Discord, but I don't know if any of you have been on Discord,

26:59

but there's a lot of stuff that's bad on there.

27:03

And I believe that even the app on the app store is rated R 18.

27:08

So it's very confusing.

27:09

It doesn't make much sense why they're choosing certain apps over others.

27:14

I think in a lot of cases, we'll just see a lot of people flocking to different apps instead,

27:19

possibly even shadier apps using VPNs to bypass this.

27:25

And I think in a lot of cases I've seen there's people who use this for like niche

27:30

interests. They use social media to interact with other people who have niche interests, people who are like, you know,

27:35

marginalized, like queer people. It's it's a bit of a problem that we're restricting the access of

27:43

children to the internet basically. I

27:50

They have a right to access the internet and there needs to be better systems to do this, right?

27:56

Like we already have on device, like parental controls, like Apple has some of the best in the world.

28:03

You can set that up straight on people's devices. It limits what people can access.

28:08

I think when we start like adding blanket bands to platforms, it becomes more of an issue.

28:16

So I'm not really, I think a lot of people are very much not a fan of this in Australia.

28:21

And I don't think people are going to like Albo very much for too much longer because of this.

28:27

I don't think it's going to be very good.

28:31

Yeah, you really hit the nail on the head with everything.

28:35

A lot of the stuff you said actually is here in this article.

28:38

And let's see, reports have noted a sudden sign up in a spike for apps like Lemonade

28:43

and Yope that currently aren't covered by the ban.

28:46

That surge has made prophecy out of warnings that banning social media will just see kids

28:49

shift to services operated by outfits even less interested in child safety than the likes

28:53

of Meta and Google.

28:55

Hard to believe there's companies that care less than Meta about child safety, but apparently

28:58

there are.

29:00

They also said that, you know, the law, I believe, said they have to take like quote

29:05

unquote "reasonable measures" or something.

29:07

So basically companies have to figure out on their own how they're going to verify users,

29:12

which means a lot of them are outsourcing it, which ironically the article points out here.

29:16

Some have decided to work with age verification companies based outside of Australia.

29:20

So now you're putting people at risk with companies that aren't even in the country.

29:25

The article says these companies may request a photo of government ID or a selfie.

29:29

So I guess some of them may actually require ID.

29:33

But yeah, some of them may take selfie.

29:34

Like you said, I, when I was in college, I sat behind a girl in my English class that I swear looked like 20, you know, 19, 20, 21.

29:43

And she was like 42 or something like that.

29:46

And she just looked amazing.

29:47

And I'm sure when she was 20, she probably looked like she was 16.

29:49

So, you know, this, this like determining a person's age isn't always accurate, just based off the photo alone.

29:56

And then, um, yeah, again, like you said, you know, some people are using makeup masks lying about their ages.

30:03

I know in the UK with their online safety act,

30:05

we saw people uploading video game characters,

30:08

all kinds of crazy stuff, VPNs like you said.

30:11

So yeah, this is very symbolic I think.

30:17

And actually on that note,

30:19

we did include in the show notes,

30:22

EFF has launched an age verification hub

30:24

as a resource against misguided laws.

30:27

So I think this is a good time to mention that real quick.

30:30

So the EFF has launched this website.

30:33

First of all, they're launching, they're doing an AMA,

30:36

I believe next week.

30:38

Yeah, December 17th at 5 p.m. Pacific time.

30:41

And then, excuse me, 12 p.m. through 5 p.m. Pacific time

30:46

with EFF attorneys, technologists and activists

30:48

answering questions about age verification.

30:51

Oh, this is going on for three days,

30:52

December 15th through 17th.

30:54

So you'll have plenty of time to ask your questions.

30:56

And if you're not a Reddit user

30:58

or you're unable to make that for any reason

31:00

They're going to have a live stream panel discussion.

31:02

That's January 15th from 12 p.m. Pacific time again.

31:07

So there's that worth checking out, but they also have an actual hub, EFF.org/age.

31:14

And this is a really, really simple hub, but it has, over on the left side, it says, like,

31:20

why join the fight?

31:22

And it tells you, you know, what's at stake, who is harmed, is this legal, does the tech

31:26

even work, which we were just spent all this time talking about usually not, and it usually

31:31

backfires a lot. And yeah, I actually like where did it go? In the original article where

31:40

they talked about this new hub, EFF, they had a really good line. It says, "We all want

31:44

young people to be safe online. However, age verification is not the panacea that regulators

31:48

and corporations claim it to be. In fact, it could undermine the safety of many." So,

31:53

Yeah, it's like you were saying, we've talked about this quite a lot.

31:57

And it's one of those things where politicians want easy, simple solutions.

32:02

Everybody does.

32:02

It's human nature.

32:03

We want a good guy and a bad guy, and we want very clean, simple, you know, all good

32:10

or all bad.

32:10

Like it makes for a good movie, but in reality, people don't really like it when there's kind

32:15

of a gray area and like, oh, this person has redeeming qualities.

32:18

So like politicians just want an easy answer that they can point to and say, I did a thing.

32:23

I made everybody safe vote for me again next year.

32:26

And unfortunately, this is not one of those issues that you can just easily do that with

32:30

even I've mentioned in the past, like going after the companies and making social media

32:35

like Facebook themselves has pumped out so many studies that say this is bad for mental

32:41

health just across the board, not even for kids, you know, doom scrolling.

32:45

I know I'm kind of rambling or getting on a soapbox a little bit here, but like doom

32:47

Doomscrolling has become a common term in today's day and age.

32:51

And honestly, that kind of blows my mind that we all just accepted that as a term.

32:56

But like, that's how bad it is.

32:57

And I don't, that's not limited to kids.

33:00

We all know what doomscrolling, even if you don't know it by name, if somebody describes

33:03

it to you, it's like, oh yeah, I do that sometimes.

33:05

So like, it's crazy that we will do anything other than crack down on these companies and

33:10

be like, stop intentionally putting people at risk.

33:13

And it just, there's so many arguments against this.

33:17

It makes kids unprepared for how to deal with this stuff once they turn old enough to start

33:21

using social media.

33:22

And it, I've mentioned before, I think it takes away from the parents right to like

33:26

decide when is right for their kids to join.

33:28

And it's just, yeah, I don't know.

33:30

It's at best, it's well-meaning.

33:33

And I say at best because I know there's a lot of people who don't believe that.

33:35

And I don't blame you for not believing that.

33:37

I'm not saying I believe it entirely, but at best this is well-meaning, but it's an oversimplified.

33:43

What's somebody, I forget who it was.

33:44

It may have been EFF.

33:45

said, it's like, it's like taking a machete where you need a scalpel.

33:50

And I don't know if they were talking about age verification, but yeah,

33:52

it's, it's just one of those things where it's an oversimplified solution,

33:56

quote unquote solution that, uh, yeah, we're going to have to keep an eye on

34:01

cause that's, uh, yeah.

34:04

Yeah.

34:05

I think there's, we've made a lot of, we've made a lot of like points towards

34:09

why this is, uh, why this is bad.

34:12

Um, someone said here, if you're under 16, you're probably unable

34:15

watch this live stream. Um, actually you can still watch the live stream.

34:19

You just wouldn't gotta be logged in. So that doesn't,

34:22

that's another thing that doesn't make sense about this, um,

34:25

this whole thing is it doesn't affect being logged out.

34:28

So what people are just going to start watching YouTube videos logged out,

34:32

people are going to start watching things through frontends and all these other

34:37

things. Um, I think stopping people from posting as well is kind of,

34:42

you know, it's, it's,

34:43

I heard someone say it's like you're silencing an entire generation, basically, because they

34:48

can't comment on anything, they can't express their thoughts, they can't do anything.

34:53

So I think that is another important thing about this is that there's people that are

35:01

being silenced and their voices need to be heard as well on this.

35:05

And it's not like these are people that can vote, these are not people like that are of

35:11

that age, but their thoughts on this still matter. So I think we should definitely listen

35:18

to what they're saying. And in a lot of cases, it sounds like a lot of people, especially

35:22

under 16, are against this because that's where they find their communities. That's

35:28

where they access information. So I don't know, I think this is... I don't want to use

35:37

a commonly used anecdote here, but it does feel a bit like a slippery slope.

35:45

This could be like the push towards an actual ban or towards requiring ID for any platform, for instance.

35:55

I think it's very problematic, but like Nate said, there's some really good information here on the EFF's website.

36:03

There was also, I'm also following a fight for the future one as well.

36:08

They're also going against this sort of stuff.

36:10

So it's worth checking out that.

36:12

It was, it was a page specifically for Australians as well.

36:18

And yeah, I think we should definitely try and because I think a lot of people

36:24

that I've talked to about this, they're actually in favor of it because

36:28

they don't really understand the repercussions behind something like this.

36:34

because it's actually kind of like, it's shrouded in language that a lot of people would want to agree with, right?

36:42

Like this is to protect children, this is, you know, this sort of stuff.

36:46

And I think that's important to kind of come out against if you can, to people in your life, to people that you talk to.

36:54

Because, yeah, there's definitely some people who are...

37:00

There was an interesting post I saw, and it was from one of the MPs in Australia,

37:07

and he spelt Snapchat with a capital C.

37:12

Like, there's no capital C in Snapchat.

37:14

Like, these people don't even know what social media platforms are,

37:17

and they're enacting laws to ban them.

37:19

Like, I think there should be more experts consulted on this than just politicians who

37:28

don't really understand technology. They don't understand the repercussions of this sort of ban.

37:34

But I'm sure they did the calculations and they found that this would be

37:41

received positively by the public. So that's why they went ahead with it.

37:47

But yeah, it's kind of all I have to say about that one.

37:52

Yeah, the slippery slope argument is like, it's really tough because sometimes it makes you

37:56

sound kind of paranoid. And sometimes it doesn't come to true, you know, or it come to pass.

38:01

It's one of those things where everybody always makes that argument on both sides of an issue.

38:08

But yeah, I don't know. I mean, I'm with you. I think it is a slippery slope. I think it puts

38:16

It puts power in the hands of the government to be able to say, "This requires age-gaining,

38:22

but this doesn't." Then they can update that list at any time to say, "Well, now this thing

38:28

is not safe for children." You made a really good point that, again, this is a little bit

38:32

conspiratorial, but any government will usually dress things up in a way that makes it very hard

38:40

to argue with them. "Oh, this is for the safety of the children." Then when you come in and you're

38:46

I don't think this is a good idea.

38:47

The default response is, oh, so you hate kids.

38:50

And it's designed to like paint you in that negative light

38:53

when you go against the grain like that.

38:55

And it's kind of nefarious, but it's really intentional.

38:57

They know what they're doing.

38:58

But yeah, the last thing I want to touch on is

39:01

what you said there about they don't even really know

39:04

how a lot of this stuff works.

39:06

And I don't know about Australian politics,

39:08

but in the US, that's certainly how it works.

39:10

It's a lot of the time somebody will come in

39:12

and say, we think there's a problem.

39:13

Here's our solution.

39:14

Here's this bill.

39:16

And a lot of the time the politician,

39:17

'cause you know, to their defense, they're human.

39:19

Nobody knows everything, right?

39:21

They probably have no idea about how any of this stuff works,

39:24

but they're like, well, you made a good argument,

39:25

you've got this bill, it's ready to go, sure,

39:27

I'll get behind it.

39:30

And that's one of the reasons that it's really important

39:32

to try to get involved politically if you can,

39:34

is because a lot of the time, they may not know any better.

39:38

Like I really, I don't wanna let them off the hook too much

39:40

'cause I'm very critical of politicians myself,

39:44

But I think a lot of the time we envision them

39:46

as like these evil like, you know,

39:48

like in a movie or a political thriller,

39:50

they're like purposely just trying to make people suffer.

39:53

And it's like, no, a lot of them really think

39:54

they're making the world a better place.

39:55

And they have no idea that this is a genuine risk

39:58

that has all these knock on effects

39:59

or there's so many easy ways to get around it.

40:02

And a lot of the time that's simply

40:04

because there's nobody on the other side of the aisle

40:07

as citizens, there's nobody on the other side of this issue

40:10

coming up and saying, hey, this is a really bad idea.

40:12

And here's why I think you shouldn't do this.

40:14

Not to shill it all the time, but I did mention

40:16

on a previous live stream, Louis Rossman did a talk

40:20

for EFF Austin, which full disclosure,

40:22

I'm a board member of EFF Austin.

40:23

But if you look up Louis Rossman, EFF Austin,

40:25

he tells the story of that's exactly how he got involved

40:28

in right to repair because he found out there was a law

40:32

that was going through and he went and talked

40:34

to the politician who was sponsoring the law.

40:36

I might be getting the finer details wrong,

40:38

but basically he went and talked to him

40:40

and they told him it's like, well, you know,

40:41

that other person came in and here's what they told me.

40:44

And Rossman was like, what are you talking about?

40:46

That's not how any of this works.

40:48

And the guy was like, the politician he was talking to,

40:50

he's like, son, I'm like 80 years old.

40:51

I don't know how any of this stuff works.

40:53

They came in, they made what sounded to me

40:55

like a coherent argument.

40:56

I believed them, you're the first person

40:58

telling me otherwise.

40:59

And that was the moment,

41:00

and ultimately Lewis was able to win him over to his side.

41:03

But that was the moment that Lewis realized.

41:05

He's like, a lot of the time there is simply nobody there

41:08

telling them why the other side is mistaken

41:10

in this thing and is, you know, yeah, it's just the point is like, if you can, and, oh,

41:19

here's what I was going to say is like, you know, don't go in there and like attack them

41:22

and like, you're evil, but if you can get in there and show them like, hey, this is

41:25

why that person's wrong and this isn't a good idea.

41:28

I'm not saying it's 100%, but a lot of the time it can make a difference, especially

41:31

at the local, the local levels.

41:33

So yeah.

41:38

All right.

41:39

On that note.

41:39

What?

41:42

Yeah, do you have anything to add or?

41:44

- No, yeah, we can move on to the next story here.

41:49

And this one is from Brave.

41:52

So Brave, this is published on December 10th,

41:56

AI browsing in Brave Nightly,

41:58

now available for early testing.

42:02

So this is kind of a surprise, I guess,

42:04

because Brave browser has kind of built itself

42:08

like a privacy and security focused browser. And generally, when we think of AI browsing,

42:15

we think of insecurity and privacy invasion. So this is certainly an interesting direction for

42:22

Brave to be putting their resources into. But we've kind of seen a similar thing happening

42:27

with Firefox. So I'm just going to quote from the article here. "Today we're announcing the

42:31

the availability of Brave's new AI browsing feature in Brave Nile.

42:36

Our testing and development build channel for early experimentation and user feedback.

42:42

When ready for general release, this "agentic experience" aims to turn the browser into

42:47

a truly smart partner, automating tasks and helping people accomplish more.

42:52

I feel like that's the buzzword of this year's "agentic".

42:56

It's "agentic".

42:57

We added an "agentic agent".

43:00

However, agentic browsing is also inherently dangerous, giving an AI control of your browsing

43:06

experience could expose personal data or allow agentic AI to take unintended actions.

43:13

Security measures are tricky to get right and disastrous when they fail.

43:18

As we have shown through numerous vulnerabilities we found and responsibly disclosed over the

43:24

last few months.

43:27

to prompt injections, basically where a website can have a prompt on the page that can basically

43:35

redirect the agentic AI to perform like a malicious attack against someone, are a systemic

43:44

challenge facing the entire category of AI-powered browsers. For this reason, we've chosen

43:49

a careful approach to releasing AI browsing in the Brave browser, soliciting input from

43:55

security researchers. We are offering AI browsing behind an opt-in feature flag in the nightly

44:01

version of Brave, which is the browser build we use for testing and development.

44:08

We'll continue to build upon our defenses over time with feedback from the public.

44:12

At present, these safeguards include AI browsing is currently only in the nightly channel behind

44:18

an opt-in feature flag via Brave's integrated AI assistant, Leo. AI browsing happens only in an

44:25

using a dedicated browsing profile,

44:27

keeping your regular browsing data safe,

44:30

and AI browsing has restrictions and controls

44:33

built into the browser,

44:35

AI browsing uses reasoning-based defenses

44:38

as an additional guardrail against malicious websites.

44:41

The AI browsing experience has to be manually invoked,

44:45

ensuring that users retain complete control

44:47

over their browsing experience

44:48

and likable AI features in Brave,

44:51

AI browsing is completely optional,

44:53

by default.

44:57

So what are your thoughts, Nate?

44:59

What are your initial thoughts here?

45:04

- Yeah.

45:07

Well, my initial thought kind of goes back

45:08

to what you said at the very beginning

45:09

where you said you're surprised

45:13

because AI is not typically something we think of

45:15

in the context of privacy and security.

45:18

And I agree with you for the record,

45:20

but I have noticed that Brave is trying

45:22

to really, they're trying to have their cake and eat it too,

45:27

where they wanna be a privacy browser.

45:30

And there's a lot of websites out there that show

45:33

that they do have actually really good privacy out of the box.

45:37

And I do believe that they're really innovative,

45:39

but I also question exactly how private AI can really be,

45:44

even with all of the stuff they've put into it.

45:47

I don't know, it's interesting for sure.

45:49

I agree with you, like a gentic AI

45:51

is definitely the buzzword of the year for sure.

45:53

And, you know, I really appreciate

45:56

that a common criticism with Brave is it's got all the,

45:59

the, you know, people call it blow.

46:01

It's got the AI stuff, it's got the crypto stuff.

46:02

It's got all this stuff built in

46:04

that people don't necessarily want.

46:06

And I understand, I guess it's one of those things

46:10

I think both people have really good points

46:12

'cause some people will say like, well, just don't use it.

46:14

You know, you don't have to use any of it,

46:16

which is totally true.

46:17

Ironically, Brave, actually, I found this out recently,

46:21

Brave Shields has one to disable Brave's own AI summary

46:24

and Brave Search.

46:25

So you can use Brave with Brave Search

46:28

and still disable the AI summary using their own shields,

46:32

which is pretty cool.

46:33

You can opt out of all this stuff,

46:34

but at the same time, I understand the argument of like,

46:36

oh, you know, but the code is still there

46:38

and it's kind of adding potential attack surface

46:40

and I don't know.

46:43

In regards to this specifically, I'm still,

46:47

I'm not entirely convinced what use agentic browsing has,

46:50

especially this early in.

46:53

It's one of those things where I personally wouldn't

46:57

trust it to do, I don't know, even simple things.

47:02

Like, you know, okay, I need a new coffee maker, you know,

47:05

go compare the top three brands or whatever.

47:08

And it's like, but how do I know those are still the best?

47:10

How do I know that's not paid advertising?

47:12

How do I know you didn't summarize

47:13

something that wasn't there?

47:14

I've, I don't know if I've shared this story here before,

47:17

a long time ago, I was testing out LLMs to see like, Oh, let me see if these can improve

47:22

my workflow in any way. And I was doing a review of Threema, I think. So I, um, I told

47:28

an LLM, I can't even remember which one. I'm like, Hey, give me the pros and cons of the

47:31

messenger Threema. And one of the pros it said was it has a password manager built in.

47:37

And I was like, this is news to me. So I asked it. I'm like, Hey, what's your source for that?

47:41

And of course it couldn't tell me because this was one of those LLMs that doesn't cite

47:44

sources, but it went, oh, you're right. There is no password manager. And it's like, what? Like,

47:48

you know, there's so many stories out there about like developers don't like using AI because they

47:52

have to go back and fact check it. It's just twice as much work. And so I don't know, but it could

47:56

also be one of those things where it gets better over time. And when Proton rolled out their LLM,

48:00

they made the argument that like AI is not going anywhere. And we may as well give people private

48:06

options, which I feel like maybe is what Brave is going for here is like, everybody's trying to jump

48:10

on the agentic AI bandwagon.

48:12

So maybe it's best to give them an option

48:13

that's slightly less crappy than all the other ones.

48:16

But yeah, I don't know.

48:21

Yeah, I will say this.

48:22

I feel like Brave has been, again,

48:26

has done a lot of innovative things in the privacy space

48:29

and they have been one of the biggest,

48:32

I don't know if critic is the right word,

48:33

but like they said, we have shown through

48:35

numerous vulnerabilities that we found

48:36

and responsibly disclosed over the last few months

48:38

talking about the agentic browsing and other,

48:40

the security and other agentic browsers.

48:43

So I feel like I wouldn't expect them to be perfect.

48:46

I would definitely expect there to be mistakes in this,

48:49

for sure, some of which are just inherently

48:51

baked into AI itself.

48:53

But I gotta be honest,

48:54

if I was gonna trust anybody with it, I would trust Brave.

48:56

I'm not saying you should use it,

48:57

I'm just saying if I had to,

49:00

I would probably trust them first.

49:03

- Yeah, they seem like at least so far,

49:05

like I am very much in the camp of not really wanting any of these features in my browser.

49:11

But this definitely seems like the best possible people to be doing this.

49:18

And it seems like they've thought about the security aspects.

49:21

But I guess before we talk about that a bit more,

49:24

sod this all, just sent a gift membership in the chat.

49:28

So thank you very much.

49:29

And Supreme is 21, just receive that.

49:33

So definitely say thank you to that person.

49:35

And Jonah also sent five Privacy Guides gift to memberships in the chat.

49:40

So if you didn't get one of those, make sure to say thank you.

49:42

And we also got another donation from Sod this all for two pounds.

49:48

Enjoying the stream.

49:49

Keep up the good work.

49:51

Thank you so much.

49:52

That's so appreciated.

49:54

Oh, we're glad you're enjoying it so far.

49:59

Yeah.

50:00

Um, getting back to the article real quick, it does say, you know, they are open source.

50:04

They say we welcome bug reports and feature requests on GitHub.

50:07

Uh, we encourage anyone who discovers a security issue to report that issue to

50:11

our bug bounty program and they say they're actually doubling their usual reward

50:14

for now on hacker one.

50:16

So, um, yeah, it seems like they've, uh, they've at least put some thought into

50:22

this.

50:22

They talk about, cause I read this article earlier, they, they said, um, where was it?

50:27

It uses reasoning based defenses as an additional guardrail.

50:30

Um, maybe they're, they're talking about something else, but further down in the

50:33

article, they talk about how there's basically like two LLMs working here.

50:37

And so when you tell Leo to go, you know, compare coffee pots or whatever, Leo

50:45

goes and gets the answer and then actually double checks with a second LLM and

50:49

is like, Hey, here's the answer I came up with.

50:52

And then that LLM compares the prompt and the answer to kind of determine if

50:56

it's in the ballpark or if something's really wrong.

50:59

Um, this is very high level.

51:01

So they didn't go into like super, super detail, but, um,

51:05

and I really appreciate personally in this article, they keep saying over and

51:09

over, like right here, despite its risks, it shows great promise.

51:14

Um, whereas, oh no, here we go.

51:16

While these mitigations help you significantly, users should know that

51:18

safeguards do not eliminate risks such as prompt injection.

51:22

Um, further down, they say, uh, where did it go?

51:28

Basically they say throughout this article, like, look, nothing is

51:31

perfect, this has risks.

51:33

And they're really trying to hammer it home.

51:34

Like you need to understand this has risks

51:37

if you're gonna use it.

51:38

And I don't think they ever say they're gonna get rid of it.

51:41

I think it's just something they're acknowledging,

51:44

but yeah, it's important to be aware of that,

51:49

which unlike a lot of other browsers,

51:51

a lot of other companies are just trying to get the next dollar.

51:54

They're trying to just roll this out as quick as possible,

51:56

be first one to market.

51:57

And it seems like they're trying to make sure

51:58

that people know the risks,

51:59

which I personally really appreciate.

52:03

- Yeah, I think it's, I'm not in shock.

52:06

Like I genuinely am wondering what these sort of features,

52:10

like we've seen Firefox is also doing a similar thing,

52:12

like they're doing like a AI window.

52:16

I think it's very similar to what Brave is doing.

52:19

And it's, I guess my question would be,

52:22

what, who uses this and why would I want to use this?

52:26

So I prefer personally, like I prefer to like,

52:29

use the browser myself and would this really be saving me a lot of time? I'm just kind of questioning the

52:36

the need for this, I guess. I definitely am not quite sure what

52:43

is this just because Brave wants to be on the same level as other browsers and also get that market share from people who like AI?

52:51

I'm not really sure.

52:53

It certainly isn't particularly appealing to me, but I'm not really, maybe I'm not the target audience for this sort of feature,

53:00

but I feel like a lot of the extra features in Brave, I literally turn off every single feature that they've added.

53:06

Like, I just don't use anything that they've added. I just want the browser to have a search bar and, yeah, I don't need like a weather widget.

53:15

I don't need like a news tab. I don't need like a Brave VPN. I don't need like a Leo AI thing.

53:21

So I don't know, I think if I was going to use Brave again, I would probably disable all of this.

53:29

But yeah, it's not, I don't entirely understand it myself.

53:37

Yeah, I'm with you. I disable all the crypto stuff. I disable the Brave news.

53:42

I mentioned I disabled the AI summaries and Brave search even.

53:46

I do, I will admit I do use Leo sometimes,

53:49

but it's mostly for looking up like server errors.

53:54

I do a lot of self-hosting and I find it's,

53:58

I've had good luck with that, good success with that.

54:00

But I'm with you in the sense that I like it to be separated.

54:03

Like that's why I shut off the AI summaries in search

54:06

because I found myself relying on them more than I wanted to.

54:09

And I realized I was doing it.

54:11

And I'm like, no, I don't want to do that.

54:12

I want to go to the actual source.

54:13

I want to go to the website and get the full context

54:16

us this source or not.

54:18

And, uh, you know, it's one of those things where like, if I want the AI,

54:21

then I want to specifically open a Leo tab and go to that.

54:24

I don't want it just integrated in that way.

54:27

So the only thing I could personally think as a real use case for sure is maybe,

54:32

um, and I could be wrong here.

54:34

I could be projecting, but like, um, maybe if somebody's like differently

54:37

abled and it's like, no, it's awesome that I can now send the browser out to,

54:40

like I said, compare like six different coffee pots.

54:43

And that's easier for me because now I don't have to navigate

54:46

to every single website and that's a physical challenge.

54:49

But I don't know, maybe there are other use cases,

54:52

but yeah, I don't see myself really using it anytime soon.

54:56

I might play around with it whenever it comes into stable,

54:58

but I just, like I said, I don't trust it

55:02

to not make mistakes and just make double the work

55:04

where I have to go back and double check it.

55:05

So I don't know how much I'm gonna end up using it.

55:09

- Yeah, no, that's actually a really good point

55:11

about the differently abled people.

55:13

I guess I don't think about that

55:15

because I'm in a position where that doesn't affect me,

55:18

but that is a really good point actually.

55:22

I do think that is kind of an important thing

55:24

that should be available, but also, yeah, I'm not sure if...

55:29

Yeah, that might be tricky because I'm not sure what sort of...

55:35

It is gonna be sending basically everything to an AI server,

55:38

So that could also be a privacy risk for those people too.

55:42

I think I would hope that this sort of feature

55:45

could work on device, but it's pretty unlikely, right?

55:48

It's a lot of processing power to like analyze screens

55:52

and like do that sort of stuff.

55:55

So it seems like that probably might not be the case.

55:59

It's probably being offloaded to some open AI

56:02

or Gemini or some other third party AI companies.

56:07

So it's kind of unfortunate that there isn't really anyone who's taking a privacy first approach, even Proton.

56:15

They've got their AI thing, but it's not a local model or anything.

56:19

It's running on their servers.

56:21

And it really is just a pinky promise at that point.

56:27

So yeah, I don't know.

56:28

I'm not really the biggest fan of these features, but I think that the Brave team has done at least the due

56:36

diligence to try and make it, uh, not complete crap, like a lot of these AI

56:40

companies, um, like, uh, open AI or I don't really know the, all the other ones.

56:46

Um, but yeah, it's, it's interesting.

56:48

Uh, comment whoever that one's from.

56:50

Yeah.

56:51

Yeah.

56:51

That one.

56:52

Yeah.

56:52

Uh, Claude.

56:53

Yeah.

56:54

Can't remember all of them off the top of my head, but yeah.

56:56

Um, yeah.

56:58

It's definitely interesting to see.

57:02

Agreed.

57:05

All right, on that note, we're going to go back to some political stuff.

57:09

And we're going to talk about the U.S. planning to scrutinize foreign

57:12

tourist social media history.

57:14

And actually I've seen this article kind of make the rounds.

57:18

And I think they kind of buried the lead on this one because that is alarming,

57:23

but it's not just that.

57:24

So yeah, well, so I'm going to kind of start towards the beginning of this

57:29

article here, this comes from the Guardian.

57:30

And they say the mandatory new disclosures would apply to the 42 countries whose

57:34

currently permitted to enter the US without a visa,

57:37

such as longtime US allies,

57:39

Britain, France, Australia, Germany and Japan.

57:42

So in a notice published on Tuesday,

57:44

Customs and Border Patrol said that it would require

57:46

any telephone numbers used by visitors over the same period.

57:49

Any email addresses used in the last decade,

57:53

they would really love me if I had to do this,

57:55

as well as face fingerprint DNA and iris biometrics.

57:59

So that's why I said,

58:00

I feel like a lot of people are bearing the lead

58:01

because I haven't heard anybody mention that

58:04

like any newspapers yet or any news outlets. I think privacy international mentioned that in

58:11

their headline. But yeah, I know DNA fingerprint biometrics, which I mean, you can make the argument

58:18

that fingerprint and stuff like that, you or face at least you kind of give when you go through

58:22

the airport anyways, because a lot of them use facial recognition nowadays, which you can opt

58:26

out of as an American citizen. TSA will get a little bit short with you sometimes if you do it,

58:31

and they'll make you stand in the long line.

58:33

But other than that, I haven't had any issues.

58:36

But DNA, I think, is a new one.

58:38

I can't remember that one.

58:40

The article goes on to say it would also

58:41

ask for the names, addresses, birth dates, and birth places

58:44

of family members, including children.

58:46

So now your loved ones are being put at risk just

58:50

because you wanted to come visit.

58:51

So yeah, this goes back to an executive order

58:54

that Trump signed on the first day of office.

58:57

Here on the second term, it says that it calls for restrictions

59:00

to ensure visitors to the US quote,

59:01

do not bear hostile attitudes towards its citizens,

59:04

culture, government institutions, or founding principles.

59:08

And then the article just kind of goes on to talk about

59:10

how the US is supposed to host the World Cup this year.

59:13

We're co-hosting it with Canada and Mexico.

59:15

So there are concerns that this might cause

59:18

like a dip in visitors for the US part of that.

59:24

FIFA said they were expecting more than 5 million fans

59:27

across the whole thing.

59:29

And they say that the US has already seen a dramatic drop

59:33

in tourism on Trump's second term.

59:37

So, yeah, there's just kind of a lot of statistics here.

59:41

California tourism authorities are predicting

59:43

a 9% decline in foreign visits to the state.

59:46

Hollywood Boulevard reports a 50% fall in foot traffic.

59:51

Las Vegas has seen a rapid decline in visits.

59:53

Canadian residents who made a return trip to the US by car

59:55

dropped almost 37% in July.

59:59

Air travel from Canada 25%.

1:00:01

So anyways, yeah, a point being like it goes on to say like,

1:00:04

tourism is already kind of low

1:00:05

and this is probably not gonna do any better.

1:00:08

And Jordan and I were talking about this a little bit

1:00:12

before recording actually.

1:00:13

And it's such a tricky area

1:00:15

when you talk about this kind of political stuff

1:00:16

because on the one hand, every country of course,

1:00:19

wants to make sure they're keeping the bad guys out, right?

1:00:21

And they wanna make sure they're keeping out the people

1:00:23

who have ill intentions when they come to the country.

1:00:26

And especially in the wake of 9/11,

1:00:29

if you were old enough to remember that,

1:00:30

I remember all this information coming out that like,

1:00:33

yeah, these guys, anybody who spent five minutes

1:00:37

would have realized that these guys had bad intentions

1:00:39

and there were all kinds of red flags

1:00:41

that didn't go off or nobody followed up on.

1:00:44

But it's just, I don't know, man,

1:00:46

there's, I feel like this is especially the DNA stuff.

1:00:48

Like the social media is already kind of,

1:00:50

but now you're talking about, you know,

1:00:52

family members, including children,

1:00:54

all their information, DNA biometrics,

1:00:57

I really have to wonder at what point

1:01:01

is this going just a little bit too far?

1:01:03

And I don't wanna get too far off topic,

1:01:07

but I've heard Snowden talked about

1:01:10

one of the big problems they had at the NSA

1:01:12

was an overwhelming amount of data

1:01:14

and the inability to parse it.

1:01:16

And I have to wonder if that's kind of a point

1:01:18

they're gonna hit too,

1:01:19

where they're taking in so much data, it's like, cool.

1:01:21

Now that we've got 500 data points

1:01:23

on every single person coming into the US,

1:01:25

we can't get through all the noise

1:01:26

to find the actual problem people

1:01:28

because it's just too much data.

1:01:33

- I don't know if you have any thoughts on this story.

1:01:36

- I mean, I think I would say,

1:01:40

people I've talked to,

1:01:41

just like the general sentiment that I've had was,

1:01:45

people are a little bit cautious

1:01:47

about entering the United States

1:01:48

for a large amount of reasons.

1:01:52

And I think this sort of thing is not really helping things.

1:01:57

I think people, once they learn,

1:02:01

you've got to unprivate your social media accounts,

1:02:04

let them search through everything, give DNA swabs,

1:02:07

fingerprint scans, biometric scans.

1:02:11

This is sort of the stuff that,

1:02:14

I think a lot of countries where there's more

1:02:19

repression of citizens.

1:02:22

That's a lot of people would say is, you know, that's you're infringing on people's freedoms.

1:02:30

And I think America, people assume that it's a free country, but there's this sort of stuff where if you want to enter the country, you have to give DNA swabs face scans.

1:02:39

Social media history, like this is not really looking like a country that cares about the freedom of the people that want to visit.

1:02:49

And I don't know, it's it's definitely kind of concerning.

1:02:54

But the way I understand it, right, is you don't have to be like an American citizen to be protected

1:03:00

by, you know, the Constitution and all of those things, right? The laws apply to tourists as well,

1:03:06

correct? I would have to check with my aforementioned lawyer friend, but I actually don't. I think

1:03:14

certain things do and certain things don't. Okay, yeah, I think that's certainly something that I

1:03:20

was thinking of. Yeah, it's a bit of a, this is for the visa process, is that correct? You need to

1:03:29

provide this information for the visa process. It's not even to like enter the country. I don't think

1:03:34

so because no, because it says the new disclosures would apply to the 42 countries whose nationals

1:03:39

are currently permitted without a visa. I see. So this is basically applying to everyone then,

1:03:47

in that case, right? I think so, because yeah, I think the visa, the process to get a visa,

1:03:52

I think is even more strict. Wow. And is this just for, this is just for tourism. Like, you have to

1:03:58

give all this information for tourism. Yeah, just to come visit, which I mean, obviously, I am an

1:04:06

American citizen, so I haven't been on that side of things, but like I went to Europe a couple months ago and

1:04:11

you know, it was just for a couple or not even a couple weeks. It was like 10 days. I didn't have to get a visa or anything. So like

1:04:18

I kind of have been a little bit on that other side of this fence where I get to

1:04:23

you know, not in the turning over social media and stuff, but yeah, like it's you know,

1:04:29

I don't know. It's just it's weird to think about if I were had I gone to Europe and when I get to Amsterdam, you know

1:04:34

to go through the port of entry and they're like, okay, unlock your Facebook.

1:04:39

And especially someone like me, how do I explain that?

1:04:41

Like, okay, what's your Facebook?

1:04:42

I don't have one.

1:04:43

What about Instagram?

1:04:44

I don't have one of them.

1:04:46

What's your, you know, like people already give me weird looks when I'm like,

1:04:48

I don't have any of this stuff.

1:04:49

But and then, you know, I don't know if I get caught not telling them about my

1:04:54

mastodon account.

1:04:56

That's probably not a good look.

1:04:58

So yeah.

1:04:59

Yeah.

1:04:59

I think if you've made any comments, criticizing the US government or

1:05:04

the US in general, that would probably, I don't know, that seems like that could be a reason to deny you entry into the country.

1:05:12

And I don't know, just to put a different perspective on this as well, I don't know if anyone's ever, people always talk about, you know, China is quite strict on this sort of stuff.

1:05:22

But even China doesn't require this sort of information to enter the country, right?

1:05:27

So this is kind of, I feel like this is almost like global precedent at this point.

1:05:33

Like I actually have never heard of a country requiring this much information just to go

1:05:37

there as a tourist, not even to live there, just as a tourist.

1:05:42

So yeah, this is definitely kind of concerning.

1:05:47

And it's just, for me at least, it's just another reason why I would never go there.

1:05:52

No offense to any Americans.

1:05:54

I think you have an amazing country, but um, that's it's just yeah, it's a tricky. It's a tricky situation

1:06:02

To the uh

1:06:04

To the defense of the china critics, I think china still does it they just don't tell you you just show up there

1:06:08

And then find out you have to turn around and go home because they saw your twitter

1:06:13

Um, I could be wrong, but yeah

1:06:16

I don't know. I don't we we were talking about this a little bit before earlier

1:06:19

I don't want to be too political, but I do you know, I wasn't the military and i'm

1:06:24

I've never been an overly patriotic person, but I'm really worried about the direction

1:06:28

that we're headed in America.

1:06:29

This kind of stuff is a big part of that.

1:06:33

Like you said, this is ridiculous for a democratic, open country.

1:06:39

We've already seen people turned away at the border.

1:06:42

There was a guy, a French person, a scientist of some kind, who flew to the US for a conference

1:06:48

earlier this year flew into Houston and they said that he had a like anti-American extremist

1:06:54

content and to be fair, they never disclosed what it was.

1:06:57

So I mean, maybe it was like really bad stuff, although I find it a little bit hard to believe.

1:07:02

Maybe it was just, I don't like that guy, you know, there's, there's unfortunately, there's

1:07:05

a lot of stories out there and we're living in a very difficult time where the veracity

1:07:12

of a lot of claims is up for debate.

1:07:14

So it's really hard to parse through this stuff, but I think this is setting very dangerous

1:07:19

precedents when you make these overly broad, like, you know, hostile attitudes towards

1:07:25

its citizen culture, government or institutions.

1:07:27

Like, I don't know, that's kind of one of the freedoms I enjoy as an American is to criticize

1:07:31

my government, which I've been doing regardless of party for the last several years.

1:07:35

So I don't know, I'm probably getting a little too political, but yeah, it's just, we're

1:07:39

not heading in a good direction.

1:07:40

Um, it's scary from a, from a privacy perspective, for sure.

1:07:46

Yeah.

1:07:46

Also just from a general like freedom perspective, this is kind of concerning.

1:07:52

Um, I think everyone can sort of regardless of your political alignments say that this

1:08:00

is not really a great, uh, aspect in terms of people's privacy.

1:08:04

This is kind of invasive.

1:08:05

it's collecting too much information just to enter the country.

1:08:10

Um, I feel like that's a reasonable position to take on this, um,

1:08:14

regardless of what you think about whether this is a good idea or not.

1:08:19

Um, but yeah, I don't, I don't really have too much more to add on this

1:08:22

unless there's something you could add from a more American perspective.

1:08:29

No.

1:08:31

Um, yeah, I just not, not to give it a pass, but going back to what I said,

1:08:35

I see the argument of having to, trying to weed out the bad guys, but I just wonder about

1:08:40

the efficacy of this.

1:08:41

And you know, the question that I've always asked in the past is like, what are we giving

1:08:45

up in exchange for something?

1:08:47

You know, we want to keep the children safe, but if we're taking away all their rights,

1:08:50

is it really a worthwhile trade off?

1:08:51

Like how much are we improving their safety versus how much we're taking away?

1:08:55

And I feel like that really applies here.

1:08:58

How much are we weeding out the bad guys and keeping the country safe in exchange for violating

1:09:03

so many people's privacy. That's very concerning to me.

1:09:08

Yeah.

1:09:10

All right. I believe our next story will take us to India.

1:09:16

Yes. So India considers enforcing a GPS on mobile devices.

1:09:22

So this is sort of a quick little story here.

1:09:25

India's government is considering a proposal to force

1:09:28

smartphone manufacturers to enable GPS tracking at all times. This is just a short write up on

1:09:34

the privacyguides.org/news category by Fria. So the proposal comes courtesy of the Cellular

1:09:42

Operators Association of India, a non-governmental trade association representing reliance. Okay,

1:09:50

I'm not going to try. Just you can see on the screen. Some of the biggest Indian telecom companies,

1:09:56

The proposal is a response to the Modi administration's

1:10:01

rostrations that they often don't get precise locations when making legal

1:10:05

requests to telecom firms, since they rely on cellular tower

1:10:09

triangulation, which isn't always very precise.

1:10:14

So AGPS is Assisted Global Positioning System is a form of GPS that combines

1:10:20

a satellite-based GPS with some other technology like Celia Network and

1:10:25

nearby Wi-Fi access points to achieve a much more precise location than either technology

1:10:31

could achieve alone. The proposal would see to it that smartphone users can't disable

1:10:38

the technology. This would mean that the location services would always have to be enabled without

1:10:43

an option to turn them off. Apple, Google and Samsung all oppose the move on the grounds

1:10:50

that a GPS network services is not deployed or supported for location surveillance, and

1:10:58

that the move would be a regulatory overreach, according to a letter from the Indian Cellular

1:11:04

and Electronics Association, a different non-governmental organization representing Apple and Google.

1:11:11

India's government has scheduled to meet with the top smartphone manufacturers to discuss

1:11:15

the matter, but the meeting was postponed for unknown reasons.

1:11:19

at this point, no decision has been made.

1:11:23

So I think one interesting thing we've seen with India

1:11:27

is they sort of have been pretty ready to sort of implement

1:11:33

these surveillance things.

1:11:35

Like I know they've got quite a comprehensive digital ID system

1:11:39

and that's certainly a bit of an issue

1:11:42

with a lot of surveillance things.

1:11:45

But I think the ability to get even more precise location from this AGPS thing is, I don't know, it just seems a bit, it's like you're treating every single person as a suspect at that point.

1:12:02

Like every single person has got to be tracked, their location is to be recorded by the government.

1:12:07

That seems like, I feel like that's like 1984, you know, like that's like, that's like kind of like dystopian, you know.

1:12:15

But it seems that the Indian government doesn't appear to think so.

1:12:21

It says here, Apple plan to outright refuse the order and the Indian

1:12:25

government quickly backed down and reversed the requirement.

1:12:29

So it was because the manufacturers would have to install a government app by default

1:12:34

and it wouldn't be able to be removed by users.

1:12:38

Basically, that app would be able to provide the location tracking at all times.

1:12:44

So yeah, it's a bit of an issue, right?

1:12:48

Especially when there's these governments trying to strong arm

1:12:51

like smartphone manufacturers into like implementing these tracking measures.

1:12:57

I think it's quite a bit of a problem, especially when this is so wide

1:13:03

reaching in society, like every single person is going to have to have their

1:13:06

device location tracked at all times.

1:13:10

It's it's it's problematic in a lot of ways.

1:13:14

But yeah.

1:13:18

Yeah.

1:13:19

I don't have a lot of thoughts on this that you didn't already state.

1:13:22

It's India's going a little bit hard in the paint right now with, uh,

1:13:26

trying to really force things on their citizens.

1:13:28

And like you said, originally it was this, this app, this, uh,

1:13:31

Sanchar Sathi, which I'd probably mispronounced that.

1:13:33

And I apologized to all the Indian listeners, but they were like, oh,

1:13:36

it's this cybersecurity app.

1:13:37

And then they backed down from that.

1:13:39

And they were like, oh, well, you know, everybody downloaded it.

1:13:41

So it's cool.

1:13:42

And then they were like, Oh,

1:13:43

But now encrypted messengers need to link to a SIM.

1:13:46

And that also, I believe, got repealed, at least informally,

1:13:52

got repealed.

1:13:53

And now they're like, oh, but now this assisted GPS tracking.

1:13:59

And it's just--

1:14:01

I don't know.

1:14:01

It's really weird.

1:14:02

And the fact that they keep--

1:14:04

it's the fact that it keeps happening.

1:14:06

The cynical part of me wants to make the joke.

1:14:08

It's like, don't you know you're supposed to put a few months

1:14:10

of space in between all these attempts?

1:14:12

Like the fact that it just keeps happening like every other week and it's like something different every time it's like, what is going on there?

1:14:19

It's you know, earlier I talked about giving politicians the benefit of the doubt, but I'm having a hard time doing it here for sure.

1:14:26

So yeah.

1:14:27

Yeah, I don't I feel like we don't have to explain why this is a problem, right?

1:14:31

Like having everyone's location monitored by the government at all times.

1:14:36

I feel like most reasonable people would be against this, but apparently the Indian government thinks that this is completely reasonable.

1:14:42

So I don't know, but I think it's, I think if you live in India, you need to really make

1:14:52

a stink about this because this is like, this is mass surveillance.

1:14:55

Like that is by the letter mass surveillance.

1:14:59

So definitely, definitely, I'm not too familiar with Indian politics.

1:15:04

So I'm not really sure what avenues you might have to, to fight against this.

1:15:10

But I've definitely do some research on that.

1:15:13

I can't really provide any info on that, unfortunately.

1:15:17

- Yeah, they do, I know they do elections

1:15:23

because I know their elections last for like a month.

1:15:25

Like it's a pretty intense process.

1:15:28

So they definitely have elected officials.

1:15:30

So I would imagine there is a way to contact them, I hope.

1:15:36

All right.

1:15:38

And that'll take us to our next story, which is,

1:15:45

this is a big deal.

1:15:46

I'm a little bit surprised

1:15:47

'cause when I first heard about this, I was like,

1:15:49

"Oh no, this is gonna be like the next just never ending,

1:15:54

never ending vulnerability that we keep hearing about."

1:15:57

But so far, I feel like it's been a little bit quieter

1:16:00

than I expected.

1:16:01

Not to say it's not bad,

1:16:02

it's just been quieter than I expected.

1:16:04

So anyways, there's a new vulnerability called React to Shell

1:16:09

and I do apologize that I'm kind of reading this story

1:16:14

in real time.

1:16:15

This was kind of a last minute story for me,

1:16:19

but there's this new remote code execution

1:16:21

that has exposed over 77,000 IP addresses

1:16:25

and has breached over 30 organizations.

1:16:27

So again, it is a big deal.

1:16:29

I was just expecting to see it in my head

1:16:30

lacks a lot more.

1:16:32

So this is an unauthenticated remote code execution vulnerability

1:16:35

that can be exploited via a single HTTP request

1:16:38

and affects all frameworks that implement React server components,

1:16:41

which includes Next.js, which is very popular.

1:16:46

So yeah, it...

1:16:48

Bleeping Computer likes to get really, really deep in the weeds

1:16:51

on a lot of these kind of vulnerabilities.

1:16:53

So I'm probably not going to read the whole article

1:16:55

because they get very technical.

1:16:59

But I do know that next.js is a very big vulnerability,

1:17:04

or not vulnerability, excuse me, a library, I believe.

1:17:06

It's something that I know gets used quite a lot.

1:17:09

So React is close, I'm sorry.

1:17:14

Yeah, yeah, it's a very popular one.

1:17:17

So React disclosed the vulnerability on December 3rd

1:17:19

and explained that unsafe deserialization

1:17:22

of client-controlled data inside React

1:17:24

enables attackers to trigger remote code execution

1:17:27

without authentication.

1:17:28

So basically, for those who don't understand remote code

1:17:31

execution, it's remote.

1:17:34

They don't have to have physical access to your device

1:17:36

or anything like that, as long as they can get access

1:17:39

to the server.

1:17:40

I have seen a few of the projects that I follow

1:17:43

have been pushing out updates that specifically

1:17:45

in the patch notes say like, this is to fix the React,

1:17:49

what is it called?

1:17:50

React to Shell Flaw.

1:17:51

So I'm glad to see some people are responding to this.

1:17:54

So on December 4th, a security researcher

1:17:58

published a working proof of concept

1:18:00

demonstrating the command execution

1:18:01

against unpatched servers.

1:18:02

And soon after scanning for the flaw accelerated

1:18:05

as attackers and researchers began.

1:18:06

So this is one of the reasons that responsible disclosure,

1:18:11

and I'm not saying this person didn't responsibly disclose,

1:18:13

I don't know the details on that,

1:18:14

but this is one of the reasons that responsible disclosure

1:18:17

is such a big deal in the security community

1:18:19

because once you go public with the details,

1:18:21

especially like detail detail of how this works

1:18:23

and approve a concept.

1:18:25

They like literally the next day,

1:18:27

I start seeing articles in my newsfeed

1:18:28

about like all of a sudden scans have gone up

1:18:31

and attackers are trying to find people

1:18:33

who are using this outdated server or server software

1:18:37

or whatever, we see that kind of stuff a lot.

1:18:41

So yeah, the researchers determined

1:18:43

that IP addresses were vulnerable

1:18:44

using a detection technique

1:18:46

where an HTTP request was sent to servers

1:18:49

to exploit the flaw and a specific response

1:18:51

was checked to confirm whether the device was vulnerable.

1:18:54

So again, they could do this automated scanning

1:18:55

to see who was vulnerable.

1:18:58

Gray noise recorded 181 distinct IP addresses

1:19:01

attempting to exploit the PLAW over the past 24 hours

1:19:04

at the time this article was written.

1:19:05

Scans are primarily originating from the Netherlands, China,

1:19:08

the US, Hong Kong, and a small number of other countries.

1:19:11

So that's interesting, Netherlands.

1:19:12

Did not expect to see them pop up on this.

1:19:15

Palo Alto Networks reports more than 30 organizations

1:19:17

have already been compromised.

1:19:19

Attackers exploiting the vulnerability to run commands,

1:19:21

conduct reconnaissance and attempt to steal configuration and credential files.

1:19:25

And some of these compromises are linked to known state associated threat

1:19:30

actors. So yeah, let's see here.

1:19:37

Let's see, Cloudflare has rolled out.

1:19:40

No, you're good.

1:19:41

I just noticed in the United States, there was 24,000.

1:19:45

That's like the highest out of every other country.

1:19:46

And just thought that was interesting.

1:19:49

Yeah, that surprises me too.

1:19:51

I wonder if they've got to be using VPNs or something, I think.

1:19:57

So yeah, it says here further down in the article, because again, I'm skipping over

1:20:02

some of the more technical details.

1:20:05

But if you're a tech savvy person who wants those details, definitely check those out.

1:20:09

Because again, bleeping computer usually digs deep into the step by step.

1:20:14

They say that Cloudflare has already rolled out emergency detections and mitigations.

1:20:18

However, it inadvertently caused an outage.

1:20:21

So if some of the websites you use earlier this week kind of went down for a little

1:20:24

bit, that could be why CISA has added the CVE to their known exploited

1:20:30

vulnerabilities catalog and is requiring federal agencies to patch by December 26th,

1:20:35

which is a very generous timeline in my opinion, but maybe I'm wrong.

1:20:41

Um, let's see here.

1:20:42

Yeah.

1:20:43

So we, uh, we see these kind of vulnerabilities happen from time to time.

1:20:48

It's, you know, if, if, uh, if one organization uses a certain

1:20:53

library or a certain, uh, framework in this case, you know, then.

1:20:57

Unfortunately, everybody who uses it could be exposed.

1:21:00

So thankfully, like I said, I haven't seen too many.

1:21:05

What I've seen a lot in the past is one thing happens.

1:21:08

And then every single week for six months, there's like six new companies

1:21:11

are like, Oh, we had a data breach because of this third party thing.

1:21:15

I personally have not seen that yet, but also I guess this is still pretty early and could still end up happening.

1:21:23

So yeah, hopefully a lot of the projects you use are patching for listeners out there.

1:21:29

They definitely should be because I believe this vulnerability was rated like a maximum severity rating.

1:21:35

Like this is like a severity 10 vulnerability.

1:21:38

So like they 100% should be.

1:21:41

And I would hope that everyone is rolling out fixes sooner rather than later,

1:21:46

like not the 26th of December.

1:21:48

That's kind of a long time.

1:21:50

Yeah, that's like three weeks away.

1:21:51

That's crazy.

1:21:53

Yeah, especially for like a remote code execution vulnerability.

1:21:56

That's quite a long time.

1:21:58

And I think there are versions rolling out now that have got the

1:22:02

that have it fixed.

1:22:03

So I think all it requires is just to make sure you've updated.

1:22:09

So that might be more difficult in some organizations or might be more easy in some organizations.

1:22:16

So it kind of makes sense.

1:22:22

But yeah, this also affected like other things that like include or depend on the React server, basically.

1:22:32

So I think the ones that I listed here is the React Router, Waku, RSC, Plugin RSC, RW, SDK, and Next.js.

1:22:45

So I guess if there's anyone who's like a developer or anything like that, you'd probably know by now, unfortunately.

1:22:53

But yeah, it's definitely an unfortunate vulnerability

1:22:58

and definitely hope that people are updating

1:23:03

their backend stuff right away.

1:23:08

- I just wanted to say I went and checked,

1:23:11

followed several links and yeah,

1:23:13

that's a 10.0 vulnerability.

1:23:16

That's crazy.

1:23:16

I don't know that I've seen one of those before personally.

1:23:20

I've seen a lot of like nine point somethings,

1:23:22

But wow, 10 is I think as high as the scale goes.

1:23:27

- Yeah, definitely a pretty bad one.

1:23:32

Oh, we just got another five privacy guides,

1:23:35

memberships gifted by Jonah.

1:23:38

So if you got one of those, make sure to say thank you.

1:23:41

Thanks, Jonah, for that.

1:23:45

- Jonah's so generous.

1:23:46

- 10 gifted memberships, oh my goodness.

1:23:50

- I know.

1:23:53

Yeah.

1:23:55

All right.

1:23:56

I think it's time to head over to the forum and look for questions unless you have anything

1:24:02

else to add.

1:24:03

No, yeah, definitely.

1:24:04

That's everything there.

1:24:05

Let's move on over to the forum here.

1:24:08

Was there any updates that you thought were very important to mention this week?

1:24:13

Oh, yes.

1:24:15

So we do have some bad news that we're going to cover real quick that has been getting

1:24:21

a lot of discussion, which is that Techlor has shut down their forums, actually.

1:24:27

So let me, we have a discussion going on about that in our forums that has been very active.

1:24:35

But I real quick, I do want to read Techlor's own statement about it over on his site.

1:24:42

Basically he says, "Oop, there's a pop-up."

1:24:46

Basically says, "TechLore has a new home.

1:24:48

Everything is now consolidated at TechLore.Tech and powered by Ghost, but there's difficult

1:24:52

news too.

1:24:52

We're closing our public forum."

1:24:54

So as of today, which is a couple days ago, the forum is read only.

1:24:59

It will remain accessible until June 1st of 2026 so you can export your data and reference

1:25:03

valuable threads.

1:25:04

After that, all user data will be automatically deleted.

1:25:07

Users can still DM each other to exchange contact information if they wish.

1:25:11

He says this was a difficult decision.

1:25:13

We went back and forth on for months.

1:25:14

The forum has been a space where many of you connected

1:25:16

and shared knowledge.

1:25:18

Losing that hurts, not just for the community, but for us too.

1:25:20

He says, but the reality is we were a two-person team.

1:25:23

We haven't been able to give the forum the attention it deserves.

1:25:27

And forums require sustained focus, energy, and care.

1:25:30

And that's energy we want to direct at our core mission.

1:25:34

So yeah.

1:25:35

And then he goes on to talk about moving into focus

1:25:40

on their other content.

1:25:42

So yeah, thanks to the privacy dad

1:25:45

who went ahead and shared this on our forums

1:25:47

'cause that's definitely where I first heard about this.

1:25:50

And then thanks also to JG who went ahead

1:25:53

and posted the actual link to Techlor's statement.

1:25:56

So yeah, it's been a very polarizing thing

1:26:03

among our own community.

1:26:08

You know, I think nobody's really happy about it

1:26:10

from what I've seen for sure.

1:26:12

And it's very unfortunate, but it's,

1:26:17

yeah, I don't know.

1:26:18

I think, I guess the reason it's been especially unfortunate

1:26:21

is a lot of people, I don't know,

1:26:24

there's a, how do I want to word this?

1:26:28

I've seen a lot of people really like the idea of diversity

1:26:32

and just having more options in general,

1:26:35

which is something I'm a big believer in too.

1:26:37

And so I think that's a major reason

1:26:41

that everybody's been really kind of bummed about this.

1:26:45

But I don't know, it's a,

1:26:48

and I know it's, I don't know,

1:26:51

I don't have a lot of thoughts other than what Henry said.

1:26:53

You know, it's two people over there at TechLore right now

1:26:55

and that's, forums require a lot of work.

1:26:58

You know, we're really fortunate to have

1:26:59

a really good team here of volunteer moderators

1:27:03

that help keep an eye on things

1:27:05

and our forums are super, super active.

1:27:08

I know it's not easy to keep up with a forum

1:27:10

or any kind of community really.

1:27:12

So that's a lot for two people to manage.

1:27:16

Did you have any thoughts on this story,

1:27:18

on this development, Jordan?

1:27:20

- No, it's just an unfortunate outcome,

1:27:23

but yeah, I think it's,

1:27:25

there was definitely some interesting information there.

1:27:27

I would definitely look at exporting your account's data.

1:27:30

That's a thing you can do in discourse.

1:27:33

And I think, yeah, it's interesting to see that they're taking a direction towards more

1:27:41

digital rights activism, which is really exciting.

1:27:44

So I think I'm excited to see where that leads.

1:27:46

But I think that the forum will be sorely missed.

1:27:50

I definitely was a fan of that forum and I was pretty active over there.

1:27:55

So it is unfortunate it's going away, but as you can say, all things come to an end eventually.

1:28:01

So that's unfortunate, but yeah, definitely looking, definitely very excited to see what

1:28:06

future stuff their team is working on.

1:28:11

Agreed.

1:28:13

Yeah.

1:28:14

Yeah, Jonah says over here in the chat, end of an era, we really need more places to constructively

1:28:19

discuss privacy advice, but we're coming at things stronger than ever at PrivacyGuide,

1:28:23

so hopefully we can pick up some slack.

1:28:25

So yeah, I mean, of course, if you were a TechLore forum user, we would love to have

1:28:29

you over at the privacy guides forum. We now have the, what did I talk about at the top?

1:28:35

The experience level tags. So, you know, if you feel like you're still pretty new to this

1:28:41

stuff, you can go ahead and tag yourself with a beginner tag if you want to. And yeah, you're

1:28:46

definitely welcome here. And I don't know. Yeah, there's, you know, they're talking about

1:28:52

some other stuff too, because again, it's the diversity that people really appreciated.

1:28:56

So people are talking about different Lemmy communities.

1:28:58

And it's a good discussion to check out

1:29:01

and see what else is out there.

1:29:02

People mentioned a few specific project forums,

1:29:04

but those are like, you know, Graphene Cubes.

1:29:06

Those are dedicated to those projects specifically

1:29:08

and not always privacy in general.

1:29:12

So yeah, but I'm with you.

1:29:15

I'm excited to see what TechLore

1:29:17

is gonna be putting out in the future.

1:29:19

They've been putting out some really good videos lately.

1:29:21

So wishing them the best on that for sure.

1:29:24

- Yeah, some really good stuff,

1:29:26

especially on the, I saw the video about the digital omnibus and I thought that was great.

1:29:31

So it's really good stuff to see coming out.

1:29:34

And yeah, we're definitely also working on some activism stuff here as well,

1:29:39

which I think will be really exciting.

1:29:41

But it's, it's definitely, definitely a change.

1:29:46

I'll definitely miss it.

1:29:47

But yeah, all the best to them in the future endeavors.

1:29:53

Yeah.

1:29:53

I'm catching up on old content.

1:29:55

He's got a lot of interviews with people

1:29:57

that have been really interesting.

1:30:01

All right.

1:30:02

I think that'll bring us into the questions.

1:30:05

So we're going to start by taking questions from our forum,

1:30:09

specifically from our paying members.

1:30:11

And if you want to become a member,

1:30:13

you can go to privacyguides.org,

1:30:14

click the little red heart icon in the top right corner

1:30:17

of the page, and you will be presented with options there.

1:30:20

And then once we have checked the forum,

1:30:22

we will go ahead and bounce over to the live chat,

1:30:24

where I know we have been getting some questions.

1:30:27

So thank you guys who have been active.

1:30:31

I actually, it doesn't look like there have been

1:30:33

a whole lot of questions this week,

1:30:34

but there have been a few posts I wanted to mention.

1:30:38

One of them was, JG said it would be really cool

1:30:40

if privacy guides talking about the highlight story

1:30:43

this week about the man who got arrested in Atlanta.

1:30:45

He said it'd be really cool if we could get in touch

1:30:47

with EFF to answer questions as far as legal stuff and issues.

1:30:52

So it wasn't EFF per se,

1:30:54

but I did reach out to a lawyer and I hope that was helpful.

1:30:57

You said, this would help us better understand how to

1:31:00

and how not to enter the country if you're an activist.

1:31:03

So yeah, I know we didn't answer that specifically,

1:31:06

but I did talk about how there's,

1:31:08

it's really a lot of gray area when it comes to cell phones,

1:31:11

searches at the border right now

1:31:12

and definitely need to just err on the side of caution,

1:31:16

I think.

1:31:18

And then JG also asked if Jordan could weigh in

1:31:22

about the social media ban, which you did.

1:31:24

So, trying to see what other questions here.

1:31:29

Feel free to chime in if I missed one, Jordan.

1:31:33

- Yeah, just seeing a couple of speculations here about,

1:31:36

you know, was the activist that was arrested,

1:31:39

were they using Graphene OS?

1:31:40

We don't know.

1:31:41

So, I mean, I hope they were, you know,

1:31:45

but yeah, I don't, we don't know,

1:31:48

but it would be interesting to hear.

1:31:52

More details about this.

1:31:53

We're kind of waiting on the media to pick it up or like for more information to get released basically at this point.

1:32:03

I wouldn't be surprised on that note if 404 does release because I think that's where we originally picked up the story.

1:32:09

And 404 in my opinion does a really good job of keeping up with stories because one of the drawbacks of the news cycle is a lot of the time once something falls out, people just move on.

1:32:20

So I really appreciate organizations of any kind that do kind of like stick to a story and update whenever there are updates.

1:32:27

So hopefully they will.

1:32:29

But somebody asked a few questions about how the forum works specifically, which it looks like Jonah went in and answered those.

1:32:39

Yeah, so not a lot of questions from the forum specifically this week.

1:32:45

But, and Jonah just gifted a whole new round of memberships.

1:32:50

Oh, no, no, no, that was solid this all.

1:32:52

Sent five privacy guides, gift memberships.

1:32:54

Hey, thank you for doing that.

1:32:56

How nice of you.

1:33:00

Man, this is like Oprah, you get a membership,

1:33:02

you get a membership, you get a membership.

1:33:04

I love it.

1:33:06

Thank you to all the members, the new members.

1:33:09

Okay, so scrolling back up to the top here,

1:33:16

this is kind of a difficult thing.

1:33:17

Can a bidder, you said,

1:33:19

shouldn't the Supreme Court's job be to interpret the law

1:33:21

and not the public opinion?

1:33:23

That would be nice, but according to my lawyer friend,

1:33:26

there is unfortunately politics

1:33:28

that goes into this stuff sometimes.

1:33:29

So I hate to say that,

1:33:33

and that's not me trying to cast any kind of aspersions

1:33:37

one way or the other.

1:33:38

That's always just unfortunately been part of the system here in the US.

1:33:42

And I don't like it, but yeah.

1:33:46

That's also why a lot of the time you will see a case that will go like, it'll work its

1:33:53

way up through the courts.

1:33:54

And then right when it gets to the Supreme Court, the, the, um, somebody involved in

1:34:01

the case, usually the person pushing the case, the, what do they call that, the plaintiff,

1:34:04

I think they'll drop the case because their fear is that if they take it up to the Supreme

1:34:09

Court, the Supreme Court will rule in a way against them and then it will become legal

1:34:14

precedent across the US.

1:34:17

That's not exactly a one-to-one of what you're saying, but the point is it's a lot of the

1:34:20

time there is a little bit of political game theory that goes into our political system.

1:34:27

It's unfortunate.

1:34:33

So Saad, this all says love the guitars in the background.

1:34:34

Thank you.

1:34:36

I appreciate it.

1:34:37

There's a few I'm proud of.

1:34:41

Let's see.

1:34:42

You said you're not sure about disappearing messages

1:34:44

for most people.

1:34:45

As a recommendation, I'm not super sentimental about old messages,

1:34:51

to be honest.

1:34:52

But I recognize that not everyone shares that opinion.

1:34:57

So--

1:34:58

Especially because it's so easy at this point.

1:35:00

Like, you know, signal, you can just swap a setting in the settings and then every single

1:35:04

message you send in the future is going to expire.

1:35:07

I know I just put it on four weeks.

1:35:10

I'm not too sentimental about things, but I think it's an easy change to make and you

1:35:16

never know what you've sent in the past that you might not want to be existing forever

1:35:21

on someone else's device, which is what would happen if you didn't have it.

1:35:24

So definitely worth thinking about a bit more if you need that sort of protection, but I

1:35:29

I can completely understand if you are a little bit sentimental about your messages.

1:35:41

Somebody, Abdelhaq Kour, apologies if I pronounced that wrong.

1:35:44

You said, why isn't Session Messenger on the PrivacyGuides website?

1:35:47

I believe the biggest reason, and we actually talked about this the other week,

1:35:50

as I believe a headline story, is the lack of perfect forward secrecy,

1:35:55

which is basically, if you didn't know, it's when every so often the encryption

1:36:00

keys rotate. So that way if somehow, if the encryption were to be cracked by like

1:36:06

a government computer or something, they would only get access to a certain window

1:36:10

of messages instead of your entire message history. However, we did announce last week

1:36:15

that coming in 2026 session will be adding PFS. So obviously I cannot say anything for

1:36:21

sure. We don't know the future. There may be other factors that I'm not aware of,

1:36:24

but I wouldn't be surprised if there is at very least a discussion revisiting session

1:36:28

and whether or not we should add it. So if you have opinions on that, definitely join the forum

1:36:33

and be part of that discussion when it comes up. So there's comments about doom scrolling.

1:36:44

I will address this one actually. You also asked if you care about privacy, why is your face on

1:36:48

on YouTube, I chose to make that decision

1:36:54

just to help spread the message.

1:36:55

And obviously I could have gone like the route

1:36:57

like Jordan is using the,

1:37:00

I don't know what Apple calls those, the emoji thing.

1:37:04

But Jordan is using that like the hated one, for example,

1:37:07

just does a voiceover.

1:37:08

I could have gone that route.

1:37:09

And honestly, if I were doing it today,

1:37:11

I maybe would consider doing like the VTuber thing

1:37:13

potentially, but at the time that I started doing

1:37:17

YouTube videos, VTubers weren't really, at very least they weren't a super popular

1:37:21

thing like they are now.

1:37:22

And, um, I don't know.

1:37:25

I, I felt like for me, it was worth giving up a little bit of that privacy to

1:37:29

help spread that message.

1:37:30

And, um, I don't know.

1:37:33

I, I, it doesn't really bother me so much anymore.

1:37:36

There's other steps that I take to try and protect my privacy.

1:37:39

Like, um, you know, for, I'm actually pretty open about this.

1:37:42

Nate is not my real name.

1:37:44

It's, it's a stage name, if you will.

1:37:45

And, you know, I take a lot of steps to compartmentalize my life and remove data broker data and stuff like that.

1:37:54

So, you know, privacy is a sliding scale.

1:37:56

It's not like an all or nothing.

1:37:57

Like, could I have better privacy by not showing my face?

1:37:59

Absolutely.

1:38:00

I could also have better privacy by just not having a YouTube account in general.

1:38:04

So, it's kind of up to everybody to judge their own threat model and what's important to them.

1:38:09

And I was willing to not to like try to make myself sound grand or anything, but I was willing to make that

1:38:15

personally to help spread that message.

1:38:17

So, and then I think this is when they started handing out

1:38:24

memberships.

1:38:26

[laughter]

1:38:28

Yeah, I definitely think there's risks with showing your face.

1:38:34

And I think especially for people that are of a certain group of people,

1:38:41

it can be kind of problematic.

1:38:42

So yeah, I personally prefer not to because yeah, just not something I really want to

1:38:49

deal with, but I think it's good to also make that sacrifice as well.

1:38:55

But I think it's certainly up to the individual, you know, like whatever you're comfortable

1:39:00

with I think is important to think about and choose what's the best option for you.

1:39:13

Nate looks like Harry to me.

1:39:17

What I miss?

1:39:18

Oh, Harry who?

1:39:21

Just because you said your name wasn't actually named but yeah.

1:39:25

Oh, like Harry.

1:39:27

Okay.

1:39:29

I don't know if I have an opinion on that.

1:39:31

Actually, I will say when I was younger, like middle school younger, I got, this is going to be a hard thing for younger people to think about.

1:39:41

to wrap their heads around, but once upon a time,

1:39:42

Harry Potter was not cool.

1:39:45

And I got bullied, picked on a lot for looking like Harry

1:39:50

Potter.

1:39:51

So yeah.

1:39:53

Interesting.

1:39:55

Yeah, I know nowadays that would get you cool points.

1:39:57

But back then it was an insult.

1:40:02

Somebody asked if we mentioned Noster.

1:40:04

We didn't bring up Noster.

1:40:06

I don't believe we recommended on privacy guides either.

1:40:08

If you think we should, again, head over to the forum,

1:40:10

started discussion. Do we still allow discussions on GitHub? I know that used to be a thing.

1:40:19

No. Well, I mean, unless it's an issue, but there's no GitHub discussion. Gotcha. Gotcha.

1:40:27

Okay. Then yeah, if you think we should head over there. And then yeah, somebody else asked,

1:40:33

what are the benefits of being a member of PrivacyGuides? For now, I think the biggest thing is

1:40:37

probably the early access to videos by, I think it's like, man, how long was the,

1:40:45

oh no, it's still member only the threat modeling video is still member only.

1:40:50

But yeah, now we've got the smartphone course as well. And yeah, so early access to videos,

1:40:57

but I do know we're always having discussions and taking suggestions on

1:41:02

additional perks that we can offer to members. So yeah.

1:41:07

Definitely worth checking out the videos that are like sitting there because they're not quite done yet.

1:41:12

Like we want to release the entire course to the public in one go.

1:41:16

We don't want to just like have it sitting there and not be complete.

1:41:20

So that is why it's sitting as members only for the foreseeable future.

1:41:25

We're hoping to have the stuff published by the end of the year.

1:41:29

So that'll be public then.

1:41:30

But it is kind of important to make sure we get all of the course done.

1:41:34

is kind of a lot of work to get that all completed,

1:41:38

especially because there's a lot of information in the course.

1:41:44

But yeah, it's looking pretty good so far,

1:41:46

but there should be a regular upload soon.

1:41:50

We're just been working a little bit more on the courses.

1:41:52

So yeah.

1:41:55

- Yeah, which again, I really wanna thank you for that

1:41:57

'cause I feel like in my personal opinion,

1:42:00

as somebody who has edited all my own videos

1:42:03

start to finish over at the new oil.

1:42:04

Like I feel like the, the most involved part is the part that you're doing, like just a

1:42:12

quick peek behind the curtain.

1:42:13

What's been happening so far with the videos I'm on is I film them, I do the initial rough

1:42:18

cut just to kind of cut out all the dead space and the mistakes and, and then I send it over

1:42:22

to Jordan and Jordan does all the color correction and adds all the really cool animations and

1:42:26

graphics and stuff that I could never figure out how to do in a million years.

1:42:30

And so I feel like Jordan's doing the majority of the work by far.

1:42:32

And I really, I think they're crushing it too.

1:42:35

And I really appreciate all the hard work they do.

1:42:37

So.

1:42:38

Yeah, no, I think it's definitely, uh, it's going to be really great once people

1:42:41

get to see, because I feel like nothing's public yet.

1:42:44

But, um, I think once everyone gets to see, um, the smartphone security course,

1:42:50

I think they're going to be very impressed.

1:42:53

Cause yeah, I do, I do think you did a great job with explaining stuff.

1:42:57

And just to be clear, it was Nate who wrote that.

1:43:02

So yeah, that's going to be coming soon.

1:43:05

Definitely look out for it.

1:43:07

But just one small thing is YouTube members are different to our forum members.

1:43:14

You only get access to the early videos on YouTube, if you're a YouTube member.

1:43:19

So it's kind of an unfortunate thing, but just another thing to cover here.

1:43:25

And also, Nate, can you either neither confirm nor deny the Harry allegations?

1:43:34

You know, I think my thing lately is I'm just going to deny everything.

1:43:38

And yeah.

1:43:41

Yeah, I was in an online chat room one time a few a while back where somebody was like,

1:43:47

hey, aren't you, Nate?

1:43:48

And I was like, I don't know who you're talking about.

1:43:49

So I think that's just going to be my thing now.

1:43:53

Oh yeah, I guess people could recognize you I suppose.

1:43:56

Um, that is, that is something to think about.

1:43:59

Um, hello there, Dev, Dev Josh.

1:44:02

Hello.

1:44:04

Welcome.

1:44:05

Yeah.

1:44:08

All right.

1:44:08

Well, that is all the updates from this week in privacy.

1:44:11

They will be shared here on the blog every single week.

1:44:15

Um, so stay subscribed with your favorite RSS reader.

1:44:18

If you want to stay tuned, um, there are, I believe I'm

1:44:23

missing some parts to this sign off.

1:44:26

Oh no.

1:44:28

Well, I know I do wanna remind you guys that,

1:44:30

like I said at the top, Privacy Guides is a nonprofit

1:44:33

which researches and shares privacy related information.

1:44:36

And my cat is making a last minute cameo yet again.

1:44:39

And we facilitate a community on our forum and on matrix

1:44:43

where people can ask questions and get advice

1:44:44

about staying private online and preserving

1:44:46

their digital rights.

1:44:48

If you would like to support our mission,

1:44:50

you can go to privacyguides.org

1:44:52

And there is a little heart in the corner, the top right of the website.

1:44:56

You can click on that and you could become a member or you can make a one time donation.

1:45:01

And all that information is available there.

1:45:04

So thank you guys for watching.

1:45:05

Again, updates will be in the blog.

1:45:07

So subscribe with your RSS reader.

1:45:09

If you prefer audio, we do have a podcast style recording of updates, um, which you

1:45:15

can go find on our website as well, or I believe we'll also be in the description here.

1:45:20

There's a newsletter that contains a summary of all these stories if you want access to that as well.

1:45:26

So thank you guys for watching. Jordan, did I miss anything?

1:45:30

No, I'm not sure why that changes in the document, but yes, we will be good next week. Don't worry.

1:45:36

But yeah, thanks so much for watching.

1:45:38

Just making sure I covered it all.

1:45:40

Yeah.

1:45:41

Okay. All right. Thank you guys so much for watching and we will see you same time next week with the latest news from next week.

1:45:49

Bye-bye.

1:46:19

[MUSIC]