What Will the EU Do Next?
Ep. 34

What Will the EU Do Next?

Episode description

Happy New Year! Join us for This Week in Privacy #34!

Download transcript (.srt)
0:00

[Music]

0:18

Welcome back to This Week in Privacy, our weekly series where we cover the latest updates

0:23

with what we're working on within the PrivacyGuides community and this week's top stories in the

0:27

data privacy and cybersecurity space.

0:29

Privacy Guides is a nonprofit which researches

0:32

and shares privacy-related information

0:33

and facilitates a community on our forum and matrix

0:36

where people can ask questions and get advice

0:38

about staying private online

0:39

and preserving their digital rights.

0:41

Before we dive into this week's show,

0:42

here's a rundown of how the show will be laid out.

0:45

Normally we would start by covering site updates,

0:47

but with the holidays and everything,

0:49

it's been a little bit of a slow week,

0:50

so we have no updates.

0:52

We will jump right into discussing top stories

0:55

in the data privacy and cybersecurity space.

0:58

After that, we'll explore some of the trending posts

1:00

in the Privacy Guides forum,

1:01

and then we'll answer viewer questions.

1:03

So if you have a question for us,

1:04

please leave a comment either in the forum thread

1:06

for this episode or in the YouTube chat.

1:10

My name is Nate and I am joined by Kevin.

1:12

Kevin, how are you doing?

1:14

- Having a wonderful year so far.

1:15

How are you, Nate?

1:16

Enjoying a year?

1:17

(laughs)

1:18

- Yeah, I mean, two days in, but so far, so good.

1:21

Nothing too crazy yet.

1:24

Yeah, I just think, you know, with 206, like here finally, like, I've been trying to make some new

1:31

privacy and security like resolutions, right?

1:33

And I was thinking, oh, wait, I wonder what Ireland's doing, you know?

1:37

And it turns out that, unfortunately, the Irish government is trying to

1:42

implement a few new resolutions, but this time, every one of Europe must follow them.

1:48

Uh, fortunately, Simon Harris, the deputy prime minister or Tannis Day of Ireland and

1:53

And the leader of the governing Finnegal party has displayed an interview with the outlet

2:00

known as Xtra.ie saying that they want Ireland to leverage its upcoming presidency of the

2:06

EU Council to lead a drive for ID verified social media.

2:11

Now for context here, I just want to like emphasize that Ireland has an interesting background.

2:18

they host the headquarters of almost every single big tech company out there in Europe.

2:24

Just for compliance reasons, of course, you will want some sort of headquarters in the

2:28

EU market.

2:29

And Ireland, because of its tax benefits, tends to host a lot of these big tech companies.

2:33

And because of that, we may observe quite a lot of lobbying, quite a lot of influence

2:40

by these big tech companies on the government.

2:41

And I'm not particularly surprised by this fact that, yes, Ireland is trying to lead the

2:47

to push for ID verified social media.

2:49

Not mind you, this may seem a bit suspicious.

2:53

Why would a government which has a history of appointing

2:57

ex-meta lobbyists to head

2:59

it's the data protection commission, for example.

3:02

A government that's historically been so friendly

3:05

to these companies want to push

3:07

for ID verified social media.

3:10

Now, it isn't always at the best interest of your heart.

3:13

This person right here, Salman Harris,

3:16

has described that a lot of its motivations were meant to block "keyboard warriors" from spreading

3:21

hate in its formation online. And of course a lot of ID verification is still up in the weeds so far,

3:28

but based on what we know about Ireland, it's quite clear to try and push for more

3:33

big tech-friendly solutions to ID verification in ways that may comply with their potential

3:39

concessions that otherwise other countries may not particularly offer. In an interview with XJIE,

3:45

Tanaste, Simon Harris had said previously that the government will, of course, lead these calls for ID, verify social media accounts,

3:52

and that they hope to implement an Australian-style ban on children accessing social media during the presidency of an upcoming year.

4:02

This will obviously trigger some sort of showdown with the social media giants in Ireland,

4:09

But based on what we know, of course, is that there will be some sort of compromise, I believe,

4:15

from the EU and Ireland and these big tech companies. And the fact that there has been

4:20

potential support not only from the powerful EU leaders such as the French Prime Minister

4:26

and French President Emmanuel Macron and UK Prime Minister Kirstarmer, but we already have been

4:31

seeing a lot of levies and fines against people posting their personal opinions or

4:38

harassing government employees or government leaders in Ireland. In this example right here,

4:43

we see someone by the name of Sandra Berry, who's already given a six-month sense for setting

4:48

vile social media messages to a party leader, calling him a murderer, and of course setting a

4:53

very unfortunate death threat, which is quite an in a bad taste. Now, of course, I think we're

4:59

seeing a bit of an unfortunate trend here in which a lot of government officials are reacting

5:04

to allow these like unfortunate attacks on their family and through their personal safety.

5:09

And you know, I don't blame them sometimes. It's a very human approach to try to protect,

5:14

you know, colleagues and yourself. But oftentimes it does not mean clamping down on the free internet.

5:20

I think I remember a past story in which like the the hiking app or the running app Strava,

5:27

it was used to actually identify politicians. And funny enough, these politicians proceed to take

5:34

active steps, I guess, you know, that's their own physical safety.

5:39

Well, also not realizing, oh, wait, maybe we should also care about the common

5:43

focus as well.

5:44

Maybe we shouldn't clamp down on free expression online.

5:47

Maybe some form of criticism is acceptable and you don't need

5:49

the anonymized entire internet just to do so.

5:52

So right now I feel like these politicians are being blunt.

5:56

They're being quite candid in their attempts.

5:59

Um, it's very clear that they're not appreciate anonymous communication.

6:02

They believe that any sort of criticism against them

6:05

is a personal attack, regardless of whether it actually

6:08

does constitute it.

6:09

And right now, there are attempts to actually de-anonymize

6:12

you on social media to implement an Australian solid age

6:15

verification system across Europe.

6:18

Makes it really clear that, yes, these governments only care

6:23

about themselves.

6:24

They only care about their own safety, their own well-being.

6:28

Well, not necessarily like accounting

6:30

for what you believe, which is the right to freely express

6:34

your thoughts, the right for you to whistleblow,

6:37

right for you to disclose responsibly to a press outlet,

6:41

and the right for you to actually access

6:44

information you want in OMA that you believe is accurate for you.

6:49

And Nate, I was just actually just wondering, though,

6:51

like based on what you know about the situation going on,

6:54

though, do you feel like Ireland would actually

6:59

succeed in promoting a verification and these denonimization mandates, they do feel like

7:05

it'll still be a big pushback against these big tech companies or do you feel like there'll

7:08

be some sort of compromise involved here?

7:09

Because it seems to me like they're asking for a lot, but obviously a complete denonimization

7:14

of the internet seems very unlikely.

7:17

So I'm feeling there has to be something like a compromise proposal that they will put up.

7:20

But I'm like, what would that look like?

7:23

I don't know. I never like to be a, what's the word I'm looking for? I never like to be a defeatist or a naysayer, but I mean, we have to point out that they did successfully pull off age gating social media in Australia, which the article, I think even the people you mentioned, the politicians specifically cited that as like this is the playbook we're following and, you know, we've covered several times in the past how everybody's just kind of, it's an unfortunate thing in

7:53

politics in general, but also in tech where one person does something and it opens the floodgate.

7:58

For example, we saw, I don't know if you remember, but there's a story a couple, I think like a year

8:05

or two ago about how Metta invented basically their Ray-Ban smart glasses. They invented that years

8:12

ago, but even Metta was just like, ooh, this is kind of creepy. This is kind of like, ooh, I don't know

8:17

if we're comfortable releasing this. And maybe to be fair, maybe that was them projecting and saying

8:21

like, I think people might be too creeped out by this. But then Clearview AI came along and scraped

8:26

everybody's photos and did the whole facial recognition thing. And so Meta felt kind of

8:32

emboldened after that. And now we do have the Meta Ray-Ban glasses that are freely available

8:37

over the counter. And apparently some people are actually buying them. And there's actually a whole

8:41

bunch of smart glasses. I don't know if they all have facial recognition. But the point being,

8:44

Like once one person breaks that levy, then all of a sudden it's like everybody else feels

8:52

empowered.

8:53

And unfortunately, I think Australia kind of did that with their social media ban.

8:58

And unfortunately, it hasn't been, as far as I know, it hasn't been enough of a failure,

9:05

certainly not on the level that the UK has.

9:07

But you know, so the UK has the online safety act, Australia has their thing.

9:11

And now we've got, I don't have the list in front of me, but there's so many European

9:15

countries who have openly said like, yeah, we want to age gate the internet and the US,

9:19

you know, we've talked at length about all the laws they want to pass here in the US

9:22

to do the same thing.

9:24

So it's a really unfortunate, um, yeah, it's, I guess what I'm trying to say is I feel like

9:31

the deck is stacked against us.

9:33

And I think we certainly can get them to back down because at the end of the day, politicians

9:38

will ultimately do what their voters want, you know, and if the voters raise enough,

9:45

I mean, just to say it bluntly, if the voters raise enough hell and they, you know, they

9:50

protest and they call their politicians peacefully protest, of course, and call their politicians,

9:54

eventually the politicians will listen because they know if they do something too unpopular,

9:57

they'll get voted out of office.

9:58

But I do have to admit that it's a bit of an uphill battle because they, you know, there's

10:05

already an existing precedent and unfortunately they're picking really good things from their

10:13

perspective to hinge their argument on.

10:19

Somebody sent in death threats, right, to the prime, or maybe not the prime minister,

10:22

but to one of these politicians and they can look at that and be like, "See, we need to

10:26

keep people safe from this kind of stuff," which that is reprehensible.

10:29

That's not cool.

10:30

No matter how much you hate somebody, violence is not the answer, but it's just unfortunately

10:35

like it's hard to argue with those kinds of things.

10:37

And it makes you sound very,

10:42

it makes us sound very callous when we say like,

10:45

"Oh, these things are just gonna happen no matter what."

10:48

But I don't know,

10:49

it's, we are very big fans, I think of

10:54

acknowledging that there are other ways

10:56

to accomplish things that don't require us

10:57

to age gate the entire internet.

10:59

'Cause sorry, I know I'm kind of jumping

11:01

from topic to topic, but that's a big thing

11:03

that they always say,

11:03

"Oh, we're gonna keep miners off social media."

11:06

Well, how do you know who's a miner?

11:07

You're gonna have to have everybody submit ID.

11:09

We say this every time.

11:10

So, it's really unfortunate.

11:14

And I think there has to be better ways

11:15

to clamp down on these legitimate issues

11:17

of harassment and bullying.

11:20

And he talks about fake,

11:23

they talk about bots and fake news and disinformation.

11:25

Like there has to be a better way to crack down on that

11:26

than just age gating everyone.

11:28

And I think real quick, I did,

11:30

I forget who I heard it from,

11:32

But somebody, it may have been Taylor Lorenz,

11:34

made a really good point that the idea that if we just,

11:38

you know, identify the entire internet,

11:40

that all these problems are gonna be solved as ridiculous,

11:41

because if you go on Facebook,

11:43

there's plenty of people using their real names

11:44

and saying horrible things and bullying each other.

11:46

So clearly that's not just gonna magically fix everything.

11:50

It's very unfortunate.

11:52

- Yes, it appears that enforcement is the prime issue

11:56

that Tanaste right here, like really wanted to fix.

12:00

just for your information, they do have a digital Asia

12:04

consent in Ireland, which is around 16.

12:07

I'll be honest with you, I really hate all these weird terms.

12:09

There's a social media requirement for being 16.

12:12

There's a digital Asia consent.

12:14

What does that mean?

12:15

But at the end of the day, I feel like based on what I know

12:18

about Irish law and their digital Asia consent law,

12:21

the idea is if you're below 16, you cannot consent

12:25

to be on the internet.

12:26

You cannot consent to do anything that may involve

12:29

target to advertise, for example, on social media platforms.

12:32

But the issue is like for them, of course, is that their domestic law is not being enforced.

12:36

It's unenforceable and there's no really real mechanism for them to do so.

12:40

And for some reason, rather than just keep it within the country,

12:45

they really want to implement a European wide enforcement mechanism to deal with

12:50

these "anonymous bots."

12:55

I also wanted to emphasize something here.

12:58

Like when someone mentions anonymous bots, troll farms,

13:03

hate speech, information,

13:05

oftentimes it's just another way to kind of market

13:09

what you're trying to do,

13:10

which is to eliminate free speech online.

13:13

Two more palatable thing for the wider

13:16

Irish or European population.

13:18

I mean, no one likes those online Russian bots,

13:22

for example, spreading this information online.

13:24

It's annoying, I'm sure.

13:26

But the difference between actually implementing targeted,

13:29

sustained mechanisms to ensure that these bots don't occur,

13:33

working with platforms to ensure that it could be done

13:36

so in a targeted way.

13:38

And of course, just quote unquote,

13:40

"scrolling over everyone," just for the sake

13:43

of eliminating a few trolls here and there.

13:46

And I don't wanna downplay anything

13:48

over these politicians are worried about.

13:50

I'm sure they go through a lot.

13:52

Like in the article itself,

13:53

Like I think he mentioned that, you know,

13:55

that the deputy prime minister had like a bunch of people's

13:57

abacus and chanting as it's home.

14:00

I'm sure they get doxxed.

14:01

It's tough being a public figure.

14:03

I mean, anywhere, you know, be it online

14:05

or as a politician.

14:08

However, I think the fact that they're trying

14:10

to actually unbend the EU Digital Service Act,

14:13

which was introduced in 2022,

14:17

it could really like lead to a really big showdown

14:20

between the EU under Irish presidency

14:24

and of course the current United States administration

14:28

which really heavily handed on freedom of speech issues.

14:33

So I think one thing that I do wanna point out is like,

14:39

it probably won't necessarily lead

14:41

to a complete denonization of the internet

14:43

if you live in the European Union.

14:45

But what will happen of course

14:47

that if it does end up broadening these age-gating arms

14:51

across such a huge market, the United States may step in.

14:56

And whether you think it's a bad thing or a good thing,

14:58

I leave entirely up to the audience, you.

15:00

But for me personally, I would say that regardless

15:03

of what politicians think is right or wrong,

15:06

it's usually always been spearheaded and influenced

15:09

by big-deck corporations.

15:11

And you see this from time to time again.

15:13

In America, of course, there's actually a boxing match

15:15

between Google and Apple versus every single tech platform

15:20

out there trying to put liability either in the app store

15:25

or on individual websites.

15:26

And both are bad for equally reasons.

15:29

I guess here in Europe, it's kind of like, all right,

15:32

should we implement an age-to-work verification

15:35

requirement in regular tech companies to comply,

15:39

maybe negotiate with them a bit, or face

15:42

a wider backlash from the Trump administration, who

15:47

personally, I'm trying to be on non-biased here,

15:49

is very involved with free speech issues,

15:52

and it has been seen like targeting the UK's

15:56

on my safety act as well.

15:58

So it's like a bit of a mix,

15:59

but we can't really predict what's gonna happen.

16:01

Like in the European Union,

16:02

once Ireland does achieve the presidency of the EU council,

16:06

but what we can say though is that naturally,

16:09

just because they're saying something as a goal here,

16:11

doesn't mean it will necessarily happen,

16:13

we defeated check control before.

16:16

under the Danish presidency.

16:18

And while Denmark has failed to plan a lot of its goals,

16:21

they have actually reached past agreements

16:24

in which they were able to reach

16:25

some sort of compromise proposal

16:27

that will be tabled to 2026.

16:29

And I guess now it's up to Ireland to lead that push as well

16:33

once they do have the presidency.

16:36

So once again, I do want to emphasize

16:39

that it is not the end of the world.

16:41

Maybe we shouldn't be defeated.

16:42

We have defeated check control before.

16:44

And I'm sure like just because like the new conversation

16:47

maybe about age verification doesn't mean that, you know,

16:50

the Europeans and people across the world can like

16:52

band together, try to fight it.

16:54

'Cause ultimately like your voice does matter.

16:57

And I'm sure the same guys with campaigns that motivated

16:59

the push against check and control will rally

17:01

against Ireland's push for disagigating majors.

17:04

And I'm sure Nate, like we will make sure to like inform you

17:08

and keep you safe from these proposals.

17:12

- Yeah, for sure.

17:13

Well said.

17:15

I also, real quick, I need to point out something

17:17

you said early on when you started talking is you talked

17:19

about like Russian bots and Russian troll farms.

17:21

You make a good point that like,

17:23

how is this gonna stop those people?

17:27

If Ireland has a law that all Irish users,

17:29

'cause that's how it's gonna go, right?

17:31

It's like, okay, let's say they adopt this EU wide.

17:34

Now the entire EU, if you're an EU user,

17:37

you have to submit ID and verify.

17:39

But if you're not in the EU, you don't have to do that.

17:42

So how is this gonna stop?

17:44

I mean, sure, it'll stop like the local people

17:46

from sending you death threats,

17:47

and again, allegedly, because people will still post

17:49

on Facebook under their real name,

17:51

but it's not gonna stop.

17:54

I think I already cleared my note,

17:55

but one of the politicians said that like,

17:57

"Oh, this isn't just about the threats that I got,

17:59

this is about threats to democracy."

18:01

Like this is part of a wider issue,

18:03

which I think is a fair discussion to have for the record

18:06

that that is a thing, but like,

18:07

"Okay, if we're talking about threats to democracy,

18:09

there is literally a dorm.

18:11

Y'all can fact check this, this is a true story.

18:14

There is a dorm in Russia, in Moscow,

18:16

where we can trace a significant number

18:18

of social media accounts,

18:19

because that is part of Russia's,

18:22

what do they call it, the GRU?

18:23

It's like, it's their like military intelligence unit,

18:27

that their whole job is to make memes

18:29

and make fake accounts and make bots

18:31

and create disinformation and create confusion.

18:34

Like, again, this is a real thing.

18:36

So how's that gonna stop them?

18:38

They're not in the EU, they're not even trying.

18:40

We can trace their IP back to that building.

18:43

We know who they are.

18:44

They're not gonna have to submit ID.

18:46

So yeah, it's kind of a flawed premise from the get go

18:49

because it's only penalizing people in Europe.

18:52

And it's not gonna impact other people abroad.

18:55

So it's at best, it's solving part of the issue

18:58

and not the whole thing.

18:59

- Yeah, and that's what I'm saying.

19:00

They're using this as an excuse,

19:02

this whole threat of democracy to science critics

19:04

who are actually in the EU, mind you.

19:07

But based on what we have discussed before

19:08

and with the whole protect EU initiative

19:10

and the up potential requirement to mandate VPN data

19:14

disclosure, there are definitely target VPN companies

19:17

next year.

19:18

Maybe-- I don't know.

19:19

Maybe Ireland will realize that and maybe try

19:22

to push forward a measure to actually mandate

19:25

VPN data retention and use that to target those petty Russian

19:30

bots.

19:31

But in reality, we'll know what they're trying to do.

19:33

Focus on the critics, focus on people who disagree with them.

19:38

It's very clear what they're trying to do here.

19:40

And like, I really had it really funny how they're not even hiding at this point.

19:43

Um, maybe it's not even frame it to more like, you know, potable, but, um,

19:48

it's very clear that they just want to deal with this specific issue,

19:52

which is average saving death threats.

19:54

My family's receiving death threats.

19:56

Let's just put a band aid solution and just ban the entire internet, you know,

20:01

ban the anonymous internet and make sure that if you do decide to criticize me,

20:06

say it with your real name, you know, and.

20:10

I don't know.

20:13

I don't know.

20:14

I think for now, though, what can do if they do say after covering news from last year

20:20

and when the Denmark held the presidency, not this year, when Ireland will probably hold

20:23

the presidency, it just feels like for me, politicians tend to love these broad solutions.

20:30

But in reality, they just don't really understand that, oh wait, it's actually going to destroy

20:34

what it means to be internet.

20:36

The ability to be anonymous, the ability to have a pseudonym.

20:40

I know that firstly for me, like I have been in the pseudonym

20:42

for most of my online life before coming to Privacy Guides.

20:45

And I'm sure for you, like coding pseudonym

20:46

is very important as well.

20:48

And as well as so many people on the forum,

20:50

our own staff members, we don't even show their faces,

20:53

but so like work incredibly hard to deliver you

20:55

the informational content that you are watching

20:58

and listening today.

21:01

So yeah, like I just--

21:03

I really hope that by the end of 2026, until 2027,

21:08

we'll still have at least people in Europe having the ability

21:12

to create pseudonyms to protect their identity,

21:16

to actually access information that the others

21:18

who couldn't have to.

21:20

That's my personal wish for folks out here.

21:22

And if you're living in the E right now,

21:25

there's so much beauty in operating a pseudonym

21:26

and to be able to say whatever you want.

21:28

I'm not saying you should say death threats.

21:30

Please don't do that.

21:31

That's illegal.

21:32

But just be able to be honest and criticize those

21:38

which are always would punish you is huge power to us.

21:40

And there's so much power and anonymity.

21:42

And I heavily advise that folks right here

21:45

to protect your dish identity

21:46

and to use pseudonym as if possible.

21:51

- Yeah, for sure.

21:52

It's a, it goes well beyond, you know,

21:55

just it touches on like whistleblowers and, you know,

21:59

people trying to escape sensitive situations at home.

22:03

Like, yeah, it's just, they're just putting,

22:08

there's a phrase that's escaping me right now,

22:10

but it's, you know, they're putting everybody at risk

22:12

because of the sins of a few people.

22:14

And it's just, yeah.

22:18

But yeah, actually on that note,

22:20

let's head over to France,

22:22

where France is kind of very similar.

22:24

They're considering an Australia-style

22:26

social media ban for children next year.

22:28

So this is what I mean.

22:32

It's like, and you know, the Ireland story even mentioned

22:36

the president Macron, but you know, okay, let's start by going

22:40

ahead and covering the facts here.

22:41

So there is a draft bill in the French parliament

22:44

that will impose a social media style ban,

22:47

or excuse me, an Australia style social media ban

22:49

for children under the age of 15.

22:51

And I believe I read that they wanna have this pushed out

22:54

as early as September of 2026, which, man,

22:58

I don't like this bill, but I really got to give France a lot of credit.

23:01

You guys don't mess around because in America, that would be like,

23:04

Hey, you know, we'll implement it sometime between now and the heat

23:06

death of the universe.

23:07

But they're, they're trying to get this going real quick, which is unfortunate.

23:12

But I got to, I got to respect the timeline.

23:14

So the fresh being efficient.

23:16

Wow.

23:17

No kidding.

23:18

Well, we live in a cursed timeline.

23:22

Normally you got to be German to be that efficient.

23:25

So now that I've upset everybody.

23:27

So the article says that people, children between age of 15 and 18 are already not allowed to use their phones in high school.

23:33

So just kind of setting the stage for how France handles this kind of stuff.

23:39

And yeah, again, they cited Australia as something that they want to model this after.

23:45

And it will come under legal review in the coming days.

23:48

Education unions will have a look at the proposed high school man on phones.

23:53

and let's see.

23:55

I thought this was funny.

23:56

The President Macron, I believe it was Macron who said this,

23:59

he used the analogy of a teenager getting into a Formula One

24:02

racing car before they had learned how to drive.

24:05

Quote, "If a child is in a Formula One car

24:07

and they turn on the engine, I don't want them

24:08

to win the race, I just want them to get out of the car.

24:11

I want them to learn the highway code first

24:12

and ensure the car works and to teach them

24:14

to drive in a different car."

24:17

Yeah.

24:18

But I, it also says that in addition to children under 15 being banned entirely, those between

24:27

15 and 18 should face a nighttime digital curfew, which would go from 10pm to 8am.

24:34

I have so many thoughts on this story.

24:36

So first of all, one of the things that's really problematic about these kind of proposals

24:42

is that I liked his analogy of like learning how to drive a car, right?

24:46

Like you're absolutely right.

24:47

put a 16 year old in a race car because they're very powerful cars and they could really hurt

24:53

themselves. But you have to teach them how to drive. And that's what bugs me about these social

24:58

media bands is, you know, I don't want to be too political, but we have this criticism in the US

25:02

about the same thing, right? Like the minute you turn 18, suddenly you can take out these massive

25:07

college loans, you can get married, you can buy a house, you can join the military and go to war

25:11

and die. And you're, you're 18, you just turned 18 yesterday, and you can go do all these things.

25:16

And there's no, especially in the case of social media, like we're not ramping people up. Ideally,

25:22

you know, when you turn 18 the whole time you've been learning, right? Like your parents have been

25:25

teaching you, you've been taking classes. So ideally in a perfect world, when you turn 18,

25:30

you have some education and you understand like, maybe I don't want to join the military or maybe

25:34

I should, you know, not buy a house yet, these kind of things, maybe I should look for scholarships,

25:38

whatever. But the problem with those, these social media bands is they, they kind of don't give kids

25:44

that chance to ramp up. And to be fair, I will say this French one does a little bit, right? Like

25:49

if you're 15 to 18, you can use it, but not at night, which also seems a little unfair because

25:53

like, you know, kids are supposed to be in school during the day. When else are they supposed to

25:56

use social media? I mean, granted, I don't know. I remember when I was in school, I had friends who

26:01

would like wear that as a badge of honor, like, oh, I didn't go to sleep until two AM. And I'm like,

26:05

dude, why are you people staying up so late? I hate being tired all the time. But

26:09

and that, you know, actually, I'm glad I brought that up because that was before we had this

26:13

this discussion before recording.

26:15

I'm in my mid 30s.

26:16

When I was in high school, Facebook did not exist.

26:19

My space was kind of a thing, but it wasn't like,

26:22

it wasn't like it is now where like,

26:23

if you didn't have a my space, you're basically dead.

26:26

Like you don't exist.

26:27

And I actually didn't have a my space

26:29

until like my senior year, I think.

26:31

And the only reason I signed up is because I had to do

26:33

some kind of class assignment.

26:34

And I remember I was researching Ralph Nader.

26:37

And if anybody know who that name is.

26:39

- Oh my God.

26:40

- Yeah, he's...

26:41

(laughs)

26:41

- Okay, you do.

26:42

- Thank you, I don't feel as old,

26:44

but I couldn't find his position on some issue

26:46

that I needed for a school assignment.

26:48

So I literally made a MySpace account

26:50

because he had a MySpace,

26:52

I mean, I'm sure it wasn't him, but he had a page.

26:54

So I made an account just to message his page

26:56

and be like, "Hey, I'm in high school,

26:58

"I'm doing an assignment.

26:59

"I don't know what his position on this issue.

27:01

"I can't find it anywhere."

27:02

And they basically wrote me back and said

27:04

he hasn't taken a position yet, so he has none.

27:06

But, you know, like that's the world I grew up in.

27:10

And I still had friends that would brag

27:12

about staying up until one or two in the morning.

27:14

So like, what is a social media ban gonna fix here?

27:16

Kids have been doing this before the internet.

27:18

They're gonna keep doing it after the internet.

27:19

You're not gonna fix it.

27:20

You're just, you're not.

27:21

It's, I know we say this every time,

27:24

but it's like they're taking the wrong approach

27:26

to solve any existing issues here.

27:28

So yeah, I mean, this story has a lot of overlap

27:31

with the one we just covered.

27:32

The, I guess the only saving grace is that it seems

27:34

to be a little bit more, well, no,

27:37

'cause the Ireland one was specifically focused

27:38

on social media too.

27:39

I was gonna say it seems to be a little bit more targeted,

27:41

But that's not true.

27:42

It's just, yeah, it's like I was saying at the beginning,

27:44

it's this big focus on all across the EU

27:48

is everybody's just like,

27:49

"Oh, maybe this isn't such a bad idea."

27:50

'Cause apparently they can't look literally across the channel

27:53

and see how well it's going for England

27:55

to just age gate the internet.

27:56

But yeah, what are your thoughts on this one, Kevin?

28:01

- I have a lot of thoughts.

28:02

I have some who's raised in a generation where,

28:05

we were kind of raised on the internet.

28:07

I think I wasn't necessarily an iPad kid when I was like,

28:10

like only a few years old, my parents didn't have a lot of technology at the time, but this is what

28:15

during the early 2000s or so. So for me personally, like, I was, when I was around the internet,

28:21

you know, like using it, I had phones for quite a bit. I didn't have social media at the time. I just

28:26

joined it voluntarily because I was cool all my friends were doing it. But these days though,

28:31

I just want to emphasize that while there are a lot of harms of giving your children an iPad,

28:36

for example, and just leave it alone with that when they're one or two years old. I think there is

28:41

a balance here between responsible parenting, of course, and also recognizing that yes, kids are

28:47

kids. They will circumvent things no matter what you do. And maybe there is a reason why it affected

28:53

to see that, okay, social media use is on the rise, mental health issues are on the rise.

28:59

Maybe kids aren't doing as well as school nowadays, relying everything on AI and technology.

29:02

But maybe that's not necessarily the result of the technology or social media platform usage in hand.

29:07

Maybe the factor used is so much is the symptom of wider societal concerns that maybe the government should focus on instead.

29:14

Not to be shifted gears away from the US, but I remember that kids used to go outside back in the day.

29:21

They used to play, they used to ride bikes around. I guess what? Decades of you know, hostile architecture,

29:27

decades of you know, a lot of fear, genuine fear about crime for example,

29:31

boys really like made kids to go inside and at this point it's like do you really want

29:35

children to like be inside all day and be prisoners?

29:38

Well, of course they will like use social media more often because they have nothing

29:40

better to do.

29:42

So I think right here it's like maybe we should focus less on okay, there's a technology problem

29:49

that's another technology problem.

29:50

Let's implement age verification to implement privacy restricting actions on everyone and

29:55

maybe focus on okay, how can we make it safe for kids to go outside more often?

30:00

you can invest in sustainable solutions to ensure that, hey,

30:04

you can allow children to go outside and play,

30:07

like they used to do during the '80s, for example.

30:11

So right now, I do have to say that as much as I do agree

30:14

generally, is that, yes, social media is bad.

30:17

We are, if I was to guys take a stance,

30:19

is that you should voluntarily choose technological

30:22

minimalism.

30:23

You shouldn't be forced to do so,

30:25

'cause people will circumvent it anyways,

30:27

and it'll lead to a lot of horrible

30:28

draconian concerns out there.

30:30

You are free to open that Facebook account as much as we tell you not to register that Facebook account.

30:37

We want you to add a few more choice to say, you know, I'm going to delete that Facebook account.

30:40

I'm going to talk to my friends, for example, I am not going to engage in a system of target advertising, the advertisement, like industrial complex that such a complex all of information.

30:50

It should be a voluntary choice for you.

30:53

It should not be decided by corporations or governments or whatever, um, to use things in a certain way.

30:59

Because ultimately these mandates, these verification minutes,

31:02

is just a way to simply collect your IDs.

31:06

It's just another data breach waiting to happen.

31:10

But yeah, I think something I found really interesting

31:12

is I thought it's like,

31:13

I kind of have two perspectives here.

31:15

You were obviously whenever in time without social media,

31:18

I was wondering, did you feel like,

31:21

how were kids back in day, would you say?

31:23

Do you think that, what I'm saying is true,

31:24

was it actually generally more better

31:26

for folks to not have social media?

31:28

Or do you feel like kids still have,

31:29

ultimately had their own problems though,

31:31

like back then?

31:33

- Well, you know what's funny is I was thinking about that

31:34

while you were talking because I went to

31:38

what was called a magnet school.

31:39

If anybody hasn't heard of that, it's basically,

31:42

the school had special programs like ROTC for example.

31:45

And so even if I was outside the district,

31:47

as long as I was enrolled in one of those programs,

31:49

I could go and I lived at the edge of the county.

31:52

So like, I didn't have any friends around.

31:53

My nearest friend, I think I found this out years later,

31:57

actually lived like the next housing development over. But even

32:01

then that was still a couple miles. Like I really lived not in

32:04

rural area, but definitely very suburban. And like I couldn't

32:07

just, you know, I couldn't just go to my friend's house. It was

32:10

miles away. It was, you know, and I couldn't on busy highways

32:14

too, for the record, it's not even like we'll hop on your bike,

32:16

like it just it wasn't feasible for me to go without a car. And

32:19

as a freshman in high school, I didn't have a car. I was like

32:21

14, you know, I couldn't just drive over and my mom worked

32:25

full time. So by the time she got home, like, you know, she just

32:28

wanted to cook dinner and relax. So it's, that's actually when I

32:32

started to become a more heavy internet user, because back in

32:34

my day, we had, and I'm going to really give some people flashbacks

32:37

here, we had like aim, you know, the instant messenger, we had

32:41

try to remember what were some of the other things I used. I don't

32:43

even remember. I think, no, that was pretty tumblr. I don't

32:46

remember what I used, but I remember like, there were forums,

32:49

there were, you know, instant messengers, like, and that was how

32:51

I stayed in touch with my friends who lived closer to the

32:54

school that I wasn't able to see anymore.

32:56

But I think you're right is because I remember, you know, when I was young, granted, it wasn't social media, but I grew up as a gamer.

33:02

I've always been a gamer.

33:03

I've always played video games.

33:05

But I remember putting down the game and voluntarily going outside to play.

33:09

Like my mom didn't usually have to be like, Hey, get out of the house, go do something.

33:13

It was just something I wanted to do.

33:14

Like, okay, we'll come inside, we'll play games for a little bit, and then we'll put the games down and go play outside.

33:18

And, and I see that even with, um, with young children these days, you know, uh, my, my niece.

33:23

would come over and we'd put on Bluey or something just, you know,

33:27

cause kids have unlimited energy for the record.

33:30

It's not one of those like raised by TV things, but it's like,

33:32

just watch an episode of Bluey and give me 20 minutes, 10 minutes in.

33:36

If that five minutes in, she doesn't want to do it anymore.

33:39

You know, her mom would give her a tablet with YouTube on it, five minutes in,

33:41

she puts it down, she's coming to bug us.

33:43

She wants to hang out with us.

33:45

And I think, yeah, like you said, like it's, it's, I think we've created a culture,

33:49

whether it's through the design of cities, whether it's through, you know,

33:52

the culture we've raised, because I definitely grew up in the, the, again,

33:56

I don't want to get too political here, but I grew up in the stranger danger

33:58

days where, you know, people would teach you like, you know,

34:01

watch out for strangers and don't get in the van, even if they offer a puppy

34:03

or whatever, which for the record, great advice.

34:05

But it's, you know, I grew up in that, that era of fear where it's like,

34:09

Oh, everyone's out to get you.

34:10

And now that I'm older, I realize I'm like, not usually, I mean, sure,

34:14

like teach kids to be smart and be safe, but don't like scare them into

34:17

being afraid of strangers.

34:18

And, and, you know, I don't know if it's coming out, right?

34:21

But you know what I'm trying to say?

34:22

Like there's a balance and I think we've taken that too far a lot today where we're not setting

34:28

kids up for success where they feel safe to go to the playground and play.

34:31

They feel safe to go make friends with their neighbors and you know, that's just something

34:36

that's really been lost.

34:37

And I think like you were saying in absence of those in-person relationships, what else

34:43

are you going to do?

34:43

You're going to have to turn to the internet because that's the only place you're going

34:45

to find humans are social creatures.

34:47

We need friends and we need community.

34:49

And yeah, I think if we create a culture that is more friendly and has more community built

34:55

in at all ages, not just kids, adults need that too, then we won't be as addicted to

35:00

these platforms because we won't need them.

35:03

I've heard this said from multiple sources and for the record, I haven't fact checked

35:05

this and I'm not encouraging drug use.

35:09

But I've heard it said that the overwhelming amount of troops in Vietnam use drugs, like

35:16

hardcore drugs.

35:17

But the overwhelming amount of them came home and stopped using.

35:21

Because it turns out the reason they were using drugs was because they were in a very

35:24

isolated stressful situation and that was how they were coping.

35:27

And then once they got home and they had their families and their jobs and their loved ones

35:31

and their church and their community, they didn't need the drugs anymore and they just

35:34

stopped using it.

35:35

And I feel like it's the same thing.

35:37

If we would just create a culture where we have those in-person relationships again,

35:41

we wouldn't need to go to the dopamine casino that is social media and we wouldn't be nearly

35:46

so addicted to it, I think. So yeah, I think you really, really hit on something big with

35:51

what you were saying there.

35:53

Yeah, and like, I agree, like 100% Nate. It's like, almost, once again, I'm going to say

35:57

this over and over again, banning solutions by people who generally agree with us. And

36:02

yeah, social media is bad. But it was just ironic. These cost somewhere problems to come

36:06

up, I'd say. Like, just from my perspective, as you know, something from Gen Z, right? Growing

36:16

to like most a lot of people, maybe in these countries like you know and like maybe in France

36:21

or in the Netherlands maybe it's more like accessible to like ride your bike down the

36:24

street and talk to your friends, maybe buy something.

36:27

But of course like we can't just assume that yes like you know your parents are obviously

36:33

working all the time, pray at home all day, let's ban solution we need to say a lot of

36:37

that you know kids will find other ways to do things.

36:40

I think there's so many stories I heard like not just from personal experience but also

36:43

some other friends of my generation,

36:45

who have figured out ways to circumvent parental controls

36:49

if their parents even implemented for them.

36:51

But some parents just didn't have any like,

36:54

thing at all in the first place.

36:57

So it's like kids are very smart.

36:59

They'll figure out how to use VPNs,

37:00

usually very unsafe VPNs.

37:03

And they'll ultimately just be forced to use

37:06

and sell their information indirectly

37:08

to these companies who use like this age verification bill.

37:13

That's a way to just, you know, harvest your data and all that.

37:18

But yeah, like honestly Nate, like I do like sympathize a lot

37:22

with like the whole like the culture that we have,

37:24

at least in the U.S.

37:25

Maybe it's very prominent across, you know, the sea as well,

37:27

like in Europe and maybe in other countries

37:30

where there is a growing sense of isolationism

37:32

in society and sometimes to prevent anti-social behavior,

37:37

must directly address root of the problems.

37:39

enough times that involves building more third spaces,

37:43

maybe trying to make it safe for kids to go outside

37:46

and play ball again.

37:48

I certainly wish I had that growing up

37:50

and I'm sure I wished that you had more of those spaces

37:52

growing up as well,

37:54

as these decades pass into the future,

37:56

where, you know, will there be any spaces at all

37:59

in the first place, either online or in person?

38:02

We don't know, we just want to like,

38:03

print the file like, you know,

38:05

our kids a day and then have them be left alone

38:08

without anything to communicate with.

38:12

But yeah, I think it's a very

38:14

distressing situation overall.

38:17

But I think right now we can't really figure out

38:21

what will be the ultimate solution to this problem

38:24

unless of course there's future studies to see,

38:27

okay, what is the long-term impact of social media

38:31

after it's aged for advocate?

38:33

Do people actually serve it?

38:36

through these bills, I should do anything for children.

38:39

I say personally, no, kids will figure out a way

38:43

to sort of invent anything nowadays,

38:46

but I anticipate there's gonna be a lot more conversation

38:50

about VPNs or usage or even Tor in the near future.

38:54

And yeah, it's like so far,

38:57

I'm not so excited to see what's gonna go on with this bill.

39:00

Hopefully the French parliament will be very slow

39:03

in discussing this bill.

39:05

Maybe it won't be as efficient as we thought it would be.

39:08

We don't know.

39:11

Yeah, for sure.

39:12

I do think it's kind of funny real quick how you kept mentioning that kids find

39:16

ways around things and my whole life growing up, um, you know, just again, to

39:20

date myself here, like I remember growing up and listening to older folks talk

39:24

about like, man, you know, my kids, my grandkids, they're so smart.

39:27

I can't even work the VCR and they, you know, they can program it and this, that

39:30

and the other.

39:31

And, but for some reason it's, you know, I don't know if maybe that

39:35

mentality has gone away nowadays.

39:36

Like maybe if people are feeling a little bit more tech savvy than they

39:39

used to back in the day, but yeah, you're, uh, I've heard so many people talk

39:44

about, you know, oh, my kid figured out how to get around the school's firewalls.

39:48

My kid figured out how to do this, that, and you know, um, I'm thinking of a,

39:53

when I used to podcast with Henry from tech lore, he told a story on the show

39:58

one time about how his parents had the location tracking on just to keep an eye

40:03

on him and he got so sick of it one time that he spoofed it and put it in the middle of the ocean.

40:09

Which I don't think his parents ever saw that, but he was just so like,

40:12

just had that moment where he's just like, I'm so tired of being tracked and like put it,

40:15

put the location in the middle of the ocean.

40:17

And, and, you know, it's like he was a teenager when that happened.

40:20

It's, it's how I don't know.

40:23

It's I'm sure the vast majority of kids won't find their way around this, but also it's, you know,

40:29

All it takes is one kid to find it and then that spreads like wildfire, man.

40:33

You know, one kid finds a way around the school firewall and all of a sudden

40:36

everybody is just like, Oh, did you hear if you type in this and this and go here

40:40

and now you can surf any website you want?

40:42

Like we used to do it in the military because when I joined the military, they

40:45

blocked at first they blocked like YouTube and Facebook and everything.

40:49

And then halfway through my contract, they loosened it up.

40:51

So it didn't matter anymore.

40:52

But yeah, it's just people find ways around things.

40:55

And I don't know.

40:57

It just, it seems.

40:59

Do they really want to play that cat and mouse constantly? It seems like there's better things

41:02

they could be doing to solve, to be doing with their time to solve these issues.

41:07

Yeah, I feel like political power is so sparse nowadays and to focus on these specific issues

41:14

means that there is, of course, a great agenda at play at here, Nate. But I do think this is a lot

41:21

of time we spent on these two age verification mandates in Ireland and France. I was just wondering,

41:27

maybe it's time to move on to an area of your expertise, actually.

41:31

Data breaches. And don't worry, I'll cover this story this time.

41:35

It is unfortunately targeting one of our, like, you know, my favorite sources in across technology

41:41

policy, specifically, Konde Nast, which is to come in to host a lot of technology enthusiasts,

41:48

you know, news outlets and newsrooms, specifically, Ars Teneca and DeWired.

41:54

Specifically, a supposedly white hack security feature turned

42:00

hacker has leaked and published the records of 2.3 million

42:04

wire subscribers in a devastating database

42:06

charting God and S, which is of course,

42:08

the company of why our Seneca ironically enough,

42:12

it only leaked wire subscribers, not our Seneca.

42:17

We don't really know why that's the case.

42:18

Our Seneca came out with a statement

42:20

saying that they have not been targeted with the hack,

42:22

despite the fact that apparently it targeted a content asset database.

42:27

But specifically, if you're a wire subscriber,

42:30

you should be really concerned here.

42:33

Apparently, this leaked data revealed a substantial array

42:35

of personally defined information,

42:38

where in accordance to a bleeping computer analysis,

42:41

which they themselves actually analyzed the data set

42:44

and not an official statement by the company itself,

42:48

apparently 2.3 million total records

42:50

and 2.3 million unique email addresses with time states ranged from all the way to 1996,

42:56

so September 9, 2025. Of course, there's always the usual unique internal ID,

43:01

but also email address and optional data, such as first and last name, phone number,

43:06

physical address, gender and birthday. Of course, many of these fields are empty,

43:12

but there's some incredible personal details that a lot of people actually did optionally.

43:17

12% of the records included both the first and last name, 8.21 included a physical address,

43:23

2.84 included a birthday, and 1.37% included a phone number. And a much more smaller subset

43:30

included everything from their name, birthday, phone number, to address, and gender.

43:35

And honestly though, like something that I found really funny in this edition, is that

43:39

it was like clear that the hacker tried communicating with the wired for quite some time.

43:46

For a month, I believe and apparently they got tired of like, you know, how to deal with like, you know, them acknowledging the

43:52

Suspensive leak and it ended up disclosing it on a dark web form

43:57

But I don't know if we're not we can actually confirm right now what you're saying is true or not

44:01

It seems to me a little fishy, you know

44:03

It's like a post supposedly you want to actually do something very good

44:07

And when a company doesn't redo I think it's released in a dark web anyways like what?

44:13

Yeah, that was the impression I got from this story is that apparently this attacker approached

44:19

it was either wired or a Kanday Nast or somebody and

44:23

allegedly tried to alert them and be like hey you have this vulnerability and

44:28

like you said just got tired of the run around and decided they were just gonna leak all this data online and

44:34

Kanday Nast did confirm this story actually they said like yeah

44:37

We were in communication and now we realize that they just weren't acting in good faith

44:41

like they were lying to us and misrepresenting that they were a white hat, which I mean,

44:46

I got to take Kandin aside on that one. Like I've seen, I've seen people do that in the

44:53

security community where they're just like, oh, I don't feel like I have to responsibly disclose

44:57

this because of whatever reason they want to make up, you know, I don't, I don't like that company.

45:02

I feel like they took too long to respond. Whatever, I've seen it happen several times

45:07

where somebody's like, oh, I found this vulnerability, but I'm going straight to the

45:11

because I don't believe in responsible disclosure.

45:13

And I don't know, that's, yeah, that's a hot take.

45:19

And it certainly opens up allegations

45:21

that maybe you're not acting in good faith,

45:23

but I'll just leave that there

45:24

'cause that's my personal opinion.

45:26

- Hmm.

45:27

- Yeah, it's definitely, I don't know.

45:30

It's 'cause I do see, as much as we love signal,

45:33

I'm gonna throw them under the bus a little bit.

45:35

Signal had that vulnerability,

45:38

which admittedly I think was a very niche vulnerability,

45:40

but they had a vulnerability where,

45:43

correct me if I'm wrong,

45:44

'cause I'm going off memory here.

45:46

I think it was like secret keys were being stored

45:48

in plain text on your device.

45:50

So in theory, if you had a compromised device,

45:53

like say a malicious version of Discord, for example,

45:56

it could very easily access those keys

45:59

and have your private keys now.

46:02

And it was a really bad, not so much a bad vulnerability

46:07

as it was just a lot of back and forth

46:08

because a lot of people would argue like,

46:09

Well, if your device is compromised,

46:11

nobody can save you from that anyways,

46:12

which was kind of signals argument.

46:14

But then there were other people who were like,

46:15

yeah, but it's such an easy fix.

46:17

Like the researchers even wrote the fix for you.

46:20

Like all you have to do is roll it into your system.

46:22

And what really kind of to me kind of upset me is,

46:27

apparently this had been flagged before,

46:29

like back in like 2019 or something.

46:31

And so for years, signal was just like,

46:33

it's out of scope, we're not gonna do anything about it.

46:35

But then when this went viral,

46:38

I think like last year, the year before it, it was a huge deal and it hit all the headlines

46:42

and finally signal was like, okay, fine, we'll fix this.

46:44

Even though we still say it's not a really big deal.

46:47

So I kind of see the argument of like, sometimes you do have to shame a company into, you know,

46:51

doing something.

46:51

Sometimes they just don't do it, you know, when they should, when they're first alerted

46:56

to something.

46:56

So I don't know.

46:58

It's a really messy thing.

47:00

One quick thing I do want to mention here that was in the article, they said that the

47:03

database has since been added to have I been pwned.

47:05

So if you are a wired subscriber or ever were and you think you may have been impacted by

47:10

this, definitely go check that out.

47:12

You can see if you were in it or not.

47:15

Yeah, I don't know.

47:17

Do you have any other takeaways from this one?

47:19

Oh, yes, I do.

47:20

Just to comment quickly on the signal story, I believe a signal desktop, it's sort of keys

47:26

and plain text, which is, I guess, maybe like a sign porn.

47:31

Who's using the signal desktop?

47:32

Well, a lot of us are.

47:34

And I think it's very natural to be concerned about that in the first place.

47:38

Just moving on to more to like the wire, I got to have some brief thoughts on it as well.

47:42

This definitely could be a lot of like miscommunication involved here

47:45

when you're doing responsible disclosure.

47:47

Maybe they just simply wanted to like make money off of it, for example,

47:52

and they just simply didn't get the funds that they wanted.

47:55

Maybe they genuinely were disgusted by the wired for content ads

47:59

for not actually responding to concerns and releasing a spike.

48:02

You don't know it really went wrong, of course.

48:05

But I do have to say, though, that if you are a wire subscriber,

48:09

thankfully, I don't think anyone I know

48:12

has been subscribed to Wired.

48:13

But definitely do be careful if your data is in this breach

48:16

and make sure to change information necessary,

48:19

or definitely change any passwords associated

48:22

with that account if possible.

48:23

But be on lookout for potential spam message

48:26

and targeted phishing attacks in the near future

48:29

if you include your phone number or email.

48:32

But yeah, I think that's all I have for the wired.

48:35

I think it's a great time to pivot to our next article.

48:37

It seems to be really passionate about this one.

48:42

Yeah.

48:43

Yeah, so our next story comes out of Texas, actually.

48:48

And this is a really--

48:50

I mean, this is a wild story.

48:52

Thankfully, it does seem to have a pretty good ending.

48:54

But basically, a 15-year-old girl was kidnapped

48:59

and her father was able to rescue her

49:01

because of phone location tracking.

49:04

So the 15 year old girl was reportedly kidnapped

49:07

in the Houston suburb of Porter.

49:09

Her parents said that she took the dog for a walk

49:11

and had not returned by the time she was supposed to.

49:14

Her father subsequently located her through the phones,

49:17

located her phone through the device's parental controls.

49:20

The phone was about two miles away

49:21

or 3.2 kilometers for our international listeners

49:25

in a secluded partly wooded area

49:27

in the neighboring Harris County.

49:29

Deputy said the father headed to that spot

49:30

and found his daughter as well as the dog

49:32

inside a pickup truck with a man inside.

49:35

She managed to escape with a hand from her father

49:37

who called law enforcement officials.

49:39

And the article doesn't go into a lot of detail.

49:41

That's kind of the crux of it.

49:44

But yeah, it's, I mean, first of all,

49:47

I'm not gonna say he made the wrong call,

49:49

but that is dicey going to handle the situation yourself

49:51

'cause that could have escalated very quickly.

49:54

I'm glad that him, the father and the daughter

49:57

to get away safely.

50:00

That's scary.

50:01

But yeah, this is a really interesting story

50:04

because kind of the takeaway that I took from this story

50:09

is that we are really big fans of turning off all tracking.

50:15

I'm even a big fan of like,

50:18

I think I recommend leaving Find My On now

50:20

because there's like the stolen device protection

50:23

depends on it or something like that.

50:24

But in the past, I've recommended not using

50:26

find my network at all on your Apple devices.

50:28

And, you know, we, we say to like turn off location on anything that doesn't need it.

50:32

Um, things that do need it, like map apps, just make it while using the app,

50:35

stuff like that.

50:37

But, you know, being safe and your threat model are always really important.

50:41

Like if you're, I think the example that I used when we were talking about this

50:46

before the show was if I were a woman going to a date from somebody I met online,

50:53

right?

50:53

Like a dating site.

50:54

I would definitely like there's, there's analog ways of doing this too.

50:58

If you feel comfortable, like tell your friends where you're going to be,

50:59

maybe schedule a check-in call or something, but also I wouldn't blame anybody.

51:03

If they're like, I'm going to turn on location and I'm going to give like two

51:06

or three of my closest friends or even one of my closest friends access.

51:10

And that way they can track my phone.

51:12

And obviously nothing's perfect.

51:14

That doesn't stop like somebody from turning off your phone or something.

51:16

Right.

51:17

And, but at least then we'll have like last known location and stuff like that.

51:19

So I think there's, um, I know a lot of people aren't going to like

51:24

this, but I mean, it's privacy is a gray area sometimes. And I think these are conscious

51:29

decisions that you should make for the record. I'm not saying that everybody like I'm not

51:32

trying to defend mass surveillance, but I'm saying there are situations where it makes

51:35

sense to of your own volition in a very controlled way, give up a little bit of that privacy

51:42

temporarily to, you know, do something safe. And, you know, that's that's why we preach

51:47

so much about threat models. Jordan actually just put that in the chat. Thank you. Like

51:54

I don't know. I think I covered that pretty well.

51:57

But the other thing I really wanted to make know is,

52:01

from what I understand,

52:02

Apple's find my, especially in the context

52:05

of like parental controls is actually

52:07

fairly privacy respecting.

52:08

They claim, and you know, it's proprietary,

52:11

so we have to take them at their word,

52:12

but they claim that it's very, you know,

52:14

it's an encrypted Apple doesn't see your location.

52:17

It's just the people you're sharing with.

52:19

And I don't know if that's true across the border

52:20

if it's only for the parental controls,

52:22

But it kind of shows that like we've been sold a lie.

52:25

You know, a lot of the time we've been sold this narrative

52:28

that we're giving up our privacy

52:30

because we're getting these cool features

52:31

or this cool technology or it can't work any other way.

52:34

But Apple, love them or hate them,

52:35

Apple has repeatedly proven

52:36

that that doesn't always have to be true.

52:39

You know, there are ways to have privacy

52:43

and still have this cool technology.

52:45

Maybe not every time.

52:45

I'm sure there are certain technologies

52:47

that just cannot be made private

52:49

no matter how hard you try.

52:50

But I think a lot of the time,

52:51

narrative that we've been sold is just, you know, it's just an excuse for

52:55

companies to steal our data and make us okay with it.

52:57

They're, oh, I can't give you this product unless you give up your privacy.

53:01

And it's like, no, you can, you just don't want to.

53:03

So, yeah, I don't know if you had any different takeaways from that story,

53:07

but that is a, that is a wide-ass story.

53:09

Slightly different takeaways, of course.

53:11

Obviously it depends if you do have trust and loved ones.

53:15

And sure enough, like, yeah, so nothing wrong with sharing your location

53:18

with a few people, of course, maybe make it temporarily.

53:21

if it's just your friends, for example,

53:22

and going on a trip together,

53:23

or maybe you want to only share your location

53:25

to your partner or your parents occasionally

53:28

at the time of time, or keep it on, whatever.

53:31

My only concern, of course,

53:33

is when these relationships aren't trustworthy.

53:36

When there is a complete lack of physical security threats,

53:42

a complete lack of physical security protections

53:46

against someone who knows your passwords.

53:48

From my previous line of work,

53:49

I'm volunteering with certain university clinic.

53:53

I dealt with a lot of technology abuse victims

53:55

in which their threat model was actually based

53:58

on the UI itself.

54:00

Someone having shared access to an account,

54:03

they have to access their location at all times.

54:06

A form of indirect spyware where an abusive partner,

54:09

for example, could access someone's location

54:11

at all times, he used to track it.

54:13

Maybe they place an air tag on their car, for example.

54:16

And all of that makes me like still like incredibly concerned

54:20

because even these safe methods, as you mentioned with FindMy,

54:23

well, yes, it is anti-encrypting,

54:25

and Apple can't necessarily track your location.

54:28

Someone access your account can.

54:31

And if you do decide to take a route of which, yes,

54:34

I will want to share my location with my partner,

54:37

my colleagues, my friends, my family members,

54:41

if you believe you have a high enough threat model

54:43

that involves trusting them, for example,

54:46

You trust the people you're sharing your location there with you.

54:49

You know that they won't harm you.

54:50

Maybe consider like starting a conversation on,

54:53

OK, I want to secure your devices.

54:55

Can you set a strong password, for example?

54:59

If someone does ask for information,

55:00

please do not give it to them at all costs.

55:03

Just these small conversations like that

55:04

can really help a lot because the more people

55:08

they give a location to, the higher your risk is.

55:11

So definitely try to keep that list very minimal.

55:13

And start a conversation with them.

55:15

maybe think about, okay, maybe you can focus more

55:19

on securing your own personal devices

55:21

in addition to mines as well.

55:24

'Cause often, I don't know what will happen, of course.

55:28

Maybe one day the person that you trust

55:30

may not actually be the person that, you know,

55:33

might not be the same person in the future.

55:35

So just keep that in mind as well

55:37

and make sure to really control the list

55:40

of people you share your location data with.

55:41

'Cause from my personal experience,

55:43

a lot of the women I talked to,

55:46

they've definitely regained their choice

55:48

of sharing their iCloud passwords

55:50

or their location data with the partner all the time.

55:52

So even though you could revoke it

55:53

or kick them out of like change your password

55:55

and kick them out, a lot of times people

55:57

don't really have the technical know how to doing so.

56:00

So, well, I'm sure the vast majority of people

56:02

in this like call probably won't have to deal

56:04

with a similar mess in advanced situation.

56:08

Keep in mind that there is still power

56:10

in restricting access to your location data.

56:13

And we should definitely set that boundary straight up

56:15

from the beginning.

56:16

Listen, I will share my location data with you,

56:19

but keep in mind that I'm also concerned about this and that.

56:22

Please don't weaponize this against me.

56:24

And if this is the right person, the right friend,

56:26

right partner, the right family member for you,

56:29

they will respect that boundary.

56:31

There's nothing wrong with setting that boundary here

56:33

and there to ensure that it can be weaponizing

56:35

against you in the future.

56:36

But yeah, Nate, I think that's all I really have

56:38

from my perspective.

56:40

I don't think it's a universal thing

56:41

Everyone should apply, but if you feel like this could be the case for you, I think keep that in mind though.

56:49

No, yeah, really good caveats.

56:50

I'm glad you said that because it's one of those things where like, it's a very nuanced thing.

56:56

You know, like you said, it's giving someone a lot of trust and I'm not necessarily saying that everybody should do that, of course.

57:03

I'm just saying that there are situations where it makes sense, but also these are in a perfect world.

57:08

these would be things where everybody is

57:13

acting above board, right?

57:14

Like, you know, maybe these parents have that location

57:18

on just for this situation.

57:19

And they're like, we're never gonna check this

57:21

unless we think your safety is at risk.

57:23

And you know, like, and maybe they've held to that

57:25

this whole time.

57:26

Like maybe they've never checked it before.

57:28

And this was the one time that they're like,

57:30

she should have been home by now, let's check the phone.

57:32

Oh no, she's two miles away.

57:33

How did that happen?

57:34

Like it's, it's, yeah, it, it is, it's, it's really unfortunate.

57:39

We see, or at least I've seen a lot of forum posts around the privacy space of

57:43

people talking about, um, estate planning, you know, like if I pass away and

57:48

they're really concerned about how can I make it, how can I make it so that, um,

57:54

in the event that I'm gone, people can still like get into my accounts and

57:57

stuff, but not until I'm gone.

58:00

And you know, some people come up with these really elaborate and I don't

58:03

that in a disparaging way, but some people come up with really elaborate like, "Oh, my lawyer will

58:06

have like half of my passphrase and my spouse will have the other half or something." And, you know,

58:11

it's perfectly valid if that's the way you want to structure things and that's your threat model,

58:15

because you are putting a lot of trust in somebody. I've got, I think I mentioned, I actually have two

58:19

friends, I think it's two, maybe it's just one, who have trusted me with their bit warden password

58:25

in the event that something goes, you know, happens to them. And here's how you can get into that

58:31

and take care of things.

58:32

And yeah, it's really, um, it's a lot of trust you're, you're putting in somebody for sure.

58:37

So that's something to think about if you find yourself in a situation where you're like,

58:40

I want to share this data for whatever reason, you know, share my location.

58:44

It's, can you do it temporarily?

58:45

Can you put any sort of safeguards in place?

58:47

Do you know how to revoke that access if things go wrong?

58:49

So yeah, really, really good stuff.

58:52

I'm glad you mentioned that.

58:54

Of course, like, and trust goes a very, very long way.

58:57

I'm Nate and I'm really glad that like, you know, we started to normalize the idea of yes, you can still like be a virus advocate.

59:03

You still care about yourself and care for others. But at the end of the day, you're still a lot of trust people.

59:09

You're still allowed to like build continuity plans and you're still allowed to, you know,

59:14

share a part of you that you, you know, you don't really share to other people, which is, you know,

59:18

something simple as location data for crying out loud. Like the fact that, you know, I've seen a lot of it,

59:23

you know, be weaponized makes me feel

59:24

I mean, for luck to share my location as had in the past,

59:27

where back in the day, when everyone was on Snapchat,

59:31

everyone shared a location data, even I did.

59:34

And I had so much fun just stalking everyone's location.

59:36

Oh, you're in the Bahamas right now, and that's so cool.

59:39

But I think a lot about growing up and maturing

59:43

is realizing that not everyone deserves to use that

59:47

to access that information.

59:49

And so retaining that trust and doing by doing so,

59:54

Sacrifice and compromise a little bit is so much important part of growing up

59:57

So sure if you're trying to like advocate greater privacy practices for yourself and your family

1:00:02

Because at the end of the day if you have a family or partner

1:00:05

You're gonna trust them a lot with you know to hold true secrets to have access to that

1:00:10

You know your counseling does happen and the same goes the other way around you will have to like know where they are

1:00:15

Okay, sometimes time and I think but just like recognizing there is a balance here

1:00:21

Makes your privacy journey a lot more feasible

1:00:24

So it would be replicated by everyone else who's watching us right now.

1:00:31

Yeah, definitely.

1:00:34

All right.

1:00:36

If those are all our thoughts on that story, we're going to move on to our next story.

1:00:42

So this one, I want to make it very clear upfront.

1:00:45

If any viewers are familiar with my time on surveillance report,

1:00:51

You know that one of my hobbies is attacking politicians regardless of what side of the aisle they're on.

1:00:56

So this story concerns the current administration, but I just want to assure you guys that regardless of who is in power, I would still call this out as a bad move.

1:01:07

So the headline, this comes from Reuters, it says the Trump administration removes three spyware linked executives from the sanctions list.

1:01:13

So apparently during the Biden administration,

1:01:17

the US sanctioned three executives of Intalexa,

1:01:21

which is a spyware maker.

1:01:23

I believe they, yes, they make Predator,

1:01:26

which is a malware.

1:01:28

A lot of you guys have heard of Pegasus, right?

1:01:30

I think that's the one, right?

1:01:31

That's the one from NSO Group, I believe.

1:01:33

- Yeah, NSO Group, Pegasus, yes.

1:01:35

And if you heard more of Paragon Group,

1:01:37

they also have their own like special spyware as well.

1:01:40

But yes, I think for this company specifically,

1:01:42

It's a predator.

1:01:44

Yeah, it's predator, which we have seen used on Greek journalists and even members of the

1:01:49

US Congress.

1:01:50

I actually did not know that last part.

1:01:51

Holy crap.

1:01:53

Yeah, so, right.

1:01:56

So for those who don't know, this is like the state level, like the highest level of

1:02:01

malware.

1:02:01

This is the stuff where companies will pay companies like Intellexa will go to places

1:02:05

like Defcon or other hacker conventions around the world and they will find really good hackers.

1:02:12

I'm not making this up. This happens. They will listen in on a talk, for example, and

1:02:16

somebody will be like, "Oh, here's how I found this zero day in iOS that allowed me to read

1:02:21

all the messages." They'll go up to that person and be like, "Hey, if you find any more zero

1:02:25

days, give us a call first before you call Apple." These companies will literally pay

1:02:30

millions of dollars for vulnerabilities, usually in phones, but I'm sure in all kinds of systems.

1:02:37

What they offer is, and I'm sure it's like a string of vulnerabilities. I'm sure it's

1:02:40

not like one thing, they will offer malware that's usually zero click.

1:02:46

Like you don't have to do anything.

1:02:47

They just, I think Pegasus for a while, they just had to send you like an

1:02:51

iMessage and it would just automatically install itself.

1:02:53

Again, no, you didn't need to download something.

1:02:55

You didn't need to click a link.

1:02:56

You didn't even need to accept the message.

1:02:58

It just worked.

1:03:00

And I, I hate to say historically, I think this is still like 90% true.

1:03:07

When Pegasus first became very mainstream, we saw a lot of people really panicking and they're like,

1:03:12

"Oh, how do I know if my phone's infected?" 99% of the time, if you're asking that question,

1:03:17

they're probably not using it on you. Because again, these are exploits that cost millions of

1:03:21

dollars. And the more they use it, the more that companies like Apple and Google are going to notice

1:03:25

this malware is creeping into their phones, and the more work they're going to put into figuring

1:03:28

it out and fixing it and reverse engineering it. So they're going to be very selective with who they

1:03:32

they use it on. Usually it's like journalists, activists, like really like hardcore, high

1:03:37

profile activists, politicians, things like that. So lately though, we have seen, or at

1:03:45

least I feel like I've seen a lot more stories popping up about Apple letting people know

1:03:49

that like, hey, you've been targeted, you might have malware on your phone. Apparently

1:03:53

Apple has a feature now that does that. I don't know if Google does, I don't think they

1:03:56

but it could be like, what do they call that?

1:04:01

The confirmation bias where like, I see the story a lot,

1:04:04

so it feels like it's happening more often,

1:04:06

but I don't know if that's actually true.

1:04:07

I don't have any statistics on me.

1:04:09

But anyways, yeah, that's kind of the main story here

1:04:11

is we're not sure why, but for some reason,

1:04:13

the US Treasury Department has decided

1:04:15

that they don't wanna sanction these three anymore.

1:04:18

They claim that, what did they say?

1:04:21

This was done as part of the normal administrative process

1:04:23

in response to a petition request for reconsideration

1:04:26

And they said that each of the individuals had demonstrated measures to separate themselves

1:04:30

from the intellects of consortium.

1:04:31

So basically they're saying, oh, we received an appeal and these guys don't work for intellects

1:04:36

anymore, which is, I guess, I don't know, that's a take.

1:04:42

But yeah, it's, I don't know, this, this, I think the reason we wanted to cover this

1:04:47

story again is regardless of political affiliation, it shows that the current administration is

1:04:53

Kind of cozying up a little bit to spyware makers.

1:04:56

Um, I'm going to fact check myself here, but I think there was even a story recently where,

1:05:02

um, yes, I do remember this now.

1:05:05

They, I don't know if this was Biden era or first Trump era, but the government did actually

1:05:09

at one point, I think it was Biden era, but don't quote me.

1:05:12

They did actually, um, forbid the U S to buy spyware like this.

1:05:17

And I'm sure for the record, I'm sure there's a lot of loopholes and a lot of under the

1:05:21

table and like, you know, that kind of stuff. But at least on paper, they said, Hey, you

1:05:25

can't do this anymore. And the current administration has rolled that back to and said, no, you

1:05:28

can totally buy spyware and deploy it. So, um, yeah, it's just really unsettling. And

1:05:34

again, I don't care who's in office. I don't care if it's a Republican, a Democrat, a Martian,

1:05:38

I don't care. The point is like any time that any government shows a willingness to use

1:05:43

this kind of stuff is really concerning. You know, Kevin was just talking about spyware

1:05:48

And that's kind of the over the counter commercial stuff that abusive partners might put on there, but it's still

1:05:57

Yeah, not good

1:05:59

Yeah, like I think we were a little scared to cover this story at nurse because you know it the headline seems like a little too partisan

1:06:05

Of course, but I want to emphasize that privacy guys does not endorse like anyone a political affiliation

1:06:11

We're not legally allowed to do so according to US law

1:06:15

but one of the reasons why I want to include this story discusses because I

1:06:21

different governments will have different responses.

1:06:23

We can't just label saying the US is friendly to spyware,

1:06:26

the US opponents to spyware.

1:06:27

This country, sponsors this country won't.

1:06:30

It is ultimately a tool to be used.

1:06:32

All right?

1:06:32

And just because one administration

1:06:36

take a more heavy hand approach to target spyware,

1:06:38

doesn't mean that in the future, it will be utilized.

1:06:41

Or even when that image is still in power,

1:06:43

you don't know if the individual law

1:06:44

for the agencies are trying to sneakily acquire a target spyware

1:06:49

of tool sets to deploy without the knowledge

1:06:51

of the federal government itself.

1:06:55

So it's like I'm saying, though, regardless

1:06:58

of the current administration, regardless of the government

1:07:00

power, we will try our best to fact check and dispute

1:07:06

any potential wrongdoing.

1:07:08

And it's clear right here is that while they may--

1:07:11

these three exegglers may have potentially

1:07:14

separated themselves from Intellexa.

1:07:17

The fact that their software is usually

1:07:19

targeted to human rights activists, journalists,

1:07:22

even members of the US Congress, is clearly a moral bankruptcy.

1:07:27

The fact that there are really no subtle tools to anyone

1:07:29

without vetting what it's used for is frankly insane.

1:07:33

And if there is anything that fires against will,

1:07:35

California, though, is responsible regulation

1:07:37

against these tools, regardless of where you are.

1:07:40

Because these tools, if they won't be used against you now,

1:07:43

they will be used to get you in the future if you do something

1:07:45

pistol-wondering up.

1:07:46

It doesn't matter where you are in the world.

1:07:48

This stuff should not be normalized.

1:07:50

I'm telling you.

1:07:51

This is the one time where maybe we

1:07:53

shouldn't be selling hacker tools to everyone

1:07:55

in the free market, mind you, because regardless we are,

1:07:57

if you say something loud enough on the internet

1:07:59

and they get enough views, who knows,

1:08:00

maybe some random government really

1:08:01

doesn't like that and hacks your phone.

1:08:04

This is something that should not be normalized.

1:08:06

And any sort of sanctions lifting,

1:08:10

even by folks who do pinky promise

1:08:13

won't do these type of things anymore.

1:08:15

Coz to show that there is this very slow and rapid normalization

1:08:19

of targeted spyware.

1:08:21

And it will be more common throughout the world, mind you.

1:08:25

This stuff may be becoming more common across US law

1:08:28

enforcement agencies, but also has been incredibly common

1:08:32

in Spain, of course, in which even Catalonia or Barcelona

1:08:35

is now the capital of targeted spyware firms.

1:08:39

Ever since a lot of former SNO group alumni

1:08:41

actually migrated to Barcelona, you know,

1:08:44

after the whole fallout from Pegasus,

1:08:47

there's been a lot of like, you know,

1:08:50

companies founded in Dubai and the UAE, for example,

1:08:53

Israel, of course, but also like Russia and China as well.

1:08:58

Like throughout the world, we're seeing a lot more

1:09:00

like greater instances of target-inspired firms

1:09:02

and they went through numerous legal challenges, lawsuits.

1:09:07

But eventually these people will just, you know,

1:09:09

You know, dude, still quit the company, joined a new company,

1:09:13

and then started services again and rebadged what we thought was, you know, been suit to death.

1:09:20

You may have known that the NSR group has experienced a lot of difficulties recently.

1:09:23

A WhatsApp lawsuit against them has actually succeeded in the US government at least.

1:09:29

I believe it was like jointly like operated by both like Apple and WhatsApp.

1:09:34

And essentially they're forced to pay out millions of dollars of damages.

1:09:38

Right now, the loss of Salaam going, and it's still managed to survive disputing every single way of actually paying that judgment in court.

1:09:47

While it's, you know, it's current team members, right, dispersed to other companies to create new companies in Barcelona to create new companies in the UAE, or to even join governments themselves and leverage your expertise and more legally sanctioned practices.

1:10:01

So I think we're here to take away isn't like, oh,

1:10:05

this administration is bad or this administration is a good whatever.

1:10:07

Take away here is that these people will invade punishment.

1:10:12

Like, like no matter how many companies you ban or suit a death,

1:10:16

they'll just be dispersed and create another company from other country and

1:10:18

try to find some other way to make money based on the talents.

1:10:21

And quite frankly, it's disgusting.

1:10:27

Yeah, I don't really have much to add on to that.

1:10:29

just like you said at the start, it's really, you know, this is something that I, I've definitely

1:10:36

said a lot is it's, it's dangerous to give governments any sort of authority because

1:10:44

you never know when the winds will shift. Like even if you're, if you're very conservative

1:10:48

and you're pro-Trump and you're like, yeah, he's going to use it to find the bad guys.

1:10:51

Okay. What happens when a Democrat gets into power and now has that same authorization

1:10:57

to use the spyware. It's really unsettling stuff and it's really hard to, I don't know,

1:11:04

I say it's hard to justify especially because as far as I know, we haven't really seen any

1:11:08

cases where it was used successfully to catch actual bad guys. Again, maybe it's like a

1:11:14

confirmation bias or just because these are the ones that make headlines, but it seems

1:11:18

like every time I see spyware talked about it's like, "Oh, deploy it on journalists,

1:11:21

deploy it on politicians, deploy it on activists in repressive countries." I've yet to read

1:11:27

story and for real people if there are please send them my way because I actually genuinely want to read them

1:11:32

I've yet to see a single story where it's like

1:11:34

Oh, they use Pegasus to stop a terrorist to stop a plot to you know find this person who was you know kidnapped and held ransom like

1:11:42

I don't know it seems it seems to me like these only ever get abused for bad purposes and

1:11:48

Yeah, not a fan

1:11:50

Yeah, I think like maybe in the future. Um, I will love that you figure out

1:11:55

Okay, what are actually some legitimate usage cases

1:11:57

that Target Spy wear?

1:11:58

And are these cases even wear?

1:11:59

I'm sure like if you pretend to like to purchase your software

1:12:02

maybe they will be willing to share a few case studies

1:12:04

that they used in the past.

1:12:05

But if I swear, as it turns out they're using cases

1:12:08

where they actually like targeted journalists

1:12:09

or dissidents, then well, that's a little ironic, right?

1:12:14

But I think it's good to actually objectively

1:12:16

find out there's facts here.

1:12:17

How many criminals are actually being caught?

1:12:18

How many terrorists are being apprehended

1:12:22

because of these Target Spy reactions?

1:12:24

Because it seems to me that there is very limited information

1:12:27

where it's been used in that case.

1:12:30

I mean, I'm sure it has been in the past.

1:12:31

I'm not denying that.

1:12:33

But it seems to me that the only people

1:12:35

who use your phone so far, in which you could be vulnerable

1:12:38

to a zero-day attack, are not terrorists or criminals

1:12:41

who probably thrown away their phones long time ago.

1:12:45

They're usually people with lives

1:12:47

who also have a voice in things.

1:12:49

So yeah, I think I made a picture for that for us

1:12:52

discussing the upcoming future, Nate.

1:12:56

But anyways, I think it's actually

1:12:58

a great time to discuss any trending form updates.

1:13:03

So right now, since it is a new year,

1:13:06

I actually post on a form any of the forms

1:13:12

like potential security or privacy resolutions

1:13:15

for 2026.

1:13:16

And while, yes, both of us have already

1:13:18

discussed our own resolutions, there's actually

1:13:21

been quite a lot of activity on the form.

1:13:22

actually to figure out like, hey, what are some things that I want to do?

1:13:26

And of course, I can't really describe who is on the thread itself.

1:13:30

But I can give a very summary of people who have been discussing this here.

1:13:36

So there's quite a few here that want to leave that ecosystem, for example,

1:13:41

switching away from things like Mint to another of recommended distributions.

1:13:46

One user really want to fully move on to Graphing OS and maybe potentially even

1:13:51

like an embrace to use just a proton even more often.

1:13:54

Maybe self-host some things like I have mentioned in the form.

1:13:57

Some people want to delete as many older accounts as possible

1:14:01

because I'm sure all of us has so many accounts

1:14:03

just waiting to get hacked and have their pasts be leaked.

1:14:06

And it's gonna take, I feel like the whole year

1:14:08

to get through all the accounts we made

1:14:10

in order to pass like a few decades of our lives, you know?

1:14:14

- That's definitely something I need to do

1:14:15

is delete some old accounts.

1:14:17

- We have so many.

1:14:19

It's gonna take forever, man.

1:14:20

I'm telling you.

1:14:22

We got quite a few gamers here, of course, actually.

1:14:25

Someone switched from Windows 11 to Bazite,

1:14:27

and actually wanted to switch away from Bazite

1:14:29

to another Fodor atomic distribution, which is great.

1:14:35

And yeah, self-hosting.

1:14:39

Someone is trying to look at more VOIP providers.

1:14:44

Someone also wants to try to monare out at first time.

1:14:47

Yeah, honestly, kudos to everyone here.

1:14:49

I'm really proud of you trying to set those goals for yourself.

1:14:52

And I'm really happy to see that 2025 was assigned

1:14:55

that you have done a lot of good work already

1:14:58

to focus on your privacy.

1:15:00

And trying to say this to me, you should stop doing it.

1:15:02

And I really hope that you keep each other accountable.

1:15:05

And keep us updated on any potential privacy wins

1:15:08

throughout the year as it progresses.

1:15:12

But yeah, Nate, do you think you should cover the next form

1:15:16

update?

1:15:16

I won't get any questions that may have posted.

1:15:20

- No, we can go ahead and cover this one real quick.

1:15:23

We, yeah, we were me mostly.

1:15:25

I was on the fence about this one earlier,

1:15:27

but I think it is important we talk about.

1:15:30

So there is a thread that I believe,

1:15:33

this is another one that you posted, right?

1:15:35

I'm open to it right now. - Yeah.

1:15:37

- Okay, so there was an article that said,

1:15:40

iPad kids are more anxious, less resilient

1:15:43

and slower decision makers.

1:15:45

And that came from the register.

1:15:48

Oh, excuse me.

1:15:48

The actual headline says infant screen time,

1:15:50

link to anxiety and slower cognition.

1:15:52

And I originally- - I think they changed it.

1:15:54

- It was yours.

1:15:55

Oh, did they change it?

1:15:56

- Yeah, sometimes it does.

1:15:57

- Yeah, it's weird.

1:16:00

Or like, yeah, sometimes I've noticed

1:16:01

you'll go into like reader mode

1:16:03

and the headline will change or something.

1:16:05

It is really weird.

1:16:07

But yeah, it's, the reason I wanted to talk about this one

1:16:11

is I feel like in the privacy community,

1:16:16

This is my personal observation, I could be wrong.

1:16:18

I feel like we're really too hard on parents sometimes

1:16:21

or especially I feel like we overestimate the average person.

1:16:26

And I say this not to put down any non-technical listeners.

1:16:30

I say this as, and I'm saying this in good faith.

1:16:35

I know a lot of people are gonna be really upset

1:16:37

with me when I say this, but I think we kinda need

1:16:39

to take a step back as a community,

1:16:41

as a technical community, a tech community.

1:16:43

We need to kind of take a step back and I mean, for lack of a better way to put it,

1:16:49

just kind of check ourselves.

1:16:51

I hate to say that because, you know, I know, right?

1:16:53

That whole like, check yourself before you wreck yourself.

1:16:55

Like that's not where I'm going, but you know, it's because I see one of the first

1:16:59

comments in this thread kind of started to imply that like, well, you know, parents

1:17:04

are just like, they're just not trying hard enough.

1:17:08

You know, that's basically what it is.

1:17:09

They're just being irresponsible and lazy.

1:17:10

And I think, you know, like anything, there's nuance, you know,

1:17:15

being a parent is hard, especially nowadays.

1:17:17

I, I don't have kids.

1:17:19

I want to throw that out there.

1:17:20

I don't have kids.

1:17:20

Um, I never planned to have kids.

1:17:24

And even so, I'm already at a point in my life where I look back on it.

1:17:29

And my mom cooked dinner almost every single night when I was growing up.

1:17:33

And I don't understand how she did that.

1:17:34

She worked a nine to five job with a commute during rush hour, 40 hours a

1:17:40

and somehow made a fresh meal every single night almost.

1:17:44

And I just, I will never be able to wrap my head around that.

1:17:47

And, you know, every parent I've ever spoken to,

1:17:51

like I'm at the point in my life now where,

1:17:53

even though I'm not a parent,

1:17:54

there's a lot of parents around me, you know?

1:17:57

A lot of my friends are either having kids

1:17:59

or already have kids of various ages.

1:18:01

I had a coworker at my last job that I swear to God,

1:18:04

I don't know how he found time for anything

1:18:05

because his kids every single day,

1:18:07

yeah, they had baseball practice last night,

1:18:09

They had basketball last night.

1:18:11

They have a game this weekend.

1:18:12

They have a game next weekend.

1:18:14

We have two games this weekend.

1:18:15

We have to go out of town to, you know, some other city halfway across the state.

1:18:19

And reminder, Texas is a really big state.

1:18:21

So like, I just truly do not understand it.

1:18:23

And he showed up to work every single day and I'm like, dude, how do you find the time?

1:18:27

And he would still find time to like, you know, play an hour of sky room here

1:18:31

and there a couple of times a week.

1:18:32

And I'm just like, wow, I'd be dead by now.

1:18:36

And he was a good parent.

1:18:37

He was, he was very engaged.

1:18:38

He was very, you know, with his kids.

1:18:41

And it's just the point I'm trying to make is like,

1:18:43

it's a lot of work to be a parent.

1:18:44

If you speak to any brand new parent, they will tell you,

1:18:47

like, I remember my sister told me after she had her kid,

1:18:49

she's like, I didn't know it was possible to be this tired.

1:18:52

Like, it's just hard.

1:18:54

And I think rather than shaming parents,

1:18:59

and for the record, there are some parents

1:19:00

who are just totally checked out and they suck.

1:19:02

I do want to acknowledge that.

1:19:03

But I think for a lot of parents,

1:19:05

it's just really hard to keep up with all this stuff.

1:19:07

I think to kind of flip it a little bit,

1:19:09

I think we as technical people kind of underestimate

1:19:13

the skills we have.

1:19:14

You know, for us, it's like things like,

1:19:18

I don't know, things like installing Linux

1:19:20

or flashing a phone, you know, those are things

1:19:22

that we can do very easily, very quickly

1:19:24

because we have experience and we know what we're doing.

1:19:27

But to somebody who's never done this stuff before,

1:19:29

like I've known people who pay for Norton Antivirus.

1:19:34

This is again, a whole conversation we've had offline.

1:19:37

I know people who have been using iPhone for years

1:19:39

and did not know that it has built-in parental controls.

1:19:43

And to be fair, they're very, very buried apparently.

1:19:47

I once asked, I think I've told this story before,

1:19:50

I've asked in the past, I asked my sister to look at

1:19:53

my website, the new oil and give me feedback.

1:19:55

And she was like, well, there's some words I don't understand.

1:19:58

And I was like, okay, give me a list of words.

1:20:00

And I was expecting her to be like, you know,

1:20:02

like zero knowledge or ROM or Linux maybe.

1:20:06

She literally said like firewall operating system.

1:20:10

And I think web browser was one of them.

1:20:12

Like people don't, I've had way too many conversations

1:20:15

with people where they're like,

1:20:17

"Hey, I need help with this tablet."

1:20:18

And I'm like, "Okay, is it Android or iOS?"

1:20:21

And they're like, "I don't know."

1:20:24

What do you mean you don't know?

1:20:25

And then it turned out it was actually like Windows

1:20:27

or something.

1:20:27

So it's the point I'm trying to make is

1:20:29

we are a lot more technical

1:20:31

than we give ourselves credit for sometimes.

1:20:33

And I think the goal rather than attacking parents,

1:20:36

especially if you're not a parent.

1:20:37

If you are a parent,

1:20:38

you're probably a little bit more qualified,

1:20:39

but rather than attacking parents and saying like,

1:20:42

"Oh, you're lazy, you're disengaged,"

1:20:44

I think the better solution is to show them like,

1:20:45

"Hey, did you know your phone has parental controls?

1:20:48

Did you know that there are these alternative front ends

1:20:51

where you can put on YouTube for your kid

1:20:53

and it's not gonna show them ads,

1:20:55

it's not gonna track them?

1:20:56

Did you know that there's,

1:20:58

'cause I know a really common thing nowadays is like

1:21:00

getting a Gmail account for your kid while they're still young.

1:21:03

So that way someone doesn't claim it

1:21:04

by the time they grow up.

1:21:06

Did you know there's proton?

1:21:06

You can get them that instead or TUDO, whatever.

1:21:09

Like, I don't know.

1:21:10

I think it's just a little empathy, I think,

1:21:12

is what I kind of wanted to talk about

1:21:14

and trying to focus more on the education side of things

1:21:17

'cause I don't know, me personally,

1:21:20

I think it was hard enough to be a parent.

1:21:23

It's always hard to be a parent.

1:21:24

And now we've got all this technology

1:21:25

and it's so confusing and companies are so full of crap

1:21:27

and they're always lying about this thing is hyper secure

1:21:30

and perfectly encrypted.

1:21:31

And those of us who are technical know that's crap

1:21:33

and it's not true.

1:21:34

We know how to read a privacy policy and parents don't they don't have time to read a privacy policy when they're trying to

1:21:39

Cook dinner and again commute and do this. It's just I don't know. I know I'm going in circles now, but yeah, it's some

1:21:46

It's it's hard to be a parent nowadays

1:21:48

And I think we would do better to try and help support parents and give them the tools

1:21:52

They need to navigate this stuff rather than putting all the onus on them

1:21:57

Yeah

1:22:03

I

1:22:04

parents, of course. It's another thing to also recognize that some things that may be questionable,

1:22:08

so just giving an iPad to a toddler, of course, is just like, oftentimes, like, a result of

1:22:14

ignorance, they may not know certain things, like, that they weren't raised with an iPad in mind,

1:22:18

you know? But honestly, for me, like, I'm grateful that I wasn't an iPad kid until I was like seven,

1:22:24

right? Of course, not until either one or two, like, some kids nowadays are. So for me personally,

1:22:30

it was a matter of, yes, an iPad was very instrumental for me to get interested in technology. I

1:22:36

learned how to use it and it was really great for educational purposes. It also led to a lot of

1:22:41

unfortunate things with screen time and I think it's okay to recognize that yes, you should not

1:22:45

be giving a children's iPad when they're only like a few months old or two years old, but maybe later,

1:22:52

like potentially like recognize that yes, you will eventually need to give your child a phone

1:22:57

one day or night about one day. And there isn't really like a solid like understanding of what I

1:23:04

would call like a unified curriculum, you know, like, you know, like parents that could easily

1:23:08

teach their children how to say no to strangers when they're giving up candy, right? Like that's

1:23:12

a given, you know, that's how like, I feel like most parents like raise their children,

1:23:16

don't talk to strangers, don't open the door to random people. Maybe it's time to instead like

1:23:22

focus on things like well maybe you can't like understand that parents have the

1:23:28

technical how to implement parental controls but how about like maybe teaching a child okay

1:23:34

honey please don't respond to random strangers online or hey listen um let me know if you're

1:23:40

making a social media account okay um you gotta let me know if you're doing that and you're and

1:23:45

I want because I don't believe you should have this social media account yet I think things like that

1:23:50

are a lot more doable once parents end up growing up for my generation, for example,

1:23:59

which we are raised on the social media. And we were those social media kids.

1:24:02

I think all of us recognize the harms that social media has in our own children.

1:24:07

I don't know whether we'll go through the roots of actually implementing parental controls on

1:24:12

all of our devices. For me personally, I will. That's what I want to do if I were to become a father.

1:24:19

but we don't know if master joy as well because nowadays being a parent is incredibly tough.

1:24:23

I recognize that struggles that my parents had, you know, just trying to put food on the table,

1:24:27

much as deal with a language barrier. And for me personally, as a potential parent,

1:24:34

I hope to one day like teach my kids how to responsibly use technology without necessarily

1:24:38

banning them from it. But that's gonna take a lot of effort from my and as well because I

1:24:41

will probably be working a full-time job as well. And I need cook for them, drive them to all these

1:24:46

places, but maybe this technical divide will be bridged as like more people from the younger

1:24:52

generations become parents themselves are more fluent with how social media works.

1:24:57

But then again, you'll never know.

1:24:59

Maybe we'll start shoving iPads on our kids and that's a normal thing to do.

1:25:03

But I hope that's not the case.

1:25:05

But if you're like, based on what I know, based on what you know, my family friends and I,

1:25:09

it tends to be the default option nowadays.

1:25:11

And we should still try to advocate against giving our past to kids and whatnot.

1:25:17

Not just because of personal security concerns, but also based on the involvement of the child

1:25:22

itself.

1:25:24

But yeah, Nate, I think you really set a lot of great points.

1:25:26

So I feel like we all could have reflected and learned from someone who actually has

1:25:31

interacted with parents and dealt with their concerns as well.

1:25:37

Yeah.

1:25:37

And just to be clear, you made a really good point.

1:25:40

I'm not defending the kind of parents who just like,

1:25:44

I'm completely checked out.

1:25:45

I just let YouTube raise my kid.

1:25:46

I think there's a distinct difference between like,

1:25:48

of course I'm going to let my kid watch YouTube

1:25:50

while I'm like cooking dinner or something versus,

1:25:52

I'm just never there.

1:25:53

And, but yeah, I mean, it's just, it's so,

1:25:57

yeah, that was another thing that you brought up

1:26:00

that we were talking about before recording

1:26:02

or before the stream is,

1:26:06

I come from a generation where, you know,

1:26:08

I mentioned earlier, like when I was in high school, it was my space.

1:26:11

Facebook didn't exist yet.

1:26:13

Like I don't even think targeted advertising existed or it was still like based on cookies.

1:26:16

And you know, it was very primitive compared to what it is now.

1:26:19

And it's just tech changes so fast.

1:26:21

And like the world that kids are being raised in nowadays is completely unrecognizable from

1:26:28

the world that I grew up in.

1:26:30

And so it's just, and you know, phones or devices are constantly pushing new features

1:26:35

every time they update.

1:26:36

It's so hard to stay on top of all of them.

1:26:38

And it's, you know, it would be great if we all lived in a world where we could switch

1:26:42

to Linux and switch to Graphene.

1:26:44

And, but, you know, sometimes people for whatever reason, don't do that or can't do that.

1:26:48

And I think it's, it's unfair to expect them to read all the privacy policies, spend all

1:26:54

their time studying.

1:26:55

Like that's one of the reasons I got into podcasting.

1:26:58

I think we answered this in a question a couple of weeks ago.

1:27:00

Like that's one of the reasons I started podcasting this privacy stuff is because I realized I'm

1:27:03

I'm like, man, keeping up with these headlines is like a full time job.

1:27:06

I could not expect anybody to keep up with this stuff if they weren't as

1:27:11

passionate, as high I am about this.

1:27:13

And, you know, so wanting to be able to kind of bring like, Hey,

1:27:16

these are the important stories you guys should be paying attention to.

1:27:18

And there's just, uh, it's just such a crazy landscape.

1:27:22

And, um, yeah, just, uh, I guess just trying to be more solution oriented.

1:27:27

Like that's a big thing for me personally that I'm a really big fan of is, you

1:27:30

know, it's, sometimes there is no solution.

1:27:32

Sometimes you look at something and you say like,

1:27:34

oh, this just sucks.

1:27:34

And I don't know what to do about it.

1:27:36

But I think sometimes having that attitude of like,

1:27:38

how can we help?

1:27:38

How can we make parents aware that, again, these devices

1:27:42

have protection features and child safety features?

1:27:47

And I don't know.

1:27:49

Yeah.

1:27:50

You said a lot really well.

1:27:53

Yeah, you too as well.

1:27:54

I think it's nice that their perspective is

1:27:57

on the same problems as well.

1:27:59

and try to figure out what's the best solution for everyone involved.

1:28:03

Anyways, Nate, I think it's time to do--

1:28:06

now discuss the two questions we have on the form thread itself.

1:28:11

Chat is-- I don't see anything on the chat so far.

1:28:14

So I think it's good to cover the thread first.

1:28:16

Maybe cover them as they come up.

1:28:18

I'll start with the first one.

1:28:19

Yeah, if you guys have any questions,

1:28:20

be sure to drop them in the live chat

1:28:21

while we're covering these.

1:28:23

Yeah, of course.

1:28:24

Of course.

1:28:25

Let's start with JD, our resident privacy wizard.

1:28:30

Who's asked, "Does privacy guides have a robot resource in which it will be okay to release all members of the form?

1:28:37

A glimpse of what to expect this year or at least a quarter one?"

1:28:40

Or are there no set plans to begin with for even tentative romance to exist?

1:28:43

I only ask out of curiosity to learn how all you are upgrading graduate from the ex-work of 2020 Fridays here

1:28:48

as a goal is to maintain a status quo.

1:28:50

As always, looking forward to your live stream.

1:28:53

Thank you so much for joining us.

1:28:55

If you are a Sora as of right now, just ask your question.

1:28:57

We do have a roadmap of sorts.

1:29:01

It's not public yet.

1:29:03

But keep in mind that what we do have right now, if you're looking to--

1:29:06

I'm more interested, at least in the general, more corporate, non-private side of things,

1:29:11

is that we do release our meeting minutes publicly, especially like--

1:29:16

I believe it's mostly among the executive board and with our folks at Magic Grants as well.

1:29:21

But those are more so on the more organizational side of things.

1:29:24

We're not necessarily related to privacy as a project as a whole.

1:29:26

Well, we do post those minutes live.

1:29:29

As for a more specific roadmap,

1:29:34

we do have an internal roadmap,

1:29:35

and I'm sure Nate has a lot more to discuss here,

1:29:39

but they've been working on a lot of exciting things.

1:29:41

And I'm going to pop out a lot of new content for Q1 2026.

1:29:46

Do you care to elaborate?

1:29:49

- Well, I can only speak for the video stuff.

1:29:52

To be honest, do we have like an actual,

1:29:56

like a GitHub repo or something with a roadmap in it?

1:29:59

'Cause I haven't seen that if we do.

1:30:01

- We do have a GitHub repo, yes, yes.

1:30:03

Well, that's more of an internal thing, I believe.

1:30:06

- I should poke around a little more.

1:30:08

- Yeah, yeah, no, it's like,

1:30:09

it's how we got a tracker, everything.

1:30:10

So just for transparency's sake,

1:30:12

we do have a private GitHub tracker

1:30:14

where we kind of list all of our videos

1:30:16

and our content when we're in posts.

1:30:19

But of course, if you do want a more of like

1:30:21

social media facing roadmap,

1:30:24

Let's just say we have like a simple flow table.

1:30:27

I think like one of us are happy to create that for you

1:30:29

and maybe make it more of a regular thing

1:30:31

just for transparency's sake.

1:30:33

But keep in mind of course that things do change.

1:30:35

We don't know if certain video ideas

1:30:37

will actually like exist in the meantime.

1:30:40

But maybe for example,

1:30:42

when I think of more like general things as well,

1:30:43

just for example,

1:30:45

when releasing the next phase of Merch for example,

1:30:47

when are we planning to release,

1:30:50

let's say a certain section of the website,

1:30:53

All of that, of course,

1:30:55

I envision could exist in a graphic.

1:30:58

We can publish the form.

1:30:59

But I'm more than happy to discuss this

1:31:01

with the rest of the team to see,

1:31:03

hey, what kind of roadmap exists out here right now?

1:31:06

Since a lot of it's currently very internalized

1:31:08

and very private,

1:31:10

but we could definitely publicize it a bit

1:31:11

and turn it into an image that anyone can view

1:31:13

and more easily digest, of course.

1:31:16

So thank you for asking us.

1:31:18

Just know that we're not maintaining the status quo.

1:31:22

We're planning to grow and release new content.

1:31:24

At least why I know they're definitely pushing

1:31:26

Adored and Ate very hard.

1:31:28

They're sure to publish a ton of amazing video content

1:31:30

in Q1, 226.

1:31:32

We can't really just describe the specifics now,

1:31:35

but trust me, there's a lot of content going out.

1:31:37

So keep that in mind as this month progress.

1:31:41

Qisu, which she says, "Happy New Year to everyone.

1:31:46

"I'll start the year with a personal question on minds.

1:31:49

It is a bit lengthy and will insert you asynchronously.

1:31:52

It closely counts the potential tips while live streaming

1:31:54

and how to not to dox yourselves.

1:31:56

Yeah, I think like, oh, it's a question

1:31:58

they posted on the forum like a few hours ago, actually.

1:32:02

If you mind clicking that as well, Nate,

1:32:04

it's a very detailed question.

1:32:06

I think Nate and I were more happy to answer

1:32:08

as we read through it.

1:32:11

So just to kind of summarize here,

1:32:15

The Kisu is considering methods to live stream

1:32:20

without doxing themselves, right?

1:32:22

And while they're okay with being a public figure,

1:32:24

they do not want someone to come down to nor

1:32:27

as like being swatted or being stalked by,

1:32:30

having your family members stalked by someone.

1:32:32

So that's their threat model.

1:32:34

And the idea is like, they wanna imitate,

1:32:37

Jordan would do, and they're old like live streamers

1:32:40

so that's in which they will have a floating emoji face.

1:32:43

and they may consider doing imitating that.

1:32:46

But of course, there's a lot of common mistakes

1:32:47

that could happen.

1:32:48

They just want to like, you know, like our perspective on this.

1:32:51

And a lot of their current bullet points

1:32:53

include streaming on YouTube, Twitch,

1:32:56

or something self-hosted, just owncast, my peer tube,

1:32:59

protecting themselves against target attacks

1:33:00

from a personal point of view.

1:33:02

And then considering that like a random employee

1:33:04

from YouTube or Twitch could also just query them

1:33:06

in the database and figure out where to streaming

1:33:07

from a static IP.

1:33:11

And that's kind of like what they've been like concerned

1:33:13

well. And for more of an account-based perspective, but also from a computer-based perspective,

1:33:19

they don't plan on streaming from a personal computer. And that's why they probably want to

1:33:23

like figure out like how to, for example, like have a dedicated gaming PC, which will supposedly like

1:33:28

run various Linux distributions alongside a Mac Studio, which is why it's just saying,

1:33:33

I assume KVN switch. They also want to limit this mouse software as much as possible,

1:33:38

in which everything's running on simple hardware. There's a few failing points as possible.

1:33:42

And all of that is multi-streaming from OBS Studio.

1:33:46

So it seems to me that there's quite a lot of car

1:33:48

criminalization here, would you say?

1:33:50

Like they're trying to have two different PCs for this purpose.

1:33:55

Yeah, so I kind of skimmed through this a little bit earlier.

1:33:59

And I mean, there's a lot to go through here.

1:34:02

Like you said, this is a really advanced question.

1:34:05

Or maybe not advanced, but really detailed.

1:34:07

I think the first thing I want to say up front

1:34:11

I think it's good you're thinking about this kind of stuff.

1:34:14

Like for example, further down, I don't think you've gotten this far, Kevin, but further

1:34:19

down you make a comment about, "I want to be careful regarding any reflective surface,

1:34:23

especially since I do wear glasses while working."

1:34:26

For audio listeners, I just touch my own glasses.

1:34:30

So that is a good thing, because I've had that thought.

1:34:32

I've had, I think it was a band meeting actually one time where I had an email open or something,

1:34:39

And my singer made a comment on that and he's just like, I don't remember, I opened something

1:34:45

and my singer made a comment about being able to see it in my glasses and that was over

1:34:48

a Discord meeting.

1:34:49

So and I don't even think that was with my good camera here.

1:34:52

I think that was just with my crappy little webcam.

1:34:54

So I mean, yeah, that's a good thing to consider.

1:34:58

I think one thing, I think you're focused, if I may be honest, I think you're focused

1:35:07

a little too much on the technical and really overthinking this.

1:35:10

Like you're saying I want a completely separate computer, which if you have the money for

1:35:13

that, great, by all means feel free.

1:35:16

But also you could just maybe have like a separate user account.

1:35:19

Cause when you say like, you don't want to leak anything that's on your personal computer.

1:35:22

Well, what do you mean?

1:35:23

Like leak it in the sense of like, I just mentioned where it's visible in your glasses

1:35:26

or it's accidentally visible on screen for a minute, create a separate user account where

1:35:29

you only install your gaming stuff and your streaming stuff.

1:35:32

And now you don't really have to worry about it too much.

1:35:33

Like what's the worst case scenario?

1:35:35

They're going to see your desktop that has all the other games on it, or they're going to see, you know, your OBS for a second.

1:35:40

Like that's really not a big deal.

1:35:43

You also talk about using a VPN.

1:35:45

I mean, I definitely, I definitely encourage that.

1:35:50

My only thing is you talk about like a kill switch and private DNS filtering.

1:35:54

I mean, that could kill your stream while you're going, if you do it wrong.

1:35:57

But I will also say you might experience some latency, some loss of quality.

1:36:02

I'm not really qualified to comment on that kind of stuff, but just something to be aware

1:36:06

of.

1:36:07

You say room is closed from the outside world, no laser through a window, external sounds

1:36:11

or things like that will make my location specific.

1:36:14

I mean, laser through a window, I think is a little extreme, but yeah, if there's like

1:36:17

a train that goes by at the same time every day, that could definitely be a thing.

1:36:19

I think that's a little bit more advanced, but again, it's good you're thinking about

1:36:22

this stuff.

1:36:24

The main thing that kind of stuck out to me is what are you thinking about from the non-technical

1:36:31

side?

1:36:32

For example, are you staying on top of your data removal stuff from people search websites?

1:36:37

If that's a thing, I don't know what country you live in.

1:36:38

Maybe you live in Europe where that's not really much of an issue.

1:36:40

But are you staying on top of that?

1:36:42

Have you thought about using a pseudonym?

1:36:45

Have you thought about disinformation?

1:36:48

You could lie and say that you live in a different place than you really do.

1:36:51

You could get a, I don't know, I could get a New York Yankees hat and put it behind me

1:36:56

and say that I'm in New York.

1:36:57

I guess that's kind of common.

1:36:58

But I could get the Cubs and they're Chicago, right?

1:37:01

I could say I'm from Chicago.

1:37:02

I don't even know.

1:37:02

I'm not a sports person, but you know, it's, it's.

1:37:06

Yeah.

1:37:06

Right.

1:37:06

We can tell.

1:37:08

Um, but just stuff like that, like, have you given thought to those kind of things?

1:37:12

Have you given thought to the chat?

1:37:13

Like what happens if somebody tries to dox you in the chat?

1:37:16

And this one I'm kind of conflicted on because like, I used to block my real name

1:37:23

in YouTube comments, uh, over on my channel with a new oil, but then I realized

1:37:28

that's almost confirmation.

1:37:29

If somebody types it in and the comment doesn't appear,

1:37:31

now they know they got my name right.

1:37:32

So instead I just started blocking everything

1:37:34

and now I have to manually approve every single comment,

1:37:36

which may not be feasible in a live chat,

1:37:38

but you know, same thing.

1:37:39

Like what are you gonna do

1:37:40

if somebody says that in a live chat?

1:37:42

I've heard some people who are like bigger

1:37:44

Twitch streamers and YouTubers,

1:37:46

the protocol for their moderators

1:37:47

is if somebody doxes you, flood the chat

1:37:49

so it gets off screen as quick as possible.

1:37:51

And you know, just kind of having those thoughts

1:37:54

and conversations in advance.

1:37:56

So yeah, I think I'd be more interested in it.

1:37:59

It sounds like you've put a lot of thought

1:38:00

into the technical side of things.

1:38:02

Are you putting as much thought into the peripheral things

1:38:05

that maybe are not as exciting as computers and software

1:38:09

and VPNs, but are equally important?

1:38:13

- And I'm personally, I'm not like a prominent

1:38:16

like last chamber user on a regular basis.

1:38:18

So I can definitely say that like,

1:38:21

what I know from a lot of people who do like,

1:38:23

getting to this hobby and make it big,

1:38:24

is like, there's always like a chance

1:38:26

that they may not use your pseudonym anymore.

1:38:29

They may be like, okay, I am larger scale.

1:38:31

I'm going to make the sacrifice and say who I am.

1:38:34

My real name, show my face.

1:38:36

It would be like a very common journey

1:38:38

amongst all kinds of creators.

1:38:39

Oh, I play this video game under pseudonym.

1:38:43

No, before it was just my voice.

1:38:44

And then I slowly revealed so much of myself,

1:38:46

like where I live, where I'm from,

1:38:48

because I want to put this report of my audience.

1:38:50

And then I show my face, my real name, where I live.

1:38:53

And up to a point where,

1:38:55

versus some of the content creators,

1:38:56

like I used to know back in the day,

1:38:58

when they go through some inevitable legal drama,

1:39:01

I can see their jail records, for example,

1:39:03

which is insane for me growing up and watching them.

1:39:07

But it makes sense 'cause you're voluntarily sharing

1:39:09

your information online.

1:39:10

So perhaps the next step, besides your tentacles steps

1:39:13

are already taken already,

1:39:14

and also probably needs advice on these tentacles steps,

1:39:18

maybe just generally just not,

1:39:20

try to reduce the amount of information you're actually

1:39:22

discussing in these live streams.

1:39:24

I'm not saying that you can't go report with your audience,

1:39:28

but there's a lot of things to talk about that isn't just

1:39:31

describing where you live or where you are.

1:39:34

You're very specific in each interest that means

1:39:36

the people where you are.

1:39:39

Of course, it's very difficult because as of any content

1:39:42

creator playing video games, you want to ensure that

1:39:45

relationship with the audience is here,

1:39:46

and to build that closeness, you also need to share

1:39:49

some parts of you, which really a lot of people do tend to show their face or the real name.

1:39:53

But it comes with a lot of sacrifices.

1:39:55

And for a lot of people like me or Nate here, Jordan, Jonah, we made that sacrifice.

1:40:04

We believe that it's important for people to actually relate to us as individuals and

1:40:08

also potentially learn a lot from what we talk about.

1:40:13

But there's not necessarily me.

1:40:14

can't protect your privacy like any other way

1:40:17

by describing everything about yourself.

1:40:20

I won't go into detail of where I'm from,

1:40:23

my internet password or like the random place

1:40:26

I went to the other day.

1:40:27

Like that's not important, that's not relevant.

1:40:30

And by just simply minimizing the information,

1:40:33

you can still be a public figure

1:40:34

without necessarily like doxing yourself.

1:40:37

The way to who will dox you, oftentimes is very easy.

1:40:40

I know people who can dox, who get doxed

1:40:42

because they reveal like a specific street sign or a specific road that's only like

1:40:46

especially some like area in this part of town in this country.

1:40:51

Almost everything you can say can be used to eventually like like you know locate you.

1:40:57

So maybe it's good to like reduce the money and finish and talk about in the first place.

1:41:01

It's gonna be really difficult of course because I know there is a balance

1:41:06

and I'm sure you'll make it because there's a lot of people out here who are public figures

1:41:09

who are access to able to use pseudonyms without necessarily showing their face.

1:41:14

It is a little difficult, of course, as you start interacting with other people,

1:41:17

but I'm sure you'll get it. But try not to focus too much on the technical details. So focus on

1:41:22

the actual things you're talking about on a day-to-day basis.

1:41:26

All right, real quick. You reminded me when you're like, "Oh, I've seen people get docs over,

1:41:30

like, you know, a street sign or something." When I was in Europe, I took, I use, I don't usually use

1:41:36

signal stories, but I did in Europe. And at one point I was on the train from Poland to Germany.

1:41:43

And I was just passing through a town. I don't even, I couldn't tell you what town it was,

1:41:48

but I was passing through a town and so I took a picture out the window and posted it on my

1:41:52

signal story and I said something like, "On my way to Germany" or something. I don't remember. I

1:41:55

said, I don't even know if I posted it like or said anything. I think I just posted the picture.

1:41:59

And you know, for the record, I set up the like, who could see it and I made sure it was only people

1:42:02

Well, I trusted.

1:42:04

And one of my friends messaged me like almost immediately and was like, Oh,

1:42:10

you're in this town in Poland.

1:42:12

And I was like, hold on a minute.

1:42:13

So I like opened the GPS.

1:42:14

I figured out where I was and I'm like, Oh, the hell did you know that?

1:42:19

And it turns out the first giveaway was, um, he's like, well, I know you're in Europe.

1:42:24

Uh, or I think Poland specifically, because like the way the crosswalks go,

1:42:28

he's like, they're like the only place that does this.

1:42:30

And then he's like, but also specifically there's a skyscraper,

1:42:34

which are not very common in Europe.

1:42:36

I found that out. That was very shocking.

1:42:38

But he's like, there's a skyscraper in the background that has a very specific shape.

1:42:42

And that's like the the tallest or the second tallest building in Poland,

1:42:46

like something like that.

1:42:47

And this dude plays a lot of one of those like Geo guess her things.

1:42:51

Oh, Lord. Yeah.

1:42:51

Jordan said that in the chat, like Geo guess her pros are scary.

1:42:54

Yeah, he was fantastic.

1:42:55

And for the record, he was someone I trusted.

1:42:57

Like I clicked his name when I was like, yes, share this story with these people.

1:43:00

So I wasn't worried about it, but it was just so shocking that he immediately like was,

1:43:05

oh, you're in this city in Poland.

1:43:06

And I'm like, I just snapped a random picture of a cool town outside of the train.

1:43:11

What the hell?

1:43:12

So yeah, it's, it's crazy what people can, can pick up on sometimes.

1:43:17

Um, and I don't say that to scare you, but yeah, it's, but, uh, just to kind of

1:43:21

back up another thing you said earlier, you said, like, there's so much you can,

1:43:24

um, connect with your audience on besides personal information and, you know, like,

1:43:29

But for example, I don't reveal my birthday even in real life.

1:43:38

Like it's a thing for me, it predates me getting into privacy.

1:43:41

I just don't like it.

1:43:42

I don't celebrate my birthday.

1:43:43

I don't talk about it.

1:43:44

I don't say when it is.

1:43:46

And so I do that in real life.

1:43:47

If people are like, "Oh, when's your birthday?"

1:43:48

I'm like, "I'd prefer not to say."

1:43:50

But a lot of the things you guys know about me, like I am really, really, really into

1:43:56

sci-fi.

1:43:58

Super.

1:43:58

I love sci-fi.

1:43:59

You know, I'm, I play video games.

1:44:01

I've got two cats that you guys have seen on some of the streams in the past.

1:44:05

Like a lot of the things that I say about myself are not necessarily revealing things,

1:44:10

but they're, they're real.

1:44:11

You know, it's not me faking it.

1:44:13

Like I don't, I don't go on mastodon and say like, oh, I can't wait for the next season

1:44:18

of silo when I've never seen the show.

1:44:20

You know, they're, they're real things about me that I don't know, just kind of back up

1:44:24

what you were saying.

1:44:25

Like there's a lot of things you can bond on without revealing like, here's where I'm

1:44:28

here's exactly how old I am.

1:44:30

Here's, you know, this, that and the other.

1:44:31

So yeah.

1:44:34

Just wanted to take that one up and throw that out there.

1:44:37

- Yeah, of course.

1:44:38

No need to get into like the deals of like,

1:44:39

okay, like how's my furniture arranged?

1:44:42

Is this from this specific town?

1:44:43

Like it's okay.

1:44:45

Like as long as you understand your threat model,

1:44:48

which is I do not want people to call the cops on me

1:44:51

while they swat them me down my doors.

1:44:54

If that's your threat model, you should be fine personally

1:44:56

with just having a very minimalist life room shut up,

1:45:00

you'll be A-OK.

1:45:02

But what will kill you is what you say.

1:45:04

So maybe think about, potentially,

1:45:07

about when you create your disordered home.

1:45:08

Think about some things that you do

1:45:10

want to discuss to your audience that you don't mind,

1:45:11

and things that you should definitely lie or misconstrue

1:45:15

to ensure that you won't get docks

1:45:16

by disrevelling your location.

1:45:20

But yeah, I think that's all I had to say.

1:45:21

I'm going to stop Ignate.

1:45:23

Anything else from your expertise?

1:45:26

Just real quick to add onto what you just said is a lot of the time,

1:45:30

if you're not prepared in terms of like giving disinformation,

1:45:34

that's when you panic and you default to giving out real information.

1:45:37

Like we talk about that a lot with, um, you know, like when you go to the

1:45:40

counter and they're, Oh, what's your phone number for the receipt when you're

1:45:42

buying something?

1:45:43

And if you have not already had that conversation with yourself about,

1:45:47

I'm going to politely refuse or I'm going to give them a fake phone number.

1:45:50

That's when you just kind of panic and you're just like, Oh, uh, crap.

1:45:54

I was going to do the phone number from Jenny, but I just forgot it.

1:45:56

What is it?

1:45:57

Eight, six, seven, five, three, oh nine.

1:45:58

I think anyways, but point being it's, you know, like you were just saying,

1:46:03

like think in advance, what information am I willing to share?

1:46:06

What information am I not willing to share?

1:46:07

And yeah, I think just being prepared will get you a long way.

1:46:11

And again, I know we hate them, but use all of these tools to your

1:46:16

advantage in data removal, use the, the, use the facial recognition stuff,

1:46:22

use the Google searches, use the results about me.

1:46:25

I'm not necessarily saying you have to pay for,

1:46:27

like we do recommend easy opt-outs

1:46:28

if you're gonna pay for a service,

1:46:29

but just use every tool at your disposal

1:46:32

'cause some people really have nothing better

1:46:33

to do with their time.

1:46:34

And it's actually kind of impressive if it wasn't so sad,

1:46:37

but I digress.

1:46:39

I think that's all I got.

1:46:40

(laughs)

1:46:41

- Yeah, yeah, safe.

1:46:42

Like I say, like I usually like default to you

1:46:44

when I get any like any questions about videos

1:46:46

or live streaming, 'cause I feel like

1:46:47

you've been in that for quite a long time to smell, right?

1:46:51

- Yeah, I think I started showing my face

1:46:54

when I joined Surveillance Report in,

1:46:57

I think that was at the end of 2020,

1:47:00

or the beginning of 2021.

1:47:02

And then shortly after that,

1:47:04

Henry encouraged me to start making my own videos as well.

1:47:06

So yeah.

1:47:08

- Yeah, it's been a few years.

1:47:09

- Ooh.

1:47:11

- Time flies, right?

1:47:12

- Yeah.

1:47:12

- But hey, listen. - About two.

1:47:14

- At the end of the day, at the end of the day,

1:47:15

I think it's like, ultimately,

1:47:17

I do apply people who do want to have that public

1:47:21

facing personality and like try to do things like with the platform,

1:47:25

be it focusing on private education or playing a video game.

1:47:27

I think like it takes a lot of energy to do.

1:47:30

And the fact that you want to do it as well is like, uh, very inspiring.

1:47:34

So like regardless of like, you know, whether or not they could actually

1:47:37

choose to own a private matter or not.

1:47:38

So he said, I'm sure that you will be an amazing live streamer if you

1:47:42

decided to pursue this path.

1:47:44

Um, if you're needing further advice, don't feel free to hit us up at

1:47:47

many times either on the form or through email, DMs, whatever.

1:47:51

Um, you have any more specific advice that you may not want to discuss publicly.

1:47:54

Um, but yeah, like thank you so much for the question.

1:47:58

Yeah.

1:48:00

Um, I did not see any more questions come in on the live chat.

1:48:04

Uh, real quick, I did see, um, Ilya and nine, nine, nine, nine said that, uh, one

1:48:10

thing I like to do with some of my accounts is I quote unquote, borrow someone

1:48:13

else's username.

1:48:14

So people try to find me, they get info about that person instead.

1:48:17

I mean, that could work for a personal account, but usually when you're trying to be like a streamer or something, you're trying to build a brand.

1:48:23

So, I mean, that's definitely, yeah, your personal and your professional identities should be kept a little bit separate, ideally.

1:48:28

But I would not recommend using a different username for your professional account everywhere.

1:48:33

That's why, like, you know, when I do work, it's the new oil.

1:48:39

I try to be the new oil everywhere online or, you know, like now with privacy guides, I try to be Nate B.

1:48:44

so that anywhere I go, people know it's me, ideally.

1:48:47

So, oh, hold on.

1:48:50

We got one more last minute question from unredacted.

1:48:54

You said, "Where do you think the bad internet bill stands,

1:48:56

specifically in regard to age verification rumors

1:48:58

of section 230 reform in the US?"

1:49:02

I have not heard any details,

1:49:03

but I have actually been thinking I could be wrong,

1:49:06

but I think politicians usually, correct me if I'm wrong,

1:49:09

'cause I think you might know more about this

1:49:11

than I would, Kevin, but I think politicians

1:49:13

and Congress usually takes a break over the holidays.

1:49:17

So they probably paused for the last few weeks,

1:49:21

but I wouldn't be surprised if they're getting back

1:49:23

in session soon.

1:49:24

So I think we do need to try to check and see

1:49:28

if there's an update on that.

1:49:30

- Yeah, for sure.

1:49:30

I think like having some sort of election track

1:49:32

could be very important for us.

1:49:33

And yeah, like I think following this is very important.

1:49:37

And what I do have to say though is that,

1:49:40

well, currently a lot of these bills

1:49:41

are currently being installed due to these,

1:49:43

Well, it was originally bipartisan support, right?

1:49:45

You know, like with Democrats or Republicans,

1:49:47

there seemed to be very, very, like,

1:49:50

interested in protecting children,

1:49:51

and you know, like supporting age verification, whatever.

1:49:54

But there's always minor details

1:49:55

that cause some kind of friction, of course,

1:49:57

'cause you know, what is free speech?

1:49:58

What is censorship?

1:49:59

You know, there's always a partisan answer to this.

1:50:01

When you do that, it should be, you know, very, you know,

1:50:03

nonpartisan, but hey, it is how it is.

1:50:05

And I anticipate that, you know,

1:50:07

in Congress things will slow down,

1:50:09

but that doesn't mean you should not keep it out of things.

1:50:11

and to lobby, not last year, lobby,

1:50:13

but call your local representatives.

1:50:15

We try to look at what the grassroots organizations

1:50:17

are trying to organize against these horrible bills out there.

1:50:21

But yes, no substantial updates so far because of holiday break,

1:50:25

but we will keep you updated as we expass.

1:50:27

So keep in touch.

1:50:31

All right.

1:50:32

And I think since that's everything,

1:50:33

I'll go ahead and take us out.

1:50:36

All the updates from this week in privacy

1:50:38

will be shared on our blog as well as this video.

1:50:41

So subscribe with your favorite RSS reader if you want to stay tuned on that.

1:50:45

For people who prefer audio, we have a podcast style recording of these updates every week

1:50:49

and the video will also be available on PureTube at a later date.

1:50:53

PrivacyGuides is an impartial, nonprofit organization that is focused on building a strong privacy

1:50:57

advocacy community and delivering the best digital privacy and consumer technology rights

1:51:02

advice on the internet.

1:51:03

If you want to support our mission, then you can make a donation on our website, privacyguides.org.

1:51:08

To make a donation, click on the red heart icon located in the top right corner of the

1:51:12

page.

1:51:12

You can contribute using standard fiat currency via a debit or credit card, or you can opt

1:51:17

to donate anonymously using Monero or with your favorite cryptocurrency.

1:51:21

Becoming a paid member unlocks exclusive perks like early access to video content and priority

1:51:25

during the livestream Q&A.

1:51:27

You also get a cool badge on your profile in the Privacy Guides forum and the warm fuzzy

1:51:31

feeling of supporting independent media.

1:51:32

So thank you all for watching and for your great questions and support, and we will see

1:51:37

you next week.

1:51:37

[MUSIC PLAYING]

1:52:07

(gentle music)