Discord Wants Your ID!
Ep. 40

Discord Wants Your ID!

Episode description

Discord is facing backlash after requiring all users to provide ID to access age restricted servers and channels, this comes after Discord breached 70,000 of its users government ID documents, Apple is moving to remove anonymous chat apps from the App Store, and much more! Join us for This Week In Privacy #40.

Download transcript (.vtt)
0:02

Do I need to hit go live or

0:03

you can...

0:24

Welcome back to This Week in Privacy,

0:27

our weekly series where we discuss the

0:29

latest updates with what we've been

0:31

working on within the Privacy Guides

0:33

community and this week's top stories in

0:36

the data privacy and cybersecurity space,

0:39

including Discord's new age verification

0:42

push.

0:43

Both Google and Amazon doorbells are in

0:45

the mainstream headlines and a reminder

0:48

about DHS social media surveillance.

0:51

I'm Jordan and this week I'm joined by

0:54

Nate.

0:56

Hello.

0:57

Privacy Guides is a nonprofit which

0:59

researches and shares privacy related

1:01

information and facilitates a community on

1:04

our forum and matrix where people can ask

1:06

questions and get advice about staying

1:08

private online and preserving their

1:11

digital rights.

1:13

Now let's get straight into the biggest

1:15

news in privacy and security from the past

1:19

week.

1:22

Alrighty.

1:23

Thank you so much, Jordan.

1:25

Our first story this week,

1:27

we're going to talk about Discord's new

1:29

age verification push.

1:32

A lot of you guys may have already

1:33

seen this.

1:34

This has really been making waves online,

1:36

or at least in a lot of the

1:37

spaces I'm in, which includes Discord.

1:39

So maybe this is a bit of a

1:40

sampling bias,

1:41

but this has definitely been in the

1:43

headlines.

1:44

So earlier this week,

1:45

discord announced um i mean really that's

1:49

that's it they announced that moving

1:50

forward all accounts are by default going

1:54

to be treated as teen accounts which means

1:57

that you will be severely limited in

1:59

certain function functionality excuse me

2:02

um i have read so so many articles

2:06

this week so let me try and see

2:08

if i can

2:10

see if this one mentions, um,

2:12

what the restrictions are on a teenage

2:13

account.

2:14

I mean,

2:14

obviously there's things like if an,

2:15

if a server is marked as like.

2:17

Um,

2:17

which I don't know how obvious this is

2:21

to some of you guys, but like,

2:22

I'm in like a true crime server,

2:23

for example.

2:24

So that would be plus,

2:25

so it's not just porn.

2:27

It could be anything.

2:28

And there's also like certain messages

2:32

won't go through by default and they get

2:33

filtered and stuff like that.

2:35

I'm not seeing it here.

2:38

But anyways, yeah,

2:39

it basically just restricts your account

2:41

by default.

2:42

However, you can age verify.

2:45

There's kind of a lot to this story

2:46

and I'm having a hard time putting things

2:47

in the proper order here.

2:48

But

2:49

So it's,

2:52

let's just go ahead and say this.

2:55

Some people have accused Discord of

2:57

backpedaling because they started,

3:00

there was another article that came out

3:02

about midweek where Discord basically

3:04

emphasized that

3:05

You don't necessarily have to submit ID

3:08

because they're going to try and verify

3:10

your age automatically.

3:12

That was part of the initial announcement.

3:14

So personally,

3:15

I don't know if I would call that

3:16

backpedaling,

3:16

but I get where you're coming from that

3:18

like they definitely put more emphasis on

3:19

that in the middle of the week when

3:21

this got met with a lot of pushback.

3:22

They were like, no, guys, relax.

3:23

They said most people won't have to verify

3:25

their IDs.

3:27

Um, so let me,

3:28

let me talk about that part real quick.

3:30

Um, by default,

3:32

discord is going to try to guess your

3:34

age using account metadata.

3:36

So that would be things like how long

3:38

you've had the account,

3:39

what sort of games you play,

3:40

what time of day you're generally active

3:42

and stuff like that.

3:43

If they get it wrong is when you

3:45

will have to verify ID.

3:48

or if they can't determine your age.

3:51

Or I guess more accurately,

3:52

you won't have to,

3:53

you'll just get reverted to a teen

3:54

account.

3:54

And if you're fine with that,

3:55

then you're fine with that.

3:56

But depending on what kind of servers

3:58

you're in, again, that may get you booted.

4:02

So this is where things start to get

4:04

dicey, as you can imagine.

4:05

So this article we've chosen here from Ars

4:08

Technica,

4:09

I chose this one specifically not because

4:11

of the headline,

4:12

but because this one also has a really

4:14

deep dive into Discord's verification

4:18

system and how it's supposed to work.

4:21

Apparently...

4:23

There's two ways of doing it.

4:24

One of them is a biometric scan,

4:26

which they claim will be totally on

4:29

device.

4:29

It'll never leave your device.

4:30

It's like a video selfie.

4:32

And then if your phone determines that you

4:34

are eighteen,

4:35

it just sends a yes or no back

4:36

to the server.

4:38

And as you can imagine,

4:39

this is going to affect younger users most

4:42

and people who look young, because I mean,

4:43

I obviously look way above eighteen,

4:45

but I feel like I've told this story

4:48

somewhere before.

4:49

When I was in college in my English

4:50

class,

4:50

I sat behind a girl that I swear

4:52

to God,

4:52

I thought she was like twenty and

4:54

apparently she was forty.

4:55

So some people just don't look their age

4:57

and especially when you get younger and

4:59

it's like, OK,

5:00

the line between seventeen and eighteen is

5:01

literally a day.

5:03

So how's Discord supposed to guess that

5:05

accurately?

5:07

I guess I'm getting ahead of myself.

5:08

But anyways,

5:09

so you can do the face scan,

5:10

which if they are to be believed,

5:11

stays on your device.

5:13

It just sends a yes or no back

5:14

to the server.

5:15

If the face scan doesn't work,

5:16

then you have to submit ID,

5:19

which they say is deleted as soon as

5:20

possible.

5:20

But they don't really give guarantees on

5:23

what that is.

5:24

They just say like,

5:25

we promise we delete it as soon as

5:26

we're done.

5:28

That might be immediately.

5:29

That might be a couple days.

5:31

It's really hard to know.

5:32

And for those of you who missed it,

5:34

I just do really quick want to address

5:35

the headline for audio listeners.

5:37

The headline of this article we chose is

5:38

Discord faces backlash over age checks

5:41

after data breach exposed to seventy

5:42

thousand IDs.

5:43

That happened last year.

5:45

I want to say late last year,

5:46

but it may have been early last year.

5:50

Oh, no, no, no.

5:51

In October, our senior security editor,

5:54

Dan Gooden, joined others,

5:55

warning the best advice is to assume they

5:57

have had their data stolen.

5:58

So I think that was late last year.

5:59

Anyways,

6:01

I did a second look into that story

6:03

in preparation for this one,

6:04

and I think that was also the result

6:08

of –

6:10

What's the word I'm looking for?

6:11

The UK's Online Safety Act.

6:14

Discord already did this once over in the

6:16

UK.

6:17

Those seventy thousand IDs were part of

6:18

the appeals process.

6:19

That's what I'm looking for.

6:20

That's that's where these came from is

6:22

people who were submitting appeals,

6:24

submitted their ID.

6:26

And I'm assuming because they had so many

6:27

IDs to go through and there was such

6:29

a backlog that required them to, you know,

6:32

they stuck had a backlog of IDs,

6:36

a stack of IDs.

6:37

That's what I was looking for.

6:40

Yeah,

6:41

that was also with a different provider.

6:43

I don't know that that really matters.

6:44

This whole breakdown here from Ars

6:46

Technica is really interesting because

6:48

it's almost like a Matryoshka doll of

6:50

different companies.

6:51

Like they say that they're going through

6:54

KID,

6:55

but then KID passes it on to somebody

6:59

else.

6:59

Where did it go?

7:00

Privately.

7:01

KID does not receive personal data from

7:03

Discord when performing age insurance.

7:07

And then it's – yeah, it was weird.

7:08

It's like every – again,

7:10

ours has this really good write-up here

7:11

where like when you look at one company's

7:13

privacy policy,

7:14

they say that they work with this other

7:16

company who works with another company.

7:18

And it's – I don't know.

7:20

It's really weird.

7:21

But eventually it does lead back to this

7:22

company in the EU who says that they

7:23

employed a double-blind implementation

7:25

where basically they never know what your

7:27

account is.

7:28

They're just verifying your age.

7:29

Yeah.

7:31

And there's something wrong with the CSS

7:32

here that some of this text is black

7:34

on my screen for some reason,

7:35

so it blends in with the background.

7:39

But yeah, it's a whole thing.

7:43

I think that kind of sums up the

7:44

facts of the story.

7:46

This has faced immense backlash.

7:49

I've already seen one script floating

7:52

around that I don't know if it works

7:54

or not,

7:54

but it claims that it basically sends the

7:56

yes check back to KID and verifies you

7:58

now.

8:00

We don't know if Discord is going to

8:02

invalidate that by the time March rolls

8:05

around,

8:05

because I think this is supposed to take

8:06

place in March.

8:08

Some of the servers I'm in,

8:09

we've definitely been having discussions

8:11

about other platforms.

8:15

And, oh, man.

8:18

I think, unfortunately,

8:19

the best platform – and I'll say why

8:21

I say unfortunately in a second.

8:23

I think, unfortunately,

8:23

the best platform we have right now is

8:24

Matrix.

8:26

And I mean nothing against the Matrix

8:28

people,

8:28

but it's – and they even admit it.

8:29

They have their own blog post.

8:31

I'll see if I can pull it up

8:32

while I'm talking here.

8:33

But they have their own blog post where

8:35

they're kind of welcoming people who are

8:36

joining Matrix and Element for the first

8:39

time.

8:39

And they admit that they're like,

8:41

we're not really a drop-in replacement

8:43

right now because they've been so busy

8:46

prioritizing.

8:47

They have a lot of public service

8:48

contracts in –

8:50

europe and that's what's been paying the

8:52

bills to hire more developers which i

8:53

totally get but that also means that they

8:56

haven't really had the time to um to

9:00

dedicate towards man i'm really not

9:01

finding this blog right now i tried

9:03

element and matrix i'll find it later but

9:05

they haven't had the time to really

9:06

prioritize uh some of the things that you

9:09

would expect if you're coming from discord

9:11

for the first time things like custom

9:13

emojis things like uh streaming games

9:15

voice chat i think some servers support

9:17

voice chat but not all of them so

9:20

And they also admit in their blog post

9:22

that they are also, as a UK entity,

9:25

trying to figure out the best way to

9:28

implement age verification to comply with

9:30

UK law.

9:32

They do mention that in their blog post

9:33

that some servers may still have to do

9:35

that depending on where they're located.

9:37

Yeah,

9:38

I guess I think that's kind of all

9:40

I've got at the moment regarding this.

9:44

Matrix is not perfect,

9:45

but the thing it has going for it,

9:46

I guess,

9:46

is to kind of finish that thought is

9:48

it is fully open source.

9:50

There is end to end encryption available,

9:52

especially in DMs.

9:53

They're one on one are default.

9:55

They can be turned on in rooms if

9:57

you feel you need that for some reason.

9:59

It can be decentralized.

10:00

You can host your own server.

10:02

So, I mean,

10:02

it's definitely a step up from a privacy

10:04

perspective.

10:04

It's just,

10:06

I think it would be a tough sell

10:07

to get a lot of longtime Discord users

10:10

onto Matrix because of the feature set

10:15

that it's missing.

10:16

But I've, again,

10:17

I'm in some servers where we've been

10:18

looking into some other stuff and like all

10:21

of the other things that I see or

10:22

I have seen personally that are like,

10:26

that look really good,

10:26

that look like they might actually be a

10:28

good replacement from an end user

10:29

perspective,

10:31

it's just repeating the cycle.

10:33

They're all like,

10:34

they just raised nine million dollars in

10:35

VC funding, or they're closed source,

10:38

they're this, they're that, and it's like,

10:39

cool, so in five years,

10:41

we're gonna be right back here where we

10:42

started.

10:45

I think that's kind of all I've got

10:46

for now.

10:48

Jordan,

10:48

did you have any thoughts on this story?

10:51

Yeah.

10:52

So I think the first thing I want

10:54

to talk about here is, I guess,

10:58

the privacy concerns with this,

11:00

because I've seen some people saying that,

11:03

you know, it's not,

11:05

they're not requiring ID from everybody.

11:09

So that's okay.

11:10

Right.

11:11

But I think one issue with that idea,

11:15

right, is the way these like age,

11:17

basically they use some sort of age

11:19

estimation technology,

11:21

which is like

11:22

based on a data set of what people

11:26

look like at a certain age, right?

11:28

And it's kind of been like a long

11:32

studied thing where we've been able to

11:34

find out that these age estimation tools

11:38

are not very good at estimating different

11:42

types of people.

11:42

Like for example, it's people of color

11:48

who are women,

11:49

like there's been like studies that have

11:52

been done on that.

11:53

And it doesn't,

11:53

it doesn't estimate age correctly for

11:55

those people.

11:57

And, you know, the least,

11:59

the more of a minority you are,

12:01

the less chance it is to be correct.

12:03

Right.

12:04

So it's,

12:06

it's kind of problematic in that way.

12:07

And also these platforms have to

12:09

basically, you know,

12:10

you have to show your face and submit

12:12

a biometric scan, which,

12:16

These platforms say that that scan is not

12:20

saved or used to...

12:25

you know, improve their services.

12:27

But I think at a certain point we're

12:29

seeing, you know, all these people,

12:30

like I remember last year when people were

12:33

facing the discord age estimation

12:35

technology,

12:36

a lot of people were using death stranding

12:38

like photo mode to bypass it.

12:41

But I think that when,

12:42

when they have a lot of these,

12:44

you know,

12:44

people using video games and stuff like

12:46

that,

12:47

we're going to find that a lot of

12:48

these people,

12:50

companies are going to have to start you

12:52

know actually saving some of that because

12:55

there'll be people bypassing it with that

12:57

technology so they need to be able to

12:59

control that right um so I think the

13:03

problem is still there even if you say

13:06

you know not everyone will have to um

13:09

I think

13:10

It's also important to mention that this

13:12

was only to access, like,

13:14

not safe for work channels and servers.

13:17

So I don't think your access to the

13:20

entirety of Discord would be restricted.

13:22

You just wouldn't be able to access not

13:24

safe for work channels,

13:25

which I think that's another debate.

13:27

Like,

13:28

what exactly is classified as not safe for

13:30

work?

13:31

Because, you know,

13:32

I think that could be...

13:34

deemed different for a lot of people.

13:37

And, you know,

13:38

Discord could decide to set a server to

13:40

not say for work,

13:41

which could allow which could kind of

13:44

force people to verify their ID.

13:47

So Discord is in sort of like a

13:48

very centralized position, right?

13:50

Where

13:53

like Nate talked about,

13:54

matrix is a decentralized alternative

13:57

where if the default matrix.org home

14:01

server decided to implement age

14:04

verification,

14:05

then you would actually have the choice to

14:08

switch to a different home server which

14:11

wasn't applying the same restrictions,

14:13

right?

14:14

So

14:16

I think Matrix is definitely a step in

14:19

the right direction,

14:20

but I think it's very much missing a

14:23

lot of the key features that Discord has.

14:26

And, you know,

14:26

a lot of people who use Discord rely

14:29

on all the time.

14:31

Like, you know, people do...

14:34

like they watch movies together on Discord

14:37

or they want to screen share to their

14:39

friends or be in a massive group call.

14:42

And as far as I know,

14:45

a lot of those features in Matrix are

14:49

not as mature or very patchy or not

14:52

really applied in the same way that

14:56

Discord does it.

14:57

So

14:59

I think a lot of people are going

15:00

to be kind of unhappy with matrix as

15:03

a replacement to discord, um,

15:05

which I think is just highlighting another

15:07

issue where I think we need more diversity

15:12

between platforms.

15:14

Like I think it would be better if

15:15

we had, uh,

15:18

you know,

15:18

more alternatives to Discord that actually

15:20

took privacy concerns seriously,

15:23

because Discord is like always not really

15:28

been a platform that has cared about data

15:30

privacy.

15:32

And we can kind of tell now that

15:34

that's never really been a priority

15:35

because they're fine with pushing forward

15:37

with all these age checks, right?

15:40

So I don't know, it's kind of,

15:43

an unfortunate situation to be in for a

15:45

lot of people because, you know,

15:47

some people rely on these channels and

15:49

servers and it's going to basically mean

15:51

either give up your ID or

15:55

miss out on that entire community,

15:57

which for some people is just not

15:58

possible.

15:59

So I do think this is sort of

16:01

very coercive.

16:02

It's sort of forcing people into giving up

16:05

biometric scans of their faces.

16:08

And even then,

16:10

your biometric scan of your face may not

16:11

actually approve you.

16:12

So you might actually have to upload ID

16:14

as well.

16:16

So overall,

16:18

I think this is a very bad move

16:21

from a lot of companies that are trying

16:23

to push this now.

16:25

It's only going to be a matter of

16:27

time.

16:27

Like last year,

16:28

we saw the breach with seventy thousand

16:29

IDs from Discord.

16:31

And I think it's only a matter of

16:32

time before these KID companies get

16:36

compromised in some way.

16:37

It's a treasure trove of data for hackers

16:40

to go after because they are dealing with

16:43

such sensitive information.

16:45

Um, so I think it's, I dunno,

16:49

I'm really annoyed by this.

16:51

I think more people should be talking

16:52

about it.

16:53

Um,

16:53

I did put together some posts on social

16:55

media,

16:55

kind of pushing people towards

16:57

alternatives for their communities.

16:59

Um,

17:00

so I don't know if anyone saw that,

17:02

but, um, I think it's,

17:04

it's also a problem, uh,

17:05

with a lot of these, you know,

17:07

community setups where you need to

17:10

basically

17:11

advocate for these platforms in the

17:13

entirety of the community,

17:14

because if not everyone moves,

17:17

then you basically,

17:20

you're not going to be able to use

17:21

the platform.

17:24

So, you know,

17:25

it's kind of a frustrating space to be

17:27

in right now.

17:28

But yeah, what do you think, Nate?

17:33

Yeah, for sure.

17:35

In regards to your post, actually,

17:36

that was included in the newsletter this

17:39

week,

17:40

So if you all are subscribed to the

17:41

newsletter,

17:43

you definitely got a direct link to that

17:44

on Mastodon.

17:45

And I also included a link to our

17:50

social media tab on the website for other

17:52

platforms that people could share it at.

17:53

But I just had a couple of quick

17:57

thoughts.

17:57

So you're right about in terms of what

18:01

servers are defined as and up.

18:03

In some cases, users can pick that.

18:07

When you make the server,

18:08

you can check if it's eighteen and up

18:10

or not.

18:11

But Discord does also have a mechanism to

18:14

automatically determine that the server is

18:16

eighteen and up.

18:17

So on the one hand, it's like, well,

18:20

it's only eighteen up if people choose to

18:22

make it that.

18:22

But on the other hand,

18:23

Discord does have like an automated way.

18:25

So you're definitely...

18:26

You make a really good point there where

18:28

that could turn into a problem later.

18:30

This is the same issue we've cited with

18:31

things like on-device scanning of photos,

18:33

right?

18:34

Sure,

18:35

right now it could be used for totally

18:37

legitimate things like detecting CSAM and

18:39

abuse,

18:40

but what happens when an authoritarian

18:41

government gets their hands on it and

18:43

says, oh,

18:43

now we want to identify people who were

18:44

at a protest and stuff like that.

18:46

So it could potentially be a slippery

18:49

slope for sure.

18:52

Um, real quick,

18:53

I did actually go and find the

18:55

restrictions for teen accounts.

18:57

Where did that go?

19:00

Um, so there's content filters.

19:02

We talked about that with the servers, um,

19:04

or no, no, no, uh,

19:05

sensitive content will be blurred by

19:06

default and you will need to be eight

19:09

quote unquote age assured in order to

19:11

unblur sensitive content or turn off the

19:13

setting.

19:14

Age gate spaces,

19:15

we already talked about that.

19:16

Channel servers and app commands.

19:18

Direct messages from people a user may not

19:20

know are routed to a separate inbox by

19:22

default,

19:22

and access to modify this setting is

19:24

limited to age-assured users.

19:28

People will receive warning prompts for

19:30

friend requests from users they may not

19:31

know,

19:31

and only age-assured adults may speak

19:33

onstage in servers.

19:36

I don't even know what onstage is,

19:37

but yeah.

19:38

And then, yeah, real quick,

19:41

I also found you sent me that Matrix

19:43

blog post, thank you.

19:44

And some of the things they say that

19:45

they're missing is they're missing things

19:47

like game streaming, push to talk,

19:49

voice channels, custom emojis,

19:51

extensible presence,

19:52

richer hierarchical moderation, et cetera.

19:54

And not to make this like a Matrix

19:57

hate session,

19:57

but I think to me personally,

20:00

I ran my own community a couple of

20:01

years ago and that was by a wide

20:04

margin.

20:06

The biggest thing that I struggled with

20:07

was that

20:09

The moderation tools in matrix are

20:13

honestly to call them bare bones as being

20:15

generous because it's the things you would

20:16

expect on any given platform.

20:17

Like you can ban a user from the

20:19

room,

20:20

you can create moderators and you can set

20:22

moderator permissions.

20:23

And that's about it.

20:25

And then, and I mean, to be fair,

20:27

I would argue discord's built in

20:28

moderation tools are also absolute trash,

20:31

but discord also has a lot of third

20:32

party bots that are really good.

20:35

At least some of them are really good.

20:37

And again,

20:38

I'm not trying to crap on the matrix

20:40

people, but like the matrix bots,

20:42

and I'm told,

20:42

I don't know if this has changed because

20:44

I haven't really been active on matrix in

20:45

a long time,

20:46

but at the time that I was struggling

20:48

with matrix,

20:49

I was told that the bot makers were

20:52

really hindered by the APIs they could get

20:54

from matrix.

20:55

So for example,

20:56

On Discord,

20:58

there are bots that will automatically

21:00

kick somebody out if their account is

21:01

under a certain age.

21:02

Like if the account was made less than

21:04

a week ago or two weeks ago,

21:05

automatically boot it,

21:06

which is a great way to get rid

21:07

of spammers because most of them just make

21:08

a new account and join.

21:10

At the time that I was active on

21:11

Matrix,

21:12

the bot makers couldn't access that

21:13

information.

21:14

So they're like,

21:14

we don't know how old an account is.

21:15

We can't tell the bot to boot anybody

21:18

who's under a certain age, which is...

21:20

I don't know.

21:22

And that's another concern of mine

21:23

because...

21:25

that was the big complaint with mastodon

21:26

at least as i remember it when elon

21:28

bought twitter and everybody was leaving

21:30

twitter and i remember a lot of people

21:32

saying like oh there's not any good

21:33

moderation tools on mastodon a lot of

21:36

minority people don't feel safe there

21:39

I don't know how Twitter was not worse

21:41

than that.

21:42

No, it was blue sky.

21:43

Everybody was going to blue sky because

21:44

apparently blue sky had moderation tools.

21:46

I don't know.

21:47

But anyways, it's like,

21:48

I feel like it's the same thing with

21:49

matrix.

21:49

Like if there are not these tools to

21:51

keep your users and your community safe,

21:53

nobody's going to stick around there.

21:55

Like I,

21:55

I got my brother on matrix for a

21:56

while and he left after a while because

21:58

he's like, yeah,

21:58

I got tired of seeing people spam rooms

22:00

with CSAM.

22:01

like the yeah all right um anyways it's

22:06

just not a good landscape unfortunately i

22:09

think um oh and then there was sorry

22:13

real quick there's one last thing i do

22:14

want to mention jonah raised our raised a

22:18

point in the chat earlier this week that

22:21

some people think this is about discord

22:24

getting ready for ipo because discord is

22:26

getting ready to go public and

22:30

I mean, this totally tracks in my book.

22:32

Some people think that this is Discord

22:33

trying to verify how many actual humans

22:35

they have on their platform.

22:37

Because if they can show like, hey,

22:38

ninety percent of our users are real

22:40

people and we verified them and we know

22:42

that that makes their company that much

22:44

more valuable.

22:45

So I just want to make sure we

22:46

we presented that argument.

22:49

Or that point.

22:51

Yeah, I don't know.

22:52

I think a lot of people that have

22:53

been using like I was one of the

22:55

original people who was like using Discord

22:57

back in, I think, twenty fifteen.

23:01

And I feel like back then it was

23:03

a much better platform than it is now,

23:05

even though it had a lot less features,

23:08

which is kind of unfortunate because I

23:09

think since about, you know,

23:11

twenty eighteen, twenty nineteen,

23:13

they've started adding all these like

23:15

obviously they have to try and make money.

23:17

This isn't like they've been dumping money

23:20

into keeping this chat service alive,

23:22

keeping it freemium.

23:24

I think it's kind of inevitable,

23:26

inevitable that they're going to

23:28

make it like really crap and add a

23:30

bunch of annoying uh paid stuff um so

23:33

i mean it definitely makes sense that

23:36

they're trying to verify everyone's a real

23:37

person um but i think one one thing

23:40

we should definitely uh check in here is

23:44

we we did get quite a few comments

23:45

just while we were discussing stuff here

23:48

so um i think i want to cover

23:50

this one here from blind goose on twitter

23:53

um

23:55

I think Apple and Google need to work

23:56

on a digital ID where they verify your

23:59

ID card, store it on your phone.

24:01

They can with permission share just your

24:03

age or more identifiable information if

24:05

needed.

24:06

When applying for a credit card,

24:07

the storage can be a hundred percent on

24:09

device,

24:09

meaning not even Apple or Google servers

24:11

have that data.

24:13

I mean, I think that would be.

24:18

I mean,

24:18

that would be the ideal situation for a

24:20

digital ID,

24:21

but I think there's other issues with

24:24

digital ID systems.

24:27

Like, for instance,

24:28

we've already seen this a little bit,

24:30

but a lot of websites that are being

24:33

restricted aren't actually technically not

24:38

safe for work or not safe for children.

24:42

So it's

24:45

it's,

24:46

it's gotta be done in a way that

24:48

I mean,

24:49

I would prefer if it didn't exist because

24:51

you know,

24:51

there's also that fact that there'll be

24:54

Apple and Google is kind of becoming the,

24:57

uh, what do you call it?

24:59

Um,

25:02

the people who decide gatekeepers yeah the

25:04

people who decide if you're actually able

25:07

to access something um and i think that's

25:10

probably not the greatest outcome um we

25:13

should be trying to push for things to

25:16

be freely available um same thing with the

25:20

discord communities you know um so

25:23

I think it would be good if it

25:26

was the only option,

25:27

but it isn't the only option.

25:28

There's also the option of not having age

25:30

verification.

25:31

So maybe we should do that instead.

25:35

But that's my thought on it.

25:37

And did you see any other questions here

25:39

that we should probably quickly cover?

25:42

Yeah,

25:42

there's one other I want to – or

25:45

two, but one of them is really quick.

25:46

And real quick,

25:47

I just want to say I agree with

25:48

you.

25:49

Like this is one of the things that

25:50

frustrates me so much about age

25:51

verification is we could have – we

25:54

absolutely have the technical ability to

25:56

have solutions.

25:58

that are much more privacy respecting,

26:00

like what discord is proposing here,

26:01

where like everything stays on device and

26:03

all they get is a yes or no

26:04

token.

26:05

And I'm not saying face scan.

26:06

And like you said,

26:07

there's definitely other problems with age

26:09

verification, but it's like,

26:10

this is just one more piece of the

26:11

puzzle.

26:12

It's like,

26:12

why do we always have to go for

26:13

the worst possible solution?

26:15

But, um,

26:17

Real quick,

26:17

I wanted to point out Anonymous Thirty

26:20

Five pointed out, he said,

26:21

so they have a mechanism for when an

26:23

adult is put in an underage category.

26:25

But what happens when Discord accidentally

26:27

makes a child overage, which is.

26:30

I think a really good point that I

26:32

haven't heard before,

26:33

and I don't know if it's necessarily like

26:35

a big part of this argument,

26:36

but I certainly think it's a really good

26:37

point.

26:37

Like what happens when they mess up in

26:38

a.

26:39

Sixteen year old gets access to porn

26:41

servers now.

26:42

And yeah, that's, that's not great.

26:45

And then the last thing I wanted to

26:47

highlight is one of our YouTube users

26:49

says,

26:49

are there any alternatives to discord?

26:51

I mean,

26:51

unfortunately I do still think that matrix

26:54

is the best alternative we have.

26:55

And I know I feel a little bad

26:57

because we've sat here and talked about

26:59

all the things that matrix is missing,

27:01

but you know,

27:01

you mentioned that you used to use discord

27:03

back when it was still a lot more

27:05

bare bones.

27:06

And yeah,

27:07

I think I saw a similar statement from

27:12

somebody else.

27:12

I can't find it now.

27:13

But I think the issue is that there's

27:17

this concept in psychology called the

27:19

hedonic treadmill where basically if you

27:23

make any amount of money,

27:25

like say you make decent money and then

27:28

you get a raise and now you can

27:29

afford a nicer car or a nicer home

27:32

or whatever.

27:33

it's really hard to take that pay cut

27:35

and downsize.

27:36

Like you could do it,

27:37

obviously it's possible,

27:39

but it's very difficult.

27:41

And especially for people to do it

27:42

willingly.

27:42

And I think that's what's so hard is

27:44

it's really hard for people to,

27:47

now that they've had all these really nice

27:49

features like these custom emojis these

27:52

you know fancy profiles that they can deck

27:54

out with nitro and stuff like that i

27:56

think it's just gonna make it that much

27:58

harder for people to scale back to

28:01

something a little bit more bare bones

28:02

purely in the name of like and especially

28:04

in some servers where it's like well i

28:06

still have to turn over my id anyways

28:07

and you know when they're dealing with

28:09

potential encryption key issues which

28:12

I am still having to this day.

28:13

I opened a matrix today for the first

28:15

time in a long time and half the

28:16

messages are encrypted because of key

28:17

issues and, you know, things like that.

28:19

It's just, it's,

28:21

I'm not saying that there aren't people

28:22

who wouldn't be willing to do it.

28:23

I'm just saying,

28:23

I think it's a really tough sell and

28:25

I think that's unfortunate.

28:28

Yeah, I mean, personally,

28:30

I think no one's really talked about this

28:33

so far,

28:34

but I find Signal works perfectly fine as

28:38

like a replacement for Discord.

28:41

You can do group voice chats,

28:44

group video calls.

28:46

I guess it is a benefit of a

28:48

centralized service because, you know,

28:51

There's only one server.

28:53

There's not like a bunch of

28:54

interoperability issues.

28:56

So it could be worth trying SignalOut,

28:59

especially now because you don't have to

29:03

share your phone number with people.

29:05

So you can kind of do it anonymously.

29:08

So I think that would be also a

29:09

good alternative to try.

29:11

But I think...

29:14

Yeah,

29:17

there's not a lot of good alternatives

29:19

that do everything because Discord is sort

29:21

of one of those applications where

29:23

everything is just...

29:26

it just has all the features,

29:27

but I would also, uh,

29:29

issue a plea to people,

29:31

please stop putting everything behind a

29:34

discord server.

29:35

Please use a website, use something.

29:37

I don't want to have to sign up

29:38

for discord.

29:39

Like there's so many communities where

29:41

they decide for whatever reason,

29:44

that's their only community is going to be

29:46

on discord.

29:47

They're going to store all the information

29:49

in a discord server.

29:51

Um,

29:52

I think that's ridiculous.

29:53

It's not accessible for everybody because

29:56

you need a Discord account.

29:58

And it's also just a bad way of

30:00

displaying information.

30:01

Like who thought that putting things in

30:04

chat channels and then just like,

30:05

you know,

30:07

having to scroll around and search for

30:09

messages is a good idea.

30:10

Like we have forums for that.

30:14

We have websites for that, like wikis.

30:19

So it's really frustrating that there's a

30:21

whole bunch of information that

30:23

is kind of inaccessible to people that

30:25

don't have discord.

30:26

And I really hope that after this whole

30:29

saga,

30:30

a lot of people are going to start,

30:31

you know,

30:32

moving things to platforms where there's

30:35

no central authority that just decides,

30:39

Oh no, your server's now,

30:40

so no children allowed.

30:43

Um,

30:43

and it could be for something completely

30:46

benign.

30:47

So, um,

30:48

I don't know.

30:49

I just,

30:50

I'm just really frustrated with how many

30:52

communities rely so heavily on Discord and

30:55

they don't seem to want to move to

30:57

any other platform.

30:59

And yeah,

31:02

I've kind of just been trying to avoid

31:03

getting Discord,

31:04

but there's so many communities that I

31:06

want to access because there's information

31:08

in there that I need that I can't

31:10

access otherwise.

31:11

So anyway,

31:13

that's just me ranting a little bit.

31:18

I just need to say that regarding your

31:19

first point,

31:20

I totally agree with you because I'm...

31:24

I'll cut it short.

31:25

But anyways,

31:25

I've been in big servers where...

31:28

they just move too fast and I can't

31:30

keep up with them.

31:31

And like,

31:32

I'm literally in one server just for the

31:33

little name tag.

31:34

I'm not,

31:35

I never even go in there because there's

31:37

tens of thousands of users.

31:39

And every time I check,

31:41

there's a conversation going on and I

31:42

don't know what's going on and I don't

31:44

know anybody in there.

31:45

I know that makes it a feedback loop,

31:46

but my point being like, yeah, if you're,

31:48

it drives me insane when they're like, Oh,

31:50

our support channel, our,

31:51

our primary support channel is discord.

31:53

And it's like,

31:55

Why?

31:55

Like,

31:55

what if I join and I can't get

31:57

help because there's ten thousand other

32:00

people chatting and nobody sees my

32:02

question?

32:02

And I know some of them have like

32:03

a little ticket system that you can open

32:05

things, but it's just yeah,

32:07

I'm not I'm not a fan of that

32:08

either.

32:09

To me, it feels like a very sloppy.

32:10

It's almost like Reddit,

32:11

like no offense to Reddit,

32:13

some offense to Reddit.

32:14

But when when somebody like a company is

32:16

like, oh, our, you know,

32:17

Reddit is where you follow us or open

32:19

support ticket.

32:20

It's just like that's not a support

32:21

channel, man.

32:21

That's just lazy.

32:22

So I'm with you.

32:23

I don't like that either.

32:26

Yeah,

32:26

there's been so many times and I've been

32:28

like, oh, I'll join this.

32:30

I guess I have to make a Discord

32:32

account.

32:32

I know,

32:33

I'll just use like a burner phone number

32:35

and a burner email and then I sign

32:37

into the server and then I get booted

32:39

instantly because there's an age policy.

32:42

You can't have an account that was just

32:43

created.

32:44

It's just really annoying to deal with

32:47

anyone that has their only community on

32:49

Discord.

32:51

And I think now is the best time

32:53

to diversify.

32:54

Just have another community.

32:56

You don't have to get rid of the

32:57

old one.

32:57

You can open another one somewhere else.

32:59

You can start a wiki

33:01

as well as a Discord.

33:02

You can start a discourse forum as well

33:04

as a Discord.

33:05

Just have another option because if

33:08

Discord does go to crap,

33:10

then at least you have another option for

33:12

communicating.

33:14

So, anyway.

33:16

Totally.

33:17

It is what it is.

33:18

All right.

33:21

I believe you've got the next story here

33:23

about Google.

33:26

Yes.

33:27

So this next story comes from Ars

33:30

Technica.

33:31

Upgraded Google safety tools can now find

33:34

and remove more of your personal info.

33:37

The results about new tool is getting an

33:39

upgrade.

33:40

So this is actually a tool that we

33:44

suggest to use at Privacy Guides because

33:48

it does enable you to basically enter your

33:52

personal information and have Google

33:54

periodically check Google search results

33:57

for information that appears about you.

34:00

And this is kind of useful because if

34:02

there's things like your address,

34:04

your phone number,

34:05

your email address showing up in Google

34:08

searches, that could be a safety issue.

34:11

So

34:12

That is why we do recommend, you know,

34:16

if you don't have any other option,

34:17

if you don't want to do it like

34:18

manually all the time,

34:20

it is something that we do recommend to

34:22

do because, you know,

34:25

it's going to be able to find that

34:27

information and alert you if it's actually

34:31

found.

34:31

So this is basically an upgrade to that.

34:34

So Google's had that feature for a while,

34:36

but now it's actually getting an extra

34:39

layer of functionality

34:42

So with today's upgrade,

34:43

Results About You gains the ability to

34:45

find and remove pages that include ID

34:48

numbers, like your passport,

34:50

driver's license, and social security.

34:53

And you can access the option to add

34:55

these to Google's ongoing scans from the

34:57

settings in Results About You.

34:59

Just click the ID numbers section to

35:02

enable detection.

35:04

Naturally,

35:05

Google has to know what it's looking for

35:07

to remove it.

35:07

So you need to provide at least part

35:10

of those numbers.

35:11

Google asks for the full driver's license

35:13

number,

35:14

which is fine as it's not as sensitive.

35:16

And for your passport and SSN,

35:18

you only need the last four digits,

35:20

which is good enough for Google to find

35:23

the full numbers on web pages.

35:27

So this is kind of good to see

35:32

that there's more safety tools being built

35:34

into Google search because a lot of times,

35:37

you know,

35:37

things can get indexed that we don't want

35:39

to.

35:40

It's good to have control over having

35:41

those listings removed because like I said

35:44

before,

35:44

it can be a safety issue for some

35:46

people.

35:48

But I do think in this case,

35:50

you are trusting Google with that

35:52

information.

35:53

And I don't think there's,

35:56

I think Google is not exactly the most

35:58

trustworthy company and it will definitely

36:01

depend on the level of safety that you

36:03

care about, right?

36:04

So if you're being targeted constantly and

36:08

this is like quite a big threat to

36:09

you,

36:10

then maybe you would be more likely to

36:13

enroll in this program.

36:15

So for instance,

36:15

you're like a public figure or something

36:17

and you probably do want to get notified

36:18

every time something pops up because you

36:21

don't want that listed.

36:22

That could definitely make sense.

36:24

If you're just an everyday person,

36:26

I think just periodically searching your

36:28

name, your address, your phone number,

36:30

that sort of thing is probably enough for

36:32

most people.

36:33

But I think this is definitely an

36:35

interesting thing here.

36:38

I guess having a look at the rest

36:40

of this here,

36:42

it looks like there's a tool that

36:48

identifies explicit images as well as

36:52

deepfakes.

36:53

um so that is also another thing that

36:56

you know a lot of people uh you

37:02

know deal with um i think this is

37:04

also kind of uh aimed towards people who

37:08

work in the sex work industry you know

37:10

they probably don't want information uh

37:13

associated with their work associated with

37:15

their real name that's definitely

37:17

something that some people would prefer

37:19

not to have um so that is

37:22

something that can remove some of that

37:25

content or, you know,

37:28

at least blur the content.

37:30

So that's another thing that they added as

37:33

well.

37:34

So it's definitely an interesting update.

37:38

And I think in general, this is,

37:42

a good thing,

37:43

even if it's coming from like the worst

37:44

company ever.

37:46

I think this is going to protect a

37:47

lot of people from a lot of issues

37:50

that, you know, certain public figures,

37:53

you know,

37:53

people work in the sex work industry.

37:56

I think that's definitely going to be

37:58

beneficial as well as just public figures

38:00

as well.

38:01

So yeah,

38:02

that's kind of my initial thoughts on this

38:03

story.

38:07

Yeah,

38:08

I will say I don't know if it's

38:10

just aimed at sex workers.

38:11

I don't know how actually prevalent it is.

38:14

I haven't seen any specific numbers,

38:16

but I know for a for media has

38:18

covered several stories about just normal

38:22

women who have been targeted by deep fakes

38:25

and they've covered.

38:27

I've lost track of how many,

38:28

and I say just normal women in the

38:30

sense of like, they're not famous,

38:31

they're not sex workers.

38:32

They just have the quote unquote crime of

38:34

being attractive.

38:35

And, you know,

38:37

they've been covering stories about tons

38:39

of,

38:39

they call them nudify apps where you can

38:42

feed it a picture and the AI will

38:44

generate what she might look like naked.

38:46

Um, they've covered tons of like, uh,

38:50

Oh man, I just,

38:51

it just slipped out of my head.

38:52

But, um,

38:54

Yeah,

38:54

they've covered tons of stories like that.

38:56

And so unfortunately – and again,

38:59

I haven't seen any statistics,

39:00

and I'm not trying to sound like I'm

39:02

downplaying it.

39:02

I'm just trying to give an honest level

39:05

read.

39:06

I don't know if this is like a

39:07

huge epidemic or if those are just a

39:10

handful of stories that are really

39:12

unfortunate.

39:13

But either way,

39:15

I think this is an actual problem that

39:16

doesn't just affect sex workers.

39:18

It could affect anybody, but –

39:20

I think you still do make a really

39:21

good point of, you know,

39:23

that's trusting Google with a lot of data.

39:25

And I do want to point out that

39:25

Google has really good security.

39:27

Google has never, to my knowledge,

39:29

had a major data breach.

39:32

But you're still trusting them not to use

39:36

that data for advertising, for tracking,

39:39

for all the other things that Google's

39:43

multi-billion dollar empire is built on.

39:45

So yeah, it's definitely a...

39:49

It's definitely one of those cost benefit

39:51

analysis things where you have to ask

39:53

yourself,

39:55

do I think this is enough of a

39:56

problem that I should go ahead and sign

39:58

up for this?

39:59

Or like you said,

40:00

would it be better if I handled this

40:01

myself, if I did an occasional search,

40:05

set myself reminders in my calendar or

40:06

whatever?

40:07

Because yeah,

40:08

that is a lot of sensitive information to

40:10

be handing over to Google for sure.

40:14

No,

40:14

that's a good point you brought up about

40:15

some of those apps like the nudify apps.

40:18

I think that's very much a problem,

40:20

especially with minors.

40:22

A lot of this stuff is, you know,

40:23

happening in high schools.

40:25

It's, you know,

40:26

using these apps on people who are under

40:28

the age of eighteen,

40:30

publishing this information online to kind

40:32

of embarrass people.

40:34

It's...

40:35

It's pretty bad.

40:36

And I think that is another good use

40:38

case here.

40:38

I guess I kind of missed that when

40:40

I was looking over it the first time.

40:42

So I think that's also kind of another

40:45

benefit of this as well.

40:47

But I think, you know, it's...

40:51

At the end of the day,

40:52

this is not a tool for everybody.

40:54

It's a tool for a very specific group

40:58

of people.

41:00

And I think if you...

41:03

are in that group of people,

41:04

then this would make sense.

41:05

Um, but if it's, you know,

41:07

I think a lot of people in our

41:08

community are just not going to trust any

41:12

big tech company, but, uh,

41:16

and especially now, because, you know,

41:18

we've seen a lot of this, uh,

41:20

for instance,

41:22

might talk about this next week,

41:24

but Google handing over, you know,

41:26

data to law enforcement without, you know,

41:29

proper oversight,

41:30

just handing it over and giving them all

41:32

this information doesn't seem like a

41:35

particularly good idea in that case.

41:38

So it's kind of hard to, you know,

41:44

justify

41:46

pushing this unless you're in a very

41:48

specific situation.

41:49

Um, I think like Nate said,

41:51

just doing a Google search of your

41:53

information every now and then is

41:55

definitely gonna be a better way to,

41:59

you know, protect your information.

42:01

But I think this is also a tool

42:04

that is applicable for some people.

42:07

Um,

42:07

so I think it's important that we covered

42:09

it.

42:09

Um,

42:10

but I guess moving on to our next

42:12

story here also from Google,

42:15

Nate, what are we talking about next?

42:21

All right.

42:21

Our next story is about how Google

42:26

recovered deleted footage from a doorbell

42:29

camera, which raises a lot of questions.

42:31

So for anyone who doesn't really follow

42:34

the news super closely,

42:35

there is a woman named Nancy Guthrie.

42:40

I don't know if I'm pronouncing that name

42:41

right.

42:41

I've only read it in articles.

42:42

I haven't seen any videos.

42:45

And she is a mother of three,

42:48

one of whom is a journalist for –

42:50

I believe it's NBC.

42:52

And she was reported missing,

42:55

she being Nancy, not the journalist.

42:57

She was reported missing on February

42:58

first,

42:59

and her family called in a welfare check.

43:02

She's from Arizona if I remember

43:03

correctly.

43:04

And the police showed up and said that

43:07

they had reason to believe that she was

43:09

taken.

43:09

She didn't just wander off or anything.

43:13

And –

43:14

Initially, the police said – okay,

43:16

so a real quick piece of context here.

43:19

Google Nest cameras,

43:20

which are kind of like Google's competitor

43:22

to Ring,

43:23

which we will also talk about here in

43:24

a little bit.

43:26

Google's Nest cameras, by default,

43:28

if you just buy the camera itself and

43:30

you don't buy the service,

43:31

according to this article,

43:32

they save three hours of quote-unquote

43:34

event history, which for the record,

43:37

this is coming from Ars Technica,

43:38

not like Google's actual documentation,

43:40

so I apologize if –

43:41

This is wrong,

43:42

but this is what Ars Technica says.

43:44

Events are anything that, like,

43:46

triggers the doorbell.

43:46

So it wouldn't be three continuous hours

43:48

of history.

43:49

It would be any time in the last

43:50

three hours that, say, you know,

43:52

somebody dropped something off at your

43:53

door, delivery was made,

43:54

somebody knocked on your door, whatever.

43:57

If you pay ten dollars a month,

43:58

you get thirty days of events.

44:00

Twenty dollars gets sixty days of events

44:02

plus ten days of full video.

44:05

Which, I gotta be honest,

44:06

is a pretty good deal.

44:07

But, anyways.

44:09

So they, uh...

44:11

The reason this matters is because Nancy

44:13

Guthrie had a ring doorbell, or excuse me,

44:17

a Nest doorbell.

44:18

And initially investigators, to be fair,

44:22

not Google,

44:23

but investigators said there's no footage

44:25

because she wasn't paying for the service.

44:28

And then suddenly, what is this,

44:30

the thirteenth?

44:30

Three days ago on February tenth,

44:33

Suddenly,

44:33

the police came forward and said,

44:35

actually, here's some footage.

44:37

Does anybody know this man?

44:38

And again, I've only seen screenshots,

44:42

even though it's here in the footage.

44:44

I was very busy all day.

44:46

I couldn't watch videos.

44:47

But excuse me.

44:49

They say the first video shows the person

44:51

approaching the door and noticing the

44:52

doorbell camera.

44:52

They place their hand over the lens and

44:54

appear to pull on the mounting bracket.

44:56

but the cameras have a small security

44:57

screw that makes it difficult to remove

44:58

them without causing damage.

45:00

I don't know why he cared about causing

45:01

damage if he's kidnapping somebody,

45:03

but whatever.

45:03

In the second video,

45:04

the individual seems to try to drape a

45:06

plant over the camera to block its view.

45:07

Both videos are short,

45:08

which is what you would expect from an

45:09

event as identified by the Google Home

45:11

system.

45:12

So they say the video was apparently,

45:15

quote unquote,

45:16

recovered from residual data located in

45:18

backend systems.

45:20

And Ars Technica says it's unclear how

45:21

long such data is retained or how easy

45:23

it is for Google to access it.

45:24

Some reports claim that it took several

45:26

days for Google to recover the data.

45:28

So this story raises a lot of questions.

45:35

I think there's a lot of ways you

45:36

could look at it.

45:36

And I'm not necessarily saying one is

45:39

right or wrong, because honestly,

45:41

I don't really know how I feel about

45:42

this.

45:43

I think on the one hand, there is,

45:46

I think most of our more technical viewers

45:47

know this, but in case you guys didn't,

45:49

when you click delete on something on a

45:52

computer, it typically doesn't delete it.

45:56

It basically tells the computer,

45:59

You can write over this if you want.

46:01

It's kind of like if you guys have

46:03

ever worked at a job where...

46:06

It's like a dumpster.

46:07

Let's put it that way.

46:08

It's like if you're driving through a

46:09

neighborhood and somebody has put a couch

46:11

or a table on the corner...

46:13

It's not necessarily gone yet,

46:15

but they don't want it.

46:15

You can take it if you want.

46:17

It's kind of like that.

46:18

And so just because something is deleted

46:20

in a system doesn't necessarily mean it's

46:22

fully gone.

46:22

Now, the longer it's been,

46:24

the more likely that it has been written

46:26

over by something else and it is actually

46:28

gone.

46:28

But usually,

46:29

especially if it's right away,

46:30

usually you can pull back most or all

46:33

of the data.

46:34

So it's really hard to tell exactly what's

46:37

going on here.

46:38

And I do want to acknowledge we could

46:42

do the conspiracies...

46:44

I don't mean to be rude,

46:44

but we could do the conspiracy theory

46:45

thing where they say, no,

46:46

they had it the whole time.

46:47

They're just pretending this is a story.

46:49

You might be right.

46:50

I'm going to acknowledge that.

46:51

You might be right.

46:51

We don't know.

46:53

It's also equally possible that they just

46:55

have literal warehouses around the world

46:57

full of servers,

46:58

and they got lucky and dug through and

47:01

found some stuff.

47:04

I don't know.

47:04

I really don't know.

47:05

I mean,

47:05

I don't know how many nests there are.

47:07

I don't know how much server storage there

47:09

is.

47:09

It seems like a bit of a stretch

47:10

to me that within a week it wouldn't

47:13

all be overwritten if she wasn't paying

47:16

It is weird.

47:16

I will admit that.

47:17

But I guess I'm just saying this isn't

47:19

necessarily a smoking gun,

47:20

but it certainly is weird.

47:21

And I think one of the reasons we're

47:24

talking about this is because this is a

47:25

big story that everybody's been talking

47:26

about.

47:27

But also,

47:27

I think this is a good reminder that

47:30

everything I just said,

47:31

when you hit delete,

47:32

something isn't necessarily gone right

47:33

away.

47:34

And that's why it's really important to

47:35

think about things like these days,

47:37

everybody's mostly switched over to solid

47:39

states.

47:40

And solid states, it used to be that

47:43

When you would delete something,

47:44

the common advice was to use a file

47:46

shredder, which would basically,

47:48

like I talked about,

47:49

it would mark it for deletion,

47:50

and then it would overwrite it a couple

47:51

times to make sure it was really gone.

47:53

But with solid states,

47:55

I guess you could do that,

47:56

but you shouldn't because it really

47:58

reduces the lifespan of the device.

48:00

And instead,

48:00

we rely on things like full disk

48:02

encryption.

48:02

If somebody stormed in right now and took

48:06

my computer and it died,

48:07

they wouldn't be able to get back into

48:09

it without decrypting it.

48:12

And that functionally serves the same

48:14

purpose.

48:15

But I guess to kind of bring it

48:19

back around, when you upload anything,

48:21

anything that's cloud connected like this,

48:23

even if they say like,

48:24

we're not gonna keep this stuff,

48:27

let's just go ahead and assume Google was

48:29

being honest here.

48:30

It was still there.

48:31

You know, they can't,

48:32

them saying we're not going to keep it

48:34

is not the same as them saying we're

48:35

going to delete it.

48:36

It's them saying like,

48:37

if it gets overwritten,

48:37

it gets overwritten.

48:38

We don't care that you're not paying for

48:40

that service and it could still be there.

48:42

And in this case,

48:43

hopefully it will help find this woman

48:44

safe and sound, you know,

48:45

but there's a lot of other cases where

48:47

it could be a bad thing and it

48:48

could recover something that you wanted to

48:50

stay gone.

48:51

So yeah,

48:52

it's just really important to keep that

48:54

kind of stuff in mind, I think.

48:56

I think those were my thoughts.

48:58

Did you have any other takeaways from the

49:00

story, Jordan?

49:01

Yeah,

49:02

so I think this is sort of a

49:06

confirmation of something that we've been

49:10

assuming but not knowing for quite a while

49:14

google doesn't really delete things they

49:17

have data for a long time they don't

49:21

actually abide by a lot of the policies

49:24

they have i mean i just think that

49:26

when it's a company as large as google

49:28

just think about the absolute amount of

49:30

data that they have

49:32

I think it's almost impossible to,

49:34

you know,

49:35

they probably got data centers in

49:36

basically every city in a lot of these

49:39

major countries.

49:40

And it's this whole interconnected network

49:42

where all this information is just like

49:45

zooming around the internet.

49:47

I think it's really hard for them to

49:49

delete a lot of things.

49:51

And I think the same thing goes for

49:52

this, right?

49:53

You know,

49:53

maybe that footage was saved and it was

49:57

saved and backed up across, you know,

49:59

six continents or something,

50:01

because obviously Google doesn't want to

50:02

lose that data because people rely on the

50:05

security camera footage.

50:07

So I think in this case,

50:09

it could have been that the storage of

50:11

this video clip could have been in a

50:14

backup or, you know,

50:16

across a lot of

50:18

data centers.

50:19

And I think all it took really was

50:22

for a high profile case.

50:23

They're obviously not going to do this

50:24

for, you know, your average person.

50:26

But I think because this was such a

50:29

high profile case,

50:30

I'm not entirely sure who this person is.

50:33

So maybe they're not that high profile,

50:35

but they seem to be.

50:37

And

50:38

I think it just shows that Google does

50:41

have the ability to bring back data that

50:43

is allegedly deleted.

50:46

So it brings up more questions about data

50:51

retention,

50:51

like how long is Google really keeping

50:54

things for?

50:55

When they say that your information is

50:56

deleted, how true are they being?

51:01

And I would argue probably not very true.

51:03

It's probably stuck in backups.

51:04

It's probably stuck in

51:06

you know,

51:07

whole systems like training LLMs,

51:10

all sorts of things that, you know,

51:12

we don't have control over.

51:14

So I'm not really surprised that they were

51:16

able to recover this footage.

51:17

I think this is kind of pretty standard

51:19

stuff.

51:20

Um, I'm not sure why people were, uh,

51:23

I guess so outraged.

51:24

I think this is pretty standard for a

51:25

company to have, you know,

51:26

backups lasting quite a long time.

51:30

Um, and you know,

51:31

they're not going to immediately delete

51:33

the footage.

51:34

Um,

51:36

So I think it is possible for them

51:38

to dig up footage in very extreme

51:41

circumstances like this.

51:42

So I don't know.

51:44

I am not particularly surprised by this

51:49

story in particular.

51:54

Yeah, that's fair.

51:55

I don't know.

51:56

I'm...

51:58

Yeah.

51:59

I don't,

51:59

I don't know if I'm surprised or not.

52:00

I think,

52:01

I think when I first heard this story,

52:02

I was kind of like, Oh, that's crazy.

52:04

But I wasn't just like, what?

52:05

Oh my God, that's crazy.

52:07

I was just like, Whoa.

52:08

So yeah, it's yeah.

52:13

I don't think I have much to add.

52:14

I just, I,

52:15

I really agree with your point about once

52:18

they have that data, you,

52:19

you lose control over it functionally one

52:20

way or another, you know,

52:22

like

52:23

for better or worse again even if like

52:25

let's assume just for the sake of argument

52:27

let's assume total good faith on google's

52:29

end what if they do have a data

52:30

breach someday they finally get got and

52:33

you know who knows what'll get leaked

52:35

that's yeah they could use it for ai

52:37

they could use it for anything so it's

52:39

just really important to keep that in mind

52:40

with anything you do online i think yeah

52:43

and i think this brings forward actually

52:45

another thing that we should probably talk

52:47

about is

52:48

The reason why Google had this footage in

52:51

the first place was because the camera was

52:54

not end to end encrypted.

52:56

So I think this is kind of an

52:58

important note.

52:59

Google is kind of known for being really

53:03

not very like not applying a lot of

53:06

these uh practices where you know

53:08

companies like ring i mean ring has other

53:10

problems we're gonna we're gonna talk

53:12

about that later um but uh companies like

53:14

ring uh apple you know they allow you

53:17

to at least enable that encryption so you

53:19

know if it was incriminating information

53:22

or just private you know you don't really

53:24

want recordings of yourself

53:26

doing who knows what outside your house or

53:29

inside your house, you know,

53:30

available to a massive corporation,

53:32

which could get breached.

53:34

So I think it's important to,

53:36

if you do have, you know,

53:39

a camera system,

53:40

maybe think about something local,

53:42

maybe think about something that employs

53:44

end-to-end encryption.

53:45

I know a pretty easy option for a

53:47

lot of people who have Apple devices is

53:49

Apple's HomeKit Secure Video, which is,

53:54

has end-to-end encryption.

53:56

And there's other local alternatives as

53:59

well, setting up like an NPR.

54:01

A lot of people in the comments of

54:02

this article were talking about,

54:04

this is why I only use local surveillance

54:07

systems.

54:07

But I think there's also issues with that

54:10

as well,

54:10

because

54:12

you know a lot of people don't own

54:15

the house or apartment that they live in

54:17

they can't just go around drilling holes

54:19

and things and like putting cables

54:21

everywhere that's just unrealistic um so

54:24

you know i think it's not always going

54:28

to be a perfect solution there's no

54:31

perfect solution to everybody so that's

54:33

why i think you should think about this

54:35

but i think you know you're putting google

54:39

you're trusting Google with like the

54:41

privacy of your home.

54:42

I think we need to question why you're

54:44

doing that.

54:45

You should probably consider moving to

54:47

something that offers a high level of

54:49

security,

54:50

such as Apple HomeKit secure video or some

54:54

other local alternative.

54:55

I know maybe Jonah would have something to

54:59

add on this.

54:59

I feel like he's kind of into that

55:01

whole like

55:01

home assistant, self-hosted stuff,

55:03

but unfortunately he's not here this week,

55:06

but that's okay.

55:08

I guess I would throw it over to

55:09

you, Nate.

55:10

Do you have any experience with these sort

55:11

of like security systems?

55:15

I don't because I am in that boat

55:18

you talked about where I'm one of those

55:19

people who rents.

55:20

And so I can't really,

55:22

and thankfully some places I've lived

55:24

actually do have policies against outdoor

55:26

cameras like Ring,

55:28

which I think is super awesome.

55:29

But

55:30

Not all of them do.

55:31

And yeah, it's actually my last apartment.

55:34

I remember they had like a ring or

55:37

a nest up in the corner and it

55:38

was facing the only in and out for

55:40

the building and it pissed me off so

55:42

much.

55:43

But yeah, it's really...

55:48

I wish I had more... Hold on,

55:49

let me check.

55:50

I swear I saw somebody post talking about

55:54

one the other day.

55:55

I'm checking a group chat I'm in, but...

55:57

All right.

55:58

While you're doing that,

55:59

I think it's also important to note that

56:01

when you're installing all these security

56:04

systems, like Nate was saying,

56:06

you can invade other people's privacy too.

56:08

It's important that I think some people

56:11

get caught up on

56:13

I'm just protecting my privacy.

56:14

You know, I don't care about other people.

56:16

It's like,

56:17

no other people recording other people

56:19

without their consent is wrong.

56:21

It's, it's not right.

56:22

It's invading their privacy.

56:23

So, you know,

56:24

especially with these camera systems,

56:26

I think it's important to remember that if

56:29

you install a doorbell like this,

56:30

that's pointing out towards the street and

56:33

recording everybody as they enter their

56:35

houses and walk past,

56:36

it's kind of creepy and

56:38

And it's definitely something we're going

56:40

to talk about later.

56:41

So definitely stay tuned for that.

56:44

But Nate,

56:44

did you end up finding that post you're

56:46

looking for?

56:48

I did.

56:48

I'm not going to say who it came

56:50

from because I did not ask permission.

56:51

This is off the cuff,

56:52

but it is somebody who is very

56:54

knowledgeable and knows what they're

56:55

talking about.

56:56

They said that if you need,

56:58

what was it?

56:59

Wise, W-Y-Z-E.

57:01

They say they recommend it.

57:03

He did specify for mainstream people.

57:05

um so this is not like the most

57:07

private solution but he said that it does

57:09

accept sd cards and you do need to

57:12

download their app to like set it up

57:13

initially but once you set it up you

57:15

can tell it like not to send anything

57:16

to the cloud and only record an sd

57:18

card um so make sure you mount it

57:20

somewhere where they can't just like tear

57:21

it off the wall and take it and

57:23

then he said that in the past he's

57:26

recommended zosi for um

57:30

Oh,

57:30

he said that's a bigger setup with

57:31

multiple cameras that sends wirelessly to

57:35

a local DVR or hard drive,

57:37

like you see in convenience stores or gas

57:38

stations.

57:39

So that might not be realistic for some

57:42

people, but yeah.

57:44

One more thing that I remembered while you

57:45

were talking,

57:46

you mentioned that ring does have an end

57:49

to end encryption.

57:50

I think you have to enable it.

57:52

I don't think it's enabled by default,

57:53

but they do offer it.

57:55

You mentioned Apple usually does better.

57:56

I actually learned recently that Google,

57:59

um,

58:00

Google's in browser password manager is

58:02

like the only one that's not encrypted by

58:04

default, Google Chrome.

58:06

So like Safari is, Firefox is,

58:09

I should hope Brave is,

58:10

but like Google Chrome is the only browser

58:13

where like you can save passwords in the

58:14

browser,

58:15

but you still have to take that extra

58:17

step to go in and encrypt the password

58:19

manager

58:20

which is completely insane and just backs

58:22

up what you were saying about the fact

58:23

that like Google is a little bit sketchy,

58:26

which is so weird because just a minute

58:27

ago I was like,

58:28

they have some of the best security except

58:30

when they don't.

58:31

It's just so insane.

58:33

So yeah, be mindful of that.

58:35

Google is not always as trustworthy as

58:37

they should be for a company of their

58:39

size.

58:40

Yeah,

58:41

I think it's definitely an unfortunate

58:44

thing, but I think they do have,

58:46

just like I was saying before,

58:47

their infrastructure is quite modern.

58:50

It's quite a large space.

58:53

operation so obviously they they do have

58:56

some security policies to protect things

58:59

um so it's kind of stupid that they

59:01

don't have internet encryption on

59:03

passwords by default that's like a thing

59:05

you have to enable like they can access

59:08

all your browsing data like your your

59:09

bookmarks and all that you have to enable

59:11

encryption on that um

59:14

But with that being said,

59:16

let's dive into the site updates this

59:18

week.

59:19

And before we dive into a story about

59:20

how Reddit is being monitored by the DHS,

59:25

let's give some quick updates about what

59:27

we've been working on at Privacy Guides

59:30

this week.

59:32

So throwing it over to you, Nate,

59:34

what have you been working on this week?

59:38

Yeah,

59:39

it's been a lot of behind the scenes

59:41

stuff.

59:42

I'm working on a script for,

59:46

oh my gosh, I should know this,

59:47

private email.

59:49

And yeah,

59:51

so that's what's next in the pipeline for

59:52

me.

59:53

That's been my main focus.

59:54

And then we have private browsing and the

59:58

intermediate smartphone security are both

1:00:01

ready for members.

1:00:01

We're just coordinating with

1:00:03

infrastructure.

1:00:04

I know we keep saying this every week,

1:00:05

but

1:00:06

We're really having some serious issues

1:00:08

with PeerTube.

1:00:09

And I think Jonah is doing his best

1:00:12

to take care of that.

1:00:15

But yeah,

1:00:15

as soon as we get that worked out,

1:00:16

we're going to push those out to members

1:00:20

and then shortly thereafter to the public.

1:00:24

So that is what I have been up

1:00:25

to.

1:00:26

And I believe you have been up to

1:00:27

some stuff as well.

1:00:29

Yes,

1:00:30

so this week I've been working on a

1:00:32

video that Nate put together.

1:00:34

It was a video about private...

1:00:38

messaging so that's another one to look

1:00:40

out for we're kind of covering off these

1:00:41

base topics just to have you know

1:00:44

resources for everyone to access and i was

1:00:47

also working on doing some social posts

1:00:49

this week because i feel like we've been

1:00:50

kind of uh slouching a bit on that

1:00:53

and we haven't been doing it as much

1:00:54

as we should and there's a whole bunch

1:00:56

of important issues going on that we need

1:00:57

to talk about so um i decided to

1:01:00

put together something uh about discord

1:01:03

And, you know,

1:01:03

alternatives to Discord and such.

1:01:05

So if you want to check that out,

1:01:07

that's available on most of our social

1:01:09

feeds that are text-based as well as

1:01:12

photo-based.

1:01:13

Some of them might need to wait a

1:01:15

little bit to go out.

1:01:16

But, yeah,

1:01:18

Nate's showing it here on Mastodon.

1:01:19

So that's kind of what I've been working

1:01:21

on.

1:01:21

I also put together another post for

1:01:23

Valentine's Day and...

1:01:26

It was kind of a funny one.

1:01:27

Look out for it tomorrow.

1:01:30

Hopefully that does well and people think

1:01:33

it's a good idea.

1:01:35

But I just want to remind people that

1:01:40

All of this is made possible by our

1:01:42

supporters and you can sign up for a

1:01:44

membership or donate at privacyguides.org

1:01:48

or you can also pick up some cool

1:01:50

swag at shop.privacyguides.org and you can

1:01:54

see in the background,

1:01:55

Nate's got a poster there and a bottle

1:01:57

and we've got all these products you can

1:01:59

get if you wanna support us and also

1:02:01

get something in return.

1:02:02

And now finally,

1:02:04

let's talk about Ring's new search party

1:02:09

tool.

1:02:13

dystopian ring search party feature sparks

1:02:18

public backlash so this is an article here

1:02:21

from nine to five mac and i should

1:02:25

uh preface this by saying there was a

1:02:29

an ad at the super bowl which i

1:02:31

think is like an american football event

1:02:33

for anyone not from america um

1:02:37

And basically it's the ad focused around,

1:02:42

you know,

1:02:43

utilizing everybody's ring doorbells and

1:02:46

ring cameras to basically find and help

1:02:50

lost dogs.

1:02:51

And it was basically a thirty second ad

1:02:54

promoting this new feature.

1:02:56

And of course,

1:02:58

this is kind of a massive concern from

1:03:00

a privacy perspective, obviously, because,

1:03:03

you know,

1:03:05

a company that's basically has all these

1:03:10

cameras everywhere and they're using it

1:03:12

to, you know,

1:03:13

identify something and track that thing.

1:03:16

That could be used for some really

1:03:19

dystopian stuff, obviously.

1:03:21

So that is kind of concerning.

1:03:24

So,

1:03:24

here's just kind of explaining how this

1:03:27

works.

1:03:28

The search party feature for dogs works by

1:03:31

allowing owners of lost dogs to send a

1:03:33

photo and description to other nearby ring

1:03:36

doorbell users when the camera thinks it

1:03:38

has spotted a dog matching the

1:03:41

description.

1:03:41

It alerts the homeowner.

1:03:43

If they confirm that it looks like the

1:03:45

right dog,

1:03:45

it puts them in touch with the owner

1:03:47

of the pet.

1:03:48

The company has now rolled out the feature

1:03:51

to non-Ring camera owners via the Ring

1:03:54

app, going all in on promoting it.

1:03:57

including with a whole Super Bowl ad,

1:03:59

which as far as I understand,

1:04:01

Super Bowl is kind of like a really

1:04:03

high profile event that would have

1:04:04

probably costed like tens of millions of

1:04:06

dollars.

1:04:07

So they're definitely kind of going all in

1:04:09

on this.

1:04:11

And this is also another concern because

1:04:13

in the US there's been a lot of

1:04:16

nationwide protests against ICE

1:04:17

operations,

1:04:18

which is

1:04:21

quite concerning and there's also people

1:04:24

who are kind of concerned that this could

1:04:27

be used to coordinate ICE operations to

1:04:31

deport and arrest people so

1:04:36

Just quoting from, uh,

1:04:38

four or four media here at Sunday's

1:04:40

Superbowl ring advertised search party,

1:04:42

acute horrifyingly dystopian feature

1:04:45

nominally designed to turn all of the ring

1:04:47

cameras into a neighborhood dragnet that

1:04:50

uses AI to look for a lost dog.

1:04:53

It does not take an imagination of any

1:04:55

sort to envision this being tweet to work

1:04:58

against suspected criminals,

1:05:00

undocumented migrants, or other, uh,

1:05:04

or others deemed suspicious by people in

1:05:07

the neighborhood.

1:05:08

Many of these use cases are how ring

1:05:10

has been used by people on its dystopian

1:05:13

neighbors app for years.

1:05:16

The neighbors app itself.

1:05:17

I haven't heard of this before.

1:05:18

So just quoting what the article says,

1:05:20

the neighbors app quickly got a reputation

1:05:22

for racist sharing reports of supposedly

1:05:25

suspicious looking people whose skin color

1:05:28

was the only thing they had in common.

1:05:31

so yeah this is kind of concerning uh

1:05:33

it goes through you know a lot of

1:05:35

the other uh social media backlash and

1:05:39

such um in this article and i think

1:05:42

this is kind of highlighting uh an issue

1:05:45

that i talked about before where you know

1:05:48

your privacy is important but also the

1:05:50

privacy of other people you know creating

1:05:52

a dragnet surveillance network is

1:05:56

Just to find dogs seems like a pretty

1:05:59

unequal exchange,

1:06:01

especially because this technology can be

1:06:04

used for kind of nefarious purposes,

1:06:07

like by ICE to, you know,

1:06:10

round up people, follow protesters,

1:06:12

all sorts of stuff like that.

1:06:14

I think it kind of goes against a

1:06:16

lot of things that...

1:06:19

you would expect in public.

1:06:20

You would expect in public that your

1:06:22

location isn't going to be tracked.

1:06:25

And in this case,

1:06:26

it's basically using cameras to use facial

1:06:30

recognition to identify dogs.

1:06:32

But you can see how that could be

1:06:34

used against people, right?

1:06:38

I think this is sort of a thing

1:06:39

that we've talked about a little bit in

1:06:42

other countries.

1:06:43

They have a lot of these facial

1:06:44

recognition

1:06:45

networks that are run by governments,

1:06:48

that identify people and track their

1:06:51

movements across the country.

1:06:53

And I think this is basically just doing

1:06:56

the same thing, but with a, oh,

1:06:58

it's cute.

1:06:59

Oh, it's a dog searching tool.

1:07:02

It's the same thing.

1:07:04

It's going to be a slope where they

1:07:06

do this feature now,

1:07:08

and then in a year's time,

1:07:10

they're using it to track protesters.

1:07:12

They're using it to track

1:07:14

immigrants.

1:07:15

They're using it to track, I don't know,

1:07:17

the next choice of who they're wanting to

1:07:20

follow.

1:07:21

So yeah,

1:07:22

I think this is extremely dystopian.

1:07:25

And I think most people need to kind

1:07:29

of look past the advertising and marketing

1:07:31

and look at what the real issue with

1:07:33

this technology is,

1:07:35

because it is quite concerning.

1:07:42

Yeah,

1:07:42

just to touch on a few things you

1:07:44

said.

1:07:45

Yes, the Super Bowl is,

1:07:48

you would think a religion here in the

1:07:49

US.

1:07:49

People lose their minds for it.

1:07:51

I say that as somebody who,

1:07:53

if sports vanished tomorrow,

1:07:55

the only reason I would notice is because

1:07:56

everybody else around me would be having a

1:07:57

meltdown.

1:07:59

But putting aside my hipsterism, yeah,

1:08:01

it's a huge deal.

1:08:02

And so for them to run this ad

1:08:03

was probably, I don't know,

1:08:04

about tens of millions.

1:08:05

I mean, maybe tens of millions,

1:08:07

but definitely millions for sure.

1:08:09

To run that thirty second ad is pretty

1:08:11

wild.

1:08:12

Um, I think I'm really happy.

1:08:15

What I,

1:08:15

what I picked up on in this story

1:08:17

is this has been like a nationwide outcry.

1:08:20

Like the, the article says like,

1:08:21

in addition to the four Oh four article

1:08:23

that you wrote or you, you read about,

1:08:26

um,

1:08:27

A Senator Ed Markey tweeted and said,

1:08:30

what this ad doesn't show,

1:08:31

Ring also rolled out facial recognition

1:08:32

for humans.

1:08:33

I wrote to them months ago about this.

1:08:35

Their answer,

1:08:35

they won't ask for your consent.

1:08:37

This definitely isn't about dogs.

1:08:38

It's about mass surveillance.

1:08:39

And then they said a quick search on

1:08:41

X shows this to be the prevailing view.

1:08:43

So I know that's like a very specific

1:08:46

slice of the internet,

1:08:47

but I'm very happy to see because so

1:08:50

many of the times I'm a little bit,

1:08:54

I'm a little bit annoyed and I'm going

1:08:55

to try not to be like,

1:08:57

I don't know.

1:08:58

It's not old man yells at clouds,

1:08:59

but something hipster, I guess.

1:09:00

I don't know.

1:09:02

I'm a little bit annoyed that I feel

1:09:03

like we as privacy advocates are

1:09:06

constantly trying to raise an alarm about

1:09:08

like, hey, guys, this is bad.

1:09:10

And people are like,

1:09:11

you need to calm down.

1:09:12

You're overreacting.

1:09:12

This isn't a big deal.

1:09:14

And then like three years later,

1:09:15

something happens and everybody is like,

1:09:17

oh, this is bad.

1:09:18

And it's like, yeah,

1:09:18

I've been saying that for years and you

1:09:20

told me I was overreacting.

1:09:22

And, you know, um, we didn't include it,

1:09:24

but I've,

1:09:24

I've been sharing around tonight, uh,

1:09:26

a story from meta where they're

1:09:28

reintroducing their, um,

1:09:30

their Ray-Ban glasses,

1:09:32

but they're including facial recognition.

1:09:34

And there's a quote here.

1:09:35

I can dig it up real quick.

1:09:37

Not to go too far off topic,

1:09:39

but this is from Meta's Reality Labs.

1:09:42

This is from the actual company that made

1:09:43

the glasses.

1:09:44

They said,

1:09:44

we will launch during a dynamic political

1:09:46

environment where many civil society

1:09:48

groups that we would expect to attack us

1:09:50

would have their resources focused on

1:09:51

other concerns.

1:09:53

And that is like the most cartoon villain

1:09:55

thing I've ever heard.

1:09:57

It's like I almost think it sounds like

1:09:59

it should be in the onion for them

1:10:00

to just say the quiet part out loud

1:10:02

where it's like, oh,

1:10:03

this is the perfect time to do something

1:10:04

downright evil because all of our enemies

1:10:07

are busy with other things.

1:10:09

And it's just it's you know,

1:10:10

and I bring that up because like I've

1:10:12

been sharing it around and everybody I'm

1:10:14

showing it to is just like,

1:10:15

that's ridiculous.

1:10:15

That's crazy.

1:10:16

I can't believe that.

1:10:17

And it's like, yeah, they're.

1:10:19

Like,

1:10:19

I've been trying to tell you guys that.

1:10:22

And I'm not trying to do the, like,

1:10:23

I told you so thing.

1:10:23

It's just like, I wish people had cared,

1:10:26

you know,

1:10:26

five years ago or ten years ago with

1:10:29

Cambridge Analytica when all this stuff

1:10:30

started coming out.

1:10:32

And, like, you know, I don't know.

1:10:34

Maybe we couldn't have stopped it because,

1:10:35

like, I don't want to sound defeatist,

1:10:37

but, you know,

1:10:38

Shoshana Zuboff in Age of Surveillance

1:10:40

Capitalism, she talks about how, like,

1:10:42

this is their playbook is they'll do

1:10:43

something.

1:10:43

And when people get mad,

1:10:44

they pull back a little bit.

1:10:46

But then they kind of just, like –

1:10:48

take a different direction to get around

1:10:49

the obstacle.

1:10:50

And they still end up doing it anyways,

1:10:51

which is why we can't really trust a

1:10:53

lot of big tech companies.

1:10:54

But I don't know.

1:10:55

It's just, it's,

1:10:56

it's frustrating to see things that you're

1:10:59

trying to raise the alarm about.

1:11:00

And sometimes it feels like people don't

1:11:01

take it seriously.

1:11:03

And I'm glad.

1:11:04

where I'm going with this is I'm really

1:11:05

glad to see that people are taking this

1:11:07

seriously,

1:11:08

that everybody is sitting up and it's not

1:11:10

just us for a change.

1:11:11

That's like, you know, this could be bad.

1:11:12

It's like, everybody is just kind of like,

1:11:14

okay, nope, this is too far.

1:11:15

You crossed a line.

1:11:16

This is creepy.

1:11:17

I don't want it.

1:11:18

And I guess I'm just going to say,

1:11:21

I hope that,

1:11:23

I hope that we're finally entering an age

1:11:24

where things will be different.

1:11:25

Because again,

1:11:26

I mentioned that typically tech will pull

1:11:29

back just enough until people calm down

1:11:32

and then they'll go do it anyways.

1:11:33

And I really hope that we're entering a

1:11:37

world where the internet has been around

1:11:38

long enough and the tech companies have

1:11:39

been around long enough that we're kind of

1:11:42

wising up to it and we're not going

1:11:44

to let them do that.

1:11:45

I hope.

1:11:46

I don't know.

1:11:47

We'll see.

1:11:48

But I really hope that's where we're

1:11:49

headed.

1:11:50

So.

1:11:51

Yeah,

1:11:52

those were my thoughts on that story.

1:11:53

Yeah,

1:11:56

I think I largely agree with what you're

1:11:58

saying.

1:11:59

I think it's also I'm seeing some comments

1:12:01

here in the chat.

1:12:02

So just kind of going over some of

1:12:04

these.

1:12:04

Someone said it's fail X on tour.

1:12:08

This is a bit of a political question,

1:12:10

but it's exploitative nature.

1:12:12

Do you think that other camera systems

1:12:14

could adopt rings infrastructure in the

1:12:15

future?

1:12:17

and i think it definitely could be uh

1:12:22

i think if it's successful and you know

1:12:25

i i unfortunately a lot of times these

1:12:29

awful things end up being pushed through

1:12:31

like saying it's like a boil you're

1:12:33

boiling the frog people just like you're

1:12:36

just slowly boiling them people just give

1:12:38

up with uh with trying to push back

1:12:40

against things because they're just

1:12:42

constantly trying to push things through

1:12:44

So I think if it's if it's a

1:12:46

success for Ring, I think, you know,

1:12:50

this feature could be rolled out to other

1:12:51

security cameras.

1:12:52

But I still think, you know,

1:12:56

it would be

1:12:57

I mean,

1:12:58

I'm not really sure what the percentage

1:12:59

is,

1:12:59

but I bet that there is a very

1:13:02

large percent of people,

1:13:05

at least in America,

1:13:06

I haven't seen too many ring doorbells and

1:13:08

things in Australia.

1:13:10

I think they're less popular,

1:13:12

like Amazon in general is less popular.

1:13:14

So I think in the US that there's

1:13:18

just a...

1:13:20

a lot of those cameras everywhere.

1:13:22

So I think it is pretty powerful being

1:13:25

able to have so many cameras everywhere

1:13:27

and to always have the ability to, like,

1:13:29

hook into them.

1:13:31

So, you know, I think...

1:13:35

If you have a Ring camera, I mean,

1:13:37

the best time to get rid of it

1:13:38

was yesterday.

1:13:39

And the second best time is today.

1:13:42

So try and work something out because I

1:13:45

think it's not great for the privacy of

1:13:47

other people.

1:13:48

And it seems like this is just going

1:13:49

to get pushed through.

1:13:50

They've already dumped so much money into

1:13:52

it.

1:13:52

They're not going to just start changing

1:13:55

direction on it.

1:13:56

doing this facial recognition or dog

1:13:59

recognition.

1:14:02

But well, that's how they're marketing it.

1:14:05

But we all know it's going to eventually

1:14:07

get rolled out to humans.

1:14:12

Do you see any other questions here you

1:14:13

think we should touch on?

1:14:15

I actually do want to touch on,

1:14:16

because you said that,

1:14:18

this next question from, or not question,

1:14:19

but a statement from Anonymous.

1:14:21

You said, basically,

1:14:22

if the market dislikes this,

1:14:23

no one will follow suit, which is true.

1:14:25

I don't want to get into free market

1:14:28

and all that,

1:14:29

but I just want to point out,

1:14:30

kind of related to what you said,

1:14:31

I think so much, and it's,

1:14:37

no single snowflake believes it is

1:14:39

responsible for the avalanche.

1:14:41

I don't know who said that, but like,

1:14:44

It's so easy to look at these things.

1:14:45

Like you were saying,

1:14:46

like ring doorbells are so ubiquitous here

1:14:48

in the US and so many people have

1:14:50

them and they bring a benefit.

1:14:52

I'm not going to lie.

1:14:53

Like, yes,

1:14:53

obviously they bring a benefit or else

1:14:55

people wouldn't be buying them.

1:14:56

And I get it.

1:14:57

But it's so easy for people to sit

1:14:59

here and think like, well,

1:15:00

what can I do?

1:15:01

You know,

1:15:01

like I get something out of it.

1:15:03

If I throw away my doorbell,

1:15:04

like you said,

1:15:05

they're just going to do it anyways.

1:15:06

It doesn't matter.

1:15:07

And it's like, yes, I hear you.

1:15:08

And I know it sounds hard, but

1:15:10

But if enough people do it,

1:15:12

it makes a difference.

1:15:13

Something like, what is it?

1:15:15

Like,

1:15:15

sixty percent of the people in the U.S.

1:15:17

are vote that are eligible to vote.

1:15:19

So almost half of the U.S.

1:15:20

doesn't vote.

1:15:21

And that's high turnout, by the way.

1:15:22

Really bad.

1:15:23

And so just think about if those other

1:15:25

forty percent of people voted.

1:15:27

What would – like maybe the political

1:15:29

situation wouldn't change.

1:15:30

Maybe it would.

1:15:31

Who knows?

1:15:32

But that's forty percent of the people

1:15:33

that could potentially change the vote

1:15:34

because every single one of them is like,

1:15:36

oh, my vote doesn't matter.

1:15:37

I'm sorry.

1:15:37

I know I'm being a little political,

1:15:39

kind of, sort of, not really.

1:15:40

But my point being is like we're at

1:15:45

a point – and I hate to say

1:15:47

this because it sounds defeatist.

1:15:49

We're at a point where the privacy

1:15:50

situation in the world is so bad.

1:15:52

that doing almost anything moves the

1:15:55

needle.

1:15:56

And I try to tell people that when

1:15:57

I talk to like mainstream users and they

1:16:00

just feel like, oh, well,

1:16:00

my data's already out there.

1:16:01

What does it matter?

1:16:02

They already know everything about me.

1:16:03

And it's like, yeah, that's the point.

1:16:05

The bar is so freaking low that doing

1:16:07

literally anything,

1:16:09

switching to Brave or Firefox, shoot,

1:16:10

putting uBlock Origin in Chrome,

1:16:12

as bad as Chrome is,

1:16:13

just putting uBlock Origin in there,

1:16:15

tweaking a few settings,

1:16:16

taking some apps off your phone.

1:16:17

Like most people are literally doing

1:16:19

nothing

1:16:20

Doing literally anything will move the

1:16:23

needle.

1:16:23

And so where I'm going with that is

1:16:24

like,

1:16:25

if you're one of those people or you

1:16:26

know one of those people who's like, well,

1:16:28

I genuinely get value out of my Ring

1:16:30

doorbell and I feel like throwing it away

1:16:31

won't make a difference.

1:16:32

Turn on the end-to-end encryption because

1:16:35

at least then Ring can't access your

1:16:36

footage without your permission.

1:16:38

And at least then you're like cutting off

1:16:41

this whole search party feature.

1:16:43

And it sends, like Anonymous was saying,

1:16:46

if enough people do it,

1:16:47

it sends that message of like,

1:16:49

I want control over my data.

1:16:51

And if enough people start sending that

1:16:52

message,

1:16:53

the market will start to pivot that way.

1:16:54

I mean, look at Apple.

1:16:55

Apple ran a whole campaign advertising

1:16:57

privacy.

1:16:58

And you could argue that they're not

1:16:59

private enough.

1:16:59

That's fine.

1:17:00

I accept that.

1:17:01

But the point is like they knew that

1:17:03

that's something that matters to people

1:17:04

and they want to start pivoting in that

1:17:05

direction.

1:17:06

And so like, yeah,

1:17:07

it's every little bit helps.

1:17:09

That's what I'm getting at is like every

1:17:11

little thing,

1:17:11

whether it's throwing the camera away

1:17:12

entirely,

1:17:13

whether it's canceling the subscription,

1:17:15

whether it's turning on the encryption,

1:17:16

like anything to send the message of,

1:17:18

I don't like this.

1:17:19

And, and I want privacy respected.

1:17:22

I think on a mass scale,

1:17:23

if enough people do that,

1:17:24

it will add up.

1:17:25

It will create an avalanche.

1:17:26

So yeah, that's,

1:17:29

that's my opinion on that one.

1:17:31

Yeah, that's definitely a good, uh,

1:17:35

A good thing to promote here.

1:17:37

I think, yeah,

1:17:38

more people will need to make changes.

1:17:41

Like, yeah,

1:17:42

it's not only going to be one person

1:17:44

that changes things,

1:17:45

but I guess we can also move on

1:17:48

here to kind of a spooky story here

1:17:53

about Reddit.

1:17:56

So here we have the Homeland Security

1:18:00

spying on Reddit users.

1:18:01

So do you want to take this one,

1:18:02

Nate?

1:18:04

uh yeah it is my turn isn't it

1:18:06

all right so um i think this will

1:18:09

be potentially a quick story because um i

1:18:13

think we've seen stories like this in the

1:18:15

past or maybe it's just me in my

1:18:16

past work i know we've definitely i've

1:18:18

definitely covered stories like this but

1:18:20

um this story is a really interesting

1:18:22

write-up and it focuses on you know common

1:18:25

journalistic technique is like you tell

1:18:27

one person's story and you use that to

1:18:28

kind of extrapolate to a larger trend

1:18:31

larger trend.

1:18:32

And so, uh,

1:18:33

that's what this person did here.

1:18:35

They talk about Homeland security is

1:18:36

spying on Reddit users.

1:18:38

And this one is really interesting.

1:18:40

So specifically there's a Reddit user

1:18:42

called budget chicken, two, four, two,

1:18:44

five.

1:18:45

I love randomly generated names.

1:18:46

They're so fun.

1:18:47

Um,

1:18:47

at least I hope that's a randomly

1:18:48

generated name.

1:18:49

Uh, if not,

1:18:50

I really want to hear the story behind

1:18:51

it, but anyways, uh,

1:18:52

so this actually happened in January.

1:18:53

This is very recent.

1:18:55

And this is an internal report that was

1:18:57

leaked to this reporter.

1:18:59

And it talks about how Homeland Security

1:19:01

officials in Texas were monitoring this

1:19:04

user,

1:19:05

who as far as we know is not

1:19:06

a narco-trafficker, a gang member,

1:19:07

or a terrorist.

1:19:09

The report centers on Budget Chicken's

1:19:10

call for a protest near a Border Patrol

1:19:12

facility in Edinburgh, Texas,

1:19:14

which I lived in Texas for ten years

1:19:15

and I don't even know where that is.

1:19:16

I just want to point that out.

1:19:17

Maybe I'm the idiot,

1:19:18

but it's not like it's Dallas.

1:19:20

It sounds to me like at best it's

1:19:21

a suburb of a major city.

1:19:23

So...

1:19:25

The report acknowledges that anti-ice

1:19:27

protests throughout Texas have been quote

1:19:28

unquote generally lawful,

1:19:30

and there's no evidence of any threat

1:19:31

posed by this user's call.

1:19:32

Any protest whatsoever near the border

1:19:34

facility is said to warrant continuous or

1:19:36

continued monitoring.

1:19:38

And then there's a screenshot of the

1:19:40

actual report here.

1:19:42

Uh,

1:19:42

to quote directly from the bulletin at

1:19:44

this time,

1:19:44

there's no specific reporting of planned

1:19:46

violence,

1:19:46

targeting DHS personnel or facilities

1:19:48

linked to this protest call.

1:19:49

However,

1:19:49

any demonstration in proximity to a U S

1:19:52

border patrol Rio Grande Valley facility

1:19:54

may present operational safety and

1:19:55

reputational risks that weren't continued

1:19:56

monitoring the, um,

1:19:59

I'll actually just read it cause it's real

1:20:00

short.

1:20:00

So there's a screenshot here for audio

1:20:02

listeners of a,

1:20:04

a Reddit post from this budget chicken guy

1:20:06

in our slash Rio Grande Valley,

1:20:08

Rio Grande.

1:20:09

I,

1:20:10

Again, lived in Texas for ten years.

1:20:12

They pronounce everything wrong,

1:20:13

so sorry if I pronounce it wrong.

1:20:15

He says, join me in protest against ICE.

1:20:17

He says, in light of today's events,

1:20:19

I'm rallying people to support our rights

1:20:20

and freedoms, not just for ourselves,

1:20:21

but for our neighbors, family,

1:20:22

and community.

1:20:23

We need volunteers to be witnesses and to

1:20:24

spread awareness.

1:20:25

The more we are silent,

1:20:26

the faster it will come to us.

1:20:28

I will be at the intersection of the

1:20:30

border patrol station around six.

1:20:31

Please come support.

1:20:32

That's the whole post.

1:20:36

So...

1:20:38

Where exactly is it?

1:20:40

Well, okay,

1:20:41

here's where he starts tying it into a

1:20:43

bigger thing.

1:20:44

He said,

1:20:46

there's a section that gives a sense of

1:20:48

the sheer volume of data Homeland Security

1:20:49

collects to generate a big picture view of

1:20:51

what's going on in the country.

1:20:52

One specific priority asks what group or

1:20:54

individuals are responsible for or are

1:20:56

associated with border violence and what

1:20:57

are the intended impact to Customs and

1:20:59

Border Patrol personnel and operations.

1:21:01

They say that they are tracking three

1:21:03

particular social media trends,

1:21:04

which is social media-driven mobilization,

1:21:06

symbolic targeting of government

1:21:07

facilities,

1:21:08

and a statewide baseline of mobilization

1:21:10

potential.

1:21:12

At the risk of being a little biased,

1:21:14

I want to point out none of those

1:21:15

said anything about violence.

1:21:16

Mobilization is literally just like

1:21:18

getting people out.

1:21:20

So like social media-driven mobilization,

1:21:21

so like posting online like that guy just

1:21:23

did, like, hey, I'm going to go protest.

1:21:25

Everybody come with me.

1:21:26

Like didn't say anything about violence,

1:21:29

just people coming to protest.

1:21:32

In other words,

1:21:32

the government is building a sociological

1:21:34

profile of political discontent.

1:21:36

The bulletin notes that these protests are

1:21:37

perception-driven,

1:21:38

meaning they are motivated by general

1:21:39

concerns about rights rather than specific

1:21:41

incidents.

1:21:41

Excuse me.

1:21:44

I haven't had enough water today.

1:21:45

So…

1:21:47

It's to determine the threat posed by

1:21:49

budget chicken two, four, two,

1:21:50

five analysts.

1:21:50

Didn't just look at the protest call.

1:21:52

They scored the user's entire digital

1:21:53

footprint.

1:21:54

The bulletin notes that chicken quote

1:21:56

frequently participates in various

1:21:57

community discussions.

1:21:58

For example, in our slash Texans,

1:22:01

they compared the team to the Cleveland

1:22:02

Browns in our slash movies.

1:22:03

They discussed the film, almost famous,

1:22:05

never seen it personally.

1:22:06

Our slash Stephen King shares his book

1:22:08

collection.

1:22:09

And I don't know if I can curse

1:22:11

and not get us demonetized.

1:22:12

I've cursed before.

1:22:13

So r slash fuck I'm old reminiscing about

1:22:15

the nineteen seventies television

1:22:16

production logos.

1:22:18

And here's.

1:22:23

Okay,

1:22:23

I'll go ahead and say right here that

1:22:24

up until this point, I was like,

1:22:26

this is really invasive and I'm not

1:22:28

condoning it.

1:22:29

But I completely understand they need to

1:22:31

like know what's going on.

1:22:32

They need to keep an eye on threats.

1:22:33

And just because this dude hasn't said

1:22:34

anything doesn't mean he's not going to do

1:22:35

anything.

1:22:36

So I understand the like, hey,

1:22:37

let's just keep an eye out on this.

1:22:39

This is the part where they lost me.

1:22:41

It is recommended that all agents wear

1:22:42

their ballistic armor, utilize long arms,

1:22:44

and if possible, work in groups.

1:22:47

Personal opinion,

1:22:48

based on military history and trading,

1:22:50

that sounds a little excessive in response

1:22:52

to a dude who said, hey,

1:22:53

let's go stand on the corner and chant

1:22:54

some slogans.

1:22:56

Again,

1:22:56

is it possible he might do something?

1:22:58

Absolutely.

1:22:59

Should you keep an eye on it?

1:23:01

Absolutely.

1:23:02

Should you escalate straight to rifles and

1:23:04

body armor?

1:23:04

Absolutely.

1:23:05

I don't know about that.

1:23:06

Maybe I'm wrong.

1:23:08

Maybe I'm a little bit too much of

1:23:09

a hippie these days.

1:23:10

But yeah, to me, that was pretty wild.

1:23:15

But anyways, personal opinions aside,

1:23:17

the point here is the social media

1:23:19

monitoring.

1:23:20

I mentioned that...

1:23:21

This story is not really unique.

1:23:23

It's just recent.

1:23:25

In the past,

1:23:26

I have talked about stories of government

1:23:29

monitoring,

1:23:30

all kinds of social media networks,

1:23:32

and not just the big ones,

1:23:33

not just Facebook and TikTok and Reddit.

1:23:37

I mean,

1:23:37

there's small niche ones that are built

1:23:39

for very specific communities.

1:23:41

Discord servers, the government...

1:23:45

I can go dig up this source if

1:23:46

I have to,

1:23:46

but I swear I've read stories about how

1:23:49

on some of the bigger Discord servers,

1:23:50

they will literally just throw sock puppet

1:23:52

accounts in there to scrape up the

1:23:53

messages so that they can search them

1:23:55

later.

1:23:56

And it's just...

1:23:59

That's actually – I found this out

1:24:01

recently.

1:24:01

That's actually what Palantir does.

1:24:03

That's their whole claim to fame is

1:24:04

Palantir doesn't actually collect any

1:24:06

data.

1:24:06

They build the databases that link all the

1:24:09

data together and that they can sell it

1:24:13

to governments and law enforcement,

1:24:14

and then they can search that database.

1:24:16

So this is all powering the surveillance

1:24:19

state,

1:24:20

and –

1:24:21

Yeah,

1:24:21

it's just – it's a reminder to be

1:24:23

careful what you post online.

1:24:25

It's a reminder that anything you put in

1:24:26

a digital format gets swept up.

1:24:28

It's a reminder that, unfortunately,

1:24:31

there is – yeah, this is happening.

1:24:35

It's just a reminder to remember that on

1:24:36

any platform.

1:24:37

Again,

1:24:37

it's not just the big stuff like Reddit

1:24:39

and Discord.

1:24:40

It's the small niche stuff.

1:24:41

Anywhere they think something might be

1:24:43

happening,

1:24:43

they're trying to get a foothold in there.

1:24:45

I guarantee it.

1:24:47

So that is what I took away from

1:24:50

that.

1:24:51

I don't know if you had anything to

1:24:53

add.

1:24:55

Yeah, I guess some basic thoughts here.

1:24:59

Generally,

1:25:01

I was under the impression that law

1:25:05

enforcement,

1:25:06

I assume Homeland Security is law

1:25:10

enforcement, I guess.

1:25:11

I'm not really too privy about that.

1:25:13

Homeland Security is...

1:25:17

It's hard to explain what Homeland

1:25:18

Security is.

1:25:19

It's a federal – it's technically a type

1:25:22

of law enforcement.

1:25:23

It's the federal arm of the government.

1:25:25

They generally handle border security and

1:25:29

that kind of stuff.

1:25:30

I'll let you go ahead and talk.

1:25:31

I'm going to look it up exactly what

1:25:32

it is.

1:25:32

But yeah,

1:25:33

they're kind of like a federal law

1:25:34

enforcement for the border.

1:25:37

Right.

1:25:37

So I think let's just go ahead and

1:25:39

assume that they are sort of law

1:25:41

enforcement-esque, I guess.

1:25:43

I'm not really familiar.

1:25:44

I'm not American.

1:25:45

So I'm just assuming.

1:25:46

But I think that generally I was under

1:25:51

the impression that, you know,

1:25:52

in most places that were, you know,

1:25:54

a free country, a country where, you know,

1:25:57

People's privacy is respected that,

1:26:00

you know,

1:26:00

police officers or law enforcement in

1:26:02

general wouldn't go poking around and

1:26:05

start building a case against someone

1:26:07

unless there was a suspicion of a crime.

1:26:11

And I don't see how there's any how

1:26:14

organizing a protest is possible.

1:26:17

any sort of suspicion for a crime,

1:26:20

everyone is perfectly within their rights

1:26:23

to do so.

1:26:24

So I think that's the number one

1:26:25

concerning thing.

1:26:28

This is the sort of thing we talk

1:26:29

about when we're talking about Russia.

1:26:34

They have Russian agents on social media

1:26:36

just trawling social media,

1:26:39

looking for posts about people organizing

1:26:41

protests and stuff to crack down on this

1:26:44

sort of thing, right?

1:26:45

Like, I don't think this is really,

1:26:48

this is not normal.

1:26:49

This is law enforcement and, like,

1:26:54

you know,

1:26:55

federal government agents intermingling in

1:26:58

people's right to organize.

1:27:01

Like, this is, like,

1:27:03

kind of concerning stuff.

1:27:05

And I think the other thing that I

1:27:07

thought about this story was it was kind

1:27:10

of surprising that they even have

1:27:16

you know,

1:27:18

people who are actually trolling Reddit

1:27:20

looking at things like this,

1:27:22

especially because, you know,

1:27:24

it's within people's right to protest what

1:27:27

is going on.

1:27:27

It's not against the law.

1:27:29

It's completely legal.

1:27:31

So it's kind of surprising that there's a

1:27:36

whole like they didn't just like save the

1:27:39

post in like a document they like

1:27:41

literally went through this entire this

1:27:43

person's entire posting history and they

1:27:47

didn't even have a suspicion that a crime

1:27:49

was going to be committed so i think

1:27:51

that's kind of concerning um this sort of

1:27:55

thing shouldn't be happening in a country

1:27:57

where you know things should be

1:28:01

you should be free to organize a protest

1:28:03

is what I'm trying to say.

1:28:04

This is not like controversial.

1:28:05

I don't think, um,

1:28:07

without having federal agents,

1:28:09

just like monitoring your social media

1:28:12

presence.

1:28:13

Um,

1:28:14

so that's kind of the main concern that

1:28:16

I got from this.

1:28:18

And, you know, people were saying,

1:28:21

you know, Oh,

1:28:22

what were they going to do?

1:28:23

Like,

1:28:23

they don't even know what they're doing.

1:28:25

Like, you know,

1:28:26

I think this is just concerning from the

1:28:27

fact that it was happening in the first

1:28:29

place.

1:28:30

Um,

1:28:31

But yeah,

1:28:34

it's kind of raising alarm bells in my

1:28:37

head that this is sort of not something

1:28:40

that is generally something that is good

1:28:43

that you would be happy about happening in

1:28:46

your country.

1:28:47

But I guess throwing it back to you,

1:28:49

Nate,

1:28:49

do you have anything more to add on

1:28:51

that?

1:28:53

no um i totally agree with you uh

1:28:55

real quick according to wikipedia the

1:28:57

department of homeland security is a

1:28:59

federal executive department rep

1:29:00

responsible for public security uh its

1:29:03

mission involves anti-terrorism civil

1:29:05

defense immigration and customs and border

1:29:07

border control cyber security

1:29:09

transportation security maritime security

1:29:11

and sea rescue and the mitigation of

1:29:13

weapons of mass destruction wow that last

1:29:15

one's new to me okay coast guard's gonna

1:29:18

go stop nukes um

1:29:20

No, I one hundred percent agree with you.

1:29:22

And it sucks because, again,

1:29:25

with with my military background,

1:29:27

I understand the idea of like like you

1:29:29

said,

1:29:30

we don't know what this guy is going

1:29:31

to do.

1:29:32

But at the same time,

1:29:33

like it just feels so heavy handed.

1:29:36

Like it's one thing to say like, OK,

1:29:38

somebody made a post.

1:29:39

They're going to be protesting tonight at

1:29:40

six.

1:29:41

noted.

1:29:42

It's another thing to like go through this

1:29:43

dude's history and compile a whole

1:29:46

dossier.

1:29:48

And especially to like, put it on record.

1:29:50

It's, I don't know.

1:29:50

I'm not trying to excuse it for the

1:29:52

record.

1:29:52

I'm not trying to do that at all,

1:29:53

but I understand.

1:29:55

I understand the need to make sure he's

1:29:57

not a threat.

1:29:58

Like maybe you go through his history and

1:29:59

you find all this like violent

1:30:00

anti-government stuff.

1:30:01

And it's like, Oh,

1:30:02

this dude might do something,

1:30:04

but like you,

1:30:05

you scroll through it for five seconds and

1:30:06

you're like, he talks about sports.

1:30:08

He watches movies.

1:30:09

He talks about logos from TV shows.

1:30:11

Like,

1:30:12

this dude is probably not going to do

1:30:13

anything violent.

1:30:14

And again, just the over-response of...

1:30:18

I don't know it's just it's it's

1:30:20

everything about it I don't like I don't

1:30:22

like the overreaction I don't like the

1:30:23

fact that they went looking in the first

1:30:25

place and it like you're saying it has

1:30:27

such a chilling effect on protesting

1:30:30

especially when again in the context of

1:30:31

the article said that even the government

1:30:34

admits that generally speaking the

1:30:36

protests have been peaceful they have been

1:30:38

non-violent and to just have this

1:30:40

overreaction of like treating everyone as

1:30:42

a threat um

1:30:45

Oh,

1:30:46

I'm not going to dig into it too

1:30:47

much because this is a really political

1:30:48

take, but I think America has long had,

1:30:50

and actually this is probably not even a

1:30:51

hot take, even though it's political.

1:30:52

I think America has long had a problem

1:30:54

of the over-militarization of police.

1:30:58

I think everyone across the board can

1:30:59

generally agree with that.

1:31:01

And this is just kind of part of

1:31:02

that trend of like, oh,

1:31:03

somebody said anything,

1:31:05

we instantly have to assume the worst

1:31:06

intent and gear up for the worst.

1:31:08

And it's like,

1:31:09

Again,

1:31:10

was body armor and rifles really

1:31:11

necessary?

1:31:12

Was it necessary to go digging through all

1:31:13

his history?

1:31:16

Yeah, it's really crazy.

1:31:17

I don't know if you had anything more

1:31:21

to add to that,

1:31:21

but there were a couple of comments in

1:31:23

the chat here that I thought were pretty

1:31:25

on point.

1:31:27

If I could show those.

1:31:28

Yeah, sure.

1:31:29

Let's see what people are saying.

1:31:33

Yeah, so anonymous-thirty-five said,

1:31:34

this is why people should

1:31:35

compartmentalize.

1:31:36

I agree.

1:31:37

Like, unfortunately,

1:31:38

Reddit's making that really hard.

1:31:39

I actually logged into Reddit today and

1:31:42

for the first time in, like, two years.

1:31:44

And I was going to leave a post

1:31:47

in r slash privacy,

1:31:48

and it got instantly deleted.

1:31:49

Or I was going to leave a comment

1:31:50

because while I was there,

1:31:51

I saw something that I was like, yeah,

1:31:52

I can weigh in on this.

1:31:53

Instantly deleted because I haven't been

1:31:55

on Reddit in so long that now my,

1:31:56

like, user quality score, whatever crap,

1:31:58

is, like, zero.

1:32:00

And so I can't post anywhere.

1:32:03

And it's like, okay,

1:32:05

so what am I just supposed to randomly

1:32:07

comment on posts I don't care about so

1:32:09

that Reddit thinks I'm not a bot?

1:32:10

It's a hard system to game,

1:32:11

at least for me,

1:32:12

because I don't care enough to figure it

1:32:13

out.

1:32:15

I guess what I'm getting at is some

1:32:17

sites make it really hard to

1:32:18

compartmentalize like that.

1:32:20

But yeah, it is really,

1:32:22

especially nowadays,

1:32:23

if you're planning to be more politically

1:32:24

active where that's clearly going to put

1:32:25

you under a microscope,

1:32:27

it's probably not a bad idea to have

1:32:29

your normal Reddit account where you only

1:32:31

talk about sports and TV shows and have

1:32:33

your other Reddit account where you talk

1:32:36

more about politics and stuff.

1:32:38

Yeah.

1:32:39

It's sad that we're moving into a world

1:32:41

where just normal,

1:32:41

peaceful protesters need to

1:32:42

compartmentalize.

1:32:43

That's not a good trend.

1:32:46

Yeah.

1:32:47

And then this is just real quick.

1:32:49

This is something that Jonah touched on

1:32:50

recently.

1:32:51

You said Facebook and Google are

1:32:53

collecting your data so they can

1:32:54

personalize ads.

1:32:55

Government collects your data so they can

1:32:56

ruin your life if they decide to.

1:32:58

A couple weeks ago when Jonah and I

1:33:00

talked about the UK trying to ban VPNs,

1:33:03

it sounds so –

1:33:06

I hate that I'm saying this because I

1:33:07

know that I sound crazy,

1:33:08

but unfortunately,

1:33:09

this is the direction we're headed in.

1:33:11

He pointed out that a lot of the

1:33:13

time,

1:33:13

authoritarian countries will outlaw very,

1:33:15

very tiny things for the sole purpose that

1:33:18

they can come after you anytime they want.

1:33:20

And it's almost like in the military,

1:33:24

they had what was called a – because

1:33:26

this article has got me thinking about the

1:33:27

military now.

1:33:28

They had what was called a – I

1:33:29

think it was – I want to say

1:33:31

it was Article thirty-four,

1:33:32

but I think I might be confusing that

1:33:33

with Rule thirty-four.

1:33:35

Um, but anyways,

1:33:36

they have this article that's literally a

1:33:38

catch all.

1:33:39

Um, and it's like,

1:33:40

that's literally what it is.

1:33:41

It has no purpose other than to be

1:33:43

like,

1:33:44

we want to nail you against the wall.

1:33:45

So we're going to throw this charge on

1:33:46

there too.

1:33:47

That is the whole point of it.

1:33:48

It's just like a general,

1:33:49

like you've done something generally not

1:33:51

okay.

1:33:52

And we disapprove of it,

1:33:53

but it's not covered by anything else.

1:33:55

And sometimes it's a standalone thing to

1:33:57

like get you in trouble when you finally

1:33:58

screw up too much and somebody doesn't

1:34:00

like you.

1:34:00

Sometimes they tack it onto like six other

1:34:02

charges just to like pour salt in the

1:34:03

wound.

1:34:04

But yeah, it's like the same thing.

1:34:06

When all this data is collected,

1:34:09

nobody ever goes through any significant

1:34:12

period of their life without doing

1:34:13

something illegal, without jaywalking,

1:34:15

without accidentally littering.

1:34:17

And yeah,

1:34:19

this stuff can be weaponized against you,

1:34:21

unfortunately.

1:34:22

Again, I know I sound crazy,

1:34:25

but that's kind of the direction we're

1:34:26

heading in,

1:34:27

where every single thing is now making

1:34:29

people suspect.

1:34:30

And going back to what I was saying

1:34:31

earlier about saying this for years...

1:34:36

I hate being right.

1:34:37

Yeah,

1:34:37

I think it's justified to feel frustrated,

1:34:41

especially when a lot of times I think

1:34:43

a lot of people in a lot of

1:34:45

countries,

1:34:46

they don't feel like they have any say

1:34:48

over the authority that's imposed on them.

1:34:51

Like by the government,

1:34:52

they can kind of just say,

1:34:54

do what they can, protest,

1:34:57

vote every once in a while,

1:34:59

and the government still passes ridiculous

1:35:01

laws that allow data brokers to collect

1:35:03

all your information,

1:35:04

even though most people would be against

1:35:06

it.

1:35:07

It's kind of the problem with, you know,

1:35:12

centralization like that.

1:35:14

But I think it's, yeah,

1:35:17

you're right to feel frustrated with that.

1:35:20

And I think it's also a problem

1:35:24

pretty pretty tied to to privacy as well

1:35:28

because a lot of times these companies get

1:35:30

away with doing all this stuff with your

1:35:32

information because you know there's

1:35:34

people in the government who benefit from

1:35:39

allowing this to continue so they don't

1:35:42

have any reason to change the things that

1:35:44

affect a lot of people um

1:35:48

So that's my take on it.

1:35:49

I guess on your comment, Cannabida,

1:35:52

I think it's a good post.

1:35:55

But yeah, I mean,

1:35:56

I don't really have anything more to add

1:35:58

here.

1:35:59

I guess we can kind of move into

1:36:00

the forum updates now.

1:36:03

So in a minute, though,

1:36:04

we'll start by taking some viewer

1:36:07

questions.

1:36:07

There was quite a lot of activity on

1:36:09

the forum thread this time.

1:36:10

So you can leave them there or you

1:36:11

can leave them in the chat and we'll

1:36:13

just pop them up on screen as well.

1:36:16

But if you've been holding out

1:36:18

Any questions on the stories that we've

1:36:20

been talking about so far?

1:36:22

Definitely go ahead and start leaving them

1:36:25

in the chat or in the forum thread.

1:36:28

But for now,

1:36:29

let's check in on our community forums.

1:36:32

So there's always a lot of activity going

1:36:34

on there.

1:36:35

And especially in the last week or two,

1:36:37

there's been quite a lot of controversial

1:36:38

stories being shared and a lot of

1:36:40

discussion around different topics.

1:36:42

If you're not already a member,

1:36:44

definitely consider joining at

1:36:47

discuss.privacyguides.net.

1:36:51

So the first post that we want to

1:36:53

touch on is...

1:36:56

Basically, it was a post,

1:36:58

it was a news post on the forum,

1:37:01

and there was a discussion around Google

1:37:03

fulfilling an ICE subpoena demanding

1:37:06

student journalists' bank and credit

1:37:09

details.

1:37:11

So this is kind of the issue that

1:37:12

we talked about with,

1:37:14

like we were talking about at the start,

1:37:16

when you give all this information to

1:37:18

Google,

1:37:18

then when there is a law enforcement

1:37:20

request,

1:37:21

they can actually hand it over depending

1:37:23

on, you know, what...

1:37:26

law enforcement requests they receive.

1:37:28

So it is kind of unfortunate that this

1:37:31

student had their details handed over

1:37:34

against their will.

1:37:36

and against their consent, I guess,

1:37:39

because Google had it in plain text.

1:37:42

So if that is a problem with these

1:37:44

tools, if it's not encrypted,

1:37:45

then Google is free to share that

1:37:49

information.

1:37:50

So there was some discussion there in that

1:37:53

thread.

1:37:53

But yeah,

1:37:54

do you have any thoughts on this one,

1:37:55

Nate?

1:37:59

Um,

1:38:00

thoughts on the story itself or thoughts

1:38:02

on the thread?

1:38:02

Cause I'm looking through the thread and

1:38:03

the thread kind of turned into a

1:38:05

discussion about de-Googling,

1:38:06

which I think is a really cool topic

1:38:09

that I'm always down to talk about, but.

1:38:13

Yeah, the story,

1:38:14

it looks like this person went to a

1:38:16

protest.

1:38:17

We were just talking about that.

1:38:19

Attended a protest targeting companies

1:38:20

that supplied weapons to Israel at a

1:38:22

Cornell University job fair in twenty

1:38:23

twenty four.

1:38:25

According to this article,

1:38:26

they were there for about five minutes,

1:38:28

but that was enough to get them banned

1:38:29

from campus.

1:38:31

And then.

1:38:33

Yeah, DHS requested,

1:38:34

I'm assuming this was a large request and

1:38:37

not just this person specifically,

1:38:39

but probably as part of a request.

1:38:42

Google or DHS requested data from this

1:38:44

person and or from Google and Meta.

1:38:47

This included usernames, addresses,

1:38:49

itemized list of services,

1:38:50

including any IP masking services.

1:38:52

So like VPNs,

1:38:53

telephone or instrument numbers,

1:38:55

subscriber numbers or identities,

1:38:56

credit card, bank account.

1:38:59

um excuse me and yeah apparently this

1:39:01

person just found this out so whoo that's

1:39:06

crazy um the letter asks the companies to

1:39:10

provide users with as much notice as

1:39:11

possible before complying ooh so maybe

1:39:16

maybe google was in the wrong here oh

1:39:17

no no oh sorry i'm skipping around that

1:39:20

was a letter from eff to big big

1:39:24

tech companies telling them to change how

1:39:25

they

1:39:26

they comply with information.

1:39:28

So, excuse me.

1:39:31

Yeah.

1:39:31

It's, it's, it's pretty interesting.

1:39:33

And I think it's, like you said, it's,

1:39:36

it's, I mean, kind of, I mean,

1:39:40

it's two sides of the same coin, right?

1:39:41

Like that's why people started talking

1:39:42

about de-Googling.

1:39:43

It's if, if information isn't encrypted,

1:39:45

isn't,

1:39:46

um, zero knowledge.

1:39:48

If the company's whole point is to know

1:39:49

who you are and know every single thing

1:39:51

about you.

1:39:52

And as Google famously said, you know,

1:39:54

to read your mind,

1:39:55

then that's a data that they can turn

1:39:58

over and they can hand over.

1:39:59

And so trying to move away from these

1:40:00

kinds of bigger companies into things that

1:40:03

are more privacy, respecting, um,

1:40:05

self-hosting,

1:40:05

certainly if you have that technical

1:40:07

expertise, but not everybody does.

1:40:09

Um, you know,

1:40:10

just things that are all done on device.

1:40:12

Like a lot of the discussion here was

1:40:14

about like Google maps and alternatives to

1:40:15

Google maps.

1:40:16

So, um, anything that's done on device,

1:40:19

anything that stores your data in an

1:40:20

encrypted fashion where they can't access

1:40:22

it or tries not to store your data

1:40:23

at all.

1:40:24

Uh, I mean,

1:40:25

that's kind of what we're all about at

1:40:26

privacy guides, big fans of that stuff.

1:40:30

Yeah, I think, you know,

1:40:31

moving away from as many Google services

1:40:34

is.

1:40:35

always going to be a benefit for your

1:40:37

privacy because just the nature of Google,

1:40:39

they're just one of the largest data

1:40:42

collectors in the world.

1:40:43

Um,

1:40:43

any data that you can avoid giving to

1:40:45

them is I would say a good, um,

1:40:48

but I guess in this case, um,

1:40:52

I guess this is sort of, uh,

1:40:53

I can't actually read the article because

1:40:55

there's, it requires an email.

1:40:57

Um, but I'm going to assume there was,

1:40:59

there was, uh,

1:41:00

it was a geo-fencing situation.

1:41:05

Is that how they were able to identify?

1:41:08

Let me see.

1:41:08

Let me go ahead and read because I

1:41:10

have it pulled up here and I'm skimming

1:41:13

it, but let's see.

1:41:19

Full extent of the information the agency

1:41:21

sought.

1:41:24

The subpoena provides no justification for

1:41:26

why ICE is asking for this information.

1:41:30

um it just requests that google not

1:41:32

disclose the existence of the summons for

1:41:34

an indefinite period of time good god

1:41:37

almighty um yeah he's this dude doesn't

1:41:42

even live in the country anymore that's

1:41:43

crazy if i i don't know it doesn't

1:41:46

specify how they got this person's

1:41:49

information or why they were part of this

1:41:53

or and i mean i could be wrong

1:41:54

it maybe maybe this was specifically

1:41:56

targeted at that person but

1:42:00

Yeah,

1:42:00

it's kind of light on details on that,

1:42:02

actually.

1:42:04

Yeah, that is kind of unfortunate.

1:42:06

I would have hoped there was more

1:42:07

information specifically about that.

1:42:09

But I think, you know,

1:42:11

avoiding using Google services,

1:42:14

like some people were in this thread,

1:42:15

they were talking about, you know,

1:42:16

avoiding using Google Maps and stuff.

1:42:18

I think, you know...

1:42:21

We could maybe touch a little bit on

1:42:23

protest OPSEC.

1:42:24

But generally,

1:42:25

you don't really want to be using

1:42:27

navigation things that are sending

1:42:30

information to the cloud,

1:42:31

because that location and your location at

1:42:34

a protest could be recorded.

1:42:37

And generally,

1:42:39

it's a good idea to be on the

1:42:41

safe side,

1:42:42

even though you're completely within your

1:42:44

right to do so.

1:42:45

you know,

1:42:46

it's better to be safer than sorry and

1:42:48

not have your location tied back to you

1:42:50

because this sort of thing can happen.

1:42:52

Like ICE agents can subpoena your

1:42:54

information from Google,

1:42:56

like what the heck.

1:42:58

But yeah, I think it goes without saying,

1:43:02

if you're going to a protest,

1:43:03

I would avoid bringing a phone entirely if

1:43:06

you can.

1:43:06

But I do realize some people might need

1:43:09

communication methods and also mobile

1:43:12

phones are quite good for recording any

1:43:13

activity at the protest because there's

1:43:16

always things that happen at protests

1:43:19

which need to be recorded for

1:43:23

transparency.

1:43:24

So I think that's also another thing but

1:43:28

definitely using like a de-Google device

1:43:30

is going to reduce the amount of location

1:43:33

information.

1:43:34

But there's always concerns with,

1:43:36

you know, cell tower triangulation,

1:43:38

that sort of thing,

1:43:39

especially these large events.

1:43:41

So there's always risks, I guess.

1:43:45

But I personally would just avoid taking a

1:43:47

mobile phone entirely because it's

1:43:50

you know,

1:43:51

the separate devices you can use.

1:43:52

Like you can bring a camera to record.

1:43:56

I know some people use walkie-talkies and

1:43:58

stuff.

1:43:59

So I don't know.

1:44:00

It's definitely...

1:44:02

a tricky, uh, situation,

1:44:05

especially when there's a lot of this, uh,

1:44:10

subpoenas coming.

1:44:11

And I do wonder, this says ICE subpoena,

1:44:14

like I've been seeing information about,

1:44:16

um,

1:44:17

the validity of some of ICE's subpoenas.

1:44:19

Like,

1:44:19

are they as valid as a judicial like

1:44:22

warrant or like a, like a, you know,

1:44:24

I'm not really sure what the term is

1:44:25

in, in the U S like a,

1:44:28

Do you know what that is?

1:44:31

So this article,

1:44:31

I know what you're talking about.

1:44:33

You're talking about, real quick,

1:44:37

this article doesn't specify,

1:44:38

but it says that they identified this guy

1:44:40

through his Gmail account.

1:44:41

So I think you're onto something with the

1:44:42

geofence warrant thing.

1:44:44

And this was – I think recently geofence

1:44:46

warrants have been outlawed or at least

1:44:48

reigned in a little bit.

1:44:49

But this was like last – or two

1:44:51

years ago now,

1:44:52

so that was probably before that.

1:44:54

But yeah,

1:44:55

so basically there's two kinds of

1:44:57

warrants.

1:44:58

There's an administrative warrant,

1:45:00

and I think the one you're talking about

1:45:01

is called like a judge's warrant or

1:45:02

something.

1:45:03

And basically an administrative warrant

1:45:06

never goes in front of a judge.

1:45:08

It's basically –

1:45:10

I mean, I'm not a lawyer, obviously,

1:45:12

but from what I understand,

1:45:13

it's basically just like a fancy

1:45:16

letterhead.

1:45:17

Like,

1:45:17

it's really just ICE asking nicely with a

1:45:19

fancy letterhead, like,

1:45:20

please give us this data.

1:45:22

And companies absolutely do not have to

1:45:24

comply with that because a judge hasn't

1:45:26

signed it.

1:45:26

There's no actual, like,

1:45:28

legal enforcement behind it.

1:45:30

But –

1:45:32

We have seen this in all the big

1:45:34

tech companies.

1:45:35

And to be fair,

1:45:36

this is not unique to right now.

1:45:38

All the big tech companies will always

1:45:40

bend over backwards to suck up to

1:45:42

whoever's in office because not to be too

1:45:44

political,

1:45:44

but like they're going to outlast them,

1:45:46

right?

1:45:47

In three years,

1:45:48

Donald Trump's going to be gone.

1:45:50

But Apple's still going to be here.

1:45:52

Google's still going to be here.

1:45:53

Meta's still going to be here.

1:45:54

So to them, this is just a game.

1:45:55

It's like, okay, whatever.

1:45:56

Like,

1:45:57

let's make this guy feel good about

1:45:58

himself.

1:45:58

Again, both parties.

1:45:59

I'm not picking on Trump here.

1:46:00

Let's make this guy feel good about

1:46:02

himself for four years and then he'll shut

1:46:03

up and go away and he won't be

1:46:04

our problem anymore, four to eight years.

1:46:06

And we'll just keep doing what we've been

1:46:08

doing.

1:46:08

This is business as usual for them.

1:46:09

So yeah,

1:46:10

they're not at all incentivized to protect

1:46:12

your data.

1:46:13

They're incentivized to not cause problems

1:46:15

so that the government doesn't cause

1:46:16

problems for them.

1:46:17

It's a quid pro quo.

1:46:19

That's a tongue twister, but yeah.

1:46:23

But yeah,

1:46:23

I've been seeing those articles you're

1:46:24

talking about too.

1:46:25

A lot of these article or these subpoenas

1:46:27

don't have actual enforcement power behind

1:46:29

them.

1:46:29

It's just the government asking nicely and

1:46:31

they're totally going with it because it's

1:46:36

just easier for them.

1:46:37

So yeah.

1:46:44

Did you have anything else you wanted to

1:46:45

add or should we move on to questions

1:46:48

in the forum?

1:46:50

Yeah,

1:46:50

I think we should move on here to

1:46:52

the questions from viewers.

1:46:55

So we'll start with questions on our forum

1:46:57

thread first.

1:46:58

And that's firstly,

1:46:59

we'll look at any comments that are left

1:47:01

by some of our paying members.

1:47:03

And you can become a member by visiting

1:47:05

privacyguides.org and clicking the red

1:47:07

heart icon in the top right hand corner

1:47:10

of the page.

1:47:11

And yeah,

1:47:12

so we'll dive right into that forum

1:47:13

thread.

1:47:14

Is there anything you can see there, Nate,

1:47:15

that pops out to you right now?

1:47:19

So I checked in a little bit throughout

1:47:21

the week.

1:47:22

I haven't really been logging into the

1:47:23

forum lately, to be honest,

1:47:24

but I do occasionally check to see what

1:47:27

kind of threads are pretty popular and

1:47:30

what people are talking about.

1:47:32

And somebody actually turned this into a

1:47:35

question,

1:47:36

so I guess we'll go ahead and discuss

1:47:38

it.

1:47:38

But there was a big discussion about...

1:47:43

The term normie and whether or not that

1:47:46

is a label that should be used.

1:47:53

So, yeah,

1:47:54

I'm I don't know how Jordan feels about

1:47:56

this one,

1:47:56

but I'm trying to move away from that

1:47:57

term personally,

1:47:58

because I do think it can intentionally or

1:48:00

not.

1:48:01

I think it can come off as very

1:48:02

demeaning.

1:48:05

And even if you're not talking to people

1:48:07

who would fall under that category,

1:48:09

I think it's just very demeaning.

1:48:12

I don't know.

1:48:12

It's kind of, you know,

1:48:13

I always say like,

1:48:14

don't put anything in a format you

1:48:15

wouldn't want to be made public.

1:48:16

Right.

1:48:16

And so I wouldn't want to be caught

1:48:17

with somebody like, oh,

1:48:18

you had this private chat where you called

1:48:20

me a normie.

1:48:20

Like, that's really rude.

1:48:21

That's really messed up.

1:48:22

And, you know,

1:48:23

I think it's just I talk about this

1:48:26

a lot, too.

1:48:26

I think people just have different

1:48:28

interests.

1:48:28

You know,

1:48:28

some people are like super into cars and

1:48:30

they can tell you everything about how a

1:48:31

car works.

1:48:31

And some people are super into sports and

1:48:34

the Super Bowl after I just got done

1:48:35

trashing that.

1:48:36

But, you know,

1:48:37

I think just because we're super,

1:48:38

super into tech doesn't

1:48:40

make everybody else in normie or make

1:48:42

anybody like better than anyone else but

1:48:45

um yeah it's it's a term i'm trying

1:48:46

to move away from personally uh yeah i

1:48:51

don't know if you have thoughts on that

1:48:53

i think this is sort of falls down

1:48:55

this interesting thing here where we say

1:48:57

like uh you know what's normal like i

1:49:02

don't know i think this is kind of

1:49:03

a little bit uh

1:49:06

boxing people into a certain thing.

1:49:08

Um,

1:49:09

I'm not really a fan of that idea.

1:49:10

I think, you know,

1:49:12

normal is kind of what even is that?

1:49:14

Like that's, that's pretty,

1:49:15

that's pretty broad.

1:49:16

Right.

1:49:17

And I think calling people,

1:49:19

a normie.

1:49:21

I don't know.

1:49:22

I don't think you should say that to

1:49:23

someone's face.

1:49:23

Like I wouldn't like to be called a

1:49:25

normie.

1:49:25

Like that's not very nice.

1:49:27

Um,

1:49:27

so I always go for like less privacy

1:49:31

conscious people or something like that.

1:49:33

Um,

1:49:34

so I always go for something that's a

1:49:35

little bit more neutral.

1:49:36

It's not, it has,

1:49:37

doesn't have a negative connotation, um,

1:49:40

or can be perceived in a negative way.

1:49:41

It's true.

1:49:42

Like if someone doesn't like aren't as

1:49:45

concerned about their privacy,

1:49:46

then

1:49:47

they're less privacy conscious.

1:49:50

So I think something like that, or just,

1:49:52

you know, I don't know.

1:49:56

I can't really think of any other way

1:49:58

to address this, I guess,

1:49:59

or address someone like that.

1:50:00

But I think it's,

1:50:02

it's sort of an othering,

1:50:04

it's an othering thing.

1:50:05

It's like trying to other somebody.

1:50:08

And I'm not really a fan of that

1:50:09

language personally.

1:50:13

Yeah, I don't know if it's any better,

1:50:14

but that's why I've started using the term

1:50:15

mainstream users because it's not – I

1:50:18

don't know.

1:50:19

Maybe that's still not the best term,

1:50:20

but I think I like yours less privacy

1:50:23

conscious because it's more – when

1:50:25

something goes mainstream and it catches

1:50:27

on in the masses,

1:50:27

there's a certain way that people use

1:50:29

things,

1:50:29

whether that's music or tech or something,

1:50:32

and there's just a certain way that people

1:50:34

interact with it where it's more –

1:50:37

I guess it's more casual to them.

1:50:39

These are all band tattoos on my arms,

1:50:41

and I have met plenty of people that

1:50:42

are into these bands,

1:50:44

but not enough to get tattoos.

1:50:46

So it doesn't make me any better than

1:50:49

them.

1:50:49

I'm not more of a super fan,

1:50:51

especially some of these bands.

1:50:52

I've met people that are like,

1:50:53

I don't even know all the words,

1:50:54

and you do.

1:50:54

So it's very, I don't know.

1:50:57

It's just trying to acknowledge, I guess,

1:50:59

that

1:51:00

Yeah, it's just trying to remove that.

1:51:02

I don't really like that term either,

1:51:03

especially also because I also just don't

1:51:05

like anything that comes out of deep

1:51:07

internet culture, to be honest.

1:51:08

And I feel like that term does.

1:51:09

But yeah,

1:51:12

trying to find something more neutral so

1:51:13

that people don't feel like I'm talking

1:51:14

down to them would be really nice.

1:51:18

Yeah,

1:51:18

I'd say that's probably a good direction

1:51:22

to go generally.

1:51:23

So there was quite a lot of discussion

1:51:25

in that thread around, you know,

1:51:27

is this a good idea,

1:51:27

is this a bad idea?

1:51:28

I guess we've sort of shared our thoughts

1:51:30

on this.

1:51:32

And there was another person who mentioned

1:51:34

there was an age verification bypass tool.

1:51:39

I mean,

1:51:41

I'm not sure if we can really comment

1:51:42

on that.

1:51:43

That might be slightly legally grey area,

1:51:46

I would say.

1:51:48

It's there.

1:51:50

You can use it if you want.

1:51:52

We're not going to promote that,

1:51:54

but that's the thing that you can do.

1:51:56

I think it's only going to cause more

1:52:00

harsher restrictions in the future because

1:52:03

as soon as they figure out people using

1:52:04

these tools,

1:52:05

they're going to require ID documents for

1:52:08

everybody.

1:52:08

They're not going to do these age

1:52:10

estimation techniques anymore.

1:52:12

So I think it's, you know,

1:52:15

get in while you can,

1:52:17

but I think this is only going to

1:52:19

get worse if there's people bypassing it.

1:52:22

And, of course,

1:52:23

there's always going to be people

1:52:24

bypassing it.

1:52:25

So it's kind of inevitable that it becomes

1:52:27

ID documents or not.

1:52:31

So I think that's the direction things are

1:52:33

going in.

1:52:33

But, I mean,

1:52:35

it certainly is a work in progress.

1:52:40

Yeah.

1:52:43

I was just going to say,

1:52:43

I only saw it in one place,

1:52:45

so I don't know how true it is,

1:52:47

but apparently Discord is talking about

1:52:49

switching their ID verification service to

1:52:53

a different provider.

1:52:54

specifically in response to this script

1:52:56

that you're talking about.

1:52:57

And like I said, I don't even know,

1:52:59

would Discord have a way of knowing

1:53:01

potentially who used it and who used the

1:53:04

script versus who genuinely used the

1:53:06

service that it's tricking and reversing

1:53:08

that?

1:53:09

Or would that make them more likely to

1:53:13

flag you as somebody who needs to age

1:53:15

verify because you tried to use this

1:53:16

script?

1:53:17

Maybe you're a minor.

1:53:19

Yeah,

1:53:19

I feel like it could backfire for sure.

1:53:22

So...

1:53:24

yeah um real quick before we move on

1:53:28

to other questions um when they were

1:53:30

having this normie discussion uh one one

1:53:33

person said uh they said maybe because

1:53:36

somebody said you know this is the reason

1:53:37

that privacy guides in the community exist

1:53:39

to spread tech awareness to those who

1:53:41

don't yet know or care and uh somebody

1:53:44

else said you know maybe someday we'll be

1:53:45

able to reach some people but most of

1:53:46

the people said most of the normies won't

1:53:48

even hear about this community

1:53:49

unfortunately too busy with their lives i

1:53:52

i mean

1:53:54

I want to make it clear,

1:53:55

I don't think it's everybody's job to

1:53:56

teach everybody,

1:53:58

especially in all kinds of subjects.

1:53:59

But this is why we ask you guys

1:54:02

to share videos, share social media posts.

1:54:05

Look for those opportunities when somebody

1:54:07

is like, oh,

1:54:09

I'm having a hard time remembering

1:54:10

passwords.

1:54:10

There's too many passwords.

1:54:12

Send them the Privacy Guides page about

1:54:13

password managers.

1:54:14

Somebody asks you about VPNs,

1:54:16

send them the page about VPNs.

1:54:18

Just look for those little opportunities

1:54:19

to kind of spread the word, I think,

1:54:21

because you don't want to be too heavy

1:54:23

handed with it.

1:54:24

But yeah, you're right.

1:54:25

If we just depend on people to magically

1:54:27

find their way to the forum or to

1:54:28

privacy guides, like some people will,

1:54:30

but a lot of people won't without a

1:54:33

little bit of a nudge and a little

1:54:33

bit of help.

1:54:34

So yeah,

1:54:36

I just wanted to address that one.

1:54:38

Exactly.

1:54:39

Yeah,

1:54:39

there was one person at the end of

1:54:41

this forum thread under the username me,

1:54:46

and they had some questions around what

1:54:48

we're talking about,

1:54:49

about the Discord stuff.

1:54:51

So first question was,

1:54:52

do you think Discord is testing the waters

1:54:54

for ID verification?

1:54:55

We kind of talked about that before.

1:54:59

Yes,

1:55:00

I think they're going to move towards that

1:55:03

eventually,

1:55:04

especially because people keep bypassing

1:55:05

it.

1:55:07

And I don't think like age estimation

1:55:09

technology is very good because like we

1:55:11

talked about before, it's kind of racist.

1:55:13

It's kind of sexist.

1:55:14

It doesn't really...

1:55:15

equally verify people based on their

1:55:18

appearance like it's kind of problematic

1:55:20

like that also you're just scanning

1:55:22

people's faces which you know biometric

1:55:25

fingerprints are kind of like it's hard to

1:55:27

change your face like you only have one

1:55:30

face and

1:55:32

It's like your fingerprint, right?

1:55:33

Like it's identifiable,

1:55:35

extremely identifiable.

1:55:37

You can't change it.

1:55:38

That's a problem.

1:55:40

Especially because, you know,

1:55:41

these companies are saying,

1:55:43

we'll delete it.

1:55:44

Don't worry.

1:55:44

We'll delete it straight after.

1:55:45

And it's like, all right, well,

1:55:48

I guess we'll see.

1:55:53

Yeah.

1:55:53

Pinky promise.

1:55:54

Right.

1:55:55

I actually I want to point out, though,

1:55:57

when you were saying like you can't change

1:55:59

your face.

1:56:00

I read that that's actually how some

1:56:01

people are getting around this in like

1:56:04

other places where age verification has

1:56:05

already been enforced,

1:56:06

like in the UK is I'm assuming mostly

1:56:09

women because, you know,

1:56:10

women generally tend to be better with

1:56:12

makeup.

1:56:13

They're putting on makeup to make

1:56:14

themselves look older than they really

1:56:16

are.

1:56:17

and i know i've i've definitely seen i

1:56:19

don't know about y'all but my wife has

1:56:20

shown me videos on tiktok of somebody

1:56:22

who's like very masculine guy and then

1:56:25

puts on makeup and it's like the most

1:56:27

beautiful woman you've ever seen or vice

1:56:28

versa and so like yeah just adding to

1:56:31

your point about like this isn't gonna

1:56:32

work and i understand that not everybody

1:56:34

can do that i sure can't do that

1:56:35

i don't know the first i know what

1:56:37

mascara is i know what eyeliner is i

1:56:38

know what lipstick is that's the extent of

1:56:40

my makeup knowledge

1:56:41

So obviously, not everybody can do that.

1:56:44

But it just goes to show how this

1:56:46

trying to guess it with biometrics is

1:56:48

horribly flawed.

1:56:50

And yeah,

1:56:51

it's probably they're going to have to

1:56:53

tighten it up, which I'm not happy about.

1:56:55

But yeah.

1:56:57

yeah they'll be releasing the uh the

1:57:00

you'll have to do a uh a blood

1:57:02

donation you'll have to do a you'll have

1:57:04

to test your cells to make sure to

1:57:06

see how old you are we've partnered with

1:57:08

and me i don't know hopefully not that's

1:57:12

like a black mirror episode or something

1:57:15

um but uh so many headlines now we're

1:57:18

once black mirror episodes and i'm not

1:57:20

even being sarcastic

1:57:22

It makes me so mad.

1:57:23

It is kind of unfortunate.

1:57:25

And I guess the next question that me

1:57:26

had was,

1:57:27

do you think this will affect Discord user

1:57:29

base in any significant way?

1:57:32

I'm a, I guess,

1:57:33

I don't know if this is the right

1:57:34

word, nihilist.

1:57:35

I don't think this is going to basically,

1:57:38

it is going to put a little bit

1:57:39

of a dent, I think.

1:57:41

Like at the start,

1:57:42

it's going to cause a little bit of

1:57:43

a dent.

1:57:44

Like right now,

1:57:44

there's definitely people leaving.

1:57:46

But I think it's the problem with these

1:57:48

community things, right?

1:57:49

Because it's fine if you move.

1:57:51

I don't know if anyone here has done

1:57:52

this before, but people have been like,

1:57:55

that's it.

1:57:55

I'm moving to Signal.

1:57:56

I'm ditching WhatsApp forever.

1:57:58

And nobody else follows you.

1:58:01

you're going to go back to WhatsApp.

1:58:03

It's kind of the problem.

1:58:04

You need everybody to be up and want

1:58:07

to do that as well.

1:58:08

And I just don't think that it's

1:58:11

particularly easy to just up and move your

1:58:15

entire community to a different platform.

1:58:17

And people are very resistant to change,

1:58:20

especially when everyone's been enjoying

1:58:21

Discord since twenty sixteen.

1:58:23

They're enjoying it apart from the age

1:58:27

verification stuff.

1:58:28

So

1:58:30

I don't know.

1:58:30

There's always people who are very

1:58:32

critical of Discord,

1:58:33

but I think they are very much a

1:58:35

vocal minority on Reddit,

1:58:37

on internet platforms.

1:58:38

I think a lot of people just use

1:58:40

the platform and don't really care.

1:58:41

So that's my nihilistic opinion, I guess.

1:58:47

Sadly, I agree with you.

1:58:48

And I have seen,

1:58:49

I think it was even in that Ars

1:58:50

Technica article we showed at the

1:58:51

beginning,

1:58:53

Discord explicitly said that they expect

1:58:56

that some people are going to be upset

1:58:57

and leave.

1:58:59

And real quick,

1:59:00

I kind of want to go back to

1:59:01

something you said, I think,

1:59:02

when we were having that discussion.

1:59:04

If you're a creator of any kind,

1:59:05

which I know is probably not most people

1:59:07

watching,

1:59:08

but if you are some kind of a

1:59:09

creator,

1:59:09

and most of our viewers probably have

1:59:11

already thought about this,

1:59:12

but if you are a content creator of

1:59:14

any kind,

1:59:15

You, in my opinion,

1:59:16

you desperately need to be thinking about

1:59:18

diversifying your community because this

1:59:21

could happen anywhere.

1:59:23

Reddit has already done this once.

1:59:24

Discord is now doing this.

1:59:26

Facebook has done this like five hundred

1:59:27

times.

1:59:28

Twitter could do this like any platform

1:59:30

you're using could change their terms of

1:59:32

service tomorrow.

1:59:33

And it just sucks to suck.

1:59:34

So like having parallel communities,

1:59:37

having Discord and Matrix,

1:59:38

having Facebook,

1:59:39

Facebook or Twitter and Mastodon having,

1:59:42

you know, like at Privacy Guides,

1:59:43

we have Ghost as one way to support

1:59:46

us.

1:59:46

We have YouTube subscriptions.

1:59:48

We have cryptocurrency,

1:59:49

like putting all your eggs in one basket.

1:59:52

We often talk about that in terms of

1:59:54

our data.

1:59:54

Like some people, rightfully so,

1:59:57

I'll keep this a short rant.

1:59:58

Some people don't want to put everything

2:00:00

in Proton,

2:00:01

not because they don't trust Proton

2:00:02

necessarily,

2:00:03

but because that's all your eggs in one

2:00:04

basket, your email, your VPN,

2:00:05

your password, your cloud storage,

2:00:07

totally valid.

2:00:08

So same thing if you are in any

2:00:10

kind of a situation where you have control

2:00:12

over your community,

2:00:13

whether you're a creator,

2:00:14

whether you're an advisor to somebody,

2:00:16

definitely recommend like, hey,

2:00:18

we don't necessarily have to leave

2:00:19

Discord,

2:00:20

but what if we did spring up a

2:00:21

Matrix server and start building over

2:00:22

there too?

2:00:23

And then when something like this happens,

2:00:25

you're not rebuilding from scratch and

2:00:26

you're not trying to convince everybody.

2:00:28

And it'll be a whole lot easier too

2:00:29

when people find out.

2:00:29

It's like, you know, hey,

2:00:31

if you're pissed off at Discord,

2:00:32

we have a parallel community.

2:00:33

Like, like, like, like, like, like, like,

2:00:34

like, like,

2:00:41

The last thing I wanted to add is

2:00:42

I don't think a lot of people are

2:00:43

going to leave,

2:00:43

but one thing I think might be effective,

2:00:45

I've seen a lot of people canceling their

2:00:46

Nitro.

2:00:47

And I think that might be a great

2:00:50

compromise for a lot of people who

2:00:52

maybe...

2:00:54

maybe don't feel like matrix is a good

2:00:55

alternative.

2:00:56

Um, canceling your nitro.

2:00:58

If enough people do it will absolutely,

2:01:01

uh, scare discord.

2:01:02

And I mean, I get it.

2:01:05

I don't pay for nitro,

2:01:06

but there have been times I'm like, man,

2:01:07

I kind of wish I did.

2:01:08

Cause that would be a really nice feature,

2:01:10

but you can still use it without nitro.

2:01:12

And, and again, if enough people do it,

2:01:15

you can kind of eat your cake and

2:01:16

have it to where you can send a

2:01:18

message without having to fully leave the

2:01:19

platform.

2:01:20

I don't think it would be as extreme

2:01:21

as everybody leaving, but yeah,

2:01:23

Yeah, I don't know.

2:01:24

I just wanted to throw that out there

2:01:25

personally.

2:01:27

I think, yeah,

2:01:28

it's good if you are a Nitro subscriber.

2:01:30

The issue that they get you with that

2:01:33

because there's a badge for being

2:01:35

subscribed to Nitro for a certain amount

2:01:37

of time.

2:01:38

And if you cancel your subscription,

2:01:40

it starts over again.

2:01:41

So I think they've kind of built in

2:01:43

all these little...

2:01:44

uh ways to keep people on the platform

2:01:47

uh to keep paying for that subscription

2:01:50

personally i don't think i could use

2:01:52

discord without nitro i can't go without

2:01:54

my stickers i don't know what i'm gonna

2:01:55

do um but yeah i'm lucky i'm not

2:01:58

on that platform anymore and i'm on other

2:02:00

ones that stickers aren't a paid feature

2:02:02

because why should they be that's silly um

2:02:06

but yeah i think

2:02:09

you know, like Nate said,

2:02:10

setting up alternatives for your

2:02:12

community.

2:02:13

I think a lot of people don't really,

2:02:15

you know,

2:02:16

a lot of people just use Discord for

2:02:17

like chatting to their friends and like

2:02:19

gaming stuff.

2:02:20

And I think move that group chat over

2:02:23

to Signal.

2:02:24

Works fine.

2:02:25

Move that group chat,

2:02:27

move it over to Signal or Matrix or

2:02:30

Any of the other recommendations we have,

2:02:33

I don't think it will be a terrible

2:02:35

experience.

2:02:36

And I think you'll avoid a lot of

2:02:38

the awful stuff with Discord.

2:02:40

I think a lot of times Discord is

2:02:43

in a position where they can kind of

2:02:45

leverage things and do crappy stuff

2:02:47

because everyone's there and they don't

2:02:49

want to move.

2:02:50

So this is another instance of them being

2:02:52

like, well, too bad.

2:02:55

This is our platform.

2:02:56

We're going to enforce this.

2:02:59

And people have been going on about how

2:03:03

crap Discord has been going.

2:03:05

They moved all their apps to web apps,

2:03:08

even on mobile.

2:03:09

And people were really unhappy with the

2:03:11

performance.

2:03:12

People were saying there's loads of bugs.

2:03:15

And I think it's...

2:03:18

Discord doesn't really care about their

2:03:23

users because the web app is,

2:03:25

let's all admit,

2:03:27

it's much easier to develop a web app

2:03:29

than doing native apps.

2:03:30

So I think they care about the money.

2:03:33

They care about,

2:03:34

like Nate said at the start,

2:03:35

this is about them getting their public

2:03:39

IPO completed for the most amount of

2:03:43

money.

2:03:43

Yeah.

2:03:44

Yeah.

2:03:47

Yeah, and real quick, just to...

2:03:49

You keep mentioning small groups and

2:03:51

chats.

2:03:52

I agree a hundred percent.

2:03:53

If the only thing you really use Discord

2:03:55

for is to keep up with a handful

2:03:57

of friends or all the servers you're in

2:03:58

or small servers with you and ten of

2:04:01

your friends,

2:04:03

Signal will work perfectly for that.

2:04:05

Even Matrix will work perfectly for that.

2:04:07

The challenges really start to come in

2:04:09

when you have these big,

2:04:10

large public communities if you're a

2:04:12

content creator or something.

2:04:14

I think Matrix will be a bigger ask

2:04:16

and even Signal, I think,

2:04:17

would be a really tall order there.

2:04:19

But yeah,

2:04:19

if it's just like you and a few

2:04:20

of your friends in some group chats or

2:04:21

some one-to-one chats, like, yeah.

2:04:24

Signal works great.

2:04:25

Matrix works great.

2:04:27

SimpleX works great.

2:04:28

Like,

2:04:29

a lot of the stuff we talk about,

2:04:30

we promote on the website will work just

2:04:32

fine.

2:04:34

One more quick thing, not to, like,

2:04:36

give everybody all the advice,

2:04:37

but one thing that occurred to me while

2:04:39

you were talking is also, like,

2:04:40

you could set up on your own as

2:04:42

a fan, you could set up, like,

2:04:43

a fan community.

2:04:43

Like,

2:04:44

I've seen or I've known of discords that,

2:04:47

like,

2:04:48

were set up as a fan run community.

2:04:50

And then once that content creator got to

2:04:52

a certain size,

2:04:52

they kind of discovered it and they were

2:04:53

like, oh,

2:04:54

I already have a community on Discord.

2:04:56

I didn't even know that.

2:04:57

And of course,

2:04:57

because all the moderators were fans,

2:04:59

they were like, yeah, come on in.

2:04:59

We'll make you an admin.

2:05:00

We'll like treat you like royalty.

2:05:02

And so it's, I mean, it's a stretch.

2:05:04

I will admit that.

2:05:05

But if you go start your own parallel

2:05:07

fan group on matrix and it gets big

2:05:09

enough,

2:05:09

maybe whoever you're starting the fan

2:05:11

group about one day, we'll just be like,

2:05:14

I mean,

2:05:14

I have like five hundred people over

2:05:15

there.

2:05:15

Maybe I should go ahead and just make

2:05:16

an account and check in every once in

2:05:18

a while.

2:05:18

So, yeah, I don't know.

2:05:20

It's a thought.

2:05:22

yeah um but yeah we are sort of

2:05:24

getting to the end of the of the

2:05:26

live stream here i just want to cover

2:05:27

one quick question here um because we are

2:05:30

just we just passed two hours for the

2:05:32

live stream um and we try to keep

2:05:33

it within two hours so one last question

2:05:36

here the good thing about it in is

2:05:38

that it will eventually happen to discord

2:05:40

and other privacy invasive platforms

2:05:42

resulting in people leaving it someday

2:05:44

sadly that day may be very far in

2:05:47

the future

2:05:48

Um, this is a comment by anonymous.

2:05:51

I think uh,

2:05:52

i'm not sure if I agree because um,

2:05:55

for instance,

2:05:56

I can think of so many platforms that

2:05:58

are just Absolutely terrible like there

2:06:00

are so many people who just keep using

2:06:02

facebook I don't know what the what the

2:06:05

issue is,

2:06:06

but basically any event here in australia

2:06:08

like it's always on It's always on

2:06:10

facebook instagram

2:06:13

Facebook Messenger.

2:06:15

It's a terrible platform.

2:06:17

If you talk to anyone that uses Facebook

2:06:19

or Facebook Messenger,

2:06:21

they'll tell you that it is absolutely

2:06:23

buggy.

2:06:24

It's a terrible platform.

2:06:25

It's not fun to use.

2:06:27

It's constantly breaking.

2:06:30

But it's still the platform that everyone

2:06:32

is on because of this network effect that

2:06:35

we're talking about.

2:06:36

So I don't know if people are actually

2:06:39

going to leave

2:06:40

Um, people always say, oh,

2:06:42

I'm going to leave, but it's like,

2:06:44

will you though?

2:06:45

Um, I guess we'll see how it goes,

2:06:47

but I am definitely not,

2:06:50

wouldn't be surprised if things go back to

2:06:52

normal in a month from now.

2:06:55

I sadly agree with you.

2:06:57

I have not heard a single good thing

2:07:00

about Facebook in probably close to a year

2:07:02

now, maybe even more.

2:07:04

I literally, not one person has been like,

2:07:07

you know,

2:07:08

people hate on Facebook too much.

2:07:10

I kind of like it.

2:07:11

It's like,

2:07:11

I've heard people defend certain features,

2:07:13

but even then they're like, yeah,

2:07:14

it's crap except for these two things.

2:07:15

Like,

2:07:16

and yet people are still using Facebook

2:07:19

like crazy.

2:07:20

They haven't had a dip in revenue yet.

2:07:22

I think they did one time,

2:07:23

but not since then.

2:07:23

So-

2:07:24

Unfortunately,

2:07:25

I share your nihilism on that one.

2:07:27

Yes.

2:07:28

And with that being said,

2:07:29

let's move into the outro here.

2:07:31

All the updates from this week in privacy

2:07:33

will be shared on the blog,

2:07:34

which has already gone up.

2:07:35

Thank you, Nate.

2:07:36

And so sign up for the newsletter and

2:07:38

subscribe with your favorite RSS reader if

2:07:40

you want to stay tuned.

2:07:42

For people who prefer audio,

2:07:43

we also offer a podcast,

2:07:45

which is available on all podcast

2:07:47

platforms and also RSS.

2:07:50

And this video will also be synced to

2:07:52

PeerTube as well.

2:07:54

Privacy Guides is an impartial nonprofit

2:07:56

organization that is focused on building a

2:07:58

strong privacy advocacy community and

2:08:01

delivering the best digital privacy and

2:08:04

consumer technology rights advice

2:08:06

on the internet if you want to support

2:08:09

our mission then you can make a donation

2:08:11

on our website at privacyguides.org to

2:08:14

make a donation you can click on the

2:08:16

red heart icon located in the top right

2:08:18

hand corner of the page and you can

2:08:21

contribute using standard fiat currency or

2:08:25

you can use

2:08:26

cryptocurrency to donate anonymously using

2:08:29

Monero or your favorite cryptocurrency.

2:08:32

And becoming a paid member unlocks

2:08:34

exclusive perks like early access to video

2:08:37

content and priority during the This Week

2:08:40

in Privacy livestream Q&A.

2:08:42

And you'll also get a cool badge on

2:08:44

your profile on the Privacy Guides forum

2:08:46

and also the warm,

2:08:48

fuzzy feeling of supporting independent

2:08:51

media.

2:08:52

Thanks for watching and we'll see you next

2:08:55

week.

2:08:56

Bye-bye.