Bad Internet Bills w/ Taylor Lorenz: KOSA, SCREEN Act, & Repealing Section 230

Bad Internet Bills w/ Taylor Lorenz: KOSA, SCREEN Act, & Repealing Section 230

Episode description

Privacy Guides sits down with technology journalist Taylor Lorenz to decipher a slate of bills being fast-tracked through Congress which threaten free speech, privacy, and your right to freely access information on the internet.

Download transcript (.srt)
0:00

Every top expert has said this is a complete moral panic.

0:02

This is nonsense.

0:03

Sites like Reddit couldn't exist.

0:05

Platforms like Twitter couldn't exist.

0:07

A major Biden, you know, White House advisor got up on stage and said,

0:11

"We are going to remove anonymity from the internet."

0:13

This is a bipartisan effort.

0:15

So there's not just one group that wants to do this.

0:17

RIP Privacy Guides.

0:20

We need to talk about what happened in Washington last week,

0:23

because while a lot of people are distracted by the holidays,

0:26

A House subcommittee is trying to take a sledgehammer to the open internet.

0:30

On Thursday, the Energy and Commerce subcommittee on Commerce, Manufacturing and Trade officially

0:36

advanced a bundle of nearly 20 bills, many of which have names which sound harmless or

0:41

even helpful like the Kids Online Safety Act or COSA and the Screen Act.

0:46

But when you actually take a look at these bills, the reality is much darker.

0:51

These bills are not about safety at all.

0:53

They are about implementing mass surveillance on an unprecedented scale, and dismantling

0:58

privacy and free speech on the internet as we know it.

1:01

Among these bills being advanced is the SCREEN Act, which will effectively mandate identity

1:06

verification by a year government ID to access completely legal content across most of the

1:12

internet.

1:12

Then there's COSA, a bill which will give state attorneys and the government the power

1:17

to decide what content will require identity verification just to access, which will be

1:23

against educational resources and life-saving healthcare information.

1:27

Looming over all of this is an ongoing push to repeal or gut Section 230, a law

1:33

which was specifically created to prevent the courts from misinterpreting the

1:37

First Amendment and it protects technology platforms from being sued

1:41

into oblivion just because they provide the tools which enable user-generated

1:45

content on the internet. These bills are moving fast and they're being fast-tracked

1:50

by lawmakers who are counting on the idea that people won't notice until it's too late.

1:55

If these pass, the internet will become a place where privacy and anonymity is illegal,

2:00

where encryption makes you suspect, and where the Americans ability to speak freely is determined

2:05

by the most restrictive state government in the country.

2:09

To break down these bills, and more importantly, what we need to know to stop them, my colleague

2:15

Nate sat down with technology journalist Taylor Lorenz.

2:18

Taylor has been covering the bad internet bills beat very extensively recently, and

2:24

I'm also going to link to badinternetbills.com in the description, and I highly recommend

2:29

you open that website up while you listen to this interview.

2:32

It lists the bills, it explains the threats, and it gives you the tools you need to contact

2:36

your representatives, which is super important.

2:39

This is one of the most critical fights for digital rights that we've seen in a decade.

2:44

Here's Nate's conversation with Taylor.

2:46

Hello, everyone. I am Nate from PrivacyGuides, and today we are talking to Taylor Lorenz.

2:52

Hi, Taylor.

2:53

Hi. Thanks for having me.

2:54

Thank you so much for being here today. Taylor's going to talk to us a little bit about some

2:59

upcoming laws in the US that are impacting the privacy and security of everyone. But

3:05

first things first, for our listeners who may not be familiar with your work, could

3:08

you tell us a little bit about yourself? Who are you and what's your background on these

3:12

kinds of issues?

3:13

I'm Taylor Lorenz. I've been a technology journalist for over 15 years now, crazy.

3:19

I cover mostly tech from the user side, so how people use technology, how people are

3:24

affected by technology, and I cover a lot related to tech policy and privacy issues.

3:30

So a lot about sort of free speech issues on the internet and also just surveillance,

3:33

government surveillance, corporate surveillance, the data broker industry, things like that.

3:38

And I have a podcast and a YouTube channel and a newsletter and yeah, I write about all this stuff.

3:43

Also, I worked in the mainstream media, so I covered a lot of this stuff for the Washington

3:47

Post, New York Times, The Atlantic, and a bunch of other, the Guardian, other mainstream

3:51

outlets.

3:52

So, like I said, we're here today to discuss a few laws that are currently in front of

3:56

Congress and they are very worrying laws.

3:59

There's actually a bunch of laws right now.

4:01

I think there's like 18 last time I checked.

4:03

But specifically, we're going to highlight three that we at Privacy Guides feel are particularly

4:09

concerning.

4:10

And those are COSA, the Kids Online Safety Act, the Screen Act, and there are, we'll

4:15

say some proposed changes to Section 230.

4:18

So I'm going to let you decide how we should kick this off.

4:21

Which one of those would you say is either most worrying or at very least you think we

4:26

should start talking about first?

4:27

Repealing Section 230 is insane.

4:30

And just to be clear, that's what the Democrats tweeted yesterday.

4:34

It wasn't just like we want to reform it, it's we're working on repealing it, which

4:37

Which is ultimately what these reform conversations are about.

4:41

They're about decimating it.

4:43

That's to me so terrifying because if we repeal Section 230, like all these other conversations

4:49

will be irrelevant because we will no longer have free expression, user-generated content

4:53

online.

4:54

I agree.

4:55

That's a good one to start with.

4:56

So for listeners, in a nutshell, and please correct me if I get this wrong, Section 230

5:02

basically says that platforms are not legally liable for the content of their users as long

5:06

as they make a good faith effort to moderate the content.

5:09

For example, privacy guides, we have our own forum.

5:12

And as long as we make a good faith effort

5:15

to remove any sort of illegal content,

5:17

that means that we're not responsible for anything

5:19

that slips through the cracks.

5:21

So from our perspective, this seems like a good move.

5:23

We don't wanna be hosting harmful content.

5:26

So what is the argument to get rid of 230?

5:28

I'm trying to wrap my head around that one.

5:30

- There is no argument other than mass censorship.

5:32

I mean, so let's be clear sort of how this all began.

5:34

Section 230 is a law that's part of the Communications Decency Act,

5:37

which passed back in the '90s when the internet was just starting out.

5:40

And when the internet was starting out, there was this big debate over what the internet should be.

5:45

You know, was it going to be like movies?

5:47

Like, was it going to be like, you know, where there's a rating system on every single piece of content

5:52

and content has to go through review before it's posted on the internet?

5:56

You know, should there be some sort of body that regulates what sort of information you can upload and access?

6:02

And back then, rightfully so, people who were proponents

6:06

of the internet were like, no, that's not

6:07

what the internet's about.

6:08

The internet is this free place.

6:09

We want it to be this global--

6:11

I hate the phrase town square, but we

6:13

want it to be this global information resource, right?

6:16

Where anybody anywhere around the world

6:17

can access information, connect with each other.

6:20

This is when email was just getting started.

6:23

Social media barely was just kicking off back then.

6:26

So this is years before even Facebook launched.

6:30

When social media started to get traction, obviously they rely so heavily on Section

6:34

230. As you mentioned, their sites like Reddit couldn't exist. Platforms like Twitter couldn't

6:39

exist because they rely on hosting user-generated content, which means that they, the platform,

6:44

they don't have to review every single piece of content before it gets published.

6:48

This is also how every single forum exists online and like even sort of like some email

6:53

messaging services to function, you know, rely on this landmark internet law that truly protects

6:58

the free internet. Now there's been all this nonsense about sort of harmful. The internet

7:03

is harmful for children. Now let me just be clear. Actually every single expert that studies

7:07

this topic is pretty consistent and has come out. Every top expert has said, this is a complete

7:12

moral panic. This is nonsense. Like actually there's no evidence at all that social media

7:16

is harming children. In fact, it helps a lot of their mental health. Like this is all just

7:20

sort of media driven moral panic. It's very similar to what we saw with video games,

7:24

television, comic books, we see these exact same arguments

7:27

come up over and over and over again.

7:29

That has given a lot of people that have wanted

7:31

to censor the internet, people in the government

7:33

that don't like what the public is saying,

7:35

an opportunity to say, ooh, we can sort of use this thing

7:39

of like protecting children to censor the internet,

7:40

'cause what the government ultimately wants

7:42

is to control speech online.

7:43

They don't want you criticizing the government.

7:45

They don't want journalists out there challenging power,

7:47

et cetera, especially not the government these days,

7:50

which is, I would argue, more authoritarian

7:51

than even a decade ago.

7:53

So they've decided that Section 230 has become this boogeyman,

7:57

where basically they're like, let's destroy Section 230.

8:00

That will destroy user-generated content.

8:01

We'll eliminate free speech.

8:03

And they claim, oh, we just want to reform it.

8:04

We want to make-- we want to hold big tech accountable.

8:07

You know, why isn't Facebook more accountable for the things

8:10

that are being said on the platform?

8:11

First of all, we have accountability.

8:13

Like, you can sue these platforms

8:15

if they're doing illegal things.

8:16

You can prosecute these platforms for certain types of harm.

8:19

Like, it's not like it doesn't exist.

8:21

If Facebook played an active role in doing something hard,

8:25

like Facebook has been subject to countless lawsuits.

8:29

It's not like you cannot sue Facebook.

8:31

But as you mentioned, we want to hold the people

8:33

responsible for the speech.

8:34

So if somebody is using a messaging app, for instance,

8:39

to mass message people information

8:41

that is considered harmful or is defamatory or something,

8:44

that person who says the speech should

8:46

be liable for the speech.

8:49

For instance, we don't go around, you know, making it possible to sue AT&T because you

8:54

said something over a phone call that made somebody else mad, which by the way, they

8:57

wanted to do that.

8:58

Let's be clear.

8:58

That's what the media was arguing for.

9:00

Back with landline telephones.

9:02

They were arguing that, you know, landline telephones, people were getting so heated on

9:05

conversations on telephones.

9:06

It was making people angry and driving violence in society and harming people's mental health

9:11

because they'd get really upset on the phone.

9:13

One guy tried to use it as a defense in his murder case, actually, that he was so angered

9:17

by this phone call that, and the media called on telephone companies to be regulated.

9:22

It's so crazy because the exact same arguments have been used against technology every single

9:27

time.

9:27

They argued this about newspapers, they argued this about novels.

9:30

All of that's to say is none of this is about protecting kids.

9:34

It's about the government's seizing control over online speech.

9:37

Section 230, like you said, it relates to a wide range of political issues.

9:41

And I think some listeners can probably already hear, you know, you mentioned like, you can

9:45

You can't sue Facebook and Facebook is still around, but if you sue some of these smaller

9:48

platforms like Signal or Mastodon, they don't have the same kind of money to withstand these

9:53

kind of lawsuits.

9:54

No, no, no, let me be clear.

9:56

Currently they don't have to have that money because they are not liable.

10:00

The people that post on them are liable for their speech.

10:03

You could sue those companies now if they were complete.

10:06

If they break the law, if Mastodon breaks the law or something, right, or does something

10:11

but you could sue them for doing something illegal.

10:15

Or you could sue-- again, there are these lawsuits that come up.

10:18

There are more protections for any platform

10:21

that hosts user-generated content.

10:23

You mentioned, if you revote Section 230, first of all,

10:26

yes, it would consolidate the power of Big Tech,

10:28

but Big Tech would have a lot more moderation capabilities,

10:32

so they would have the ability to host more content.

10:34

But pretty much every user-generated--

10:36

every small site that relies on user-generated content,

10:39

it's not that they just don't have money

10:40

with stand the lawsuits, is that they don't have the money to moderate every single piece of content.

10:44

They don't have the money. A small forum doesn't have the resources to read every single comment.

10:49

And then of course, they also want these platforms to verify the identity, the offline legal identity

10:53

of every person as well, which is a whole other issue. That's the problem with the other two

10:58

laws that you mentioned. Yeah, thank you for clarifying that. That's probably a better way

11:03

to get at that. How does section 230 relate to privacy specifically? And how do you think it

11:10

Our forum isn't even that big, relatively speaking.

11:13

And I couldn't imagine having our volunteer moderators

11:15

or even our staff, like that would be multiple full-time jobs

11:19

just for our forum to sift through that.

11:21

What are some other ways this might impact

11:23

privacy tools and privacy community?

11:25

- Yeah, privacy specifically.

11:26

I mean, you're just not gonna have a lot of privacy.

11:28

Like for instance, a lot of people choose not to use

11:31

the major social networks because of privacy reasons,

11:33

because these are massive data harvesting operations

11:36

and they don't wanna give their data to Metta

11:38

or Google or whatever.

11:39

They want to use some of these smaller sites,

11:41

or they want to engage in discussions around things

11:44

like abortion or their gender identity,

11:47

or things that they want to keep private.

11:48

And they don't necessarily want to log into a Facebook

11:51

account to have those conversations,

11:52

or log into Instagram and have to verify their information

11:56

to do that.

11:57

So they want to use some of these smaller forms.

11:59

Those forms will all be wiped out.

12:00

Disabled people rely on this as well.

12:01

They want to discuss medical conditions

12:03

without their insurance premiums going up,

12:04

because a data broker bought their info

12:08

from a social platform.

12:09

So even just the ability to participate in some of these smaller, more niche forums that

12:13

have really good privacy protections will go away.

12:17

You will have to use these major social, really just Google and Meta, which have no, they

12:23

have a horrible privacy record, of course, as we know.

12:26

Yeah, especially Meta, really bad.

12:28

Oh yeah.

12:29

I think it's really important for people to note that this is a bipartisan effort.

12:33

So there's not just one group that wants to do this, but the Democrats are really leading

12:37

the charge. And Democrats like have been so aggressive in pushing these surveillance laws

12:44

and censorship laws. It's scary. I think a lot of people associate Trump with these things.

12:49

And Trump has done a lot to attack the traditional media, but the Democrats are enacting policies

12:55

that are further to the right of Trump on the internet.

12:58

So we've been mentioning age verification and identity verification. And based on my

13:03

that one is largely, it is both COSA and the Screen Act,

13:07

but I'm mostly seeing it with the Screen Act,

13:10

which I have not been hearing about very much.

13:12

I had to really dig to find information about this one.

13:15

So again, correct me if I'm wrong,

13:17

but if I understand it correctly,

13:18

the Screen Act is an age verification bill.

13:21

Like there's kind of not much else to it.

13:22

- It's an identity verification bill.

13:24

- Identity, yes, thank you.

13:26

I should really start saying that instead of age verification.

13:29

So it would require platforms to verify users.

13:32

Is this again, because it's kind of hard to find information about this one.

13:35

Is this like just across the board, all platforms, or are they trying to sell?

13:39

This is one of those like, oh, only larger platforms or how is that one supposed to work?

13:44

Yeah.

13:44

So, um, this came out of anti porn groups and these extreme far right sort of

13:49

project 2025, you know, heritage foundation affiliated groups were, and so did

13:53

Kosa to be clear.

13:54

These are both originated in these far right anti LGBTQ circles where they

14:00

thought, okay, how can we get LGBTQ content?

14:02

off the internet, designated as profane, which they also

14:05

designate any sort of sex ed education as profane.

14:08

And then we forced it through Congress

14:09

by saying, oh, we'll protect the kids from porn.

14:11

What they mean by porn is sex ed, abortion information,

14:15

information about feminism, like LGBTQ content.

14:18

And by the way, I'm not making--

14:19

Mike Masnick at Tectord has done a great job

14:21

of the Heritage Foundation coming out and saying this.

14:23

Like, they're saying it publicly.

14:25

It's not even a secret.

14:26

They're like, yes, we can't wait to get this content

14:28

off the internet.

14:29

And so they want to do that through identity verification.

14:32

mentioned, the Screen Act is more overt than COSA,

14:35

but both of them have the exact same effect.

14:37

The Screen Act is just like anything

14:38

that has adult content, which is literally anything.

14:43

Because again, adult content is a completely subjective.

14:46

And if you look at, for instance,

14:49

that was some sort of Trump terror memo called NSPM 7,

14:52

where they designate who's hostile to America,

14:55

who's considered a terrorist.

14:56

And it's just like anybody that is

14:57

against quote unquote American values,

14:59

anybody that's against quote unquote Christian values.

15:01

These are all highly subjective things.

15:03

So it's similar with the Screen Act,

15:05

where it's like adult content, just as any content

15:08

that the government doesn't like.

15:09

So we saw this happen with the Online Safety Act.

15:12

We have here the Kids Online Safety Act.

15:13

In the UK, back in 2023, they passed the Online Safety Act,

15:16

which just went into effect a couple months ago.

15:18

And that's why you saw, for instance,

15:21

the subreddit for war crimes was removed

15:24

under these anti-poorin laws or whatever,

15:26

because any sort of police violence video,

15:29

people speaking out against sexual assault, rape victims,

15:33

alcoholics anonymous forms have been shut down.

15:36

Actually, all these places for kids to report

15:39

incidents of molestation and things like that

15:41

have been shut down.

15:42

So that is the type of content.

15:44

When you see adult content, content that's

15:46

quote unquote not safe for kids,

15:47

that's the type of content that they're seeking directly.

15:49

And yes, it would remove anonymity

15:51

from the internet basically completely.

15:54

- So this would require again, like all platforms

15:57

because you know, the common argument is like,

15:59

When I go by alcohol, I have to show ID, right?

16:02

- No, you don't.

16:02

Identity verification is nothing like showing your ID

16:05

in the real world, the IRL world.

16:07

First of all, you don't show your identification

16:11

to every single person that you interact,

16:14

to every single store that you walk in on walking in,

16:17

before you can even walk in, you have to show ID,

16:20

maybe to a bar, but certainly not to buy things

16:23

or to engage, to access information.

16:26

And then your ID, your government ID,

16:29

is not tied to every single piece of content that you read.

16:33

It's not stored forever when you go to the library

16:36

or when you go to a bookstore to buy a book.

16:37

And then your government ID is stored

16:39

and they can monitor every single word that you've read.

16:41

And then that can be used in a court case against you.

16:43

There's actually nothing like that.

16:45

Not to mention, it's a quick ID check.

16:47

A human being looks at it quickly,

16:49

doesn't memorize the info, checks there, and then moves along.

16:53

Here, it's stored in a database forever

16:55

But we know has absolutely no privacy.

16:57

These databases leak constantly.

16:59

This is a massive data privacy concern.

17:01

And that data is then used forever,

17:03

can be used to target, you exploit, you.

17:04

And often they're harvesting not just your offline address,

17:09

physical, they are harvesting biometric data

17:11

that is tied to it as well.

17:13

- One thing I noticed with a lot of these laws

17:14

is that they're vague specifically

17:16

when they talk about identity verification.

17:19

And they leave it up to the platforms

17:21

to decide how to implement this.

17:23

Do you think that's a better way?

17:25

Actually, maybe this isn't even a good question

17:26

in light of this conversation.

17:28

I think I know what you're getting at.

17:29

A lot of times lawmakers will leave

17:32

sort of specifics around identity verification in vague terms

17:36

to argue that they're not technically

17:40

mandating identity verification.

17:42

That's such a lie.

17:44

They know exactly what they're doing,

17:45

and they know what they're doing because the identity

17:46

verification lobby has basically written

17:49

half of these bills.

17:50

This massive-- they're about to get billions of dollars

17:53

if these laws pass.

17:54

But it's also just like a farce.

17:55

There is no way to verify anybody's offline identity

18:00

in any sort of platform, whether you're

18:01

a third party platform or a major platform,

18:05

without violating their privacy.

18:06

It just doesn't exist.

18:07

You have to either harvest their biometric data,

18:12

but even if you harvest the biometric data,

18:14

that biometric data and then your behavior patterns

18:15

can easily be tied to your offline identity,

18:17

or you have to verify things with your offline identity

18:20

and often confirm things in other ways.

18:22

no privacy forward way to verify your identity

18:25

because they have to harvest a lot of information

18:27

to verify your identity.

18:28

And any time you're trying to cordon off parts of the internet

18:31

for children or make things safe for children

18:33

and try to age gate any a thing of the internet,

18:36

a lot of people hear, oh, age verification.

18:38

OK, so kids will have to verify their ages.

18:41

No, in order to know who's a child,

18:43

everybody has to give up their information.

18:46

And in case you weren't thinking, oh, maybe you're still

18:49

giving them the benefit of it out, let me tell you.

18:51

I was at the Biden White House last August at this big day

18:55

that they had for content creators

18:56

to push their agenda, whatever.

18:58

Neera Tandon, a major Biden White House advisor,

19:02

got up on stage and said, we are going

19:04

to remove anonymity from the internet.

19:05

Don't you wish you could unmask every troll

19:07

is how they were selling?

19:08

Because they also try to sell these things

19:09

as some sort of answer to online bullying,

19:11

even though we know actually removing anonymity

19:13

doesn't help bullying at all.

19:14

Like literally go to the Facebook comment section

19:17

of any post.

19:17

You can see people are happy to bully under their government

19:20

names.

19:20

But they're explicitly saying it.

19:22

So this has been a goal of the Democrats for a while,

19:25

and the Republicans as well, is to completely remove

19:28

anonymity from the internet, in part to prosecute that.

19:30

Like you're saying already this happened

19:32

with people getting fired or in legal trouble

19:35

for comments about Charlie Kirk or comments

19:37

that the government doesn't like about Israel

19:39

or foreign policy, things like that.

19:41

I think this Screen Act is so insidious.

19:43

It's just as bad, if not worse, than COSA.

19:45

But COSA, because COSA has been this thing for so many years,

19:48

and there's been a little bit more activism around it.

19:51

People are just more aware of it and they're not as aware of the screen act.

19:54

The screen act is just as bad to be clear.

19:56

All 18 of these laws in this child online safety package are very bad,

20:01

very bad. There's the app store accountability act, which is also really bad.

20:05

That puts age verification on the app store level,

20:07

which actually is worse in a lot of ways. Like trust me, there is not a single,

20:11

all of these laws are very bad, but they, they have so many different names.

20:14

It's hard to keep up.

20:15

I pointed out recently actually on the topic of the app store one,

20:18

I said that I think it's really telling the even Apple and Google or specifically a lot

20:23

of these companies are like, yeah, we're all in favor of this age stuff, but we don't want

20:26

it.

20:27

We don't want the IDs.

20:27

We don't want the data.

20:29

Make Apple and Google do it.

20:30

And they're like, no, we don't want it either.

20:31

And I think that's really telling that all these companies are like, yeah, it's a great

20:34

idea, but not in my backyard.

20:36

Well, what's really scary, they say that, but then they pre-comply.

20:40

Meta and YouTube are already harvesting a huge amount of data.

20:44

They announced this publicly over the summer where they said, we're going to start age

20:48

things where we're going to start harvesting biometric data, we're going to start monitoring

20:52

more about how people use Meta and YouTube products.

20:55

So if you watch Jimmy Skibbity toilet videos, uh-oh, now you're classified as a teen and

21:01

you have to verify your identity.

21:02

I spoke to an undocumented woman who, this happened where she was using her main computer,

21:08

her child was watching YouTube.

21:09

It flagged her as a child.

21:11

Obviously, you can understand why someone who's undocumented is extremely concerned

21:14

about that.

21:15

That's already happening now.

21:18

And I think it's really scary because this

21:20

is what we hauled Mark Zuckerberg in front of Congress

21:22

for in 2017 and 2018.

21:24

Remember, he was like, sir, I sell ads.

21:25

They're like, Cambridge Analytica,

21:27

you're collecting all this data.

21:28

Now, they're mandating that they collect even more data,

21:32

and they're giving them complete cover

21:34

to start collecting huge amounts more data.

21:37

This is like six years or eight years later.

21:39

I'm like, what year is it even?

21:41

And they went from, hey, you're not

21:43

doing enough to protect users' data to, hey, yeah,

21:45

go ahead and harvest some, like monitor everything,

21:48

'cause we're gonna pass these laws anyway,

21:49

like as long as you give that info to the government

21:52

when we ask, collect all you want.

21:54

And the TikTok ban is part of this as well, of course,

21:56

because of course our data is actually less safe now

21:59

that under this new ownership structure

22:01

than it was previously.

22:02

- That will take us to the Kids Online Safety Act,

22:05

which you just mentioned a minute ago.

22:06

Like you said, our listeners are probably

22:07

a little bit more familiar with,

22:08

'cause this one has been in the public eye a little bit.

22:10

It was originally introduced in 2022,

22:13

And the original idea, again, on paper is that platforms like Facebook and YouTube should be

22:19

responsible for mitigating "potential harms." You mentioned that a minute ago too, to children

22:23

who are using the service. I don't even want to say that language because that's not what it does.

22:27

That's not what it says and that's not what it does. What it says is that platforms need to

22:33

censor more content and they need to censor more content in line with what the government perceives

22:38

as harmful. I just want people to understand that because I think some people read it and they're

22:43

Well, the goal is, or you see these headlines, right?

22:45

Congress passes law to protect children.

22:47

That's not what these laws do.

22:48

These laws harm children.

22:49

And we have research, actual scientific-based evidence

22:53

of researchers that have studied these things,

22:55

and we know that they actually harm children,

22:58

especially LGBTQ and marginalized youth.

23:01

But that's the guys under which it was passed.

23:02

- That is one of the questions I had written down.

23:04

You mentioned how these acts will do a lot of harm

23:07

to minorities, LGBTQ and these kind of people.

23:10

And so I think it's really easy for conservative folks to kind of,

23:14

they say like, oh, that's fear mongering or they may even agree with it

23:16

because they don't agree with those viewpoints.

23:19

But you've, you've pointed out, like this is a bipartisan thing.

23:22

What would you say are some of the risks that would get people on the more

23:25

conservative side of the aisle to realize like, no, this is bad for you too.

23:28

This is bad for everyone.

23:30

Part of project 2025 was about like censoring trans people off the internet.

23:33

Again, the Heritage Foundation has come out and said, here is how we plan to use

23:37

COSA to remove abortion content online.

23:39

Here's how we plan to use COSA to censor, you know, LGBTQ trans content off the internet,

23:45

all, all LGBTQ and women's rights content.

23:47

They don't, they want to block that.

23:49

So they're open about it.

23:50

They want that.

23:51

These Republicans, I've tried to explain to them like, well, what if a Democrats

23:55

empowered you guys had all this drama with Joe Biden saying that he was job owning over

23:59

COVID stuff and vaccines?

24:01

Like, don't you feel like I personally believe in vaccines, but I try to make this case to

24:06

them of like, well, would you want the government over sent?

24:08

I would argue that we don't want the government determining speech.

24:11

We want to hear from actual experts online.

24:13

We don't want government propaganda.

24:16

This is, again, what we always could accuse China and Russia

24:18

and authoritarian states of doing.

24:20

So now we're trying to replicate that exact system.

24:24

That's what you guys were supposedly against.

24:26

They're not actually against it.

24:28

Once they seize power, their whole thing is like, oh, well,

24:31

OK, we're going to pass it under us.

24:32

So they're going to bake into law where it's written in a way

24:37

that will be used to censor all the content

24:39

that they don't like, and then the left will never

24:40

get power again, which they're probably

24:42

right actually about that.

24:44

If they can effectively control the internet

24:47

and skew it so effectively to the right wing, which they have

24:49

already done in some ways, the left won't ever get power again.

24:53

So that's when you talk to these staffers,

24:55

they're like, well, there's not going

24:56

to be another democratic presidency.

24:58

We won't have to worry about it, which is bleak.

25:01

The Democrats are just happily going along with it

25:03

because they also want to censor people.

25:06

And they're like, yeah, we'll align with the Heritage

25:08

Foundation, because we hate when people criticize us online.

25:11

We hate when people say things that are anti-Israel.

25:14

I'm Chuck Schumer, and I hate that somebody says

25:16

I shouldn't give $500 billion to whatever Israeli defense

25:21

fund thing or whatever.

25:23

It's a lot of foreign policy criticism,

25:24

also a lot of criticism that the Democrats are not

25:27

fighting Trump hard enough.

25:28

They just want to shut-- they just also

25:30

want control over speech.

25:31

So actually, COSA has been very led by the Democrats.

25:34

And the Democrats, one other thing I'll say,

25:37

that's a little history for people to understand.

25:40

Obama was very protect.

25:42

So Obama was fully in bed with the tech industry.

25:45

He had one of the last events that he had before leaving office

25:47

was called South by South Lawn, where

25:49

he brought Uber, Facebook, all just the worst,

25:53

like, biggest tech companies ever to come out

25:55

and have this celebration of what they've done

25:58

and essentially get everyone in the White House jobs

26:01

in these major Amazon.

26:02

Like all these big Microsoft big companies were there.

26:05

I reported on it.

26:06

When Trump won, the tech lash started.

26:09

And that's when liberals started to realize, oh, wait,

26:12

maybe these platforms are not just all rainbows and sunshine.

26:16

They're actually being used for fascism and bad things.

26:20

So that's when, again, we're going

26:21

to haul Mark Zuckerberg in front of Congress

26:23

and really crack down on him, whatever.

26:25

It's also when you saw the rise of the Black Lives Matter

26:27

movement, Me Too movement, more and more social justice

26:30

movements that were challenging democratic politicians

26:32

being progressive enough.

26:34

And they hate that.

26:35

They don't want that.

26:36

They don't want any, they don't want anyone speaking out.

26:38

They want to do their corporate bullshit, whatever in peace.

26:41

And they don't want any backlash.

26:42

And they want to seem like they're tough on big tech because they feel like

26:46

they weren't tough enough on big tech originally.

26:48

And these laws, even though they're actually a massive reward to big tech.

26:51

And if you look at who backs this stuff, like big tech funds, like, I

26:55

mean, meta was one of the biggest lobbyists in DC last year and the year before.

26:59

They don't want to have this air of cracking down on big tech.

27:01

So that's why a lot of them sign onto it.

27:03

You kind of covered a lot of the, so my question here was, I saw your recent

27:07

interview with Ari Cohen, which was great, by the way, he mentioned that the

27:10

latest incarnation has no car routes for smaller platforms.

27:12

So again, my goal is I'm, I'm kind of trying to bring this home to viewers.

27:15

You know, again, privacy guides, we have a forum, a mastodon instance, a

27:18

peer tube instance as a US organization.

27:21

Exactly.

27:21

We, we would be subject to COSA compliance.

27:24

So I'll just say something too.

27:25

Like I think people, again, because there's this framing of cracking down on

27:30

big tech, people think like, oh, we're going to get that.

27:34

We're going to really stick it to meta.

27:35

We're going to whatever.

27:36

And they don't actually realize how many smaller internet

27:38

services they do rely on.

27:40

Because when you think of social media,

27:41

you think of Google and YouTube and stuff.

27:44

And I understand not everyone's on Mastodon and PeerTube

27:47

and things like that.

27:48

Those would go away under these new laws.

27:50

But they might turn to a subreddit for information

27:55

when they're looking about something.

27:56

They might end up on a forum.

27:58

They might just be on a website that has a contributor model that doesn't necessarily

28:04

moderate so heavily, or they might want to participate in a campaign.

28:08

You know, hey, let's all get, they're going to, I don't know, build a giant cell phone

28:13

tower in my backyard.

28:14

Let's all come together and do this social media campaign to prevent that.

28:18

You won't have that ability anymore, because all of your stuff will have to be approved

28:23

by some intermediary that is willing to take on the liability of your activism and your speech.

28:30

So there won't be any internet activism online. There won't be a way to engage in that because

28:37

nobody's going to like, I mean, any organization that would take on any sort of mass liability for

28:41

that stuff would just be sued out of existence. So it's just, it really, there's such a mass

28:46

chilling effect. And I think a lot of people don't realize actually how much, I mean, even platforms

28:49

like next door, right?

28:51

Like you might not think of these when you're thinking of main social media, but

28:55

like there's value in that people get about even like these marketplaces.

28:59

Like there's just a lot of user generated content online that people don't

29:03

think of as user generated content and that they engage in.

29:07

And messaging apps might be subject to this as well.

29:10

So it's just like, what about your WhatsApp group?

29:13

You know, like what is the threshold for that?

29:15

Is that counting as social media?

29:17

probably like you won't have be able to mass distribute information really in the same way

29:25

anymore.

29:26

One thing I want to add on to that that you I think the key there was mass distribute

29:30

information because I it frustrates me but I see this a lot in the privacy community where

29:35

you know for example Google says they're going to stop allowing side loading on their phones

29:39

and there's so many people that are like oh well I'm super tech savvy I know how to get

29:43

around that. And it's like, that's really great. Most people don't. And so, you know,

29:48

there are always the people that like, Oh, well, I'll know how to roll my own messenger

29:52

app and still be able to use it. And it's like, cool, you and like six other people.

29:55

And that's just not enough for the kind of mass communication that that you're talking

29:59

about there. Yeah. Also, just like, you have to think of who

30:03

who do we want to protect? We want to protect the most marginalized users. Like you don't

30:08

want to write regulation that censors or harms the people that rely on these platforms the

30:14

most activists, journalists, academics, like people speaking, challenging power.

30:20

Like again, this is what America always criticized other countries for criticized Russia and

30:27

China saying you guys don't have free and open internet.

30:29

Oh, Iran doesn't have a free and open internet.

30:31

You know, the government approves everything and, you know, LGBT, these people such and

30:36

such groups, marginalized people can't speak out.

30:39

Okay, we're about to do that here in America.

30:41

Isn't that like what they literally spent like 20 years fear

30:43

mongering about these other countries, which by the way, I don't support those

30:47

other countries.

30:47

I think those are authoritarian versions of the internet that I don't want.

30:50

Personally, I think to have the people that we currently have in power enacting

30:54

that authoritarian version of the internet is extra scary.

30:57

Cause I think there's a lot of people in power right now that don't value

31:00

civil liberties and don't value free expression.

31:02

I do agree.

31:02

And I really appreciate you pointing out that this is a bipartisan thing

31:05

because the privacy community is very diverse.

31:08

Like we have people on the left, we have people on the right.

31:10

And that's why I asked that question about like, you know,

31:13

realizing that this opens the door that even if there is a power shift in the future,

31:17

if the Democrats come into power in the future, now they're wielding this.

31:20

And, you know, this really is a bipartisan, like everyone is impacted by this,

31:24

regardless of whether you buy into the narrative for that's being sold to us or not.

31:28

Which the far right doesn't think will happen.

31:30

And I think they're just in delusion.

31:31

Like I think some of them, and I talk to these people for work.

31:34

And I just-- the Democrats are also delusional.

31:37

I talked to Democrats last summer

31:39

that were saying a lot of this out there, like, well,

31:41

there's no way where Trump would win again.

31:42

And I would just say to both of them,

31:43

you guys are both delusional.

31:45

You know, we do have this system.

31:46

At some point, someone you don't like will be in power.

31:49

And you should write laws so that when--

31:52

no matter who's in power, your rights are protected.

31:56

Because when you write these laws--

31:57

like, and I think the left and the right both

31:59

have become very anti-free speech.

32:01

Like, leftists just want to censor people on the right.

32:04

people on the right want to just censor people on the left.

32:06

And you see very few organizations--

32:08

this was criticism of ACLU and some other orgs--

32:10

you see very few organizations, except FIRE,

32:12

and I think some others, that have

32:13

done a good job of really being bipartisan and being like,

32:17

we're standing up for free speech, even speech

32:18

that we don't like, even speech that we find morally

32:21

reprehensible.

32:22

Like, we will defend speech.

32:24

Again, not if it's criminal, not if it's directly--

32:26

but you know what I mean.

32:27

Just protecting people's right to expression is very important.

32:30

And their right to privacy is very important.

32:32

I think that's the other thing is, is like, we deserve anonymity on the internet.

32:36

I think it's very dystopian to have everything that we say online and read and consume and

32:42

watch online tracked by the government.

32:45

Imagine the worst person that you can imagine, imagine the worst person that you hate on

32:49

the other political team becoming in power.

32:51

You know, is that what you want?

32:52

No, you don't want them to have control over your information ecosystem.

32:56

We've covered all the questions that I had written down, but I always like to kind of

33:00

open the floor.

33:01

anything that didn't come up that you're like, no, I really want to make sure we talk about

33:04

this before I go.

33:05

Well, I just would tell people to get involved.

33:07

Like I see a lot of nihilism online these days where people think, oh, it doesn't matter.

33:12

Oh, what can I do?

33:13

Truly, I promise you, I've talking to people in these offices, these congressional offices,

33:18

it does make a difference.

33:20

Fight with them.

33:21

I've had people call and say, oh, and they said that, you know, the staffer dismissed me

33:24

and they say, oh, there's not doing identity verification.

33:26

Yes they are.

33:27

Like, don't let them gaslight you.

33:29

Call them, make your voice heard.

33:30

There's a really great website called bad internet bills.com, which fight for the future

33:35

of a digital rights organization put together.

33:37

It's so great.

33:38

It makes it so easy.

33:40

You can sign their form.

33:42

You can send a letter to Congress.

33:43

They tell you exactly what to say.

33:44

They have overviews of all of these bad laws.

33:47

I really encourage people to go to bad internet bills.com and just make their voices heard.

33:51

As you mentioned, we actually were able to stave off COSA before.

33:54

We're able to stave off.

33:55

to make it clear that we the public do not want this gross invasion of privacy.

34:00

Yeah, we've been pushing that website a lot on our weekly live streams for sure.

34:04

It's great.

34:05

And I've been seeing it pop up on a lot of what I was doing research for this.

34:07

I'm seeing it pop up in a lot of other places too, which makes me really happy.

34:10

It's so great.

34:11

I'm so glad that they put it together because it's just very easy.

34:13

Yeah, fight for the future rocks.

34:15

They're doing great work.

34:16

Taylor, thank you so much for your time today.

34:17

You are very active in continuing to discuss these kinds of issues and talk to

34:21

experts about it.

34:22

So where can viewers and listeners continue to follow your work on this stuff?

34:25

Yeah, I'm on YouTube just at Taylor Lorenz. I have a series called Free Speech Friday,

34:30

where every Friday I talk about these issues and starting live streaming soon too to talk about

34:35

because there's just so many issues. I can barely one video a week is not enough. But yeah,

34:39

you can find me on YouTube or my newsletter, which is just user mag.co.

34:44

I think that's all we got. Thank you so much again for your time. We really appreciate it.

34:48

We want to thank Taylor again for making time in her busy schedule to come and talk to us.

34:52

This interview came together on very short notice due to the pace of current events,

34:55

so we really appreciate her being flexible and lending her expertise.

34:59

PrivacyGuides is an impartial, non-profit organization that is focused on building a

35:02

strong privacy advocacy community and delivering the best digital privacy and consumer technology

35:06

rights advice on the internet. If you want to support our mission, then you can make a direct

35:10

donation on our website, privacyguides.org. To make a donation, click the red heart icon

35:14

located in the top right corner of the page. You can contribute using standard fiat currency via

35:18

via debit or credit card, or opt to donate anonymously

35:21

using Monero or with your favorite cryptocurrency.

35:24

Becoming a paid member unlocks exclusive perks

35:26

like early access to video content and priority

35:28

during the This Week in Privacy livestream Q&A.

35:31

You'll also get a cool badge on your profile

35:32

in the Privacy Guides forum

35:33

and the warm fuzzy feeling of supporting independent media.

35:36

Thank you for watching and we will see you in the next video.

35:39

(gentle music)