Did Signal's Founder Create the Most Private AI?
Ep. 36

Did Signal's Founder Create the Most Private AI?

Episode description

Moxie Marlinspike has created a private AI alternative called Confer, Windscribe has helped form a privacy alliance, Threema got acquired by Comitis Capital, E2EE comes to RCS messaging on iOS 26.3, and much more! Join us for This Week in Privacy #36!

Download transcript (.vtt)
0:21

Hello, everybody.

0:23

Welcome back to This Week in Privacy.

0:25

This is our weekly series where we discuss

0:27

the latest updates with what we're working

0:29

on in the Privacy Guides community and

0:31

this week's top stories in the data

0:33

privacy and cybersecurity space.

0:35

The stories this week include Moxie

0:38

Marlinspike's new AI chatbot,

0:40

a new privacy services alliance,

0:42

a new advanced Linux malware, and more.

0:45

I'm Jonah,

0:45

and with me this week is Nate.

0:48

How are you doing, Nate?

0:50

I'm good.

0:50

How are you?

0:51

I'm doing great.

0:52

Thank you.

0:53

For those of you who are just tuning

0:55

in for the first time or might not

0:56

know,

0:56

Privacy Guides is a nonprofit which

0:59

researches and shares privacy-related

1:01

information,

1:02

and we facilitate a community on our forum

1:04

and matrix where people can ask questions

1:06

and discuss

1:07

get advice about staying private online

1:09

and preserving your digital rights.

1:11

So we always kick off the show with

1:13

what we've been working on at Privacy

1:15

Guides this week,

1:16

and I'll hand it over to Nate to

1:17

share a bit about what he's been doing

1:19

on the video side of things.

1:22

Awesome.

1:22

Yeah,

1:23

so there's been some exciting

1:24

developments.

1:25

For those of you who didn't know,

1:26

we have a new video now out to

1:28

the public.

1:28

It's loosely based on Em's written article

1:33

that privacy is like broccoli.

1:35

So if you haven't checked that out yet,

1:36

please go check it out.

1:39

Very proud of everyone at the team.

1:40

It's obviously, it's not just me,

1:41

Jordan edited and everybody helped double

1:44

check the writing and everything.

1:45

So yeah,

1:47

we're continuing to work on our video

1:49

courses, the smartphone security guide,

1:52

and I believe there's still some work

1:54

going on.

1:54

Well,

1:54

there's definitely some work going on on

1:56

the threat modeling course.

1:58

And I also,

2:02

Just came in.

2:03

I yesterday recorded a new video about,

2:07

I should know what this was.

2:08

I just recorded it when I'm drawing the

2:09

blanks.

2:12

Private browsing.

2:13

That was it.

2:15

Yeah.

2:15

Yesterday I just recorded a new video

2:16

about private browsing and that will be

2:19

coming out very soon as well.

2:23

we have uh yeah we've got a lot

2:24

going on behind the scenes it's hard to

2:26

keep track of all of this um i

2:28

can share some other updates on our end

2:30

um most notably if you missed it last

2:34

week um we had some personnel changes um

2:37

our wonderful intern kevin is no longer

2:40

working with us his internship ended um so

2:44

he's moving on to other projects but um

2:46

it was a great experience working with him

2:48

i'm super glad uh that we got the

2:50

chance to do that and i hope to

2:51

see kevin um

2:53

hopefully stick around on the forum and do

2:54

some volunteer,

2:55

maybe some one-off projects with us in the

2:57

future.

2:57

But of course, that's up to him.

2:59

We'll see how that goes.

3:00

But I just wanted to, again,

3:02

thank Kevin for all of his work and

3:05

explain a bit about what the show is

3:07

going to look like going forward.

3:08

Of course,

3:09

it's going to be mainly Nate and I

3:12

or...

3:13

hosting this show.

3:14

Jordan may be hosting some episodes as

3:16

well, like they used to.

3:18

But that'll be kind of the roundup for

3:21

this week in privacy going forward.

3:24

Besides that,

3:25

we published several news articles this

3:27

week at privacyguides.org slash news.

3:30

There were Instagram password resets in

3:32

the news, RCS potentially in the iOS,

3:37

which I believe we're going to talk about

3:38

later in the show.

3:40

Threema,

3:41

which is a pretty popular end-to-end

3:42

encrypted messaging app,

3:43

was recently acquired by another private

3:46

capital firm.

3:47

There's some vulnerabilities in AI agentic

3:50

browsers.

3:51

There's news about Windscribe and their

3:55

partnership with some other privacy

3:57

services,

3:57

which is another topic that we're going to

3:59

cover in more detail here in the show.

4:01

And an article on the WhisperPair

4:04

Bluetooth vulnerability.

4:06

So definitely a lot of cool stuff coming

4:08

out there.

4:09

Freya has been working really hard getting

4:12

timely news briefs and news updates out on

4:14

our site for people who are interested in

4:16

that sort of thing.

4:17

So if you want to keep up to

4:20

date with that,

4:21

definitely give the privacyguides.org

4:24

slash news site a follow or follow it

4:27

on various social media platforms or the

4:29

news category on our forum,

4:30

which is where all of these articles are

4:33

posted.

4:35

I also want to highlight an article that

4:37

I wrote a year ago.

4:38

I know with a lot of political news

4:40

going on in the world,

4:42

I think it's just always a good time

4:44

to maybe give a reminder of this.

4:46

We do have an article on smartphone

4:48

security and keeping your smartphone safe

4:52

if you are attending any events.

4:55

In person like protests or activities like

4:58

that if you're an activist or you're a

5:00

journalist who needs to be sure that your

5:03

baseline security for your smartphone is

5:06

up to speed definitely give this article

5:09

or read, we will.

5:11

I'll have a link to this in the

5:13

show notes when we send this out later.

5:14

But yeah,

5:17

just since the article is a bit old,

5:18

I wanted to resurface it here.

5:21

I think off the top of my head,

5:23

that's all of the major updates we've been

5:25

working on at Privacy Guides.

5:26

It's been a pretty busy few weeks for

5:28

me.

5:28

We've been working on a lot of things

5:30

behind the scenes.

5:31

And if you're a regular member on our

5:33

forum,

5:34

you may have seen some of the notes

5:36

from meetings that we've had,

5:37

and we have some future stuff that we

5:39

have to work out.

5:41

still in the works.

5:43

But I think that's kind of everything for

5:45

now.

5:47

Nate,

5:47

if you don't have anything else to add

5:49

in terms of Privacy Guides updates,

5:50

I think I'll hand this over to you

5:53

to share some of our news stories,

5:55

maybe our first one.

5:58

yeah all right so we uh our our

6:03

headline story this week is about moxie

6:05

marlin spike's new ai chat bot this has

6:09

been kind of making waves for those of

6:10

you who don't know who moxie marlin spike

6:12

is you probably at least know uh his

6:14

work which is the signal messenger uh

6:17

moxie created signal he passed off the

6:20

reins to the current president uh

6:22

Well,

6:23

I don't know if he picked her necessarily,

6:24

but he passed off the reins.

6:25

And now who has taken over is Meredith

6:27

Whitaker, who, yeah.

6:32

Anyways,

6:33

she's doing a great job with Signal.

6:35

But he moved on, what was that,

6:39

about a year?

6:40

No, more than a year ago.

6:40

My sense of time is all messed up.

6:42

But he's been kind of laying low,

6:43

actually.

6:44

At least I feel like.

6:45

I certainly haven't heard his name in

6:46

quite a while.

6:47

And now it's popped up with this new

6:49

chatbot called Conferr.

6:51

And I believe the link for that is

6:52

confer.to if any of you want to check

6:55

it out.

6:55

But it's very minimal right now.

6:57

And what really sets this one apart is

6:59

this might be, to my knowledge,

7:02

the first really private AI chatbot on the

7:06

user side.

7:07

And we're going to dig into that in

7:09

a little bit.

7:09

But...

7:12

You know, things like,

7:13

which this article here does unfortunately

7:15

incorrectly say at the end,

7:17

things like Lumo, things,

7:18

as far as I know,

7:18

things even like Braves Leo,

7:20

I could be wrong about that one,

7:21

but they don't really protect the user.

7:25

They're basically like a no log VPN for

7:27

LLMs.

7:28

They promise they won't keep your logs,

7:30

but technically they can access them.

7:33

Um, if there was a situation like, uh,

7:35

you know, famously years ago,

7:37

proton was forced by law to monitor a

7:40

certain account and record the IP

7:42

addresses.

7:42

Cause they were trying to figure out who

7:43

was behind that account.

7:45

Not they proton, the,

7:46

the authorities were trying to figure it

7:47

out.

7:48

And, um, you know,

7:49

eventually they were able to do that.

7:51

So this, uh,

7:54

this is kind of in the,

7:57

this is the first time that they're trying

7:58

to create something more like signal

8:00

where, uh,

8:01

That's not even possible.

8:02

It's all encrypted from start to finish

8:06

for real.

8:07

And yeah, it's, I mean, it's, okay.

8:12

So if you go to the confer.to website

8:14

and you click on the blog,

8:15

you can dig in there and there is

8:17

some like really

8:20

There's some really technical details.

8:21

I don't think there's any code or

8:22

anything.

8:22

I mean, the thing's open source,

8:23

so you can go look at the code

8:24

if you're that technical.

8:25

This Ars Technica article, I think,

8:27

does a pretty good job of dumbing it

8:28

down.

8:29

And basically,

8:30

it really relies on what are called

8:31

trusted execution environments.

8:34

This goes way over my head,

8:35

and I'm sure Jonah can probably break it

8:38

down a little bit simpler.

8:39

But basically, the way they describe it,

8:41

they say it prevents even server

8:42

administrators from peeking at or

8:44

tampering with conversations.

8:46

And they talk about how it's designed

8:47

where you can access it from different

8:50

devices,

8:50

and it can synchronize just like Signal.

8:52

And I don't know.

8:54

It's really cool stuff.

8:58

Before I jump into the next part of

8:59

this thought, Jonah,

9:01

is there anything you want to add?

9:03

Yeah, so I don't know.

9:06

I'm probably less excited about this

9:08

Confer stuff than other people I've seen

9:10

in the community.

9:12

All of this private AI stuff has been

9:15

trending quite a bit lately,

9:16

and they especially rely on these trusted

9:18

execution environments.

9:20

TEE's Confer is now different,

9:23

but we've seen

9:24

For example,

9:25

if anyone wants to look into this later,

9:26

we've talked a lot about Maple AI on

9:28

the forum,

9:29

which uses a similar technology to kind of

9:32

validate the security of how these AI

9:34

models are being run in theory.

9:38

And basically what these trusted execution

9:41

environments are,

9:44

are

9:45

features that are built into the CPUs of

9:48

servers that these models are running on,

9:50

where they can run a certain set of

9:53

code and it can be validated by the

9:56

hardware that the code hasn't been changed

9:58

or modified in theory.

10:00

So if an AI service, for example,

10:02

like Confer is releasing an open source

10:05

model and they're saying this code is

10:06

what's running on here,

10:07

the hardware should, in theory,

10:09

ensure that they're not swapping out that

10:11

code behind the scenes and it might

10:13

protect you.

10:15

The problem with TEE is that there are

10:19

quite a few limitations,

10:20

and it wasn't really designed to protect

10:23

against the operator of the server from

10:27

potentially being a physical attacker

10:28

here.

10:29

And so we've seen security experts like

10:33

Matthew Green, for example,

10:34

on

10:35

on Twitter caution against the use,

10:38

or not against the use,

10:40

but against relying too much on TEEs for

10:43

the security of AI models because they

10:46

can't provide the same type of guarantees

10:48

that full end-to-end encryption can

10:51

provide.

10:52

That's simple math, whereas this is

10:54

a bit more policy based.

10:56

It's certainly a step better than,

10:58

you know,

10:59

them just saying they're not going to look

11:00

at it,

11:00

but not really doing anything about it.

11:02

But it's nowhere near the guarantees that

11:04

you're going to get from end to end

11:05

encryption.

11:07

And so it's tricky to say that any

11:12

of these are truly private because

11:15

While some AI models like Lumo,

11:17

for example, or this one or other ones,

11:19

they will use end-to-end encryption for

11:21

the storage of your chat logs.

11:24

They don't necessarily, well,

11:28

they don't at all use end-to-end

11:29

encryption on the chat as you're chatting

11:32

with it.

11:33

And they can theoretically read that chat

11:35

or log it.

11:36

at that time.

11:38

And there isn't really any way around this

11:40

because these AI servers,

11:41

they need access to your data in order

11:44

to run the AI query on it.

11:47

And so that's why

11:50

I think all of these cloud AI models,

11:52

they don't probably reach the level of

11:54

privacy that a lot of people are going

11:56

to want from AI,

11:58

especially for some of the things that

11:59

people are using this for.

12:02

I think we're going to maybe talk about

12:04

this in a bit,

12:04

but I know there's a ton of stories

12:06

about how AI is starting to be used

12:07

for health stuff and personal,

12:10

very sensitive information.

12:12

And these protections, in my opinion,

12:14

don't really go anywhere.

12:16

too far or far enough to protect that

12:19

data, unfortunately.

12:22

That's kind of how I would describe the

12:23

technical side of things.

12:25

I know there's a lot that we want

12:26

to talk about that isn't necessarily

12:28

technical,

12:28

but more just privacy concerns with AI in

12:32

general,

12:32

beyond the pure security of the system.

12:37

But yeah,

12:39

I definitely want to create the

12:41

distinction between TEEs,

12:45

like better being used,

12:48

In this case versus end-to-end encryption,

12:50

because end-to-end encryption,

12:51

it's a whole different ballgame,

12:53

and it's much better security,

12:54

and AI is not providing that in any

12:57

of these cases.

13:01

Cool, yeah,

13:02

and thank you for clarifying what I meant

13:03

when I said that most other, like Lumo,

13:06

is more like a no-logs VPN in the

13:08

sense that they can read your chats,

13:11

they just promise not to store them.

13:12

And they do store them in a way

13:13

where they can't read them,

13:14

but while you're having that chat,

13:16

they totally could.

13:17

And that's what this is supposed to do

13:18

differently.

13:19

But yeah, let's,

13:21

because I'm excited to get into this,

13:23

let's go ahead and talk about,

13:24

towards the end there,

13:25

you mentioned that this is all fine and

13:28

nice from a technical perspective,

13:29

potentially,

13:31

Um,

13:31

but this doesn't solve a lot of the

13:36

privacy concerns with AI and, um,

13:40

Man, this is really good.

13:41

So just to kind of give listeners a

13:43

tiny peek behind the curtain,

13:45

we have a weekly meeting,

13:46

a staff meeting at Privacy Guides.

13:48

And part of that discussion is always what

13:50

story do we think would make the best

13:52

headline story?

13:53

And when we talked about this,

13:56

our staff member, Em, had an amazing,

14:00

a lot to say, really.

14:02

She didn't just make a point and she

14:04

gave a really good talk.

14:05

I guess you could call it that.

14:06

But anyways,

14:07

she pointed out that something that we

14:09

never really talk about,

14:10

and I'm sure some of you guys have

14:11

thought about this,

14:12

but I personally have never thought about

14:14

this before.

14:15

And AI really,

14:19

at least in its current form,

14:20

can't be made private.

14:21

Like maybe it can from the end user,

14:23

but that doesn't count the data and how

14:27

the data is scraped.

14:28

And it's...

14:32

I think a lot of us know that

14:34

anything you put online you should treat

14:36

as,

14:37

you know, being public.

14:39

At least that's something I've always been

14:40

able to say,

14:41

or that's something I've always said in

14:42

the past is anything you post online,

14:44

be prepared for it to be breached or

14:46

anything like that.

14:47

But even so,

14:48

that doesn't make it right for these

14:49

companies to go around scraping all this

14:50

data without consent.

14:51

And I do want to acknowledge there are,

14:53

I know there's at least one AI that

14:57

they're trying to, or not, well,

14:58

I guess you could call it an AI,

14:59

a training set they're trying to create

15:01

that is drawn from

15:04

consenting users but at this time there's

15:06

really no mechanism like that i think uh

15:08

i think creative commons is currently in

15:10

talks to create a flag where basically you

15:13

can say oh yes i consent to let

15:15

ai train on this data but certainly chat

15:19

gpt anthropic uh whatever some of the

15:22

other ones are out there i don't think

15:23

any of them have ever respected that

15:26

because they existed before that was a

15:27

thing and we already know that there's so

15:29

many stories about like uh

15:32

Copyrighted material,

15:33

the Harry Potter books,

15:33

the New York Times is currently in a

15:35

lawsuit where there's so many stories

15:37

where we know that this is what's

15:38

happening.

15:38

They're training on data that is not

15:41

licensed for free use,

15:42

that is not consented to.

15:44

And yeah,

15:48

that's a trickier thing that I don't think

15:50

can be solved in a technical way.

15:55

Any thoughts on that one, Jonah?

15:58

Yeah,

15:59

there's so many concerns with the privacy,

16:01

and I totally agree with what Em had

16:03

brought up in our meeting and on Mastodon.

16:05

A lot of these things that are being

16:08

built right now,

16:09

they're being just used in very un-private

16:10

ways, and it requires harming privacy,

16:15

exactly like you said,

16:16

in order to build these things in the

16:17

first place.

16:18

And so it's very concerning what's going

16:21

on with AI here.

16:28

Yeah,

16:29

I think she summed it up pretty well.

16:31

So I'm not sure if I have anything

16:33

else to add beyond that off the top

16:35

of my head.

16:39

Sorry,

16:39

I was just looking at the chat here,

16:41

seeing if something was coming up.

16:42

So I was a bit distracted.

16:44

But yeah,

16:46

I think that's kind of where I'm at

16:47

with that.

16:47

Yeah,

16:47

we're trying something new on the back end

16:49

here, and there's a lot of moving parts.

16:52

Yeah, let us know.

16:54

Definitely, if you're watching,

16:55

leave a comment how the stream is going

16:58

in terms of quality.

16:59

Or if you're chatting on a different

17:00

platform,

17:02

let us know and we can see if

17:03

that shows up.

17:04

Because, yeah,

17:06

we have a bit of a different setup

17:07

than usual.

17:10

But, yeah, going back to AI, I mean,

17:12

it's...

17:13

I don't know.

17:14

I feel like there's not much to add

17:15

to that statement,

17:15

but it's to what Em said.

17:18

But it's such a – to me, again,

17:20

I thought it was really enlightening

17:20

because I've never thought of that before.

17:22

And it feels so – because I've seen

17:24

some people say,

17:26

particularly in the Mastodon post that Em

17:30

sent us where she kind of put all

17:32

these thoughts into words.

17:33

And someone argued,

17:35

like I said at the top, it's like,

17:36

well,

17:36

you post anything online and you know it's

17:37

going to be public.

17:38

And I think this kind of falls under

17:39

17:40

A long time ago,

17:41

I wrote a blog post where I made

17:42

the argument that there's a difference

17:44

between an expectation of privacy and an

17:46

expectation to be stalked.

17:48

And, you know,

17:49

I've been in situations where because I

17:50

have, you know,

17:51

very visible arm tattoos for those who've

17:53

never seen my arms before.

17:55

And I usually wear short sleeve shirts,

17:57

especially in the warmer months.

17:58

And I've had many times where, you know,

18:00

I've had like there was one time I

18:02

went to the store and as I was

18:04

leaving the store,

18:04

a friend texted me and was like, hey,

18:06

I just saw you at the store.

18:06

And I'm just like,

18:08

How did they know it was me?

18:09

And this was like during COVID.

18:10

So like everybody had a mask on.

18:11

I'm like,

18:12

how did they know it was me?

18:13

And then I'm like, oh, right.

18:15

Duh.

18:16

But and, you know,

18:17

I'm not mad about that.

18:17

Right.

18:18

But the difference is that friend didn't

18:19

then immediately like get in the car and

18:21

follow me around town as I ran errands

18:23

or went back home or whatever I did.

18:24

And it feels different when, you know,

18:27

you post something publicly and sure,

18:29

somebody might see it.

18:30

Somebody might get upset.

18:32

versus that might get scraped up.

18:34

That might get added to a data training

18:36

set.

18:38

I personally have, and many people do,

18:40

I have disappearing messages enabled on

18:41

Mastodon because I want them to go away.

18:44

And if they get caught up in a

18:46

data training set,

18:48

having a hard time talking tonight,

18:49

then they never go away.

18:51

They're there forever.

18:52

And it's just such an interesting

18:55

perspective that I never considered.

18:56

Because when we look at these new

18:57

technologies,

18:59

I think there's two kinds of problems.

19:01

There's technical problems and then

19:03

there's,

19:04

I don't know what I would call them,

19:05

but there's all the other problems.

19:07

And technical problems to me are things

19:09

like the energy usage.

19:12

And for the record,

19:13

I am not trying to downplay this,

19:15

but the energy usage of AI is really

19:17

bad for the environment, right?

19:20

That's a technical problem.

19:21

As we go, I think,

19:23

not to sound overly optimistic,

19:25

but I think we'll learn how to make

19:27

more energy-efficient data centers.

19:28

We'll learn how to maybe switch to

19:30

renewable energies, maybe someday,

19:31

eventually.

19:33

And all these things that can reduce the

19:36

environmental impact of AI.

19:38

But then there's the other things like...

19:42

what we were just talking about,

19:43

the fact that all this privacy data was

19:44

taken,

19:45

the fact that this is potentially putting

19:46

people out of work, you know,

19:48

these are much harder things to solve.

19:51

And maybe there is a solution if you

19:52

want to be an optimist,

19:53

but I think starting this discussion is

19:56

certainly part of that, you know,

19:58

finding that solution if nobody's talking

19:59

about that side of things, for sure,

20:01

because

20:02

I don't know if I said this,

20:03

but we usually think of privacy as the

20:05

end user, my prompts and that stuff,

20:07

the responses.

20:07

But I feel like we don't talk enough

20:10

about the training data in the sense of

20:12

the privacy invasion from it.

20:14

So yeah, it's really interesting stuff.

20:19

Absolutely.

20:20

And beyond just the training aspect of it,

20:24

I think...

20:27

like AI has definitely shown that privacy

20:29

has never really been more important in

20:32

protecting your data online because we

20:35

could talk about this other story that we

20:36

have.

20:36

This was reported by four or four media

20:39

talking about Grok's AI sexual abuse

20:41

material that it's generating on Twitter

20:43

right now or X.

20:45

And I think it also shows that like,

20:50

You have to be careful with the data

20:53

and images that you're posting online,

20:55

especially personal stuff,

20:56

because now people can basically use these

21:01

models.

21:01

They're getting better and better at

21:02

creating very photorealistic things,

21:04

and they can take innocuous images or just

21:06

selfies or any images of people and turn

21:09

it into stuff that you probably don't want

21:12

to see or don't want on the internet.

21:15

And certain people are going to be more

21:16

affected by this than others.

21:18

But I think that...

21:20

You know,

21:21

AI is creating a very dangerous

21:23

environment right now where any data that

21:30

you put out can potentially be misused or

21:34

blown far out of proportion,

21:36

far beyond what you originally intended

21:38

when you were maybe making a post.

21:41

And I think it's just, I don't know.

21:44

AI, I think,

21:45

has created a lot of terrible situations

21:46

all around in terms of privacy, for sure,

21:52

and in terms of safety on the internet.

21:54

And I don't really see a way that

21:57

it's ever going to be rectified.

21:59

There aren't a lot of great solutions

22:01

here.

22:02

And so it makes me very hesitant to

22:05

recommend or use AI in any

22:07

capacity because it's just creating it's

22:10

really creating a monster that we don't

22:12

really know how to how to control and

22:14

it's not a good idea I think for

22:16

people to become reliant on tools like

22:20

this and use it like in their everyday

22:22

lives for for all sorts of things because

22:26

you know, eventually, you know,

22:27

it's either going to create these

22:29

dangerous situations or it's going to have

22:31

to be reined in by these tech companies.

22:34

And then you're in a situation where,

22:35

you know,

22:36

tech companies are kind of censoring the

22:39

stuff that you can create online.

22:43

There's always a censorship problem,

22:44

I think,

22:44

with a lot of these centralized services

22:46

and big tech services where outsourcing

22:50

all of your control to these centralized

22:52

cloud providers instead of trying to do

22:54

everything yourself puts you in a very bad

22:57

situation.

23:00

We see it all the time in other

23:02

industries,

23:03

and it's something that I think we can

23:06

catch right away and try to avoid going

23:08

forward.

23:09

I don't know what the general vibe for

23:12

AI is among the general population right

23:16

now.

23:16

I don't think AI is a huge thing

23:20

for most people outside of the tech

23:22

sphere.

23:24

And I think most people are rejecting AI

23:26

right now, which is probably...

23:28

probably a good thing because it it just

23:31

seems to be creating a lot more harm

23:33

than good right now in so many different

23:35

ways yeah that's fair and to be clear

23:42

when i when i kind of um

23:47

when I kind of take a potentially

23:48

optimistic approach,

23:49

I'm not necessarily trying to be pro AI.

23:51

I'm just, I'm trying to be fair.

23:52

I'm trying to point out, cause you know,

23:54

one,

23:54

and this is definitely not a one-to-one,

23:55

this may even be a disingenuous

23:57

comparison, but,

23:58

One comparison I keep hearing people make

24:00

is newspapers.

24:01

When newspapers first went to mass print,

24:05

we had a huge problem with misinformation

24:07

and disinformation and what's now called

24:09

yellow journalism,

24:10

which is – we still see to an

24:12

extent the super sensational just making

24:14

things up out of thin air because it's

24:16

scandalous and it sells.

24:17

I think now we call it clickbait.

24:19

But we –

24:22

That was eventually something that we were

24:23

able to mostly figure out by creating

24:25

legislations and having these very strict

24:29

slander laws and things like that.

24:31

And to be fair, laws are retroactive,

24:33

right?

24:33

Like laws work after the fact,

24:35

after somebody has already been harmed.

24:37

So I'm not saying that's a perfect

24:38

solution,

24:39

but I think we can agree that in

24:40

the end,

24:40

newspapers ended up being better than they

24:42

were.

24:42

So that's kind of the lens I'm trying

24:44

to look at this through is like,

24:46

what are we looking at in terms of

24:48

some of these problems are technical

24:49

problems that can be solved,

24:51

but then others like you're right.

24:52

It's, it's,

24:53

I think going back to the whole training

24:57

data thing.

24:57

Okay.

24:57

We create,

24:58

let's say we created a system where people

25:00

have to opt in to have their training

25:02

data scraped up by AI and whatever that

25:04

looks like, whether it comes with a, like,

25:07

what's the word I'm looking for?

25:08

Compensation or whatever.

25:09

Would that be enough?

25:10

Because that's what makes AI so quote

25:13

unquote good or effective is that it just

25:14

has obnoxious amounts of training data.

25:17

And I would be surprised if enough people

25:20

opted in to really make AI as effective

25:22

as it is now.

25:23

And again, to be clear,

25:24

I'm not saying like, oh,

25:25

people should opt in.

25:26

Like, no,

25:27

it's your content that you're creating.

25:28

Do whatever you want with it.

25:30

But it's just,

25:31

it's kind of backing up what you were

25:32

saying, Jonah,

25:33

about if

25:34

we may be facing something that there may

25:36

be no way to use it privately.

25:37

And I think that is,

25:40

I agree with you.

25:40

I think that's really good that people are

25:42

opting not to use this,

25:43

but I do worry for people.

25:45

We had at least one person in the

25:46

forum who said that they're in a field

25:48

where it's becoming increasingly difficult

25:50

to navigate that field without having AI

25:54

on your resume because it's,

25:56

I don't even know what businesses are

25:57

using it for,

25:58

but apparently everybody wants you to be

25:59

able to learn AI,

26:00

whatever that even means.

26:02

I don't know.

26:03

And so I guess know how the prompts

26:06

work.

26:06

I really don't know.

26:07

But if you're in a field where you're

26:09

like, I don't like to use AI,

26:11

even if you just don't like it,

26:12

I don't see a use for it.

26:13

I don't see a value.

26:14

I've tinkered with AI in the early days

26:15

and-

26:16

There's a few things it does really well,

26:18

but overall,

26:19

I don't see how it became this trillion

26:21

dollar industry that's propping up the

26:22

entire US economy.

26:24

It's just not that, for me at least,

26:25

it doesn't do that much.

26:26

So if I were in one of those

26:28

fields where they're like, well,

26:29

how often do you use AI?

26:31

Hardly ever because it just doesn't – I

26:33

don't have a use for it.

26:34

It doesn't do anything for me.

26:35

And anyways, yeah,

26:36

my point being what I'm trying to get

26:38

to is it's unfortunate that some people

26:40

are in a position where they're now stuck

26:42

where they have to show that they know

26:44

how to use AI.

26:46

And I don't know.

26:47

It's like –

26:49

It's like phones,

26:50

like phones are not really private, right?

26:52

And it's hard to make them private.

26:53

And some people can afford to not have

26:55

a phone.

26:56

They can work in a field or they

26:57

can be self-employed where they don't have

26:58

a phone,

26:59

but not everybody has that luxury.

27:00

And it's really unfortunate people are

27:02

being put in that situation.

27:03

Yeah,

27:04

I think you bring up a really good

27:05

point,

27:05

especially with the whole like having to

27:09

maybe opt in or like consensually sharing

27:12

your data with AI.

27:13

This is something that really bothers me

27:14

about the current AI landscape, actually,

27:16

because I think these tech companies have

27:18

kind of created this situation where now

27:21

that they've created the problem and all

27:23

of these problems that AI causes.

27:25

now they're trying to sell you on various

27:27

solutions to try and take control of it

27:29

after the fact and it's like these these

27:30

problems wouldn't exist without all of the

27:32

ai that's being pushed on consumers from

27:35

these tech companies fairly irresponsibly

27:37

um i was just looking i mean we

27:39

were just taking a look at youtube studio

27:41

the other day and looking at um some

27:43

of their like likeness detection features

27:44

and how that's going to require me to

27:47

you know scan my face and send them

27:48

my id if i want to

27:51

you know,

27:51

monitor YouTube for people who are

27:53

potentially creating AI generated videos

27:57

of me, for example.

27:59

And this is not a position that I

28:00

think people should be putting in the

28:02

first place because it's just yet another

28:05

thing in a long string of events where

28:07

tech companies create these problems as an

28:09

excuse to try and get more and more

28:11

of your data.

28:12

And now I have to share even more

28:13

data with Google that they didn't

28:16

necessarily have before because of the AI

28:18

problems that they've created.

28:20

And I'm sure that this is going to

28:20

be commonplace

28:22

On other platforms, if it isn't already,

28:24

it'll be coming soon.

28:25

And I'm sure there's not going to be

28:27

a single way to opt out of it

28:29

everywhere across the internet because

28:32

there's just no coordination like that.

28:34

And there isn't really a great way to

28:35

do it privately.

28:37

And so by just accepting AI and kind

28:41

of normalizing all of this,

28:42

that's just kind of the society that we're

28:45

creating here.

28:47

We're just losing the ability to control

28:51

who has access to our data and who

28:55

benefits from it.

28:56

And unfortunately, at this point,

28:59

it seems pretty clear that the only people

29:00

who are really benefiting from having all

29:02

of our data is

29:05

these big tech companies.

29:07

So I don't know.

29:08

It's ridiculous, I think.

29:13

Yeah, for sure.

29:14

The normalization,

29:15

that's a huge problem with privacy, right?

29:19

As privacy advocates,

29:22

it's so normal to use these tools that

29:24

it sometimes can, I don't know, it's...

29:28

I'm sure a lot of us have been

29:29

in that situation where it's like, oh,

29:30

I don't have Facebook.

29:31

What do you mean you don't have Facebook?

29:33

And they like, I don't know.

29:34

Usually when I say that,

29:35

people are just like, whoa,

29:36

that sounds awesome.

29:37

And I'm like, yeah, just delete it.

29:37

It's not that hard.

29:38

But I also know some people have just

29:40

been met with like, you know,

29:43

they're isolated now because it's like,

29:44

oh, well, you're not on Facebook.

29:45

So I didn't send you an event invite

29:47

because apparently you don't exist

29:48

anymore.

29:49

So yeah,

29:51

when you use that word normalization,

29:52

that really jumped out at me.

29:54

That's such a problem with a lot of

29:55

these privacy invasive texts is they

29:57

become normalized.

30:01

I don't have any more thoughts to add

30:04

to that one.

30:05

Do you have anything to add before we

30:06

move on to our next story?

30:09

I think that kind of covers all the

30:11

stuff I was thinking about with AI.

30:14

So we can take a look at our

30:15

next post here.

30:17

This comes from the Windscribe blog,

30:20

actually.

30:20

The headline is,

30:22

Windscribe partners with Kaji, Notesnook,

30:25

Addy.io and ENTI to create a privacy

30:27

focused alliance.

30:29

And so basically,

30:30

what Windscribe has done is they've

30:33

partnered with all the services I've just

30:35

named to give people kind of exclusive

30:39

discounts or deals on all those services

30:41

if you're a Windscribe user.

30:43

And I know this is I don't remember

30:45

if we've talked about this in a previous

30:46

episode,

30:47

you can remind me but I know ENTI

30:49

has done a similar thing with other

30:51

services in the past.

30:52

And it looks like Windscribe is kind of

30:54

joining in

30:55

on that initiative.

30:57

So I think it's pretty cool what they're

31:00

doing.

31:04

I guess the question that we would

31:06

probably want to talk about is how do

31:09

we feel about these privacy alliances?

31:11

Do you have any opinions?

31:12

I have a couple of things to say

31:14

for sure,

31:14

but I can let you go first.

31:19

I got to be honest,

31:19

I think this is the first one I've

31:21

seen,

31:21

or at least the first one on this

31:22

scale, for sure.

31:27

I don't really have too much of an

31:28

issue with it personally.

31:30

I think the thing that disappoints me is

31:31

that a lot of these are like Addy.io,

31:34

Addy.io,

31:34

twenty five percent off the first year.

31:37

Same thing with NT,

31:37

twenty five percent off for the first

31:38

twelve months.

31:40

I think.

31:43

I don't know.

31:43

Maybe it's just me being cheap,

31:44

but I'm one of those people that if

31:45

I'm going to sign up for a discount,

31:47

I would like to continue to have that

31:48

discount.

31:49

But I mean,

31:49

at least they're being upfront about it.

31:51

But I don't know.

31:52

I think...

31:54

I don't have too many issues with it

31:55

because I think wind scribes, uh,

31:58

their logic makes sense.

31:59

You know, if,

31:59

if you read the blog posts that they

32:01

put out, they said, uh,

32:03

like why a privacy focused partnership

32:05

instead of just like building a suite in

32:07

house.

32:08

And their answer is basically

32:10

compartmentalization.

32:11

You know, if, if you compartmentalize,

32:13

then.

32:13

If any one of these services goes away

32:16

or becomes compromised or what have you,

32:18

then it's just that one service.

32:20

It's not across the board.

32:21

It's not your entire account.

32:23

And I think they make a really good

32:24

point there.

32:25

And also one thing they didn't say,

32:27

but one thing I've historically said that

32:28

I really believe that is that when you

32:30

try to do everything,

32:32

usually you end up doing everything kind

32:33

of poorly.

32:35

So I would definitely prefer like

32:37

Windscribe.

32:37

We're going to focus on our VPN and

32:39

we're going to make a really good VPN

32:41

and we're going to let NT handle the,

32:43

or NT handle the photo storage.

32:45

We're going to let Kagi or Kaji handle

32:47

the search and, you know,

32:49

which I've heard really good things about

32:50

Kaji.

32:50

I still haven't used it myself,

32:51

but I've used NT.

32:53

I'm very happy with it.

32:54

So yeah, it's,

32:55

it's kind of nice to see that, um,

32:58

that specialization there the only thing i

33:00

can think that might kind of not be

33:03

great is a lot of uh quote-unquote normie

33:06

users are really big fans of they want

33:09

the ecosystem right they want like that's

33:11

one of the amazing things about google

33:12

right is you get an email and it

33:14

says hey let's have lunch on the and

33:16

it automatically asks do you want to add

33:17

this to your calendar or it used to

33:19

ask i think now it just does it

33:20

but i don't use google anymore so i

33:21

have no idea um

33:23

And it adds it to your calendar.

33:25

And then when you send an email and

33:26

you add an attachment that's too big,

33:28

it's like, oh,

33:29

do you want to just one click,

33:30

add this to Google Drive and send it

33:31

that way?

33:32

And they make it so seamless and

33:34

everything works together.

33:36

And so that is kind of the argument

33:37

for things like Proton, for example.

33:39

If you don't use them, that's fine.

33:41

You don't have to.

33:41

But it's a really compelling alternative

33:43

for people who want that ecosystem.

33:45

And that's kind of the only thing I

33:46

could see getting in the way from my

33:47

perspective is some people may say like,

33:49

well,

33:49

why would I sign up for six different

33:51

services when I could

33:52

just go somewhere else and get it all

33:54

at once but yeah i think those are

33:56

kind of all my thoughts absolutely i i

33:59

totally agree well i'll go through a

34:02

couple of your points i think the the

34:03

first thing that you mentioned how some of

34:04

these discounts aren't lifetime plans i

34:06

think is really unfortunate because i do

34:08

think that the the the big draw for

34:12

for this um for a lot of people

34:14

would be to escape an ecosystem like like

34:16

proton um i i understand all your points

34:19

about the ecosystem and definitely a lot

34:20

of people are

34:22

into that sort of thing.

34:23

And I definitely use a lot of Proton

34:25

services myself personally,

34:27

but I also know a lot of people

34:28

who don't want to put all of their

34:30

eggs in one basket and they don't want

34:31

to use ProtonMail and ProtonDrive and

34:33

ProtonVPN, right?

34:34

And they would rather like trust

34:36

individually vetted individual services.

34:39

And I think there's also something to be

34:41

said about companies that really just

34:44

specialize on doing one thing and one

34:46

thing

34:46

really well,

34:47

like Haji with Search or NT with Photos,

34:51

for example.

34:53

I think all of these privacy services and

34:59

companies still exist in a pretty niche

35:01

market,

35:02

and I'm glad that more people are

35:04

becoming concerned about the security and

35:06

privacy of their data,

35:07

and they're switching to these services.

35:09

But there's, you know,

35:12

there's still a lot of growth to be

35:13

had in this sector.

35:15

And I think that prevents a lot of

35:18

companies from growing super big at the

35:20

moment.

35:21

Proton, I think,

35:22

is a is a good exception.

35:23

But I know Proton

35:25

people have a lot of complaints about how

35:28

Proton is slow to add new features or

35:30

they're not integrating all of their

35:31

products properly or that sort of thing.

35:34

And it's true.

35:35

And I think it's just really hard to

35:37

build like a full ecosystem right off the

35:40

bat.

35:41

And if you could have all of these

35:42

separate teams that are much more

35:43

streamlined,

35:44

they don't have to worry about integration

35:45

as much.

35:46

They can just focus on their own features.

35:49

Like Windscribe can just focus on trying

35:51

to be the best VPN they can be,

35:53

for example,

35:53

and they can leave

35:54

like cloud storage and search and photo

35:57

storage to these other companies.

36:01

You know,

36:01

I think that's really beneficial and would

36:03

help a lot of companies.

36:04

But if you're only giving away like trials

36:08

or limited time discounts,

36:10

it's

36:12

It's not going to be very compelling just

36:14

from a cost perspective, unfortunately.

36:16

I'm not sure if there's a super great

36:18

way for a coalition of these companies to

36:23

work together on something like that,

36:24

or if there's an opportunity for someone

36:26

to sell bundles.

36:28

Because at the end of the day,

36:29

especially with most modern payment

36:32

systems right now, you have to...

36:35

have like one central company that's in

36:38

charge of billing and that's obviously

36:39

going to give that company whoever it is

36:41

a lot of power over the other companies

36:44

in this in this coalition right and so

36:47

i'm sure there's probably some

36:48

cryptocurrency solution to this where

36:50

everything could be decentralized and

36:51

split up but not everyone is going to

36:52

pay in cryptocurrency um

36:55

But I would maybe want to see a

36:58

solution that's more integrated than this,

37:00

where it's not just like exclusive

37:02

discount codes,

37:03

but maybe it's a bundle where you could

37:05

sign up with any of these services and

37:07

get billed through the service of your

37:09

choice.

37:10

And that might decentralize it a bit,

37:11

but then you get access to all of

37:12

these other services for the lifetime of

37:14

the bundle.

37:15

And I think that that would be a

37:16

lot more compelling for a lot of people

37:19

who are switching from something like

37:20

Proton,

37:20

that kind of

37:22

includes many of the things that are being

37:24

sold here.

37:26

How that would work exactly, again,

37:28

I don't know.

37:29

But I think that that would be the

37:30

biggest draw for a bundle of privacy

37:33

products like this.

37:34

And it's kind of a shame that

37:36

They're not doing that right now,

37:37

but maybe they'll go in that direction.

37:39

For now at least,

37:40

I guess if you're a Windscribe user,

37:41

this is a pretty good opportunity to use

37:43

some of the services that we recommend.

37:45

We haven't recommended or evaluated all of

37:48

the services,

37:48

including Windscribe itself on privacy

37:51

guides.

37:52

I know there's a lot of discussions on

37:54

our form if people are interested in

37:57

learning about pretty much all of these.

38:00

But yeah, I think...

38:04

that would be the direction that I would

38:06

want to see something like this go in.

38:08

And we'll see if that happens.

38:13

Yeah.

38:13

And just to back up what you were

38:14

saying, I totally get the appeal of not,

38:18

like the compartmentalization is the

38:20

appeal for some people, right?

38:22

Like you were saying, like, sure,

38:23

there are a lot of people who want

38:25

the ecosystem,

38:26

but there's also a lot of people who

38:27

want whatever the best thing is.

38:28

You know,

38:29

if you think Mulvad is better than Proton

38:32

and you would rather use Mulvad and you

38:33

kind of,

38:34

mix and match.

38:35

I think that's great.

38:36

And I do agree with Windscribe that that

38:38

is certainly more secure from a

38:41

compartmentalization perspective.

38:43

But yeah, I'm with you.

38:45

I think really,

38:46

even if you want these disparate services,

38:48

it would be really cool if there was

38:50

some kind of... If it was...

38:51

I don't know.

38:53

This feels to me like...

38:55

this feels to me a lot like the

38:56

concept of a sister cities,

38:58

which I've never really understood.

39:00

It's like, Oh, we're gonna like,

39:03

I swear to God,

39:04

I've lived in places that like the sister

39:05

city is in like Russia.

39:07

And I'm like, how?

39:09

Like I'm in Texas.

39:10

How, what, what, what's going on?

39:14

And you know, it's,

39:14

it's really means nothing.

39:15

It's just some kind of like cooperation.

39:17

They're like, Oh,

39:17

maybe you should check this place out.

39:19

We've, you know, shook hands or whatever.

39:20

And

39:21

And, you know,

39:22

I don't mean to downplay this to that

39:23

extent,

39:23

but it feels very similar that it's kind

39:25

of like, well,

39:25

we just got together and agreed we all

39:26

like each other and we're really cool.

39:27

And they are really cool services.

39:28

Again, I want to stress that, but.

39:30

It would be nice to see something that's

39:31

a little bit more cohesive, I guess,

39:34

or benefits the user a little more other

39:36

than just some kind of like temporary,

39:38

which again,

39:39

I think some of them are like permanent,

39:41

right?

39:41

I think at least one of them was,

39:42

I'm trying to pull the page back up

39:43

here.

39:44

Notes Nook was a permanent discount.

39:46

As far as I'm seeing.

39:48

And I think there was one other one.

39:51

There's not any, well, Control D,

39:53

but that's also run by Windscribe, so.

39:55

oh okay okay that's the one i was

39:57

thinking of yeah fifty percent off control

39:59

d um so yeah it would be yeah

40:02

lifetime discount that's dope but yeah so

40:05

it would be cool to see something a

40:06

little bit more cohesive like that but i

40:09

don't know at the same time it's like

40:10

that could be a really cool i'm i'm

40:12

trying so hard to get a lot of

40:13

my family members to try ente because you

40:16

know most of them are just in google

40:17

photos apple photos whatever phone they're

40:19

using and

40:21

Yeah.

40:21

So maybe, I mean, if nothing else,

40:22

maybe this could be a nice like, hey,

40:24

here's twenty five percent off.

40:25

Give it a shot.

40:25

So totally.

40:26

I really like your sister cities analogy,

40:29

actually,

40:29

because that is kind of what what this

40:31

is.

40:31

I know.

40:32

I think all of those sister city things

40:34

are like with international cities and

40:37

there isn't much like true connection

40:40

between them.

40:41

And that is sort of what this feels

40:42

like at the moment.

40:42

Like it's a lot of disparate services

40:45

where you can get like, you know,

40:47

that there's cross promotional stuff going

40:49

on.

40:49

There's

40:50

limited time discounts,

40:51

but there isn't a true partnership or

40:54

working together on something extremely

40:57

cohesive.

40:57

It's just awareness.

41:00

Windscribe is probably for the most part

41:02

just making people aware of these services

41:04

more than providing an actual long-term

41:07

value for their users.

41:09

But I think awareness of other privacy

41:10

companies is certainly a good thing.

41:12

So I'm not going to knock it for

41:15

that,

41:15

but I don't think

41:16

in its current iteration,

41:17

this is going to create a great

41:18

alternative for someone coming from

41:20

something like Proton, for example.

41:22

But yeah,

41:23

if you were going to try out some

41:25

or all of these services anyways,

41:27

especially because you can pick and

41:29

choose,

41:29

it's not like some bundles where you have

41:31

to register for everything and then you

41:34

might not even want to use some of

41:36

these things.

41:37

You know, it's not that serious.

41:39

So yeah,

41:41

I think it's a cool opportunity for them.

41:44

I always like to see privacy companies

41:46

work together on this sort of thing rather

41:49

than you know constantly compete with each

41:52

other especially in at times when it

41:54

doesn't make any sense to be competing

41:56

with each other at all um so so

41:59

yeah i think it's i think it's cool

42:03

yeah for sure um man you said one

42:08

last thing i wanted to touch on but

42:10

got away from me so all right yeah

42:15

I think I would just ask you,

42:17

maybe we didn't cover this.

42:18

Do you have any thoughts about like any

42:21

of these companies in particular?

42:22

How many of these have you used?

42:25

Because I know not all of these are

42:26

even recommended on our site, for example.

42:30

But I know they've been talked about a

42:31

lot on the forum.

42:33

Well, actually, real quick,

42:35

my thought just came back to me.

42:36

I was going to say,

42:37

you said this is kind of Windscribe just

42:39

like kind of bringing awareness of these

42:42

companies.

42:43

To their defense,

42:43

that can be used because, you know,

42:45

back when I was on Surveillance Reporter,

42:47

we took a sponsor and our first sponsor

42:50

was JMP Chat, the voiceover IP.

42:54

And I thought for sure that I was

42:55

like, oh, everybody knows JMP Chat.

42:57

And I was floored how many people left

42:59

comments like, oh,

43:00

I've never heard of this before.

43:01

This is really cool.

43:01

And I'm like,

43:02

Really?

43:04

And I don't mean that in a bad

43:05

way.

43:05

Like, really?

43:06

But I was like, really?

43:07

That many people have never heard of this.

43:08

So I'm sure even Windscribe probably has

43:11

tons of people that are like, oh,

43:12

I've never heard of Kaji.

43:12

I've never heard of Addy.

43:14

So...

43:16

But in answer to your question,

43:18

I use Entei.

43:20

I'm kind of in this weird space where

43:21

I'm like halfway between Entei and

43:24

Nextcloud,

43:24

and I'm not sure which one I want

43:26

to commit to, to be totally honest.

43:28

I like Nextcloud, but I'm debated.

43:31

The encryption in Nextcloud is still not

43:32

great,

43:33

so it's like I could have all my

43:34

photos end-to-end encrypted,

43:35

but then they don't integrate,

43:36

but then how much do I use the

43:37

integration?

43:38

So anyways...

43:40

I've used Addy IO in the past.

43:43

I've tinkered with Kaji a little bit.

43:47

I haven't really like used it personally.

43:49

I've used it to test it out,

43:50

but I've never used it in like my

43:51

day-to-day use to see how it would

43:53

integrate my workflow.

43:55

And I've looked into Notesnook.

43:56

One of these days,

43:57

I actually want to do a video about

43:59

privacy respecting alternatives to things

44:01

like Notion,

44:02

which Notion is already not terrible,

44:04

but there's so many open source,

44:06

like Obsidian, Notesnook.

44:08

There's one called AnyType, I think it is.

44:12

So yeah, Notesnook,

44:13

I looked into it a little bit as

44:14

a potential note alternative,

44:15

but I haven't actually used it myself.

44:19

How about you?

44:19

Do you have any experience with any of

44:20

these?

44:21

Um, yeah, I, I'm,

44:24

I'm also on the boat of maybe switching

44:26

to Entei.

44:27

Um,

44:27

but I haven't really like fully committed

44:30

to any of these photo backup platforms

44:32

myself yet.

44:33

Um,

44:34

otherwise I don't really use a lot of

44:35

these.

44:35

I do need to get better at, um,

44:37

note taking and maybe notes look would be

44:39

a good solution, but maybe that'll be my,

44:41

my new year's resolution this year and

44:44

I'll have to report back on what I

44:46

ended up doing.

44:49

Makes sense.

44:50

Yeah.

44:50

I've, I've been pretty happy with, um,

44:53

Addy, like all the ones I've used.

44:55

It's not like I didn't stop using them

44:56

for some, like, Oh,

44:57

they had this big problem, but, uh,

45:00

I don't know how many of them would

45:01

qualify to be listed on,

45:02

on privacy guides.

45:03

I know we have some really strict

45:04

standards, but for me, it was just,

45:05

I found other things that integrated with

45:07

my needs better or, you know,

45:09

my workflow or they were a little bit

45:10

cheaper or something, but.

45:12

Yeah, like I said before,

45:13

for better or for worse,

45:14

I'm kind of in the Proton ecosystem right

45:17

now.

45:17

And I'm thinking about changing it,

45:18

but I haven't yet.

45:19

So that's kind of where I'm at.

45:22

Fair enough.

45:24

I will admit I'm one of those people

45:25

that's constantly like,

45:26

I'll have a workflow that works.

45:28

Like let's say Nextcloud, right?

45:29

Let's say I'm all in on Nextcloud.

45:31

And then I'll have that moment where I'm

45:32

like,

45:33

but it's not really end-to-end encrypted.

45:36

So what if I did replace the notes

45:38

and then I go back to this system

45:39

where everything's like,

45:41

I've got my notes here and I've got

45:42

my photos here and I've got this here

45:43

and this here.

45:44

And then I'm like, yeah,

45:46

but I really miss next class.

45:50

I'm constantly trying different things and

45:52

going back and forth and it's awful.

45:55

I don't know why I'm like this.

45:56

All right.

45:59

Okay.

46:01

If that's all we have on that topic,

46:03

first of all,

46:04

I've been asked to let you guys know,

46:05

as a reminder,

46:06

that you can get this bottle on

46:07

shop.privacyguides.org.

46:11

Our next story,

46:12

we are going to talk about messaging a

46:14

little bit.

46:15

We're going to talk a little bit about

46:17

RCS and iMessage later,

46:18

but first we're going to talk about

46:20

Threema, which is an encrypted messenger.

46:23

It is not recommended by privacy guides.

46:25

I think historically,

46:27

I think they've added forward secrecy now,

46:29

but in the past they were missing it.

46:30

And I think there's a few,

46:31

maybe a few other shortcomings,

46:32

but it's not the worst messenger in my

46:35

opinion.

46:36

And yeah,

46:37

Well, for now,

46:38

it's not the worst messenger because they

46:40

have just been acquired by a venture

46:42

capital firm.

46:43

And this is called what is it?

46:45

I'm probably going to mispronounce this

46:46

comatose capital or comatose.

46:48

Maybe I'm not sure,

46:49

but I believe they are a German company.

46:54

I somebody mentioned in the the group chat

46:58

that this is actually not the first time

47:00

three months been acquired by a company

47:02

like this.

47:03

So

47:07

I think it was about five years ago.

47:08

They were acquired by another private

47:10

equity company.

47:10

So this is just a second private equity

47:13

acquisition,

47:14

but it has been kind of the case

47:15

for a while that they were owned by

47:16

this.

47:17

They weren't their own company.

47:20

Which on the one hand,

47:21

I could see that as an argument for

47:23

maybe this won't really affect the quality

47:25

of the product at all because they've

47:27

already kind of...

47:29

Although I don't want to take shots at

47:31

Threema because I think anybody who's

47:32

trying to make privacy and security...

47:35

is doing a good thing,

47:36

but I do have to be honest that

47:38

they are severely lacking on a lot of

47:40

basic features that other messengers

47:41

already have.

47:42

Um,

47:43

So yeah,

47:44

it's not the most feature rich platform

47:46

and it costs money for those who didn't

47:47

know.

47:48

It's five dollars one time for the

47:50

individual.

47:51

Like a lot of companies,

47:52

they have like an individual arm and they

47:53

have like a business to business arm.

47:55

I think the B to B one is

47:57

like a subscription.

47:57

But if you're just an individual user,

47:59

it's five bucks one time.

48:00

And that's a hard sell when I could

48:02

go download Signal, SimpleX, Session,

48:07

pretty much any of them.

48:08

So yeah, that's awesome.

48:11

It's already a hard sell to get people

48:12

to using it.

48:13

And like I said,

48:14

it is missing a few of the more

48:15

advanced securities features that we've

48:16

come to expect out of things like Signal,

48:18

like perfect forward secrecy.

48:19

But yeah, I mean...

48:23

I don't know.

48:24

Do you want to talk about why?

48:27

The price was always the thing that was

48:28

holding back Threema, I think,

48:30

from gaining widespread recognition or

48:33

recommendations in our community and on

48:35

our site.

48:36

I think even one of our criteria right

48:38

now, which are, of course,

48:41

always subject to change of people if the

48:42

community feels otherwise,

48:44

but I think we settled on we only

48:45

recommend free messengers because I think

48:48

while a lot of people...

48:50

in our community are willing to pay for

48:53

more private and more secure services with

48:55

something like a messenger or social

48:57

network or something along those lines it

49:00

really there there is a network effect and

49:03

the reality is you are going to want

49:06

to communicate with people who don't care

49:08

about privacy and security and aren't

49:10

going to pay for for a messenger like

49:12

this and so it was a very niche

49:14

um use case where where three would make

49:17

sense compared to something like signal

49:20

Or especially Simplex,

49:21

which doesn't even require a phone number.

49:22

But even in Signal's case,

49:24

I think most people have phone numbers and

49:26

most people expect that's a way to text

49:28

people on your phone.

49:29

And so slotting in Signal to replace those

49:32

messages makes a lot of sense for people.

49:35

It was definitely argued to me in the

49:37

past that Threema makes sense for people

49:39

who don't have phone numbers.

49:41

To acquire a phone number to use Signal,

49:43

for example,

49:44

probably costs more than the five dollars

49:46

that Threema costs.

49:48

You could argue that Threema is actually

49:50

cheaper than Signal from that perspective.

49:53

But I think the reality is most people

49:54

do have phone numbers and most people are

49:55

looking for free messengers.

49:57

And especially with the introduction of

50:00

completely free ones like SimpleX,

50:02

it was just challenging to recommend.

50:06

With a messenger specifically,

50:08

if we want to improve privacy in the

50:10

space overall,

50:10

we need to be promoting services that...

50:15

Everyone can use,

50:16

and you can get your entire network on

50:18

because that improves the baseline

50:19

security and privacy for everyone.

50:21

Whereas with Threema,

50:24

if it's only people who are willing to

50:25

pay for it,

50:26

you're only going to get people who

50:27

already care about privacy.

50:28

It's a bit like preaching to the choir,

50:30

I think.

50:33

I've never used Threema for this reason.

50:35

I think there's easier ways to reach me.

50:38

I think you've mentioned that you use

50:41

Threema in the past.

50:41

I don't know if you want to share

50:42

a bit about your experience with that.

50:47

Yeah, I mean,

50:51

I don't have much to share because I

50:52

haven't used it a lot.

50:53

I think I've only run into like one

50:54

or two other people who use it.

50:55

And honestly, even those people,

50:57

like after a couple months of chatting on

50:59

and off,

50:59

and they're not people I chat with every

51:01

day.

51:01

It's, you know,

51:01

people who are in the privacy community

51:03

who are basically like, oh, I have three,

51:05

I'll help you test it out.

51:07

And, you know, those semi articles like,

51:09

couple of times a month or something.

51:10

And,

51:10

and then after like four to six months,

51:12

they're just like, yeah,

51:13

I'm just gonna like go all in on

51:16

signal because that's where most of the

51:17

people I talk to are.

51:18

So I'm going to stop using this.

51:19

And, um, and like I said,

51:22

it's already missing so many things that,

51:24

you know, signal.

51:25

it has just like emoji reactions.

51:28

Like you're very limited to the emoji

51:29

reactions you can use on Threema.

51:32

I think they do have polls now,

51:34

but I think they're very limited polls.

51:36

It's just, I don't know.

51:38

It's just,

51:38

it's not as good of an experience.

51:40

And again, I hate to say that.

51:41

Cause I think, you know,

51:42

anybody who's trying to further privacy

51:43

and security, I think that's great,

51:45

but it's just,

51:46

it hasn't really been the best experience.

51:50

And going back to the payment thing,

51:51

I agree with you.

51:51

Like people are just so, and again,

51:54

I see the argument from both sides,

51:55

because on the one hand,

51:56

we shouldn't be conditioned to expect

51:58

things for free, right?

52:00

If it's free, you are the product,

52:01

which isn't totally true,

52:02

but it's a great shorthand.

52:05

And when we have so many free services,

52:07

nine out of ten times,

52:08

they're selling our data or they're doing

52:09

something like that,

52:10

something shady to monetize.

52:12

But on the other hand,

52:14

And it's good also that Threema has like

52:16

a business model, right?

52:16

Like you pay for the product the same

52:18

you would as anything else.

52:20

But at the same time,

52:20

that is such a hard sell to like

52:22

try and get, you know,

52:24

I always use my family as an example,

52:25

but to try and get my sister to

52:26

like, hey,

52:27

you should switch to Threema when,

52:29

you know, you have to pay for it.

52:31

It's missing a lot of features.

52:32

It's,

52:33

not the prettiest UI.

52:34

And this is coming from me.

52:35

I'm the kind of person that normally

52:36

doesn't even care what the UI looks like.

52:38

I have cubes right in front of me

52:39

right now, which is true.

52:42

So it's just a really hard sell,

52:46

unfortunately.

52:46

I agree with you though.

52:47

I think if I were to make the

52:51

shots,

52:51

I would tell Threema that they should make

52:55

their individual facing arm totally free

52:58

and they should just focus on monetizing

53:00

their business to business side

53:02

And that's how you should do their

53:03

business model.

53:04

You know,

53:05

there's like we see Telegram and now we

53:07

see Signal kind of venturing into

53:10

monetizing certain features of these

53:12

platforms where you can provide a very

53:14

good base service for free,

53:15

but then optional stuff,

53:16

especially for power users,

53:17

which are probably the core demographic of

53:20

what Threema is serving right now with

53:23

their five dollar pricing.

53:26

I think people will pay for those

53:28

features,

53:28

but they're not necessary for everyone.

53:30

And I think

53:31

for any messenger to take off your

53:33

experience,

53:33

I think really

53:35

validates the point I was making.

53:38

I think the background behind our criteria

53:41

that the messengers that we recommend on

53:42

our site have to be free basically comes

53:45

down to any of these paid messengers,

53:48

I don't really see them taking off as

53:51

more than a neat tool for a hobbyist

53:55

who's into security to mess around with.

53:57

But it's not going to get the kind

53:58

of mass appeal that you need from certain

54:00

products.

54:00

It's fine if

54:02

If, you know,

54:03

NT charges more than Google Photos,

54:05

for example,

54:05

because these aren't social platforms.

54:08

I can protect all of my data.

54:10

If other people aren't protecting their

54:11

data,

54:12

I think that's unfortunate and that should

54:13

be fixed.

54:13

So there's that,

54:14

but it's not going to affect me, right?

54:16

But with a messenger,

54:18

like the only thing I'm doing is

54:19

communicating with other people and they

54:20

might not care about security and privacy

54:22

as much as myself.

54:24

But I want to...

54:27

It benefits me when those types of people

54:29

can use these platforms and they simply

54:32

won't find pretty much any price worth it

54:34

for something that other companies can

54:36

provide for free.

54:37

So I would agree.

54:38

I would just hope for a different

54:40

monetization model.

54:41

I think there's there's room here.

54:43

I don't know if they'll do that.

54:46

Threema, like at this point,

54:48

they kind of seem a lot like Wire,

54:51

which used to be pretty widely recommended

54:53

in the privacy community, as you know.

54:56

But then they really pivoted after they

54:59

were acquired to be very business to

55:02

business focused.

55:02

And I can imagine Threema kind of

55:04

following that same direction where they

55:06

just focus on their business product and

55:08

kind of drop the consumer side of things.

55:12

Which would be a bit unfortunate,

55:13

but also I don't know how much three

55:15

month is currently adding to the landscape

55:18

at the moment,

55:18

so it is what it is,

55:20

I think there's a couple different

55:21

directions that they could go in and we'll

55:24

see if they do any of that or

55:26

if comment is capital.

55:28

is the type of private equity firm that

55:31

strips their acquisitions for parts and

55:33

completely shuts everything down.

55:35

You never know with these private equity

55:36

things.

55:36

That's usually what they do, yeah.

55:40

So yeah, if you're a Threema user,

55:43

I would be concerned by this acquisition.

55:47

But if you're not a Threema user,

55:48

which I would imagine a lot of people

55:50

are not,

55:51

I don't think there's going to be a

55:52

lot of impact in the privacy space from

55:54

this news.

55:57

Oh, gosh.

55:57

They own Petco.

55:59

Petco GmbH.

56:02

Okay.

56:03

Well,

56:03

that is not a good sign for Trio.

56:07

I would say, yeah.

56:08

GmbH, that's Germany, isn't it?

56:10

Or is that Switzerland?

56:12

I think GmbH is Germany.

56:14

I think there's a couple of different

56:16

countries in the EU that use that one.

56:20

Use that, yeah.

56:21

I couldn't tell you off the top of

56:23

my head.

56:24

Oh, nevermind.

56:24

They don't own them anymore.

56:25

They sold their majority stake in.

56:27

Okay.

56:27

Sorry.

56:30

I'm just poking around their website now.

56:34

Yeah.

56:34

Yeah.

56:35

Um,

56:36

I think I kind of came into the

56:36

privacy scene on the tail end of wire,

56:38

but I remember that too.

56:40

Wire used to be.

56:42

it used to be pretty solid.

56:43

It was,

56:44

it was much more polished than three

56:45

months and it was free and it didn't

56:46

require any, any private information,

56:48

but yeah,

56:50

they went all in on business to business.

56:51

I think maybe you can still download wire,

56:53

but they certainly don't make it easy.

56:55

And, um,

56:56

Yeah, it's unfortunate.

56:58

That was kind of the other big thing

56:59

I wanted to mention was, like you said,

57:00

venture capital firms – or private equity

57:03

firms, sorry.

57:04

Their whole – there's a podcast called

57:07

Stuff You Should Know that I love,

57:08

and late last year they did an episode

57:10

about private equity,

57:11

and it –

57:12

covered all of that.

57:13

Like that's usually,

57:14

that's their whole job basically is they

57:15

buy a company,

57:16

they make it run super efficient and by

57:18

efficient, we mean we fire everybody,

57:20

we triple the workload.

57:21

We, you know, it's, it's honestly,

57:23

it's like a corporate pump and dump scam.

57:25

I don't even know how they get away

57:26

with it,

57:26

but that's what a lot of them do.

57:28

So hopefully Threema can survive this.

57:30

Um, but I, I, I will say they've,

57:33

they've done a few really interesting

57:34

marketing stunts in the past that I think

57:36

have, uh,

57:37

done good things to raise awareness to

57:38

privacy.

57:39

Like

57:40

I still see sometimes they have a,

57:43

you can still access it actually.

57:44

They have a website where you can upload

57:46

a picture and it'll blur it and then

57:47

it'll put a banner on it that says

57:48

hashtag normalize privacy or regain

57:52

privacy.

57:52

That's what it is.

57:53

And that was part of an awareness campaign

57:55

they did a couple years ago.

57:56

And I think they also did something in

57:58

Europe where they rented an ice cream

58:00

truck.

58:01

I could be remembering the details of this

58:02

wrong,

58:02

but they rented an ice cream truck and

58:04

they were giving people free ice cream.

58:06

But in return,

58:07

you had to hand over personal data.

58:08

They would ask people for their phone

58:10

number or whatever and their date of

58:11

birth.

58:12

And it was funny watching people with the

58:13

ice cream cone and they're like, why?

58:16

No, no, here, have it back.

58:17

I don't know.

58:18

And they're like, yeah, exactly.

58:19

It's insane.

58:19

So why are we doing this with other

58:21

services?

58:22

So yeah, I do agree.

58:24

Overall,

58:24

they haven't really made a huge dent,

58:25

but I really appreciate the innovative

58:27

marketing stunts like that they used to

58:29

do.

58:29

I think those were super fun.

58:30

That is funny.

58:32

I didn't hear that story,

58:33

but I think we don't have to get

58:35

too much into this,

58:36

but it's really interesting how people

58:39

definitely treat the online space

58:41

differently than real life.

58:42

If people were asked for that on a

58:43

website,

58:44

no problem entering that information.

58:45

Your browser would probably autofill it

58:47

for you.

58:48

But when you ask this in real life,

58:50

people suddenly realize what's happening

58:53

here.

58:53

I don't know.

58:55

why people make that distinction in their

58:56

mind.

58:56

But that's a really funny way to kind

59:00

of realize that.

59:02

Yeah, really, really true.

59:04

Really quick before we move on to the

59:06

next story,

59:06

I'll address one comment that we got in

59:08

the YouTube chat here.

59:13

That was about this story before we move

59:14

on.

59:14

They asked about Jammy and if we've used

59:17

it.

59:17

That's not something that we've really

59:19

looked into too much on our website.

59:22

And I think...

59:25

like whenever i've looked into jammy in

59:27

the past that's more of like a video

59:29

conferencing service i know it has instant

59:31

messaging built in but i don't know how

59:33

usable it is in my mind it's sort

59:35

of like a a free software skype

59:38

alternative i think a lot of people um

59:41

will probably use something like signal

59:43

and either signal video calls or or jitsi

59:46

video calls instead of jammy that's

59:48

usually what i see recommended but if

59:51

Uh, if you have any additional questions,

59:53

do you want to share more about like

59:54

what you would use jammy for?

59:55

And if it makes sense for you,

59:56

I would encourage, um,

59:58

the user who asked this to, uh,

1:00:03

post on the forum about it and maybe

1:00:04

get some more, more opinions.

1:00:09

Yeah, I agree.

1:00:10

Not to spend too long on it,

1:00:11

but Jemmy is a name that I've seen

1:00:13

pop up from time to time repeatedly.

1:00:16

And I feel like it's hard because it's

1:00:18

not super popular.

1:00:19

I feel like for me as a not

1:00:20

very technical,

1:00:21

like I don't know any code,

1:00:22

I feel like it's really hard for me

1:00:23

to kind of get a good...

1:00:28

What's the word I'm looking for?

1:00:30

Get really good insight into how it

1:00:32

measures up to things like Signal or some

1:00:35

of these other alternatives.

1:00:36

So I would definitely like to know more

1:00:38

about it.

1:00:38

I'd like to know what is going on

1:00:40

under the hood that makes it better or

1:00:43

worse or what use case it's for.

1:00:46

I haven't found a lot of people using

1:00:47

it,

1:00:47

so I've never had a chance to really

1:00:48

test it myself.

1:00:49

But yeah, like I said,

1:00:52

it pops up from time to time.

1:00:53

So I would love to learn more about

1:00:55

it.

1:00:55

I just feel like I have a hard

1:00:56

time finding that information myself.

1:01:01

All right.

1:01:03

I believe...

1:01:07

Is it my turn to take the next

1:01:08

story or is it yours?

1:01:09

I can look at this.

1:01:10

Our next story is encrypted RCS.

1:01:14

Signs of that were spotted in the iOS

1:01:16

twenty six point three beta.

1:01:18

So the article that we have was actually

1:01:20

posted by Freya on our site as a

1:01:22

news brief reporting on a few different

1:01:24

sources.

1:01:26

Basically,

1:01:26

people have discovered in the iOS twenty

1:01:28

six point three beta some settings that

1:01:31

indicate carriers will be able to enable

1:01:35

end to end encryption for RCS messaging

1:01:37

and indicate that in iMessage.

1:01:40

So that's pretty exciting news for people

1:01:42

who have been following RCS support on iOS

1:01:45

for decades.

1:01:46

a while um because of course right now

1:01:48

it is all not encrypted and apple said

1:01:51

that they weren't committing to using the

1:01:53

same sort of encryption standard that

1:01:55

google is using right now in google

1:01:57

messages on android because it was

1:01:59

something that google developed on their

1:02:01

own instead of working with gsma to create

1:02:03

like a standardized encryption protocol

1:02:07

for all these platforms and services to

1:02:08

use but now um there is a new

1:02:11

standard it's called messaging layer

1:02:13

security or mls and that's

1:02:16

what RCS is going to be using in

1:02:17

the future.

1:02:18

And I guess that's what's being added to

1:02:20

iOS.

1:02:21

So the appearance of this stuff in the

1:02:24

iOS beta doesn't indicate that it's coming

1:02:27

in the iOS twenty six point three release

1:02:29

necessarily that could come with this code

1:02:33

still disabled, for example.

1:02:34

So we might not see encrypted RCS right

1:02:36

away,

1:02:37

but it is a sign that it is

1:02:39

actually coming at some point.

1:02:42

They're actively working on support for

1:02:44

it.

1:02:44

And I think that people who

1:02:47

are on iOS right now or who are

1:02:50

on Android and use Google Messages and

1:02:52

chat with people on iOS,

1:02:55

are going to be excited about this because

1:02:56

there's definitely a lot of benefit to

1:02:58

encrypting all of your messages.

1:03:01

RCS definitely is not the ideal messaging

1:03:06

platform for a lot of reasons.

1:03:07

There isn't a lot of production of your

1:03:09

metadata,

1:03:10

like who you're chatting with and that

1:03:12

sort of thing compared to something like

1:03:14

Signal, for example.

1:03:15

So we're still going to recommend a lot

1:03:17

of different,

1:03:18

more secure and more private messengers on

1:03:21

our site that we would recommend

1:03:23

you use.

1:03:24

But especially in the United States,

1:03:27

I don't know how much of how much

1:03:29

this is the case in other places.

1:03:32

I know a lot of other countries just

1:03:33

standardized on various messengers like

1:03:35

WhatsApp or whatever.

1:03:37

But texting is extremely common here.

1:03:39

And it's definitely used around the world.

1:03:42

And

1:03:44

It's easy for a lot of people,

1:03:45

and people just default to it.

1:03:46

And so improving, again,

1:03:49

kind of what I was talking about with

1:03:50

Threema earlier,

1:03:51

I think anything that improves the

1:03:52

baseline security of all of these people

1:03:54

who don't care about privacy and security

1:03:57

and who aren't seeking out private and

1:03:59

secure alternatives like Signal,

1:04:01

it's still a good thing.

1:04:02

It's a step in the right direction,

1:04:04

even though it's not the best you could

1:04:06

be doing.

1:04:07

A lot of people rely on this,

1:04:08

and it's going to benefit a lot of

1:04:09

people.

1:04:09

So hopefully...

1:04:11

This is a sign that this will come

1:04:13

in the final release sooner rather than

1:04:15

later.

1:04:16

But I guess time will tell when this

1:04:18

will actually come out.

1:04:21

So I don't have too much to add

1:04:22

to this, but just to clarify,

1:04:24

and you may or may not know this,

1:04:26

you said, and the article says here too,

1:04:29

that it's a carrier setting.

1:04:30

So does that mean the carriers would have

1:04:33

to choose to opt into this,

1:04:34

like Verizon and T-Mobile?

1:04:36

Most likely.

1:04:38

This is definitely the biggest bummer with

1:04:39

RCS.

1:04:41

RCS...

1:04:43

Right now,

1:04:43

it can be implemented in two ways,

1:04:45

you can kind of do everything yourself as

1:04:47

a carrier and add support for it and

1:04:49

interoperate with other carriers that are

1:04:52

using the universal profile,

1:04:53

but it is much like texting it's a

1:04:55

carrier based platform or.

1:04:59

The other thing you can do with our

1:05:00

CS which a lot of carriers do I

1:05:02

don't remember which ones in the US do

1:05:04

this,

1:05:04

but I think there's a list on Wikipedia

1:05:06

or somewhere that I could find,

1:05:07

but a lot of carriers.

1:05:09

don't run RCS and they purchase a service

1:05:12

from Google that does it for them.

1:05:15

And so the reality behind RCS right now

1:05:17

is that Google is actually running all of

1:05:19

the service behind it for I think the

1:05:22

majority of people,

1:05:23

but if not the majority,

1:05:24

definitely a lot of them.

1:05:26

And so it's basically just a centralized

1:05:28

Google Messenger right now that your

1:05:31

carrier is kind of promoting on your

1:05:33

phones.

1:05:34

So obviously, from a metadata perspective,

1:05:37

that gives Google a lot of data,

1:05:40

but also it is, yeah,

1:05:43

there's a lot of middlemen involved here.

1:05:46

It's not just like an over-the-top service

1:05:48

that these tech companies are working on

1:05:51

together.

1:05:51

It's integrating with the traditional

1:05:53

carrier platform.

1:05:54

And whether you're going to Google servers

1:05:57

or whether you're going to the carrier

1:05:58

servers,

1:05:58

that is something that the carrier has to

1:06:01

set up on their end, which is...

1:06:04

Yeah, I agree with your reaction.

1:06:06

A bummer.

1:06:09

Yeah.

1:06:09

Cause I,

1:06:10

I feel like it's going to be a

1:06:12

challenge to get carriers to go ahead and

1:06:14

roll this out.

1:06:15

Like,

1:06:15

I feel like if Apple did it or

1:06:16

even Google,

1:06:17

if they did it at the phone level,

1:06:18

it would just, they would do it,

1:06:20

but I feel like carriers have very little

1:06:22

incentive and.

1:06:23

Kind of going back to what you said

1:06:24

earlier at the beginning when you were

1:06:27

covering this,

1:06:28

I agree with you that we look at

1:06:30

something like iMessage and people who do

1:06:33

not care about privacy or security,

1:06:35

who use the I have nothing to hide

1:06:37

argument liberally,

1:06:39

these are the same people who are using

1:06:41

iMessage.

1:06:42

And they're getting end-to-end encryption

1:06:44

just talking to each other without even

1:06:47

knowing what it is or knowing that it's

1:06:49

enabled.

1:06:50

And so, yeah,

1:06:50

it would be really cool if RCS could

1:06:52

roll out

1:06:53

to the general public and be available for

1:06:56

everyone cross-platform.

1:06:59

But I guess the only thing I could

1:07:01

see maybe as incentive for the carriers is

1:07:04

I know that RCS also comes with a

1:07:07

lot of those quality of life features that

1:07:08

iMessage is known for,

1:07:09

like bigger attachments and reacting to

1:07:11

messages and stuff.

1:07:12

So maybe we'll get lucky and maybe

1:07:15

carriers will roll it out because of the

1:07:17

features and the privacy and the security

1:07:18

will just be an added bonus.

1:07:19

But

1:07:21

yeah um yeah at the moment i do

1:07:23

think there's a lot of pressure and i

1:07:24

think apple adding rcs better rcs support

1:07:27

is going to add even more pressure for

1:07:28

these carriers to support it because like

1:07:30

it i'm in one group chat with some

1:07:33

family members on rcs right now and it's

1:07:35

very nice to be able to see like

1:07:36

read receipts and and like

1:07:39

typing indicators and all the normal group

1:07:41

stuff that you don't get on SMS because

1:07:42

SMS is a terrible platform.

1:07:44

And so there have been some improvements

1:07:48

there.

1:07:48

And I think people will realize that,

1:07:51

especially if they're in chats with RCS

1:07:53

users,

1:07:54

and they will eventually demand carriers

1:07:56

do it,

1:07:56

especially in the US where like texting is

1:07:59

so common.

1:08:00

I don't know how it'll be

1:08:02

in places where people don't use SMS in

1:08:03

the first place.

1:08:05

Maybe there's less incentive to use RCS,

1:08:07

but I think at least for a good

1:08:11

amount of people,

1:08:12

there is pressure to support it,

1:08:14

which is all right.

1:08:14

Like I said, it's not my favorite,

1:08:16

but it increases the baseline,

1:08:18

especially if this gets included.

1:08:20

So I'm hopeful that we'll see wide

1:08:24

adoption.

1:08:26

Yeah, same here.

1:08:29

Um, I think actually I'm going to,

1:08:33

I'm going to keep on the Google and

1:08:35

Apple thing and I'm going to go to,

1:08:37

we have a story here about Google Gemini

1:08:41

is going to power Apple's AI features such

1:08:43

as Siri.

1:08:45

And yeah, so, oh man,

1:08:49

this is kind of a confusing story for

1:08:51

me because,

1:08:51

so Apple's been trying to roll out their

1:08:53

Apple Intelligence,

1:08:55

clever little bit of marketing there,

1:08:56

which is just on-device AI.

1:08:58

And-

1:09:00

Man,

1:09:00

I know we've talked about AI so much

1:09:02

already tonight, but again,

1:09:06

on the user end,

1:09:07

from what I've been understanding,

1:09:08

it seems relatively private.

1:09:09

A lot of it is going to be

1:09:10

done on device.

1:09:12

And I think Apple actually has a very

1:09:15

similar architecture to confer.

1:09:18

Jonah can definitely correct me if I'm

1:09:19

wrong,

1:09:20

but I think they have a very similar

1:09:21

architecture where they try to run

1:09:22

everything in these trusted modules and

1:09:24

they try really hard to make it as

1:09:25

private as possible.

1:09:27

And Apple has been running into a lot

1:09:30

of delays rolling out their Apple

1:09:32

intelligence thing.

1:09:33

And,

1:09:36

One of the few things I don't like

1:09:37

about TechCrunch is they're very sparse on

1:09:39

technical details here.

1:09:40

But basically,

1:09:40

they say that Apple and Google have signed

1:09:44

a deal where Google's Gemini is going to

1:09:47

power at least some of the AI features.

1:09:50

And the headline specifically says,

1:09:51

like Siri, for example.

1:09:53

This is not an exclusive deal,

1:09:55

according to this article.

1:09:56

So Apple may...

1:09:59

Um,

1:10:00

potentially this is me speculating Apple

1:10:01

might tap, uh, you know,

1:10:03

like Claude for something else or,

1:10:05

you know, whatever chat GP,

1:10:07

I think originally they did contract with

1:10:08

chat GPT.

1:10:09

So yeah, it's again,

1:10:12

it's very sparse on technical details as

1:10:13

far as privacy stuff goes.

1:10:15

It just says here in the article that,

1:10:16

uh,

1:10:16

Apple has focused on privacy with its AI

1:10:18

rollout with much of the process happening

1:10:20

on processing happening on device or

1:10:21

through tightly controlled infrastructure.

1:10:23

Apple says it will maintain these privacy

1:10:25

standards through its partnership with

1:10:26

Google.

1:10:26

That's kind of all they said.

1:10:28

So, yeah.

1:10:32

I don't know.

1:10:34

What do you think about this?

1:10:36

There's a lot going on here.

1:10:37

Okay.

1:10:40

So, yeah,

1:10:41

I know Apple says that they are going

1:10:43

to maintain their privacy standards.

1:10:46

which to their credit the things that

1:10:48

apple have has been working on lately to

1:10:51

my knowledge they were kind of the first

1:10:52

to go in this private compute direction

1:10:54

that um confer the ai company we just

1:10:57

talked about earlier in the show and that

1:10:58

maple ai and that other people are doing

1:11:00

i think google was i mean i think

1:11:02

apple was kind of the first to create

1:11:04

this and they kind of have a

1:11:05

big advantage compared to their

1:11:07

competitors in the sense that they can

1:11:08

build their own hardware and CPUs to make

1:11:15

the security more robust rather than just

1:11:17

relying on these off the shelf solutions

1:11:19

from Intel and AMD or Nvidia, for example.

1:11:26

Something I want more clarity about when

1:11:29

it comes to how this will work is

1:11:32

like what exactly Google's involvement is.

1:11:35

I saw a lot of rumors that this

1:11:37

would happen leading up to this

1:11:39

announcement where people were basically

1:11:41

saying that Apple and Google came to

1:11:44

an agreement where like Apple would get

1:11:49

access to Gemini's models basically and

1:11:52

they could create their own models based

1:11:53

on that or add additional training data or

1:11:55

whatever and they could run everything

1:11:57

themselves on these private compute

1:12:00

servers that they have.

1:12:01

So it isn't like the current

1:12:04

implementation that Apple has right now

1:12:06

with ChatGPT where Siri will sometimes

1:12:09

offload your request to ChatGPT and just

1:12:12

send it over.

1:12:13

In theory,

1:12:14

if Apple is running all of this and

1:12:15

keeping it on their cloud and they're just

1:12:17

using these models that Google has created

1:12:19

and that's what their partnership is,

1:12:22

Keeping everything in one ecosystem and

1:12:24

not giving more data to Google, I think,

1:12:26

is an improvement for sure.

1:12:29

But all of this private AI stuff,

1:12:32

no matter how it's implemented,

1:12:33

has all of the problems that we talked

1:12:36

about earlier on in the show and the

1:12:38

problems with AI in general.

1:12:40

And I don't think that Apple's private

1:12:43

compute is going to be at a level

1:12:45

of privacy...

1:12:46

and security that I would be comfortable

1:12:48

with using for anything serious if I was

1:12:51

going to use AI at all.

1:12:56

And that's unfortunate because I think

1:12:57

that Apple is kind of doing the best

1:13:00

you really technically can do from the

1:13:02

security perspective if you want to get

1:13:04

back into the technical specifics of how

1:13:05

AI works.

1:13:07

But the best that's possible right now

1:13:09

with our current technology isn't

1:13:11

good enough in my opinion for people who

1:13:13

are serious about their privacy and

1:13:14

security and, um, any of this cloud stuff,

1:13:18

like I think it sets a very dangerous,

1:13:22

um,

1:13:24

path that we are going on with technology.

1:13:27

Because it seems like all of these tech

1:13:29

companies like Apple and especially like

1:13:33

ChatGPT or Google,

1:13:34

what they're trying to do is offload as

1:13:37

much as possible to the cloud.

1:13:39

And in doing so,

1:13:40

they're making normal hardware for people

1:13:44

more expensive.

1:13:44

We talked about this a few episodes ago,

1:13:46

and I know there was a news brief

1:13:47

about the RAM pricing,

1:13:48

which is

1:13:49

crazy right now because all of these data

1:13:51

centers are buying it up.

1:13:52

And what's really happening is people are

1:13:55

being priced out of the market where you

1:13:59

can own your local compute.

1:14:00

And that's a trend.

1:14:01

I was sharing this, I think,

1:14:04

in one of the group chats we have,

1:14:07

but it's a trend that we really see

1:14:11

in society at large for many years.

1:14:13

People were

1:14:14

priced out of the housing market.

1:14:15

I know that's a hot topic,

1:14:16

especially all around the world right now.

1:14:18

People are being priced out of even the

1:14:21

car market.

1:14:22

So many more people are leasing before or

1:14:26

people are just relying on things like

1:14:27

Uber or Lyft.

1:14:28

I know Tesla really wants to do this

1:14:30

with their robo taxis where people won't

1:14:33

own cars generally.

1:14:34

They will just rely on other people who

1:14:35

own cars to taxi them around.

1:14:38

I think that that is the direction that

1:14:40

tech companies want

1:14:41

everything to go in because they can

1:14:43

control all of it and they can create

1:14:45

this subscription model that you have to

1:14:47

pay for and local compute is kind of

1:14:50

going away and I think that's very scary

1:14:52

and dangerous because we're really they're

1:14:56

really forcing everyone in this position

1:14:59

where everyone's going to be locked in as

1:15:01

rent seekers on all of these platforms and

1:15:05

won't have any agency over

1:15:08

Over anything anything that's doing

1:15:09

computers all of these computers are just

1:15:11

going to be thin clients for the cloud,

1:15:13

which is extremely unfortunate it's not a

1:15:16

direction that I think people should

1:15:18

should tolerate.

1:15:19

And I guess i'm having a camera issue,

1:15:23

but whatever hopefully you can see me but

1:15:26

yeah that's my.

1:15:29

Concern with with all of this.

1:15:34

Yeah, no, I, I agree with you.

1:15:35

Cause even it's, it's this whole,

1:15:37

like everything on the cloud is even from

1:15:40

a practical perspective,

1:15:41

like I swear I'd have to go find

1:15:43

it,

1:15:43

but I swear I read a story several

1:15:44

years ago about somebody who rented a car

1:15:48

and they were in like Arizona.

1:15:49

And they couldn't start the car because

1:15:51

the car couldn't get cell signal to call

1:15:54

home and do whatever stupid checks it had

1:15:56

to do to like verify that they could

1:15:58

start the car.

1:15:58

And just things like that are just so

1:16:01

it's, it's a practical perspective.

1:16:03

You know,

1:16:03

what happens when the power goes out?

1:16:06

You know,

1:16:07

that's a very common scenario we've all

1:16:09

been in what happens i mean i guess

1:16:10

when the power's out you're not really

1:16:11

using computers but you know what i mean

1:16:13

or even your phone yeah what happens when

1:16:14

the power goes out and now the grid

1:16:17

is overloaded with everybody texting and

1:16:18

everybody checking twitter to be like oh

1:16:20

what's happening does anybody know why the

1:16:21

power is out and you can't do anything

1:16:23

because you have like you can't make that

1:16:26

connection to the the server for whatever

1:16:28

license you're supposed to have and it's

1:16:30

just it seems like such a

1:16:32

like I get it on the one hand,

1:16:33

right?

1:16:34

Like I love the cloud in the sense

1:16:36

of like, I, you know,

1:16:38

if my computer crashes,

1:16:39

I have a copy of my data or,

1:16:40

you know,

1:16:40

to not have to destroy my own CPU

1:16:44

doing this.

1:16:44

God,

1:16:44

I wish I could render videos in the

1:16:46

cloud and not have to destroy my GPU

1:16:47

to do it.

1:16:48

But, you know,

1:16:49

it comes with practical drawbacks of just

1:16:52

that, that,

1:16:53

resilience, you know,

1:16:54

what happens when AWS knocks out a third

1:16:58

of the internet traffic or cloud flare,

1:16:59

whoever.

1:17:00

And it's just, yeah.

1:17:01

I mean,

1:17:02

I feel like I've seen that multiple times

1:17:04

just in the last several months of like

1:17:05

some major outage and all my friends in

1:17:07

discord are just like, well,

1:17:09

I guess I'm just gonna like, you know,

1:17:11

take an extended lunch today or something.

1:17:12

Cause I can't do anything.

1:17:13

Cause the cloud's out.

1:17:15

My whole job is on the cloud and

1:17:17

Yeah,

1:17:18

it seems so very short-sighted in the name

1:17:20

of profits,

1:17:21

which I know is so hard to believe

1:17:22

that tech companies would do that.

1:17:24

But yeah, I don't like it either.

1:17:27

It's horrible.

1:17:36

I don't have much else to add to

1:17:37

that one.

1:17:38

Yeah, I think...

1:17:42

That's kind of all I have to say.

1:17:43

We did have one more discussion question

1:17:45

for that topic about did Google win kind

1:17:49

of this AI thing despite being ruled

1:17:51

against making anti-competitive deals in

1:17:52

court?

1:17:53

It's a really interesting case, I think,

1:17:54

this one.

1:17:55

And again,

1:17:56

I want to see more about this because

1:17:58

you know,

1:17:58

if Google is actually like controlling all

1:18:00

of the stuff that Apple is doing behind

1:18:02

the scenes, that would be very concerning,

1:18:05

especially from an antitrust standpoint.

1:18:06

But if it is a deal where Apple

1:18:08

just kind of building on their work,

1:18:10

but they're doing it themselves,

1:18:12

that's pretty typical of Apple across

1:18:16

their software and their hardware.

1:18:17

I mean,

1:18:18

most of like Apple's advances in hardware

1:18:20

come from like Samsung making better

1:18:22

screens and that sort of thing.

1:18:23

And if it's a situation like that with

1:18:24

Google,

1:18:26

It's probably not a huge anti-competitive

1:18:28

concern,

1:18:28

but if Gemini branding is going to be

1:18:30

prominently featured and stuff,

1:18:31

and Gemini is kind of buying their way

1:18:34

into being the AI company that people

1:18:37

think about, yeah,

1:18:39

it is a weird situation for Google to

1:18:41

be in.

1:18:41

So definitely something I hope antitrust

1:18:44

people keep an eye on,

1:18:46

but I don't think they have much teeth

1:18:50

at the moment against these big tech

1:18:52

companies.

1:18:55

Sadly true.

1:18:57

Just to add on to that,

1:18:58

I don't know much about a Google AI

1:19:00

antitrust lawsuit,

1:19:01

but it says here in the TechCrunch article

1:19:03

that Google and Apple specifically have

1:19:06

faced lawsuits.

1:19:08

In August,

1:19:09

a federal judge ruled that Google acted

1:19:11

illegally to maintain monopoly in online

1:19:12

search by paying companies like Apple to

1:19:15

present its search engine as the default.

1:19:17

So I don't know.

1:19:19

Yeah, this could...

1:19:22

Hmm.

1:19:24

I don't know.

1:19:24

Yeah.

1:19:25

I guess now that I think about it,

1:19:26

I could see a scenario like Europe saying,

1:19:29

Hey,

1:19:29

you have to offer other models or

1:19:30

something.

1:19:31

But like you said,

1:19:32

that may only be the case if Google

1:19:35

is maintaining everything.

1:19:36

If like you said, if Google's just like,

1:19:37

okay, here's a copy of our model,

1:19:39

go host it on your server and do

1:19:40

whatever, then I don't know.

1:19:42

I mean,

1:19:42

I would still argue that's Apple being

1:19:44

monopoly,

1:19:44

but governments seem to be a little bit

1:19:46

easier on that.

1:19:48

Yeah.

1:19:49

I mean, and it still has, um,

1:19:53

troubling implications, I think,

1:19:54

for the AI industry.

1:19:55

Because whoever trains these models,

1:19:57

they have a lot of control over what

1:19:58

the AI does.

1:19:59

And so they can definitely shift things to

1:20:03

show up in certain ways or prioritize

1:20:05

certain responses.

1:20:06

I don't know what these AI companies could

1:20:08

do,

1:20:08

but it does give Google a lot of

1:20:10

power either way.

1:20:16

Agreed.

1:20:22

All right.

1:20:22

Let's see.

1:20:22

I think this is our...

1:20:25

We have another news story before we move

1:20:26

on to forum updates here.

1:20:30

But this is from Ars Technica.

1:20:31

Never before seen Linux malware is far

1:20:34

more advanced than typical.

1:20:36

Void link includes an unusually broad and

1:20:39

advanced array of capabilities.

1:20:42

So basically this article from Ars

1:20:44

Tactica,

1:20:44

it kind of dives into this new Linux

1:20:50

malware that can infect Linux machines and

1:20:53

it has a lot of advanced capabilities that

1:20:55

attackers can use to perform various

1:20:57

things on your computer.

1:21:02

I feel like we've talked a bit about

1:21:03

malware on Linux in the past.

1:21:04

I think this is a trend that's only

1:21:06

going to continue,

1:21:08

especially as more people adopt Linux.

1:21:11

The reality is all of these malware

1:21:13

targets or malware developers are going to

1:21:15

target the platforms that most people use.

1:21:19

And so if we see more people adopt

1:21:21

something like the Steam Deck or more

1:21:23

people adopt Linux on desktop because

1:21:24

gaming is getting better or because they

1:21:26

want to escape...

1:21:27

all of the copilot nonsense in Microsoft

1:21:30

Windows or for whatever reason,

1:21:33

we will see Linux become more and more

1:21:35

of a target

1:21:38

just inherently, I think.

1:21:40

And so we're going to probably see more

1:21:41

articles like this.

1:21:42

But it does demonstrate what I think a

1:21:44

lot of people in the privacy and

1:21:46

especially the security community have

1:21:48

been saying about Linux and desktop Linux,

1:21:50

especially for a while,

1:21:52

which is that I think Linux does have

1:21:55

a good ways to go as far as

1:21:59

defending itself against

1:22:01

malware like this i think linux has for

1:22:03

a very long time greatly benefited from

1:22:06

not having a very big market share on

1:22:08

desktop people will always say you know

1:22:12

linux has very high usage on the server

1:22:14

and so therefore there should be more

1:22:16

malware for it based on that but that

1:22:18

isn't true because the desktop ecosystem

1:22:20

is just a very different um

1:22:23

threat landscape,

1:22:25

you're running so many different

1:22:26

applications,

1:22:27

you're running like web browsers,

1:22:28

especially that's downloading arbitrary

1:22:30

code from the internet,

1:22:31

anything that you're just doing, random,

1:22:34

whatever desktop things on is going to

1:22:35

have a much

1:22:36

larger attack service than something like

1:22:38

a Linux server.

1:22:40

If I set up a Linux server,

1:22:42

it's only going to do whatever I install.

1:22:44

And so the attack surface is very small,

1:22:46

and that's why you haven't seen a lot

1:22:48

of malware targeting these Linux servers.

1:22:52

But yeah, this is...

1:22:56

I think that's basically my only point.

1:22:58

I think this is a trend that we'll

1:22:59

continue to see.

1:23:00

So I hope that Linux distro developers and

1:23:02

the Linux kernel product take security a

1:23:05

bit more seriously because there are

1:23:08

security features that we see on

1:23:11

mainstream big tech platforms like Mac OS

1:23:13

and Windows that Linux still could benefit

1:23:17

from.

1:23:18

And it hasn't seen much of a focus

1:23:19

yet, unfortunately.

1:23:21

Did you have any takeaways from this

1:23:22

article when you read through it that I

1:23:24

didn't cover, Nate?

1:23:26

Well, kind of to add to that,

1:23:27

because I think your takeaway is spot on,

1:23:29

like Linux needs better security.

1:23:32

I don't think there's a lot of people

1:23:34

that would argue that,

1:23:34

that know what they're talking about.

1:23:37

But no, it's interestingly,

1:23:40

this kind of plays into what we were

1:23:42

talking about right before we transitioned

1:23:43

to this story, because it says that...

1:23:46

This particular malware is actually aimed

1:23:48

at servers.

1:23:49

It's specifically aimed at virtual

1:23:50

machines and stuff,

1:23:52

and it can detect popular hosting

1:23:54

providers like AWS, Azure, Tencent,

1:23:57

and they say that there's indications the

1:23:59

developers are gonna add detection for

1:24:00

Huawei, DigitalOcean,

1:24:02

And it's very modular.

1:24:04

So that's kind of been one of Linux's

1:24:07

saving graces, I guess you could say,

1:24:09

is because, you know,

1:24:10

when you buy a Mac,

1:24:11

you're buying the entire device, right?

1:24:13

And like when you're buying Windows,

1:24:14

you're generally buying,

1:24:15

it's a little bit mix and match,

1:24:16

but generally there's, you know,

1:24:18

a handful of people make the chips and

1:24:19

a handful of people make, you know,

1:24:21

the RAM and all that, even less now.

1:24:24

But it's, you know,

1:24:25

Linux machines are so varied in their

1:24:28

hardware and their capabilities and what

1:24:29

they're designed for,

1:24:30

like you were saying.

1:24:31

And

1:24:32

So this one is very modular,

1:24:34

and it can do all kinds of different

1:24:35

things depending on what type of machine

1:24:38

it's on and what,

1:24:39

if I'm reading this right,

1:24:40

and what the attacker needs it to do,

1:24:42

which is really interesting.

1:24:43

And kind of just to back up what

1:24:45

you were saying about we're going to see

1:24:46

more of this, the article says, like,

1:24:48

similar things have targeted Windows

1:24:49

servers for years,

1:24:51

but they're less common on Linux.

1:24:53

And, you know, like I said,

1:24:54

this goes back to everything.

1:24:55

This goes back to Linux needs better

1:24:57

security.

1:24:57

This goes back to...

1:25:00

Was I saying resilience?

1:25:01

Because if the VM that's hosting my app

1:25:04

goes down, can I use the app?

1:25:07

Oh, man.

1:25:08

This was a good story to end on,

1:25:09

I think,

1:25:10

because so many things come together.

1:25:12

And yeah,

1:25:14

is it going to take down AI data

1:25:15

centers now?

1:25:16

How are people going to live without their

1:25:18

AI chatbot telling them what kind of

1:25:19

coffee they want?

1:25:21

So yeah, I think they'll survive.

1:25:24

Oh, I don't know, man.

1:25:25

But, but I, I can't pick between the,

1:25:29

the, the hot coffee and the cold brew.

1:25:30

I don't know.

1:25:32

I got nothing, but yeah, it's,

1:25:35

it's really, and I agree.

1:25:36

I agree with you a hundred percent on

1:25:37

your takeaway that we do.

1:25:38

I,

1:25:39

I always hate telling developers what to

1:25:41

do because I'm not a code person and

1:25:42

I would love to know code.

1:25:43

I'm trying to learn code for the record.

1:25:45

That's one of my goals this year is

1:25:46

to learn at least Python.

1:25:48

I feel like that's a good foundation to

1:25:49

start with.

1:25:50

And I feel bad saying like, oh,

1:25:53

you should go do this thing when I

1:25:54

can't really contribute to that.

1:25:55

But it's, you know,

1:25:57

privacy and security are, you know,

1:25:59

Kerry Parker from Firewall's Don't Stop

1:26:01

Dragons,

1:26:01

he likes to say that privacy is a

1:26:02

team sport.

1:26:03

And like you were saying with the

1:26:03

messengers,

1:26:04

and when we do things that raise the

1:26:07

default level of privacy and security,

1:26:09

it, it raises everyone with it.

1:26:12

What's the phrase?

1:26:12

Like a rising tide lifts all boats.

1:26:14

And so it's,

1:26:16

it's not me trying to sound entitled and

1:26:18

be like, well,

1:26:19

these developers need to do what I want.

1:26:20

It's like, no,

1:26:21

like if we put more emphasis into

1:26:23

security, everyone benefits by default.

1:26:27

And,

1:26:27

Yeah,

1:26:28

clearly the article backs up what you were

1:26:29

saying about we're just going to see more

1:26:31

of this,

1:26:31

whether it's on the server side or the

1:26:32

desktop side.

1:26:34

Yeah,

1:26:34

my wife now has two Linux devices because

1:26:37

she now has a Steam Deck and a

1:26:38

Pop!

1:26:38

OS machine, which, quick side note,

1:26:41

when she got the Steam Deck,

1:26:42

I took great joy in telling her,

1:26:43

now you can tell people I use Arch,

1:26:45

by the way.

1:26:45

So I had to make the joke.

1:26:50

Yay!

1:26:53

I feel accomplished tonight.

1:26:55

Yeah, that's all I got.

1:26:57

Just kind of backing up what you were

1:26:58

saying.

1:27:02

Let's move on to some form threads that

1:27:04

have been popular this week.

1:27:07

I think one that has gotten a lot

1:27:09

of discussion over the past few days was

1:27:12

about Mailbox,

1:27:13

which is recommended on our site as an

1:27:15

email provider.

1:27:18

There's this thread on the Mailbox form,

1:27:22

basically,

1:27:22

that was also linked on our form for

1:27:24

further discussion,

1:27:25

talking about

1:27:27

um some issues that people are having with

1:27:30

guard and mailbox if if you're unfamiliar

1:27:33

or haven't checked their website in a

1:27:35

while mailbox recently um

1:27:39

went through a whole refresh.

1:27:41

It seems like they redesigned their whole

1:27:43

website.

1:27:43

They refreshed all of their apps.

1:27:45

And it seems like they might have changed

1:27:47

development.

1:27:48

And so unfortunately,

1:27:50

this isn't something I would say that

1:27:52

we've gotten a great chance to take a

1:27:55

longer look at.

1:27:56

but um i would ask anyone who like

1:28:00

uses mailbox right now if you have any

1:28:02

experiences with things changing or um

1:28:05

potential problems with this new version

1:28:07

definitely let us know in this forum

1:28:08

thread because um it's something that we

1:28:10

want to keep an eye on i know

1:28:11

we have a couple team members um who

1:28:13

do use mailbox i don't personally but um

1:28:17

we

1:28:17

we're gonna have them look into some of

1:28:19

these things and hopefully find out more

1:28:22

about what's going on but yeah it's

1:28:24

according to these the users on these

1:28:26

forums um there's potential security

1:28:30

issues it seems like with um with guard

1:28:34

which is their um tool that basically

1:28:38

encrypts all of your messages pgp so it's

1:28:40

important um and i guess it's uh

1:28:45

leaving behind traces,

1:28:46

even after you log out on a machine,

1:28:48

that seems to be the main gist of

1:28:50

the issue.

1:28:50

So again,

1:28:51

that's something that we'll want to

1:28:52

validate,

1:28:53

but it's something you might want to be

1:28:55

aware of if you are a mailbox user.

1:28:59

And hopefully we'll have more information

1:29:00

to share on that soon.

1:29:04

I think that kind of covers it.

1:29:08

Did you see any comments in this thread

1:29:09

that you wanted to cover?

1:29:13

No,

1:29:13

I just wanted to back up what you

1:29:14

said about if anybody is a Mailbox user

1:29:16

who can kind of shed some light.

1:29:17

Because it very quickly devolved.

1:29:19

I don't want to say devolved.

1:29:21

That's not the right word.

1:29:22

It very quickly turned into people

1:29:26

discussing Proton and Tudor and some of

1:29:29

the features they offer.

1:29:30

Because a lot of people were like, oh,

1:29:31

I use Mailbox because it has this feature.

1:29:33

And people were like, well,

1:29:34

Proton offers that.

1:29:35

And they say, oh,

1:29:36

but I don't like that it doesn't do

1:29:37

this.

1:29:38

And they say, well, Tudor does that.

1:29:40

You know, which is fine.

1:29:41

It's totally fine.

1:29:42

But it kind of...

1:29:44

As an outsider coming in who's never used

1:29:46

Mailbox,

1:29:47

I'm kind of looking at this and I'm

1:29:48

like, so what is the issue?

1:29:49

Like,

1:29:50

I actually asked Jonah that before we

1:29:51

started recording.

1:29:52

I was like, what is Guard?

1:29:53

What is going on?

1:29:54

But yeah, and it's also interesting.

1:29:58

It's not necessarily related,

1:29:59

but we did kind of note that Mailbox.org

1:30:03

appears to have really facelifted their

1:30:06

website.

1:30:07

So...

1:30:09

Which it looks great, for the record.

1:30:10

Looks super modern, super slick,

1:30:12

very awesome.

1:30:13

But it does seem that they've done a

1:30:15

lot of things,

1:30:16

both on the user end and behind the

1:30:18

scenes.

1:30:19

And yeah,

1:30:20

I think we're just trying to get a

1:30:21

better idea of exactly what's going on

1:30:23

here.

1:30:24

And if there is anything to be concerned

1:30:26

about,

1:30:26

we definitely want to make sure mailbox

1:30:27

users know.

1:30:28

We want to make sure that we know,

1:30:29

so we can see if there's any concerns

1:30:33

that affect our recommendations,

1:30:34

all that kind of stuff.

1:30:37

Yeah,

1:30:37

all of those mailbox changes are

1:30:39

relatively recent,

1:30:39

so it sounds like they're more extensive

1:30:43

than I had thought when they first

1:30:46

announced that.

1:30:48

For sure.

1:30:51

All right.

1:30:52

Did we have any other forum threads you

1:30:55

wanted to discuss,

1:30:56

or should we turn it over to questions?

1:30:59

I don't think so.

1:31:00

I think we can look through the chat

1:31:02

and see if anyone had any questions for

1:31:04

us this week and the forum thread as

1:31:08

well.

1:31:08

Let me get it pulled up.

1:31:10

But if you have any,

1:31:11

you want to highlight right off the bat,

1:31:13

definitely get started.

1:31:17

Let's see.

1:31:18

I'm looking through the forum thread right

1:31:20

now.

1:31:21

You,

1:31:22

I don't know if we do have any

1:31:24

answers to this.

1:31:24

One of our members bits on a,

1:31:27

Bits on Data Dev says,

1:31:29

referring to the headline story,

1:31:31

the confer AI, he says,

1:31:33

where do the trusted execution

1:31:34

environments run?

1:31:36

They'd be more trusted if I knew where

1:31:37

these were and how they're insulated from

1:31:39

the surrounding environment.

1:31:40

For now,

1:31:40

I can't seem to find much info on

1:31:42

it.

1:31:42

It's Marlin Spike,

1:31:43

so I feel pretty sure I can play

1:31:44

with this for fun,

1:31:45

but I'm not about to have a therapy

1:31:46

session with it anytime soon,

1:31:47

though that should never be the use case

1:31:48

for AI in an ideal world.

1:31:50

Thank you for getting one step ahead of

1:31:51

me there.

1:31:53

Yeah, I don't.

1:31:55

Unfortunately, correct me if I'm wrong,

1:31:57

but I think a lot of what we

1:31:58

know either comes from like that article.

1:32:00

I don't know if Moxie's done any direct

1:32:02

interviews yet,

1:32:03

but there's reputable sources like Ars

1:32:06

Technica, for example.

1:32:07

But there's also,

1:32:09

I think I mentioned it at the top,

1:32:10

but I may have accidentally stumbled over

1:32:12

myself and rushed through it.

1:32:13

If you go to confer.to,

1:32:15

which is their website,

1:32:16

and I think right at the beginning,

1:32:18

it prompts you to log in.

1:32:19

There is no like free tier for this

1:32:20

thing.

1:32:21

But it says on that login page,

1:32:23

it's like, oh, click here to learn more.

1:32:25

And he has three blog posts.

1:32:27

I do remember now, I did say this,

1:32:29

where he digs in a little bit more

1:32:30

to the details.

1:32:31

I don't know if he gives you that

1:32:33

level of detail that you're looking for,

1:32:35

but if you have any more technical

1:32:36

questions, I would start there for sure,

1:32:37

because that blog post gets very

1:32:39

technical.

1:32:39

Yeah,

1:32:41

that's probably a good place to look.

1:32:42

I haven't seen where Confer is hosted

1:32:45

specifically,

1:32:45

if you're talking about the hosting

1:32:47

provider side of things.

1:32:48

Typically...

1:32:51

Like I know with Maple AI, for example,

1:32:54

they run on Amazon Web Services and Amazon

1:32:56

is selling a service right now to people

1:33:00

who run this sort of product where you

1:33:03

can rent access on these trusted execution

1:33:06

environments that are hardware validated.

1:33:08

And I know Intel and AMD have this

1:33:11

Intel...

1:33:14

Notably,

1:33:15

Signal has been using Trusted Execution

1:33:18

Environments for things like contact

1:33:20

discovery and other features if you've

1:33:22

seen any of their integrations with

1:33:23

Intel's platform on their blog.

1:33:26

So that's the sort of thing that this

1:33:29

is like.

1:33:31

The trusted executed environments, yeah,

1:33:36

they're running on various providers,

1:33:38

and it's mainly relying on the hardware to

1:33:44

isolate that environment from the rest of

1:33:47

the stack.

1:33:48

However,

1:33:49

and I know that we talked about this,

1:33:52

because I remember talking about this in a

1:33:53

previous episode.

1:33:54

Maybe we can find that and link it

1:33:56

after, but I...

1:33:58

problem with all of these things is that

1:34:01

like they aren't fully validated yet and

1:34:04

protecting against physical access like i

1:34:05

said earlier is not something that they

1:34:07

were designed to protect against in the

1:34:09

first place now they're kind of being used

1:34:11

for that purpose maybe we'll see

1:34:12

improvements

1:34:13

to that end i don't know but like

1:34:16

at the end of the day vulnerabilities um

1:34:21

in these platforms have been found before

1:34:23

one was found very recently because we

1:34:24

talked about it on this show and other

1:34:27

ones have been found in the past and

1:34:28

very often um but not always but very

1:34:32

often

1:34:33

They do rely on some sort of physical

1:34:34

access,

1:34:35

but it just kind of shows that like

1:34:36

these aren't the best protections against

1:34:39

people who have physical access to the

1:34:40

machines.

1:34:41

You can't fully rely on them for that.

1:34:44

And the only thing that they can really

1:34:48

do is kind of isolate the code that's

1:34:51

running and also

1:34:53

in theory,

1:34:54

validate the code that's being run.

1:34:56

But the code that's being run could be

1:34:59

anything.

1:34:59

It could be code that's spying on you,

1:35:01

for example.

1:35:02

And if that's running in the TEE and

1:35:04

you don't know it,

1:35:05

what protection is it really giving you?

1:35:07

None at all.

1:35:07

And so I think in this case,

1:35:09

like with Confer,

1:35:10

I believe everything should be open source

1:35:12

so people can audit this.

1:35:13

But are people auditing the code that's

1:35:15

being run in these trusted execution

1:35:17

environments across the board?

1:35:19

I don't really know if that's the case.

1:35:21

I don't know.

1:35:22

What Confer has done about that,

1:35:23

but I don't think that's happening with a

1:35:25

lot of these other companies who are doing

1:35:26

similar things to Confer.

1:35:31

Totally agree.

1:35:31

But a quick note,

1:35:32

I don't know if you remember in the

1:35:33

or if you saw in the original article

1:35:36

about Confer,

1:35:37

the Ars Technica one we spoke about at

1:35:38

the top.

1:35:40

Towards the bottom,

1:35:41

it talks about how he's actually offering

1:35:43

remote attestation.

1:35:44

or I don't know how to pronounce that

1:35:46

word,

1:35:46

but how you can remotely verify that the

1:35:49

code is running on the server,

1:35:51

that it has not been tampered with and

1:35:53

that there is no additional code.

1:35:55

Um, I,

1:35:56

I don't know to my non-technical brain,

1:35:58

that sounds like a big claim.

1:36:00

So I don't know anything about that,

1:36:01

but I'm just saying they did say that's

1:36:03

a thing at least with confer.

1:36:06

Yeah.

1:36:06

And, and we see this with other, um,

1:36:10

Similar platforms as well.

1:36:13

I just keep bringing up Maple because it's

1:36:14

the only one that we've...

1:36:18

talked about on the forum a bit,

1:36:19

but they have cryptographic proofs that

1:36:22

they publish on their website as well,

1:36:25

where you can ensure that you're talking

1:36:27

with their secure servers.

1:36:29

But what does that tell you about the

1:36:31

code that's actually being run on it

1:36:33

itself?

1:36:33

That is unclear.

1:36:35

And that's the main thing that I would

1:36:37

caution people against.

1:36:39

Just because you know that you're

1:36:40

interacting with a trusted piece of code

1:36:44

doesn't mean that the code is trustworthy.

1:36:46

It really depends.

1:36:48

like, they don't even have to be like,

1:36:51

confer doesn't have to be malicious to

1:36:52

have bugs in their code.

1:36:53

There's buggy code all the time.

1:36:55

And so it's not a guarantee by any

1:36:58

means that there's no way to get your

1:36:59

data out of this.

1:37:01

And yeah,

1:37:03

that's the kind of thing that I would

1:37:05

be very concerned about.

1:37:06

I don't really know how what you're

1:37:09

describing would, um,

1:37:13

work in like a web-based client like

1:37:15

Confer because then we get into another

1:37:17

issue where we talk about this on our

1:37:19

website with end-to-end encrypted web

1:37:21

applications like Proton where Proton

1:37:24

could in theory send you a totally

1:37:26

different version of the website that like

1:37:31

runs JavaScript that decrypts your data,

1:37:33

for example,

1:37:33

and they could do it surreptitiously where

1:37:35

it would be very difficult to detect that

1:37:37

they're doing it.

1:37:38

I mean,

1:37:38

unless you're like going through the

1:37:40

inspect element source code and you're

1:37:42

looking at all the JavaScript and you're

1:37:43

seeing if it's different and then you just

1:37:45

have to assume

1:37:46

they're targeting you and they didn't just

1:37:47

push out an update.

1:37:48

Like that's the kind of thing that's very

1:37:49

hard to detect.

1:37:51

And I don't think Confer can really do

1:37:52

anything about that without a native

1:37:54

client.

1:37:57

I think going back to Apple's AI

1:38:00

implementation,

1:38:00

if we want to talk about the security,

1:38:02

their private compute,

1:38:05

be a bit better against this because they

1:38:07

can run code on in your operating system

1:38:10

like on ios that validates the servers

1:38:12

that they can't i mean if they implement

1:38:14

this properly apple everything is

1:38:16

proprietary so who knows what they're

1:38:18

doing you can't you can't really trust

1:38:19

these platforms either but i'm just saying

1:38:21

in theory running a native client there

1:38:25

could be some validation involved but with

1:38:26

something like a web app like confer

1:38:29

If they change the server,

1:38:30

they want to redirect you to a malicious

1:38:32

server,

1:38:32

they can just give you a different version

1:38:34

of the website that connects to this

1:38:36

malicious server and says, yep,

1:38:38

the server's verified.

1:38:39

It's all good.

1:38:40

And how would you know otherwise?

1:38:42

I don't think that would be very easy

1:38:46

for most people to detect.

1:38:47

So yeah, all of this private AI stuff,

1:38:50

if it's running in the cloud...

1:38:53

and not locally,

1:38:55

I wouldn't really trust it.

1:38:57

And that is the main reason that on

1:38:59

our website,

1:39:00

we do have some AI recommendations.

1:39:03

But if you're going to use AI at

1:39:04

all,

1:39:04

we only recommend local AI models at this

1:39:07

time because it's really the only way to

1:39:10

ensure that your data that you're

1:39:13

inputting into it isn't going to be

1:39:14

monitored or logged by other parties.

1:39:16

That's just the reality of the situation

1:39:19

at the moment.

1:39:23

Fair enough.

1:39:26

Here's another question about Confer from

1:39:27

the forum.

1:39:28

Do you think Confer's way of doing AI

1:39:30

is something other AI products will follow

1:39:32

suit,

1:39:32

and how difficult would it be for existing

1:39:34

AI products to migrate that?

1:39:38

I would say...

1:39:41

I think that it's likely that other AI

1:39:43

products will do this.

1:39:44

It does seem like NVIDIA is putting more

1:39:47

resources into this trusted execution

1:39:50

environment from what I've seen in their

1:39:52

GPUs for AI data centers to use.

1:39:57

Again,

1:39:59

like I said a few times on this

1:40:00

show already,

1:40:01

what Conferred is doing isn't super new.

1:40:04

We've seen it from other companies like

1:40:07

Maple and like Apple.

1:40:08

I think confer could be doing a better

1:40:11

job.

1:40:11

They probably have more security minded

1:40:12

people behind it.

1:40:14

I don't know.

1:40:15

I haven't looked too much into it,

1:40:16

but like it's, it's,

1:40:19

it's something that has been done before.

1:40:21

It's something that I think we'll probably

1:40:22

continue to see happening.

1:40:26

Um, and as far as I know,

1:40:28

there isn't much stopping AI companies

1:40:31

from, from implementing this in hardware,

1:40:34

especially as the hardware supports this

1:40:36

and gets better.

1:40:37

Um,

1:40:39

So I don't know why most companies like

1:40:43

ChatGPT or OpenAI would be incentivized to

1:40:47

do this.

1:40:47

It seems like they're perfectly happy to

1:40:49

just have all of your data.

1:40:52

So I don't know if it will happen,

1:40:55

but it definitely could happen for sure.

1:41:00

Yeah, I agree.

1:41:01

It's the incentive thing for me.

1:41:03

I'm sure, as you noted,

1:41:04

there's already other companies trying to

1:41:06

do this and trying to create those private

1:41:08

alternatives,

1:41:08

but

1:41:09

As far as the ones that are around,

1:41:10

the OpenAI, Anthropic,

1:41:13

I don't see what their incentive would be

1:41:15

to do it.

1:41:20

We did have one more quick thing in

1:41:24

the forum.

1:41:24

It's actually a shout out from,

1:41:27

we have a pretty active member who goes

1:41:29

by Nombre Falso.

1:41:31

And he said,

1:41:31

there's an age verification bill making

1:41:34

its way through Florida.

1:41:35

SB four, eight,

1:41:36

two would require you to verify your

1:41:38

identity to use an AI chat bot.

1:41:39

But he makes a pretty good point that

1:41:42

with AI getting so integrated into things

1:41:44

like Gemini, for example,

1:41:45

where Google's rolling it out to every

1:41:46

part of their product system,

1:41:48

does that mean that eventually they can

1:41:50

make the argument that you have to verify

1:41:51

your identity to use Gmail?

1:41:53

So pretty concerning stuff.

1:41:56

And if you're,

1:41:58

In Florida,

1:41:58

definitely go check out that link in the

1:42:00

forum and learn more about it.

1:42:04

I only have access to the YouTube chat.

1:42:08

I'm not sure if we've missed anything in

1:42:09

some of the other chats,

1:42:10

because I know we're live streaming to a

1:42:11

few different platforms right now.

1:42:14

I don't believe we have.

1:42:17

So I guess we can do kind of

1:42:18

a last call for any questions if people

1:42:20

are still wondering about anything.

1:42:23

Otherwise...

1:42:26

It looks like we've gotten through

1:42:28

everything on the forum.

1:42:31

Yeah.

1:42:32

I'm not seeing anything in the YouTube

1:42:33

chat that we haven't already addressed.

1:42:35

There's the question about Jami.

1:42:38

We did have a new member sign up

1:42:39

tonight.

1:42:41

What is that?

1:42:42

Oh, man.

1:42:42

I don't know if I can pronounce this

1:42:44

because I think this is a Greek name.

1:42:46

Ionis Karopoulos.

1:42:49

But they became a member on YouTube

1:42:51

tonight,

1:42:52

I think at the beginning of the stream.

1:42:53

And I think we missed that.

1:42:54

So thank you so much for signing up

1:42:56

and supporting Privacy Guides and our

1:42:58

mission.

1:42:59

Really appreciate it.

1:43:01

We got another question in the chat from

1:43:04

unredacted.

1:43:05

Any update on the bad internet bills that

1:43:08

we had talked about?

1:43:10

As far as I know,

1:43:12

it's been pretty slow over the holidays.

1:43:14

I haven't seen any new news,

1:43:15

but they are continuing to advance.

1:43:18

I haven't seen any good news in that

1:43:19

direction either.

1:43:21

And I don't think...

1:43:23

No news is good news in this case.

1:43:25

I think no news means that they're working

1:43:27

on things behind the scenes to continue

1:43:30

pushing it through.

1:43:31

So yeah,

1:43:33

when we see more updates on that,

1:43:36

we'll definitely be sharing on our social

1:43:38

media and keeping people up to date.

1:43:41

But I don't know.

1:43:44

It's hard to keep track of all of

1:43:45

the many vacations that Congress feels

1:43:49

welcome to take away from their jobs.

1:43:51

And so I don't know if they're actually

1:43:53

doing anything right now over in

1:43:54

Washington or if they're just kind of

1:43:55

lounging around for the winter.

1:43:57

So that could have something to do with

1:43:59

it.

1:44:00

Yeah.

1:44:01

But yeah, if there's updates,

1:44:03

I'll let you know.

1:44:05

Yeah,

1:44:05

it looks like Congress reconvened on

1:44:09

January fifth, if I'm reading this right,

1:44:12

the subcommittee markup of six bills.

1:44:16

It says that's the Committee on Energy and

1:44:18

Commerce,

1:44:18

which I think is what a lot of

1:44:19

these bills fell under,

1:44:20

if I remember correctly.

1:44:21

Yeah, they met up yesterday, actually.

1:44:24

And next meeting was scheduled today at

1:44:26

three.

1:44:27

So three p.m.

1:44:29

So, I mean, theoretically,

1:44:31

if anything happened,

1:44:32

hopefully we should hear about it any day

1:44:34

now.

1:44:35

It could be ongoing right now.

1:44:37

Maybe that's the update.

1:44:38

You know, that's true.

1:44:39

For all the hate that politicians

1:44:41

rightfully get,

1:44:42

they usually do work pretty late when they

1:44:45

have these meetings.

1:44:45

So, yeah,

1:44:46

they might be talking about it as we

1:44:48

speak.

1:44:49

Everybody, uh, focus real,

1:44:51

real hard and we're going to send them

1:44:53

a message to tell them to stop being

1:44:55

stupid.

1:44:57

Please.

1:45:00

Yeah.

1:45:00

Jokes aside.

1:45:03

Um, yeah.

1:45:03

Underdacted said, thanks.

1:45:04

Hard to keep track these days.

1:45:05

Yeah.

1:45:05

Trust me.

1:45:06

You're telling me there's so much to keep

1:45:08

track of and, uh, but all righty.

1:45:17

Well,

1:45:17

I think that kind of wraps everything up

1:45:19

then Nate,

1:45:20

do you want to take the outro here?

1:45:23

Sure.

1:45:23

I can do that.

1:45:25

So all the updates from this week will

1:45:27

be shared on the blog.

1:45:28

So if you are not signed up yet

1:45:30

for the newsletter, go ahead and do that.

1:45:32

Or you can of course,

1:45:33

subscribe with your favorite RSS reader.

1:45:36

For people who prefer audio,

1:45:37

we also offer a podcast available on all

1:45:40

platforms and RSS,

1:45:41

and this video will also be synced to

1:45:43

PeerTube.

1:45:44

Privacy Guides is an impartial nonprofit

1:45:47

organization that is focused on building a

1:45:48

strong privacy advocacy community and

1:45:51

delivering the best digital privacy and

1:45:52

consumer technology rights advice on the

1:45:54

internet.

1:45:55

If you wanna support our mission,

1:45:56

then you can make a donation on our

1:45:57

website, privacyguides.org.

1:45:59

make a donation click the red heart icon

1:46:01

located in the top right corner of the

1:46:03

page you can contribute using standard

1:46:05

fiat currency via debit or credit card or

1:46:07

you can donate anonymously using monero or

1:46:09

your favorite cryptocurrency becoming a

1:46:12

paid member unlocks exclusive perks like

1:46:14

early access to video content priority

1:46:16

during the live stream q a you'll also

1:46:18

get a cool badge on your profile in

1:46:19

the privacy guides forum and the warm

1:46:21

fuzzy feeling of supporting independent

1:46:23

media so thank you guys so much for

1:46:25

watching and we'll see you next week

1:46:27

thanks everyone