All right.
Microsoft abruptly terminated the accounts
of two multiple actually FOSS projects.
Employers are now using personal data to
figure out the lowest salary that you'll
accept.
And the FBI was able to recover deleted
signal messages from a device's local
notification database.
Busy week this week coming up on This
Week in Privacy number forty eight.
So stay tuned.
Welcome back to This Week in Privacy,
our weekly series where we discuss the
latest updates with what we've been
working on within the Privacy Guides
community,
and this week's top stories in data
privacy and cybersecurity.
I'm Jordan,
and with me this week is Nate.
How are you doing, Nate?
I'm doing pretty good.
It's been a good week.
How are you?
I'm doing good,
getting ready to dive into this top story
here.
So this week,
kind of a huge story that's been going
around.
Microsoft abruptly terminates VeraCrypt
account, halting Windows updates.
So basically, quoting from the article,
Microsoft has terminated an account
associated with VeraCrypt,
a popular and long-running piece of
encryption software,
throwing future Windows updates of the
tool into doubt.
Veracrypt's developer told for media,
the move highlights the sometimes delicate
supply chain involved in the publication
of open source software,
especially software that relies on big
companies, even tangentially.
So according to the Veracrypt developer,
I didn't receive any emails from Microsoft
nor any prior warnings.
VeriCrypt is an open source tool for
encrypting data at rest.
Users can create encrypted partitions on
their drives or make individual encrypted
volumes to store their files in.
Like its predecessor, TrueCrypt,
which VeriCrypt is based on,
it also lets users create a second
innocuous looking volume if they are
compelled to hand over their credentials.
And I'd also like to add you can
actually do full disk encryption with
VeriCrypt.
This is why this is quite a concerning
move because
uh if you lose access to your full
disk encryption then obviously you're not
going to be able to access your files
so that is kind of a big concern
here as well um and i guess moving
on to the second thing here wireguard vpn
developer also can't ship software updates
after microsoft locks their account so if
you didn't know already wireguard is a
protocol that most vpn providers use to
facilitate the VPN connections.
And they also have an official WireGuard
app that you can use with your WireGuard
profiles.
And this app on Windows is currently not
being able to be updated because the
WireGuard developer has also been locked
out of their Microsoft developer account.
So that means that they can't ship
software updates to Windows users.
Jason Donnefeld
The creator of open source WireGuard VPN
software told TechCrunch that he has been
locked out of his Microsoft developer
account and as a result cannot sign
drivers or ship updates for WireGuard for
Windows users,
which are critical for its software to
run.
And I think probably the most concerning
thing here is if there was a critical
vulnerability found in WireGuard VPN on
Windows,
they wouldn't be able to ship an update
and get it fixed.
So this is kind of a concerning thing
to be happening right now.
But we did see that there was a
sort of update to this story.
So we did want to mention this immediately
off the bat.
Zach Whitaker from TechCrunch did a
Mastodon post saying that he had contact
from VeraCrypt and WireGuard,
both telling them that they had regained
access following their Microsoft account
lockouts and can now release updates
again.
So that was on the eleventh of April.
And a lot of these articles were coming
out on the April the eighth.
So it took a couple of days for
this to happen.
And I guess a lot of backlash, actually.
So there's this article here from Bleeping
Computer that also goes a little bit more
into depth about how this worked.
So according to the VeraCrypt developer,
his account was actually terminated,
which I basically need to have
a Microsoft developer account to sign
Windows drivers and the bootloader.
I believe this is because the secure boot
process, if the drivers aren't signed,
and it won't be able to load properly
or something like that.
I've tried to contact Microsoft through
various channels,
but I've only received automated replies
and bots.
I was unable to reach a human.
I cannot publish Windows updates.
So I think this story kind of highlights
the concern of trusting a centralized
entity with the update process.
So in this case,
Microsoft does have quite a lot of control
over which developers can actually
create apps and updates on their platform,
which, you know,
we obviously oppose this because people
should be able to run whatever software
they want on their computer.
They shouldn't be held hostage by a
corporation.
And I think in this case,
it also ended up being a kind of
security risk because they weren't able to
release updates and they could have been a
critical vulnerability found.
But I do want to mention that WireGuard
VPN is notoriously very stable.
And the amount of updates that have been
published for it have been quite minimal.
And there haven't been that many critical
vulnerabilities found.
So that is a benefit.
It's also because the WireGuard protocol
is much leaner than other protocols like
OpenVPN.
So that does benefit it in that
circumstance.
In this article here from TechCrunch,
Sorry,
this article here from Bleeping Computer,
it also stated that dev teams from
Windscribe and memtest-eighty-six have
also been locked out of their accounts
too,
so big issues for Windscribe and
memtest-eighty-six.
And it is kind of concerning that this
happened without any warning,
no notification,
just these developers trying to access
their accounts and they weren't able to.
And like Jonah said here in the chat,
he said,
almost like having a triopoly on app
stores isn't a good idea.
Yeah,
I think we should be trying to focus
more on having more app stores,
having alternative ways to install
software.
I think, yeah.
I mean,
one of the benefits with Windows is you
don't have to use the app store to
install apps.
You can actually install them
independently, which is a benefit.
But it also,
like we said in this story,
there is a...
element of control that Microsoft has
because you need their permission to sign
drivers and write to the bootloader.
So if you don't have that developer
account,
then you're not going to be able to
do that properly.
In that case,
your users would have to basically bypass
that,
which is a much more technical process.
Although it is possible,
it's not really recommended and a lot of
users are probably going to feel unsafe
about doing that.
so i feel like i've talked for quite
a while here um nate do you have
any thoughts on this story so far oh
man i mean i think you covered most
of it for sure i think um yeah
it's really uh it i'm glad you mentioned
the um the full disk encryption nature of
vericrypt i'm a vericrypt user myself i i
will admit that and um it's
VeriCrypt can be used either to encrypt
specific things like you can create a
container or you can encrypt like an
external hard drive.
Or like you said,
you can encrypt your entire Windows
computer,
which my wife is primarily a Windows user.
I kind of dual boot in between Windows
and Linux for the most part.
I really just use this Mac for this
and that.
when I travel and stuff.
So that's kind of one of the first
things we do is where I'm going with
that is we encrypt our computers with
VeriCrypt.
And the developer,
I forget which article he said it in.
He may have said it in all of
them.
But the developer pointed out that if they
hadn't gotten this fixed in time before
the certificate ran out,
which I think would have been early June
or end of June, sometime in June,
then that could have potentially meant the
computers wouldn't boot.
And on top of it,
since he can't push out an update,
how are you supposed to make sure that
people know like, Hey,
decrypt your computers because they might
not boot.
So it's really, it's really,
really troubling for sure.
The,
The thing I like about the bleeping
computer article is that it gives more of
Microsoft's side of the story.
And I don't like that necessarily because
I care what Microsoft's opinion is,
but just to have the full,
complete information, right?
And Microsoft claims that ever since,
I think it was April of last year,
they've had this program where developers
have to verify,
which we're seeing that now on the Android
side of things, right?
And Android, it's not...
I don't wanna say it's not going well
because it's not been rolled out yet,
but this just shows things can go wrong,
even well-meaning things.
There are problems with this model.
And so it's really confusing.
Did these people just somehow miss the
notifications?
Was there some kind of glitch where they
didn't get notified?
Because the Microsoft,
I think it was like a VP,
was saying that they've been sending out
emails,
they've been sending out notifications.
He said they've been emailing everyone
since October,
I think the guy from VeriCrypt said that
he noticed this account was shut off in
January.
It's just,
we're just now finding out about it for
some reason.
Not like he was hiding it.
He was just,
he was busy trying to figure it out.
And for some reason,
it's just now that he's coming forward and
being like, hey,
here's where I've been for the past couple
of months.
So, I mean, yeah, there's so many,
Like, so many questions here, you know?
Because there's... Again, there's like...
Did something just get missed?
I don't know.
Why all of these?
I have a hard time believing that like
Windscribe, they have a whole team.
How did a whole team miss this?
And like, for the record,
I'm not blaming Windscribe with that.
I'm blaming Microsoft.
Like,
I don't think Microsoft did a good job
of notifying everybody if they did at all.
It's really weird.
I don't like to assume malice.
You know,
I don't like to look at this and
be like, oh,
Microsoft's trying to crush open source.
But I certainly have a lot of questions
and I don't understand how so much fell
through the cracks and so much got missed.
And
It really shouldn't take all this negative
media coverage for Microsoft to be able to
do something.
That's a big concern that I've had for
years,
that I've noticed years and years and
years ago that...
It was with Facebook specifically.
So I used to manage bands and one
of the bands I managed wanted to change
their name because there was another band
with the same name that got super,
super popular.
So they're like,
we need to rebrand because we keep getting
confused with them.
And like trying to get Facebook to rename
this band was literally a multiple months
long deal.
And it's like,
why can't I just talk to a person?
Like it doesn't even have to be real
time.
I'm not asking to call somebody or chat
with somebody.
Why can't I email a human being?
And it's just – everybody's trying to cut
down on the cost of having a support
staff, and they're trying to save money,
and it's all about shareholders making
more money and stuff.
But it's just like this is the result
is things don't get – like what if
this had never taken off the way it
had been?
Like this would have been a really big
deal.
This would have been really bad.
So –
Sorry,
I'm having trouble putting my thoughts in
order.
But yeah, this was so, so not cool.
And I guess it makes me wonder.
So, like, first of all,
what do you think,
if you have any opinions,
what do you think developers could do
against this?
Because, I mean,
it really does seem like Microsoft holds
all the cards here.
And I don't know.
Like, it's...
because these are privileged software,
right?
I think that's the difference.
We were talking about this in another chat
earlier today,
is like when I install things on Windows
that don't come from the Microsoft Store,
even when they're unsigned,
there's a little pop-up that's like, hey,
this is unsigned.
And I can tell it like, yeah,
I know, install it anyways.
But for these like really high level,
very privileged things, it's like,
I don't even know if there's a way
around that.
So I mean, yes, in a perfect world,
switch to Linux, right?
But I don't know if there's any other
alternatives there.
Do you have any thoughts on that?
Yeah, I think, you know,
when there's a gatekeeper and that
gatekeeper is Microsoft,
there is like not a whole lot we
can do in this case, unfortunately,
which kind of sucks.
Like you said,
like switching to operating systems that
respect you and don't hold your computer
hostage and do this sort of silliness is,
I mean,
I'm sure there is a security benefit from
doing this, but I think, you know,
we have to look at things from like
different angles as well, right?
Like what does this power that Microsoft
holds, what can it be used for?
And if they can lock people out of
their accounts accidentally, accidentally,
I'm saying in quotations,
very large quotations, but, you know,
then what could they do if, you know,
there was a foreign government that didn't
really like, oh,
we don't like that you're allowing people
to use WireGuard VPN to bypass our
firewall or,
something like that, right?
I think people should be able to use
their computer in whatever way they want,
right?
We shouldn't be, we shouldn't be,
you know,
bowing to Microsoft's wins on what we do
on our computers.
So ideally people are using Linux here.
I mean,
there's definitely the ability to use
different repositories is great.
I think
it's kind of unfortunate though,
whoever controls the platform kind of gets
to decide these things.
And as far as I'm aware,
there's Linux is basically the only system
where there isn't a big tech corporation
with ultimate control over everything
because Android,
like we've kind of seen Google owns the
Android source code.
So a lot of times they can exert
things onto the operating system that, uh,
against the user's interests,
for instance.
Like we've seen where they're stopping
people from installing apps on their
devices and stuff like that.
We're still kind of monitoring that
situation as well.
But I think when we operate on platforms
that are not controlled by the community,
then that's when we run into issues like
this.
So yeah, I mean,
I don't really have too much more to
add unless you have something.
No, yeah,
I guess just the last thing I wanted
to mention is,
do you happen to know off the top
of your head what PrivacyGuide's official
encryption software recommendations are?
Off the top of my head,
we do recommend VeriCrypt, I believe.
Yeah.
Okay.
Yeah, I've got it pulled up here.
Let's see.
Can I share?
Yes, sharing this tab.
Okay.
So it looks like we do recommend...
If you're trying to upload to the cloud,
we do recommend Cryptomator.
We do recommend VeriCrypt.
It does say that...
for disk encryption um interesting i did
not know we recommended that for full disk
encryption but yeah the like i said it
can also be used for standalone for to
encrypt like an external disk but oh there
it is operating system encryption yeah we
typically recommend whatever it comes
built in with so like linux distros will
come with lux which is the linux unified
key setup um the only thing that kind
of sucks is i'm told that it can
only happen
At the start, like when you install Linux,
I don't think you can go in and
add it afterwards,
at least not very easily.
It's kind of clunky.
Macs come with FileVault and Windows.
So Windows has BitLocker.
It's kind of tricky to use because
originally it was only in like the upper
level versions,
like the Pro and the Enterprise,
which typically cost a lot of money.
The home version now has it if you
use a...
if you use an online account,
which we definitely do not recommend for a
variety of reasons.
And, uh, yeah, but there's,
there's ways to, you know,
you can upgrade to pro, which, uh,
you can usually find, um,
make sure they're the reputable,
but you can usually find resellers online
who will sell them a pro license for
a lot cheaper and stuff like that.
So BitLocker, I will be honest,
me personally, um,
I'm not too crazy about BitLocker because
I've seen a lot of vulnerabilities in the
past where there's a vulnerability found
in full disk encryption and BitLocker is
vulnerable to it, but VeriCrypt is not.
Or I've also seen stories about there's a
bug and now BitLocker won't decrypt and
you can't boot your operating system.
You have to recover it.
So always make sure you save the recovery
methods because that's really important.
But at the same time,
I guess to make devil's advocate argument,
We know that BitLocker is secure.
We did cover a story about this a
while back where law enforcement requested
some BitLocker keys from Microsoft,
and Microsoft only had them because the
whole online account thing,
they were not able to get it just
the – what's the word I'm looking for?
They were not able to get it just
from the device.
As far as we know,
BitLocker has not been broken by law
enforcement or anything.
But also,
we would have been in the same situation
about not being able to boot had this
VeriCrypt thing not happened, right?
And there's also plenty of Windows
softwares that break Windows even without
the encryption enabled.
So I guess my point is,
I think there's pros and cons,
but BitLocker does have a lot going for
it.
Just something to be aware of.
Yeah.
Yeah,
I guess that's all I got on that
one.
Also, yeah,
just a reminder to get off Windows if
possible.
Even if you dual boot,
I mentioned I have Windows and Linux.
I don't fully run Windows all the time.
I try to use Linux whenever I can,
and then I use Windows for the more
CPU-intensive stuff,
like video editing and stuff like that,
that my Linux machine just can't really
handle.
I think that is all I've got.
All right.
And I think if that's all we've got
on that story,
we are going to talk next about a
wage.
What is it?
Surveillance wages.
Cause you know,
just when you think privacy can't get any
worse.
So this comes from market watch and it
says employers are using your personal
data to figure out the lowest salary that
you'll accept.
And yeah,
Man, I mean, honestly,
this article really,
the headline kind of says it all.
So there's been a lot of talk lately
about surveillance pricing,
which is where
And as far as we know,
this is happening more online than in
person.
But companies will use the data about you
to try and figure out like, oh,
maybe you'll pay a little bit extra.
You'll pay a little extra for this plane
ticket.
You'll pay a little extra for this.
I think there was a story a while
back that found out that if you were
in the parking lot of a Target and
you open the app on your store or
open the website,
They would actually try to charge you more
because they figured that you probably
weren't going to go to another store,
which I don't know who's going to sit
in the parking lot and then order
something.
That's kind of weird.
But there's also been a lot of allegations
about – well,
I mean actually –
No, because that's this data.
There's been allegations like Uber,
for example,
if they can tell your battery is low,
that they'll charge you more because they
know that you can't afford to wait for
the price to go down.
I don't think that was proven,
but that's definitely an allegation and I
wouldn't put it past them personally.
So yeah,
now the new thing is using your personal
data to figure out how much to pay
you,
which in the past has historically been
based on things like how long you've been
in this industry,
how long you've been at this job,
what the actual job title you're applying
for is, certifications, things like that.
which I'm sure will probably still play a
role.
But of course,
now they've got to factor in things like,
where did it go?
Things like if you've taken out a payday
loan,
if you have a high credit card balance,
I just made a big move and I'm
not going to lie.
We moved from a very high cost of
living area to a lower cost of living
area.
We didn't have savings because the cost of
living was so high.
So I've got a pretty high credit card
balance right now because we use that to
fund a lot of the move.
Um, so things like that,
I think further down, they did talk about,
we've actually,
I really feel the need to point out,
this is not hypothetical because I've been
hearing for a long time now that this
is how it works with gig, gig work.
And, you know, things like, um,
like DoorDash,
there are certain people who start out
making more, like if, if, okay,
if I put it in an order to
DoorDash,
And it goes to two different people,
two different dashers.
One of them will see a higher price
than the other one based on things like
how picky they are with their – do
they just accept every single one that
comes along or do they wait for the
better ones?
And the ones that typically pay more will
usually go to those people first, right?
Which is so predatory.
It's so – I feel like I'm getting
ahead of myself,
but like –
I don't know.
Yeah.
Like, okay,
we'll just jump into that part because I
mean, again, this is the story.
Well, okay, hold on.
There is one more thing I want to
point out before I jump into the analysis
portion, which is they say that, um,
The vendors that provide tools that make
this possible, oh, no,
it's a little bit further up.
Yeah,
a first of its kind audit of five
hundred labor management artificial
intelligence companies found that
employers in healthcare, customer service,
logistics,
and retail are customers of vendors whose
tools are designed to enable this
practice.
The report does not claim that all
employers using these systems engage in
wage surveillance.
Instead,
it warns that the growing use of
algorithmic tools to analyze workers'
personal data can enable pay practices.
And I skipped over it earlier,
but when they talk about, like,
high credit card balance and stuff,
they can also scrape your social media to
see if you are more likely to join
a union or could become pregnant.
And...
I guess the last thing I'll say is
I love down here.
They were talking about how there are laws
now that are trying to outlaw surveillance
pricing,
but a lot of them have not caught
up to wage surveillance wages,
except for one,
which is Colorado is trying to pass the
prohibit surveillance data to set prices
and wages act,
which would ban companies from using
intimate personal data.
Um, it carves out performance-based wages,
uh,
which I mean on the surface sounds fine.
Uh, I like that.
He says, uh,
Here it is.
The bill would prohibit companies from
using workers' personal data without their
consent to determine what they're paid.
So, I mean,
I read that without their consent.
I'm like, yeah, of course,
that's going to be buried on page fifty
of the contract, right?
That's ridiculous.
So, yeah, I don't know.
Anyway,
so getting back to the analysis portion of
this, this is really frustrating.
I had the privilege of interviewing
someone recently that we'll talk about a
little bit more later.
coming up here, but she spoke about how
a lot of this surveillance used to like
set prices and set wages like this,
it becomes deterministic, right?
Because here's what I can see happening is
you go up to your boss and you
say, I think I deserve more.
And your boss says,
or even negotiating a new job, right?
Because I've done that where I go to,
I get a job and they say,
we'll pay you this much.
And I say, I think I deserve more.
And I've successfully done that.
And what happens when they come back and
say, well, we can't.
Like I do that every,
you guys may or may not know this.
If you rent,
sometimes you can negotiate your rent.
Like I've done that.
I've gone to the leasing office and been
like, I don't want to pay more.
Like,
let's see if we can find to an
agreement and they'll walk it down a
little bit.
And then I've been to other places where
they're like, no,
that's out of our control.
We can't do anything.
And my fear is as this continues to
grow,
we're going to see more of that
That second thing where people are like,
no, we can't do anything.
It's just,
it's set what it is because that's what
the algorithm says.
And I can't,
I don't have the authority to push back
on the algorithm,
which is demeaning to your employees.
But it's also deterministic.
And it sets us in this like,
this like,
almost like a lack of free will.
I just don't want to use the word
deterministic again.
But it sets us in this environment where
we have no freedom, really.
We have no growth because now it doesn't
matter how hard you work.
It doesn't matter, you know,
how good you do.
It's, you'll forever hit a cap, right?
And sure, those things will matter.
Those things will help.
But, you know,
if you've got the high credit card debt,
if you've got pro-union views on social
media,
now you can either choose to not talk
about that
Or you can choose to just like forever
not reach your full earning potential,
which then keeps you trapped in this
cycle.
And it's just, oh my God,
this is so predatory.
And yeah, I don't know.
I feel like I kind of rambled a
little bit on that one,
but hopefully I kind of said something
coherent.
Yeah.
I mean,
I think one important thing to add to
this is we already see like when it
comes to employment and like how much
people are paid,
like this is already an issue without like
the surveillance stuff, right?
Like,
we have studies that are done where,
you know,
there's people that apply with the same
resume,
but they change the name from a female
name to a male name.
And then the person they get employed more
under the male name,
then they get more interviews under the
male name than the female name.
I think this is just like increasing the
level of discrimination.
People are going to be finding themselves
in right.
Like, Oh, your name appears like this.
And we already kind of know that these
AI systems are incredibly biased against,
um,
people of color, you know, uh,
marginalized groups that are less
represented.
They're, they're not as that,
that these systems are trained on data
that doesn't have them as the majority.
Right.
So it's going to kind of deprioritize
their, uh,
their skills and their experience, right?
So I think this is sort of an
additional layer to that discrimination.
I think someone said here,
Plants McGee said,
giggles in European where this is illegal.
Yeah,
this is illegal in a lot of the
world, actually.
I'm kind of surprised that this isn't
illegal in the US,
but I guess that is the state of
things.
You know what, though?
I don't mean to cut you off,
but I'm glad you mentioned that because I
did want to mention that.
I don't want to get after anybody here,
but I just want to point out,
in my personal opinion,
I think that's a dangerous attitude to
have.
We're like, haha,
that wouldn't happen here.
Just, what was it, late last year,
early this year?
The EU is talking about rolling back parts
of GDPR to be more competitive on the
AI industry.
So like,
Yes.
Like laws help.
Laws are good.
That's I'm glad you guys have that,
but I just really feel the need to
point that out.
Like still keep an eye on this stuff
because laws can change.
And I am under no delusion that European
politicians care more about their citizens
than the U S ones.
They're just, you know,
they put on a better facade about it
in my opinion, but yeah,
just to just keep that in mind,
laws can change and that stuff can go
away.
We gotta,
we gotta make sure that we're constantly
fighting for our privacy rights,
not taking it for granted.
So yeah.
Yeah, exactly.
I mean, I don't know.
This story is kind of, uh,
I don't know.
I think, yeah,
there's laws in countries like the
European union where like, you know,
access to personal information for
employment purposes is protected and not
able to be run through an algorithm or
whatever.
Um,
and there's laws against that in the EU
and, um,
I know in Australia, technically,
that's classified as discrimination.
So it depends on the country.
But like Nate said,
I think it's important.
We can't just say, oh,
it's just the US being the US.
I think we should be constantly vigilant
of governments that are trying to do this
stuff.
Yeah,
so it is a good point that AI
will just be fancy autocorrect and picking
the most likely responses will inherently
lead to a tyranny of the majority rather
than a fair system.
Exactly.
So it kind of has that effect, right?
I think, you know, it's more likely to,
it basically just mirrors the reality,
right?
In a lot of cases,
which the reality is people get
discriminated against and people are paid
less depending on their,
their identity, which is,
we've done studies on this.
We know this is the case.
It's kind of something that we're trying
to fight against to stop,
but it's not something we've completely
solved.
And yeah, I think, you know, if we,
if we try and make sure people are
aware of this,
I think some of the stupid stuff that
I've seen is a lot of companies are
using like
AI to scan people's resumes when they
apply and it will like check the keywords
and stuff and people were just like
putting invisible keywords on their
resumes To make it detect it like this
is this is incredibly silly stuff I can't
believe I have to say this but like
we need to go back to when humans
were reading resumes and interviewing
people and not feeding their information
through an AI system and
That's also just terrible for the person's
privacy.
Like,
I don't want everyone to know my
employment history or where I worked or
what I've the schools I've been to.
And, you know,
that's just another thing that we're
feeding the AI systems.
Like really we're putting all this
information through these massive AI
companies with like no corporate like
control, especially in the US.
Like there's, I feel like there's,
it's very lax at the moment because the
entire economy is basically propped up by
the AI data center industry.
It does seem like that is starting to
fall down a little bit now,
like with a lot of data centers being
canceled, but.
I definitely think the AI hype stuff is
propping up a lot of stuff.
And it kind of means that a lot
of cases this behavior is allowed and when
it shouldn't be.
So, yeah.
That's kind of my thoughts on that.
Did you have anything more you wanted to
add, Niamh?
No.
I think just, yeah, it's such a, like...
it doesn't matter where you are in terms
of your economic beliefs.
Like even if you're a free market person,
there's always going to be someone who can
do the work for less.
And when it's a race to the bottom
like this, everyone loses.
I mean, just look at airplanes, right?
It's, you know,
even Southwest now is getting away,
doing away with their like first come
first serve seating and like free check
bag because it's becoming such a
competitive market.
And they, I don't know,
it's such a race to the bottom.
One of the,
the headers here that I think I may
have scrolled past said judging our
desperation rate, which is again,
it's just so predatory.
It's one thing like surge pricing is one
thing, right?
Because surge pricing looks at the entire
market and says using Uber as an example,
a concert just ended.
There's a, you know, ten thousand people.
That's probably too many.
Five thousand people in this one spot
trying to get home.
We're going to charge more.
Right.
But this is looking at an individual
person.
This is looking at you specifically and
saying, I know that.
that you've sent out five hundred resumes
this week.
I don't know how you did that.
You probably used a bot,
which I wouldn't blame you.
You sent out five hundred resumes this
week.
You've gotten two callbacks and you've got
a thousand dollars left in your savings
account.
You will literally do anything.
And I'm going to give you the bare
minimum that I can to make you say
yes.
but also not pay you.
Your other coworkers are making more than
you do.
Personal opinion,
I've always thought it was ridiculous that
you're not supposed to talk about pay at
work.
No,
I was always happy at my last job
to tell people how much I was making
because I wanted everyone else to know.
I think I've mentioned this in my last
job.
I was...
the highest paid person in in our job
title just by sheer coincidence and luck i
don't know how i got that and i
was very open about that not because i
wanted to brag it to everybody else but
because i was like you guys like i
don't think i'm better than everyone else
you guys deserve to be getting paid more
too and i would tell people that kind
of stuff all the time so like yeah
um jonah says five hundred resumes in a
week sounds like fighting ai with ai hey
man you know that's that's the the
situation we're in right it's ai writing
emails ai reading emails ai responding to
emails it's yeah
But I don't know.
It's yeah,
I'm kind of going off on a rant,
but this just makes me so mad because
it's so, I keep using the word predatory,
but it removes, it removes everything.
It removes the hard work.
It removes the whole like, and again,
it doesn't matter what your beliefs are.
Even if you're like a pull yourself up
out of your bootstraps person,
you can't anymore because it removes that
possibility because they know exactly how
much you need and they will never give
you a penny more.
It's just, it just frustrates me.
So sorry,
I feel really passionately about this
subject.
Yeah,
I think it goes without saying people
should be paid a dignified amount.
And yeah, I agree totally.
Like, I think it is important to me.
There's a weird culture around not sharing
how much money you make.
I'm not really sure what the reasoning is
behind that,
but I think it is important to be
open about that, especially because,
you know, your employer, well, I mean,
not every employer,
but this
the people that are using this software,
they are not holding back.
They are doing everything in their power
to pay you the least amount.
So the least you can do is discuss
this with your coworkers, um, unionize,
do all those sorts of things.
Right.
Like, I dunno, maybe that's, uh,
that's too much, but I think it is,
uh, it is important, like another thing.
Right.
But I think, you know,
if your employer is doing this sort of
stuff to employ people,
Name and shame, name and shame,
like seriously,
like that is the sort of thing that
people go on strike for.
So I think if there's companies that are
doing this,
definitely try and get them to stop
because like Nate said,
it's discriminatory.
It's like, you know,
it's removing people's power to control
things.
And yeah,
if you think it's not already happening,
definitely go read this article because
they lay out several scenarios they looked
at where it's like, this is happening,
like not could happen.
Like this is happening in like,
they mentioned, what is it?
What is it?
Staffing, gig nurses, again, gig workers,
DoorDash, Uber,
like it's already happening there.
There's no reason it's going to stop.
And God,
twice now I've had something pop into my
head and then I lost it.
I hate when that happens, but.
Yeah, it's...
This is one of those moments where laws...
I mean...
I know laws are controversial.
Like people are always afraid of
over-regulation and afraid of like,
you know, Oh,
companies break laws all the time,
but like,
what else can we do about this?
There is no, I mean, yes,
we can all take our privacy seriously and
we should,
regardless of whether or not this is
happening, but there's really like,
I don't see any way to fix this
other than just straight up outlawing it.
Like we were talking about earlier,
this is illegal in a lot of countries.
It should be legal here in the U
S it should be very illegal.
Everyone should be mad and sending this to
their politicians and being like,
we need to outlaw this before it becomes
a regular practice.
Cause yeah,
Oh,
I remember what I was going to say.
Because yeah, on that note,
you can't tell me this is a problem
that the free market is just going to
fix.
Because look at Amazon.
Everyone knows Amazon,
especially the Amazon brand,
is usually cheap garbage.
And you can't tell me that Amazon became
the behemoth they are today by putting out
the best product.
They did it by undercutting everyone else,
by knocking off everyone else,
by using manipulative algorithms to
prioritize their crap first.
I guarantee you,
Amazon doesn't pay for that ad slot at
the beginning.
It's just, this is not...
Yeah, this is a complicated thing to fix.
And it's not just going to fix itself.
That's what I'm getting at.
But anyways,
I think we beat that to death unless
you have something else to add there.
Okay, so in a minute,
we're gonna talk about how the FBI was
able to recover signal messages, sort of,
from a locally stored database on a phone.
But first,
we're gonna talk about some updates about
what we've been working on at Privacy
Guides this week.
And the first thing is,
we have a new interview coming out on
Sunday.
If you guys have not seen that yet,
it's in the newsletter.
Go check out privacyguides.org slash
livestreams.
It should be there right now.
That's live streams with an S on the
end, just by the way.
And we have an interview with the one
and only executive director of the EFF,
Cindy Cohen, will be coming out on Sunday.
I'm super excited for it.
I was the one who got to do
the interview and I'm still excited about
it.
It was really cool.
She was awesome.
And I think it was a really good
interview.
I tried to make sure it was applicable
to everyone.
So we talked about how to stay motivated
in the fight for privacy.
We talked about how to build a good
community.
We talked about what she learned in her
time fighting with the government and her
kind of insights on that.
So really excited for that.
Make sure you're subscribed on YouTube,
on PeerTube,
because we'll be posting it there as well.
PeerTube, of course,
does not have the little premiere feature,
but obviously we do post everything on
PeerTube as well.
So
Make sure to check that out.
And just to hype you guys up for
it a little bit, on the XIX,
we're also going to be interviewing
Carissa Veiles about her upcoming book,
which is coincidentally about AI and all
this stuff we just talked about.
And a lot of the stuff I got,
you know,
a lot of the stuff I was saying
about how this makes it deterministic and
it removes meritocracy.
Like,
these are all things that she talks about
in her book and in her interview.
So...
Yeah.
And then my last thought is that we
are working on a video coming up soon
that many of you have requested.
And that's all I'm going to say to
kind of build a little bit of hype
for that.
So that is what is going on on
the video front here.
And I'm going to turn it over now
to Jordan to let me know what I
may have missed.
No, didn't miss anything.
But we do have some other things that
we've been working on more on the site
update section.
So there wasn't any site updates this
week, but there was.
Freya has put out another article here.
It's about OKCupid settling after selling
three million photos to a facial
recognition company.
Oh,
now that's probably not what you want to
hear about your dating app.
But yeah, if that sounds interesting,
you can check that out.
Go to privacyguides.org slash news to
check it out.
And we've also been what,
so our activism lead,
M has been working on a section for
the website,
which if you haven't called it already,
there was the Privacy Activist Toolbox has
been released.
So you can visit that by going to
privacyguides.org slash activism.
There's the Privacy Activist Toolbox,
which came out a couple of weeks ago.
And that has a lot of tips about
how to
be an effective privacy activist.
There's a lot of great tips in there,
but she's also released this new support
request here on GitHub and
Basically, it's for a DPA directory.
So there's data protection authorities and
that's basically the organizations that
you need to contact to lodge a complaint
with.
So this pull request has got basically all
the regions in the world.
Well, I mean, I'm not sure if...
Yeah,
basically every region that you would
think.
I'm sure there could be some that we're
missing,
but I think Em has done a really
good job here and has covered, I think,
basically all of them.
But there could be small ones that we
didn't find.
So if there is any of those,
I guess you could take a look at
the pull request and suggest adding those.
But so far, what we've got is Africa,
Asia, Europe, North America, Oceania,
and South America.
So basically it will list the privacy law
in particular, the abbreviation,
the data protection authority.
So you can click on that.
And there's also a contact page link and
a complaint link.
So you can basically get directly to the
page that has the complaint form.
So basically what we're trying to do with
this is make it as easy as possible
for people to make a complaint against a
company, against the government, because,
you know,
this is kind of important to
utilize.
Because if you don't use your privacy
rights, well,
you're not going to have privacy.
So if there's companies that are misusing
your data,
or if you want to get something deleted,
I think using this DPA directory is going
to be really helpful.
So definitely stay tuned for that.
I know Jonah said he was taking a
look at the pull request.
So I'm sure it'll be released in the
next couple of
weeks so it does look really nice uh
definitely check out the pull request on
github there's a preview there um but it's
really well put together um so i
definitely recommend checking that out um
it has you know all the regions that
you would expect but if there's any
regions that we missed or you know that
there's that we need to add still um
that you think we might have missed this
there's just so many countries on earth um
we i'm sure we might have missed one
or two so if there's anything that you
would recommend seeing
adding to that, um,
if you're from one of those countries,
definitely do reach out and let us know.
Um,
it's kind of going to be a community
project, I guess.
Um, if there's, there's a couple of, um,
countries now that are sort of in the
process of putting together a data
protection authority,
which is really good,
like Egypt and Mexico.
So definitely, uh,
keep an eye on that as well.
Um,
There's definitely a lot of important
information there,
but also it's sort of a project here
where we are trying to get community input
as well,
because we try and represent every country
here,
but I'm sure there's things that change or
if there's countries that are establishing
data protection authorities,
then that's a really positive step for
people in those countries as well.
Yep,
so that's basically everything that I've
got to talk about here.
There wasn't any articles this week,
so kind of a light week on that
side.
But yeah,
I guess we can hop right into our
next story here.
Oh, actually, before we do that,
all of this is made possible by our
supporters,
and you can sign up for a membership
or donate at privacyguides.org or pick up
some swag at shop.privacyguides.org.
I recently made another purchase on the
shop.
There's some really cool new merch that we
released for the activism section,
so definitely check that out.
and have a look if you didn't visit
it in a while.
There's some new stuff on there now.
Privacy Guides is a nonprofit which
researches and shares privacy-related
information and facilitates a community on
our forum and matrix where people can ask
questions and get advice about staying
private online and preserving their
digital rights.
Now,
let's talk about how little snitch is
coming to Linux.
So kind of a big announcement from Little
Snitch,
which has been historically a Mac OS only
app.
They've now announced that there is a
version available for Linux.
So this is kind of reading a little
bit from their blog post here announcing
it.
I guess press release.
Recent political events have pushed
governments and organizations to seriously
question their dependence on foreign
controlled software.
The core issue is simple and
uncomfortable.
Through automatic updates,
a vendor can run any code with any
privileges on your machine at any time.
Most people know this,
but prefer not to think about it.
Linux is the obvious candidate for
reducing that dependency.
No single company controls it.
No single country owns it.
So I decided to explore it myself.
And basically the article goes on to say
that this person was trying to find an
alternative to little snitch.
They tried open snitch.
which has several command line tools and
stuff like that.
But basically,
it doesn't have the same ability to show
which process is making connections,
which is basically the way that it works
on macOS.
Like any process on macOS,
you're able to see the connections and
block them if you don't want them to
be made.
As far as I'm aware,
Open Snitch is somewhat a little bit more
limited.
It kind of only does like application
level.
I'm not a hundred percent sure because
it's been quite a few years since I've
used Open Snitch.
And it did seem like it was a
bit more complex.
So to use like the interface wasn't
particularly easy.
So this person has developed,
this person at Objective Development has
created a
linux version of little snitch now let's
like kind of clear the clear the air
on how this works so basically it is
there's another app on little snitch on
mac os but i can't remember what it's
called so there is another app on the
it's called lulu and it's by
another company.
It's a nonprofit company.
So definitely, yeah,
Lulu is the other one you're thinking
about.
But yeah,
this one here is Little Snitch has
previously been a paid software.
So it's actually kind of surprising that
this is a free and open source.
It's licensed under GPL v.
So it could be quite cool to see
package managers just adding this by
default,
like just adding this as a package.
But basically, the way that this works is
It is a browser app, kind of.
It's like a web app, basically.
And the reason why they decided to go
with this is that it can work on
a server, right?
Because this is something that you just
run as a system process.
For instance, basically,
that allows you to access the connections
that the server is making through that
nice interface.
So that's a benefit of it being in
the browser, right?
You can access that for a remote computer,
which is extremely useful.
I'm not really sure of many solutions that
do this sort of thing,
especially that easily.
You just install a package and it's
instantly monitoring.
But kind of scrolling down here,
this is basically based on a kernel
component written for eBPF.
And that's an open source component
which is available.
So just to be clear,
the UI is open source and the eBPF
filtering component is free and open
source,
but the
Basically, the backend,
which manages rules, block lists,
and a hierarchical connection view is free
to use, but not open source.
So that's basically the reasoning behind
that is because that part is kind of
proprietary to objective development.
They've been working on that for like,
twenty years to perfect it.
So they argue that that should be
kept closed.
And they kind of did an important note
here.
Unlike the Mac OS version,
Little Snitch for Linux is not a security
tool.
eBPF provides limited resources,
so it's always possible to get around the
firewall, for instance,
by flooding tables.
Its focus is privacy,
showing you what's going on and,
where needed,
blocking connections from legitimate
software that isn't actively trying to
evade it.
So if you install malware,
little snitch is not going to protect you
from the connections getting out, right?
And at least right now,
there are some limitations.
I did see some people having issues with
Fedora workstation working correctly.
They do note that on the page,
on the download page.
I tested it on Debian and it was
working perfectly fine for me.
You just install the package and basically
it only works on kernel Linux kernel six
point one two and above.
So basically the reasoning behind this is
that older kernels currently have an eBPF
verifier maximum instruction limit.
So they kind of have to backport this
fix.
Hopefully they can kind of get in contact
with the Linux kernel developers and do
that.
So that is an interesting thing too.
But I think this is kind of a
pretty, it's a pretty basic app so far.
Like it allows you to enable block lists
and see the connections that your computer
is making.
But I think that's really all you really
need at this point.
I think just being able to see the
connections itself,
because a lot of times you'll be using
software and you won't realize that it's
making connections to like ads and stuff
like that.
especially if you're using software that
is genuine it's genuine normal software
but it's a proprietary app that you know
might have some data tracking built in
like discord or any of any other of
those types of apps um i think this
is an important tool to have on linux
uh especially because people on linux
still need to be able to see the
connections that are being made and you
know there's still privacy invasive stuff
on linux uh you can install
Facebook Messenger on Linux,
you can install Discord on Linux,
you can install Steam,
like all these apps are not great for
your privacy,
but being able to see some of those
connections I think is pretty important.
But yeah,
we kind of got this little poll up
on the screen to use Little Snitch,
you can type one,
two or three in chat to respond and
it'll pop up on the screen.
But Nate,
did you have any thoughts on this one?
No,
I think you kind of you kind of
answered the question I was going to ask,
which is, you know,
Linux is known for being more private.
So my first thought is kind of like,
is there a use case for this?
Why or why would people want to use
this?
And you made a really good point.
You know, one of the this is
tangentially related but you know a common
question is like how do i get people
to switch to xyz signal linux whatever and
one of the things that we are you
know myself and jonah and um you know
some of us always say is like you
have a lot more luck by focus focusing
on the features and so one thing i
like to point out i'm trying to get
my sister to switch to linux because she
was on windows ten and it's you know
not getting updates anymore and um
I'm going to do that with her next
time I see her in person.
And one of the things I'm going to
try to convince her is, you know,
like everything you do on a windows
computer,
you can more or less do on Linux,
especially for her.
If, if, you know,
assuming she's not using any special
software for her job.
Um, you know, you can browse the internet,
you can download discord.
You can, uh,
a lot of games are now gaming on
Linux is doing really well from what I
hear actually.
So, um, for the most part, even,
even some video editing, you know, uh,
the, the Linux computer I use is cube.
So that's a lost cause,
but for like Fedora,
you can run DaVinci on Fedora.
And I think.
Um, you know, there's, yeah,
it's just point being is like,
just because Linux itself is relatively
private,
but especially once you start adding on a
lot of these,
these features that people might use,
even at first,
like if somebody makes a switch to Linux
and at first they start using, uh,
Microsoft office, God forbid,
or something.
I don't even know if that's Linux
compatible, but you know what I mean?
Like they start off and then after a
while, they're just like, yeah, you know,
maybe I'll check out LibreOffice or
something.
And, and, you know, it's,
it's just helpful to kind of have that
ability to control things.
It may also let you know,
like if you fire it up and you're
like, oh my God,
this thing is pinging like twenty
different servers ten times a day.
Like, hold on,
let me take a closer look at this
thing.
So, yeah, it's pretty cool.
And I... I don't know.
I think that's really cool that he made
this...
I guess the selfish side of me would
like to see this come to something like
windows,
even though I know we already have things
like port master open, what is it?
Open wall, simple wall.
But it's not quite one-to-one.
So I think anytime we have more,
more options is always good in my book.
So I think that's pretty cool.
Yeah,
I think I'm also kind of biased on
this because I use their software.
I use Little Snitch on Mac.
I really think it is exceptional software.
People say to use Lulu because it's free,
but it really does not do the same
thing as Little Snitch.
Little Snitch has a lot of benefits over
that.
So it's great to see them kind of
expanding because...
people have kind of been complaining about
little snitch.
They're like, Oh,
I wish it was on windows.
Like you, like, uh,
I wish it was on Linux, you know,
different platforms I think is good.
I mean,
I'd like to see it be on windows
too, but it may just be, you know,
I feel like when we talk about these
filtering softwares, it's extremely, uh,
it's extremely specific to the platform.
Like in this case, it was using eBPF,
but, you know, on Windows,
I'm sure they've got some whole other
system, right?
So, you know, making that basically,
I guess,
compatible with Little Snitch is probably
a lot of work and, you know,
they kind of have to port the entire
thing over.
which is kind of a pain,
but I think it's good to see that
a little snitch is expanding to other
platforms and it's free,
which I think was very generous, but.
Yeah, I agree.
I fully recognize that it's not an easy
thing to go from platform to platform.
And I mean, even Linux, right?
You were saying some people on Fedora are
having some issues getting it up and
running.
And hopefully since it is open source like
that,
hopefully people can do what they need to
do to get it up and running.
But yeah, it's certainly no small task.
And I think that is really cool.
And actually, yeah,
I could never remember because I'm not a
Mac user.
I could never remember if Little Snitch or
Lulu was the one that was free.
And I didn't realize Little Snitch was the
paid one.
And I think that's really cool that
this is free.
So bummer that it's not fully open source,
but I understand the logic of like,
you know,
I've been doing this for years and I
don't want somebody to, twenty years,
more than twenty years,
and the algorithms and concepts are
something we'd like to keep closed for the
time being.
Like, I get it.
So.
Yeah, I think also here,
let's quickly cover this question we got
from Cass K. So they asked,
any tips you guys have for people who
want to start making YouTube content
related to privacy?
All right, Nate, what do you got here?
Yeah,
so I just want to mention that I
have my own website called The New Oil,
which is...
It's supposed to be like a very, very,
very beginner level to privacy stuff.
And my hope is that when people finish
reading that,
they'll move on to other resources like
privacy guides.
But over there,
I do actually have a quick start guide
on the front page for content creators.
And the reason I'm referring you over
there is because there is a lot of
different stuff there.
But it kind of goes over things like...
it's really, I mean,
as with everything in privacy, right?
It's like, what, what do you want?
What are your, your priorities and stuff?
So for example, um,
if you're going to be a Twitch streamer,
you could totally just use your handle,
right?
You know, I mean like Markiplier that,
that dude's obviously not his real name.
It's derived from his real name, I think.
But, um,
A lot of YouTubers and stuff,
they're known by their handles.
But then if you're going to be a
public figure, like a politician,
a lot of them go by nicknames.
Like Ted Cruz, his first name is Raphael.
So things like that.
It's just to keep in mind,
what are you going for will determine a
lot of those things.
But I'm a big fan of things like
using pseudonyms wherever possible.
So again, handles, fake names.
being mindful of your online presence.
You don't have to plant your flag on
every single website, but it may not hurt.
And sometimes you may not need certain
websites.
Like I remember way,
way back in the day,
there's a band I follow, uh,
Oh Sleeper actually.
Um, they only ever had a Twitter account.
They never signed up for Facebook.
They never signed up for Instagram.
Like you could only follow them on
Twitter,
which was slightly annoying as someone who
didn't use Twitter at the time,
but you know,
like that's what they wanted to stick to.
And
Again, likewise,
most bands probably don't need Twitch
unless they're going to do live streaming
nowadays.
And maybe a lot of Twitch streamers don't
need Twitter.
So just kind of asking those questions,
but
Also,
a lot of the technical tools that we
recommend to privacy guides as well,
things like email aliasing,
password managers,
basically securing your online account.
Because I've also, again, I've seen bands,
their Facebook gets hacked and all of a
sudden they're spamming out like,
twenty percent off Ray Bans or whatever at
this sketchy link.
And again,
we just talked about this earlier,
Facebook does not care.
Unless you're Taylor Swift,
they don't care.
Sucks to suck, scrub.
We're not going to help you get your
account back.
I actually read one earlier today,
an article earlier today about how
Discord's support system is still the
dumbest thing that has ever been designed
by a human being.
I'm not sure a human being designed it.
It's so dumb.
But anyways, yeah.
So like just, I don't know.
I think I would check that out and
definitely bounce any of my
recommendations off privacy guides because
I will admit privacy guides has much
stricter,
I don't want to say vetting process,
criteria for a lot of their
recommendations.
So I mean,
I don't know.
Yeah, I would check both of those out.
I think they're really good resources
that'll get you started, hopefully.
Yeah, I just want to add as well,
I think there's some parts that kind of
go... I haven't read your streamer guide,
so maybe I'm just repeating what's already
on the page.
But I think stuff that's kind of important
to establish early on,
are you going to show your face?
I mean, that is kind of important, right?
I think...
You can kind of see what I'm doing
here.
Like,
I don't really want to show my face.
I'm sure someone could find what I look
like,
but it's just a layer of privacy on
that aspect.
People aren't going to recognize you in
the street or whatever,
or people aren't going to be able to
immediately know who you are.
So that is a benefit too.
And I think also thinking about things
that you share,
being very mindful of things that you
share, because, you know,
you take a picture of the room that
you're in,
someone could analyze the texture of the
roof or something,
find rental property stuff and analyze and
work out where you're living or something
like that.
You know, people are pretty creepy.
So I think being aware of
you know, what you're sharing,
how it can be used to find you
being very deliberate about posting stuff.
But I think it's definitely a personal
preference whether you want to show your
face or even show anything about you.
You can certainly be
faceless.
There's plenty of channels that do that
and pretty successfully, I'd say.
So, you know,
I think it's definitely worth thinking
about at least.
But I hope those tips were somewhat
helpful.
We try and answer people's questions in
the chat.
There was another question here from
Plants McGee,
which I'm not really sure about what it
means.
So what's the deal with Twitter, Sydney?
They just left.
So what is this referring to exactly?
They're referring to the EFF just left
Twitter.
Real quick, I do want to say,
I didn't mention the face thing and I
really should add that because that's a
really good point.
I've been just watching random YouTube
videos lately about dinosaurs and space
because I will forever be five years old
at heart.
And yeah, a lot of those...
So it's not just a privacy thing.
A lot of those channels are faceless too.
And I'm actually sitting here thinking,
I'm like, man,
maybe I should do more faceless videos
because I bet they can pump those out
real quick.
But yeah, going back to the question,
the EFF left Twitter.
I...
I have personal opinions,
but all I'm going to say is go
check their blog post.
They laid it out in very plain numbers
where basically they said – you can tell
I agree with their decision.
But basically they said that they're just
– they're not reaching people,
and they only have so many resources,
and they've decided their resources are
better spent elsewhere.
So you can disagree with them.
That's fine.
Free country for now.
You're welcome to do that.
But that's their logic.
So –
I think I'm just going to share my
thoughts on this.
Obviously these, these are my thoughts,
not,
not related to privacy guides as an
organization,
but I think the platform itself has kind
of become pretty toxic.
I think a lot of people are complaining
about it kind of becoming a bit of
a,
an echo chamber for like conservative
voices and stuff.
I think that's not great.
It's definitely,
the platform's definitely changed and
it's,
in a lot of ways became worse.
I think we're seeing more and more people
leaving because, you know,
it is kind of a platform that allows
in, in a lot of countries,
I would say hate speech,
maybe not in the US because the laws
there are a little bit looser, but yeah,
I can kind of understand not wanting to
be on a platform like that.
And I think, you know,
In our case,
Privacy Guides is still on Twitter posting
stuff.
I think we are getting some traction.
So maybe our strategy is different to that
of the EFF,
but we're still getting quite a lot of
traction with that.
I think it's important to reach people no
matter what platform they're on.
So, you know, in a lot of cases,
we're going to be on all these crappy
platforms.
Doesn't mean we support the platform or we
want to...
Doesn't mean we want to make people move
to better platforms like Mastodon.
We recommend different ones that people
should move to instead.
But I think you have to meet people
where they are.
And if we just stopped posting on all
these platforms,
we wouldn't be reaching as many people and
converting them to believing different
things,
like that X is a bad platform and
invades your privacy.
Same thing Jonas said here.
It doesn't make a ton of sense to
me to leave X,
but not Facebook or TikTok,
but shrugging emoji.
I think it's definitely like a personal
choice.
If the analytics said that they weren't,
I mean,
it doesn't really make that much sense in
my opinion,
because
we have all these multi-posting tools.
Like, for instance, our team here, Nate,
I, me, and Jonah, we're basically,
a lot of our posting is through Buffer.
And all it is is just ticking another
box to send it to another platform.
Like,
we're not specifically creating anything
for a specific platform.
So, I mean, if those platforms
I mean, we can debate all day,
like how bad X is as a platform.
We can debate all day how bad Facebook
is and TikTok, but I still think,
you know,
we post on all those platforms because we
wanna be able to reach
these people because you know everyone
deserves privacy not just everyone on
mastodon um so and especially because
these people are probably less aware of
the issue that's why they're on those
platforms in the first place so i can
kind of understand from like an
ideological perspective like if you really
don't like being on a platform that kind
of amplifies conservative voices i can
kind of understand why you may not want
to be on a platform like that where
you get harassed but
I think from an organizational
perspective,
I think I'm not quite sure if I
agree with this because it is kind of
easy to cross post and I can respect
the decision,
but I'm not sure if I agree with
it particularly.
But yeah, that's kind of my thoughts.
Okay.
I don't really have much to add.
I think that was a good point about
buffer, but I don't know.
I don't, I don't have a,
what do they say?
I don't have a dog in this fight.
So it's kind of a messed up saying
now that I think about it,
don't fight dogs.
On that note,
I think we're going to move into a
story about the FBI extracting a suspect's
deleted signal messages saved in the
iPhone notification database.
So I'm not going to scroll on this
one too much because this is actually a
paid post.
But this comes from four or four media.
Highly recommend.
It's totally worth it, in my opinion.
They do great reporting.
But Jonah did say in the chat that
you felt like this is a little bit
of a nothing burger here.
And I...
I halfway agree.
Um, cause you know, it's the,
the headline kind of says it all,
but I think it's worth talking about
because it kind of points out,
they said further down in the article that
this, like,
this just kind of amplifies how difficult
it can be.
Actually,
let me see if I can find it
here,
but they basically said it just points out
how difficult it can be to, um,
to think about every possible angle of
your OPSEC,
especially when it really matters this
much, you know?
So like last year in June,
we found out that, which I mean,
the more technical people who know this
kind of stuff, which remember,
not everybody is super technical,
but yeah,
We found out last year that because push
notifications are usually not encrypted,
Apple and Google can see them,
which probably was not a shocker to most
people.
But also, basically,
the way that they're registered to make
sure they get to the right device and
stuff, it's... Basically...
It's another way that police can get your
data, right?
Police can go to Apple and Google and
they can subpoena you for your data.
And so we're really big fans of services
like Tudor, for example,
does not rely on Google for push
notifications on Android.
Signal, I think,
also has their own implementation.
Proton does rely on Google,
but they encrypt it.
So there's not really anything useful
there, although there is still metadata,
which is worth noting.
But...
And now we're learning it's still even
more complicated than that, right?
Because basically what happened is they
arrested somebody and they were able to,
you know,
they ran the Celebrite or whatever,
whichever device it was.
They ran the forensic tools on the
person's phone,
which was an iPhone in this case.
And this person had already deleted
Signal.
I believe they even had disappearing
messages enabled,
but don't quote me on that.
I feel like I read that in this
post somewhere.
And the police were still able to extract
some of the messages because the
notification history was stored on the
device.
And, um,
Yeah,
I did not see that coming personally.
So it is worth noting that because these
are device notifications that we only got,
or they only got, I should say,
half the,
I can't word today and I apologize.
They only got half of the conversation,
right?
They got the incoming stuff.
They did not get everything, of course.
And I personally am a little bit unclear
on exactly how this would work.
Like for example,
how long do these notifications stay
there?
It sounds like this is a like a
volatile memory kind of thing like RAM.
So would rebooting the phone get rid of
them?
I'm assuming not because I think if these
people were smart enough to use signal and
disappearing messages and to delete
signal,
then they probably rebooted their phone as
well.
Or maybe not since they were able to
forensically examine the phone.
I really can't say for sure.
But
I just, personally,
I have some technical questions like that.
But I think the big reminder here that
I thought was interesting I wanted to talk
about was just the reminder to be mindful
of your notifications.
One thing I really appreciate about
Signal,
and I know other apps do this too
to various extents.
Privacy apps tend to be a lot better
about this, of course,
as opposed to something like Discord.
But Signal lets you get pretty granular in
terms of like,
I can mute this chat.
I can mute this chat for an hour.
I can mute it for a day.
I can mute it indefinitely.
I can select notifications.
It says here that includes name, content,
name only, or no name in content.
And so this is actually what I used
to do ever since I found out about
that story from last year.
I have Signal set to send me just
a notification that says Signal.
And then from there,
I use notification profiles
to kind of manage things.
So like when I was at work in
my last job,
which was a more traditional nine to five
sort of job, I, you know,
in the sense of like,
I can't be on my phone during the
day and stuff like that.
I had a notification profile that would
start at working hours and at end of
day, which was never really end of day.
But at that point I'm like, whatever,
we're staying late.
I don't care.
And during that time,
pretty much the only notification that
would come through would be my wife in
case there was an emergency.
And so at that point,
I don't need the notification content,
right?
Because I know exactly who it is.
It's the only person that the notification
will get through.
And likewise, when I go to sleep,
my wife in case I'm traveling and my
sister in case of emergency.
And I think that's it.
Now, I have one local friend, too,
in case of emergency.
And those are the only people,
even though I try,
lately I've been doing a good job,
but usually I try not to sleep with
the phone in the bedroom.
But you know what I mean?
It's like,
I'm able to craft these very specific
notification profiles.
And even now here at Privacy Guides,
I do have working hours where I'm like
at work and I try to focus.
So I've got everything that isn't, again,
like my wife and my sister and then
all the Privacy Guides people,
they're the only ones that the
notifications go through so that I don't
get distracted by other people.
And yeah.
I think trying to figure out how to
make a device work for you in that
sense, you know, somebody, I think I saw,
I think it was in the privacy guides
forum when people were talking about this
story,
or it may have been in the comments
of this actual article,
but somebody mentioned like changing the
ringtones, you know,
before I started using these notification
profiles,
that was something I did is my wife
had a different ringtone than everyone
else.
So that if I got a, you know,
a notification, I would know, is it her,
do I need to check this or can
I just ignore it?
And so, yeah,
it's definitely something to be mindful
of.
And something else that kind of came up
here,
I'm looking at the comments now on this,
is on iPhones,
if the calls show in recent,
that will show up in the actual phone
app log.
On both iPhone and Android,
there's always relay calls,
so your IP address isn't exposed.
Slight quality trade-off,
but if you're a high-risk person or if
you live in an area where you've always
got a good signal,
it's probably not a big deal.
So just things like that to keep in
mind.
But one last thing I do want to
circle back to when Jonah said, he's like,
this is kind of a nothing burger because
Signal doesn't encrypt their own local
database in the first place.
And I think that's a valid point.
I mean, I love Signal.
I recommend Signal.
It's very user-friendly.
It's very easy.
It's cross-platform.
It's got all kinds of shiny little
features that people enjoy and makes them
more likely to use it.
But I think it is important to note
that there is no perfect tool,
whether that's Signal,
whether that's SimpleX,
whether that's
Um, you know, whatever messenger,
whether that's email,
email itself is incredibly imperfect,
which we did explicitly mentioned that in
our latest video about email.
So, you know, it's,
it's looking at your threat model.
It's looking at what you need from a
tool and it's understanding what the
limitations are and how to either
eliminate them, mitigate them,
work around them.
Cause then I'm not gonna lie.
My, my, um,
One of my thoughts I had today when
I was thinking about this story is like,
I can't control other people's phones.
You know,
I have my notifications set not to do
things.
But now when I text people,
my notifications are on that device.
And that's just something to be mindful
of.
So yeah, kind of a shorter story,
I think today, but you know,
it's a pretty quick takeaway.
So I don't know if you had any
thoughts on that one that you wanted to
share, Jordan.
Yeah, I mean,
I think you brought up some great points
there.
Like, you know,
there's ways to at least somewhat protect
this on your end.
Like you said,
the settings in Signal itself.
I almost wonder though,
you could almost disable notifications
just in general, you know,
no notifications.
And then, you know,
there wouldn't be a database where
anything would be stored in this case.
But, you know,
maybe that's kind of problematic for
people to do.
But there's also the case where
know there's at least on android there is
the ability to enable notification history
so it can save notifications that's off by
default but it's another thing to check um
to see if you have that enabled definitely
disable that i don't think that's
necessary to to enable like it's it's the
whole point of notifications is they're
kind of ephemeral they're there on the
screen i kind of assumed though
that's when you dismiss the notification,
it's gone, right?
I didn't think that there would be saved
on your device in a database.
So this might be something for Apple to
actually fix because I feel like that is
a bit of a concern.
and anonymous three, four, four,
just put in the chat, threat model,
threat model, threat model.
Exactly.
Like if,
if your concern is your device being
seized and you want to protect the data
on it, um, you know, use Molly,
have an encrypted database in signal.
Um,
don't use notifications cause they can be
accessed.
Right.
Um, I think it's kind of hard though,
especially because, uh,
in this story in particular,
I think it was,
They said the case was the first time
authorities charged people for alleged
Antifa activities after President Trump
designated the umbrella term a terrorist
organization.
So I don't know,
this is kind of a very nebulous thing
going on in the US.
Like what is Antifa?
Like it's not really an organization.
It's kind of a bit silly that they're
calling it that.
But I think people should be
guess more vigilant than usual because you
know you never know if your your
activities are going to be considered
antifa and then your devices might be
seized so it could be worth thinking if
you know you might be a target of
this sort of thing a little bit more
thoroughly because it does seem like the
government is cracking down on political
behavior um i can't really read the full
article here so i'm not really sure it
says here that it was uh people present
It said the case involved a group of
people setting off fireworks and
vandalizing property at a ICE detention
facility.
Yeah, in Texas,
and one person shooting a police officer
in the neck.
So I'm not going to comment on whether
they were guilty or not,
but that's what the FBI is finding.
Yeah,
I think that's up to the courts to
decide.
But I think the thing is, right,
you know,
if you're doing any sort of
political action.
I'm not saying, you know,
you should go out and vandalize an ICE
facility, but, you know,
I'm saying like any sort of political
action,
whether that's peaceful protest or,
you know, marching through the streets,
you know,
I think it's important to think about ways
that your data could be extracted and used
against you.
I think in this case, you know,
it's definitely,
everyone deserves privacy,
even if these people were
vandalizing something um so it's kind of
unfortunate that uh the the iphone was
kind of a bit uh vulnerable to this
extraction method i also do wonder if
lockdown mode could have prevented this um
because i think lockdown mode does prevent
a lot of these uh
like tools that these FBI uses to like
forensic extraction tools.
Um,
I think it probably could have been
interesting to hear, um,
if that was the case or not,
I'm going to assume no, but you know,
I feel like anyone who's going to any,
uh, political action,
like just enable lockdown mode.
It's like the least you can do.
It's, it's, it's a basic thing.
It's just a switch you turn on.
Um,
but I do think it is, uh,
calling out Antifa is big,
the hacker known as Fortan energy.
Yeah.
It's just like, it's, it's just,
it just shows that the government doesn't
really know about like any sort of,
any sort of political organizations.
I think it's a lot easier to, uh,
it's a lot easier to lump a whole
bunch of people together and sort of say
this nebulous thing is bad.
Um, then, you know,
having any sort of specifics like, yeah,
these,
these protesters at this specific ice
facility, um, who that's why, I mean,
we don't really know if they're part of
a group or anything, but, um,
they could have just been acting
independently.
Um, I think it's just,
it's kind of ridiculous that we're,
grouping it all together like that.
But yeah,
I don't really have more to add than
that.
Yeah.
The only thing I wanted to comment on
is you said that it would be best
to turn off notifications altogether,
which yeah, I mean, if it's really,
really, it's all about threat model,
right?
Like that wouldn't be feasible for me in
my day to day for sure.
But yeah, if you're,
especially if you're doing something
sensitive,
whether that's simply protesting or
whether you're taking it further,
which I'm not advocating for violence or
breaking any laws, but I'm just saying,
you know, threat model.
Yeah.
You maybe should bring a separate device.
You maybe should turn off notifications.
It's, it's tricky, but it's also just,
yeah, I don't know.
I wonder how
I personally just wonder how well-known
this kind of vulnerability is.
Is this the kind of thing that technical
people would look at and be like, yeah,
obviously, iPhones are keeping a...
Because it sounded like Jonah...
Full disclosure,
Jonah's the person I usually go to with
deep technical questions because he's
really smart about this kind of stuff.
And when I was asking him questions about
this earlier this week, I was like...
And I asked him those questions about
would the phone restart?
Would this, that, and the other...
Or excuse me,
would a restart clear out the...
the cache or the database.
And he didn't really know either.
And he said the same thing that you
said, where he's like,
I always thought that when you swiped a
notification, it was gone.
Is this like, does you,
do you have to handle it differently
somehow?
And so I,
where I'm going with that is like, he's,
he's really smart in my opinion,
not to like, you know, but, uh,
and if he doesn't even know this stuff,
it's like, how many people do know this?
Like, this is not common knowledge.
And, um, I think we did find a,
uh,
some some court cases i mean this was
like a real real quick web search we're
not lawyers obviously but we did find some
some cases where the judge kind of threw
out certain evidence because they're like
how would a person be expected to even
know that they were leaving evidence which
i mean that's not exactly what his
argument was but you know it's like
there's a certain level of just like like
it's insane that you found this and i'm
not going to allow this in court and
it's i i kind of wonder how this
would qualify as well how how well known
is this kind of a thing and
Yeah, super crazy.
But I think this also brings up like
the other concern with iOS, right?
Because iOS is a closed system, right?
We don't have access to the source code.
We don't know that there's a database
storing people's notifications.
Like we're up to iOS now.
Like there's been versions of
Well, not twenty six,
but I guess nineteen.
Nineteen updates.
And we still haven't seen this be an
issue before.
So I think that's the benefit of open
source operating systems like, you know,
graphing OS.
We know there's no notification database.
We know that those messages are being
stored after they're being dismissed.
Whereas with iOS,
it's sort of a black box.
we can protect as much as we can
from this sort of thing like we can
make assumptions that things are done a
certain way but i feel like making
assumptions is kind of risky because we
don't have clear evidence with what these
systems actually do so i think that's
where open source operating systems are
going to definitely beat out this sort of
thing jonah says was about to type what
jordan is saying yeah um so i guess
uh we kind of had the same thought
on this um but
I think it's definitely, you know,
I think one thing to think about as
well is, you know,
there's certain apps that need
notifications and there's certain apps
that don't.
Like we can definitely try and reduce
notifications
the amount of stuff sending notifications,
because as far as I'm aware on like
Googled and Apple devices,
there's the stuff that's sent,
like the way that notifications work is it
sent through, you know,
Google Firebase or it's sent through Apple
service.
And we've already had a story previously
where, you know, someone's,
notifications were able to be subpoenaed
from Apple and Google and get access to
the notification content.
And there's possibly sensitive information
there.
We did end up finding out, though,
that in a lot of cases,
a lot of these apps that are privacy
focused,
they actually encrypt the notification
content.
So Apple and Google will get notified when
notification is arriving they won't have
any insight into what it actually is so
you know i think notifications are kind of
one of those sort of uh they have
a lot of like foot guns i guess
they kind of are like a bit of
a uh dangerous thing uh that we need
to consider i think bringing the story up
is like you know brought this to the
forefront again um i think
Obviously,
there's people that need to disable
notifications,
but I think it's good to at least
consider this as a threat,
because I guess people haven't really been
doing that.
Yeah, and not to speculate too much,
because you literally just said, like,
you know,
all we can do is speculate sometimes,
but I wonder if, because you pointed out,
it's like
there's been so many versions of iPhone
and we're just now learning this.
And part of me just wonders,
has it always,
and this is just me thinking out loud,
obviously,
I know none of us have answers to
this, but like,
has it always been doing this or is
this a new feature?
Because my thought process is,
if it's always been doing this,
I think that kind of highlights the arms
race nature of privacy and security where,
you know, before it was like,
I don't know,
just to pull random examples out of thin
air that may not fit because I'm making
this up as I go along.
But before,
the police would walk by a building,
look in the window, and go, oh,
there's my evidence.
And now they have to go deep into
the building, into the bank vault.
So it just kind of makes me wonder,
is this some new thing?
Or is they've just never used it before
because they've never needed to try so
hard to find evidence before,
which would...
If that is true,
then that would just show how much more
secure everything is getting.
But again, we don't know.
We'll never know, probably.
Just a random thought that I had.
I think that's all I've got on that
story, personally.
Alrighty then,
I guess that's kind of a little bit
of time now to move into the forum
updates this week.
So in a minute,
we'll start taking viewer questions.
So if you've been holding onto any
questions about any of the stories we've
talked about so far,
go ahead and start leaving them on our
forum thread or in the chat on the
live stream and just so you know if
you're watching this and you're not you
don't have an account on one of those
platforms we do stream on stream yard so
check out the forum post and there's a
link there you can join without an email
just a name and you can ask a
question but now for now let's stick
to our community forum.
And there's always a lot of activity
there,
but here's a few of this week's most
interesting discussions happening there.
So there was this thread that Nate linked
here, and this one is about Wisconsin.
So Wisconsinites can keep watching porn
after governor vetoes age verification
bill.
So I guess I'm going to kind of
throw this to you, Nate,
because I feel like you have quite a
lot of thoughts on this one.
Yeah.
Um,
so I originally thought this was good
news, not just from the porn angle.
I think that's just four Oh four being
clickbaity.
Um, which I say that with love.
I mean, it's to me,
it's only clickbait if you don't deliver
on the promise and you know,
it's everybody's trying to stand out.
Right.
But anyways, um, yeah, so we've been, uh,
you know, privacy is an uphill battle.
I think we all know that.
And I think that, um,
It can be really depressing because I
think if we're being honest,
we generally tend to lose more than we
win,
which I don't think it's a lost cause.
I think...
Especially, I think, as things get worse,
people are going to start realizing the
value of their privacy,
and hopefully we can start to reverse that
trend a little bit.
But a lot of the time,
we do take some pretty severe losses,
and so it's important to celebrate the
wins,
which I'll get to why this is a
bit of a mixed bag in a minute.
But for now,
I do want to celebrate the good sides,
which is that the governor rejected this.
This was an age verification bill.
Uh, assembly bill one Oh five,
which would have four sites with more than
one third of material harmful to minors,
uh,
defined as depictions of actual or
simulated sexual acts or body parts
included, including blah, blah, blah,
blah, blah.
Um, female nipples,
not male nipples as always, but whatever,
that's a rant for a different time.
Anyways.
Um,
It would have required using any
commercially reasonable method that uses
public or private transactional data
gathered about the individual.
And the article says this means uploading
an ID,
showing their face for a biometric scan,
uploading credit card information,
or a combination of these.
And the governor vetoed this bill and
said,
I am vetoing this bill in its entirety
because I object to this bill's intrusion
into the personal privacy of Wisconsin
residents.
While I agree that we should protect
children from harmful material,
this bill imposes an intrusive burden on
adults who are trying to access
constitutionally protected materials.
Um,
Evers wrote that the bill doesn't prevent
platforms from giving collected personal
data to third parties,
such as the government or data brokers.
And he wrote,
this is a violation of personal privacy.
Additionally,
I'm concerned about data security and the
potential for misuse of personally
identifiable information that could be
intercepted by or transmitted to a third
party used for the basis of blackmail or
identity theft.
Further,
although the bill includes penalties for a
business entity who violates the
prohibition of retention of personal data,
those penalties cannot undo the harm.
So all really,
really good stuff that I was super stoked
to see.
Unfortunately,
I think here in the comment section is
kind of where it went wrong is some
people pointed out,
and I don't know if I missed this
in the original article because I'm still
not seeing it here either.
Um,
maybe it's like in a different statement
that he gave her,
like the rest of the statement,
but some people quoted that, uh,
the governor wants device level age
verification.
And apparently this is a quote from him.
Uh,
we can and should work to prevent minors
from accessing adult content.
Um,
but there are better solutions than the
one offered by this bill.
For example,
we can work with tech companies to
implement device-based device-based age
verification that takes place on a user's
phone or computer,
which can be more secure and effective
method.
Other States have been moving toward
device-based solutions and major tech
companies are adopting these options as
well.
So yeah, it's, I don't know.
I don't want to get into the age
verification debate.
Cause we've, I still am fresh from like,
there was,
there was like a three or four week
run where we talked about it every single
week.
And I still don't feel like we have
anything new to add to that,
or at least I certainly don't.
Um, you can feel free to,
to chime in if you have more to
add to that.
But, um,
I don't know.
I have mixed feelings about device based
stuff, but I certainly see the drawback.
And anyways,
I guess this one's a mixed bag,
but it's I want to celebrate the win
that he did repeal it.
And I certainly agree that this would have
been way worse than device based.
I'm definitely I definitely know the
problems with device based.
I'm not saying I'm in favor of it.
But there's a difference between getting a
paper cut and getting your finger chopped
off.
And I think this would have been getting
your finger chopped off.
So I think that's good that we avoided
a much worse fate.
I hope he doesn't go for the device-based
stuff.
That's kind of the drawback.
But yeah, anyways,
I'm going in circles now.
I just wanted to celebrate a small,
even if it's a mixed bag,
we did have a small win this week
that I thought was pretty cool.
Yeah, that's good to hear.
I mean, one thing that I kind of,
it kind of bounces off the issue that
we talked about in the highlight story is,
you know,
I think we should be against all these
sort of centralized things, right?
Like this is centralization.
Again, like we talked about app stores,
right?
Three app stores, or I guess really two,
there's like two big ones,
Google Play and Apple's app store, right?
And
I think this is just like sort of
reinforcing why these platforms are bad.
Like we talked about at the start,
like having these platforms decide what is
allowed, what is not allowed.
This is just a bad idea.
This bill in particular,
it sounds to most people,
I think it would sound reasonable,
but the issue is where, you know,
there's more, there's things where like,
you know,
what's classified as adult content, right?
Like that's the stuff that they mentioned
sounded reasonable, I guess if you're,
I mean, I guess, uh, but you know,
there's all sorts of issues when it starts
covering more stuff that isn't technically
adult content.
That's restricting people from accessing
those, uh,
applications unless they verify their
identity.
Um, so I dunno,
this is definitely sort of,
I guess it's somewhat of a win.
It's not like a.
The bill got taken down, I guess,
but there's still the chance that
device-based age verification might make
its way through,
like we've been seeing with the app store
transparency stuff.
I think that would definitely be
probably in a lot of cases worse because
you know doing this on a device level
is a lot more invasive and it gives
a lot more control to these tech companies
so I'm certainly against both I think I
can't believe I keep having to say this
but like you know we have parental
controls like we have all these amazing
tools that people have access to I don't
think the government needs to get uh
involved on people's devices so much,
right?
Maybe people have different opinions.
There's a way to do this privately.
I just think there's always leaks, right?
Like this article in particular, actually,
it mentions, like if you scroll down,
there's a section in there where they talk
about the Discord age verification stuff
where people that were having to send
their ID data and selfies was breached in
a security breach, right?
Like
Even though we think all of these services
are done properly,
the age verification systems,
a lot of times security is just not
the priority.
So that's sort of my thoughts on this
story, I guess.
And even if they were,
just to add on to that last part
you said, this week alone,
I've covered three or four stories about
insider threats and people...
abusing their access to a system to get
into.
I covered one.
I need to add these to my website,
actually.
I covered one about a police officer who
was using DMV photos to make AI nudes
of women, not even making that up.
And then I found another one.
I wasn't even looking for this one.
This one didn't even come across my
newsfeed.
I was web searching for something else and
it magically showed up in the search
results.
There was a dude at Facebook who was
giving himself access to over thirty
thousand private photos.
So, yeah,
even if these systems are made correctly,
quote-unquote correctly,
I'm not going to count Facebook as
correctly because they've had more data
breaches than there are grains of sand on
Earth.
But even if these things are made
correctly, they're still insider threats,
and it's just –
Yeah,
I think I'm kind of coming around to
you because there's so many ways to solve
a problem, right?
And there's the technical solutions,
there's legal solutions,
but then there's like educational
solutions,
which I think is kind of like where
things like privacy guides come in and
this this podcast and stuff.
And I think
From what I'm seeing,
I think this is probably largely an
educational problem because I feel like I
mentioned the example of, again,
my sister didn't even know that iPhones
have parental controls.
And granted, her kid is really young.
She doesn't have to worry about that yet.
He never has.
He doesn't have his own phone or tablet
or anything.
Um,
so she's not at that point where she
has to worry about it,
but like how many parents know these
things exist?
How many parents know what they're capable
of?
I've heard,
I never used them cause I don't have
kids.
I've heard that some of these parental
controls are actually really good,
but how many of them, you know,
just don't know they're there or they'll
pay for some garbage third party thing.
That's going to be selling their kids
data.
Cough Cough Life, three sixty.
Because, you know, again,
like especially my my generation,
we came up in an era where like
Windows security was garbage.
Nobody trusted Windows firewall.
Of course,
you had to pay for a third party
antivirus.
And that's just not true anymore.
And I just I wonder how many people
even know that.
So, yeah,
I think I am kind of starting to
lean more towards the side of like I
think this is largely an edge or at
very least we need to start with the
educational aspect.
And then if we get everybody up to
speed and find out that there's cracks,
then maybe we start talking about how do
we fix this?
But
Yeah,
this definitely feels like an
oversimplified... I mean,
I've known that from the start.
But yeah,
age verification is just an overly simple
solution.
I think I want to add a little
bit extra onto what you're saying there
about how we should...
teach adults about these features.
We have gotten to a point, right,
where it is like,
I almost feel like people don't have an
excuse because if you buy a new Google
device, if you buy a new Apple device,
if you buy a new Windows device,
in the setup process,
it literally asks you,
is this device for a child?
Um, like it is kind of like,
I feel like we've gotten to a point,
right?
Where like, maybe,
maybe the device has to come with like
a red sheet of paper or something that
says, if this device is for a child,
please set it up during the setup process.
Like how much more obvious can we get?
Like, you know what I mean?
But actually I do want to push back
on that a little bit.
Cause it doesn't do that here in the
U S. Um,
but I think that would be a good
idea if it did that in the U
S because I don't see any reason it
shouldn't.
No.
It doesn't ask?
I haven't seen a single one in the
US.
And I mean, granted,
it's been a while since I've set up
anything that wasn't... No,
even because the most recent device...
Well, I mean, okay,
this computer is a company computer,
so it was already set up when I
got it.
My Windows computer I got in...
When did I get my iPhone?
I don't know,
but they're all within the last five years
for sure.
And not a single one of them has
done it to me.
I don't know.
I don't think I know anybody who set
up a device from scratch.
I think most people I know just like
transfer their Apple ID or their Google
account or whatever.
So I don't know.
I could try to do some digging and
look into it.
But yeah,
I've never had a device ask me that
ever here in the US.
But like I said,
I don't think that would be a bad
idea because...
Yeah.
How cool would that be if, you know,
my sister goes out and buys her kid
his first iPhone and it says,
is this device for a child?
And she goes, why?
Yes.
Yes, it is.
And then it just walks her through the
parental controls.
So yeah, I think that, I mean,
I can only comment on like,
I don't have particularly new devices.
Like I have an older phone, right.
I reset it recently and through the setup
process, at least on iOS, it did ask,
um, you know, it said,
It said during the setup process, like,
is this device for a child?
Same with this Android phone on the Google
operating system.
And same with Windows, actually.
I installed Windows recently and they did
ask.
But it does say on Apple's website.
I did look into this before because I
was having this conversation with someone.
Like, it does offer this section.
It does say on their website, you know,
this is a –
before you can set parental controls on a
child's device.
So it doesn't say specifically on here.
I mean, I don't know.
I don't think it would be different in
the US,
but I guess you can trial and error
this.
But it does say,
I'm seeing articles here where it says it
is...
implementing new features when you set up
a device.
So I'm not sure,
maybe we'll have to look more into how
that affects things globally.
Because I know in Australia,
we do have like age verification laws and
stuff.
So it could be applied differently here.
But I do think companies are making things
incredibly easy now to do this.
And that's why I kind of get a
bit frustrated when there's government
officials who are pushing for these really
aggressive methods to do this right like i
don't know i feel like generally it's not
up to the government to decide this sort
of stuff like it should be up to
like a parenting decision um from the
parents like if they want to have a
child using an adult device they can but
um
Yeah, I definitely think the process,
at least even if it doesn't display it
on setting up a device,
I think it's good that the integration is
already there.
The options are there.
Maybe we could do a better job showing
people that this isn't even an option.
But I feel like it's, I don't know,
maybe Joda can comment on like a US
perspective on this.
But every device I've set up so far
has asked me if it's a child's one.
So I don't know.
Yeah,
maybe I need to go reset my iPhone
and see what happens.
And I agree with you.
That's what I'm saying.
I've heard the parental controls are
really good.
It's just, at least here in the US,
I feel like it's an issue of how
do we let people know those are out
there.
I didn't even know Windows had parental
controls, to be totally honest with you.
But I don't know.
I've just...
Maybe that's something that's only rolled
out in the past couple years,
and I just barely missed it.
So...
I don't know,
but I certainly would not be opposed to
it.
To like, yeah,
these controls are already built in.
Say that this device is for a child
and we'll walk you through how to set
them up and how to use them.
I think that would be awesome.
Yeah, I've definitely seen it on Windows.
Can't recall on Android and iOS.
So it does,
I think it might've been because you
might've got the laptop as a Windows X
laptop and then you upgraded it to Windows
XI.
It might've bypassed the screen.
That could be it, yeah.
Because it, wait, was this one?
I'm not sure, actually.
I think it was more a Windows Eleven
feature.
So it could have been before they fully
released it all.
But the way it works on Windows is
quite good as well.
It works really well on iOS and Google
as well.
I think it's currently a Mac.
Yeah, Mac has it too.
So all the major platforms do have it.
So I kind of become a little bit
frustrated when we're
Trying to push these aggressive laws.
I've seen it for Apple Watches,
says Jonah.
Yeah, I've seen it too.
Quite good if you use their online
accounts, as far as I know.
I mean, yeah.
I mean, this is kind of a drawback,
right?
I don't think Graphene OS has parental
controls built in.
Don't think that's really a priority for
them.
And definitely not Linux, so...
I mean, yeah,
that is kind of an issue with these
more open platforms.
They tend to not include these sort of
features.
So yeah,
it's definitely a good discussion to have
though.
Yeah, for sure.
And I'm definitely going to keep an eye
out for it next time I buy a
new device now,
because now I'm really curious.
I think that would be great if it
was a default prompt for sure.
So.
On that note, I think it's, I mean,
that's all I had on that forum thread.
So I think it's time to head over
to some viewer questions.
So we'll start with questions on our forum
from paying members.
If you are interested in becoming a
member,
you can go to privacyguides.org and click
the red heart icon in the top right
corner.
of the page but we only had one
question this week on our initial um on
our forum post about this this topic and
uh this is from expert forty forty eight
seventy who says what are the privacy
implications of using an alternative front
end that fetches content directly from the
original service rather than proxying it
um you say something like something
invidious instances do uh so it will be
It will be, while using a popular VPN,
how does that compare to just using the
original website with a content blocker
like uBlock Origin?
My experience has been that browser
fingerprinting techniques can still track
users easily,
even with content blockers enabled,
so I'm wondering whether non-proxying
frontends offer different protections.
I'll be honest.
I'm not super familiar with the technical
aspects of frontends,
so I'm not sure which ones proxy and
which ones don't.
I can talk about it if you want.
Um, I'll let you go first then.
Cause you probably know more about this
than I do.
Yeah.
So basically there's, uh,
I guess let's talk about the main two
ones here,
but like we're talking about piped and
NVIDIAs, at least the web-based ones.
So, um, by default,
as far as I'm aware, like a lot,
it depends on the instance, right?
Because these are like decentralized
services.
So it depends on what the instance is
configured.
So the piped, um,
Piped uses a piped proxy.
So it's actually your requests to YouTube
are going through a separate,
the server that you're connecting to for
the websites,
basically they're proxying the requests on
your behalf.
And the issue with this sort of method,
right,
is
it's a lot easier to be blocked, right?
Because if there's ten thousand people
accessing a piped instance, it's going to,
YouTube's going to block that.
They're going to think you're a bot,
you're spamming.
So that's the issue that we kind of
have with piped.
They're kind of getting blocked a lot and
the access is not as good.
NVIDIUS in this instance,
it actually plays
basically what it does is it strips out
all the add-in tracking technology from
the YouTube website and it will actually
play the video directly from Google.
So you're still making a connection to
Google itself.
Again, though,
there's a setting in the settings called
proxy videos,
and that will proxy it through the NVIDIA
instance.
But by default,
it should play it directly from Google.
So the reason this is kind of also
becoming a problem is because a lot of
VPN servers are getting restricted and
they require you to sign in to play
videos.
Yeah,
there's like more restrictions being made.
So with NVIDIA,
you can actually check this yourself,
right?
You can use uBlock Origin and you can
see the connections that the website is
making.
You'll see it's connecting to Google
Video.
So your IP address is visible to YouTube
itself, right?
But there's significantly less tracking
happening because
the JavaScript on YouTube's website isn't
actually loading,
which is the usual concern, right?
So YouTube will know that your IP address
is pulling a video from their servers,
but
Yeah,
then we can also talk about like FreeTube,
and that does give you the option when
you're setting it up if you want to
do fully local playback.
So the same thing as NVIDIA is pulling
the video directly from Google,
or you can also use a piped proxy,
which like we talked about,
it can have issues,
but it does offer more privacy because
your IP address isn't being directly
exposed to Google itself.
So
Another thing here is using a VPN and
then using NVIDIUS or like a local,
locally fetching these videos,
it's going to be a lot less
easy to track it because you're using an
IP address that a bunch of other users
have.
So I think that's basically kind of the
rundown.
I wouldn't use NVIDIUS if you are on
like a residential connection because
it'll just be linked to your IP address.
They'll just see your residential IP
address accessing all the videos.
It'll be easier for them to track it.
So I'd say try using a VPN and
try using NVIDIUS and
directly fetching the videos over pipes
because pipes is usually locked a lot more
commonly.
So hopefully that answers the privacy
question about this topic.
It's kind of confusing,
but if I didn't answer it well,
just let me know.
Cool.
Thank you.
I think it's also just one thing I
want to throw out is we don't really
know much about browser fingerprinting.
Like I made like a year ago,
I made a video about that over on
my YouTube.
And the thing I learned is that a
lot of it
I mean, there's like two categories.
There's the people who are like marketing
companies who are like, yeah,
we can fingerprint anybody anywhere.
And it's like, okay,
and I'm going to take you with a
grain of salt because you'll say anything
to make a sale.
And then there's the technical people who
are just like literally everything can be
fingerprinted.
And like the privacy people who say this.
And I think the issue is we don't
actually know for sure
how prevalent it is,
which techniques they're actually using.
I've seen all kinds of proofs of concept
about CSS can be fingerprinted if you do
it right.
A lot of extensions can be fingerprinted.
There's so many different ways to do it,
but we don't know for sure what ways
they're doing and what ways they're using.
I'm not saying not to worry about it.
I just want to point that out.
It's really...
um it's not like you block origin does
nothing i know it does block a lot
of stuff and then you know brave obviously
has a lot of built-in protections firefox
especially with the setting changes that
we recommend offers really good protection
it's obviously not perfect um if you need
perfection or as close to perfection as
you can get you need something like tor
and hunix but at that point you're
probably not streaming youtube so um just
something to keep in mind there's still
definitely a use a place for those so
I mean, I think there's definitely,
we do have some research that has been
done specifically on browser
fingerprinting.
Like when I was looking into,
like we also did a video here at
Privacy Guides.
We interviewed someone about it as well.
There are at least some hard facts about
it.
So, I mean,
definitely go maybe check out that video.
We did talk a little bit with...
um we got information from someone at the
tour project who works on a lot of
the fingerprinting stuff for that um so
definitely look at that i think the um
there's definitely papers that have been
done when i was researching for that video
there was quite a lot of papers about
specifically talking about stuff like
entropy and you know how that affects the
fingerprint I guess I think Nate's right
though just like we don't really have the
specifics of what people what companies
are doing because it's kind of hard to
know right but I think going off the
research that we do have you know
increasing entropy you know like with Tor
browser I think there's
there's pretty much,
there is basically proof at this point
that like, you know,
if you use Tor browser,
if you take all the precautions as
possible, you're not going to be,
you're not going to be able to identify
someone specifically with their
fingerprint if they're using something
like Tor browser.
But I think we have gotten to a
point where so many tools just have all
this built in,
like Firefox and Brave both have
fingerprint protection built in by default
now.
So it's like,
Basically,
we're getting to a point where these
protections are becoming pretty
mainstream.
But I think if you need something a
bit more extreme,
then something like Molvado Tor is
definitely going to offer better
protection.
Yeah, just to be clear, like you said,
we have a lot of research into how
good the browsers are at resisting it.
We just don't have a lot of research
into how many companies are doing it,
what exact techniques they're using,
how common it is.
I assume it's pretty common.
I assume that a lot of companies are
doing it.
They don't tell us because it's kind of
like their secret sauce for marketing,
and this is why we're so effective.
But yeah, it's just...
I guess what I'm getting at is I
think uBlock Origin and a good privacy
browser is probably a lot more effective
than we give it credit for.
But I mean,
I'll definitely never complain about
somebody going the extra mile if they feel
the need to.
So it doesn't hurt.
Yeah, I think it's definitely...
I'll just push people towards...
I know Nate did a video about it
as well.
I thought that was quite good.
We also did a video.
Definitely check out,
get different perspectives on it
because...
uh there's definitely a lot of different
opinions right because we've got we've got
the tor people we've got the brave people
we've got the firefox people they've all
got different uh we've got the ark and
fox people they've got a different uh
perspective than the tor browser people so
you know guess go to different places for
information try and uh
try and understand the topic as best as
you can.
Hopefully the resources that we've put out
there is good enough to kind of make
a good judgment on it.
But like, like Nate said, like,
I feel like when we talk about this
stuff, it is kind of an extreme topic.
Like, you know,
having a privacy browser and a new block
origin, like Nate said,
is like going to be better than ninety
nine percent of people.
So just put it in perspective.
Which on that note,
I've been perusing the live chat here.
And I think there's only one question we
haven't addressed so far.
But it actually kind of touches on this
a little bit.
And it says,
this comes from anonymous three four four
here in the stream yard chat.
Threat modeling should be deferred to
experts that you personally consult on
over and over again.
It's unfeasible for an individual to know
every single vulnerability and scenario
that they have to protect against.
Do you guys plan to provide privacy
consulting in the near or far future?
I mean,
I don't speak for everybody around here.
I don't think we're planning anything like
that as far as I know.
I certainly haven't heard anything about
it.
Probably not would be my guess.
Not anytime soon, at least.
I don't know if we make far.
I personally do not make far,
far future plans because you never know
what will happen.
Right.
I've had my long term plans thrown into
chaos multiple times throughout the course
of my life.
So I've given up on long term plans.
I just worry about the next five years
or so and go from there.
But I do want to say that threat
modeling I don't think has to be an
expert thing because there's multiple
steps to threat modeling, right?
And one of those steps is basically
figuring out –
how bad are the risks if i fail
like that's that's one of the steps right
and so i think for a lot of
people like that's i think that's kind of
where we come up with the idea of
like a low threat model you know if
somebody's like i'm gonna pick on people
here but back in the day i used
to see people having like really really
extreme meltdowns where they were like oh
my god i connected to youtube once and
i didn't have my vpn on like i'm
so screwed and it's like
Calm down.
It's not that big a deal.
Google has one IP address.
It probably rotates anyways in a lot of
parts of the country or a lot of
countries around the world.
The results for most people are not that
big a deal.
And again, going to what I said earlier,
if your threat model is so high that
Google can't have your one IP address,
you probably shouldn't be going to YouTube
in the first place.
But anyways,
my point is I don't think it's something
that everybody necessarily has to go see a
professional for.
And I say this as somebody who has
done consulting in the past.
I think, yes,
if you have a very high threat model,
then yeah,
you probably shouldn't just be winging it
and trying to piece together a bunch of
random websites and YouTube videos.
But at the same time,
like if you're just like, dude,
I'm not an activist, I'm not,
uh political figure i'm not super i just
i just want my privacy i just want
to not get targeted ads i just want
people to not be stalking me but it's
not that big a deal and i'm not
willing to bend over backwards i mean
that's part of a threat model too right
how much effort are you willing to go
through because not everybody is willing
to go through the same amount of effort
and i'm going to say that again because
i feel like a lot of people in
the privacy community forget that
Not everybody is willing to go through the
same amount of effort and that's fine.
As long as like their threat model is
being met.
So yeah, it's an, I don't know.
I think if you want to get consulting,
I mean,
if that's something you want to do,
that'll help you sleep at night,
go for it.
But I don't think it's something that
should only be deferred to by experts.
Cause I mean, we're human too.
There's no like,
governing board that certifies privacy
experts or anything so I don't know yeah
I just that's my thoughts
Yeah,
it's like one of these things where like
I feel like, you know,
there's certain things where you can just
throw money at something and kind of
remove a bunch of the effort here.
Like I feel like going through and trying
to understand what are the best tools,
what do I need to do?
Like it is kind of time consuming.
Like Nate was saying,
like not everyone has hours every day to
go through an hour.
you know,
work out the tools and update things.
So, I mean,
it can make sense to throw money at
something.
I don't think you need to.
I think everything is available for free.
Like,
we try really hard to make things
accessible to everybody.
Like, we don't want to paywall stuff.
You know,
we offer benefits to members who give us
donations because, you know,
it's the least we can do for supporting
us.
But I think, you know,
if there's something that you want to kind
of
easy mode you can talk to an expert
i mean i'm not going to recommend it
i think all the content is available for
free and we've talked about this before
like you can take things slowly like you
can just do one thing every month like
when you have a bit of spare time
like you don't have to to go at
like i think it was michael basil who
said this it's like uh privacy is a
marathon not a sprint um
And I really like that quote because I
think it's, you know,
we think about things that we need to
do,
but I don't think we need to do
things immediately and we don't need to
try and blitz through everything in like
two days.
You certainly can if you want,
but it's definitely not required.
So, you know,
definitely put that into perspective for
you.
I always love telling people how I was
that lunatic that sat down one weekend and
went,
I'm going to move all my passwords to
a password manager and I do not recommend
it.
But,
Yeah,
not to get into a big back and
forth, but you said like, yeah,
for the average person,
threat modeling is not too high.
I mean,
everybody should threat model because
that's how you know if you're doing
enough,
but you're like going back to the story
of the Notification League,
surely it would be better to consult an
expert for blue team defenses.
Yeah, again,
if you're working on a professional blue
team,
if you are an activist who's facing jail
time or could potentially,
like sure at that point, but yeah,
we don't offer consulting at this time.
I don't know if there's any plans to,
but I mean, if we do,
I'm sure we'll announce it.
I think one thing also to add to
this comment, right, is they're saying,
like, threat modeling, like,
they're saying you should surely be better
to consult an expert for blue team
defenses.
I think, you know,
let's be a little bit honest here.
This is, like,
a very privileged position to be in.
Like,
not everyone has the money to just throw
this.
Like, we're talking about, like,
decentralized groups of activists here.
Like,
we're not – I don't mean any shade
when I say this,
but a lot of organizations are not exactly
–
uh, they're cash strapped, right?
Like they don't have money to do this
sort of thing.
Uh,
it's not really that high on their list
of priorities.
Um, so if it's like a,
a business where they have a certain
budget to,
to spend on this sort of thing,
obviously makes sense,
but that's why I think it's so important
to offer this stuff free because,
you know,
there are people who are in less, uh,
less privileged positions that also need
this information.
Um,
And it should be accessible, right?
So obviously, in the best case scenario,
this person should have consulted an
expert for blue team defenses.
But I'm pretty sure that this person was
probably not someone who had the money or
the time to be investing in this sort
of protection, I guess.
Yeah, for sure.
Taking one more look at the thread here.
Doesn't look like anybody's added
anything.
Anything else you wanted to mention or
call out?
Not particularly.
I guess if no one's got any extra
questions,
I guess we can move into the outro
here.
So all the updates from this week in
privacy will be shared on the blog every
week.
So you can sign up for the newsletter
or subscribe with your favorite RSS reader
if you want to stay tuned.
For people who prefer audio,
we also offer a podcast available on all
podcast platforms and RSS.
And this video will also be synced to
PeerTube.
Privacy Guides is an impartial nonprofit
organization that is focused on building a
strong privacy advocacy community and
delivering the best digital privacy and
consumer technology rights advice on the
internet.
If you want to support our mission,
then you can make a donation on our
website at privacyguides.org.
To make a donation,
click the red heart icon located in the
top right corner of the page,
and you can contribute using standard fiat
currency via debit or credit card,
or opt to donate anonymously using Monero
or with your favorite cryptocurrency.
And becoming a paid member unlocks
exclusive perks like early access to video
content and priority during the This Week
in Privacy livestream Q&A.
And you'll also get a cool badge on
your profile in the Privacy Guides forum
and the warm,
fuzzy feeling of supporting independent
media.
Thanks for watching,
and we'll see you next week.