Many will cheer for any case that hurts Meta without reading the details, but we should be aware that these cases are one of the key reasons why companies are backtracking from features like end-to-end encryption:
> The New Mexico case also raised concerns that allowing teens to use end-to-end encryption on Instagram chats — a privacy measure that blocks anyone other than sender and receiver from viewing a conversation — could make it harder for law enforcement to catch predators. Midway through trial, Meta said it would stop supporting end-to-end-encrypted messaging on Instagram later this year.
Is it illegal or is it just illegal on general purpose platforms whose focus isn't extreme security?
We all know Meta can still read E2EE chats (otherwise they wouldn't do it) and they're using E2EE as an excuse to avoid liability for the things their platform encourages. Contrast this with something like Signal where the entire point is to be secure.
Probably their auditors? Lying about this would be tantamount to (very serious) securities fraud. Not sure what you're basing on your allegations on besides "trust me bro"
Why would lying about having E2EE be securities (as in stock market) fraud? Would that make any lie ever told by a corporation equate to stock market fraud?
E2EE means end-to-end, where the ends are the participants in the chat. They can read it on your phone, but not on their servers. They need their app to separately transmit the plaintext to their servers to read it.
The first two E's in E2EE stand for end. From one end to the other. So no, Meta can't. Or put another way... if they can read those messages, then it's not E2EE.
I understand the concern but then to make this available for adults you now have to provide proof of age to companies, which opens up another can of privacy worms.
Theoretically we don't actually need proof of age. Websites need to know when the user is attempting to create an account or log in from a child-locked device. Parents need to make sure their kids only have child-locked devices. Vendors need to make sure they don't sell unlocked devices to kids.
Children who are smart enough to get access to a given vice without getting caught are more likely to be mature enough to be able to cope with that vice.
Well then don't give them money to do so, its not like phones grow on trees. If you make selling phone/internet device to a minor under certain threshold an illegal act severely punished by law in same way alcohol and cigarettes are, many cases of access are solved. Also, paid internet subscription doesn't grow on the trees even though there are free wifi networks.
All imperfect solutions, but they slice original huge problem into much smaller chunks which are easier to tackle with next approach.
You just need to provide the government with your name and address and the name and address of the counter party every time you send an encrypted message.
If you don't support this you're obviously a pedo nazi terrorist.
In a way, this is like saying that one trusts total strangers in some random large tech company and total strangers in government agencies to read and/or manipulate conversations that kids have. This also paves the way to disallow E2EE for other classes of people based on arbitrary criteria. I don’t believe this is good for society overall.
There is no reason kids should use so called smart devices, except making certain companies richer. Kids have had a healthy development without such crap for thousands of years. We don't discuss what percentage of alcohol should be allowed in beer and wine for kids.
* Classifying accounts as child accounts (moderated by a parent)
* Allowing account moderators to review content in the account that is moderated (including assigning other moderation tools of choice)
In call cases transparency and enabling consumer choice should be the core focus.
Additionally: by default treat everyone online as an adult. Parents that allow their kids online like that without supervision / some setting that the user agent is operated by a child intend to allow their children to interact with strangers. This tends to work out better in more controlled and limited circumstances where the adults involved have the resources to provide suitable supervision.
At the same time, any requirements should apply only to commercial products. Community (gratis / not for profit) efforts presumably reflect the needs of a given community.
Unfair presentation. What they suggested was more akin to, "Assume someone with keys is an adult, and let them start the truck."
Dad should either know his children would never drive the truck without permission, or keep his keys as safe as his wallet (and if he can't trust his kids with keys, you bet his wallet needs protection).
We know that this isn't really going to reduce harm for children, we know Meta is not seriously going to suffer or change, and we know this is going to be used as a cudgel to beat down privacy and increase surveillance.
Why is it so important that kids have access to the internet anyway that we're willing to sacrifice both our privacy and freedom of speech rights for it when we already know it's damaging their mental health?
We don't need all this privacy invasion if we just didn't give kids a smartphone with a data plan.
This is a good thing for “social” media. If you use any social media app (especially those owned by Meta) you should assume that absolutely everything you do is for full public consumption. Maybe these changes will make everyone stop thinking that anything is private when using “social” media apps.
Centralized organizations with proprietary software can never offer meaningful end to end encryption because they can just ship an app update to disable or backdoor it at any time.
It is better for them to be forced to turn off the security theater so people that need actual privacy can research alternatives.
That's why Signal requires a phone number. You can't talk to people you don't know because complete strangers don't give you their phone number. And if you do spam random numbers, they'll report you to the police and you can be tracked down based on your identifier, which still doesn't leak the chats between you and people you actually know.
Meta has a way to read your E2EE messages. I don't know what it is, but if they didn't then they wouldn't do it.
There's a difference between E2EE between friends who want to remain secure, and E2EE between strangers in an attempt for the platform to avoid legal liability for spam.
> Another poster child for Meta's lobbying (bribery) to encourage OS level age verification. (numerous recent references in HN posts)
The references I saw showed Meta had lobbied for some of the laws that require age verification be done by the site or by third party ID services. They did not show that Meta lobbied for any of the OS bills.
Some showed that Meta had lobbied in some of the states with those bills, but they just showed Meta's total lobbying budget for those states.
> The New Mexico attorney general’s office created multiple fake Facebook and Instagram profiles posing as children as part of its investigation into Meta. Those test accounts encountered sexually suggestive content and requests to share pornographic content, the suit alleges.
> The fake child accounts were allegedly contacted and solicited for sex by the three New Mexico adult men who were arrested in May of 2024. Two of the three men were arrested at a motel, where they allegedly believed they would be meeting up with a 12-year-old girl, based on their conversations with the decoy accounts.
and
> “The product is very good at connecting people with interests, and if your interest is little girls, it will be really good at connecting you with little girls,” Bejar said.
This is what it's about right? The article doesn't make it seem like encryption is meaningfully part of this case at all.
> Midway through trial, Meta said it would stop supporting end-to-end-encrypted messaging on Instagram later this year.
There's no indication that that decision, or the announcement, are directly related to the trial, just they just happened at the same time? It's a link drawn by CNN, without presenting any clear connection
I cheer any decision that holds any private web property (like Facebook) accountable for it's user actions.
It helps to reduce hegemony of large social platforms and promotes privately owned websites. For example, I know everyone who has permissions to post on my website (or pre-moderate strangers comments), and is ready to take responsibility for their posts, what my website publishes.
Currently the legal stance seems strange to me -- large media platforms are allowed to store, distribute, rank and sell strangers data, while at the same time they claim they are not responsible for it.
If you haven't already, you should look at the court case that prompted the creation of the current legal framework of Section 230. Prodigy was sued because of the things being said in public chatrooms. Should the host for an IRC server be responsible for everything said on the IRC server? Should they pre-moderate all the messages being said there? Should dang premoderate every post on this site?
The reality is that people who cheer for this stuff are going to be unreasonably shocked when it comes to bite them later. Once the government's done going after the big guys, the little guys are next, and unlike the big guys, they can't absorb a few fines and judgments.
They had to pay about $375 million. That's a lot of money, but I suspect that Facebook has made considerably more than that on targeting children.
I'm hardly the first person to use this logic, but if they make more money breaking the law than they have to pay in fines, then it's not a fine, it's a business expense.
Agree with your take. However, to put more perspective on the amount I think you have to consider this is just in New Mexico so the per capita fine is actually quite large and (big) if it were applied similarly nationally or globally it could be a significant impact to their business forcing some change.
paxys | 20 hours ago
Aurornis | 20 hours ago
> The New Mexico case also raised concerns that allowing teens to use end-to-end encryption on Instagram chats — a privacy measure that blocks anyone other than sender and receiver from viewing a conversation — could make it harder for law enforcement to catch predators. Midway through trial, Meta said it would stop supporting end-to-end-encrypted messaging on Instagram later this year.
The New York case has explicitly gone after their support of end-to-end encryption as a target: https://www.reuters.com/legal/government/meta-executive-warn...
bitwize | 19 hours ago
gzread | 19 hours ago
We all know Meta can still read E2EE chats (otherwise they wouldn't do it) and they're using E2EE as an excuse to avoid liability for the things their platform encourages. Contrast this with something like Signal where the entire point is to be secure.
cristoperb | 19 hours ago
That can't be true, otherwise in what sense is it E2EE?
gzread | 19 hours ago
Has anyone actually audited it?
babelfish | 19 hours ago
gzread | 3 hours ago
babelfish | 2 hours ago
interestpiqued | 19 hours ago
SAI_Peregrinus | 19 hours ago
throwaway173738 | 19 hours ago
duskdozer | 11 hours ago
vaylian | 11 hours ago
markdown | 19 hours ago
themafia | 19 hours ago
Absolutely. Particularly where they've been found to be guilty.
> but we should be aware that these cases are one of the key reasons why companies are backtracking from features like end-to-end encryption
Why _social media_ companies are backtracking. I'm extremely nonplussed by this outcome.
> concerns that allowing teens
Yes, because that's what we all had in mind when considering the victims and perpetrators of these crimes.
pylua | 19 hours ago
hsbauauvhabzb | 19 hours ago
pylua | 18 hours ago
simmerup | 18 hours ago
That ship has sailed
pylua | 18 hours ago
It is actually terrifying . If you write something out of context or upload an image out of context you can be in big trouble.
intended | 12 hours ago
We are at a point where we are picking and choosing collateral damage targets.
vaylian | 12 hours ago
fourside | 19 hours ago
skybrian | 18 hours ago
polyomino | 18 hours ago
skybrian | 18 hours ago
sixsevenrot | 7 hours ago
Children who are smart enough to get access to a given vice without getting caught are more likely to be mature enough to be able to cope with that vice.
cr125rider | an hour ago
It seems a bit silly to think security abstinence is the solution.
IAmBroom | 6 hours ago
kakacik | an hour ago
All imperfect solutions, but they slice original huge problem into much smaller chunks which are easier to tackle with next approach.
whatshisface | 18 hours ago
triceratops | 18 hours ago
pylua | 18 hours ago
jMyles | 18 hours ago
noosphr | 18 hours ago
If you don't support this you're obviously a pedo nazi terrorist.
newscracker | 16 hours ago
intended | 12 hours ago
Firms have a fiduciary duty to shareholders and profit.
On the other hand, You ultimately decide the rules and goals that operate government organizations, and do not have a profit maximization target.
They aren’t the same tool, and they work for different situations.
The E2EE slippery slope is a different challenge, and for that I have no thoughts
usr1106 | 11 hours ago
IAmBroom | 6 hours ago
mjevans | 19 hours ago
* Classifying accounts as child accounts (moderated by a parent)
* Allowing account moderators to review content in the account that is moderated (including assigning other moderation tools of choice)
In call cases transparency and enabling consumer choice should be the core focus.
Additionally: by default treat everyone online as an adult. Parents that allow their kids online like that without supervision / some setting that the user agent is operated by a child intend to allow their children to interact with strangers. This tends to work out better in more controlled and limited circumstances where the adults involved have the resources to provide suitable supervision.
At the same time, any requirements should apply only to commercial products. Community (gratis / not for profit) efforts presumably reflect the needs of a given community.
kelseyfrog | 18 hours ago
It's ok to drive Dad's truck unless he catches you and tells you no.
IAmBroom | 6 hours ago
Dad should either know his children would never drive the truck without permission, or keep his keys as safe as his wallet (and if he can't trust his kids with keys, you bet his wallet needs protection).
ronsor | 16 hours ago
We know that this isn't really going to reduce harm for children, we know Meta is not seriously going to suffer or change, and we know this is going to be used as a cudgel to beat down privacy and increase surveillance.
armada651 | 10 hours ago
We don't need all this privacy invasion if we just didn't give kids a smartphone with a data plan.
bdangubic | 16 hours ago
intended | 12 hours ago
Harm to kids is actually happening, and this is always going to be a hot button topic.
E2E is critical for our current ability to communicate online, but will be a lower priority when pitted against child safety.
Fighting the good fight is one thing, fighting for the sake of it, without a plan that addresses the tactical reality is another altogether.
Personally, I think E2E will be defended, but it’s becoming a lightning rod for attention. As if removing encryption will solve the emerging issues.
I suspect providing alternatives to champion, such as privacy preserving ways to verify age, will force a conversation on why E2E needs to go.
lrvick | 33 minutes ago
It is better for them to be forced to turn off the security theater so people that need actual privacy can research alternatives.
johnea | 20 hours ago
They very much want to push this liability off onto someone else...
As far as end-to-end encryption, on SM sites (social media or SadoMasochism, however you want to read it) I don't really see the need.
kstrauser | 19 hours ago
Online child exploitation should be a strict liability offense.
idle_zealot | 19 hours ago
gzread | 19 hours ago
gnabgib | 19 hours ago
Aurornis | 19 hours ago
You don't see any benefit to allowing people to encrypt their private communications in a way that can't be accessed by the company?
It's weird to see tech news commenters swing from being pro-privacy to anti-privacy when the topic of social media sites come up.
gzread | 19 hours ago
There's a difference between E2EE between friends who want to remain secure, and E2EE between strangers in an attempt for the platform to avoid legal liability for spam.
thorncorona | 15 hours ago
gzread | 3 hours ago
tzs | 19 hours ago
The references I saw showed Meta had lobbied for some of the laws that require age verification be done by the site or by third party ID services. They did not show that Meta lobbied for any of the OS bills.
Some showed that Meta had lobbied in some of the states with those bills, but they just showed Meta's total lobbying budget for those states.
sharkjacobs | 19 hours ago
> The fake child accounts were allegedly contacted and solicited for sex by the three New Mexico adult men who were arrested in May of 2024. Two of the three men were arrested at a motel, where they allegedly believed they would be meeting up with a 12-year-old girl, based on their conversations with the decoy accounts.
and
> “The product is very good at connecting people with interests, and if your interest is little girls, it will be really good at connecting you with little girls,” Bejar said.
This is what it's about right? The article doesn't make it seem like encryption is meaningfully part of this case at all.
> Midway through trial, Meta said it would stop supporting end-to-end-encrypted messaging on Instagram later this year.
There's no indication that that decision, or the announcement, are directly related to the trial, just they just happened at the same time? It's a link drawn by CNN, without presenting any clear connection
Cider9986 | 18 hours ago
deepsun | 18 hours ago
It helps to reduce hegemony of large social platforms and promotes privately owned websites. For example, I know everyone who has permissions to post on my website (or pre-moderate strangers comments), and is ready to take responsibility for their posts, what my website publishes.
Currently the legal stance seems strange to me -- large media platforms are allowed to store, distribute, rank and sell strangers data, while at the same time they claim they are not responsible for it.
vel0city | 17 hours ago
https://en.wikipedia.org/wiki/Stratton_Oakmont,_Inc._v._Prod....
ronsor | 16 hours ago
tombert | an hour ago
I'm hardly the first person to use this logic, but if they make more money breaking the law than they have to pay in fines, then it's not a fine, it's a business expense.
conductr | an hour ago
ryandrake | an hour ago
tombert | an hour ago
fridder | an hour ago