iOS to this day explicitly asks if you want to turn Apple Intelligence on and respects it.
Chrome is installing a 4 GB thing with neither consent nor knowledge of the user. For most people in most parts of the planet, 4 GB isn't a negligible amount of storage memory.
I get what you mean but.... It's Google: https://www.bbc.com/news/articles/c3dr91z0g4zo
(TL;DR of the lawsuit: Google collected info from users even though those users had turned off tracking)
I'm not excusing them, to be clear. Just saying that this behavior isn't out of character for them.
iOS to this day explicitly asks if you want to turn Apple Intelligence on and respects it.
Chrome is installing a 4 GB thing with neither consent nor knowledge of the user.
But these are two different things, no? If you don't enable Apple Intelligence, are the models removed? If Apple updates the models in the future, even if you do not have it enabled, are those updated models shipped to the device?
Ah, I see. I've never used Apple Intelligence, and don't own any of their hardware, I was unaware of the behavior difference. Thanks for clarifying. That does seem like the better way to handle it, opt in rather than opt out.
For most people in most parts of the planet, 4 GB isn't a negligible amount of storage memory.
I think the bigger concern might be bandwidth? Hell, just the cache directory for my Firefox is a hair over 1 GB. 4 GB is not negligible, as you say, but on modern systems it's also not as huge a piece of disk real estate as it once was. I'm fortunate enough to have internet that allows me unlimited bandwidth, but I have friends in countries that have per-GB caps from their ISP's. The one in particular I'm thinking of is given 200 GB per month, an unexpected extra 4 GB hits harder on that scale than I think it does on disk.
I think comments here demonstrate how much the personal computing landscape shifted from even a decade ago.
User has negligible agency without going far out of their way and even then it is not great.
How is a 4gb model that is not even used in the most prominent AI workflow of Chrome going to help the end user?
For that matter did the browser at any point ask if the AI functionality should be enabled at all? In fact the user facing settings are generally ridiculously useless.
(not directed at you)
some people need to grow the hell up and take accountability for their part in creeping fascism and the surveillance state. and no, a job is not a good justification for abdicating your duty to yourself and your neighbor. brainwashing yourself with tiktok and taking any compromise that makes your life just a little easier is, in fact, your decision.
Beyond the disk size, what's wrong with this? Isn't local machine learning better than shipping your data off to some cloud provider? Or is the problem machine learning?
To me, being upset or suspicious about this this seems completely arbitrary—Chrome contains countless other binary blobs which a user has no insight to and cannot consent to. They are part of the application. Chrome contains other machine learning algorithms and features and has for years, but these have been baked-in.
If people don't like these features that's fine, there are lots of alternatives to choose from (personally, I use Helium). But to be upset about this specific instance seems arbitrary to me. And to claim that it's somehow nefarious (i.e. the consent part) seems disingenuous. Consent is granted when the user downloads and begins using Chrome—why would Chrome need additional consent to download/update one of many external components?
My problem with this regards the size of an update issued silently. Maybe in the Global North 4 GB of storage means nothing. In other parts of the world, it's.
I'm in the global north and it's not nothing to me.
If every piece of software I have installed started doing the same, I would quickly run out of storage. It's an unacceptable practice. As well, my building has poor wifi and I loathe anything that eats my bandwidth just to exist and function, let alone a browser that repeatedly downloads 4 effin' gigabytes without my consent! I don't need the added friction on my already strained system that cost me a lot of money even when I bought it second hand.
Best privacy and unbiased ad-blocking by default. Handy features like native !bangs and split view. No adware, no bloat, no noise. Made for people, by people. Fully open source.
Yes—without auditing the source myself, my expectation from Helium is that it would not include this feature.
Despite that, I would be surprised to learn that this feature and model ships in Chromium.
I also didn't read every single word in this post, but I think the argument that this is fine since it's a local model (and thus more private) is weakened by this part:
Here is the part that should make every privacy lawyer in the audience put their coffee down. When Chrome 147 launches against an eligible profile, the omnibox - the address bar at the top of the window, the most visible piece of real estate in the entire browser - renders an "AI Mode" pill to the right of the URL field. A reasonable user, seeing "AI Mode" sitting in their browser's most prominent UI element in 2026, with the well-publicised existence of on-device LLMs in Chrome and a 4 GB Gemini Nano binary already silently installed on their disk, is going to draw what feels like an obvious inference - that the visible AI Mode is using the on-device model, that their queries stay on the device, that the local model is what powers the local-looking surface.
Every part of that inference is wrong. The AI Mode pill in the Chrome 147 omnibox is a cloud-backed Search Generative Experience surface - every query the user types into it is sent over the network to Google's servers for processing by Google's hosted models. The on-device Nano model is not invoked by the AI Mode UI flow at all. They are entirely separate code paths - the most visible AI affordance in the browser does not use the local model the user has been silently given, and the features that do use the local model (Help-Me-Write in <textarea>, tab-group AI suggestions, smart paste, page summary) are buried in textarea-context menus and tab-group right-click menus that the average user will discover, on average, never.
Think about what that arrangement actually is. The user pays the storage cost of the silent install (4 GB on disk, plus the bandwidth of the silent download). The user's most visible AI experience - the pill they actually see and click - delivers no on-device benefit at all because it routes to Google's servers regardless. The on-device model is therefore a sunk cost imposed on the user, with no offsetting transparency benefit at the surface where transparency would matter most. To put it another way - if the on-device install had given the user a clear "your AI Mode queries stay on your device" property, the install would have a defensible privacy framing (worse storage, better data flow). It does not - the install gives Google a future-options resource (the model can be invoked by other Chrome subsystems without further server round-trips) at the user's disk-and-bandwidth expense, while the headline AI surface continues to send the user's queries to Google as before. The local model is a Google-side asset positioned on the user's device - it is not a user-side asset and one could argue it is nothing but sleight-of-hand to hide that actually, the visible AI mode is NOT using the local model.
That arrangement, on its own, engages at least three of the deceptive design pattern families catalogued in EDPB Guidelines 03/2022. It is misleading information because the visible label "AI Mode" creates a false impression about where processing occurs - the label does not say "cloud-backed" or "queries sent to Google", and a reasonable user with knowledge of on-device AI will infer locality from the proximity of an on-device 4 GB model on their disk. It is skipping because the user is not given a moment to choose between local-only and cloud-backed AI surfaces - both are switched on by the same upstream rollout, with no per-feature consent. And it is hindering because turning AI Mode off does not also remove the on-device install, and removing the on-device install does not turn AI Mode off - the two are separately controlled, and discovering both controls requires knowing about both chrome://flags and chrome://settings/ai, neither of which is obvious in default Chrome.
So: not just a non-consented install, but a non-consented install that doubles as cover for a parallel cloud-backed surface that misrepresents to the user where their typing is being processed. Both layers compound the consent problem.
I can definitely understand how some might not see this as damning, but I think it at the very least changes the conversation.
The first paragraph of this quote is hilarious to me, because there is no way I would expect an AI Mode from Google to do anything whatsoever on-device. I'm pretty surprised they're installing a model at all, and I wonder what its purpose is. The author says the existence of on-device LLM in Chrome "well-publicised", yet this is the first I've heard of it.
Again, it's Google — why would anyone expect they're doing anything on device when they can do it in the cloud?
I mentioned in another post, I think this is first step to them offloading not just your requests, but also for processing other asynchronous requests as well.
They'll save some money offloading chatbot duties. They'll save a fuckton by offloading low-priority API calls.
The same reason Microsoft enabled P2P update distribution by default.
Those are two massively different things. P2P update distribution makes a lot of sense from all kinds of perspective. Torrent and related approaches are just ridiculously efficient. If MS didn't do that, people would complain that they were leaving emission reductions on the table...
Offloading LLM processing to other users is not at all comparable, and I think it's a bit conspiratorial to expect that until privacy-preserving LLMs are here. (The cool stuff, were you encrypt data, run a computation on it, then decrypt the result and it's all homomorphic. The computer doesn't actually know what it's computing.) Everything else is a massive user privacy nightmare for Google. They take on that "burden" because they trust themselves not to compromise your privacy too badly (or too noticably if you're being cynical) and to keep your data safe. Give user data from one user to another, and you have no guarantees of any kind - a malicious user could compromise any of it at will.
By the way very convenient thinking if you want to make someone your enemy. The enemies of fascism are simultaneously weak enough as to be undeserving of any sympathy or rights, and strong enough that "we must all band together". Migrants are simultaneously useless unemployed mooches and they're taking your jobs. Google is simultaneously very forward about its on-device AI efforts, and snuck this model onto your device without anyone's knowledge. Then what is it? The answer depends on what the author wants you to believe.
Malware is simply malicious software. While it may not fit the actual definition, on scale of malware to well behaved software I would not place this all the way to the right.
I know this reply isn’t that serious, but genuinely, no. Shipping a local model is actually more inert than your standard software updates. A ML model is just a bunch of matrices. It’s inert in the same way an image (which is also a big matrix) is.
Shipping executable code, which is what most updates are, are much more malware-like.
Presumably the update also includes code to do inference, so you could think of the combination as comparable to shipping a general-purpose interpreter, along with the data it needs to run.
Not quite general purpose; there's only so much behavior you can plop into an NN, and whatever I/O harnessing there's built around this NN is the actual potential for abuse. But curiously, nary a mention of that. If the I/O code around the model unnecessarily and unpromptedly sends data to Google, that's bad. But the 4GB blob has nothing to do with that.
I get that 4GB of space on-device isn't nothing. But as far as malware concerns goes, this is very unsubstantiated. On-device LLMs are generally privacy friendlier than cloud, you have a very clear overview of ecological impacts, what's not to like? Oh, it's opt-out instead of opt-in? I haven't seen anyone make the case that this model somehow is activated without a user request, and even then I'd be indifferent about it as long as no sensitive telemetry is triggered.
I think it's a safe bet there's bigger privacy violations of opt-out telemetry without any AI involvement buried in the Chrome settings... Sometimes I forget how to even.
Yeah, this is bad because it's bloat and perhaps a security issue if the JavaScript code that calls the new API is written badly. But you could do the same thing by calling an LLM remotely. Calling it malware is a distraction.
Holy crap, does this apply to all chromium based browsers? I absolutely can't take 4 GIGABYTES of bloat on my system from a browser, for it to do one thing, especially when I actively don't want to do said thing, FFS!
Do you not have 4GB of storage available, or do you just not like the idea of 4GB of data sitting there? Just curious. I understand that there are many people in the world for whom storage space is still a real constraint. On the other hand, my personal experience is that most of the people I interact with who complain about storage space are actually just kind of OCD about it, the kind of people who go spelunking in system directories to delete things they think they'll never use.
I'm not the person you replied to, but this amount of space is a big deal for me. I was alarmed when I started doing content creation recently that I had almost no storage space left. 4 gigabytes is a lot of space I could be using for photos!
That's fair. Actually, I recently tried to install World of Warcraft on my Windows machine (I haven't played in like 15 years, but a friend suggested I revisit it) and I literally didn't have enough space.
I do have, and must have at all times, lots of free storage available because I use different types of production software where a single file can be 4GB or more, and cache storage will easily run out. The software installations themselves take up space too.
A browser is one of the least important pieces of software on my device and it’s completely preposterous for one to take up this much space for one single feature. I’m vigilant enough that I don’t use Chrome but I really shouldn’t have to be, just so I can get my work done without some completely unnecessary and counterproductive friction where my private resources are used to benefit a bunch of out of touch multimillionaires
I can't imagine it's without consent. Like, sure, maybe without what we, the nerds with strong opinions, would call consent, but there's DEFINITELY a provision somewhere in the EULA that they're allowed to do this.
My money is the EULA will be shortly updated that lets Google leverage your idle Chrome window (or the background update service) to process other people's requests. It'll probably be called something like 'AI mesh accelerator.'
Remember: None of the AI stuff makes economic sense until they start charging per-token at 8x the current rate. Quickest way to reduce those costs is to offload that processing to your product.
While paying with compute time actually feels like a reasonable alternative to the current ads-based ecosystem, I just don't see such a weak model being useful at any scale. It's designed for edge devices, and wouldn't cut it for a commercial service.
I think this is the PR which added this ... in 2023? I guess Google takes a while to roll out their features, unless I grabbed the wrong change ...
(edit) also the article has a few AI-writing hallmarks; not sure if that's been pointed out already. Could also be that they're a wee bit uncreative in their wording, however.
(edit 2) I'm not familiar enough with chromium internals to understand much, but it looks like this is the logic which determines if one's machine is eligible for downloading the model. Not sure if someone else sees a nice way to disable it from doing so there.
I couldn't tell if my device had this Chrome AI, but I recently switched to Brave anyway so I just went ahead and uninstalled Chrome. I did notice that Chrome occupied 6GB on my hard drive while Brave only occupied 1GB.
P.S. Just in case anyone else is thinking of switching too, I found the switch to Brave super easy, since I can still use Chrome extensions, and I was able to auto-import all my bookmarks/history/settings from Chrome too. I also like that Brave has a one-click mute/unmute audio per tab function. Never knew I needed that! Chrome doesn't have it.
It's funny to read this comment. Because it's just appears to me that I have Chrome installed but I used it last time few month ago.. at least now I have a proper reason to finally ditch it. I have Vivaldi and Edge anyway, heh.
Upd : nope. Still want to use Chrome Remote Desktop feature.
In case there are folks who want to keep using Chrome for some reason, but don't want this bloat on their HD, here's an article that explains the steps of stopping this install permanently on Windows 11.
stu2b50 | 20 hours ago
Tbh I don’t really see the issue. It’s just part of the application. iOS ships with a bunch of local models for things like speech to text as well.
If anything it’s better than being a cloud model. It’s just a series of matrices in the end, it’s not radioactive.
[OP] rodrigo | 20 hours ago
iOS to this day explicitly asks if you want to turn Apple Intelligence on and respects it.
Chrome is installing a 4 GB thing with neither consent nor knowledge of the user. For most people in most parts of the planet, 4 GB isn't a negligible amount of storage memory.
stu2b50 | 19 hours ago
For Apple Intelligence, but Apple does not ask if you want the models for speech to text - that’s going on no matter what.
Is the issue just that it’s 4gb? It’s somewhat annoying but honestly I don’t particularly care that much.
[OP] rodrigo | 18 hours ago
Yes. It's a browser downloading 4 GB without warning.
JCAPER | 20 hours ago
I get what you mean but.... It's Google:
https://www.bbc.com/news/articles/c3dr91z0g4zo
(TL;DR of the lawsuit: Google collected info from users even though those users had turned off tracking)
I'm not excusing them, to be clear. Just saying that this behavior isn't out of character for them.
babypuncher | 19 hours ago
Really it's just another reason why anything Google makes should be avoided like the plague
goose | 19 hours ago
But these are two different things, no? If you don't enable Apple Intelligence, are the models removed? If Apple updates the models in the future, even if you do not have it enabled, are those updated models shipped to the device?
[OP] rodrigo | 18 hours ago
As far as I know, they aren't installed if you don't enable Apple Intelligence and if you disable it, the LLM is removed.
goose | 18 hours ago
Ah, I see. I've never used Apple Intelligence, and don't own any of their hardware, I was unaware of the behavior difference. Thanks for clarifying. That does seem like the better way to handle it, opt in rather than opt out.
I think the bigger concern might be bandwidth? Hell, just the cache directory for my Firefox is a hair over 1 GB. 4 GB is not negligible, as you say, but on modern systems it's also not as huge a piece of disk real estate as it once was. I'm fortunate enough to have internet that allows me unlimited bandwidth, but I have friends in countries that have per-GB caps from their ISP's. The one in particular I'm thinking of is given 200 GB per month, an unexpected extra 4 GB hits harder on that scale than I think it does on disk.
Tiraon | 17 hours ago
I think comments here demonstrate how much the personal computing landscape shifted from even a decade ago.
User has negligible agency without going far out of their way and even then it is not great.
How is a 4gb model that is not even used in the most prominent AI workflow of Chrome going to help the end user?
For that matter did the browser at any point ask if the AI functionality should be enabled at all? In fact the user facing settings are generally ridiculously useless.
vord | 9 hours ago
There is a reason Microsoft periodically 'accidentally' resets settings during updates.
wervenyt | 12 hours ago
(not directed at you)
some people need to grow the hell up and take accountability for their part in creeping fascism and the surveillance state. and no, a job is not a good justification for abdicating your duty to yourself and your neighbor. brainwashing yourself with tiktok and taking any compromise that makes your life just a little easier is, in fact, your decision.
gianni | 19 hours ago
Beyond the disk size, what's wrong with this? Isn't local machine learning better than shipping your data off to some cloud provider? Or is the problem machine learning?
To me, being upset or suspicious about this this seems completely arbitrary—Chrome contains countless other binary blobs which a user has no insight to and cannot consent to. They are part of the application. Chrome contains other machine learning algorithms and features and has for years, but these have been baked-in.
If people don't like these features that's fine, there are lots of alternatives to choose from (personally, I use Helium). But to be upset about this specific instance seems arbitrary to me. And to claim that it's somehow nefarious (i.e. the consent part) seems disingenuous. Consent is granted when the user downloads and begins using Chrome—why would Chrome need additional consent to download/update one of many external components?
[OP] rodrigo | 18 hours ago
My problem with this regards the size of an update issued silently. Maybe in the Global North 4 GB of storage means nothing. In other parts of the world, it's.
Lia | 17 hours ago
I'm in the global north and it's not nothing to me.
If every piece of software I have installed started doing the same, I would quickly run out of storage. It's an unacceptable practice. As well, my building has poor wifi and I loathe anything that eats my bandwidth just to exist and function, let alone a browser that repeatedly downloads 4 effin' gigabytes without my consent! I don't need the added friction on my already strained system that cost me a lot of money even when I bought it second hand.
Lia | 17 hours ago
I guess you're assuming that other chromium based browsers aren't doing a similar silent install?
gianni | 17 hours ago
Yes—without auditing the source myself, my expectation from Helium is that it would not include this feature.
Despite that, I would be surprised to learn that this feature and model ships in Chromium.
all_summer_beauty | 17 hours ago
I also didn't read every single word in this post, but I think the argument that this is fine since it's a local model (and thus more private) is weakened by this part:
I can definitely understand how some might not see this as damning, but I think it at the very least changes the conversation.
scojjac | 15 hours ago
The first paragraph of this quote is hilarious to me, because there is no way I would expect an AI Mode from Google to do anything whatsoever on-device. I'm pretty surprised they're installing a model at all, and I wonder what its purpose is. The author says the existence of on-device LLM in Chrome "well-publicised", yet this is the first I've heard of it.
Again, it's Google — why would anyone expect they're doing anything on device when they can do it in the cloud?
vord | 9 hours ago
I mentioned in another post, I think this is first step to them offloading not just your requests, but also for processing other asynchronous requests as well.
They'll save some money offloading chatbot duties. They'll save a fuckton by offloading low-priority API calls.
The same reason Microsoft enabled P2P update distribution by default.
vektor | 9 hours ago
Those are two massively different things. P2P update distribution makes a lot of sense from all kinds of perspective. Torrent and related approaches are just ridiculously efficient. If MS didn't do that, people would complain that they were leaving emission reductions on the table...
Offloading LLM processing to other users is not at all comparable, and I think it's a bit conspiratorial to expect that until privacy-preserving LLMs are here. (The cool stuff, were you encrypt data, run a computation on it, then decrypt the result and it's all homomorphic. The computer doesn't actually know what it's computing.) Everything else is a massive user privacy nightmare for Google. They take on that "burden" because they trust themselves not to compromise your privacy too badly (or too noticably if you're being cynical) and to keep your data safe. Give user data from one user to another, and you have no guarantees of any kind - a malicious user could compromise any of it at will.
vektor | 9 hours ago
Exactly my thinking.
By the way very convenient thinking if you want to make someone your enemy. The enemies of fascism are simultaneously weak enough as to be undeserving of any sympathy or rights, and strong enough that "we must all band together". Migrants are simultaneously useless unemployed mooches and they're taking your jobs. Google is simultaneously very forward about its on-device AI efforts, and snuck this model onto your device without anyone's knowledge. Then what is it? The answer depends on what the author wants you to believe.
[OP] rodrigo | 20 hours ago
At this point, Chrome's autoupdate is no different of a backdoor, isn't it?
skybrian | 20 hours ago
Why would it be like a backdoor? An LLM isn’t malware. My understanding is that it’s part of a proposed browser API.
Maybe some don’t like the update, but the user never had any control over browser API changes.
Pavouk106 | 20 hours ago
I thibk OP meant that autoupdate can basically install whatever Google wants, including backdoor, if they wanted.
unkz | 20 hours ago
How is that different “at this point” than before?
Pavouk106 | 19 hours ago
I didn't say it is different than before. I just wanted to add another point of view.
[OP] rodrigo | 18 hours ago
Reiterated misbehavior of Google regarding Chrome. In fact, Chrome is a trojan horse for Google's businesses, first (and foremost) ads, now AI.
[OP] rodrigo | 20 hours ago
Isn't it? 😁
skybrian | 19 hours ago
Only if you call anything you don’t like “malware.”
Tiraon | 2 hours ago
Malware is simply malicious software. While it may not fit the actual definition, on scale of malware to well behaved software I would not place this all the way to the right.
JCAPER | an hour ago
I'm not following. Could you clarify what you mean?
stu2b50 | 18 hours ago
I know this reply isn’t that serious, but genuinely, no. Shipping a local model is actually more inert than your standard software updates. A ML model is just a bunch of matrices. It’s inert in the same way an image (which is also a big matrix) is.
Shipping executable code, which is what most updates are, are much more malware-like.
skybrian | 18 hours ago
Presumably the update also includes code to do inference, so you could think of the combination as comparable to shipping a general-purpose interpreter, along with the data it needs to run.
vektor | 9 hours ago
Not quite general purpose; there's only so much behavior you can plop into an NN, and whatever I/O harnessing there's built around this NN is the actual potential for abuse. But curiously, nary a mention of that. If the I/O code around the model unnecessarily and unpromptedly sends data to Google, that's bad. But the 4GB blob has nothing to do with that.
I get that 4GB of space on-device isn't nothing. But as far as malware concerns goes, this is very unsubstantiated. On-device LLMs are generally privacy friendlier than cloud, you have a very clear overview of ecological impacts, what's not to like? Oh, it's opt-out instead of opt-in? I haven't seen anyone make the case that this model somehow is activated without a user request, and even then I'd be indifferent about it as long as no sensitive telemetry is triggered.
I think it's a safe bet there's bigger privacy violations of opt-out telemetry without any AI involvement buried in the Chrome settings... Sometimes I forget how to even.
skybrian | 8 hours ago
Yeah, this is bad because it's bloat and perhaps a security issue if the JavaScript code that calls the new API is written badly. But you could do the same thing by calling an LLM remotely. Calling it malware is a distraction.
Lia | 18 hours ago
Holy crap, does this apply to all chromium based browsers? I absolutely can't take 4 GIGABYTES of bloat on my system from a browser, for it to do one thing, especially when I actively don't want to do said thing, FFS!
glesica | 16 hours ago
Do you not have 4GB of storage available, or do you just not like the idea of 4GB of data sitting there? Just curious. I understand that there are many people in the world for whom storage space is still a real constraint. On the other hand, my personal experience is that most of the people I interact with who complain about storage space are actually just kind of OCD about it, the kind of people who go spelunking in system directories to delete things they think they'll never use.
crialpaca | 16 hours ago
I'm not the person you replied to, but this amount of space is a big deal for me. I was alarmed when I started doing content creation recently that I had almost no storage space left. 4 gigabytes is a lot of space I could be using for photos!
glesica | 15 hours ago
That's fair. Actually, I recently tried to install World of Warcraft on my Windows machine (I haven't played in like 15 years, but a friend suggested I revisit it) and I literally didn't have enough space.
Lia | 15 hours ago
I do have, and must have at all times, lots of free storage available because I use different types of production software where a single file can be 4GB or more, and cache storage will easily run out. The software installations themselves take up space too.
A browser is one of the least important pieces of software on my device and it’s completely preposterous for one to take up this much space for one single feature. I’m vigilant enough that I don’t use Chrome but I really shouldn’t have to be, just so I can get my work done without some completely unnecessary and counterproductive friction where my private resources are used to benefit a bunch of out of touch multimillionaires
delphi | 16 hours ago
I can't imagine it's without consent. Like, sure, maybe without what we, the nerds with strong opinions, would call consent, but there's DEFINITELY a provision somewhere in the EULA that they're allowed to do this.
vord | 9 hours ago
My money is the EULA will be shortly updated that lets Google leverage your idle Chrome window (or the background update service) to process other people's requests. It'll probably be called something like 'AI mesh accelerator.'
Remember: None of the AI stuff makes economic sense until they start charging per-token at 8x the current rate. Quickest way to reduce those costs is to offload that processing to your product.
Wes | 9 hours ago
While paying with compute time actually feels like a reasonable alternative to the current ads-based ecosystem, I just don't see such a weak model being useful at any scale. It's designed for edge devices, and wouldn't cut it for a commercial service.
gianni | 16 hours ago
You are absolutely right—there is. I personally read the Chrome and Google Services EULA today after seeing this article.
https://policies.google.com/terms
kacey | 12 hours ago
I think this is the PR which added this ... in 2023? I guess Google takes a while to roll out their features, unless I grabbed the wrong change ...
(edit) also the article has a few AI-writing hallmarks; not sure if that's been pointed out already. Could also be that they're a wee bit uncreative in their wording, however.
(edit 2) I'm not familiar enough with chromium internals to understand much, but it looks like this is the logic which determines if one's machine is eligible for downloading the model. Not sure if someone else sees a nice way to disable it from doing so there.
mellowminx | 8 hours ago
I couldn't tell if my device had this Chrome AI, but I recently switched to Brave anyway so I just went ahead and uninstalled Chrome. I did notice that Chrome occupied 6GB on my hard drive while Brave only occupied 1GB.
P.S. Just in case anyone else is thinking of switching too, I found the switch to Brave super easy, since I can still use Chrome extensions, and I was able to auto-import all my bookmarks/history/settings from Chrome too. I also like that Brave has a one-click mute/unmute audio per tab function. Never knew I needed that! Chrome doesn't have it.
Deely | 6 hours ago
It's funny to read this comment. Because it's just appears to me that I have Chrome installed but I used it last time few month ago.. at least now I have a proper reason to finally ditch it. I have Vivaldi and Edge anyway, heh.
Upd : nope. Still want to use Chrome Remote Desktop feature.
Lia | an hour ago
In case there are folks who want to keep using Chrome for some reason, but don't want this bloat on their HD, here's an article that explains the steps of stopping this install permanently on Windows 11.
(@Deely)