Browser extensions have much looser security than you would think: any extension, even if it just claims to change a style of a website, can see your input type=password fields - it's ludicrous that access to those does not need its own permission !
It's hard to see how you would implement that, any script run within the context of the page needs access to these fields for backwards compatibility reasons, so the context script of the extension would just need to find a way of running code in the context of the page to exfiltrate the data. It could do this by adding script tags, etc.
Browsers break backwards compatibility for security all the time. Most recently Chrome made accessing devices on a local network require a permission. They completely changed the behavior of cookies. They break loads of things for cross origin isolation.
I have published an extension [1] that has 100k+ users and I've probably received hundreds of emails over the years asking me to sell out in one way or another. It's honestly relentless. For that reason I also only trust uBlock Origin, Bitwarden and my own extensions.
I'd also note that all this spam is via the public email address you're forced to add to your extension listing by Google. I don't think I've ever had a single legitimate email sent to it. So yeh, thanks Google.
I was just having a quick search and the only email I can find that offered a price range up front was for $0.1-0.4 per user, and that was from 2023. So I assume up to a dollar per user these days?
Respect for not selling out. I have to admit though... If I had a browser extension and someone suddenly offered me a million dollars for it, I think I would take it.
This realization made me distrust any system where it is even possible to sell out. In order for a system to be trustworthy, it must be impossible for this sort of exploitation to ever occur, no matter how much money they put on the table.
I often make the argument that uBlock Origin is so essential that it should be built into the browsers instead of being a separate extension. The restrictions imposed by manifest v3 are good, it's just that uBlock Origin is special enough that it should be able to bypass them.
Unfortunately, the huge conflicts of interest make this unrealistic. Can't trust developers funded by ad money to develop an ad blocker.
> The only extension I trust enough to install on any browser is uBlock Origin.
Note however that the origin of uBlock Origin is that the developer Raymond Hill transferred control of the original uBlock project to someone who turned out not to be trustworthy, and thus Hill had to fork it later.
I never transferred the extension in the Chrome store. The Chrome store extension has always been the one from the repository I control, and I've had full control of it since when I created it back in June 2014.
My honest reaction to your comment is "What? No!".
I want to block ads, block trackers, auto-deny tracking, download videos, customize websites, keep videos playing in the background, change all instances of "car" to "cat" [1], and a whole bunch of weird stuff that probably shouldn't be included in the browser by default. Just because the browser extension system is broken it doesn't mean that extensions themselves are a problem - if anything, I wish people would install more extensions, not less.
In principle I agree with you, there is just so much crap online that it's tempting to just add this one more extension to fix something.
Looking at my own installed extensions, I have a password manager, Privacy Badger and Firefox Multi-Account Containers, which I suppose is the three I really need. Then I have one that puts the RSS icon back in the address bar, because Mozilla feels that RSS is less important than having the address bar show me special dates, and two that removes very specific things: One for cookie popups and one for removing sign in with Google.
The only one of these I feel should actually be a plugin is my password manager. Privacy management (including cookies), RSS and containers could just be baked into Firefox. All of those seems more relevant to me than AI.
Maybe adding a GreaseMonkey lite could fix the rest of my problem, using code I write and control.
Moving the toggle for "accounts.google.com" to full blocking in Privacy Badger ought to do it.
Heads up, full blocking of "accounts.google.com" will break some login pages entirely. But it is a good domain to fully block as long as you're comfortable using the "Disable for this site" button when something goes wrong.
I think the industry needs to rethink extensions in general. VSCode and browser extensions seem to have very little thorough review or thought into them. A lot of enterprises aren't managing them properly.
This is why I only run open source extensions that I can actually audit. uBlock Origin, SponsorBlock, the kind of tools where the code is available and the developer isn't anonymous. The Chrome Web Store is basically unregulated and Google doesn't care as long as they get their cut. Open source at least gives you a chance to see what you're installing before it starts exfiltrating your data to some server in a country you've never heard of.
I agree but let me play the devil's advocate. I'll channel Stallman:
Same argument can be applied to all closed source software.
In the end its about who you trust and who needs to be verified and that is relative, subjective, and contextual... always.
So unless you can read the source code and compile yourself on a system you built on an OS you also built from source on a machine built before server management backdoors were built into every server... you are putting your trust somewhere and you cannot really validate it beyond wider public percetptions.
> How do you check that the open sourced code is the same one that you are installing from the extension repository and actually running?
Extensions are local files on disk. After installing it, you can audit it locally.
I don't know about all operating systems but on Linux they are stored as .xpi files which are zip files. You can unzip it.
On my machine they are installed to $HOME/.mozilla/firefox/52xz2p7e.default-release/extensions but I think that string in the middle could be different for everyone.
Diffing it vs what's released in its open source repo would be a quick way to see if anything has been adjusted.
Extensions are trivial unless they have to run external software or services. Download the extension, extract the source, audit it with a good thinking model and either strip out all third party URLs/addresses or have the agent clone the functionality you want.
The open source one automatically publishes to the Chrome Store from GH actions so that there is no human involvement in the deployment process.
I'm currently in the process of setting that up for the one I'm building, because this transparency is very important to me) and it is a pain in the butt to do so. You have to go through a few verification processes at Google to get the keys approved.
An extension from a trusted, non anonymous developer which is released as open source is a good signal that the extension can be trusted. But keep in mind that distribution channels for browser extensions, similarly to distribution channels for most other open source packages (pip, npm, rpm), do not provide any guarantee that the package you install and run is actually build verbatim from the code which is open sourced.
Actually, npm supports "provenance" and as it eliminated long lived access tokens for publishing, it encourages people to use "trusted publishing" which over time should make majority of packages be auto-provenance-vefified.
Unless the Chrome web store integrates with this, it puts the onus on users to continuously scan extension updates for hash mismatches with the public extension builds, which isn’t standardized. And even then this would be after an update is unpacked, which may not run in time to prevent initial execution. Nor does it prevent a supply chain attack on the code running in the GitHub Action for the build, especially if dependencies aren’t pinned. There’s no free lunch here.
If the RPM/deb comes from a Linux distribution then there is a good chance there is a separate maintainer and the binary package is always built from the source code by the distro.
Also if the upstream developer goes malicious there is a good chance at least one of the distro maintainers will notice and both prevent the bad source code being built for the distro & notify others.
> This is why I only run open source extensions that I can actually audit.
How far does your principle extend? To your web browser too? Google Chrome itself is partly but not entirely open source. Your operating system? Only Linux? Mac and Windows include closed source.
I didn't claim that it's implausible. I asked a question.
On the other hand, it's not that implausible either that someone might be running Google Chrome, Windows, Mac, etc. We know that many HN commenters do. Thus, while the OP may be 100% consistent, "I only run open source extensions that I can actually audit" would not be a consistent principle for those who also use closed source software.
> You don’t have to apply the same policies to everything you use.
What's the reasoning behind it, though?
You can arbitrarily apply different policies to different things, but there's no rhyme or reason to that.
If the difference ultimately comes down to trusting certain developers to an extent that you don't need to audit their source, then I'm not sure why that couldn't also be true of certain extension developers.
This is the safest way. You also want to disable auto update to version lock, which means using Firefox or Safari or loading unpacked if you use Chrome.
Do you also audit every part of every car you buy or medicine you take? Or do you rely on large well-established institutions to do that for you?
"Dont trust google" imo is the wrong response here. We are at the mercy of our institutions, and if they are failing us we need mechanisms to keep them in check.
Well, I see how, especially for people who are close to death and want to provide for their loved ones, the answer to "Your money or your life" might lean in the other direction.
>Do you also audit every part of every car you buy or medicine you take? Or do you rely on large well-established institutions to do that for you?
Cars are under quite strict laws that software isn't. And there is only a small number of car vendors, while there are several orders of magnitude more extension vendors. Also a car vendor is a big company with many audits and controls, an extension "vendor" could just be some guy in his garage office, who just sold it to scammers, even for popular extensions.
And I still wouldn't trust a modern car using subscriptions and code updated.
Also, car companies have a lot at stake and are a clear target. The scammer is hard to even identify, and has no reputation to worry about. Of course in case of a sold extension, the original author of the extension may have a reputation they care about, but only if they're still making other extensions.
There are no established institutions for checking add-ons. The stores claim doing some checks, but seems enough is slipping through their net. It's also common sense to not buy something critical from a random anonymous source on the internet.
> "Dont trust google" imo is the wrong response here.
Straw man. The argument is that by installing random extensions you trust anonymous developers *because* Google doesn't audit. I'll cite the parent to spare you the effort of reading it again:
> The Chrome Web Store is basically unregulated and Google doesn't care.
Yes, I trust the contents of the medicine I buy at the drug store more than I trust the drug dealer on the corner. That's why they hand out test kits for free at raves.
No, Safari is really no different here from Chrome, and indeed there's broad compatibility between the extension API, such that in many cases you can use a Chrome extension unmodified in Safari.
Annoyed with how the AWS console sometimes changes regions on its own, I recently decided that I need an extension to make the current region displayed prominently. After a bit of research, I found the AWS Colorful Navbar [0] extension, which does pretty much exactly what I wanted, but (understandably) requires granting it "This extension can read and change your data on sites" on `://.console.aws.amazon.com/*`, which I'm not willing to give to an external extension. So my solution was forking the repo [1], carefully auditing the code, and then installing it from a local clone (which they actually have a nice explanation for). Going forward, I think I'll try using this approach for all sensitive extensions.
I don't really understand the complaint here. It seems for most of those extensions have it in their literal purpose to send the active URL and get additional information back, for doing something locally with it.
And why does this site has no scrollbar?? WTF, is Webdsign finally that broken?
We beg to differ. Consider for example "BlockSite Block Websites and Stay Focused" why would you need to send browsing data to remote server if your job is only to block selected domains?
If you look at the request made, then it seems to check the category of the site, for whatever reason. I don't know that extensions, so I don't know if this is a legit use, sloppy use or harmful. I'm also not saying they found nothing at all. But looking through what they found, they seem to have not even thought much about whether those cases are legit and in the excepted and necessary realm of actions the add-on is supposed to do, or if it's really harmful behaviour. I also don't see anything about how often the request was made. Was it on every url-change, or just once/occasionally?
This whole article is a bit too superficial for me.
Yes, obviously is that possible, but the least that one should do then is looking up what's really happening. These are browser addons, the source code is available. But instead they are looking from the outside and calling alarm on something they don't understand. That's just poor behaviour and harmful in today's climate.
If you read their full paper, they do technical analysis confirming findings in many cases. Many other researchers have done the same in the recent past.
Full paper also says that the unique URLs were later requested by crawlers, which confirms server-side collection.
What happens server-side is also confirmed by the palant.info article that shows a graphic provided by a major data broker that shows exactly how they mis-use data collected by extensions under false pretenses.
It's far from speculation when there's both technical evidence collected by researchers and direct evidence provided by the bad actors themselves.
>Before installing, make each user click a checkbox what access the extension has
However, as I've seen on android, updates do happen, and you are not asked if new permissions are granted. (Maybe they do ask, but this is after an update automatically is taken place, new code is installed)
Here are the two solutions I have, neither are perfect:
>Do not let updates automatically happen for security reasons. This prevents a change in an App becoming malware, but leaves the app open to Pegasus-like exploits.
>Let updates automatically happen, but leaves you open to remote, unapproved installs.
I guess I shouldnt be surprised on how many use "LibreOffice" or other legit company names to lend legitimacy to themselves. I'm wondering if companies like Zoom don't audit the extension store for copyright claims
I for sure used to use Video Downloader PLUS when I still used chrome (and before youtube-dl)
I'd assumed most people would have jumped ship to Stylus [1] after that, but most people probably never heard anything about what Stylish was/is doing.
Over 15 years ago now, I had a popular chrome extension that did a very specific thing. I sold it for a few thousand bucks and moved on. It seemed a bit strange at the time, and I was very cautious in the sale, but sold it and moved on.
It's abundantly obvious to me now that bad actors are purchasing legitimate chrome extensions to add this functionality and earn money off the user's data (or even worse). I have seen multiple reports of this pattern.
It is a classic supply-chain attack. The same modality is used by gamers to sell off their high-level characters, and social media accounts do "switcheroos" on posts, Pages, and Groups all the time.
You know, a lot of consumer cybersecurity focuses on malware, browser security, LAN services, but I propose that the new frontier of breaches involves browser extensions, "cloud integrations", and "app access" granted from accounts.
If I gave permission for Joe Random Developer's app to read, write, and delete everything in Gmail and Google Drive, that just set me up for ransomware or worse. Without a trace on any local OS. A virus scanner will never catch such attacks. The "Security Checkup" processes are slow and arduous. I often find myself laboriously revoking access and signing out obsolete sessions, one by one by one. There has got to be a better way.
I think he was just saying that it is similar business to that. Just drawing comparison that there are a market like selling video games accounts. Also usually people who cheats in games will buy high level accounts because they will be banned much faster if they start playing with new accounts for cheats. This happens in some of the games I play all the time.
If you buy someone's old gaming account (Steam for example) with many years of activity, you can appear more legitimate when trading, therefore making it easier for people to trust you and fall victim to your scam(s)
15 years ago was probably this type of business in its very early stage. There is little that can be done about "selling" extensions. Chrome Web Store should have tighter checks and scans to minimize this type of data exfiltration.
It's a moronic industry, waiting for the catastrophic data-theft disaster to happen before they do anything... Google is doing it, Apple did it, Zuck did it (the only hindrance Cambridge Analytica had to go over seemed to be the apps developer agreement that devs had to click to promise you won't do anything bad with the personal information of all those Facebook users...).
Which is all the more incredible, considering Blackberry (the phone company that was big before the age of iPhones or YouTube) had a permission model that allowed users to deny 3rd-party apps access to contacts, calendar, etc, etc. The app would get a PermissionDeniedException if it can't access something. I remember the Google Maps app for Blackberry, which solution to that was "Please give this app all permissions or you can't use it"...
For over 10 years that I maintain a reasonably popular cross-browser extension, I've been collecting various monetization offers. They simply don't stop coming: https://github.com/extesy/hoverzoom/discussions/670
It's worth reminding people that Firefox extensions that are part of Mozilla's "recommended extensions" program have been manually vetted.
> Firefox is committed to helping protect you against third-party software that may inadvertently compromise your data – or worse – breach your privacy with malicious intent. Before an extension receives Recommended status, it undergoes rigorous technical review by staff security experts.
While assuming absolutely zero bad will on your part, I would nevertheless find it fair if you were legally on the hook for whatever happened after the sale, unless you could prove that you provided reasonable means for the users of your extension to perform their due diligence on the new owner of the extension.
This is of course easy to say in hindsight, and is absolutely a requirement that should be enforced by the extension appstore, not by individual contributors such as yourself.
I wouldn't find that fair at all. Bad actors should be legally responsible for their bad action. If I sell you a taxi business, and then all of a sudden you decide to start robbing the customers - it's not my fault is it? And just to be clear, I had no idea if my extension was used for nefarious purposes, but in hindsight it probably was.
Customers were sold[1] a lifetime subscription to Honest Guy's taxis, and then Honest Guy does a secret deed to sell his taxi joint to Bad Guy[2] without telling any customer about it. Then customers start getting ripped of in all manner of ways, that some of them would have known to avoid if they knew their taxis were being run by Bad Guy.
[1] Of course, the issue here is that no contracts were signed.
[2] In the specific case I was replying to, there was no malice or intent to hide from you as seller. Yet, a better outcome could have been achieved by advertising the sale to those impacted.
I don't think there is any legal support for what I describe above, but in principle whenever a user signs up for Good Thing, and then gets baitswitched to Evil Thing, the main victim is the user, and it is fair to hold responsible everyone involved in the bait-and-switch maneuver.
No, how it should work is each extension is associated with a private key that is registered with a specific individual or legal entity and implies some kind of liability for anything signed with that key - and if/when the key changes (or the associated credentials), users will be explicitely alerted and need to re-authenticate the plugin.
If the old owner gives their key to the new owner, then they should be on the hook for it.
I was thinking of this yesterday, as I think this is also how domains should work.
I don't disagree with the advice (especially for long lived tokens), but query parameters are encrypted during transit with https. You still need to worry about server access logs, browser history, etc that might expose the full request url.
> We built an automated scanning pipeline that runs Chrome inside a Docker container, routes all traffic through a man‑in‑the‑middle (MITM) proxy, and watches for outbound requests that correlate with the length of the URLs we feed it.
The biggest problem here is that "We" does not refer to Google itself, who are supposed to be policing their own Chrome Web Store. One of the most profitable corporations in world history is totally negligent.
The browsing data itself is only half the problem. Even if you remove the spying extension, the profile it helped build persists and keeps shaping what you see as it gets sold and changes hands.
We focus a lot on blocking data collection and spyware.. but not enough about what happens after the data is already collected/stolen and baked into your algorithmic identity. So much of our data is already out there.
It seems crazy to me that the offered way to install an extension on Chrome is to click a button on a privileged website,
and then the installed extension autoupdates without an option to turn it off.
I hate the idea of installing stuff without an ability to look at what's inside first, so what I did was patch Chromium binary,
replacing all strings "chromewebstore.google.com" with something else, so I can inject custom JS into that website and turn
"Install" button into "Download CRX" button. After downloading, I can unpack the .crx file and look at the code, then
install via "Load unpacked" and it never updates automatically. This way I'm sure only the code I've looked at gets executed.
If someone would like to replicate, a good approach would be to reduce the cost by removing a full-chromium engine. I doubt these extensions are trying to do environment detection and won’t run under (for eg) JSDOM+Bun with a Chrome API shim.
So this would require a list of decided malicious extensions or not and someone can go ahead and check through that.
To find the list of decided malicious extensions, I can imagine that a github repository where people can create issues about the lack of safety (like imagine some github repo where this case could've also been uploaded) and people could discuss and then a .txt/json file could be there in the repo which gets updated every time an extension is confirmed to be malicious.
Thoughts?
Edit: (To take initiative?) I have created a git repo with this https://github.com/SerJaimeLannister/unsafe-extensions-list but I would need some bootstrap list of malicious extensions. So I know nothing about this field and the only extension I can add is this one maybe but maybe someone can fork this idea (who is more knowledgable within the extension community space) or perhaps they can add entries into it.
Edit 2: Looks like qcontinuum actually have a github repo and I hadn't read the article while I had written the comment but its not 1 extension but rather 287 extensions and they have mentioned all in their git repo
My point was to have a community effort around it as well if possible and people could say, upload suspicion and people could then confirm it?
I am curious but wouldn't this effort be more better if more people outside who are interested in investing their own resources for the safety of a better internet could help you out in such endeavour? So essentially they can also help you out in such task essentially creating an open source-ish committee/list which can decide it.
I do feel like if resources are something in short, then actually doing such would be even more beneficial, right? What are your thoughts on it?
(Tangent if you actually do this:
This might become a cat and mouse game if the person with malicious extension say reads the github repo and if they see their extension in it before people can conclude its malicious, making the cat and mouse game but I am imagining a github action which can calculate the hash and download link and everything (essentially archiving) a state of extension and then people can get freed from the game and everything as well. So this might help a lot in future if you actually implement it)
It is a noble idea to have a community driven effort in security research. We are sceptical that would work. The same way security researchers will read this thread in future bad actors (e.g. Similarweb) can read as well.
Any tool that would be open sourced or community driven for extension scanning will be with enough time used by bad actors to evade the scans. That is also why we don't share the code for this research as it would only speed up this process.
Oh I understand. I don't have any expertise in such field but reading this, I can understand why open source approach might not work out which is a little sad being honest.
But I feel like then the (bottleneck?) [which I don't mean in a bad way] would be the team where the attackers might still be infinitely more which can exhaust your resources which you mention as such.
Also,Are there any other teams working in this? Thoughts on collaborating with anyone in the security field?
Maybe if a direct detailed discussion can't happen then just as how you released the list of these extensions, you can release extensions in future too as you detect them
Do you feel as if LLM generated vibe-coded (with some basic reading of code to just get idea and see if there's any bad issues) would be more safer than a random extension in firefox/chrome in general? Given one is a black box (closed source) generated by human and the other is an open code generated by a black box.
You've just reinvented curation, but giving Google a pass for not them doing it themselves and shifting the work onto others.
Multiple regulators should sue Google for putting users at risk by failing to protect users from malicious code before publishing Chrome extensions and Android apps.
And then you can do whatever you feel is an appropriate amount of research whenever a particularly privileged extension gets updated (check for transfer of ownership, etc.)
- brave://flags/#brave-extension-network-blocking
You can then create custom rules to filter extension traffic under brave://settings/shields/filters
e.g.:
! Obsidian Web
*$domain=edoacekkjanmingkbkgjndndibhkegad
@@||127.0.0.1^$domain=edoacekkjanmingkbkgjndndibhkegad
- Clone the GitHub repo, do a security audit with Claude Code, build from source, update manually
Play Store pages for all 3 list strong assurances about how the developer declares no data is being sold to third parties, or collected unrelated to the item's core functionality.
My open question to Google is: What consequences will these developers face for lying to you and your users, and why should I have any faith at all in those declarations?
I’ve always thought that it’s crazy how so many extensions can basically read the content of the webpages your browse. I’m wondering if the research should go further: find all extensions that have URLs backed in them or hashes (of domains?) then check what they do when you visit these URLs
Without any doubt the research could continue on this. We had many opportunities to make the scan even wider and almost certainly we would uncover more extensions. The number of leaking extensions should not be taken as definite.
There are resource constrains. Those extensions try to actively detect if you are in developer mode. Took us a while to avoid such measures and we are certain we missed many extensions due to for example usage of Docker container. Ideally you want to use env as close to the real one as possible.
Without infrastructure this doesn't scale.
The same goes for the code analysis you have proposed. There are already tools that do that (see Secure Annex). Often the extensions download remote code that is responsible for data exfiltration or the code is obfuscated multiple times. Ideally you want to run the extension in browser and inspect its code during execution.
be scoped, meaning only allowed to read/access when you visit a particular domain whitelist (controlled by the user)?
be forced (by the extension API) to have a clear non-obfuscated feed of whatever they send that the user can log and/or tap onto and watch at any time?
If not, I wouldn't touch them with a 10000ft pole.
Kinda. You can usually open a devtools instance that shows whatever the extension is doing. But you can’t enforce it to not obfuscate the network requests though (you’d have to make extensions non-Turing complete).
You could mitigate some of these issues by vetting the extensions harder before letting them into the stores. Mozilla requires all extensions to have a readable source code, for example.
My daughter, in grade school, uses a Chromebook at school and access Google Classroom through Chrome. The school has very few restrictions on extensions and when I log into her account, Chrome is littered with extensions. They all innocuous (ex. change cursor into cat, pets play around on your screen etc). However, without fail, each time I log in and go to the extension page, Chrome notifies me that one or more of the extensions was removed due to malicious activity or whatever.
I don't think that your daughter might know if say any web cam might take photos and see what she's searching if the extensions are indeed malicious.
I'd either go ahead and talk to her and remove extensions altogether and ask her to have a stock/only open source extensions (yes opensource also has supply issues but its infinitely more managable than this) or the second option being to maybe create them yourself . I don't know about how chrome works (I use firefox) but one thing that you can do is if the thing is simple for your daughter, then just vibe code it and use tampermonkey (heck even open source it) and then audit the code written by it yourself if you want better security concerns.
Nowadays I really just end up creating my own extensions with tampermonkey before using any proprietory extension. With tampermonkey, the cycle actually feels really simple (click edit paste etc.) and even a single glance at code can show any security errors for basic stuff and its one of the few use cases of (AI?) in my opinion.
Capital One just offered me $45 to install a Firefox extension. I declined, though I'm sort of tempted to get paid for getting spied on which I assume is happening anyway. And who knows, maybe I could get a couple more bucks later in the class action.
Their offers are very hard to claim - only eligible to be used in their store, only given after making a purchase in their store, among other random strings. I tried to claim the same offer but could never actually get it.
That sounds right. I looked through the terms of the offer and it looked pretty onerous. I almost get the feeling they're trying to use my own hatred of the banks and desire to screw them out of $45 to trick me
Nobody is going to even do anything about SimilarWeb for pulling this off?
My understanding from the article is that they're actively behind this.
When I was the CTO in a previous role, SimilarWeb approached us. I read through the code snippet they gave us to inject onto our site. It was a sophisticated piece of borderline spyware that didn't care about anyone in the entire line of sight - including us. They not only were very persistent, they also had a fight with our management - for refusing to use their snippet. They wanted our data so bad (we had very high traffic at the time). All we wanted was decent analytics for reporting to senior management and Google had just fucked up with their GA4 migration practices. I switched them to Plausible.io and never looked back. It was the least I could do, we had to trade-off so many data points in comparison to GA, but still works flawlessly till date. Fuck SimilarWeb.
That can't be true, right? I mean, Google broke Adblockers in Chrome to prevent this very issue. And it had absolutely nothing to do with Google's Ad business.
So it's completely impossible that such malicious extensions still exist.
Extensions have too many security risks for me. At this point I'd rather just vibe code my own extension than trust something with so much access and unpredictable ownership.
Using the below page you can check your extensions, select all your extensions on chrome://extensions/ (everything on the page, it will filter it out IDs) and it will check if any IDs match.
This is why I disable automatic updates. Not just for browser extensions but everything. This whole "you gotta update immediately or you're gonna get hacked" thing is a charade. If anything, if you update you'll be hacked at this point.
@qcontinuum1 appreciate this kind of research. saw your other comments and you mentioned that the team's engineering resources are scarce + saw that at the bottom of the github repo that there are links to BTC address.
curious to know:
1- how large your team is? and how long this research took? it is very thorough and knowing such a detail might encourage others to participate in a joint effort in performing this kind of research
2- if this kind of research is your primary focus?
3- if there are other ways that financial support can be provided other than through xrp or btc?
i tried to look up your profiles but wasn't able to find where you were all from, so wishing you well wherever you are in the world. :)
Is there any irony in a thread on browser malware that includes a "please run this bash script blind"?
Not that I don't trust you, but between now and when someone stumbles on this thread, your domain could expire and I could publish something crazy at that url.
This is why I put the raw url to the script first in my comment. Downloading the script file, doing a chmod +x and then a ./script.sh to execute it is daunting for some.
But I'll add a caveat to my original comment as well.
edit: Looks like I can't edit my original comment anymore.
Stylus is a good alternative to Stylish. I keep my extensions to a minimum, and I turn off the ones I don't need until I need to use them. The only extensions I have turned on all the time are uBlock, Humble New Tab Page, and Stylus.
mentalgear | 10 hours ago
sebzim4500 | 10 hours ago
throwaway0665 | 10 hours ago
sebzim4500 | 10 hours ago
drdec | 9 hours ago
Valodim | 8 hours ago
matheusmoreira | 10 hours ago
The only extension I trust enough to install on any browser is uBlock Origin.
mcjiggerlog | 10 hours ago
I'd also note that all this spam is via the public email address you're forced to add to your extension listing by Google. I don't think I've ever had a single legitimate email sent to it. So yeh, thanks Google.
[1] https://chromewebstore.google.com/detail/old-reddit-redirect...
Hard_Space | 10 hours ago
rat9988 | 10 hours ago
mcjiggerlog | 10 hours ago
xnorswap | 10 hours ago
Thank you for not doing so.
mcjiggerlog | 9 hours ago
It's easy to see how many people in less advantaged positions would end up selling out, though.
matheusmoreira | 8 hours ago
This realization made me distrust any system where it is even possible to sell out. In order for a system to be trustworthy, it must be impossible for this sort of exploitation to ever occur, no matter how much money they put on the table.
sunaookami | 6 hours ago
stevekemp | 9 hours ago
I used to have tree-style tab, but now firefox has got native support for vertical tabs so I don't need to install anything extra.
Installing new extensions is sometimes appealing, but the risk is just too high.
matheusmoreira | 8 hours ago
Unfortunately, the huge conflicts of interest make this unrealistic. Can't trust developers funded by ad money to develop an ad blocker.
lapcat | 8 hours ago
Note however that the origin of uBlock Origin is that the developer Raymond Hill transferred control of the original uBlock project to someone who turned out not to be trustworthy, and thus Hill had to fork it later.
gorhill | 3 hours ago
matheusmoreira | 2 hours ago
weird_tentacles | 55 minutes ago
cebert | 10 hours ago
probably_wrong | 9 hours ago
I want to block ads, block trackers, auto-deny tracking, download videos, customize websites, keep videos playing in the background, change all instances of "car" to "cat" [1], and a whole bunch of weird stuff that probably shouldn't be included in the browser by default. Just because the browser extension system is broken it doesn't mean that extensions themselves are a problem - if anything, I wish people would install more extensions, not less.
[1] https://xkcd.com/1288/
mrweasel | 9 hours ago
Looking at my own installed extensions, I have a password manager, Privacy Badger and Firefox Multi-Account Containers, which I suppose is the three I really need. Then I have one that puts the RSS icon back in the address bar, because Mozilla feels that RSS is less important than having the address bar show me special dates, and two that removes very specific things: One for cookie popups and one for removing sign in with Google.
The only one of these I feel should actually be a plugin is my password manager. Privacy management (including cookies), RSS and containers could just be baked into Firefox. All of those seems more relevant to me than AI.
Maybe adding a GreaseMonkey lite could fix the rest of my problem, using code I write and control.
notpushkin | 7 hours ago
You could use an adblocker rule instead:
(I’m not sure if it’s possible to do that with Privacy Badger though)ghostwords | 7 hours ago
Heads up, full blocking of "accounts.google.com" will break some login pages entirely. But it is a good domain to fully block as long as you're comfortable using the "Disable for this site" button when something goes wrong.
mrweasel | 7 hours ago
pphysch | 5 hours ago
Pacers31Colts18 | 10 hours ago
drdec | 9 hours ago
kgwxd | 10 hours ago
wormpilled | 10 hours ago
hackinthebochs | 10 hours ago
singularfutur | 10 hours ago
randunel | 10 hours ago
fn-mote | 10 hours ago
There’s always a possibility of problems along the chain. You are reducing your risk not eliminating it.
chrisjj | 8 hours ago
Got to say, mischaracterising a neutral question as a nihilistic comment doesn't do anything for me.
endsandmeans | 10 hours ago
Same argument can be applied to all closed source software.
In the end its about who you trust and who needs to be verified and that is relative, subjective, and contextual... always.
So unless you can read the source code and compile yourself on a system you built on an OS you also built from source on a machine built before server management backdoors were built into every server... you are putting your trust somewhere and you cannot really validate it beyond wider public percetptions.
anonymars | 9 hours ago
fsflover | 7 hours ago
nickjj | 9 hours ago
Extensions are local files on disk. After installing it, you can audit it locally.
I don't know about all operating systems but on Linux they are stored as .xpi files which are zip files. You can unzip it.
On my machine they are installed to $HOME/.mozilla/firefox/52xz2p7e.default-release/extensions but I think that string in the middle could be different for everyone.
Diffing it vs what's released in its open source repo would be a quick way to see if anything has been adjusted.
AJ007 | 7 hours ago
insin | 9 hours ago
https://robwu.nl/crxviewer/
pbhjpbhj | 9 hours ago
pezgrande | 9 hours ago
oj-hn-dot-com | 5 hours ago
I'm currently in the process of setting that up for the one I'm building, because this transparency is very important to me) and it is a pain in the butt to do so. You have to go through a few verification processes at Google to get the keys approved.
mixedbit | 10 hours ago
jakub_g | 9 hours ago
https://docs.npmjs.com/trusted-publishers#automatic-provenan...
elashri | 9 hours ago
[1] https://docs.pypi.org/trusted-publishers/
smithza | 6 hours ago
when someone uses `npm install/add/whatever-verb` does it default to only using trusted publishing sources? and the dependency graph?
either 100% enforcement or it won't stick and these attack vulnerabilities are still there.
btown | 4 hours ago
m4rtink | 9 hours ago
Also if the upstream developer goes malicious there is a good chance at least one of the distro maintainers will notice and both prevent the bad source code being built for the distro & notify others.
pocksuppet | 7 hours ago
lapcat | 9 hours ago
How far does your principle extend? To your web browser too? Google Chrome itself is partly but not entirely open source. Your operating system? Only Linux? Mac and Windows include closed source.
nemomarx | 9 hours ago
lapcat | 9 hours ago
On the other hand, it's not that implausible either that someone might be running Google Chrome, Windows, Mac, etc. We know that many HN commenters do. Thus, while the OP may be 100% consistent, "I only run open source extensions that I can actually audit" would not be a consistent principle for those who also use closed source software.
notpushkin | 8 hours ago
lapcat | 7 hours ago
What's the reasoning behind it, though?
You can arbitrarily apply different policies to different things, but there's no rhyme or reason to that.
If the difference ultimately comes down to trusting certain developers to an extent that you don't need to audit their source, then I'm not sure why that couldn't also be true of certain extension developers.
mixmastamyk | 5 hours ago
lapcat | 5 hours ago
NamlchakKhandro | 9 hours ago
Because let's get real, no one ever gets a job in tech if they're not an iPhone user right?
bennydog224 | 9 hours ago
Rebuff5007 | 9 hours ago
"Dont trust google" imo is the wrong response here. We are at the mercy of our institutions, and if they are failing us we need mechanisms to keep them in check.
__alexs | 9 hours ago
abenga | 9 hours ago
falcor84 | 7 hours ago
haritha-j | 9 hours ago
coldtea | 8 hours ago
Cars are under quite strict laws that software isn't. And there is only a small number of car vendors, while there are several orders of magnitude more extension vendors. Also a car vendor is a big company with many audits and controls, an extension "vendor" could just be some guy in his garage office, who just sold it to scammers, even for popular extensions.
And I still wouldn't trust a modern car using subscriptions and code updated.
sjamaan | 6 hours ago
PurpleRamen | 8 hours ago
acheron | 8 hours ago
worksonmine | 2 hours ago
Straw man. The argument is that by installing random extensions you trust anonymous developers *because* Google doesn't audit. I'll cite the parent to spare you the effort of reading it again:
> The Chrome Web Store is basically unregulated and Google doesn't care.
Yes, I trust the contents of the medicine I buy at the drug store more than I trust the drug dealer on the corner. That's why they hand out test kits for free at raves.
Angostura | 9 hours ago
lapcat | 9 hours ago
Angostura | 6 hours ago
lofaszvanitt | 8 hours ago
falcor84 | 8 hours ago
[0] https://chromewebstore.google.com/detail/aws-colorful-navbar...
[1] https://github.com/nalbam/aws-navbar-extension
smithza | 6 hours ago
[0] https://research.swtch.com/xz-timeline
cachius | an hour ago
joquarky | 48 minutes ago
PurpleRamen | 10 hours ago
And why does this site has no scrollbar?? WTF, is Webdsign finally that broken?
moebrowne | 9 hours ago
Seems someone decided it was a good idea to make the scrollbar tiny and basically the same colour as the background:
PurpleRamen | 7 hours ago
[OP] qcontinuum1 | 8 hours ago
PurpleRamen | 7 hours ago
This whole article is a bit too superficial for me.
useragent42 | 7 hours ago
In particular, look for the diagram provided by a data vendor showing this in action.
As with safebrowsing and adblocking extensions, there is no need to send data to servers.
Many groups of smart people have developed client-side and/or privacy-preserving implementations that have worked with high effectiveness for decades.
Unfortunately, many other groups have also financial incentives to not care about user privacy, so they go the route shown in the research.
PurpleRamen | 5 hours ago
Yes, obviously is that possible, but the least that one should do then is looking up what's really happening. These are browser addons, the source code is available. But instead they are looking from the outside and calling alarm on something they don't understand. That's just poor behaviour and harmful in today's climate.
useragent42 | an hour ago
Full paper also says that the unique URLs were later requested by crawlers, which confirms server-side collection.
What happens server-side is also confirmed by the palant.info article that shows a graphic provided by a major data broker that shows exactly how they mis-use data collected by extensions under false pretenses.
It's far from speculation when there's both technical evidence collected by researchers and direct evidence provided by the bad actors themselves.
PlatoIsADisease | 10 hours ago
>Before installing, make each user click a checkbox what access the extension has
However, as I've seen on android, updates do happen, and you are not asked if new permissions are granted. (Maybe they do ask, but this is after an update automatically is taken place, new code is installed)
Here are the two solutions I have, neither are perfect:
>Do not let updates automatically happen for security reasons. This prevents a change in an App becoming malware, but leaves the app open to Pegasus-like exploits.
>Let updates automatically happen, but leaves you open to remote, unapproved installs.
endsandmeans | 10 hours ago
Cyuonut | 9 hours ago
fusslo | 9 hours ago
I guess I shouldnt be surprised on how many use "LibreOffice" or other legit company names to lend legitimacy to themselves. I'm wondering if companies like Zoom don't audit the extension store for copyright claims
I for sure used to use Video Downloader PLUS when I still used chrome (and before youtube-dl)
insin | 9 hours ago
https://news.ycombinator.com/item?id=17447816
I'd assumed most people would have jumped ship to Stylus [1] after that, but most people probably never heard anything about what Stylish was/is doing.
[1] https://chromewebstore.google.com/detail/stylus/clngdbkpkpee...
notenlish | 6 hours ago
deanc | 10 hours ago
It's abundantly obvious to me now that bad actors are purchasing legitimate chrome extensions to add this functionality and earn money off the user's data (or even worse). I have seen multiple reports of this pattern.
RupertSalt | 9 hours ago
You know, a lot of consumer cybersecurity focuses on malware, browser security, LAN services, but I propose that the new frontier of breaches involves browser extensions, "cloud integrations", and "app access" granted from accounts.
If I gave permission for Joe Random Developer's app to read, write, and delete everything in Gmail and Google Drive, that just set me up for ransomware or worse. Without a trace on any local OS. A virus scanner will never catch such attacks. The "Security Checkup" processes are slow and arduous. I often find myself laboriously revoking access and signing out obsolete sessions, one by one by one. There has got to be a better way.
dalmo3 | 9 hours ago
elashri | 9 hours ago
asimovDev | 7 hours ago
[OP] qcontinuum1 | 8 hours ago
netsharc | 8 hours ago
Which is all the more incredible, considering Blackberry (the phone company that was big before the age of iPhones or YouTube) had a permission model that allowed users to deny 3rd-party apps access to contacts, calendar, etc, etc. The app would get a PermissionDeniedException if it can't access something. I remember the Google Maps app for Blackberry, which solution to that was "Please give this app all permissions or you can't use it"...
extesy | 6 hours ago
GeekyBear | 2 hours ago
> Firefox is committed to helping protect you against third-party software that may inadvertently compromise your data – or worse – breach your privacy with malicious intent. Before an extension receives Recommended status, it undergoes rigorous technical review by staff security experts.
https://support.mozilla.org/en-US/kb/recommended-extensions-...
Updates must also be vetted before being made available.
Rygian | 6 hours ago
This is of course easy to say in hindsight, and is absolutely a requirement that should be enforced by the extension appstore, not by individual contributors such as yourself.
eli | 5 hours ago
Rygian | 2 hours ago
deanc | 3 hours ago
Rygian | 2 hours ago
[1] Of course, the issue here is that no contracts were signed.
[2] In the specific case I was replying to, there was no malice or intent to hide from you as seller. Yet, a better outcome could have been achieved by advertising the sale to those impacted.
I don't think there is any legal support for what I describe above, but in principle whenever a user signs up for Good Thing, and then gets baitswitched to Evil Thing, the main victim is the user, and it is fair to hold responsible everyone involved in the bait-and-switch maneuver.
Chris2048 | 3 hours ago
If the old owner gives their key to the new owner, then they should be on the hook for it. I was thinking of this yesterday, as I think this is also how domains should work.
l72 | 9 hours ago
cess11 | 9 hours ago
dangets | 8 hours ago
karel-3d | 3 hours ago
lapcat | 9 hours ago
The biggest problem here is that "We" does not refer to Google itself, who are supposed to be policing their own Chrome Web Store. One of the most profitable corporations in world history is totally negligent.
bell-cot | 9 hours ago
lapcat | 9 hours ago
GuestFAUniverse | 9 hours ago
Considering the barriers they build to prevent adblockers, that doesn't shine a good light on them.
chrisjj | 9 hours ago
Assume they did.
And the question becomes "Why didn't they come clean?" ... and much easier to answer.
eli | 5 hours ago
chrisjj | 5 hours ago
eli | 4 hours ago
chrisjj | an hour ago
nanobuilds | 9 hours ago
We focus a lot on blocking data collection and spyware.. but not enough about what happens after the data is already collected/stolen and baked into your algorithmic identity. So much of our data is already out there.
Grom_PE | 9 hours ago
I hate the idea of installing stuff without an ability to look at what's inside first, so what I did was patch Chromium binary, replacing all strings "chromewebstore.google.com" with something else, so I can inject custom JS into that website and turn "Install" button into "Download CRX" button. After downloading, I can unpack the .crx file and look at the code, then install via "Load unpacked" and it never updates automatically. This way I'm sure only the code I've looked at gets executed.
captn3m0 | 9 hours ago
kwar13 | 9 hours ago
https://kaveh.page/snippets/chrome-extensions-source-code
Even a tiny extension like this one I wrote with 2k users gets buyout offers all the time to turn it into malware: https://chromewebstore.google.com/detail/one-click-image-sav...
bennydog224 | 9 hours ago
croes | 9 hours ago
No need for such complicated attacks /s
bittercucumber | 9 hours ago
[OP] qcontinuum1 | 8 hours ago
georgehill | 9 hours ago
baggachipz | 8 hours ago
james-bcn | 8 hours ago
Imustaskforhelp | 8 hours ago
To find the list of decided malicious extensions, I can imagine that a github repository where people can create issues about the lack of safety (like imagine some github repo where this case could've also been uploaded) and people could discuss and then a .txt/json file could be there in the repo which gets updated every time an extension is confirmed to be malicious.
Thoughts?
Edit: (To take initiative?) I have created a git repo with this https://github.com/SerJaimeLannister/unsafe-extensions-list but I would need some bootstrap list of malicious extensions. So I know nothing about this field and the only extension I can add is this one maybe but maybe someone can fork this idea (who is more knowledgable within the extension community space) or perhaps they can add entries into it.
Edit 2: Looks like qcontinuum actually have a github repo and I hadn't read the article while I had written the comment but its not 1 extension but rather 287 extensions and they have mentioned all in their git repo
https://github.com/qcontinuum1/spying-extensions
So they already have a good bootstrapped amount & I feel as if qcontinuum is interested they can maybe implement the idea?
[OP] qcontinuum1 | 7 hours ago
We might to it once. That requires non-trivial engineering effort and resources and we are at the moment short on both of those.
Imustaskforhelp | 6 hours ago
I am curious but wouldn't this effort be more better if more people outside who are interested in investing their own resources for the safety of a better internet could help you out in such endeavour? So essentially they can also help you out in such task essentially creating an open source-ish committee/list which can decide it.
I do feel like if resources are something in short, then actually doing such would be even more beneficial, right? What are your thoughts on it?
(Tangent if you actually do this: This might become a cat and mouse game if the person with malicious extension say reads the github repo and if they see their extension in it before people can conclude its malicious, making the cat and mouse game but I am imagining a github action which can calculate the hash and download link and everything (essentially archiving) a state of extension and then people can get freed from the game and everything as well. So this might help a lot in future if you actually implement it)
[OP] qcontinuum1 | 5 hours ago
Any tool that would be open sourced or community driven for extension scanning will be with enough time used by bad actors to evade the scans. That is also why we don't share the code for this research as it would only speed up this process.
Imustaskforhelp | 5 hours ago
But I feel like then the (bottleneck?) [which I don't mean in a bad way] would be the team where the attackers might still be infinitely more which can exhaust your resources which you mention as such.
Also,Are there any other teams working in this? Thoughts on collaborating with anyone in the security field?
Maybe if a direct detailed discussion can't happen then just as how you released the list of these extensions, you can release extensions in future too as you detect them
Do you feel as if LLM generated vibe-coded (with some basic reading of code to just get idea and see if there's any bad issues) would be more safer than a random extension in firefox/chrome in general? Given one is a black box (closed source) generated by human and the other is an open code generated by a black box.
Chris2048 | 3 hours ago
burnt-resistor | 2 hours ago
Multiple regulators should sue Google for putting users at risk by failing to protect users from malicious code before publishing Chrome extensions and Android apps.
Chris2048 | 2 hours ago
precompute | 2 hours ago
gnl | 9 hours ago
- https://github.com/beaufortfrancois/extensions-update-notifi...
And then you can do whatever you feel is an appropriate amount of research whenever a particularly privileged extension gets updated (check for transfer of ownership, etc.)
- brave://flags/#brave-extension-network-blocking
You can then create custom rules to filter extension traffic under brave://settings/shields/filters
e.g.:
- Clone the GitHub repo, do a security audit with Claude Code, build from source, update manuallynekusar | 9 hours ago
Chrome/Google/Alphabet is spying on 100% of their users.
Quit using Alphabet stuff, and your exploitation factor goes down a LOT.
ubermonkey | 8 hours ago
nkmnz | 8 hours ago
rkagerer | 8 hours ago
Play Store pages for all 3 list strong assurances about how the developer declares no data is being sold to third parties, or collected unrelated to the item's core functionality.
Brave Web browser (runapps.org) https://chromewebstore.google.com/detail/mmfmakmndejojblgcee...
Handbrake Video Converter (runapps.org) https://chromewebstore.google.com/detail/gmdmkobghhnhmipbppl...
JustParty: Watch Netflix with Friends (JustParty.io) https://chromewebstore.google.com/detail/nhhchicejoohhbnhjpa...
My open question to Google is: What consequences will these developers face for lying to you and your users, and why should I have any faith at all in those declarations?
baby | 8 hours ago
[OP] qcontinuum1 | 8 hours ago
There are resource constrains. Those extensions try to actively detect if you are in developer mode. Took us a while to avoid such measures and we are certain we missed many extensions due to for example usage of Docker container. Ideally you want to use env as close to the real one as possible.
Without infrastructure this doesn't scale.
The same goes for the code analysis you have proposed. There are already tools that do that (see Secure Annex). Often the extensions download remote code that is responsible for data exfiltration or the code is obfuscated multiple times. Ideally you want to run the extension in browser and inspect its code during execution.
coldtea | 8 hours ago
be scoped, meaning only allowed to read/access when you visit a particular domain whitelist (controlled by the user)?
be forced (by the extension API) to have a clear non-obfuscated feed of whatever they send that the user can log and/or tap onto and watch at any time?
If not, I wouldn't touch them with a 10000ft pole.
notpushkin | 7 hours ago
Yes. Not usually user-controllable though.
> be forced to have a clear non-obfuscated feed
Kinda. You can usually open a devtools instance that shows whatever the extension is doing. But you can’t enforce it to not obfuscate the network requests though (you’d have to make extensions non-Turing complete).
You could mitigate some of these issues by vetting the extensions harder before letting them into the stores. Mozilla requires all extensions to have a readable source code, for example.
giarc | 8 hours ago
Imustaskforhelp | 8 hours ago
I'd either go ahead and talk to her and remove extensions altogether and ask her to have a stock/only open source extensions (yes opensource also has supply issues but its infinitely more managable than this) or the second option being to maybe create them yourself . I don't know about how chrome works (I use firefox) but one thing that you can do is if the thing is simple for your daughter, then just vibe code it and use tampermonkey (heck even open source it) and then audit the code written by it yourself if you want better security concerns.
Nowadays I really just end up creating my own extensions with tampermonkey before using any proprietory extension. With tampermonkey, the cycle actually feels really simple (click edit paste etc.) and even a single glance at code can show any security errors for basic stuff and its one of the few use cases of (AI?) in my opinion.
ghtbircshotbe | 8 hours ago
https://addons.mozilla.org/en-US/firefox/addon/wikibuy-for-f...
soared | 8 hours ago
ghtbircshotbe | 7 hours ago
neya | 8 hours ago
When I was the CTO in a previous role, SimilarWeb approached us. I read through the code snippet they gave us to inject onto our site. It was a sophisticated piece of borderline spyware that didn't care about anyone in the entire line of sight - including us. They not only were very persistent, they also had a fight with our management - for refusing to use their snippet. They wanted our data so bad (we had very high traffic at the time). All we wanted was decent analytics for reporting to senior management and Google had just fucked up with their GA4 migration practices. I switched them to Plausible.io and never looked back. It was the least I could do, we had to trade-off so many data points in comparison to GA, but still works flawlessly till date. Fuck SimilarWeb.
hannob | 8 hours ago
So it's completely impossible that such malicious extensions still exist.
(may contain sarcasm)
ArcaneMoose | 7 hours ago
molticrystal | 7 hours ago
https://output.jsbin.com/gihukasezo/
or
https://jsfiddle.net/9kLsv3xm/latest/
or
https://pastebin.com/Sa8RmzcE
singularity2001 | 7 hours ago
bittercynic | 7 hours ago
ravenstine | 6 hours ago
leptons | 5 hours ago
welanes | 6 hours ago
1. Go to chrome://extensions and toggle Developer mode on (so IDs are visible)
2. Select all text on the page with your mouse and copy
3. Paste it into the tool
It parses the IDs and warns you if any are among the 287 spyware extensions.
ianhawes | 6 hours ago
welanes | 6 hours ago
JadeNB | 2 hours ago
the_gipsy | 6 hours ago
Pepperidge farm remembers.
herf | 5 hours ago
heavenlyfather | 5 hours ago
curious to know: 1- how large your team is? and how long this research took? it is very thorough and knowing such a detail might encourage others to participate in a joint effort in performing this kind of research 2- if this kind of research is your primary focus? 3- if there are other ways that financial support can be provided other than through xrp or btc?
i tried to look up your profiles but wasn't able to find where you were all from, so wishing you well wherever you are in the world. :)
revicon | 4 hours ago
revicon | 3 hours ago
amalter | 2 hours ago
Not that I don't trust you, but between now and when someone stumbles on this thread, your domain could expire and I could publish something crazy at that url.
revicon | an hour ago
But I'll add a caveat to my original comment as well.
edit: Looks like I can't edit my original comment anymore.
nipperkinfeet | 42 minutes ago