Meanwhile Apple iPhone sales were up 23% YoY end of last year. It'll likely be a good year for Apple, with a little more room in margin to make some plays, and a lottt of cash.
> By contrast, Apple and Samsung are better positioned to navigate this crisis. As smaller and low-end-positioned Android vendors struggle with rising costs, Apple and Samsung could not only weather the storm but potentially expand market share as the competitive landscape tightens.”
> Be honest, who had "Sam Altman kills Apple Computer" on their 2025/6 Bingo card?
Not the person Sam Altman specifically, but AI in general. It was obvious even in 2024 that braindead beancounters were jumping on the hype train, so much so that coal power plants were kept alive to satiate the power hunger [1]. The last time that shit happened, it was the coin craze [2], but unlike cryptocurrencies there was and is an actual product being made...
> Well, actually, there is one other important reason for this article’s existence I'll tack onto the end – a hope that other people start digging into what’s going on at OpenAI. I mean seriously – do we even have a single reliable audit of their financials to back up them outrageously spending this much money…for this? Heck, I’ve even heard from numerous sources that OpenAI is “buying up the manufacturing equipment as well” – and without mountains of concrete proof, and/or more input from additional sources on what that really means…I don’t feel I can touch that hot potato without getting burned…but I hope someone else will…
And I'd say if it ends up being shown there even is the slightest hint of impropriety going on, trial him. Up to and including capital punishment for the entire board and C level - what OpenAI already has done, even if legally on paper, IMHO is the biggest market manipulation in history, and it's not just one competitor that is suffering but society as a whole.
I don't have an issue with big companies and their super rich investors engaging in petty bitch fights. By all means, hand me some popcorn and soda. But the RAM situation, with everyone not being super rich and flush with cash from AI crazed investors being screwed royally? That is far beyond acceptable.
We need to send a message: you can't mess around with the world economy at that level without feeling serious repercussions. The lives of the billions are not playthings for the select few.
And if it turns out to be outright market manipulation, engaging in deals he doesn't even have the money committed for by others, much less actually have it on his balance sheet? Then it's time for the pitchforks, not even Madoff was this ruthless.
Holy shit, I had no idea openai ahd such immense international power over manufacturers in independent foreign countries that they can tie the hands of ram companies and forcibly prevent them from making more ram.
The DRAM shortage and lack of fab capacity have also caused the Playstation 6 to slip to 2029 or so.[1] Game consoles are vulnerable. They need a lot of RAM and have to sell at a moderate price.
The IDC article says that DRAM prices are not expected to come down again. "While memory prices are projected to stabilize by mid-2027, they are unlikely to return to previous level — making the sub-$100 segment (171 million devices) permanently uneconomical." Before, they always came back down in the next RAM glut, when everybody built too much capacity. Why is that not going to happen next time?
Because this shortage isn't natural, it's the result of OpenAI flexing monopsony power to deprive everyone else for its strategic gain. Unlike an organic shortage, there is no compelling reason for otherwise excess capacity to be built, since this artificial shortage can end as arbitrarily as it started.
The datacenters are still going to be built, and their usage won't suddenly fall just because the companies behind some of the products on them suddenly lose value. The demand is not tied to their profits, so I find it unlikely for the shortage to just end.
These data center projects are losing hundreds of billions of dollars which they don't have, and some evidence is starring to come out they're just money laundering schemes to get money from the government to contractors. I wouldn't bet on them all being built.
There far too many railways, amusement parks, housing developments and other bubble ventures that were either never even completed after wasting a lot of money or went bust soon after opening.
No reason the same can't happen now - especially for something as expensive and faily easily re-sellable as a datacenter & the hardware insite. Just rip it all out and sell it for parts where they are actually needed.
The data centers have already been financed, they’re not going to stop halfway through because they’ve run out of money. Whether or not they’ll make money on completion is a different story, but that’s 2-3 years away at least. Then you might see RAM prices drop, but not before.
One reason we end up with excess capacity is process improvements; adding new fabs to get more density or performance doesn't make old fabs go away, and so we go through cycles of excess capacity. Demand has been relatively constant.
Here we're facing different forces-- unprecedented demand for DRAM that may be durable. But it also looks like the pace of supply changes may be decreased as process improvements get smaller and the industry stops moving so much in lockstep.
It still matters what happens to the demand function, though. If enough AI startups blow up that there's a lot of secondhand SDRAM in the market, and demand for new SDRAM is impacted, too, that will push things down.
Sort of like what happened with the glut of telecom equipment after
You’re asking why a market that has had 3 price fixing lawsuits in less than 2 decades (criminal convictions in 1998, civil in 2006 and 2018) isn’t going to follow market dynamics?
> The IDC article says that DRAM prices are not expected to come down again
Sure thing. I'd take a look at IDC & similar firms' forecasting history before worrying too much about what they say.
There is an AI boom right now. There will be a consolidation cycle at some point. When that happens half the players, if not more, will disappear. The huge hardware budgets will go with them.
We also can't be certain that the DRAM makers aren't capitalizing on this opportunity because they can. Remember: all of them are convicted monopolists. As in actual prison time convicted. And fined. And lost civil lawsuits. Multiple times.
I just can't see AI paying enough of a premium on HBM to justify the DRAM spikes. Frankly I can't see the volume either. Wafer starts on DRAM are dramatically bigger than you are probably imagining. DRAM is in practically everything these days. AI servers is but a drop in the bucket. 10% of the market? Yeah right, if its 4% I'd be shocked. And you are telling me a shift of 4% of wafers to HBM is driving these prices and shortages?
I humbly suggest if you look at the numbers something smells funny.
Disclaimer: none of us has access to the actual data, a lot of it is inferred by industry players. Some are well connected and usually accurate but that is not evidence. Therefore it is possible this is a genuine market action and nothing nefarious is going on.
HBM is not normal memory. It uses a lot more area per bit and has lower yield too. So a Gb of baseline DRAM and a Gb of HBM are very different measurements, the latter equates to so much more in terms of volume.
I recently upgraded from the Pixel 7 to the 10. Nothing but regret - the phone isn't worse, but it's not better either, and I had to reinstall everything. Why did I do this?
Maybe for fortnite players. If you just call/text/email whats the point in upgrading? At this point I pretty much just ride out whatever iphone se I happen upon on deep discount until Apple finally walls me off from the OS version I need to access my bank account (thanks to very helpful flow of bank website forcing app store redirect to their app for mobile users). Then it's on to the next se.
Now that I think on it maybe I ought to just pay attention to jailbreakable OS version numbers again. If I stuck on one of those OS versions I could just spoof my user agent for the bank website with a jailbroken phone.
Strange comparison. If you just call, text and check on bank apps, then the market is not for you. Just buy a used phone from 10 years ago.
It's like saying why should you get a new gaming laptop to replace your 6 year old current gaming laptop, when all you do is office work. If all you do is office work, why buy a gaming laptop at all? Just use a standard okish smartphone or tablet.
Who are these phones even for? I see people out in public and all they do with their phone is scroll instagram reels or ticktock. They need all this horsepower for that? I don't think so either. This is why I brought up fortnight because gaming is one example where there probably is a marked difference in frames per second between models. But 99% of people are probably just looking at images and videos and text on their smartphone, pretty low stakes stuff.
The cool thing about Pixels is that not only will you have to pay extra for RAM because of AI, but some of the RAM you paid for will also be permanently reserved for local AI features, regardless of whether you use them.
On a Pixel phone you have only Google spyware. On another brand's phone you have all the same Google spyware, plus the spyware from that brand and a permanently locked bootloader.
You can remove 3rd party spyware/bloat in 15 minutes with Shizuka/canto and a usb cable and you won't notice anything changed in the phone. Unfortunately the Google spyware is so deeply integrated that you can't really do that unless you accept a ton of things not working - not just Google apps but also lots of third party apps that require Play services.
Yes, if you want full degoogling you need a custom ROM like Graphene on Pixels or Lineage. The main issue these days is that bootloaders are locked. Phone manufacturers mostly refuse to give you control over your own hardware.
Somehow, with 12GB of RAM, I can't get my iPhone 17 Pro to keep more than a few safari tabs open without having them refresh when I come back from an app or two, and it makes me want to throw my phone across the train (Where the internet often cuts out!).
A lot of software has been squandering the massive hardware gains that have been made. I hope this changes when it becomes a lot harder to throw hardware at the problem.
I also wonder what this means for smartphone-esque devices like the Switch 2. If this goes on long enough I won't be surprised if they release a 'lite' model with less RAM/Storage and bifurcate their console capabilities, worse than what they did with 3DS > 2DS .
iOS I think has really aggressive background task killing, and it also drives me insane. I know they do it for battery life but I'm about ready to switch to Android, and would have a long time ago if I that didn't also mean replacing my watch, headphones, etc.
Is it too much to ask for me to manage my own background processes on my phone? I don't want the OS arbitrarily deciding what to pause & kill. If it actually does OOM, give me a dialog like macOS and ask me what to kill. Then again, if a phone is going OOM with 12GB of RAM there's a serious optimization problem going on with mobile apps.
I recently started learning how to do iOS apps for work and the short answer is: you don't.
Apple seemingly wants all apps to be static jpegs that never need to connect to any data local or remote, and never do any processing. If you want to do something in the background so that your user can multitask, too damn bad.
You can run in the background, for a non-deterministic amount of time. If you do that, iOS nags your user to make it stop. If you access radios, iOS nags your user to disable it.
It's honestly insane. I don't know why or how anyone develops for this platform.
Not to mention the fact that you have to spend $5k minimum just to put hello world on the screen. I can't believe that apple gets away with forcing you to buy a goddamn Mac to complile a program.
Depends on where you live. I haven't seen one for less than $1000, and that's for a five-year old model soon going out of support. Seems like a waste of money.
> If you do that, iOS nags your user to make it stop. If you access radios, iOS nags your user to disable it.
These are features, because we can't trust developers to be smart about how they implement these. In fact, we can't even trust them not to be malicious about it. User nags keep the dveloper honest on a device where battery life and all-day availability is arguably of utmost importance.
> you have to spend $5k minimum just to put hello world on the screen.
I've never felt nagged. Every time I get one of those popups, which isn't too often, I think "neat, good to know."
It's inconvenient that apps can't do long-running operations in the background outside of a few areas, but that's a design feature of the platform. Users of iOS are choosing to give up the ability to run torrent clients or whatever in exchange for knowing that an app isn't going to destroy their battery life in the background.
> iOS I think has really aggressive background task killing, and it also drives me insane. I know they do it for battery life but I'm about ready to switch to Android, and would have a long time ago if I that didn't also mean replacing my watch, headphones, etc.
Android does all sorts of wacky stuff with background tasks too... Although I don't feel like my 6 GB Android is low memory, so maybe there's something there, but I also don't run a lot of apps, and I regularly close Firefox tabs. Android apps do mostly seem well prepared for background shenanigans, cause they happen all the time. There's the AOSP/Google Play background app controls, but also most of the OEMs do some stuff, and sometimes it's very hard to get stuff you want to run in the background to stay running.
I dunno about watches, but Airpods work fine with Android, as long as you disconnect them from FindMy cause there's no way to make them not think they're lost (he says authoritatively, hoping to be corrected).
On Android of course it depends on the configuration. I am running LineageOS 23 on an older device with 6GB of RAM as well and it would kill basically anything (making e.g. paying with a credit card a pain when you have to switch to the bank app to confirm a transaction). Had to adjust few variables for ZRAM control and now it's seamless.
iOS doesn't have aggressive background task killing except for memory pressure. It suspends apps for battery life; it only kills them under memory constraints. If you don't want apps dying and tabs closing, use apps that use less memory. iOS does not have swap out of a desire to avoid unnecessary NAND wear (and to avoid the performance impact), so it must more aggressively kill things.
So I have safari and I can’t switch to my email? Both native apps and sometimes I lose the state of safari if I move more than 10s away? I have to keep switching between the 2 apps to keep alive my safari tab? Insanity
With dram, you have to refresh every cell within a periodic interval. Usually this is handled in hardware. It would be a crazy optimization if unused pages weren’t refreshed. There would have to be a decent amount of circuitry to decide that.
I'm not suggesting it exists, but I could plausibly see something where the range to refresh could be changed at runtime. If you could adjust refresh on your 8 GB phone in 1 GB intervals (refresh up to 1/2/4/8 GB etc; or refresh yes/no for each 1GB interval), the OS could be sure to put its memory at low addresses, and the OS could do memory compaction into lower addresses and disable refresh on higher addresses from time to time. Or ... I think there's apis for allocations for background memory vs foreground memory; if you allocate background memory at low addresses and foreground memory at high addresses, then when the OS wants to sleep, it kills the process logically and then turns off refresh on the ram ... when it wants to use it again later, it will have to zero the ram cause who knows what it'll have.
I don't work at that kind of level, so I dunno if the juice would be worth the squeeze (sleep with DRAM refresh is already very low power on phone scales), but it seems doable.
This is an argument for having less memory on a hardware level. But once the DRAM is there, it uses power, whether or not it stores useful data or useless data.
There's a reason why we say unused RAM is wasted RAM.
Powering down unused physical RAM is absolutely a thing on some systems. For one thing, it's required if you ever want to support physical memory hotplug. The real issue however is that the gain from not doing DRAM refresh is clearly negligible: it's no more than the difference between putting a computer to sleep (ACPI S3), or putting a phone to sleep in airplane mode - and powering it off.
I really dont understand that at all. Web Pages are mostly static, you would think the iPhone would cache websites reasonably well.
I remember on Android I dont recall the app name specifically, but it would let me download any website for offline browsing or something, would use it when I knew I might have no internet like a cruise.
Heck there used to be an iOS client for HN that was defunct after some time, but it would let you cache comments and articles for offline reading.
Web pages that make sense are mostly static. But these days articles need to load each paragraph dynamically, so in order to save 3kb in case you wouldn't finish the article you need to download 5mb of js to do that, plus a bunch of extra handshakes.
It's the js that does it, because so many webpages are terribly optimized to integrate aggressive ad waterfalls into them. Or have persistent SPA framework's doing continually scope checks.
That being said, there's no reason the Safari context shouldn't be able to suspend the JS and simply resume when the context is brought back to the foregrown. It's already sandboxed, just stop scheduling JS execution for that sandbox.
Sorr of related. On my laptop running linux, Firerox with youtube will get progressively slower if you keep sleeping and waking up the laptop. It is as though the JS is struggling to keep up with adjusting to the suspend and wake cycle. This never happened on Windows/macos systems so it could just be a linux thing.
Wasn't the 2DS just a 3DS minus the lenticular screen, and especially minus the front-facing camera that did face tracking to improve the quality of the 3D?
My understanding was that market research showed a lot of users were turning off the 3D stuff anyway, so it seemed reasonable to offer a model at lower cost without the associated hardware.
> My understanding was that market research showed a lot of users were turning off the 3D stuff anyway
It was also because young children weren't supposed to use the 3D screen due to fears of it affecting vision development. You could always lock it out via parental controls on the original, but still that was cited as a reason for adding the 2DS to the lineup.
> Fils-Aime said. “And so with the Nintendo 3DS, we were clear to parents that, ‘hey, we recommend that your children be seven and older to utilize this device.’ So clearly that creates an opportunity for five-year-olds, six-year-olds, that first-time handheld gaming consumer."
Removing docking functionality could possibly reduce RAM usage by never enabling 4K screen output. This would be similar to the switch lite.
Although, for a $450 device that doesn’t need to make much of a profit on its own, I also don’t think they’re heavy on memory in the first place (12GB). You can buy top quality Chinese Android handhelds with more RAM and better Qualcomm processors than the Switch 2 for about the same price, and those companies are making $0 in software royalties (e.g., AYN Thor Max is $450 with a 16GB/1TB configuration).
> Removing docking functionality could possibly reduce RAM usage by never enabling 4K screen output. This would be similar to the switch lite.
Every version of the Switch 1 had 4GB of RAM, they didn't cut that on the Lite. Going back and patching every game to ensure it ran on less RAM it was originally designed for would have been a nightmare.
> (e.g., AYN Thor Max is $450 with a 16GB/1TB configuration).
AYN just announced that the Thor will get a price increase soon for obvious reasons.
Oh yeah, I accidentally implied the switch lite cut down RAM when it didn’t.
Of course the Thor Max will have a price increase, but also, obviously 16GB/1TB is a massively bigger bill of materials than the Switch 2’s 12GB/256GB configuration.
And I forgot to mention that Nintendo has far more pricing leverage in terms of their volume.
Mine (Android Firefox) does it when I have a YouTube video paused and do something else for a bit. Whenever I stop watching a video, I have to screenshot it so I know the timestamp to try to get back to later :-/
App battery usage is unrestricted, so it's not that.
Am I too much of an idealist to hope that AI leads to less buggy software? On the one hand, it should reduce the time of development; on the other hand, I'm worried devs will just let the agents run free w/o proper design specs.
The message with AI from execs is that you have to go fast (rush!). Quality of work drops when you rush. You forget things, don’t dwell on decisions and consequences, just go-fast-and-break-things.
> The message with AI from execs is that you have to go fast (rush!). Quality of work drops when you rush.
Sure, but otherwise, the competition will be first to market, and the exec may lose their bonus. So, the exec keeps their bonus, and when the tech debt collapses, the exec will either have departed long ago or will be let go with a golden parachute, and in the worst case an entire product line goes down the drain, if not the entire company.
The financialization and stonkmarketization of everything is killing our society.
The average LLM writes cleaner, better-factored code than the average engineer at my company. However, I worry about the volume of code leading to system-scale issues. Prior to LLMs, the social contract was that a human needs to understand changes and the system as a whole.
With that contract being eroded, I think the sloppiness of testing, validation, and even architecture in many organizations is going to be exposed.
The social contract where I work is that you’re still expected to understand and be accountable for any code you ship. If you use an LLM to generate the code, it’s yours. If someone is uncomfortable with that, then they are leaning too hard on the LLM and working outside of their skill level.
It might actually turn out like that. A lot of bloat came from efforts to minimize developer time. Instead of truly native apps a lot of stuff these days is some react shaped tower of abstractions with little regard for hardware constraints.
That trend might reverse if porting to a best practice native App becomes trivial.
Considering how many companies that have adopted AI led to disastrous bugs and larger security holes?
I wouldn't call it an idealist position as much as a fools one. Companies don't give a shit about software security or sustainable software as long as they can ship faster and pump stocks higher.
Considering that AI still can't even reliably get basic programming tasks correct, it doesn't seem very likely that turning it loose will improve software quality.
I feel like my 3GS was way better about resuming where I left off than any fancy new iPhone I’ve had in the past few years.
Big name apps like Facebook, YouTube, Apple Music, Apple Podcasts seem totally disinterred in preserving my place.
YouTube being the worst where I often stack a bunch of videos in queue, pause to do something else for a while and when I return to the app the queue has been purged.
Too slow to edit. But also now playing just seems to go away after a while. Why isn’t this written to some nonvolatile place and just preserved? It feels like it must be on purpose but I wonder what the purpose is.
I assume the purpose of the Now Playing clearing after a while is the idea that when people start a "new session" with their device it should be "clean". Like, if Now Playing didn't randomly disappear then for most people it would always be on, indicating some paused music or podcast playback. It would also never give a chance for that elusive "start playing" experience that shows up in its place sometimes to recommend that I listen to one of four songs/podcast episodes.
I feel like this might be intentional to a certain degree, at least on YouTube or Facebook.
If you switched off the app while looking at a certain post or watching a certain video, that's a negative engagement indicator, so the app wants to throw you back into the algorithmic feed to show you something new instead.
Ad blockers don't work anymore, at least not with the version YT serves me. If it thinks that I have an ad blocker active (false positives happen too), it will only show a black rectangle and not even load the comments.
try firefox, librewolf, waterfox, chromium. In these browsers I had ublock origin (lite for chromium), adguard and NoScript (And/Or Privacy Badger) on my phone and PC, I didn't see any ads at all. I use the unhook and enhancer extensions with them)
On PC, I use Firefox with the uBlock Origin extension and I see no ads on Youtube.
Same with my pocket supercomputer: Firefox works great on Android, including for Youtube. And it uses extensions like the PC version does. No ads there, either.
On the BFT in the living room, I have a Google-manufactuered Google TV device. It runs SmartTube, and displays no ads on Youtube.
I even have an iPad that I use primarily for watching Youtube videos. For that, I stay completely within the confines of the walled garden and use Safari with the AdBlock add-on. And: If you're guessing that I'm about to write that have no ads on Youtube there either, then you're right. There's no ads on Youtube with that device, either.
Am I doing this wrong?
Maybe my perspective differs from that of some others, but it seems to all work very well for me here in 2026. (There's been some ups and downs with this over the years, but it all finds its way back to exactly what I wrote above, anyway.)
I also use Firefox with uBlock Origin. It worked flawlessly until some time this January. It happened with the switch to a new version of the video player which changed the design and behaviour. I'd be curious if you're still on the old version or something else is different.
Roughly in that timeframe YT also successfully blocked downloads with yt-dlp for a bit. Seems like they're trying harder now because of AI scrapers.
And that's about it. I recently pruned some other Firefox extensions while troubleshooting ompletely unrelated issues, and all that's left is uBlock Oorigin, Dark Reader, and BitWarden.
Seriously, I've had no recent issues with Youtube ads at all and certainly none in January or February of this year. It's been smooth-enough for me on all of the platforms I mentioned before (and I use them all quite a lot, except perhaps for the BFT).
YouTube will literally resume back to exactly where I was, then seemingly noticing that I switched back to it, go ahead and close the video I was watching. With all sorts of animations too, it's not just a case of having showed a cached screenshot. YouTube seems to intentionally forget where in a video I was, often after having been paused in the background for only a minute or two.
Likely some kind of complex refresh operation that kicks off when entering the foreground and takes a few seconds to complete before overwriting your state.
YouTube on TVs will often keep closed captioning on when switching accounts, then notice that CC is on and turn it off. Even though every account in the household always has CC turned on.
See if turning off your ad blocker makes a difference. I've noticed that sometimes YouTube has parts of the site the apparently can look to ad blockers like they are part of an ad (maybe intentionally to annoy people with ad blockers?).
I feel like that's definitely a choice for Facebook at least - there's no technical reason the app couldn't remember at least the post you were looking at. I think they literally don't care if you were halfway through reading something when you flicked out of the app and go back in - refreshing the page and showing you all new stuff is probably measurably "better for 'engagement'" by whatever silly metrics they use.
It’s been a while since I worked for a bigger company (not meta) but the problem there was you would have a team who was responsible for feature A and a team responsible for feature B, and if there was any weird interop between the two it just never got resolved because there wasn’t an owner. There was no internal incentive to fix the problem. It wasn’t deliberate but it was structural
I find myself saving a ton of stuff to my Watch Later list, because I can’t trust the Back button when using YouTube. This issue exists on the phone, web, and AppleTV. YouTube just likes to randomly refresh everything. It’s the most annoying “feature”.
Now is bad too, but my recollection is that the iPhone 3G-era task killer was EXTREMELY aggressive and required "tricks" to keep your state in the one app you could run
On a tangent how about those sweet app updates with patch notes reading bug fixes every week or so from the likes of Xiaomi and Anker weighing in at 600-700mb.
It's all gone to $hit, efficiency is gone it's just slop on top of more slop.
Even system apps like Photos have completely given up on state restore. I'm deep in an album comparing a photo to something on the web? Sorry, Safari needs all that RAM, Photos all is kicked out, and Photos can't possibly remember you were inside an album (despite, you know, all the APIs Apple specifically has to manage this [0]). They USED to care about these things and made it seamless enough that you weren't supposed to know that the the app was killed in the background, but they just don't seem to care anymore
Youtube/Google just make these shitty small annoying decisions just to make the iOS experience that little bit more annoying than it has to be.
Case in point — Youtube background play doesn’t pause when Siri makes an announcement, so if you’re listening to something you get two voices over each other.
I gave it the benefit of the doubt and figure it must be some kind of iOS thing, until I was listening to Audible one day and it paused automatically. So it’s just a google thing, not a third-party apps thing.
i have the same issue with the Youtube queue — this is something that could easily be persisted, but they just choose not to.
You're just adding a step that doesn't fix the primary issue (you can already manually save any page you want without adding it your reading list). Someone should be able to go to their translate app, then their photo galley, and back to Safari without it needing to refresh the context.
That doesn’t save the current dynamic state of the page. It’s at most useful for static content, but even on a Wikipedia page you’ll lose your current state of expanded/collapsed sections and hence your reading position.
I honestly think the memory shortage kills the possibility of a Switch 2 Lite.
Nintendo can't realistically take memory budget away from developers after the fact. The 2DS cut the 3D feature from the 3DS, but all games were required to be playable in 2D from day 1, so no existing games broke on the cost-reduced 2DS.
I had a China phone with amazing specs but it KEPT KILLING EVERYTHING.
Hardware is pretty useless if the software that drives it is useless. I don't know it probably works better in China all I know is that I went back to good old Samsung.
It's a pervasive Chinese phone problem. I've used many and they all have "Battery saving" features on by default, which means killing background apps after a while apparently. Battery life is great, but newly installed apps sometimes don't work as they should.
The market demands must be different there. I've disabled "battery optimisation" for all the apps I need to stay open (and some apps even prompt me to disable it!), and I don't have any issues in daily use.
That kind of aggressive process termination will be becoming less common since Android introduced freezer [1] optimization to put a background process to a completely unscheduled state.
if you run out of smartphone battery you are in much bigger trouble in China than in west since it's necessary to function almost everywhere, which is why they have rental powerbanks stand literally in every restaurant and every small grocery shop, you are never further than like 5 minutes walk from one in urban area
btw you can always put app to protected/not optimized list which usually solve problems with most of the western apps on Chinese phones (essential Chinese apps like WeChat are on the list by default)
I have had many chinese phones (Huawie, Oppo, Xiaomi) over the years and the things they choose to kill in the background is odd. Web browsers and almoat any kind of banking app will be killed in minutes if not seconds. VLC... Depends on the day could be minutes or days. No idea why that one.
Hard to tell if it something I am doing or not. I will say with all these phones and everything google turned off I typically get 3-4 days per charge but that really depends on what your usage is.
I was trying to upload a 300mb video via the local police's web interface, a very important matter. I had to set my phone screen to stay on for 30 minutes and then leave the web browser open without touching it. Disabling all power saving measures makes not difference. This was the only way I could get it to finish uploading. I'm on a pixel 8 pro with grapheneos. Same thing in both Firefox and vanadium. I don't think it runs out of ram, the system is just too trigger happy. The battery still doesn't last all day anyway.
My iPhone 8 just stopped working 2 months back (phone works but the microphone used in phone calls no longer works) so by chance my good friend gave me his pixel 8 that was only a couple few months old. It got a pink line down the screen that comes and goes which if you press in one spot can usually make it go away but he is a business owner and he can't risk the screen going from line to not working for a day as a missed communication could cost him thousands. So he said here take it and he got a new one. Seems like this pink line is common and a defect in some screens.
Anyways I wanted to say I also have a pixel 8 but with stock OS and my battery typically lasts a full day with average usage. My iPhone 8 previously even with a replacement battery was lucky if it lasted more then 5 hours. I had to charge that thing multiple times a day.
IOS or safari issue then, I also have 12GB ram on my S25+, with 25 open tabs, and I quickly did a test, there was non that were un-loaded that I had to reload
It happened a lot on my previous phone with only 4GB ram though
> A lot of software has been squandering the massive hardware gains that have been made. I hope this changes when it becomes a lot harder to throw hardware at the problem.
Considering how many people are so averse to programming that they use LLMs to generate code for them? Not very likely IMO. I would like to see it happen, but people seem allergic to actually trying to be good at the craft these days.
From everything I’ve seen, LLMs aren’t exactly known for writing extremely optimized code.
Also, what happens to the stability and security of my phone after they let an LLM loose on the entire code base for a weekend?
There are 1.5 billion iPhones out there. It’s not a place to play fast and loose with bleeding edge tech known for hallucinations and poor architecture.
You can also tell it the optimization to implement.
I asked Claude to find all the valid words on a Boggle board given a dictionary and it wrote a simple implementation that basically tried to search for every single word on the board. Telling it to prune the dictionary first by building a bit mask of the letters in each word and on the board and then checking if the word is even possible to have on the board gave something like a 600x speedup with just a simple prompt of what to do.
That does assume that one has an idea of how to optimize though and what are the bottlenecks.
Can we assume at this point if the problems are well known, the low hanging fruit has already been addressed? The Boggle example seems like a pretty basic optimization that anyone writing a Boggle-solver would do.
iOS is 19 years old, built on top of macOS, which is 24 years old, built on top of NeXTSTEP, which is 36 years old, built on top of BSD, which is 47 years old. We’re very far from greenfield.
The average developers suck. The distribution is also unbalanced. It is bulkier on the low-skill side.
Great UIs are written by above average or even exceptional developers. Such experience is tied to the real-life reasoning and combining unique years-long human experience of interacting with the world. You need true general intelligence for that.
They kind do if you prompt them, I had mine reimplement the Windows calc (almost fully feature complete) in rust running with 2mb RAM instead of 40mb or whatever the win 11 version uses as a POC.
A handwritten c implementation would most likely be better, but there is so much to gain from just slaughtering the abstraction bloat it does not really matter.
I am more worried about memory and cycles being squandered by the underlying libraries on the device itself. Not a lot you can do to optimize those.
(I'm looking at you, Liquid Glass. I would love to get back to a vintage, "flat" UI. I'll allow for anti-aliasing, Porter-Duff compositing, but that's where I draw the line.)
That is an Apple problem and keep in mind that iPhone doesn't do multi-task, the fact that you are having problems with 12GB is not surprised to me.
I have to use a Macbook M4 at work with 24GB, I have an AMD Lenovo Ryzen7 with 32GB running Linux Mint Cinnamon.
It is infuriating how slow this Macbook is, even to shut it down is slow asf.
macOS is not different than Windows, I cannot wait for COB to get back to my Linux laptop.
24GB is not enough, it will keep swapping, compressing etc. I had such device at work. 32GB is a night and day difference.
That said my workflows are such that I need at least 128GB now...
> and it makes me want to throw my phone across the train (Where the internet often cuts out!).
Spotted the German lol
The general problem is that many people don't bother testing their apps outside of their office wifi with low latency, low jitter, low packet loss and high bandwidth. Something like persisting the state when the OOM/battery-save killer comes knocking onto some cloud endpoint? Perfectly fine on wifi... but on a mobile connection that might just be EDGE, cut entirely because the user is just getting a phone call and the carrier does not do VoLTE, or be of an absurd latency? Whoops. Process killer knocks a -9 and that's it, state be gone.
Side note: Anyone know of a way to prevent the iPhone hotspot from disassociating with a MacBook when the phone loses network connectivity? It's darn annoying, I counted having to reconnect twenty times on a train ride less than an hour.
I am on my $110 android device from 2022 (4GB RAM), and I have never faced the browsing related issues that you mentioned.
My phone came with stock android 11 ROM with no bloats, so that might've helped too I guess.
It’s not just mobile safari, safari on desktop does the same thing even with lots of memory available. Whatever they’re doing to limit a tabs resources needs to go, it’s so frustrating.
It's really nuts how much RAM and CPU have been squandered. In, 1990, I worked on a networked graphical browser for nuclear plants. Sun workstations had 32 mb memory. We had a requirement that the infographic screens paint in less that 2 seconds. Was a challenge but doable. Crazy thing is that computers have 1000x the memory and like 10,000x the CPU and it would still be a challenge to paints screens in 2 seconds.
Yes, the web was a mistake; as a distributed document reading platform it's a decent first attempt, but as an application platform it is miserable. I'm working on a colleague's vibe-coded app right now and it's just piles and piles of code to do something fairly simple; long builds and hundreds of dependencies... most of which are because HTML is shitty, doesn't have the GUI controls that people need built in, and all of it has to be worked around as a patch after the fact. Even doing something as simple as a sortable-and-filterable table requires thousands of lines of JS when it should've just been a few extra attributes on an HTML6 <table> by now.
Back in the day with PHP things were much more understandable, it's somehow gotten objectively worse. And now, most desktop apps are their own contained browser. Somehow worse than Windows 98 .hta apps, too; where at least the system browser served a local app up, now we have ten copies of Electron running, bringing my relatively new Macbook to a crawl. Everything sucks and is way less fun than it used to be.
We have many, many examples of GUI toolkits that are extremely fast and lightweight. Isn't it time to throw the browser away, stop abusing HTML to make applications, and design something fit for purpose?
It's not "the web" or HTML, CSS, or JavaScript. That's all instant in vanilla form. Any media in today's quality will of course take time to download but, once cached, is also instant. None of the UX "requires" the crap that makes it slow, certainly not thousands of lines to make a table sortable and filterable. I could do that in IE6 without breaking a sweat. It's way easier, and faster, now. It's just people being lazy in how they do it, Apparetnly now just accepting whatever claude gave them as "best in show".
Back in PHP days you had an incentive to care about performance, because it's your servers that are overloaded. With frontend there's no such issue, because it's not your hardware that is being loaded
> Isn't it time to throw the browser away, stop abusing HTML to make applications, and design something fit for purpose?
Not going to happen until gui frameworks are as comfortable and easy to set up and use as html. Entry barrier and ergonomics are among the biggest deciding factors of winning technologies.
There are cross platform concerns as well. If the option is to build 3-4 separate apps in different languages and with different UI toolkits to support all the major devices and operating systems, or use the web and be 80% there in terms of basic functionality, and also have better branding, I think the choice is not surprising.
Cross platform GUI libraries suck. Ever used a GTK app under Windows? It looks terrible, renders terrible, doesn't support HiDPI. Qt Widgets still have weird bugs when you connect or disconnect displays it rerenders UIs twice the size. None of those kinds of bugs exist for apps written in Microsoft's UI frameworks and browsers.
The problem with cross platform UI is that it is antithetical to the purpose of an OS-native UI in its reason of existence. Cross platform tries to unify the UX while native UI tries to differentiate the UX. Native UI wants unique incompatible behavior.
So the cross platform UI frameworks that try to use the actual OS components always end up with terrible visual bugs due to unifying things that don't want to be unified. Or worse many "cross platform" UI frameworks try to mimic the its developer's favorite OS. I have seen way too many Android apps that has "cross platform" frameworks that draw iOS UI elements.
The best way to do cross platform applications with a GUI (I specifically avoid cross platform UI) is defining a yet another platform above a very basic common layer. This is what Web had done. What a browser asks from an OS is a rectangle (a graphics buffer) and the fonts to draw a webpage. Nothing else. Entire drawing functionality and the behavior is redefined from scratch. This is the advantage of Web and this is why Electron works so well for applications deployed in multiple OSes.
I have created and used them. They didn't look terrible on windows.
>What a browser asks from an OS is a rectangle (a graphics buffer) and the fonts to draw a webpage. Nothing else. Entire drawing functionality and the behavior is redefined from scratch. This is the advantage of Web..
I think that is exactly what Gtk does (and may be even Qt also) too..
I think it is just there there is not much funding going to those projects. Web on the other hand, being an ad-delivery platform, the sellers really want your browsers to work and look good...
There's loads of funding. But the ones funding Qt and GTK aren't parties interested in things like cohesion or design standards. They just needed a way to deliver their product to the user in a faster way than maintaining 2-3 OS platform apps. Wanting that shipping velocity by its nature sacrifices the above elements.
The remnants of the dotcom era for web definitely helped shape it in a more design contentious way, in comparison. Those standards are created and pushed a few layers above that in which cross platform UI's work in.
In line with "the web was a mistake" I think the idea that you can create cross platform software is an equally big mistake.
You can do the core functionality of your product as cross platform, to some extend, but once you hit the interaction with the OS and especially the UI libraries of the OS, I think you'd get better software if you just accept that you'll need to write multiple application.
We see this on mobile, there's just two target platform really, yet companies don't even want to do that.
The choice isn't surprising, in a world where companies are more concerned with saving and branding, compared to creating good products.
I've only done one platform gui work (python) but I'd guess this is stuff that is ripe for transpiling since a lot of gui code is just reusing the same boilerplate everyone is using to get the same ui patters everyone is using. Like if I make something in tkinter seems like it should be pretty straightforward to write a tool that can translate all my function calls as I've structured them into a chunk of Swift that would draw the same size window same buttons etc.
We get into transpiling and we essentially start to rebuild yet another cross platform framework. Starts with "read this filetype and turn it into this layout" and it ends up with "we'll make sure this can deploy on X,Y,Z,W..."
It'd be nice if companies could just play nice and agree on a standard interface. That's the one good thing the web managed to do. It's just stuck to what's ultimately 3 decades of tech debt from a prototype document reader made in a few weeks.
There is a lot of stuff you can get done with the standard library alone of various languages that play nice on all major platforms. People tend to reach for whatever stack of dependencies is popular at the time, however.
Visual Basic (and other 90s visual GUI builders) were great simple options for making GUI apps, but those GUIs were rather static and limited by today's standards. People have now gotten used to responsive GUIs that resize to any window size, easy dynamic hiding of controls, and dynamic lists in any part of the GUI; you won't get them to come back to a platform where their best bet at dynamic layout is `OnResize()` and `SubmitButton.Enabled = False`.
Pretty much any non-web GUI framework I tried so far has either been terrible to set up, or terrible to deploy. Or both. Electron is stupidly simple.
ImGUI is the single exception that has been simple to set up, trivial to deploy (there is nothing to deploy, including it is all that's needed), and nice to use.
> Isn't it time to throw the browser away, stop abusing HTML to make applications, and design something fit for purpose?
Great. How do you get all the hardware and OS vendors to deploy it for free and without applying their own "vetting" or inserting themselves into the billing?
I wouldn't say that. The web had done way more good than harm overall. What I would say is that embedding the internet (and its tracking and spyware and dark patterns that have gain prominence) into every single application that we use is what is at fault.
The web browser that we built in 1990 was all on-premise obviously. And it had a very different architecture than HTTP. There were two processes. One used TCP/IP to mirror the plant computers model into memory on the workstation. The other painted the infographics and handled the user navigating to different screens. The two processes used shared memory to communicate. It was my first job out of university.
The Internet and its consequences have been a disaster for the human race. They have greatly increased the surveillance we endure for those of us who live in "advanced" countries, but they have destabilized society, have made life unfulfilling, have subjected human beings to indignities, have led to widespread psychological suffering and have inflicted severe damage on the natural world. The continued development of technology will worsen the situation. It will certainly subject human beings to greater indignities and inflict greater damage on the natural world, it will probably lead to greater social disruption and psychological suffering, and it may lead to increased physical suffering even in "advanced" countries.
Was doing sortable-and-filterable tables in the browser without a server round-trip 20 years ago using XML/XSLT and not thousands of lines of JS but something on the order of dozens.
These feel like all the things a proper "Web 3.0" should have solved. We have decades of lessons learned that we could apply with a soft reboot to how we envision the web.
Instead it's just piling on a dozen layers of dependencies. Webassembly feels like the only real glimmer of what the "next generation" could have been like.
When I use my work PC under Win 11, I endlessly notic all the lag on basically everything. Click and email in outlook at it takes 3 seconds to draw in... thats a good 12 billion cycles on a single core to do that. Multiply that by hundreds/thousands of events across all events on the system and I wonder how many trillions of cycles are wasted on bloat everyday.
My 17 year old core 2 duo should not feel faster on a lean linux distro than modern hardware and yet it does. Wild to see and somewhat depressing.
I see old videos (Computer chronicles a good example) of what could be done on a 486 for instance. While you can see the difference in overall experience, it isnt that extreme of a difference, the 486 being released 37 years ago...
That tab refreshing thing really bugs me with fan fiction. If I think I might want to reread a story someday I'll download it, because if you read fan fiction you learn that many authors come back and fiddle with their earlier stories, sometimes even replacing the entire old story with chapter 1 of a complete rewrite. Even in the rare case that they actually do eventually finish the rewrite it is often not as good as the original.
AO3 HTML downloads have the story in one long HTML file. When reading that on iPad that stupid refresh can move you to the top which is pretty damned annoying.
For that very particular situation I do have a workaround, but it involved adding some JavaScript to the download HTML. If anyone else is reading downloaded AO3 HTML and would like this I've put it on pastebin.com. Get saveplace.js [1] and ao3book.css [2] and add this at the end of the head of your AO3 download:
First, to address the tab refresh problem, whenever you change your position in the story it waits until you've stopped at a new position for a bit and then records the new position in parameters on the URL. After a refresh happens it looks for those parameters and restores the last saved position.
Second, to make the story easier to read it hides all but the first chapter, adds buttons to move forward and back by chapter, and adds a dropdown to select chapters. It also adds a button to switch between night and day mode. The day/night mode setting is saved in local storage.
Feel free to use this in anything of your own. The chapter navigation stuff is tied to AO3's HTML, but that would be easy to delete leaving just the position saving/restoring. This is in the public domain in places where it is possible to put things in the public domain. If one of us is somewhere that isn't possible you can use it under the MIT No Attribution license (MIT-0).
AO3 also allows downloads of different formats including epub. I often download the epub (or use fichub for other sites) and read on the Epub Reader app. If I want to read on my Kindle app or my physical Kindle, I'll send the epub to my Amazon library via email.
In fairness, back in 2017 I bought a OnePlus 5T with 12G of RAM.
That's almost a decade ago.
Phones RAM progression has stagnated for a LONG time, during that time I doubt that webpages have become lighter, so yeah I'm not surprised by what you are saying.
I know this article is about RAM but I truly hate how little storage the iPhone ships with their phones. I guess everyone is using iCloud but I refuse to store my personal data on the cloud. I’m constantly down to 2-3 GB on my phone. I have just 128 GB of storage that’s not upgradable. What a shame.
My in-laws have probably discarded at least five or six Apple devices on that account. Typically they get used devices, with a good number of years of updates remaining, but the updates are pointless when iOS grabs 50% for it self and the actual update, resulting in a device that you may not be able to update even if you uninstalled everything.
The devices themselves are fast enough to run everything, you just can't update and eventually apps stop being available to the old iOS version they run.
Tin foil hat theory is icloud subscriptions is why image capture hasn't been updated in years and still crashes with big transfers. Not that I'd expect them messing with it at this point would generate a more useful tool.
The fact that the current iphone is how much more performant than a 3gs and we are doing what exactly different with it? Still scrolling instagram, text, whatsapp, maps, shitty mobile web, literally nothing has changed about how we use these devices. Nothing. These things should be like camels and have the battery last for weeks by this point. The hell is all that power even going toward? These phones are like Hummers. Just wasteful.
That is what happens when people learn to code and very little value is given to algorithms and data structures, regardless of the programming language.
This is why I miss Windows Phone. My $35 Lumia with 512 MB of RAM was infinitely smoother and faster than the 2GB Samsung Galaxy flagship phone I had, and of comparable fluidity to the so-much-more-expensive iPhones with 2GB RAM.
I doubt. Microsoft would much rather sell you a thin client & a Windows 365 subscription, and Nvidia wants you to use GeForce now instead of buying a GPU.
The shortage is manufactured, I have my doubts it will "end" in a conventional sense. I'm more skeptical and feel like this is yet another consolidation of wealth and a means of taking away compute power from people, which prevents startup competition. This way the hyperscalers are the only ones that can offer any meaningful compute.
The latest phone reviews have been eyebrow raising.
The just announced pixel is the same phone as last year. I know it sounds like a usual complaint, but look at the actual specs, it literally is the same phone with differences so small that hey might have passed as regional variance.
As for the Samsung, the screen can darken when looked from the side for privacy. That’s pretty much it. Price increased though.
Coupled with the current iOS situation it seems like things are… rotting. Everything in decline.
Upcoming Apple display mounted to wall or robot arm is rumored to have audio interface and new OS without 3rd-party apps, only "AI".
Jony Ive at OpenAI is rumored to have smart speaker, pendant, pen and bone-conducting headset in the launch pipeline. Audio interfaces, no screens,
Meta is selling millions of smart glasses, with Apple and others following.
If the memory market was not distorted, home AI + agents + open models could have a bigger role via AMD Strix Halo. Instead, they will be reserved for those who can afford to spend five figures on 512GB or 1TB unified memory on Mac Studio Ultra devices.
> users [could] interact with Siri and future Apple devices without speaking out loud.. AI systems capable of interpreting facial expressions and subtle muscle movements to understand so-called “silent speech.”
Not sure. Some AI audio pendants are always on. The Apple device is rumored to adapt its interface to the user based on facial recognition. They could choose to start monitoring audio when it thinks a known human wants to interact with the device, https://news.ycombinator.com/item?id=47145201
Apple is developing a tabletop robot as the centerpiece of its artificial intelligence strategy, with plans to launch the device in 2027.. The robot resembles an iPad mounted on a movable limb that can swivel to follow users around a room..The company is also exploring other robotics concepts, including a mobile bot with wheels similar to Amazon’s Astro, and has discussed humanoid models..
The FaceID subsystem is already pulsing periodically (N seconds?) on iPhones, e.g. to check for human attention. Apple could also use WiFi 7 Sensing (which can fingerprint humans by heartbeat) to trigger on human presence and determine when a full facial recognition scan was needed.
OSes have been in decline for a long time. This memory price is just a blip, though. These supply and demand shocks happen periodically and always return to normal.
> Coupled with the current iOS situation it seems like things are… rotting. Everything in decline.
Just "commoditizing". Last years microwave ovens were basically the same as 2024's also, and no one cares. You still need them and people still buy them and use them as much as ever, but at a replacement rate and not because of fashion or innovation.
That is a good thing. It means the economy is doing what it's supposed to do and bringing maximal value to consumers so we can spend our resources more efficiently (on other fashion-driven junk in different market segments), making us richer.
It's only bad news if your business is selling "phones" and not innovative products more generally. Which, yeah, is pretty much AAPL's trap. But that's on them, not us. We're winning.
We are and have been for many years now. Check out the "free" phone tier at your mobile vendor of choice. Those are great devices!
They may not match your particular tastes, but people with inflexible taste are always the last-resort market for manufacturers of commoditized products. People still buy from Hermès even though Shein completely dwarfs them in revenue, etc... That's the way it will always be with Apple too.
> The latest phone reviews have been eyebrow raising.
It's eyebrow raising for me in other ways.
I have a Pixel 9a and it's been quite good with really solid battery life. It's barely 6 months old and I got it new straight from Google.
A few days ago I noticed the battery started to drain much faster than usual. I also noticed at the same time Google is pushing the 10a.
Nothing changed on my end. I barely use the phone in my day to day. In 10 hours today I sent 3 text messages with Whatsapp and lost 60% of my battery in that time frame. Up until a few days ago, 60% would last me 3 days.
I find it weirdly coincidental that the battery life went from amazing to worse than a 5 year old device I had prior to this just as they are releasing new phones. I've powered it down and given it a full discharge / charge too. It's still draining at an alarming rate.
Did it happen to them in the last few days? Did it self fix itself?
I wish there were better options for phones. It's absolutely crazy to me that a phone can be perfectly working one day and then it starts getting issues like this out of the blue.
It makes it completely undependable. All I want is a phone I can trust traveling with where I'm not going to wake up the next day and then the phone starts draining 3-4x faster than it normally does.
There hasn't even been a system update for almost 3 weeks, so it wasn't an update that busted things.
I had the conversation yesterday, unexplained battery drain as of then unresolved.
They mentioned people complaining on Reddit about battery drain since the last update, but I haven’t personally seen the threads so take it with a grain of salt.
I dont think Google is gimping your device to sell you new ones.
What you probably see is Google toggling on some new AI feature, which is now doing some initial on-device computation. It will usually calm down after a few days.
> I dont think Google is gimping your device to sell you new ones.
I'm not sure.
Last year they ruined my Pixel 4a by pushing that battery patch to everyone (even to end of life devices).
Their official repair center replaced the battery as per Google's guidelines and offer but during the battery swap they managed to physically break other parts of the phone over the course of multiple visits. Each visit broke a new component.
Google support didn't do anything in the end, eventually ignoring me. This went on for almost 6 months with an email chain over 100+ replies.
Eventually the repair center gave me enough credit towards a new phone (this 9a) but only after I mentioned I was going to small claims court since they left me with a device that didn't function in the way it used to function before I visited them.
I just noticed something called smart download in youtube: in the background download 1GB of videos I may or may not watch, enabled by default. That surely drains the battery. It might also affect longevity of the UFS storage.
I think even going back a few generations, phones are improving at a much slower pace. You can only jam so many cameras onto a phone frame before users lose interest. A few years back there was a mad dash to add AR features to flagship phones so they could wow us with apps that never materialized. My last few upgrades have been almost imperceptible. Buyers just don't have a good reason to buy new phones every two years.
I'm actually super fine with the hardware stagnating! Work on the yields, cut the prices, simplify and make it more robust, while keeping the spec the same. It gives developers something to focus on, like a console, so the software gets better over the life of the device, not worse.
Perhaps this could give room for physical design changes instead. I'm sick of phones that are just a slab of glass. I remember fun, weird, fashionable designs! Buttons and keyboards, phones felt like an individual choice, not just this boring black mirror. I'd take ten years of stagnation on hardware development in phones in exchange for ten years of exciting form factors with improving software. Let's face it: the spec is high enough for anything we need to be doing, by now. The software is the real problem, and there's room here for massive improvement.
you better not look at screen:body ratio since Pixel 6a, it decreased for few generations and only now it's finally back on par, that's not what I call evolution/progress
same with Amazfit Bip from like 2018, you can't buy small THIN <10mm watch with battery requiring charging once a month and always readable MIP display (the more sunlight the better)
> I wonder whether we’ll see a secondary effect in the resale market.
I'm paying more on ebay for thinkcentre tiny and thinkpads - 12th gen intel and newer.
Refurbished spinny drives have been steadily climbing - up 50% since late last year. That's on top of the 20% mystery jump that happened in the last week of 2024.
We already are. Check eBay at the component level, which is showing it quite clearly. Look for secondary/reclaimed/refurbished components to backfill the gaps too.
Also be aware that this stuff whipsaws, if OpenAI actually takes posession of that memory and decides they can't use it and dumps, we're going to see a crash. Likewise if they back out of the deals with the memory fabs (or fail and default). There's some scary volatility on the horizon.
Wait until we find out that all of tech (ever) has been subsidized by the true-so-far assumption of continued growth, allowing today’s costs to be paid for by tomorrow’s larger market.
Also Python generators for the lulz. They help one to write extremely memory-efficient programs. Perhaps the memory shortage further helps cement Python in the language popularity charts, vis-à-vis languages that tend to load whole data in memory by default, like R.
If we are talking about R, a lot of people who converted from R continued to operate in the same manner, by loading entire datasets into memory with pandas and numpy.
Over investment in AI data centers is having a huge negative impact all over the economy. Other sectors are missing out on investment limiting their growth and stalling the economy.
Companies have reduced staff prematurely on the promise of productivity improvements that have not occurred and lost customers to terrible customer service and declining product quality.
Many hardware launches are going to be delayed or not meet expectations which really is the tip of the iceberg.
The US/SK memory cartel understandably sold out for a massive short term windfall but they their long term decisions to limit supply have created a huge opportunity for China. I wouldn't be surprised if this will go down in the history books as the start of the exit for US/SK from the industry and the start of Chinese dominance.
The smart phone industry is likely to respond with an increasingly hostile anti-consumer approach as they try and lock customers into the cabins of the sinking ship. I expect cheap and cheerful Chinese budget phones aren't going anywhere.
I am happy for ram, cpu and storage to stall. I want a more robust and open phone which can take a fall and be updated long after the vendor loses interest. I expect to uninstall most of my apps rather than install new ones as I increasingly disconnect from an ever more distracting and worthless medium. I have cancelled nearly every subscription service in the last 12 months. And I have been deleting a lot of free accounts and apps. Its like doing a big cleanup. Surprisingly rewarding.
HN has felt like more than 50% AI industry promoting blog spam of little interest to me as a reader for some time. I am setting a budget of ten, no make it five, more posts here. Then I am out for good. Account deletion and no looking back.
The U.S. gov't is now committing a sizeable chunk of GDP to investments and subsidies to AI companies and data centers and has reduced overall investment in wind and solar.
Brutally cold capitalist take. Go walk around your city, friend; remember the tragedy of the commons. There is a lot that needs to be done that isn't being done, because we're soaking up people's life's work on this effort that we don't even know the end goal of. It could result in some awful outcomes for everyone if not guided correctly, and it seems like it's not being guided at all - or worse, it's being guided by the Department of War.
> Companies have reduced staff prematurely on the promise of productivity improvements that have not occurred and lost customers to terrible customer service and declining product quality.
Companies have reduced staff because of the impact of tariffs, because of low consumer confidence and spending, or as a ploy to pump share prices. Then they claim it’s AI, because it sounds a lot better to say that you’re reducing headcount because of AI than it does to admit that you’re cutting costs because of falling revenue.
I agree with you on the AI blogspam. This is a lot like the dot-com era, where a profusion of capital is causing people to develop complete horseshit products nobody needs. When the shine comes off, a lot of companies will fade, but many will stick around, and become the FAANGs of the 2030s.
In some ways it's pretty interesting to watch the entire world mobilize production for AI; some folks like to call this "hyperstition" as the future AGI reaches backwards in time to compel its own creation. Wild, but when trillions of dollars - i.e. millions of people's entire life output of work - are being put into something, it's truly an effort on a scale that no societal project has ever been before. There's no leader, nobody is in control, nobody has the grand vision other than "build the thing and get rich in the process". Amazing times to live in. The best use of our time and resources and coordination? Probably not... as we look around our broken cities, stepping over our poor and hopeless...
The deeper problem is that businesses are now expected to be funded by investors. There was a time when banks funded new businesses with loans, but now most of their lending is mortgages. Banks were better because they would lend to any business they thought wouldn't go bankrupt and weren't subject to FOMO and thinking only about future profits/exit, which they weren't entitled to.
Question is, is it really impossible for businesses to fund themselves with bank loans now? John Kay wrote about this years ago arguing the finance sector is no longer a good thing for society but has become more of a leech: taxing the money supply but not supporting new businesses. I feel like it's only become worse now. Even insurance is barely really insurance any more. It's more like a savings account that you might be able to withdraw from when you need it, but not necessarily.
Are we sure these are ram shortages or ram "shortages"? If these are "shortages" why blame the ai companies for exposing it, you should complain to the ram maker for refusing to manufacture any.
The AI boom is worse for DRAM fab planning than the crypto boom was for GPUs.
It's way way way bigger, somehow has huge money behind it, and DRAM fabs are investments with usually no less than 5 years latency from starting to build any particular one until the first memory sticks show up on a best buy shelf, while planning for the factory to produce continuously for 20+ years in normal times to turn profit if one calculates in lending rates/interest.
They are not curtailing themselves in a market as lucrative as the current one; they probably boost their marginal work planning impacts to productivity via predictive maintenance like actions to err on the side of more productivity due to the immense profit each Gigabit they can deliver this year makes them.
Literally like over half of what a consumer pays today at best buy could be expected to be just profit of the factory based off of the non-artificial supply limits and the huge AI demand pushing out all but the wealthiest home-PC demand...
There have been multiple RAM price fixing scandals in the past so my default position is strong skepticism in this matter. I am sure they have various "explanations" but I will wait before believing them.
Dropped my iPhone couple of days ago so I had to go back to an old phone. Pixel 3a. Opens Signal and HomeAssistant faster than my 2022 iPhone ever did so why would I even buy a new phone and go back at this point. The best phones (prive/value) have already been built and sold.
It's that everything has become 20% more expensive in the past year, I'm being taxed to death, fighting with companies trying to money grab me, my electric bill is now $800, and I'm now too broke to buy a new phone every 2 years when most of my income gets eaten by the "system".
I'll wait until either SPY does another 50% run or BTC does another 100% run and then I'll buy a new phone. Google, you want me to buy your new phone? Do something to make SPY or BTC go up and then we'll talk. Until then my current phone works, and the new features aren't a must-have.
Yeah, so I'm not buying unnecessary crap this year.
If the "system" wants to drive more consumption, it's on the "system" to put more buying power into my hands. Double my salary, reduce my taxes, make BTC do a big run up, something. Otherwise I'm happy staying put.
13%!!! This should be a code red level event for … the world? I … don’t understand how world leaders are just standing by? Smartphone growth/adoption has been the bedrock of a LOT of economic growth. I would have expected massive Government intervention to avoid this.
Where are the China hawks? The argument for protecting Taiwan was that without their chips the smartphone market would contract, right? Thats whats happening now?!
I wonder if we will ever get an aproximate percentage of GDP or some other hard numbers for how much Sam/OpenAI (and the manufacturers ofc) hurt the global economy with all of this?
Less phones,computers,consoles,servers,etc sold (and everything that follows this) seems a way larger impact on the economy than a few thousand new ChatGPT Pro memberships...
Let's hope the component shortages will drive performance improvements in Apps as it will be unfeasible to expect higher specs to fix performance bottlenecks. Constraints can drive good behavior.
I remember a few years back when Jon Blow (Braid, The Witness) did a few talks about the fact that the biggest progress in recent years had been in hardware performance, making lazy software development standard since the hardware made it so easy to ignore any limitations.
I'm not as much of a fan these days but I do hope these limitations have the effect of improving best practices.
Well, maybe people stop changing their smartphone every two years. Or every year. Imagine the positive impact on the environment!
I am always surprised that when the planet caring, liberal Apple boss shows up on the Big Apple Event stage, he encourages people to ruin the planet by needles purchase of the new hardware, even though the old one can do the same job easily, as now the improvements are barely incremental, if any.
selridge | 23 hours ago
paxys | 23 hours ago
selridge | 23 hours ago
dude250711 | 23 hours ago
mlyle | 23 hours ago
vessenes | 23 hours ago
inigyou | 23 hours ago
nomel | 20 hours ago
> By contrast, Apple and Samsung are better positioned to navigate this crisis. As smaller and low-end-positioned Android vendors struggle with rising costs, Apple and Samsung could not only weather the storm but potentially expand market share as the competitive landscape tightens.”
darthoctopus | 23 hours ago
[1]: https://www.mooreslawisdead.com/post/sam-altman-s-dirty-dram...
msy | 23 hours ago
lostmsu | 23 hours ago
ajross | 21 hours ago
mschuster91 | 21 hours ago
Not the person Sam Altman specifically, but AI in general. It was obvious even in 2024 that braindead beancounters were jumping on the hype train, so much so that coal power plants were kept alive to satiate the power hunger [1]. The last time that shit happened, it was the coin craze [2], but unlike cryptocurrencies there was and is an actual product being made...
[1] https://www.theregister.com/2024/10/14/ai_datacenters_coal/
[2] https://www.theguardian.com/technology/2022/feb/18/bitcoin-m...
mschuster91 | 21 hours ago
And I'd say if it ends up being shown there even is the slightest hint of impropriety going on, trial him. Up to and including capital punishment for the entire board and C level - what OpenAI already has done, even if legally on paper, IMHO is the biggest market manipulation in history, and it's not just one competitor that is suffering but society as a whole.
I don't have an issue with big companies and their super rich investors engaging in petty bitch fights. By all means, hand me some popcorn and soda. But the RAM situation, with everyone not being super rich and flush with cash from AI crazed investors being screwed royally? That is far beyond acceptable.
We need to send a message: you can't mess around with the world economy at that level without feeling serious repercussions. The lives of the billions are not playthings for the select few.
And if it turns out to be outright market manipulation, engaging in deals he doesn't even have the money committed for by others, much less actually have it on his balance sheet? Then it's time for the pitchforks, not even Madoff was this ruthless.
donkeybeer | 11 hours ago
Animats | 23 hours ago
The IDC article says that DRAM prices are not expected to come down again. "While memory prices are projected to stabilize by mid-2027, they are unlikely to return to previous level — making the sub-$100 segment (171 million devices) permanently uneconomical." Before, they always came back down in the next RAM glut, when everybody built too much capacity. Why is that not going to happen next time?
[1] https://www.heise.de/en/news/Storage-crisis-Playstation-6-co...
darthoctopus | 23 hours ago
Because this shortage isn't natural, it's the result of OpenAI flexing monopsony power to deprive everyone else for its strategic gain. Unlike an organic shortage, there is no compelling reason for otherwise excess capacity to be built, since this artificial shortage can end as arbitrarily as it started.
MadameMinty | 23 hours ago
inigyou | 23 hours ago
m4rtink | 22 hours ago
No reason the same can't happen now - especially for something as expensive and faily easily re-sellable as a datacenter & the hardware insite. Just rip it all out and sell it for parts where they are actually needed.
mr_toad | 22 hours ago
inigyou | 21 hours ago
m4rtink | 21 hours ago
https://www.tomshardware.com/tech-industry/shareholders-sue-...
mr_toad | 9 hours ago
mlyle | 23 hours ago
Here we're facing different forces-- unprecedented demand for DRAM that may be durable. But it also looks like the pace of supply changes may be decreased as process improvements get smaller and the industry stops moving so much in lockstep.
It still matters what happens to the demand function, though. If enough AI startups blow up that there's a lot of secondhand SDRAM in the market, and demand for new SDRAM is impacted, too, that will push things down.
Sort of like what happened with the glut of telecom equipment after
vlovich123 | 23 hours ago
ErneX | 23 hours ago
bayarearefugee | 23 hours ago
xenadu02 | 21 hours ago
Sure thing. I'd take a look at IDC & similar firms' forecasting history before worrying too much about what they say.
There is an AI boom right now. There will be a consolidation cycle at some point. When that happens half the players, if not more, will disappear. The huge hardware budgets will go with them.
We also can't be certain that the DRAM makers aren't capitalizing on this opportunity because they can. Remember: all of them are convicted monopolists. As in actual prison time convicted. And fined. And lost civil lawsuits. Multiple times.
I just can't see AI paying enough of a premium on HBM to justify the DRAM spikes. Frankly I can't see the volume either. Wafer starts on DRAM are dramatically bigger than you are probably imagining. DRAM is in practically everything these days. AI servers is but a drop in the bucket. 10% of the market? Yeah right, if its 4% I'd be shocked. And you are telling me a shift of 4% of wafers to HBM is driving these prices and shortages?
I humbly suggest if you look at the numbers something smells funny.
Disclaimer: none of us has access to the actual data, a lot of it is inferred by industry players. Some are well connected and usually accurate but that is not evidence. Therefore it is possible this is a genuine market action and nothing nefarious is going on.
zozbot234 | 20 hours ago
OsrsNeedsf2P | 23 hours ago
trvz | 23 hours ago
niek_pas | 23 hours ago
asdff | 12 hours ago
Now that I think on it maybe I ought to just pay attention to jailbreakable OS version numbers again. If I stuck on one of those OS versions I could just spoof my user agent for the bank website with a jailbroken phone.
ranguna | 8 hours ago
It's like saying why should you get a new gaming laptop to replace your 6 year old current gaming laptop, when all you do is office work. If all you do is office work, why buy a gaming laptop at all? Just use a standard okish smartphone or tablet.
asdff | 3 hours ago
jsheard | 23 hours ago
https://www.androidpolice.com/google-pixel-10-3-5-gb-ai-only...
drnick1 | 23 hours ago
inigyou | 23 hours ago
esperent | 22 hours ago
recursive | 22 hours ago
These techniques seem not to be widely known. A kagi search turned up only information about some singer.
esperent | 17 hours ago
https://github.com/RikkaApps/Shizuku
And canto not canta (search the play store).
My apologies, I got both last letters wrong!
b112 | 14 hours ago
Seems far easier to just use ADB. Especially rather than trusting a codebase you don't know, and an app you don't know.
I also find it better to use ADB, list all apps installed, remove what I personally choose, instead of a list by others.
It's fairly easy:
(I have a list of about 100 apps I do this with, on mainline android phones)It's the best you'll typically get. It's deactivated, but still in the ROM of course.
inigyou | 14 hours ago
drnick1 | 22 hours ago
ProfessorLayton | 23 hours ago
A lot of software has been squandering the massive hardware gains that have been made. I hope this changes when it becomes a lot harder to throw hardware at the problem.
I also wonder what this means for smartphone-esque devices like the Switch 2. If this goes on long enough I won't be surprised if they release a 'lite' model with less RAM/Storage and bifurcate their console capabilities, worse than what they did with 3DS > 2DS .
arccy | 23 hours ago
thewebguyd | 23 hours ago
Is it too much to ask for me to manage my own background processes on my phone? I don't want the OS arbitrarily deciding what to pause & kill. If it actually does OOM, give me a dialog like macOS and ask me what to kill. Then again, if a phone is going OOM with 12GB of RAM there's a serious optimization problem going on with mobile apps.
estimator7292 | 23 hours ago
Apple seemingly wants all apps to be static jpegs that never need to connect to any data local or remote, and never do any processing. If you want to do something in the background so that your user can multitask, too damn bad.
You can run in the background, for a non-deterministic amount of time. If you do that, iOS nags your user to make it stop. If you access radios, iOS nags your user to disable it.
It's honestly insane. I don't know why or how anyone develops for this platform.
Not to mention the fact that you have to spend $5k minimum just to put hello world on the screen. I can't believe that apple gets away with forcing you to buy a goddamn Mac to complile a program.
n8cpdx | 23 hours ago
People develop for iOS because iOS users spend more money. End of story.
homebrewer | 12 hours ago
ErneX | 12 hours ago
babypuncher | 23 hours ago
These are features, because we can't trust developers to be smart about how they implement these. In fact, we can't even trust them not to be malicious about it. User nags keep the dveloper honest on a device where battery life and all-day availability is arguably of utmost importance.
> you have to spend $5k minimum just to put hello world on the screen.
Now that's just nonsense.
post-it | 22 hours ago
It's inconvenient that apps can't do long-running operations in the background outside of a few areas, but that's a design feature of the platform. Users of iOS are choosing to give up the ability to run torrent clients or whatever in exchange for knowing that an app isn't going to destroy their battery life in the background.
ErneX | 13 hours ago
toast0 | 22 hours ago
Android does all sorts of wacky stuff with background tasks too... Although I don't feel like my 6 GB Android is low memory, so maybe there's something there, but I also don't run a lot of apps, and I regularly close Firefox tabs. Android apps do mostly seem well prepared for background shenanigans, cause they happen all the time. There's the AOSP/Google Play background app controls, but also most of the OEMs do some stuff, and sometimes it's very hard to get stuff you want to run in the background to stay running.
I dunno about watches, but Airpods work fine with Android, as long as you disconnect them from FindMy cause there's no way to make them not think they're lost (he says authoritatively, hoping to be corrected).
spaqin | 19 hours ago
kyralis | 17 hours ago
skeptic_ai | 17 hours ago
mosura | 23 hours ago
There is a strong argument modern mobile goes too far for this.
Gigachad | 23 hours ago
mosura | 23 hours ago
mort96 | 22 hours ago
LtWorf | 22 hours ago
goalieca | 23 hours ago
toast0 | 22 hours ago
I don't work at that kind of level, so I dunno if the juice would be worth the squeeze (sleep with DRAM refresh is already very low power on phone scales), but it seems doable.
mort96 | 22 hours ago
There's a reason why we say unused RAM is wasted RAM.
zozbot234 | 22 hours ago
mort96 | 22 hours ago
kyralis | 17 hours ago
giancarlostoro | 23 hours ago
I remember on Android I dont recall the app name specifically, but it would let me download any website for offline browsing or something, would use it when I knew I might have no internet like a cruise.
Heck there used to be an iOS client for HN that was defunct after some time, but it would let you cache comments and articles for offline reading.
LtWorf | 23 hours ago
ibejoeb | 23 hours ago
Safari suspends backgrounded tabs. I think that's what we're observing here rather than strictly memory pressure.
deaddodo | 22 hours ago
That being said, there's no reason the Safari context shouldn't be able to suspend the JS and simply resume when the context is brought back to the foregrown. It's already sandboxed, just stop scheduling JS execution for that sandbox.
HerbManic | 15 hours ago
crowfunder | 10 hours ago
I coded an extension that adds a context menu for opening videos in embed mode. https://addons.mozilla.org/pl/firefox/addon/youtube-open-as-...
mikepurvis | 23 hours ago
My understanding was that market research showed a lot of users were turning off the 3D stuff anyway, so it seemed reasonable to offer a model at lower cost without the associated hardware.
jsheard | 23 hours ago
It was also because young children weren't supposed to use the 3D screen due to fears of it affecting vision development. You could always lock it out via parental controls on the original, but still that was cited as a reason for adding the 2DS to the lineup.
https://www.ign.com/articles/2013/08/28/nintendo-announces-2...
> Fils-Aime said. “And so with the Nintendo 3DS, we were clear to parents that, ‘hey, we recommend that your children be seven and older to utilize this device.’ So clearly that creates an opportunity for five-year-olds, six-year-olds, that first-time handheld gaming consumer."
dangus | 23 hours ago
Although, for a $450 device that doesn’t need to make much of a profit on its own, I also don’t think they’re heavy on memory in the first place (12GB). You can buy top quality Chinese Android handhelds with more RAM and better Qualcomm processors than the Switch 2 for about the same price, and those companies are making $0 in software royalties (e.g., AYN Thor Max is $450 with a 16GB/1TB configuration).
jsheard | 23 hours ago
Every version of the Switch 1 had 4GB of RAM, they didn't cut that on the Lite. Going back and patching every game to ensure it ran on less RAM it was originally designed for would have been a nightmare.
> (e.g., AYN Thor Max is $450 with a 16GB/1TB configuration).
AYN just announced that the Thor will get a price increase soon for obvious reasons.
https://www.reddit.com/r/SBCGaming/comments/1rf5gxq/to_thor_...
dangus | 23 hours ago
Of course the Thor Max will have a price increase, but also, obviously 16GB/1TB is a massively bigger bill of materials than the Switch 2’s 12GB/256GB configuration.
And I forgot to mention that Nintendo has far more pricing leverage in terms of their volume.
dude250711 | 23 hours ago
rationalist | 19 hours ago
App battery usage is unrestricted, so it's not that.
biophysboy | 23 hours ago
goalieca | 23 hours ago
mschuster91 | 21 hours ago
Sure, but otherwise, the competition will be first to market, and the exec may lose their bonus. So, the exec keeps their bonus, and when the tech debt collapses, the exec will either have departed long ago or will be let go with a golden parachute, and in the worst case an entire product line goes down the drain, if not the entire company.
The financialization and stonkmarketization of everything is killing our society.
tkzed49 | 23 hours ago
With that contract being eroded, I think the sloppiness of testing, validation, and even architecture in many organizations is going to be exposed.
al_borland | 21 hours ago
KeplerBoy | 23 hours ago
That trend might reverse if porting to a best practice native App becomes trivial.
fzeroracer | 23 hours ago
I wouldn't call it an idealist position as much as a fools one. Companies don't give a shit about software security or sustainable software as long as they can ship faster and pump stocks higher.
bigstrat2003 | 21 hours ago
canthonytucci | 23 hours ago
Big name apps like Facebook, YouTube, Apple Music, Apple Podcasts seem totally disinterred in preserving my place.
YouTube being the worst where I often stack a bunch of videos in queue, pause to do something else for a while and when I return to the app the queue has been purged.
canthonytucci | 23 hours ago
idle_zealot | 23 hours ago
bakugo | 23 hours ago
If you switched off the app while looking at a certain post or watching a certain video, that's a negative engagement indicator, so the app wants to throw you back into the algorithmic feed to show you something new instead.
mcdeltat | 23 hours ago
ssl-3 | 22 hours ago
recursive | 22 hours ago
alpaca128 | 21 hours ago
user205738 | 14 hours ago
user205738 | 14 hours ago
So you don't even need an ad blocker, just a sponsor block.
By the way, this (Not an extension, but a login from a Ru ip's) removes ads from all other Google services.
ssl-3 | 9 hours ago
On PC, I use Firefox with the uBlock Origin extension and I see no ads on Youtube.
Same with my pocket supercomputer: Firefox works great on Android, including for Youtube. And it uses extensions like the PC version does. No ads there, either.
On the BFT in the living room, I have a Google-manufactuered Google TV device. It runs SmartTube, and displays no ads on Youtube.
I even have an iPad that I use primarily for watching Youtube videos. For that, I stay completely within the confines of the walled garden and use Safari with the AdBlock add-on. And: If you're guessing that I'm about to write that have no ads on Youtube there either, then you're right. There's no ads on Youtube with that device, either.
Am I doing this wrong?
Maybe my perspective differs from that of some others, but it seems to all work very well for me here in 2026. (There's been some ups and downs with this over the years, but it all finds its way back to exactly what I wrote above, anyway.)
alpaca128 | 7 hours ago
Roughly in that timeframe YT also successfully blocked downloads with yt-dlp for a bit. Seems like they're trying harder now because of AI scrapers.
ssl-3 | 5 hours ago
And also this, from a couple of weeks ago: https://www.firefox.com/en-US/firefox/147.0.4/releasenotes/ (with Linux, but that probably doesn't matter at all)
And that's about it. I recently pruned some other Firefox extensions while troubleshooting ompletely unrelated issues, and all that's left is uBlock Oorigin, Dark Reader, and BitWarden.
Seriously, I've had no recent issues with Youtube ads at all and certainly none in January or February of this year. It's been smooth-enough for me on all of the platforms I mentioned before (and I use them all quite a lot, except perhaps for the BFT).
I wonder what's different on your end?
alpaca128 | 3 hours ago
Turns out if both this and uBlock are active, YouTube will refuse to work. But only uBlock works just fine.
mort96 | 22 hours ago
Why??
post-it | 22 hours ago
ndarray | 22 hours ago
01HNNWZ0MV43FF | 22 hours ago
tzs | 19 hours ago
mort96 | 14 hours ago
stephen_g | 21 hours ago
maccard | 13 hours ago
al_borland | 21 hours ago
skhr0680 | 21 hours ago
nntwozz | 21 hours ago
It's all gone to $hit, efficiency is gone it's just slop on top of more slop.
kalleboo | 21 hours ago
[0] https://developer.apple.com/documentation/SwiftUI/restoring-...
Fr0styMatt88 | 21 hours ago
Case in point — Youtube background play doesn’t pause when Siri makes an announcement, so if you’re listening to something you get two voices over each other.
I gave it the benefit of the doubt and figure it must be some kind of iOS thing, until I was listening to Audible one day and it paused automatically. So it’s just a google thing, not a third-party apps thing.
i have the same issue with the Youtube queue — this is something that could easily be persisted, but they just choose not to.
christophilus | 20 hours ago
tomrod | 19 hours ago
jt2190 | 23 hours ago
“Save webpages to read later in Safari on iPhone” https://support.apple.com/guide/iphone/save-pages-to-a-readi...
deaddodo | 23 hours ago
layer8 | 22 hours ago
babypuncher | 23 hours ago
Nintendo can't realistically take memory budget away from developers after the fact. The 2DS cut the 3D feature from the 3DS, but all games were required to be playable in 2D from day 1, so no existing games broke on the cost-reduced 2DS.
jama211 | 22 hours ago
expedition32 | 22 hours ago
Hardware is pretty useless if the software that drives it is useless. I don't know it probably works better in China all I know is that I went back to good old Samsung.
Liftyee | 22 hours ago
The market demands must be different there. I've disabled "battery optimisation" for all the apps I need to stay open (and some apps even prompt me to disable it!), and I don't have any issues in daily use.
flakiness | 22 hours ago
[1] https://source.android.com/docs/core/perf/cached-apps-freeze...
kevin_thibedeau | 20 hours ago
That's social engineering to get themselves more background network activity. I wouldn't trust such an app.
Markoff | 15 hours ago
Markoff | 15 hours ago
if you run out of smartphone battery you are in much bigger trouble in China than in west since it's necessary to function almost everywhere, which is why they have rental powerbanks stand literally in every restaurant and every small grocery shop, you are never further than like 5 minutes walk from one in urban area
btw you can always put app to protected/not optimized list which usually solve problems with most of the western apps on Chinese phones (essential Chinese apps like WeChat are on the list by default)
HerbManic | 15 hours ago
Hard to tell if it something I am doing or not. I will say with all these phones and everything google turned off I typically get 3-4 days per charge but that really depends on what your usage is.
brendyn | 22 hours ago
14 | 17 hours ago
Anyways I wanted to say I also have a pixel 8 but with stock OS and my battery typically lasts a full day with average usage. My iPhone 8 previously even with a replacement battery was lucky if it lasted more then 5 hours. I had to charge that thing multiple times a day.
marcellus23 | 16 hours ago
gib444 | 15 hours ago
(I have the exact same issue)
interloxia | 15 hours ago
https://f-droid.org/packages/com.github.muellerma.coffee/
Waterluvian | 22 hours ago
TheRoque | 22 hours ago
It happened a lot on my previous phone with only 4GB ram though
bigstrat2003 | 21 hours ago
Considering how many people are so averse to programming that they use LLMs to generate code for them? Not very likely IMO. I would like to see it happen, but people seem allergic to actually trying to be good at the craft these days.
londons_explore | 21 hours ago
Imagine you are Apple and can just set an LLM loose on the codebase for a weekend with the task to reduce RAM usage of every component by 50%...
al_borland | 21 hours ago
Also, what happens to the stability and security of my phone after they let an LLM loose on the entire code base for a weekend?
There are 1.5 billion iPhones out there. It’s not a place to play fast and loose with bleeding edge tech known for hallucinations and poor architecture.
rescbr | 21 hours ago
If you direct it to do a specific task to find memory and cpu optimization points, based on perf metrics, then it’s a completely different world.
jfim | 21 hours ago
I asked Claude to find all the valid words on a Boggle board given a dictionary and it wrote a simple implementation that basically tried to search for every single word on the board. Telling it to prune the dictionary first by building a bit mask of the letters in each word and on the board and then checking if the word is even possible to have on the board gave something like a 600x speedup with just a simple prompt of what to do.
That does assume that one has an idea of how to optimize though and what are the bottlenecks.
al_borland | 20 hours ago
iOS is 19 years old, built on top of macOS, which is 24 years old, built on top of NeXTSTEP, which is 36 years old, built on top of BSD, which is 47 years old. We’re very far from greenfield.
teeray | 21 hours ago
They are trained on everything, and as a result write code like the Internet average developer.
okanat | 10 hours ago
Great UIs are written by above average or even exceptional developers. Such experience is tied to the real-life reasoning and combining unique years-long human experience of interacting with the world. You need true general intelligence for that.
teeray | 9 hours ago
charcircuit | 4 hours ago
Before post training (GPT3 2020 class models). Post training makes it no longer act like the average.
robinwassen | 20 hours ago
A handwritten c implementation would most likely be better, but there is so much to gain from just slaughtering the abstraction bloat it does not really matter.
alpaca128 | 21 hours ago
JKCalhoun | 8 hours ago
(I'm looking at you, Liquid Glass. I would love to get back to a vintage, "flat" UI. I'll allow for anti-aliasing, Porter-Duff compositing, but that's where I draw the line.)
h4kunamata | 21 hours ago
I have to use a Macbook M4 at work with 24GB, I have an AMD Lenovo Ryzen7 with 32GB running Linux Mint Cinnamon. It is infuriating how slow this Macbook is, even to shut it down is slow asf.
macOS is not different than Windows, I cannot wait for COB to get back to my Linux laptop.
varispeed | 21 hours ago
galangalalgol | 21 hours ago
rescbr | 21 hours ago
Companies install so many invasive shit in the name of security theater and employee control that there is lots of waste going on.
mschuster91 | 21 hours ago
Spotted the German lol
The general problem is that many people don't bother testing their apps outside of their office wifi with low latency, low jitter, low packet loss and high bandwidth. Something like persisting the state when the OOM/battery-save killer comes knocking onto some cloud endpoint? Perfectly fine on wifi... but on a mobile connection that might just be EDGE, cut entirely because the user is just getting a phone call and the carrier does not do VoLTE, or be of an absurd latency? Whoops. Process killer knocks a -9 and that's it, state be gone.
Side note: Anyone know of a way to prevent the iPhone hotspot from disassociating with a MacBook when the phone loses network connectivity? It's darn annoying, I counted having to reconnect twenty times on a train ride less than an hour.
shafiemoji | 21 hours ago
krieger_857 | 7 hours ago
dawnerd | 21 hours ago
gib444 | 14 hours ago
Enable debug with:
$ defaults write com.apple.Safari IncludeInternalDebugMenu -bool YES
intrasight | 20 hours ago
mikestorrent | 20 hours ago
Back in the day with PHP things were much more understandable, it's somehow gotten objectively worse. And now, most desktop apps are their own contained browser. Somehow worse than Windows 98 .hta apps, too; where at least the system browser served a local app up, now we have ten copies of Electron running, bringing my relatively new Macbook to a crawl. Everything sucks and is way less fun than it used to be.
We have many, many examples of GUI toolkits that are extremely fast and lightweight. Isn't it time to throw the browser away, stop abusing HTML to make applications, and design something fit for purpose?
kgwxd | 18 hours ago
It's not "the web" or HTML, CSS, or JavaScript. That's all instant in vanilla form. Any media in today's quality will of course take time to download but, once cached, is also instant. None of the UX "requires" the crap that makes it slow, certainly not thousands of lines to make a table sortable and filterable. I could do that in IE6 without breaking a sweat. It's way easier, and faster, now. It's just people being lazy in how they do it, Apparetnly now just accepting whatever claude gave them as "best in show".
NullPrefix | 18 hours ago
m-schuetz | 15 hours ago
Not going to happen until gui frameworks are as comfortable and easy to set up and use as html. Entry barrier and ergonomics are among the biggest deciding factors of winning technologies.
dsego | 15 hours ago
qsera | 14 hours ago
okanat | 11 hours ago
The problem with cross platform UI is that it is antithetical to the purpose of an OS-native UI in its reason of existence. Cross platform tries to unify the UX while native UI tries to differentiate the UX. Native UI wants unique incompatible behavior.
So the cross platform UI frameworks that try to use the actual OS components always end up with terrible visual bugs due to unifying things that don't want to be unified. Or worse many "cross platform" UI frameworks try to mimic the its developer's favorite OS. I have seen way too many Android apps that has "cross platform" frameworks that draw iOS UI elements.
The best way to do cross platform applications with a GUI (I specifically avoid cross platform UI) is defining a yet another platform above a very basic common layer. This is what Web had done. What a browser asks from an OS is a rectangle (a graphics buffer) and the fonts to draw a webpage. Nothing else. Entire drawing functionality and the behavior is redefined from scratch. This is the advantage of Web and this is why Electron works so well for applications deployed in multiple OSes.
qsera | 10 hours ago
I have created and used them. They didn't look terrible on windows.
>What a browser asks from an OS is a rectangle (a graphics buffer) and the fonts to draw a webpage. Nothing else. Entire drawing functionality and the behavior is redefined from scratch. This is the advantage of Web..
I think that is exactly what Gtk does (and may be even Qt also) too..
I think it is just there there is not much funding going to those projects. Web on the other hand, being an ad-delivery platform, the sellers really want your browsers to work and look good...
johnnyanmac | 2 hours ago
The remnants of the dotcom era for web definitely helped shape it in a more design contentious way, in comparison. Those standards are created and pushed a few layers above that in which cross platform UI's work in.
mrweasel | 13 hours ago
You can do the core functionality of your product as cross platform, to some extend, but once you hit the interaction with the OS and especially the UI libraries of the OS, I think you'd get better software if you just accept that you'll need to write multiple application.
We see this on mobile, there's just two target platform really, yet companies don't even want to do that.
The choice isn't surprising, in a world where companies are more concerned with saving and branding, compared to creating good products.
asdff | 13 hours ago
johnnyanmac | 2 hours ago
It'd be nice if companies could just play nice and agree on a standard interface. That's the one good thing the web managed to do. It's just stuck to what's ultimately 3 decades of tech debt from a prototype document reader made in a few weeks.
asdff | 13 hours ago
arexxbifs | 15 hours ago
dvdkon | 13 hours ago
asdff | 13 hours ago
m-schuetz | 12 hours ago
ImGUI is the single exception that has been simple to set up, trivial to deploy (there is nothing to deploy, including it is all that's needed), and nice to use.
asdff | 12 hours ago
lpcvoid | 12 hours ago
pjc50 | 13 hours ago
Great. How do you get all the hardware and OS vendors to deploy it for free and without applying their own "vetting" or inserting themselves into the billing?
fsflover | 12 hours ago
pjc50 | 12 hours ago
fsflover | 5 hours ago
miroljub | 12 hours ago
We had Flash for exactly that purpose. For all its flaws, it was our best hope. A shame Apple and later Adobe decided to kill it in favor of HTML5.
The second best bet was Java Applets, but the technology came too early and was dead before it could fly off.
Some may mention WebAssembly, but I just don't see that as a viable alternative to the web mess that we already have.
SoftTalker | 6 hours ago
prmph | 12 hours ago
I agree we need in built-in controls, reasonably sophisticated, properly style-able with CSS. We also need typed JS in the browser, etc
yread | 12 hours ago
Just use jquery and this plugin, 7kB minified:
https://github.com/myspace-nu/jquery.fancyTable/blob/master/...
dzonga | 12 hours ago
what's not great are the complexity merchants, due to money & other incentives etc that ship to the web.
there's better web frameworks that are lighter, faster than react - but due to hype etc you know how that goes
intrasight | 7 hours ago
I wouldn't say that. The web had done way more good than harm overall. What I would say is that embedding the internet (and its tracking and spyware and dark patterns that have gain prominence) into every single application that we use is what is at fault.
The web browser that we built in 1990 was all on-premise obviously. And it had a very different architecture than HTTP. There were two processes. One used TCP/IP to mirror the plant computers model into memory on the workstation. The other painted the infographics and handled the user navigating to different screens. The two processes used shared memory to communicate. It was my first job out of university.
Henchman21 | 4 hours ago
You know, or something.
SoftTalker | 7 hours ago
intrasight | 5 hours ago
I know that Chrome pulling the plug on XSLT in the browser is imminent - so how are you refactoring?
johnnyanmac | 2 hours ago
Instead it's just piling on a dozen layers of dependencies. Webassembly feels like the only real glimmer of what the "next generation" could have been like.
guidedlight | 20 hours ago
https://en.wikipedia.org/wiki/Wirth%27s_law
pphysch | 20 hours ago
Rohansi | 16 hours ago
It's not though, is it? Even browsers are capable of painting most pages at over 60 FPS. It's all the other crappy code making everything janky.
HerbManic | 16 hours ago
My 17 year old core 2 duo should not feel faster on a lean linux distro than modern hardware and yet it does. Wild to see and somewhat depressing.
I see old videos (Computer chronicles a good example) of what could be done on a 486 for instance. While you can see the difference in overall experience, it isnt that extreme of a difference, the 486 being released 37 years ago...
ryanjshaw | 15 hours ago
[1] Why Aren’t Operating Systems Getting Faster As Fast as Hardware? https://web.stanford.edu/~ouster/cgi-bin/papers/osfaster.pdf
eviks | 19 hours ago
bandrami | 16 hours ago
Maybe, but I have terrible news for you about how much easier it just became to throw software at a problem
tzs | 16 hours ago
AO3 HTML downloads have the story in one long HTML file. When reading that on iPad that stupid refresh can move you to the top which is pretty damned annoying.
For that very particular situation I do have a workaround, but it involved adding some JavaScript to the download HTML. If anyone else is reading downloaded AO3 HTML and would like this I've put it on pastebin.com. Get saveplace.js [1] and ao3book.css [2] and add this at the end of the head of your AO3 download:
Saveplace does two things.First, to address the tab refresh problem, whenever you change your position in the story it waits until you've stopped at a new position for a bit and then records the new position in parameters on the URL. After a refresh happens it looks for those parameters and restores the last saved position.
Second, to make the story easier to read it hides all but the first chapter, adds buttons to move forward and back by chapter, and adds a dropdown to select chapters. It also adds a button to switch between night and day mode. The day/night mode setting is saved in local storage.
Feel free to use this in anything of your own. The chapter navigation stuff is tied to AO3's HTML, but that would be easy to delete leaving just the position saving/restoring. This is in the public domain in places where it is possible to put things in the public domain. If one of us is somewhere that isn't possible you can use it under the MIT No Attribution license (MIT-0).
[1] https://pastebin.com/viTajxy3
[2] https://pastebin.com/v6AF8cmj
nehal3m | 15 hours ago
embeng4096 | 6 hours ago
alex_duf | 14 hours ago
That's almost a decade ago.
Phones RAM progression has stagnated for a LONG time, during that time I doubt that webpages have become lighter, so yeah I'm not surprised by what you are saying.
dyauspitr | 14 hours ago
mrweasel | 13 hours ago
The devices themselves are fast enough to run everything, you just can't update and eventually apps stop being available to the old iOS version they run.
asdff | 13 hours ago
arvinsim | 13 hours ago
I know this because I still get some of my web pages refreshed even if the browser is literally the only app that is running.
asdff | 13 hours ago
pjmlp | 6 hours ago
That and using SPAs for static sites.
duskdozer | 6 hours ago
keeda | 4 hours ago
bsoles | 3 hours ago
oblio | 23 hours ago
After all this churn subsides there is a chance entry level Windows laptops will start at 32GB RAM and maybe 8-12GB VRAM?
Which could end up being about 5-10-15 years of progress packed into 2-3-4.
loeg | 23 hours ago
oblio | 13 hours ago
But I'm betting on their shortsightedness, greed and general stupidity which will cause a crash. Leading to oversupply/huge leftover stocks.
thewebguyd | 23 hours ago
The shortage is manufactured, I have my doubts it will "end" in a conventional sense. I'm more skeptical and feel like this is yet another consolidation of wealth and a means of taking away compute power from people, which prevents startup competition. This way the hyperscalers are the only ones that can offer any meaningful compute.
kace91 | 23 hours ago
The just announced pixel is the same phone as last year. I know it sounds like a usual complaint, but look at the actual specs, it literally is the same phone with differences so small that hey might have passed as regional variance.
As for the Samsung, the screen can darken when looked from the side for privacy. That’s pretty much it. Price increased though.
Coupled with the current iOS situation it seems like things are… rotting. Everything in decline.
walterbell | 23 hours ago
Jony Ive at OpenAI is rumored to have smart speaker, pendant, pen and bone-conducting headset in the launch pipeline. Audio interfaces, no screens,
Meta is selling millions of smart glasses, with Apple and others following.
If the memory market was not distorted, home AI + agents + open models could have a bigger role via AMD Strix Halo. Instead, they will be reserved for those who can afford to spend five figures on 512GB or 1TB unified memory on Mac Studio Ultra devices.
vessenes | 23 hours ago
walterbell | 23 hours ago
> users [could] interact with Siri and future Apple devices without speaking out loud.. AI systems capable of interpreting facial expressions and subtle muscle movements to understand so-called “silent speech.”
kace91 | 23 hours ago
vel0city | 23 hours ago
https://shokz.com/pages/openrunpro2
locusofself | 23 hours ago
So we are talking about a HomePod with a screen, or like one of those Meta "Portal" things?
walterbell | 23 hours ago
warkdarrior | 22 hours ago
Hmmm, so they traded always-on audio recording for always-on video recording. Not sure this is an improvement.
walterbell | 18 hours ago
b112 | 14 hours ago
walterbell | 13 hours ago
pshc | 22 hours ago
inigyou | 23 hours ago
whynotmaybe | 23 hours ago
Otherwise I'd still be rocking my S9.
I'm also using a pixel 2 for Android development and Google play billing isn't supported on it.
The hardware is fine but they make it obsolete with software.
I'm guessing they'll soon move to a subscription pricing for phones.
Affric | 22 hours ago
It might last until 4G is turned off.
I can’t really imagine needing greater bandwidth than I have now but I still use the phone like it’s 2010.
babypuncher | 22 hours ago
ajross | 21 hours ago
Just "commoditizing". Last years microwave ovens were basically the same as 2024's also, and no one cares. You still need them and people still buy them and use them as much as ever, but at a replacement rate and not because of fashion or innovation.
That is a good thing. It means the economy is doing what it's supposed to do and bringing maximal value to consumers so we can spend our resources more efficiently (on other fashion-driven junk in different market segments), making us richer.
It's only bad news if your business is selling "phones" and not innovative products more generally. Which, yeah, is pretty much AAPL's trap. But that's on them, not us. We're winning.
kace91 | 21 hours ago
ajross | 21 hours ago
They may not match your particular tastes, but people with inflexible taste are always the last-resort market for manufacturers of commoditized products. People still buy from Hermès even though Shein completely dwarfs them in revenue, etc... That's the way it will always be with Apple too.
whackernews | 21 hours ago
Richer… so we can buy more stuff I guess?
Anyway a good microwave can last you 30 years not even joking.
nickjj | 21 hours ago
It's eyebrow raising for me in other ways.
I have a Pixel 9a and it's been quite good with really solid battery life. It's barely 6 months old and I got it new straight from Google.
A few days ago I noticed the battery started to drain much faster than usual. I also noticed at the same time Google is pushing the 10a.
Nothing changed on my end. I barely use the phone in my day to day. In 10 hours today I sent 3 text messages with Whatsapp and lost 60% of my battery in that time frame. Up until a few days ago, 60% would last me 3 days.
I find it weirdly coincidental that the battery life went from amazing to worse than a 5 year old device I had prior to this just as they are releasing new phones. I've powered it down and given it a full discharge / charge too. It's still draining at an alarming rate.
kace91 | 21 hours ago
nickjj | 21 hours ago
I wish there were better options for phones. It's absolutely crazy to me that a phone can be perfectly working one day and then it starts getting issues like this out of the blue.
It makes it completely undependable. All I want is a phone I can trust traveling with where I'm not going to wake up the next day and then the phone starts draining 3-4x faster than it normally does.
There hasn't even been a system update for almost 3 weeks, so it wasn't an update that busted things.
kace91 | 20 hours ago
They mentioned people complaining on Reddit about battery drain since the last update, but I haven’t personally seen the threads so take it with a grain of salt.
FartyMcFarter | 21 hours ago
kingstnap | 15 hours ago
Battery life, RAM usage, Performance, all suffer immensely from being leaky abstractions instead of creating hard faults.
b112 | 14 hours ago
The entire point of his statement, is that the broken code is on purpose.
throwa356262 | 13 hours ago
What you probably see is Google toggling on some new AI feature, which is now doing some initial on-device computation. It will usually calm down after a few days.
nickjj | 10 hours ago
I'm not sure.
Last year they ruined my Pixel 4a by pushing that battery patch to everyone (even to end of life devices).
Their official repair center replaced the battery as per Google's guidelines and offer but during the battery swap they managed to physically break other parts of the phone over the course of multiple visits. Each visit broke a new component.
Google support didn't do anything in the end, eventually ignoring me. This went on for almost 6 months with an email chain over 100+ replies.
Eventually the repair center gave me enough credit towards a new phone (this 9a) but only after I mentioned I was going to small claims court since they left me with a device that didn't function in the way it used to function before I visited them.
yread | 12 hours ago
tootie | 20 hours ago
mikestorrent | 19 hours ago
Perhaps this could give room for physical design changes instead. I'm sick of phones that are just a slab of glass. I remember fun, weird, fashionable designs! Buttons and keyboards, phones felt like an individual choice, not just this boring black mirror. I'd take ten years of stagnation on hardware development in phones in exchange for ten years of exciting form factors with improving software. Let's face it: the spec is high enough for anything we need to be doing, by now. The software is the real problem, and there's room here for massive improvement.
Markoff | 15 hours ago
same with Amazfit Bip from like 2018, you can't buy small THIN <10mm watch with battery requiring charging once a month and always readable MIP display (the more sunlight the better)
SoftTalker | 6 hours ago
meerita | 23 hours ago
WarOnPrivacy | 23 hours ago
I'm paying more on ebay for thinkcentre tiny and thinkpads - 12th gen intel and newer.
Refurbished spinny drives have been steadily climbing - up 50% since late last year. That's on top of the 20% mystery jump that happened in the last week of 2024.
zozbot234 | 23 hours ago
ajross | 21 hours ago
Also be aware that this stuff whipsaws, if OpenAI actually takes posession of that memory and decides they can't use it and dumps, we're going to see a crash. Likewise if they back out of the deals with the memory fabs (or fail and default). There's some scary volatility on the horizon.
jl6 | 23 hours ago
aziaziazi | 22 hours ago
jl6 | 13 hours ago
jeffbee | 23 hours ago
Qem | 22 hours ago
kccqzy | 22 hours ago
pinkmuffinere | 22 hours ago
I know I'm not speaking to all the people that need to hear it, but used phones are very affordable, and reduce waste. A used iphone 13 is about $200 in the US: https://swappa.com/listings/apple-iphone-13?sort=price_low
shirro | 22 hours ago
Companies have reduced staff prematurely on the promise of productivity improvements that have not occurred and lost customers to terrible customer service and declining product quality.
Many hardware launches are going to be delayed or not meet expectations which really is the tip of the iceberg.
The US/SK memory cartel understandably sold out for a massive short term windfall but they their long term decisions to limit supply have created a huge opportunity for China. I wouldn't be surprised if this will go down in the history books as the start of the exit for US/SK from the industry and the start of Chinese dominance.
The smart phone industry is likely to respond with an increasingly hostile anti-consumer approach as they try and lock customers into the cabins of the sinking ship. I expect cheap and cheerful Chinese budget phones aren't going anywhere.
I am happy for ram, cpu and storage to stall. I want a more robust and open phone which can take a fall and be updated long after the vendor loses interest. I expect to uninstall most of my apps rather than install new ones as I increasingly disconnect from an ever more distracting and worthless medium. I have cancelled nearly every subscription service in the last 12 months. And I have been deleting a lot of free accounts and apps. Its like doing a big cleanup. Surprisingly rewarding.
HN has felt like more than 50% AI industry promoting blog spam of little interest to me as a reader for some time. I am setting a budget of ten, no make it five, more posts here. Then I am out for good. Account deletion and no looking back.
crowcroft | 22 hours ago
Would love to know what sectors you would say are obviously under invested. Sounds like an opportunity.
tty456 | 22 hours ago
m4rtink | 21 hours ago
inigyou | 21 hours ago
mikestorrent | 19 hours ago
mikestorrent | 19 hours ago
mr_toad | 22 hours ago
Companies have reduced staff because of the impact of tariffs, because of low consumer confidence and spending, or as a ploy to pump share prices. Then they claim it’s AI, because it sounds a lot better to say that you’re reducing headcount because of AI than it does to admit that you’re cutting costs because of falling revenue.
mikestorrent | 19 hours ago
In some ways it's pretty interesting to watch the entire world mobilize production for AI; some folks like to call this "hyperstition" as the future AGI reaches backwards in time to compel its own creation. Wild, but when trillions of dollars - i.e. millions of people's entire life output of work - are being put into something, it's truly an effort on a scale that no societal project has ever been before. There's no leader, nobody is in control, nobody has the grand vision other than "build the thing and get rich in the process". Amazing times to live in. The best use of our time and resources and coordination? Probably not... as we look around our broken cities, stepping over our poor and hopeless...
fsflover | 13 hours ago
Such phone exists: https://en.wikipedia.org/wiki/Librem_5
globular-toast | 13 hours ago
Question is, is it really impossible for businesses to fund themselves with bank loans now? John Kay wrote about this years ago arguing the finance sector is no longer a good thing for society but has become more of a leech: taxing the money supply but not supporting new businesses. I feel like it's only become worse now. Even insurance is barely really insurance any more. It's more like a savings account that you might be able to withdraw from when you need it, but not necessarily.
donkeybeer | 12 hours ago
namibj | 10 hours ago
They are not curtailing themselves in a market as lucrative as the current one; they probably boost their marginal work planning impacts to productivity via predictive maintenance like actions to err on the side of more productivity due to the immense profit each Gigabit they can deliver this year makes them. Literally like over half of what a consumer pays today at best buy could be expected to be just profit of the factory based off of the non-artificial supply limits and the huge AI demand pushing out all but the wealthiest home-PC demand...
donkeybeer | 8 hours ago
barbazoo | 22 hours ago
dheera | 22 hours ago
It's that everything has become 20% more expensive in the past year, I'm being taxed to death, fighting with companies trying to money grab me, my electric bill is now $800, and I'm now too broke to buy a new phone every 2 years when most of my income gets eaten by the "system".
I'll wait until either SPY does another 50% run or BTC does another 100% run and then I'll buy a new phone. Google, you want me to buy your new phone? Do something to make SPY or BTC go up and then we'll talk. Until then my current phone works, and the new features aren't a must-have.
inigyou | 21 hours ago
dheera | 21 hours ago
If the "system" wants to drive more consumption, it's on the "system" to put more buying power into my hands. Double my salary, reduce my taxes, make BTC do a big run up, something. Otherwise I'm happy staying put.
pier25 | 22 hours ago
donkeybeer | 11 hours ago
whackernews | 21 hours ago
anymouse123456 | 20 hours ago
tachalorah | 20 hours ago
pm90 | 17 hours ago
Where are the China hawks? The argument for protecting Taiwan was that without their chips the smartphone market would contract, right? Thats whats happening now?!
hansmayer | 14 hours ago
throwaway270925 | 14 hours ago
Less phones,computers,consoles,servers,etc sold (and everything that follows this) seems a way larger impact on the economy than a few thousand new ChatGPT Pro memberships...
donkeybeer | 11 hours ago
hrpnk | 13 hours ago
ValentinPearce | 13 hours ago
I'm not as much of a fan these days but I do hope these limitations have the effect of improving best practices.
piokoch | 13 hours ago
I am always surprised that when the planet caring, liberal Apple boss shows up on the Big Apple Event stage, he encourages people to ruin the planet by needles purchase of the new hardware, even though the old one can do the same job easily, as now the improvements are barely incremental, if any.
badgersnake | 12 hours ago
This feels like an antitrust kind of situation.
2OEH8eoCRo0 | 9 hours ago