GitHub is sinking

232 points by herbertl 19 hours ago on hackernews | 157 comments

oarsinsync | 18 hours ago

I went to look at a repo on Github today. Clicked on the "xxx commits" link to see the commit history, and got told I've hit a secondary rate limit and need to wait.

I'm the only person on this network that would even look at Github, and my connection has a dedicated IP, no CGN.

blinded | 18 hours ago

The only real way to browse the site is to be logged in.

NewJazz | 14 hours ago

They will gradually authwall everything they can. Just look at linkedin.

Asooka | 14 hours ago

Wouldn't this break Go and other build systems (npm?) that pull packages from github by default? Not that I endorse the practise, but will Microsoft really kick out such a big class of users?

chrisandchris | 13 hours ago

Can't count the times a "nuget restore" in our CI fails with 401, just to succeed on a 2nd attempt a few seconds later. Seems like the IP range is somehow flagged, so there's definetly a downside to it.

blinded | 13 hours ago

It does break it, from experience authorizing the pulls with a bot user fixes it.

In the case were the build happens from a github action there are standard builtin credential (workflow permissions).

https://docs.github.com/en/rest/using-the-rest-api/rate-limi...

negura | 12 hours ago

sadly even that isn't an option for me. i spent half an hour yesterday trying to create a github account. i couldn't. my @proton.me got rejected. captchas take several painful minutes to complete. and even when I did manage to create an account (at least the page displayed a success mesage), it got disabled the instant I logged in for "TOS violation". i wish i was joking, but i literally cannot create a github account. a few years ago this would have seemed crazy. but here we are.

i'm stuck having to use google (another pain in the ass) for discovering codebases that contain specific snippets. but some repo contents (such as wikis) are not exposed at all to search engines

blinded | 5 hours ago

Could you make a throw away gmail or something?

negura | 3 hours ago

currently gmail requires that you send (yes, send) a text to them with a code in order to sign up

https://discuss.privacyguides.net/t/google-account-registrat...

classified | 3 hours ago

Take that as a hint. Make an account somewhere else. The article lists several alternatives.
having an account elsewhere won't help with the use case i mentioned

graemep | 18 hours ago

Exactly the same here. I get that regularly.

noprocrasted | 18 hours ago

Yeah this is just typical techbro gaslighting. There is no rate-limit and hasn't been for years (it's just default deny), but they refuse to change the wording to reflect.

dcrazy | 18 hours ago

Would you care to cite your source that GitHub does not apply rate limits to unauthenticated requests?

noprocrasted | 17 hours ago

The parent's experience which mirrors my own - on a clean residential IP that hasn't sent any traffic I hit that "rate-limit" on my first request to the commits list view.

So there is no rate-limit, it's a default deny for unauthenticated requests... which could be fine but at least update the error message to reflect that.

tempaccount420 | 13 hours ago

It's a rate limit of 0 RPS to that endpoint

MYEUHD | 17 hours ago

If you're on the desktop, refresh the page cache by using Ctrl + Shift + R

The page will load correctly

I regularly get 404s on legit links in slack that work for other people.

tbolt | 18 hours ago

“GitLab - enterprise grade, meaning it’s bloated and confusing but it’ll impress your boss. This could be the choice if you need multiple meetings to make the choice.“

lol!

egwor | 17 hours ago

We use gitlab at work, and I have to say that it is disappointing. The UI is so complicated to do the simplest things (e.g. to approve a MR you need to click a button that is actually a menu; the diffs are difficult to read; the 'To-do list' includes MRs that were already merged (how is that actionable?)) and it seems that they're struggling to turn around improvements quickly. The issue around the 'To-do list' including MRs that were already merged was raised years ago.

I also have to say that I'm surprised about the backlash against bitbucket. I find the UI incredibly simple and clear, as do all of the new joiners. With Script Runner you can do some pretty amazing things. It handles the huge repo's well too.

pelasaco | 14 hours ago

try to find the issue boards.. its a mess. And expensive.

IshKebab | 14 hours ago

> to approve a MR you need to click a button that is actually a menu

It's not really any better on Github. Why do I have to click on "Files changed" to approve a PR? Overall I would say it is on par with Github. Worse in some ways; better in others.

Izkata | 5 hours ago

> e.g. to approve a MR you need to click a button that is actually a menu

I think you're referring to the "Your review" button in the upper-right that's not always there, but there's a plain "Approve" button on the Overview tab that doesn't open any menu and is always there.

IshKebab | 14 hours ago

Funny but not really true. It's not really any more bloated or confusing than Github. It's not really "enterprise grade" software. If you want that look at Jira or anything Microsoft produces.

ndsipa_pomu | 2 hours ago

Made me chuckle. We use a self-hosted GitLab that I chose due to it having git and a container registry. The interface certainly can be confusing if you don't use the web UI often.
I have lost count of how many times something went down on GitHub ever since documenting it on this comment chain [0] and also predicting 6 years ago [1], that going all in and centralizing everything on to GitHub was really not a good idea if you need stability or to push a critical fix and your GitHub actions doesn't work.

Now, are you going to finally self host or should we continue to expect another outage on GitHub?

This time, there is no CEO of GitHub to help us. It is Copilot, and Tay.ai that are still struggling to maintain GitHub.

[0] https://news.ycombinator.com/item?id=37395238

[1] https://news.ycombinator.com/item?id=22867803

LorenDB | 17 hours ago

Why do I keep seeing people blaming Tay.ai? That was a one-off Twitter chatbot that was shut down a decade ago.

summa_tech | 18 hours ago

It sort of feels like no major open source repository can be possibly left well enough alone. I remember how SourceForge went down the drain, it's a real pity to see same happen with GH.

Side note: I read the URL as "dBus hell". We've all been there m8

arikrahman | 18 hours ago

No m80 it's a nushell based on decibel units dBu Shell

johnfn | 18 hours ago

Everyone wants to pin this on the Microsoft acquisition or incompetence but it seems pretty clear to me from the material GitHub has posted that AI has 10xed the amount of code being committed to GH, which has downstream effects everywhere - CI, Actions, code ingestion, everywhere. The author pins it on weird things like MS Copilot, which kind of feels like he’s listing off things he doesn’t like rather than casual favors. This is ignoring the 800 pound gorilla in the room.

aliasxneo | 18 hours ago

Yeah, I had the exact same response after reading the post. I mean, I'm all for jumping on the Microsoft hate train, but not if it misses the elephant in the room. Let's say the _perfect_ GitHub replacement spawns tomorrow? What's preventing the same infrastructure challenges of millions of lines of AI-generated code destroying it?

I think centralized code hosting is pretty much going to get killed by AI. Just like it's doing to social media.

bdangubic | 18 hours ago

of all the awful things AI is doing and will be doing to society, killing centralized code hosting and social media will be its shinniest moments, both deserve to die painful deaths

idiotsecant | 18 hours ago

Yes, the terrible sin of ... Hosting code where people can find it

icase | 17 hours ago

hosting code where people can find it is the reason LLMs can write code, so we kind of screwed ourselves there…

exe34 | 17 hours ago

How did people do it before github? Did everyone write everything with peek and poke?

doubled112 | 17 hours ago

Sourceforge

kuboble | 17 hours ago

Private people would keep their code locally and share the snapshot of the code using any file sharing or hosting option available.

Companies had been hosting their own CVS or later svn servers.

lelanthran | 14 hours ago

> How did people do it before github? Did everyone write everything with peek and poke?

I've been sharing GPL projects since 1999. We didn't need peek and poke (Both of which I have also used further in history...), but we managed nevertheless.

Prior to github I shared software on sourceforge (and others). Prior to that I published stuff on Freshmeat.

Prior to that I downloaded games others shared (not open source) on Happy Puppy.

Prior to that I used usenet to find and download games, shareware, etc.

Prior to that I used ftp to (IIRC) ftp.sunsite.edu, ftp.nic.fi, and others.

Prior to that I got news of new releases using Gopher.

Finally, prior to that, I actuallyy did use peek and poke to write software :-/

If github went away, and centralised repos went away, we'd still have something...

xigoi | 14 hours ago

I can’t remember the last time I looked for a project specifically on GitHub. I always come there via a link from another site.

majormajor | 18 hours ago

> I think centralized code hosting is pretty much going to get killed by AI. Just like it's doing to social media.

Private corporate codebases are a poor fit for GH because they don't benefit from public social graph effects. And the typical codebase isn't so large as to be technically challenging to deal with with OSS tools. I'd guess they make up a substantial share of revenue.

But once the reliability is called into question, self-hosted or smaller alternatives start to look good. Although there's some trickiness there if you want to be super cautious about making sure you can get to your code+infra in case of a vendor incident, especially if you're cloud based.

Why is centralized code hosting getting killed? I'm running an opensource project, >99% of the code is AI generated, could not do this without GitHub. Ai generated source code needs a place where AIs and people can collaborate. I'm expecting GitHub to be hugely successful, but mostly for an AI audience.

chowells | 17 hours ago

Because it's centralized. Your project pays the price for every unrelated project that's getting overloaded.
I'm sure the underlying infra is not a single server, so this is mostly a period where they have to adapt to higher loads due to AI becoming actually useable in the last 8 months. It's basically proof how well AI works these days. Give it a few months so they can scale and it'll get better. Remember Twitter fail whale? Growth pains that can and will be solved.

georgemcbay | 17 hours ago

> It's basically proof how well AI works these days. Give it a few months so they can scale and it'll get better. Remember Twitter fail whale? Growth pains that can and will be solved.

GitHub's problems can technically be solved, but that doesn't mean they can be solved in a way where the economics still work out.

If AI use is 10x-ing the amount of infrastructure costs for GitHub but not 10x-ing the amount of money Microsoft brings in from GitHub then there is certainly no guarantee they will bother to solve these issues adequately.

And I'd be shocked if the revenue side of things isn't lagging way behind the extra usage post-AI-era, both because a lot of the new use is probably on the GitHub free tier, and because even on the paid tier most usage (other than CI/Actions, AFAIK) are on a fixed subscription cost per user regardless of how much you are slamming their servers and it is unclear how much they can raise that price without current enterprise users fleeing.

Twitter had a clearer goal that aligned with the financials... support more people stably, show more ads. Things are less clear with GitHub's business model where the free tier is a loss leader for the paid tier but the expansion in usage is likely to balloon the free tier usage at a far faster rate than the paid tier usage.

Also (and this part is admittedly far more speculative) if AI labs are to be believed this is still early days for AI usage and we'll still see massive usage growth over the next few years. If GitHub is already having existential trouble at the beginning of the curve, what hope do they have to scale up with their current business model if AI usage actually does ramp up exponentially?

Yah, the monitization bit is challanging. I'll ask my agent to click some of the ads GitHub serves it ;-)

But getting this infrastructure right is crucial for a future where most of the code is AI generated. GitHub puts microsoft in a good position to experiment and learn how to optimize GitHub (enterprise) for the future.

Nate b Jones on youtube, https://youtu.be/FDkvRl1RlT0?si=AEYlUchm_oalMSzf, argues that Atlassian might be an interesting acquisition for Anthropic, as it provide most of the context AI at enterprises need. When executed well, GitHub enterprise, can offer microsoft the same value: the context AI needs in the future.

lelanthran | 14 hours ago

> But getting this infrastructure right is crucial for a future where most of the code is AI generated.

That's not the problem. The revenue model they have is based on a certain amount of usage from the people who do not pay (you, for example), and a certain amount of usage from the people who do pay (enterprises).

If you 100x you usage, then they need 100x the infra, which means they need 100x the revenue.

At that sort of usage enterprises would rather self-host, and github would be left with only the free users, who are almost all like you now - hammering their servers but not paying for it.

If you self-host, for $5/m you can have your own VPS, but doesn't really solve the problem as much as you'd think - those are all vCPUs and shared, so you can't hammer them all the time either because then the provider has to increase their infra as well so fewer accounts share a single CPU.

Either way, if you want to generate code with AI at the speed that an agent can, you'll have to pay for it one way or another.

Well either Microsoft finds a way, or Anthropic will. I'm sure they'd love to host all these projects with all the source and context. Maybe they should buy GitLab, or Atlassian.

lelanthran | 13 hours ago

> Well either Microsoft finds a way, or Anthropic will.

Just what sort of nonsense is this? Neither of them are going to operate at a loss.

Why are you so convinced that they'd be happy to continue spending money on you and getting none in return?

ncruces | 13 hours ago

Also, one thing the numbers they published show is that the bits that are growing 10x YoY (and which they expect to get “worse”) are all the things that you get “unlimited” mileage off (even if you're a paying customer): repos, commits, PRs.

Things that have “usage based billing” (like action monites) grow closer to 2x YoY.

When there's a dollar amount attached, people don't 10x, because it's not worth it. They splurge when it's cheap, and unlimited.

bigstrat2003 | 10 hours ago

> But getting this infrastructure right is crucial for a future where most of the code is AI generated.

If that is the future, then source code hosting will be the least of our worries. The entire industry will collapse because the software will stop working.

Marsymars | 15 hours ago

> And I'd be shocked if the revenue side of things isn't lagging way behind the extra usage post-AI-era, both because a lot of the new use is probably on the GitHub free tier, and because even on the paid tier most usage (other than CI/Actions, AFAIK) are on a fixed subscription cost per user regardless of how much you are slamming their servers and it is unclear how much they can raise that price without current enterprise users fleeing.

I'd guess most of the costs incurred to GitHub outside of Actions as part of the enterprise flat-rate tier are a fraction of what enterprises are paying for AI in order to incur those costs in the first place.

If a company has to pay $5 extra to GitHub for every $100 of extra AI spend due to that AI use creating disproportionate load, I've got a hard time imaging that GitHub will be the thing that gets fled from.

As far as the free tier goes, it seems like there should be a path to making prohibitively-cost-incurring usage models high-friction. (e.g. limit the free Actions minutes that you get to a certain number per month.) As long as the limits are roughly proportional to the actual costs incurred, there's not too much risk of people fleeing to a competing service, because the only way a competing service would be able to undercut the costs is by taking steep losses themselves, which isn't much of a business model in order to attract people's code repositories.

lelanthran | 14 hours ago

> Ai generated source code needs a place where AIs and people can collaborate. I'm expecting GitHub to be hugely successful, but mostly for an AI audience.

Are you paying them in proportion to the resources they expend on you?

There's this thing called "sustainability", and every company needs to have it. Github cannot continue on the current trajectory where every AI-bro wants to run an agent that generates 1000s of lines of code per hour, dozens of commits per hour... and provide that for free to a few dozens of millions of users who won't pay.

That being said, Microsoft does have an opportunity here - AI-bros are willing to pay $200/m to burn tokens so Github should offer a plan for Copilot, say $400/m, that includes a repo.

If they don't ban AI agents on free tiers, they are going to be out of business soon.

kyrra | 17 hours ago

Saas code hosting seems to be the problem here. If companies self hosted, they could deal with the scaling problems themselves.

lelanthran | 14 hours ago

> Saas code hosting seems to be the problem here. If companies self hosted, they could deal with the scaling problems themselves.

If all companies did this, there'd be no free tier on Github. You get the free tier because the SaaS customers are subsidising the free tier.

logicchains | 17 hours ago

>What's preventing the same infrastructure challenges of millions of lines of AI-generated code destroying it?

There's something called "rate limits" that engineers not working for GitHub have probably heard of; it's this crazy idea that you should limit the load on your infra in order to avoid downtime. GitHub is not the first free service to ever have to deal with bots.

einsteinx2 | 17 hours ago

> I mean, I'm all for jumping on the Microsoft hate train, but not if it misses the elephant in the room.

That elephant didn’t even exist yet for the first few years of poor uptime shown in the graph in TFA… I don’t really disagree if we’re talking about the recent uptime issues, but how does that explain the years 2020-2023?

aliasxneo | 16 hours ago

It doesn't. It just means if they were having problems before, they've now been made significantly worse by AI (on the free tier). All I'm saying is that the problem is bigger than, "Microsoft sucks."

phpnode | 15 hours ago

Because if you were building GitHub from scratch today you wouldn't build it the same way and would benefit from many of the technological advancements of the last 2 decades (nearly).

watwut | 15 hours ago

I dont even like AI much and this still seem to me like yet another instance of people blaming AI for normal mismanagement and failure.

hirako2000 | 18 hours ago

GitHub hasn't changed in any positive way since the acquisition. A decade is a long time, it tells.

GitHub action, co pilot. Oh and that ugly AI search I'm unable to disable. Migration to azure.

Yes Microsoft managed to ruin the network effect. Outages? The straw that broke the camel's back.

Pay08 | 17 hours ago

How on earth is Actions a downside?

prerok | 17 hours ago

I think they meant all the security holes that have been popping up and that there is no interest from Microsoft to fix them.

rurban | an hour ago

They do fix them. But not at the core. Just in the frontends

madeofpalk | 17 hours ago

3 months post Microsoft acquisition, GitHub expanded the free plan to include unlimited private repos.

The next year they removed the limitation on collaborators on private repos for free users.

In the last 4 years they’ve significantly improved their project management tools. I think a lot of teams can make do with GitHub Projects, they’re pretty decent.

Who knows if any of these are directly because of Microsoft or not. But there has naturally been material improvements to GitHub in the years after being bought by Microsoft.

I'm loving it, running an opensource project mostly AI generated, i don't have to think about version control, building and testing my app, running AI code review, hosting my docs website, API and cli to enable Claude Code to interact with everything, etc.

It provides huge value for anyone running an opensource AI generated project.

politelemon | 17 hours ago

> GitHub hasn't changed in any positive way since the acquisition.

It's more like any positive actions they have had are being outright dismissed or forgotten. They removed several restrictions that Github had over private accounts, as well as github actions. Aside from the downtimes, the Github of today is fantastic compared to pre-acquisition Github.

Github had lots of outages even before AI was introduced.

senko | 17 hours ago

The 800 pound gorilla in the room being a $3T company that also happens to be one of the largest cloud providers?

C'mon.

delusional | 17 hours ago

This would make sense if GitHub themselves cited increased traffic or load shedding as their root cause, but most of their incidents from the last month seems to cite misconfigured infrastructure or operational mistakes.

dwroberts | 17 hours ago

Even if this is true: Microsoft own an entire cloud platform. They have enormous codebases of their own and they employ ~200k people. It’s just not an excuse, especially because they consciously made decisions such as e.g. private repositories being free

veryfancy | 17 hours ago

I’m with you here. Further: Even though I disagree with it, “GitHub down, Microsoft bad” is a defensible take, but we’ve seen it ad nauseam at this point.

hackton | 17 hours ago

If that's the case, we should also see the exact same pattern on Gitlab, Bitbucket, etc. Do we?

spiderfarmer | 17 hours ago

What is easier to 10x? A tent or a flat?

fontain | 17 hours ago

10x of nothing is nothing.

stusmall | 17 hours ago

GitHub has been basically the default for free public git hosting for a long time. I was curious what bitbucket has and it looks like the free tier is so limited, I can't imagine a lot of people hosting vibe coded open source there.

einsteinx2 | 17 hours ago

The graph in TFA shows the downtime pattern starting in January 2020. OpenAI released GPT-3.5 in November 2022 (basically December), and LLM/agentic coding didn’t really kick off in the way you’re describing until 2024, but really in 2025.

How can that explain the terrible uptime for the ~4 years post acquisition before all the AI stuff you’re talking about started?

johnfn | 16 hours ago

The subjective experience I and others report is that GitHub feels to have gotten significantly worse over the last few months. If you look at the month over month view of "Uptime history" in the cited link[1], it confirms this: it's been sub-90 (even sub-80 last month) essentially since the start of this year (i.e. when GitHub says that commit activity 10xed). Go back even a year and it's all in the high 9s.

I honestly can't explain the discrepancy between the graph in the article and the month over month stats on the same page, but the latter tracks both to my own subjective experience of GitHub and their own internal metrics.

[1]: https://mrshu.github.io/github-statuses/

silverwind | 16 hours ago

I think it's just a case of brain drain, followed by reckless AI adoption which both drove the quality down.

chilmers | 16 hours ago

The graph is not accurate, because GitHub's historical downtime data is not accurate.

For example, here is a Hacker News story about GitHub being down on July 28th 2016: https://news.ycombinator.com/item?id=12178449

Here's GitHub's historical uptime graph (on which this chart is based), saying there was no recorded downtime that day, or in fact that entire month: https://www.githubstatus.com/uptime?page=40

antiframe | 14 hours ago

Looks like it's not accurate by under repporting not over reporting. So their down time was likely worse!

RadiozRadioz | 14 hours ago

We don't have enough data to confirm if it's over or under reporting. This sample size of 1 is enough to prove the data is not perfectly accurate, but it's not enough to prove a skew bias in the data either way.

Petersipoi | 14 hours ago

Oh please, show me a company that has ever over reported their downtime. That's silly.

antiframe | 14 hours ago

That's fair. We don't know.

I am making an assumption that if Microsoft saw a lot of false positive outages they would fix that, but might drag their feet if there was an outage that didn't get properly recorded (assuming it's automatic to begin with, it might be that a human needs remember to update it).

ncruces | 14 hours ago

Or things didn't change much at all except Microsoft forced them to be more honest in their reporting.

See, I can just as easily make up a story that explains the chart.

abraham | 11 hours ago

GitHub launched a new status page Dec 2018[1]. It doesn't appear as if any history before Oct 2018 was ported over.

[1] https://github.blog/engineering/infrastructure/introducing-t... [2] https://web.archive.org/web/20181211191456/https://www.githu...

starkparker | 11 hours ago

That graph has bugged me since it went viral. The methodology is horseshit: https://github.com/DaMrNelson/github-historical-uptime

Just dumping HARs from devtools from a status site that hallucinates 100% uptime when it has no data. For example, all GitHub services had 100% uptime in June 1996: https://www.githubstatus.com/uptime?page=200

The graph gives GitHub Actions 100% uptime before it launched to GA in November 2019. That factors into the average uptime for every month on the graph before that. It's fully horseshit.

MallocVoidstar | 15 hours ago

The graph in the article is a lie, because GitHub's "historical data" is a lie.

https://www.githubstatus.com/uptime?page=3000

According to it, GitHub had 100% uptime from June to August 1996.

gverrilla | 16 hours ago

We want to thank you for your heroic service in our defense, sir. We really need people like you who know in what side they're at.

Microsoft investors

ExoticPearTree | 16 hours ago

I like to think that Microsoft is trying to run GitHub in Windows in their Azure cloud. And on the fact that every time GitHub is down I think of "someone updated the Windows Servers GH runs on and had to reboot everything".

While I'm 99% sure it is not true, it makes me sleep better at night. And giggle a little when it goes down.

QuercusMax | 16 hours ago

They definitely do something with Azure. Stuff related to GitHub action runs hosted on something.windows.net, which I believe is azure.

thayne | 15 hours ago

MS isn't solely to blame for the AI increase, but they are certainly part of the problem, including their integration of copilot into Github.

namenotrequired | 15 hours ago

The author mentions this and links an article that expands on it

pelasaco | 15 hours ago

Yes, I posted the same observation 3 months ago. https://news.ycombinator.com/item?id=46877226

"Yes, it (AI) will kill open source—at least as we know it. I’m convinced that GitHub and GitLab will eventually stop offering their services for free if the flood of low-quality, "vibe-coded" projects—complete with lengthy but shallow documentation—continues to grow at the current rate."

lbrito | 14 hours ago

Gergely's newsletter claims its more like 2.3x.

0xblinq | 14 hours ago

A big part of the problem IS Microsoft acquisition. They forced them to move to Azure, which is terrible.

Around 8 years ago I was working for a company that they also acquired, and they also forced us to move to Azure. Performance was terrible and our system wasn’t just working there as it should. A few years later our service was dead and all customers moved to one of their office products.

xantronix | 14 hours ago

Don't you think Microsoft ought to have thought a bit more about scale? They're not just innocent bystanders here. GitHub Copilot is a first class citizen of GitHub and so of course a lot of private enterprises are going to be using the thing that's bundled with the other thing.

pixl97 | 14 hours ago

Pray tell where are they going to get memory from?

xantronix | 8 hours ago

They're part of the circular AI finance economy, I'm sure they can figure it out.

shevy-java | 14 hours ago

And why is it wrong? The logic is there:

- Microsoft committed to AI. - AI slop is increasing the costs for maintaining/running GitHub. - GitHub is sinking.

This is interconnected. I can think of numerous other ways how this would be handled. But Microsoft went the AI slop way already. There is no way back for them.

UltraSane | 13 hours ago

If load has increased so much so rapidly then GitHub should be rate limiting as needed instead of basically letting people DoS them.

turtlebits | 12 hours ago

10x the code? Easy solution. Throttle unpaid customers or put a quota.

Either way, paid customers should not be affected.

Why have they not simply asked the 800lb gorilla to solve this problem for them?

gofreddygo | 11 hours ago

For upstarts, individuals, artists and idealists, Github was a means to reach and distribute code reliably to a large number of people on the planet. Is that true today? Will it ever?

97% of code coming in is AI slop. It's owned by an evil, rent seeking corp. Reliability is a flaming dumpster fire. And everything you commit there will be used to train more AI.

Github _is_ sinking.

Got me thinking, if 99% of code pushed to GH is generated by Claude, GH just becomes a free Claude distillation service. Gotta ban it on natsec grounds obviously.

quyleanh | 10 hours ago

Totally agree. People’re saying Microsoft this, Microsoft that with their Microsoft hate, but they ignore the fact that AI trend making GitHub worse, and GitHub is trying to fix.

iamkrazy | 18 hours ago

I installed forgejo on my home server and never looked back. The only problem I face is when hosting an app on DigitalOcean App platform, or vercel etc. They only connect to GitHub.

cobbzilla | 17 hours ago

I’m in a similar boat, I abandoned ship for Gitea years ago (prior to forgejo fork) and have no regrets.

For things that require GitHub I’ve been able to mirror repos there and get things working. Keeping code in sync is annoying though.

lorecore | 16 hours ago

All of the reasons to avoid GitHub are also reasons to avoid the Digital Ocean App platform and Vercel. I use Digital Ocean, but just the VPSs. Don't let yourself get vendor locked in with these middle men, retain control and shoot for the most universal level of the stack you can.

iamkrazy | 16 hours ago

It's just a step. I will eventually move to coolify, just haven't had time to set it up. But the problem stands: coolify also doesn't connect to forgejo.

sinpif | 16 hours ago

Similar situation with Apple's Xcode Cloud.

iamkrazy | 16 hours ago

Have never developed on/for apple platform, so I have no clue. Apple makes setting up development so hard, I wonder what motivates developers to jump through all the hoops.

skydhash | 15 hours ago

> Apple makes setting up development so hard, I wonder what motivates developers to jump through all the hoops.

Money to be made. And they have (had) nice API for most development needs. The actual distribution is a arduous though, mostly around the Review process.

> DigitalOcean App platform[…] only connect to GitHub

They also support deployments from GitLab (so long as you're using the gitlab.com-hosted instance and not a self-hosted GitLab instance). If you've deployed your own self-hosted forge, then you can connect DigitalOcean App Platform to it by using gitlab.com as a bridge—register an account on gitlab.com once and instruct your self-hosted forge to replicate copies to gitlab.com. You don't really need to actually use GitLab.

Having said that, considering that DigitalOcean is in the business of selling IaaS/PaaS, it's loony that they don't let you connect to, say, your own self-hosted Forgejo running on their infrastructure…

(Indeed, considering how many people would like to self-host their own forge but how few people want to actually set up and do admin for it, it's loony that DigitalOcean doesn't pick up, say, Forgejo and/or an alternative and offer a sharply discounted (e.g. $20/year) quasi-managed one-click deployment option with first-class support for connecting to their App Platform.)

mariocesar | 18 hours ago

Agree with Gitlab as an enterprise alternative. Beautifully boring and safe to have complex teams and permissions. Also has a good enough Terraform support, and a nice workflow to host docker images

coolgoose | 18 hours ago

So, what's the actual real alternative ? The one that also supports open source projects ? Ironically gitlab is costlier than github, and not without their faults, but that's "maybe" the only other alternative here, anything else ?

kukkeliskuu | 18 hours ago

I just installed a gitea. It seems decent.

TheCondor | 17 hours ago

It absolutely is.

The only concerns are if it were exposed to the public internet and scale. For personal stuff? It's spectacular.

MrDrMcCoy | 17 hours ago

Codeberg, Sourcehut, or self hosted Gitea.
I've been running a self-hosted Forgejo. Extremely responsive and I've been really happy with it.

IshKebab | 14 hours ago

How many PRs from other people have you received on your self-hosted Forgejo instance?

lelanthran | 13 hours ago

> How many PRs from other people have you received on your self-hosted Forgejo instance?

They're free to email me a diff :-)

Jokes aside, the era of community built software is coming to an end. There is no place in the world now for a repository of open source projects.

None and I don't want to.

If I did I would host it on a VPS and make it public.

negura | 12 hours ago

have you read the submission? have of it is a list of alternatives

ndsipa_pomu | 2 hours ago

It's not that difficult to self-host the free version of gitlab.
I'm not sure what to make of the graph.

On the one hand the acquisition of GitHub may have caused the availability to be worse.

On the other hand, the 100.00% availability before the acquisition looks suspicious, wondering if it's not just the status page being better updated.

(I'm aware of the recent availability problems with GitHub, but on the graph the problems start in 2020 and don't seem to worsen significantly)

rbbydotdev | 17 hours ago

I wasn't expecting to see the outages being nearly the same even before the 2023 ai inflection point

QuiCasseRien | 17 hours ago

onedev onedev ondev

I still don't see this tool when it's about a forge. It is a fantastic tool. Seriously guys, you should really consider it !

drcongo | 15 hours ago

I'm intrigued as to why you're getting downvotes here, I'm vaguely interested in onedev.

Edit: Though I do think they're mad for not offering a hosted version, especially right now while GitHub resentment is riding high.

QuiCasseRien | 14 hours ago

There is a self-hosted version of onedev, it's oss. But there is also a enterprise version (I have one) with very nice plugins most of forge doesn't still have (like the web terminal to debug oui action/ci/cd).

Last week, Robin release a very nice feature for vibe coder. AWESOME

phyzix5761 | 17 hours ago

For $5 a month I can host a server and put a bunch of projects on there. Yeah, I don't have a million stars on my repos but it works for what I need and I can give access to whoever I want.

imagetic | 17 hours ago

Anyone would buckle right now. Microsoft just sucks more at it.

ChrisArchitect | 17 hours ago

Related:

Ghostty is leaving GitHub

https://news.ycombinator.com/item?id=47939579

Before GitHub

https://news.ycombinator.com/item?id=47940921

Days without GitHub incidents

https://news.ycombinator.com/item?id=48012022

GitHub Actions is the weakest link

https://news.ycombinator.com/item?id=47933257

GitHub Copilot is moving to usage-based billing

https://news.ycombinator.com/item?id=47923357

functionmouse | 16 hours ago

Extinguish

ben8bit | 15 hours ago

I would not be surprised if AI commits are the culprit. There is no way any service would cope with a constant stream of unfettered commits by sleepless always-on agents. Ironically, this same strategy seems to be what GH/MS (and other big companies) are evangelizing - and therefore dying by their own hand (in a way).

signal_lamp | 14 hours ago

AI commits are definitely causing the recent issues with their platform as Github has seen an unprecedented amount of traffic since the introduction of OpenClaw. They are likely showing us a future problem that other sites have not experienced yet as the tools have not matured in a way for people to be able to adopt them towards their eventual vision for managing someone entire digital life.

That being said, Microsoft in general in relation to GitHub has shown at least historically they've caused more issues. Outages on Github have become so common place that I genuinely think people have simply gotten used to it. The recent round of these were just bad enough where people felt strong enough to make their own down status page https://mrshu.github.io/github-statuses/. Whether you agree with how they gathered their data, there is something being felt by the community that Microsoft is not being transparent with these issues.

tristanj | 12 hours ago

Yes this is confirmed. Github activity surged about 10x in the past year: https://x.com/kdaigle/status/2040164759836778878

Platform activity is surging. There were 1 billion commits in 2025. Now, it's 275 million per week, on pace for 14 billion this year if growth remains linear (spoiler: it won't.)

GitHub Actions has grown from 500M minutes/week in 2023 to 1B minutes/week in 2025, and now 2.1B minutes so far this week.

So we're pushing incredibly hard on more CPUs, scaling services, and strengthening GitHub’s core features.

Thom2000 | 15 hours ago

Github still doesn't support SHA-256 git repos (https://github.com/orgs/community/discussions/12490) even though their competitors (Gitlab, Codeberg) have that for ages now.

rurban | an hour ago

And a github employee implemented that...

i_love_retros | 15 hours ago

I think they went too far with AI internally. Complete collapse in quality of internal engineering practices.

sccxy | 15 hours ago

Living in Eastern Europe has its perks. I hardly ever notice big GitHub outages because of time zone.

I'm also happy with how generous their free hosting and actions are.

pelasaco | 15 hours ago

People used GitHub's free infrastructure for over a decade without complaining. Now AI-generated spam and massive amounts of low-quality code are increasing costs everywhere, and suddenly GitHub is the bad guy for acting like a business. Criticizing centralization is fair, but pretending GitHub gave nothing to open source is just dishonest. Alternatives today, are probably going to be flooded by the low-quality AI generated code tomorrow...

ricksunny | 14 hours ago

I like the "written by human" banner at the bottom - that's a first for me and will be glad to see others adpot similar.

>Written by human All opinions are my own and not those of a large language model. Everything I write is one hundred percent human. Because I care!

shevy-java | 14 hours ago

It will be a long way before GitHub dies, but it is definitely sinking. Slop is killing it. I think Microslop, 'xcuse me, Microsoft, realises this too, but there is nothing they can do now that they committed to AI fully. I feel sad for the GitHub engineers, because they write pointless blog entries nobody believes anymore. Meanwhile existing services erode in quality. It's like in a submarine. You have one hole. You manage it. Well, more and more holes pop up the following days. We know where this is headed then ...

ninkendo | 14 hours ago

I often think about how I’d do it if I ran my own company.

I would really like to see what it would be like doing all code reviews over email. The repo would just be a simple vps-style server with git-only ssh access, there’d be a particular for-review/ branch namespace for code to be reviewed, and CI would just be a bot waiting for branches to show up and would mark refs as good or not by just annotating/tagging them. It could reply in the email thread with results too.

The mailing list would have a web archive viewer, naturally. That’s how you could look at old reviews. There’s tons of existing solutions for this, and it’s just html.

Chat would be on IRC with bots to archive the channels. Easy as hell.

The whole thing (except maybe the CI runners which need beefier hardware) could be done on a very cheap server.

GitHub is waaay over engineered for what you need to run a software project. Look at the Linux kernel, they just use a simple mailing list, and it’s debatably the most successful software project of all time.

Issue/bug tracking is scarier though. Because I’d probably want to yak shave my own solution and get too involved with that and not even focus on what the company does. Maybe it could be a bug tracking software company?

Asooka | 13 hours ago

Ideally I would want the code review to be versioned as well with easily accessible history. That is, I would like to see the exact lines which a comment pertains to and when they were changed and switch back and forth. While e-mail is probably good enough as a protocol to exchange this data, the e-mail client is not a good way to view it in my opinion. Maybe we need a decentralised code review system as well.

dtoffe | 14 hours ago

AFAIK Sourceforge is alive and kicking, and it has an "Import from Github" feature that has been available for years.
Don't they inject malware/adware into your build artifacts?

lelanthran | 14 hours ago

> Don't they inject malware/adware into your build artifacts?

Aw, c'mon! The did this for about 4 months which ended in 2015! Prior to that they had Windows installers which did the same, but that also only lasted a few months.

It's now 2026. Exactly what software did you host on sourceforge from 2011 to 2015? Because I hosted my GPL stuff there, and I moved away because I was affected, and yet I am not concerned that they will do that again.

msyea | 14 hours ago

The issue is every AI coding tool integrates as a "GitHub App" (OAuth, PATs, webhooks etc.) first, over other code forges. This load is coming through their 3rd party app integrations. I bet the web/git volume isn't getting smashed as much.

I had to begrudgingly use GitHub over my preference GitLab to use some 3rd party AI features.

The solution for GitHub is to charge or rate-limit some of these 3rd parties integrations and come up with an equitable solution.

sscaryterry | 13 hours ago

"If Linux can be maintained by sending patches to an email mailing list, “doesn’t work at scale” arguments are skill issues."

Agreed. Sick of the bloatware.

marking-time | 13 hours ago

I left Github because of some very strange activity. I had a new folder named "feature" added to one of my repos. At the time I had failed to turn off the AI integrations, so I figured that was the problem.

There is no way I'm going to let a VCS put code into one of my repos without my asking for it or consent. Full stop. I moved all my significant code to codeberg but kept the github account, so my username doesn't get squatted.

negura | 12 hours ago

One shift in perspective I had was realizing that github is not just a code forge, but a social network.

tristanj | 12 hours ago

Github is struggling because AI-boosted coding increased the number of commits 14x in the past year, and the pace is still accelerating. The site is struggling to keep up.

Github's COO confirms it here: https://x.com/kdaigle/status/2040164759836778878

Platform activity is surging. There were 1 billion commits in 2025. Now, it's 275 million per week, on pace for 14 billion this year if growth remains linear (spoiler: it won't.)

GitHub Actions has grown from 500M minutes/week in 2023 to 1B minutes/week in 2025, and now 2.1B minutes so far this week.

So we're pushing incredibly hard on more CPUs, scaling services, and strengthening GitHub’s core features.

Who cares about bugs when their developer velocity has increased 5x!

senorqa | 5 hours ago

OneDev is pretty cool too https://onedev.io/

abstractspoon | 4 hours ago

I've never had a problem in 10 years of daily use

ozgrakkurt | 2 hours ago

A lot of posts are defending Microslop’s failure by explaining their traffic increased.

This is not a problem caused by normal people doing normal things so I don’t understand why people should take this into consideration.

- They knew the traffic would increase.

- they have the capacity to handle such traffic financially.

- there are a ton of other systems that handle more traffic and basically never go down.

- they don’t have to accept slop into their platform.