Do I belong in tech anymore?

165 points by Shorden a day ago on lobsters | 49 comments

technomancy | a day ago

I left my job a month ago for similar reasons; I felt like my job was to provide adult supervision to the people on my team who had given up thinking for themselves. It was honestly terrifying to watch their ability decay over the course of a year, in some cases going from a skilled, trusted professional to someone regularly making junior level mistakes several times a week. It was sad to see what I had spent years building fall apart so quickly, but I had to get out. Luckily I found another gig where I wasn't required to use any LLM stuff, hope others can do the same.

quasilyte | 16 hours ago

Any hints that you can disclose about this new gig? What's the domain, etc.

Vaelatern | 5 hours ago

"BI." Fascinating tool to me that I wish to use someday in a context where it may make sense to pay money to the team that runs it.

Vaelatern | 5 hours ago

This is doubly astonishing to me since LLMs have proven terrible to me at writing Clojure. I think there is a disjoint relationship between things that are Fun To Write (Clojure, lisps, etc.) and things that LLMs are good at (boilerplate lovers like Java, and I hate to say it because I love it but it's true, Go).

Your previous job used a language LLMs are not good at writing, mostly because that language focuses so much on developers having a good time it never asked if LLMs would too.

txxnano | a day ago

I'll say it's all in the first paragraph

Does any of this work actually matter?

If it does not, you will never feel fulfilled, and hereby the culprit. I'll say no matter which large language model you use, or any shinny new tool your CEO enforce you to use, we as professionals and people who love the craft, should never forget why we do the things we do.

Aside of this, I'll say I'm a bit tired of reading posts like this and I'm under the impression that we, programmers, need to be a bit more detached from the current day to day job and take it as other professionals: as a job

Furthermore, and this completely personal opinion, I believe this post it's simply too negative with statements like

Ethically: Generative AI tools, powered by data centers which consume vast amounts of water and pollute our environment, are built on the collective theft of the works of millions of artists, developers, authors, and other creatives, supercharge the spread of disinformation and fascism, have repeatedly provoked psychosis and suicide, and concentrate wealth in fewer hands while providing cover for widespread layoffs.

Yes but no, many of these issues are and were caused by political movements and think tanks outside technology, and we are blaming just all to AI tools or Palantir; it's just an escape goat. AI didn't vote (yet). I'll recommend to watch Trauma Zone by Adam Curtis

Eventually and as Christopher Alexander used to say, we will connect as humans, making computers do computer stuff

post_below | a day ago

I really like the idea of an escape goat. It's not fast, but you don't have to pay for gas.

thasso | a day ago

There has indeed been a significant uptick in the use of escape horses in armed robberies. Just like escape goats, escape horses remain unaffected by the recent volatility and price hikes in global oil markets. But escape horses are also fast and versatile, quite the opposite of escape goats in this regard.

cpurdy | 20 hours ago

Have you ever tried to ride a goat? This is a lot harder than it sounds. And getting a goat to do what you want, e.g. escape?!? That's a pretty terrible time to discover that your goat has a mind of its own. And far less agreeable than your typical LLM.

thesnarky1 | 20 hours ago

I'm not sure how much time you've spent around goats, but "escape goat" sounds pretty accurate to me. Plant any sort of garden on the other side of the fence and you, too, can have an escape goat.

Concur that RIDING one would be troublesome, but if your interests for freedom align with its, I suspect you'd have an excellent partner in escaping with any goat.

abathur | 17 hours ago

Just make sure it's not a fainting escape goat.

A mountain escape goat would be pretty rad, but I imagine we'd throw their balance off even if we could hang on.

kevinc | 13 hours ago

There's a puzzle game series called Escape Goat.

reezer | 22 hours ago

we, programmers, need to be a bit more detached from the current day to day job and take it as other professionals: as a job

Is that actually good? I mean in a way yes, probably, but whenever I think about that the next question I ask myself is whether it is a good idea to detach myself from the thing that I spend the majority of my wake and focused hours on. During the prime of my life. During the time I have energy.

And then the idea slowly starts to feel unhealthy and any countering of that thought feels like coping or denying reality.

thing that I spend the majority of my wake and focused hours on

Now that's another question. Why are we spending so much time working? We really ought to have 4 day work weeks by now, with all the productivity gained over the past century.

rbr | a day ago

Yes but no, many of these issues are and were caused by political movements and think tanks outside technology

You are saying, who cares about B causing C, because A also causes C. I think in abstract it should be pretty clear that's not logical. Instead, the conclusion should be to reject both A and B, not to focus on only A or B.

k749gtnc9l3w | 23 hours ago

You are saying, who cares about B causing C, because A also causes C.

The real question is the million of ways A is causing B.

I'm under the impression that we, programmers, need to be a bit more detached from the current day to day job and take it as other professionals: as a job

There is a difference between treating something as just a job that you don’t necessarily identify yourself with, vs treating it as bullshit to be phoned in so you can collect a check.

alexwennerberg | 21 hours ago

Comment removed by author

henrycatalinismith | a day ago

Ironically, what I’ve gained from AI is a deeper appreciation for human communication, in all its messy imperfection. The point of a code review is not simply for good code to make it into a codebase, but to build institutional knowledge as people debate and iterate and compromise, slow as it may be. Friction is good.

Felt this.

Time was you might give the same piece of code review feedback to the same person five to ten times and then the knowledge would probably stick for them. We were helping each other to grow at the same time as we produced code.

Agent adoption beyond a certain threshold takes that away. Your code review feedback is just another prompt for the agent. The person who's nominally running the agent doesn't have time to read and internalise any of the actual ideas expressed, because the whole point of why they're even using the agent is to save exactly the time they would have spent doing that for running more agents.

I get that agents open some doors but the whole philosophy of work that's inherent to really extreme adoption does seem to structurally relegate human communication outside the critical path of the work. My current job is nowhere near as extreme as the workplace described in this post and I hope it never is. It sounds very very lonely.

ironick | a day ago

We know at work who are the big users of AI. I have noticed a big trend in our code reviews.

The AI power users consistently require the most detailed reviews, with obvious UI breaks on their branches that you’d never miss if you were looking at the result of your change.

But more to the point, I will consistently give, say a comment of feedback on a particular way of approaching something and offer a suggestion to fix it, mentioning that “this is a better way to do this”/“more consistent with the codebase” etc etc

over and over, they will fix only the instance of that issue that the comment was pegged to, leaving the rest of the PR untouched when the same issue was present multiple times, even when I mention those other instances in the original comment

And then more PRs come through with the same issues, and I keep on having to essentially go line by line and tell them each instance to fix

All in all, I just know that these people are not reading the code they submit, and are not able to implement feedback or take it on board because they’re not concerned with the quality of the code in the first place

sunshowers | 20 hours ago

over and over, they will fix only the instance of that issue that the comment was pegged to, leaving the rest of the PR untouched when the same issue was present multiple times, even when I mention those other instances in the original comment

The weird thing is LLMs are actually quite good at pattern amplification: you can fix one case yourself and ask the LLM to fix everything else in the PR/diff — it'll be more complete about it than a human could be. Your colleagues are definitely not doing that, which is unfortunate. (Speaking as someone who used LLMs extensively for a while but has become a bit more circumspect lately.)

Quite true, the current crop of LLMs is fantastic at doing refactors and cleanups. If you understand the code well enough to tell them what you want refactored or cleaned up.

danking | a day ago

over and over, they will fix only the instance of that issue that the comment was pegged to, leaving the rest of the PR untouched when the same issue was present multiple times

I have had the same experience.

I wonder if my nits and gripes even matter. The AI-inclined amongst us produce a lot of code. That code does something. Do its little warts, failures, deceptions, etc. matter? The world is transitioning to a low-trust relationship with many products.

LLMs maybe give you an answer to the question you intended to ask. Now products maybe do what you intended to request.

I kinda think that the vast majority of software that people use never needed to met the standards I set for the software that I use.

Ameo | a day ago

Your co-workers were like this before AI was a thing. Leadership had the same narrow and close-minded view of you and your job before AI too. You were judged in very similar ways then as you are now. The value of what you do hasn't changed at all, and as other commenters have said it probably never mattered that much and never will.

AI just amplifies what was there already. The fact that the golden age of Silicon Valley has gone and software devs aren't a rare in-demand hero contributes to this as well.

As far as the political stuff you talk about, I largely think that follows with the fact that the balance of power has shifted from employee to employer in tech in a lot of places. This year, for the first time in my career, I felt like I was in direct competition with my coworkers for my job rather than working with them as a team.

AI companies care about profits and growth more than things like the greater good, the environment, or whatever. Just like all the other companies in the world. Politicians and other people in positions of power use AI to entrench and reinforce their power, just like all other technology.

And yeah, AI kind of sucks as a person who puts value in their ability to write code. I've not been in flow state since 2025 and it used to be something I experienced more days than not.


idk what position I'm even taking here. I guess I'm just using this as an opportunity to vent or something. Who cares.

creesch | 23 hours ago

Your co-workers were like this before AI was a thing.

Some of my co-workers were like this before AI was a thing. But the blast radius of their actions was greatly reduced by their own human competence in many way. So yeah, I guess that is the amplification you mention. But then other co-workers were actually pretty good before but now seem to have outsourced all their critical thinking to LLM models. Here I think other things are at play.

alper | 23 hours ago

But the blast radius of their actions was greatly reduced by their own human competence in many way.

Juniors physically couldn’t ship ill conceived features end to end just because of the fact that they were juniors and didn’t have any idea how to do it. That’s changed radically.

Random engineers will shoot us PRs for platform code pretty much all of which we have to reject because it’s bad.

But then other co-workers were actually pretty good before but now seem to have outsourced all their critical thinking to LLM models.

Yeah, this is the most interesting phenomenon to me. People I know who seemed like very experienced, wise, skilled engineers have produced some very bad code, and just not been able to notice that it was bad at all? I mean, I say "people", but I've done it myself as well. A lot of them (and I hope myself) have also managed to correct for this, and engaged their critical thinking again, but I think there's a very real danger of, like you say, outsourcing important parts of one's job to the LLM.

creesch | 6 hours ago

Yeah that certainly is one part of it, I've written about lazy and non lazy use of LLMs before and it certainly takes a lot of effort to not drift towards the lazy approach.

I also think that there is a group of people who were deeply insecure about their skills before and honestly do think LLMs are already doing a better job than they could do before.

I've watched people I know slowly outsource their thinking to LLMs. They haven't always been like this.

durrendal | a day ago

This really resonated with me, and sadly seems to resonate with a lot of us. AI didn't just take away the fun technical parts, it took away the human components where people actually stopped and contemplated the point, intent, and process.

I think ultimately Ky is right, if something is worth doing, then it's worth doing well. Compromising that for speed and laziness is just as ethically questionable as the other AI ethics points they raise.

cpurdy | 20 hours ago

We must avoid the easy path of doomerism.

AI (rather, LLMs) can be a useful tool. It should not be a substitute for human involvement, human interaction, and human investment. It's up to us to make sure that the better path is chosen.

quasilyte | 16 hours ago

I like to write code.

Living in a world when you're treated like a weird guy for wanting to write-and-understand the code is not feeling rewarding. No. It feels demoralizing. I am OK with some use cases - brainstorming for random inputs, guidance in bug searching, maybe a review that I ask myself - I like to talk about the code, after all. But I don't want to be a manager of it. I am struggling a bit, and had to quit couple of jobs in the past few months. I don't know if how it's going to unfold.

For now, I'll consider getting back to making games - it's very crappy in terms of the money (barely enough to cover the cheaper country monthly rent), but at least I can choose to do the things my way :)

I like to write code, just like some musicians would like to play their favorite instrument instead of turning on some pre-recorded discs.

denys_seguret | a day ago

Do we belong in a world in which human mind is devaluated ?

Can we simply survive now that people in powers, be them official dictators or just ordinary oligarchs, don't need smart people anymore ?

mattgreenrocks | a day ago

This is a profoundly unserious time in many spheres of life. Also, tech has always been quite unserious, and AI amplifies that further.

I came back to an infrastructure role about a month ago after having a break since last summer (I was working as a receptionist in the meantime - that was refreshing change of scenery) and what I came back to is quite frankly terrifying.

Everything revolves around making skills for chat models so that we spend the least amount of tokens possible. Knowledge I was crafting since childhood is no longer needed and my expertise is instead prompted. The numbness of it all is exhausting.

Give me one more extensive pull request that wasn't even read and I might loose it.

cpurdy | 20 hours ago

Give me one more extensive pull request that wasn't even read and I might loose it.

It is important to understand that when someone issues a PR for a change that they haven't themselves reviewed, that they are telling you up front that they consider your time to be of no value. Perhaps start with that question: "Why do you consider my time to be of no value?" See where the conversation leads.

I'm in security and it's actually getting really fun. We're seeing so much more code written by people who clearly don't understand what they're doing, that compiles and runs just fine but is disastrously buggy in surprising and unexpected ways. Glad I made the switch from dev (only last year).

antoinewdg | 6 hours ago

Would you mind sharing a bit more about your switch from dev to security? It's been at the back of my head since the rise of LLMs but I've never given it the time it deserves, and honestly have no idea where to start.

k749gtnc9l3w | an hour ago

Just alien bugs is better than I expected! I wonder is there anyone left in the loop to pay attention that the requirements are logically consistent with each other, let alone compatible with the current model-of-the-world encoded in the software…

toastal | 21 hours ago

Seems fine to me as I’ve been feeling the same. After going thru stages of grief, I’m less hyperfocused on trying to level up on tech skills. I’m playing games with friends, looking to dabble in art again, looking at starting a small local business. It’s been more freeing to stop trying to fight it when it seems the loudest ones no longer care about the philosophy or learning—it’s all just vibes man. If I do find some work, I would be more fulfilled to program something for me, my business, or my local community.

We're burning down houses to roast marshmallows.

Two weeks ago, I quit my job. It wasn’t a bad job, not by most metrics. It ticked the boxes a job is supposed to tick: good pay. Health insurance. Remote work. Time off. Nice coworkers.

Many of us have bullshit jobs where the goal is not, in fact, to make something good, or even to learn, but to simply make money to pay rent and medical expenses

Many of us can't afford to quit our jobs. I keep my family afloat.

Does any of it really matter?

It's not just the end work we do that matters, but also how we do that work.

I get paid to exercise my professional judgement to make a profit. That's not the entire story though, the personal ethics, skills, and interactions I have with others affect their lives as well. Little things that we do everyday, even just a nice compliment to someone, or automating a workflow that someone hates, makes someone's life better. That's purpose enough.

“Many of us can't afford to quit our jobs. I keep my family afloat.”

Yeah. If this wasn’t the case I would’ve bailed on tech not long after COVID started. I’m fortunate to have a job that I like, and doesn’t require any use of AI, but it’s like having a pleasant job on the Titanic while it’s going down.

I spend a lot of time trying to think of things I could do outside tech and still support my family.

james | 13 hours ago

When I started full-time design and dev work in the 2010s, tech was generally understood to be a progressive place. Tech organizations have now given up on pushing back against What happened to the principles that were professed a decade ago?

Anyone asking these kinds of questions or perceiving a change in trend in this time period is absolutely delusional. 99% of tech exists in the space of extreme exploitation through being involved in forwarding goals in finance, adtech or other exploitative extractive organisations (e.g. Amazon). Show me someone who's working in a tech job that thinks that it's 'progressive' in some way that is measurably and provably socially good and I'll show you someone either absolutely delusional, able to compartmentalise the work they do in their tech job so they don't have to deal with the actual social effects of the real use of the tech they build, or one of the very very few lucky people who gets to build systems with tech which have net positive social effect.

Tech is just part of the social and political system. If you live in a western country, tech is employed primarily in the service of the goals of your society, and at this point in history (and for at least the last 50 years), these goals have been primarily capitalist. If you didn't realise that until now, I'm glad you do now, but to consider this a change is so delusional I'm just continually shocked at the number of people working in tech who have so little social and political awareness that they're only now perceiving a change or like there is something wrong. This is not a change. Tech and capitalism have been intertwined and bedfellows for a long time. And to blame the change on just the current superficially most present and visible tech artefact is similarly just disappointing - this is nothing to do with whatever the current most visible output of tech is (and hasn't been since that most visible thing was a loom or a steam engine), it has to do with the systems we choose to live in which make those things able to be used for exploitation and to service capital and power.

k749gtnc9l3w | an hour ago

or one of the very very few lucky people who gets to build systems with tech which have net positive social effect.

And even more rare in Tech sector as opposed as in-house technology service. I think to be even neutral in the big scheme of things but slightly improving the comfort of directly affected people, it helps to talk to the users sometimes, and Tech is all about avoiding this at all costs.

If you didn't realise that until now,

I think there is an answer to the question «what happened», and it is useful to actually give it.

What happened is: those companies were always or quickly became (one could say that some initially stumbled around for a year or two) explicitly economically-anti-progressive, which is what really matters most.

So when economically progressive regulation (from government or from the court cases) was the biggest risk, they said whatever non-economics stuff would look the closest to progressive — but importantly cost them as corporations nothing to say — to distract from economically progressive questions that could come into question, like anti-monopoly laws and limits on liability disclaimers.

Now they think that isolationism policies is the most realistic big threat, they are searching for whatever talking points from administration look safest to them as corporations, to distract the interpretation of isolationist promises from a direction that would hurt the tech corporations into something they don't care about (regardless of levels of harm to people from the points they eagerly support and the various possible interpretations of isolationism).

So yes, they did try to look progressive but only where it didn't cost them, but it was also selling out, and selling out is all they do anyway.

jorgelbg | 2 hours ago

I find particularly interesting how AI has become the way of ditching a lot of established patterns in the name of solving a bottleneck that was never the problem to begin with. Some of these I've shared among my colleagues in random coffee chats, and we tend to agree:

Code reviews, theoretically not needed (nor practical with the volume of slop generated by the models). Code reviews were never about verifying each line per-se. From my PoV it was more about knowledge sharing among the team, sometimes learning of a different way and to understand why maybe something that it is working would not scale or would not be the best approach. Also avoiding the tunnel vision that we all get when expending a lot of time focusing on a single outcome.

Understanding code, yep this is apparently is not needed anymore and very much related to the previous item. Architecture of the code, putting the right abstractions in place, improving the tests.

Number of lines written as a proxy to productivity and/or value added to a project/code base.

If we squint a bit tho, it is not only sane technical practices going away, we are now back to "managers" (or CEOs) enforcing tools onto developers. We had an agreement that enforcing tooling into developers that is meant to be used by the developers was a bad idea, unless we are talking about agents, models, LLMs, which is apparently the golden path now. Forcing people to "prompt their way to success" instead of just writing code and using whatever means to make the most progress its going out the window.

lacker | 16 hours ago

I think each workplace is going to be different here. If you don't agree with your coworkers on the basics of whether to use AI, it's going to be pretty frustrating. Maybe just impossible.

Both sides are going to be annoyed. When the author is thinking "That lazy jerk just pasted in a review from an AI tool after making me wait for days", the other person is thinking "That lazy jerk pestered me to do work that he could have had an AI do for him days ago."

In this case, the author is trying to personally abstain from AI, while the organization is mandating it. This seems like a recipe for unhappiness. It's like loving Rust and hating Go and working at a Go development shop. If you really want to work with a particular set of tools, you need to find a place to work where you're allowed to use those tools.

sugaryboa | 8 hours ago

Do I belong in tech anymore?

The author is mixing tech and politics too heavily. So while a lot of observations (especially about AI) are astute, the general premise that an election is a president has changed that much seems to be an illusion.

First off, everything is political. Full stop. Politics shapes tech, and vice versa. Politics shapes the market, peoples careers, peoples experiences, peoples lives. "Politics" is not floating out in space, it's connected to everything, including your career, mine, and the author.

Second, I think saying it's an "illusion" is a massive stretch and attempts to devalue the author's actual, real-life, lived experience.
Just because it didn't change anything for you, your career, your experience doesn't mean it hasn't changed anything for them, and a whole lot of other people too.

sugaryboa | 6 hours ago

First off, everything is political. Full stop.

Yes, but the degree matters.

Politics" is not floating out in space, it's connected to everything, including your career, mine, and the author.

I am more of a marxist/materialist in this regard. I suspect that physics and material conditions affect politics more than vice versa. Programming (what has been called "tech" in the past 15 years, I really see l despise this moniker) is, after all, "cybernetics", the science of control. Control reaching humans was a matter or time, not decision making.

massive stretch and attempts to devalue the author's actual, real-life, lived experience.

In my lived experience the Sun rises in the East, but in reality the Earth revolves around its axis.

antonmedv | 18 hours ago

Generative AI tools, powered by data centers which consume vast amounts of water and pollute our environment, are built on the collective theft of the works of millions of artists, developers, authors, and other creatives, supercharge the spread of disinformation and fascism,

tldr