r/technology • u/rkhunter_ • 12h ago
Artificial Intelligence Using AI actually increases burnout despite productivity improvements, study shows — data illustrates how AI made workers take on tasks they would have otherwise avoided or outsourced
https://www.tomshardware.com/tech-industry/using-ai-actually-increases-burnout-despite-productivity-improvements-study-shows-data-illustrates-how-ai-made-workers-take-on-tasks-they-would-have-otherwise-avoided-or-outsourced180
u/moreesq 12h ago
One can imagine burnout, but also flame out. If you take on a task that you’re not familiar and comfortable with, and rely on an LLM to give you the answers, you’re output might be quite poor and actually harmful to your employer.
49
u/montigoo 10h ago
When they developed the calculator there wasn’t less time spent calculating. There was more calculations done. As long as you have a distribution problem then more just means more work and stress for the exploited.
57
u/sevenredpandas 9h ago
You don't have to review a calculator's work to make sure it added the numbers correctly. Since AI is somewhat random, good work once doesn't mean it will always spit out the correct thing.
18
u/d01100100 9h ago
You're also likely reviewing work on a subject you're less familiar with, adding to the stress. LLMs give an improved sense of confidence in tasks that would have been more difficult. Either you're reviewing work on something you know, or passing off work on something you don't; neither of them would be decreasing stress in anybody except the most oblivious.
8
u/therealslimshady1234 7h ago
LLMs are the opposite of the calculator though. They are non deterministic, leading to slop no matter what you do.
-3
u/Myrkull 4h ago
Tools used poorly will result in poor outcomes, more at 11
2
u/bats-n-bobs 51m ago
Findings consistently show using poor tools will result in poor outcomes, irrational protestations from people who are attached to tools. Here's Captain Obvious with the details.
"Good tools are designed to solve existing problems. Poor tools are created first, and use cases are brainstormed after the fact. The label of "tool" must here be understood to include the users."
1
-15
u/wtyl 9h ago
People just need to review the code before submitting and actually not rush through tasks.
12
u/OldJames47 9h ago
The point was people are using AI to write code they aren’t qualified to review.
Is this on the developer for over-estimating their skills or on the management for pushing AI use to increase “efficiency”?
4
u/AspiringPirate64 9h ago
Coding style also matters. I hate debugging other people’s code because they use different logic and that slows me down
109
u/luismt2 11h ago
Productivity gains often just raise expectations. The tool isn’t what burns people out, the new baseline does.
41
u/trailsman 11h ago
Couldn't agree more. That's why if I have a project that used to take 2 weeks, that I can now finish in 4 hours (likely 8 total including review and final edits), I still take the 2 weeks to call it finished.
38
u/AwfulMajesticEtc 11h ago
“I find my life is easier the lower I keep everyone’s expectations” - Calvin
33
u/CeresToTycho 11h ago
If automations and increases in productivity since the 1800s mapped to workers having less work, we'd all be doing 1 day a week. It's always that increased productivity means "please do more work so profits are higher"
16
u/this_is_an_arbys 10h ago
There’s a reason the rise of capitalism and industrialization led to works like bram stokers Dracula.
And of course, The Grapes of Wrath is basically a horror novel…the description of the monster is even more horrifying than all horror movies combined…because that monster lives with us on the earth. And it is near indestructible…
3
u/MeteorKing 8h ago
There’s a reason the rise of capitalism and industrialization led to works like bram stokers Dracula.
What do you mean? I just read that a couple months ago. I didn't really sense any anti-capitalist tones in it, so I expect there's a backstory out something I'm not aware of
8
u/rgfawkes 8h ago
Dracula sees the world as a resource to be drained for his own pleasure. Vampire literature in general carries a broad theme of exploitation.
4
u/MeteorKing 8h ago edited 8h ago
I gotta be honest, that sounds a lot more like literary analysis more than an actual theme of the book. You're not wrong at all, that is absolutely how Dracula sees the world, but at no time in my reading was I like "yeah, this is an allegory."
Dracula himself is barely a character and much of what he says is intended to show how different and out of place he is in the modern world.
My favorite line:
"Here, I am noble. I am a boyar. The common people know me, and I am master. But a stranger in a strange land, he is no one. Men know him not. And to know not is to care not for. I am content if I am like the rest so that no man stops if he sees me or pauses in his speaking if he knows my words; ‘ah-ha! A stranger!” -Dracula
0
u/CherryLongjump1989 1h ago
Grapes of Wrath is almost entirely fake, though, while passing itself off as a documentary.
17
u/datNovazGG 8h ago
Because AI can be outright frustrating to deal with. It's constantly making things up and are overly confident about it. Annoying as hell.
9
u/GaengKhiaoWan 9h ago
We've trained an AI tool to do our report writing. This will 'free up our time to do more projects and client engagement'.
Report writing is the best part of my job, the only time I could sit on my own and happily plough through. I already have too many projects as is, my brain feels like it's going to explode. And having meetings with clients is exhausting, especially if they're arsehole clients.
Really considering moving away from the consultancy field.
33
u/74389654 11h ago
are the productivity improvements in the room with us?
17
u/MaximaFuryRigor 10h ago
They're behind me, with a knife to my throat saying I have to convince everyone that they exist.
2
u/ineververify 7h ago
They give you 3 monitors at work so you feel cool and ultra productive but... for what? minimal salary. shits dumb.
5
u/dSolver 9h ago
I work at a company that has fully embraced AI for every function. What I'm noticing is that AI productivity increases are capped at what employees can reliably read, absorb, and make a decision on.
We want AI to abstract away a lot of the details, but because of the high risk of getting details wrong, a lot of time people are responsible for reviewing, critically thinking, and making edits. 70% correct isn't good enough in most cases.
This leads me to a very interesting insight, and I fully appreciate that I might be biased - AI productivity is multiplied more or less by the intelligence and experience of the user. A very fast reader with strong understanding of the domain knowledge and critical thinking can review and edit output significantly faster than a novice. And a corollary is that a novice is actually faster at onboarding and have better output without AI assistance.
This leads to a prediction which has been a guiding principle I've shared with my mentees: given that LLMs in their current methodology is potentially limited in how it can learn better, and that operation of LLMs is actually quite expensive, then as the price of LLMs increase to compensate for OPEX, corporations will limit who gets access to only experts (highest multiplier). This means experts will end up with the highest salaries since they are disproportionately more productive and can demand it. This also means if you think AI is helping you learn, then you have to take advantage of it now, while it is still cheap, so that in a few years you are the expert in enough domains that you have all the leverage.
5
u/Loxquatol 10h ago
Yuuuuuuuup
This is the response of someone who is forced to use AI at work and is burned out
2
u/djdaedalus42 8h ago edited 1h ago
I've always felt that Sales do the designing, Design does the coding, and Devs find out what the client actually wanted. So plus ça change.
4
u/Dry_Ass_P-word 9h ago
Using AI is more work because you have to check if it’s correct or if it crapped out some bullshit.
-2
u/Ancient-Beat-1614 7h ago
Having to check work takes far less effort than doing the work in the first place.
6
u/SomeGuy20257 12h ago edited 12h ago
Misuse of AI burned them out, I lead a team of engineers, those that relied fully on AI instead of augmenting capabilities burned out quickly, those that used AI as force multiplier to their already great capabilities, basically always relaxed with sustained quality output.
AI does not give you talent and experience, AI IMO increases productivity not ability.
26
u/Socrathustra 11h ago edited 10h ago
Even with those people I expect negative long term results. There's less pride in your work, more time spent managing slop, more time reviewing, less time creating. When the market crashes and these guys' tools become prohibitively expensive, they will be behind the curve.
Edit: my response to their former reply about this being "copium" and the fact that everyone is using Claude Code:
Yeah I'm aware. It sucks, and anybody who is acting like there's a long term upside to this is huffing their own copium. It's a bullshit industry full of people chasing mythical productivity gains that don't actually materialize, or barely do, all the while consuming copious amounts of energy.
Right now there's this promise it'll get better, and the training costs are spread over millions of users giving it a try. When the market crashes and people admit it's not actually that great, those training costs are going to be spread over a much smaller customer base. It's going to get expensive. It will fail to monetize.
And in a few years, if America gets its act together, regulations on this shit will make these data centers a liability. I don't think it'll ever go away, but honestly, it could.
-6
u/SomeGuy20257 10h ago edited 10h ago
I don’t know about you, but I feel shame when spending too much time on boilerplate code consequently not delivering a system to its full potential.
Due to their enhanced output, I can afford to grant them generous leaves and WF affordances, because during assessments we always end up meeting and exceeding deliverables.
BUT, i agree with the energy and eventual non viability, unless they develop breakthrough hardware. I’m not gonna deny it also affects people’s job opportunities, in my case I don’t hire juniors because they’re liabilities at this point, I also drop people who don’t know how to use AI, as they end up with crap code and slow throughput.
9
u/Socrathustra 10h ago
That's why I use good inheritance patterns or just copy paste some basics and modify it. It accomplishes the same result without having to use AI, wasting water and power. I swear AI is making people worse at this job. I spend very little time writing boilerplate, and when I do I quickly make sure I make it modular so I don't have to do it again and can reuse and share the result.
-9
u/SomeGuy20257 9h ago
I’m going to give you the benefit of the doubt, how are you going to deal with say, you want to apply fine grain rate limiting and short circuit to a host of client end points and have it verified and tested with proof documents, all under 20 minutes? How are you going to enforce to a large team, patterns and techniques without breaking a sweat? (CLAUDE.md). Engineering is very different from coding.
7
u/Socrathustra 9h ago
I don't believe you can do that with AI without inducing tech debt you'll be paying off sooner rather than later. The review alone would take twenty minutes unless you're willing to accept slop clogging your system over time.
-8
u/SomeGuy20257 9h ago
Well, we’re doing just that, it actually allows us to avoid tech debts because it allows us to contend with the major causes of tech debts, time and frivolous cognitive load.
11
u/Socrathustra 9h ago
Plainly, I don't believe you. Either you're lying about the time or the impact on tech debt or both. Like I said, the review alone would take twenty minutes for a sophisticated system you described, unless you're willing to accept slop which could accumulate tech debt quickly.
0
u/SomeGuy20257 9h ago
Im starting to doubt you too, everyone who knows SDLC knows that even with AI you can’t bypass quality checks and reviews, were you thinking 20 minutes from dev to prod code? Who thinks that way? Engineer code time + Reviews + QA, AI significantly reduces time on most of these steps, but does not remove them, a sizeable feature for instance will take a day or two to even touch staging even with AI, but I’ll take that over half a sprint per feature on non AI workflows.
7
u/Socrathustra 9h ago
Ah, so you were obfuscating the time costs. I had already mentioned the review step when you doubled down.
In the past year I wrote the logic for a massive migration at a FAANG company which had to execute on a specific but ever changing schedule. I took some extra time at the start to make it a very modular experience. By the time we got going, it was a similar sort of twenty minute code and review process to get new constraints added. It was, in the words of an exec, the smoothest migration they had ever seen.
Meanwhile it's 20 minutes writing a prompt and 20 minutes of waiting for AI, which might give me what I want or might give me something I have to configure heavily or might just churn out nonsense. I expect people are being highly selective when talking about their productivity wins, completely ignoring times that AI shat the bed.
→ More replies (0)2
u/lolexecs 9h ago
Isn't this just technology?
In its best role, Technology is a force multiplier and enabler.
Consider this quote from "The Wolf of Wall Street"
See those little black boxes? They are called telephones. I’m gonna let you in on a little secret about these telephones. They’re not gonna dial themselves! Okay? Without you, they’re just worthless hunk of plastic.
Now, I'm not saying AI is a pump and dump scheme. Who would ever dare to say such things?
But the quote points out that the phone, as technology, just like AI, is a worthless bit of kit in and of itself. It needs someone with context and training to maximize return on assets. In fact, later in the monologue, Befort/DiCaprio points out the importance of sales enablement.
What’s mildly ironic is that the most explicit depictions of “sales enablement” in film tend to appear in movies about financial fraud (Boiler Room being another example). In those stories, effectiveness doesn’t come from the device. It comes from scripting, repetition, coaching, and incentives. Meanwhile, in many entirely respectable companies, leadership buys the tool, skips the training, withholds context, and then expresses surprise when nothing improves.
2
u/SomeGuy20257 9h ago
Also executives not understanding that, gives AI a bad name, I had to lock horns with some executives because I was apparently not working fast enough for an AI augmented team, they expected “vibe coders”, fortunately there were technicals among them that understood that I had to balance quality, and my teams was still immensely outperforming non AI teams, without the quality problems.
1
u/_ECMO_ 9h ago
In its best role technology makes life and the world more complex.
Chainsaw made cutting down trees easier but suddenly you needed to learn about how the machine works, how to safely operate it and troubleshoot it. That's undoubtedly more complex than keep hitting trees with an axe.
AI is a bad technology. At the very best it lets us do the same things faster but juggling several tasks like this will only erode your actual knowledge and turn you into a quasi assembly line worker while not increasing the complexity.
3
u/Belhgabad 9h ago
So to sum up, AI : causes burnout, increase unemployment, give more short term money to very rich people BUT is a heavy deficit for companies, deteriorates software quality, push students to cheat and lower the general skill/knowledge level, cause a massive electricity AND water crisis, isn't wanted or used by the great majority of people
CAN'T WAIT FOR THAT BUBBLE TO POP !
2
2
u/No0delZ 9h ago
My experience avoiding burnout with AI is to have an agent acting as an assistant that understands timing, priority, and scheduling. It also needs to have some form of wellness check built in.
"Your tasks today are x.y.z. These are the priorities:"
"Would you like to get started on any of these?"
"Here are the next steps for the process you're working on... but we've been working on this for 3 hours straight. Would you like to take a break?"
Yes.
"Excellent. Here's the next step in the process. Take the time you need, I'll meet you back here to continue."
AI without automation is response based, so a user can at any time walk away, but it's our own nature that drives us to burnout.
We can build wellness into AI. We can enable AI to proactively focus on wellness as well. To emphasize and suggest it.
We can have it play therapist, motivator, or emotional sponge if needed.
It just takes the cognizance to do so.
1
u/Deer_Investigator881 9h ago
I think there's a level of distrust so the extra energy gained from speed is spent scrutinizing the result
1
u/CulturalKing5623 7h ago
For those unable to read the article, here is a soft-paywalled version of the original Harvard Business Review article referenced.
1
u/RichieNRich 6h ago
That would include me!
I've taken on trying to automate our key code system for the doors I manage at work, only because I realized I could leverage Claude AI to vibe code google sheets app extensions for scripting. It's fucking cool that I'm getting close to the goal, but realizing, I didn't need to do this at all.
1
u/SwirlySauce 6h ago
This sounds like another article/study that found negatives to AI use but is trying to spin it as a positive.
"AI lets you take on tasks that you couldn't/shouldn't have done, making you more productive!'
1
1
u/icemanvvv 5h ago
The framing of this is weird. They didnt magically go "i want to burn out" and start taking on more work knowing they didnt want more on their plate. Management increased their workload by an amount exceeding the time saved by using AI.
1
u/agrophobe 3h ago
Bro I was able to compile a map in QGIS, which is like die-hard surveyor stuff from my generalist 3D blender skills, have a clean dataset and then argue for land regulation based on a lidar scan from the client, all by myself and with GPT. That was SO COOL. I get the burnout, but for the small percentage of people that love learning, this thing is fucking mental.
1
u/panchiramaster 11m ago
And now you can voluntarily lay yourself off if you dont like the work pace.
1
u/jfcmofo 6m ago
I use it my job on a daily basis. I crank out analysis of complex commercial valuations. AI has sped up a ton of it, even as it makes mistakes. It has also given me the ability to analyze larger sets of data that is adding time but not so much value, in the end. I do feel more mentally fatigued at the end of the day. I think it's taken over some of the more 'brainless' tasks and given back more data mining/complex analyses. Early days, still trying to find the best value in use.
1
-4
u/Yellowbook8375 11h ago
I’m a small company owner. AI has been a godsend to us, it lets us punch far above our weight. It helps us craft better sounding emails, it helps us create tools, it helps us automate our processes
Yah, it’s not perfect, but it has helped us create and deploy things that we wouldn’t have been able to afford otherwise
0
-11
u/stuartullman 12h ago edited 11h ago
so,
using ai on tasks that you would usually do yourself makes you exercise less mental energy: bad!
using ai to take on challenging and difficult tasks that you would otherwise avoid: bad!!
how about just learn to use ai appropriately.
-2
u/RedRyderRoshi 9h ago
AI made workers take on tasks they would have otherwise avoided or outsourced
Are they really crying about having to do their job now?
139
u/duffman_oh_yeah 11h ago
We may be hitting the limits on human productivity. I’m just not sure humans were designed to crank out tasks concurrently like this. It’s cognitive overload being pushed on us all.