senko a day ago

The energy and/or water use comes up from time to time so I did a little digging:

The average ChatGPT query uses 0.34 Wh (0.12 kJ, 0.3kcal) and 0.32ml of water.

This means if you ask it one question every two minutes, you're using about as much energy as a 10W LED lightbulb.

If you use it 5000 times, you'll waste the energy of a medium-sized pizza, and water amount equivalent to how much you'd need to wash your hands after that pizza.

Sources:

Energy and water use: https://blog.samaltman.com/the-gentle-singularity

Typical water pipe flow (with perlator/aerator): 3.5 L/min: https://www.enu.hr/wp-content/uploads/2016/03/7.-Racionalno-... (US stats are 4x that: https://www.nyc.gov/assets/dep/downloads/pdf/environment/edu...) Time required to properly wash hands: https://www.hzjz.hr/sluzba-zdravstvena-ekologija/pravilno-pr...

  • mindcandy a day ago

    > This means if you ask it one question every two minutes, you're using about as much energy as a 10W LED lightbulb.

    Meanwhile your idle desktop PC and monitor are pulling 20-100W each.

    People tend to forget that doing the work the old fashioned way uses energy too. Presumably for a longer time for equivalent results.

  • karmakaze a day ago

    Let's do this in reverse. xAI is planning to buy an additional million Blackwell GPUs right? That's more compute than we all need to ask a question every minute or two.

  • blibble a day ago

    now provide the numbers for the training

  • hgomersall a day ago

    Those numbers from Sam Altman may be correct, but he's hardly a disinterested observer so we should be a little wary of accepting them as-is.

    • odyssey7 a day ago

      The persistent popularity of discussing how AI is supposedly bad for the environment actually gives me the opposite impression.

      Ever since reading Ryan Holiday's book, Trust Me, I'm Lying: Confessions of a Media Manipulator, popular controversies like this always make me wonder who benefits from their continued discussion.

      Every time someone talks about how AI is using energy, that's free, viral publicity advertising AI, keeping us obsessed with talking about it for one reason or another. The controversy makes it stickier in our brains. It makes it feel relevant even if you aren't using the product yet.

      And yes, when done correctly, a leading figure in AI acknowledging the allegation and responding to it helps to further our obsession with the idea. The reassurance might not be to assuage, but rather to keep us engaged.

    • senko a day ago

      True.

      I don't think he'd outright lie, but possibly the calculation is ... stretched ... to fit the narrative.

      Let's call it a small pizza, then :)

    • tptacek a day ago

      The reasoning is highly motivated in both directions. I would just look at other sources. I haven't seen any source with alarming numbers, though.

      • mwcampbell a day ago

        Why do you think the anti-AI side has highly motivated reasoning?

        • tptacek a day ago

          AI is polarizing. People who don't like it really don't like it. This wouldn't be the first anti-AI argument that was more about sentiment than actual analysis. Again, I think the climate impact argument here is basically imported from the argument against crypto (where I enthusiastically join the skeptics) and doesn't apply here. Either way, my point is, just go look at different sources. Simon Willison has linked to some on his blog.

  • nandomrumber a day ago

    Every time water use comes up I feel compelled to mention that water is 100% recycled and 100% recyclable.

    • sameermanek a day ago

      Unless we crack efficient rain water harvesting, its not. Also good water that goes up in the air and falls back wjth other pollutants makes our food worse.

      • nandomrumber a day ago

        Dams aren’t exactly an esoteric technology?

    • stickman393 a day ago

      Assuming you have free energy available to do it...

thundergolfer 2 days ago

I'm very sympathetic to those who want focus on the climate disaster, but I'm not convinced that genAI's carbon pollution is a major problem. Maybe I've missed a memo, but the biggest polluters and criminals in the climate story are still by far the fossil fuel industry (eg. Shell, Exxon) and the agricultural industry (eg. Cargill).

At some point I was even hearing the claim that digitization (e.g. GenAI) was finally divorcing the tight connection between economic growth and resource extraction. I'd bet it's incorrect, but it's much less fanciful than thinking that growth in oil or beef would help us grow without strip mining the earth.

Bray's first issue —the influence of GenAI on labour's (and also the democratic people's) decreasing power versus capital— is much more important and interesting.

  • christianqchung a day ago

    > the biggest polluters and criminals in the climate story are still by far the fossil fuel industry (eg. Shell, Exxon)

    What does this actually mean? It feels like when people say this it implies that gas companies just burn gas randomly for no reason. They sell their gas to everyday consumers, actual people. Why is this Shell/Exxon's fault and not consumers?

    • triceratops a day ago

      Did consumers ask Exxon to suppress climate change research?

      https://en.wikipedia.org/wiki/ExxonMobil_climate_change_deni...

      • SR2Z a day ago

        No, but that doesn't change the fact that people in general dislike the idea of lowering their standard of living for environmental reasons.

        Nobody pollutes in a vacuum, and the truth is that our emissions are the collective responsibility of all of us and not just 10 large companies or whatever the Reddit line is these days.

        • triceratops 18 hours ago

          > people in general dislike the idea of lowering their standard of living for environmental reasons.

          > Nobody pollutes in a vacuum

          I agree, not in a vacuum. If everyone knew the true risks then they might re-think things. They might prioritize tech that could reduce pollution without lowering living standards.

          Oil companies have worked very hard at covering up the truth, purely because a transition to cleaner tech would cost them money.

        • triceratops 16 hours ago

          > our emissions are the collective responsibility of all of us

          This reminds me of the Bugs Bunny/communism meme. My profits, our emissions.

    • satyrun a day ago

      The biggest polluter is China but who cares about reality.

      Big oil fits better into my social media feed and political beliefs.

      • acdha 18 hours ago

        Speaking of political beliefs interfering with accuracy, China’s population confounds those comparisons: if you look at per capita emissions, they’re well below the top even before you consider how much their export economy is shifting emissions rather than creating them.

        https://en.wikipedia.org/wiki/List_of_countries_by_carbon_di...

    • billy99k a day ago

      I suppose it's similar to the obsession with the economy of Nordic countries, but failing to mention it's propped up by the fossil fuel industry.

      • triceratops a day ago

        Only Norway and Denmark have significant oil exports. Sweden and Finland do not.

    • hughw a day ago

      We need to rewire the economy in a zillion different ways. Nobody should be using fossil fuels if they can avoid it, but so many cannot at present e.g. air travel, steel mills. Shell and Exxon aren't forcing people to burn their product. If they shut down tomorrow, millions worldwide would starve. We better keep them going as long as we need them. Edit: I work in oilfield services and probably am a little biased.

      • triceratops a day ago

        They're lying to people about the harms of their products. I'd give them a pass if they hadn't done that. https://en.wikipedia.org/wiki/ExxonMobil_climate_change_deni...

        I don't think the oil industry should shut down tomorrow. Because you're right that it would be a disaster and hundreds of millions would starve and die.

        But I also don't think the oil industry should put up any barriers to renewable energy. By doing that they force us to need oil. They should accept that they need to wind down.

        Employees should be provided good exits from the industry.

      • onedognight a day ago

        > Shell and Exxon aren't forcing people to burn their product.

        Is this a joke? They are lobbying against the subsidy of alternatives that are better for society. The repeal of subsidies was just successfully achieved in the OBBB. Yet, they continue to get subsidies themselves. Have you noticed the special exemption for oil and gas extraction on your tax forms? It’s in your face.

        • SR2Z a day ago

          So revoking a subsidy is forcing people to act a certain way?

          Dude, we burned plenty of oil even when we had the subsides. Big automakers were on board with the idea, energy companies were pivoting to batteries and renewables - and the average person still cared more about their costs and standard of living than whether things were actually green or not.

  • fraboniface a day ago

    Fossil (in particular coal) thermal plants that were planned for shutdown are being kept online or restarted because of AI energy use. Tech had a pretty minor environmental footprint until now but it's growing rapidly due to AI, for use cases that are clearly not vital and for a good part garbage.

  • Veedrac a day ago

    > At some point I was even hearing the claim that digitization (e.g. GenAI) was finally divorcing the tight connection between economic growth and resource extraction

    Kind of? Mostly it's a result of renewables. The US basically doesn't build new fossil fuel power any more. Not every country is on exactly the same point on the curve but they're all approximately following the same curve. Energy use has also decoupled which I think is what you're referring to, but I'm not sure the cause neatly decouples.

  • sshine a day ago

    From the article:

    > Then there’s the other thing that nobody talks about, the massive greenhouse-gas load that all those data centers are going to be pumping out.

    Arguably, the GPU emissions are lower than the carbon footprint of the human workers they seek to replace.

    Before you downvote: yes, we can’t just “deprovision” humans, but unless you think reproduction is immoral because of the environmental impact (an extreme), it has to be possible to increase total industrial production and negate global warming simultaneously. It’s a big equation. The hard part seems to coordinate anything whatsoever as a species.

    • tptacek a day ago

      I don't think we even approach this science-fiction analysis of the carbon footprint of human workers vs. LLMs, because LLM inference apparently has a carbon impact in the vicinity of, like, a couple Google searches. The environmental concerns over LLMs seem mostly to be ported in from cryptocurrency (where they were a very real concern, because crypto put a serious cash value on energy arbitrage).

      • shawndrost 14 hours ago

        You have this extremely wrong. TLDR: LLMs singlehandedly reversed the secular decline in US power emissions, which were the only reason for climate optimism.

        The story of US power sector emissions was a good story. Emissions appeared to be in secular decline from 2005 through 2022, through the crypto nonsense, and the ramp-up of EVs. In large part, this was due to stable power growth, the replacement of coal with natgas, and the adoption of wind and solar. We were on track to go to 0-10% of historical emissions by 2040. https://www.c2es.org/wp-content/uploads/2022/11/2024-GHG-Tre...

        LLMs changed that story; it is now a bad story. Emissions are back on the increase. Natural gas power plants are sold out for six years plus. We are on track to go back to 100% of historical emissions by 2040. EVs are a factor but were a factor in 2018-2022 as well. In terms of popular narratives, it's pretty accurate to say that LLMs singlehandedly reversed the only reason for climate optimism in the US.

        • tptacek 13 hours ago

          I'll read any source you have for this, but it seems unlikely given the low percentage of US energy consumption data centers account for. Your home air conditioning is most of American electrical consumption.

          • shawndrost 11 hours ago

            My statements about the recent history of declining US power sector emissions are pretty vanilla and I won't source them here, but they should be pretty easy to verify. What I'm asking you to take on faith (or to google) is that the positive trend was a mix of flat load growth and the energy transition (from coal to natgas, wind, and solar).

            Load growth was flat 2005-2020, and now it's growing at circa 2%. Recent growth is almost all commercial and industrial, not residential [1]. There has been some manufacturing reshoring, etc, but the main driver is data centers [2].

            "[D]ata centers consumed about 4.4% of total U.S. electricity in 2023 [and ~1.5% in 2014]." "[T]otal data center electricity usage climbed from 58 TWh in 2014 to 176 TWh in 2023 and estimates an increase between 325 to 580 TWh by 2028 [which would be circa 20% YOY growth]." [3]

            Total US power use in 2023 was ~4000TWh. Compute power demand was at 4.4% of that and has been growing at ~20% YOY. (McKinsey forecasts 23% YOY growth and 2025 data center capex is growing 30%[4] but let's be conservative.) If 20% holds from 2023 through 2030 data center power demand will be at ~630TWh (16% of the size of the 2023 grid). If it holds through 2040 it will be at ~4000TWh (100% of the size of the 2023 grid). (No citations here, this paragraph is just analysis, and 630/4000 TWh are just 176*1.2^7/17. However these numbers are similar to the forecasts in [2] and [3].)

            New data centers are more often powered by natural gas (and less by wind/solar) vs the grid at large, and will be true for the foreseeable future. I don't have a definitive citation for this but it's obvious to anyone close to this industry (as I am) and you can dig up any number of confirming citations about specific data center stories or utility IRPs (integrated resource plans). One concrete fact supporting this narrative is that gas turbines are sold out for the next ~6 years, which I don't believe has happened before in this century. Gas is on an absolute tear.

            (Zero-carbon data centers that you read about, like Three Mile Island or next-gen geothermal, are specifically manufactured to disrupt the powered-by-gas narrative. They are about as common as data centers built in a way that revives coal plants; that is, they are a real but small phenomenon.)

            So, that is the story. A) Power sector emissions were in decline, because of flat load growth and the replacement of coal with wind, solar, and natgas. B) This isn't true anymore; the grid is expanding quickly again, AI is the main culprit. C) At least for now, load growth is happening in a way that is dirtier than today's grid. C) If you believe that AI is going to continue to grow for 5-15 years like it is now, you also think that AI is (and especially, will be) a major driver of US greenhouse gas emissions.

            And to add an anecdotal rebuttal to your earlier comment: sure, inference is just a few google searches worth of power. But now google searches have inference, so they are a few more times more power-hungry than they used to be. And when I code, I use way more inference than the number of times I used to google. And remember that when we say "inference", we are talking about last-gen LLMs; reasoning models (which are, as of a few months ago, the default in chatbots) are a chain of inferences which can be arbitrarily long. The current status quo is much more worrisome than your quip, and the problem compounds quickly if you extrapolate at all.

            We have discovered a new and outrageously popular way to use compute, we built $455B worth of power-hungry data centers last year [4], and it is having (and will have) a huge effect on greenhouse gas emissions.

            [1] https://www.eia.gov/todayinenergy/detail.php?id=65264 [2] https://www.utilitydive.com/news/load-growth-challenges-supp... [3] https://www.energy.gov/articles/doe-releases-new-report-eval... [4] https://www.ciodive.com/news/data-center-ai-cloud-infrastruc...

            • tptacek 6 hours ago

              Your source [2] provides numbers (and a graph) only for total US electrical usage (4.4 TWh or so). The number you cite for data center usage exceeds 2023 estimates for all data center usage globally. Do you have numbers for data center usage in the United States? It's the US TWh number that tells us whether data center usage is merely rising (of course it is!) or whether it's actually a significant component of all US usage.

              This snagged you as well with your "recent growth" claim; I have no trouble believing it's true, but what you're saying and what I'm saying can be true at the same time.

              I don't think there's anything productive I can do with your "$455B worth of data center" numbers, as there's no transform that takes me from buildout cost to electrical usage.

              Respectfully: you said I was not just wrong, but "extremely wrong", so this should be easy for you to spell out. I appreciate the effort so far!

              • shawndrost 2 hours ago

                The below DOE link substantiates my quotes about US data centers and power usage, which I'll reproduce here. "[D]ata centers consumed about 4.4% of total U.S. electricity in 2023 [and ~1.5% in 2014]." "[T]otal [US] data center electricity usage climbed from 58 TWh in 2014 to 176 TWh in 2023 and estimates an increase between 325 to 580 TWh by 2028 [which would be circa 20% YOY growth]."

                These quotes are ~compatible with your 4.4TWh number. If you still think the below DOE link is wrong and "...exceeds 2023 estimates for all data center usage globally" could you share why you believe that?

                (Note that my "extremely wrong" is not directed at the literal text "LLM inference has a carbon impact of, like, a couple Google searches" but with the implication that LLMs have negligible carbon impact. If you think DCs were using 4.4% of US power in 2023 and growing at 20% YOY, and are a sizable-and-fast-growing carbon impact -- but that one LLM call is a small carbon impact -- I'll concede the latter and soften "extremely wrong" to "your original comment carried implications you didn't want".)

                https://www.energy.gov/articles/doe-releases-new-report-eval...

                • tptacek 2 hours ago

                  The 58->176 TWh interval from 2014 to 2013 clearly wasn't driven by LLMs; ChatGPT wasn't released until 2022. There were of course AI/ML models that preceded it, but nothing used at the scale LLMs are now. If your whole case is that technology writ large is driving data center expansion, that's fine; my argument is simply that it doesn't make sense to single out LLMs.

                  I think at this point though we understand the contours of our respective arguments! We don't have to keep litigating. Thanks for this!

  • tuatoru a day ago

    Add tourism and fast fashion to the list of bigger problems.

  • add-sub-mul-div a day ago

    We could be doing both more efficiently, but we have to drive and we have to grow food. Unlike those cases, producing slop isn't a necessity.

eadmund 6 hours ago

> The business leaders pumping all this money of course don’t understand the technology.

True enough!

> They’re doing this for exactly one reason: They think they can discard armies of employees and replace them with LLM services, at the cost of shipping shittier products.

I imagine that they believe they can discard armies of employees and replace them with LLM services, shipping products at either the same level of quality for the same price or for a lower level of quality at a lower price.

> The first real cost is hypothetical: What if those business leaders are correct and they can gleefully dispose of millions of employees?

I am absolutely certain that when Tim Bray needs a ditch dug on his property, he does not insist that the contractors use spoons. I am pretty sure that he does not insist that the timber for his house be hand-sawn only. I am pretty sure that he does not drive a horse and buggy. I am pretty sure that he does not grow all his own food by hand, nor does he require that all his food be grown by subsistence farmers. I am absolutely certain that the computers he uses for work and play are not hand-built from raw inputs like sand and rubber trees, but are instead the end product of a massive industrial supply chain no single man can hold in his head. I think well enough of him that I expect he is not gleeful at the thought of how many more jobs he could create if only he abstained from all technology developed after, say, 753 B.C.; why will he not extend that same grace to business executives?

I’m being a bit ridiculous, but that’s kind of the point. Technological innovations free up labour for more productive uses. If you go back far enough, most of our ancestors lived miserable, short lives of back-breaking effort. It wasn’t until the 1920 census that the majority of Americans lived in cities.

There’s no ‘mental stench’ (to use his infelicitous phrase) involved in using technology to reduce the amount of precious human labour required to achieve an end. Desktop publishing made high-quality output accessible to a much larger number of people than before; so too has generative AI.

RugnirViking a day ago

I do agree. I think it's most important point - that while yes, people rightfully point out that past automation in the long term and averaged across everyone did have probably a positive impact on income, that's not what is prompting the current investments in it. The investments only make sense in the context of and clearly are a bet on reduced payroll.

Reduced payroll to a large portion of the members of the already struggling and shrinking middle class. Thinking that some natural law says that jobs that will come back will automatically also be middle class

  • xg15 a day ago

    This is what makes me uneasy the most right now: The promise of reduced payroll can only materialize if there are no replacement jobs created by AI - otherwise the labor cost would just shift to a different position. So mass unemployment it is then?

    At the same time, the same people are also hard at work dismantling the social security net that makes prolonged unemployment survivable.

    So what exactly is the endgame here?

    • daxfohl a day ago

      I don't know, I think there's a real possibility that this ends up increasing the demand for software engineers rather than decreasing. The lower barrier to entry will mean lots more projects pursued by lots more people, and they'll eventually need help. And all the things non-tech companies would like to do but can't afford a whole software team, now they can launch with a couple engineers. All the stuff that has been on tech companies' backlogs forever, they can start tackling and then start the new initiatives they've been dreaming about but too busy managing firedrills and KLO.

      I think as software gets cheqper to build then there's just going to be a lot more of it, for use cases we aren't even thinking about yet. And the more software there is, the more challenges it will create to manage it. Our jobs will be a lot different, but I think anyone who is laying off humans right now to save a bit of cash is really short sighted. It's going to be a long time before AI can do everything a software engineer does. (think about accountants, seems like they could have been replaced long ago, but it's still a huge industry). But the time between now and then, software engineers will be the highest ROI employees there are.

    • majormajor a day ago

      The investment makes sense even if you think there will be replacement jobs created because nobody wants to be left behind.

      If you are spending X on payroll to do Y, and your competitor is now spending X on payroll to do both Y and Z, then they can (a) undercut you on selling Y, and (b) sell Z that you can't even compete with.

      (Or, even if you aren't directly going to ever Z yourself, the replacement jobs could be somewhere else entirely. You actually hope this is the case - if there are no replacement jobs, the market for selling Y might shrink, if everyone has less money.)

      • xg15 a day ago

        Good points.

  • jay_kyburz a day ago

    Its really short term thinking. What do those employers think all the software developers are going to do when they get home. The're going to start competing companies.

    As an indie game developer it's mind boggling that MS can fire 9000 staff and makes me wonder how even more flooded the indie game market can get :)

xnx a day ago

> Then there’s the other thing that nobody talks about, the massive greenhouse-gas load that all those data centers are going to be pumping out.

This is discussed ad nauseum, and the carbon accounting is very poorly done.

Looking at the capital and operating expenses of datacenters is the right way to think about it. Nothing about that tells me that AI is environmentally worse than driving a big vanity pickup truck bigger, owning a large house, having lots of offspring, or taking many international flights.

daft_pink a day ago

So, I just want to say that increasing productivity over time should result in net higher prosperity.

There are numerous examples of disruptive technologies that reduce labor costs and the world has gotten better over time not become a dystopia.

I’m sure there will be winners and losers and it will take time adjust, but dramatic increases in productivity will make a better world, because it will take less effort for you to get what you want to get.

  • dfxm12 a day ago

    Productivity has already been going up and the prosperity has largely been captured by the capital-owning class.

    https://www.epi.org/productivity-pay-gap/

    Add to this weakened labor and social programs being treated like a four letter word among most politicians, the media and voters. AI isn't going to make the system change.

    • daft_pink a day ago

      I think this is a separate argument as productivity is getting squeezed out by competition and the consumer is benefiting.

      I'm not sure if capital owners, management or workers are getting equal slices of the benefits, but I do believe that everyone is benefiting and it doesn't make sense to avoid progress as though increased productivity isn't worthwhile.

plasticeagle a day ago

Certainly Gen AI is being marketed to business leaders as being capable of reducing their payroll, but I don't believe that's what it's for as such. And as several comments have mentioned, its energy use is not even especially significant.

Gen AI exists to wrest control of information from the internet into the hands of the few. Once upon a time, the Encyclopaedia was a viable business model. It was destroyed as a business model once the internet grew to the point that a large percentage of the population was able to access it. At that point, information became free, and impossible to control.

Look at google's "AI summaries" that they've inserted at the top of their search results. Often wrong, sometimes stupid, occasionally dangerous - but think about what will happen if and when people divert their attention from "the internet" to the AI summaries of the internet. The internet as we know it, the free repository of humanity's knowledge, will wither and die.

And that is the point. The point is to once again lock up the knowledge in obscure unmodifiable black boxes, because this provides opportunity to charge for access to them. They have literally harvested the world's information, given and created freely by all of us, and are attempting to sell it back to us.

Energy use is a distraction, in terms of why we must fight Gen AI. Energy use will go down, it's an argument easily countered by the Gen AI companies. Fight Gen AI because it is an attempt to steal back what was once the property of all of us. You can't ban it, but you can and absolutely should refuse to use it.

  • exceptione a day ago

    The second order effects are were the real dangers lie: people will lose the ability to understand their own reality. You see it at Twitter, where community notes are being replaced by AI. Stupid users asking "@grok is it true that ...?" People are gullible, putting trust where they absolutely shouldn't.

    Musk wasn't happy about some facts, so grok changed. "Sorry, I was instructed to.." These tools are seen by the clueless populace, whether it is your own aunt or some HN'ers, as an objective, factually correct oracle, free of influence.

    Then there are lobby groups pushing for AI in the judiciary. Always under the banner of "cost savings". Sure, guess who gets their case being handled better.

    A debate about what a healthy society would be, what people share as a common cause, is urgent as ever. Without reality distortion from autocratic interest groups an allied talking heads. The AI flood is unstoppable, but with the current culturally engineered crises in many democratic countries, it will most likely result in serious catastrophe.

  • jofla_net a day ago

    We will be reduced to an infantile state, erased.

Herring a day ago

Aren't most jobs bullshit jobs already? Fingers crossed genAI doesn't change this too much.

Low empathy has been an issue with humanity since day 1. I wouldn't even know how to begin to fix it. It'll probably still be an issue long after we're dead. If it really bothers you I recommend meditation/therapy/etc.

Don't expect action on climate change until a few million in western countries are killed. Humans are terrible at slow-moving disasters. My parents both died early from being sedentary, despite my best efforts to get them to work out.

With luck, smarter decision-making with genAI might actually improve some societal systems.

  • bluefirebrand a day ago

    > smarter decision-making with genAI might actually improve some societal systems

    I wouldn't count on genAI making smarter decisions, only decisions that benefit the people who control the computers that it runs on

  • tptacek a day ago

    There's really no such thing; economists have spent the last 6 years dunking on Graeber's analysis.

falcor84 a day ago

> Is genAI useful? · Sorry, I’m having trouble even thinking about that now.

If you're willing to only look at the downsides of an issue and not at its upsides, then you can a-priori reject the entirety of human advancement.

troupo a day ago

> The real problem · It’s the people who are pushing it. Their business goals are quite likely, as a side-effect, to make the world a worse place

Me, 10 months ago:

--- start quote ---

Someone on Twitter said: "Do not fear AI. Fear the people and companies that run AI".

After all, it's the same industry that came up with pervasive and invasive tracking, automated insurance refusals, automated credit lookups and checks, racial ...ahem... neighbourhood profiling for benefits etc. etc.

https://news.ycombinator.com/item?id=41414873

--- end quote ---

daxfohl 11 hours ago

I think the issue isn't who specifically is running it, but that there's nobody running it. The economy of it is being run by the same old capitalist machine with no concern except ROI. The leaders are just figureheads, easily replaced if they start to care about anything other than quarterly return.

Workaccount2 a day ago

I'm sorry, but removing barriers that stop people from creating their own computer programs, their own anime scenes, their own marimba jazz, or their own live action commercial for their used car is not going to make the world a worse place.

There will downsides and their will be people negatively effected, but democratizing ability has never been a net negative.

  • satisfice a day ago

    “What great spin you have, grandma,” said Little Red Riding Hood.

karaterobot a day ago

This article read like a conspiracy theory, and it offers no evidence for its position. It's not even really making a case. It's just stating that companies are putting money behind AI so that they can lay off employees and reduce the quality of their products, and by saying that they apparently think they've made an argument.

  • apical_dendrite a day ago

    When I've talked to startup CEOs or execs building AI products at larger companies, their sales pitch is usually some form of:

    "Right now X is expensive because you have to hire people to do it. That means that access to X is limited. By using AI, we can provide more access to X".

    X could be anything from some aspect of healthcare, to some type of business service, or even something in the arts.

    It's very clear that the people building AI companies, and the people investing in those companies, think that there is an enormous market to use AI to automate a wide variety of work that currently requires human labor. They may not be explicitly framing it as "we get to reduce our payroll by 50%" - they may be framing it as "now everyone in the world will get access to X" (often tech executives will use a grand mission to justify horrible things), but the upshot is that companies are 100% putting money behind AI because they believe it will help them get more work out of fewer and fewer people.

    • mitjam a day ago

      I would say "works as designed" - efficiency gains add shareholder value which is what most orgs optimize for. Pitching these makes sense. As usual, the costs of second order effects are externalized.