• RememberTheApollo_@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    5 months ago

    Where my futurists now? Tell me again how a technological advancement will free humans from drudgery to engage in more free and enlightened pursuits?

  • Moorshou@lemmy.zip
    link
    fedilink
    English
    arrow-up
    1
    ·
    5 months ago

    Holy crap, I thought I hated AI and I was uncertain. Now I’m sure I hate AI

  • shrugs@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    edit-2
    5 months ago

    Is nobody concerned about this:

    Behind the wall, an army of robots, also powered by new Nvidia robotics processors, will assemble your food, no humans needed. We’ve already seen the introduction of these kinds of ‘labor-saving’ technologies in the form of self-checkout counters, food ordering kiosks, and other similar human-replacements in service industries, so there’s no reason to think that this trend won’t continue with AI.

    not being seen as the paradise? It’s like the enterprise crew is concerned about replicators because people will lose their jobs.

    This is madness, to be honest, this is what humankind ultimately should evolve into. No stupid labour for anyone. But the truth is: capitalism will take care of that, it will make sure, that not everyone is free but that a small percentage is more free and the rest is fucked.There lies the problem not in being able to make human labour obsolete.

    • anon_8675309@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 months ago

      The wealthy ruling class have siphoned off nearly all of the productivity gains since the 70s. AI won’t stop that machine. If half of us die of starvation and half the remaining half die from fighting each other for cake, they don’t care.

    • UnderpantsWeevil@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      5 months ago

      I’ve been watching people try to deliver the end-to-end Food Making conveyor belt for my entire life. What I’ve consistently seen delivered are novelties, more prone to creating a giant mess in your mechanical kitchen than producing anything both efficient and edible. The closest I’ve seen are those microwaved dinners, and they’re hardly what I’d call an exciting meal.

      But they are cheap to churn out. That’s what is ultimately upsetting about this overall trend. Not that we’ll be eliminating a chronic demand on human labor, but that we’ll be excising any amount of artistry or quality from the menu in order to sell people assembly line TV dinners at 100x markups in pursuit of another percentage point of GDP growth.

      As more and more of the agricultural sector falls under the domain of business interests fixated on profits ahead of product, we’re going to see the volume and quality of food squeezed down into what a robot can shove through a tube.

  • TheFeatureCreature@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    On the plus side, the industry is rapidly moving towards locally-run AI models specifically because they don’t want to purchase and run fleets of these absurd things or any other expensive hardware.

    • MudMan@fedia.io
      link
      fedilink
      arrow-up
      0
      ·
      5 months ago

      The tragic irony of the kind of misinformed article this is linking is that the server farms that would be running this stuff are fairly efficient. The water is reused and recycled, the heat is often used for other applications. Because wasting fewer resources is cheaper than wasting more resources.

      But all those locally-run models on laptop CPUs and desktop GPUs? That’s grid power being turned into heat and vented into a home (probably with air conditioning on).

      The weird AI panic, driven by an attempt to repurpose the popular anti-crypto arguments whether they matched the new challenges or not, is going to PR this tech into wasting way more energy than it would otherwise by distributing it over billions of computer devices paid by individual users. And nobody is going to notice or care.

      I do hate our media landscape sometimes.

      • XeroxCool@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        5 months ago

        If I make a gas engine with 100% heat efficiency but only run it in my backyard, do the greenhouse gases not count because it’s so efficient? Of course they do. The high efficiency of a data center is great, but that’s not what the article laments. The problem it’s calling out is the absurdly wasteful nature of why these farms will flourish: to power excessively animated programs to feign intelligence, vainly wasting power for what a simple program was already addressing.

        It’s the same story with lighting. LEDs seemed like a savior for energy consumption because they were so efficient. Sure they save energy overall (for now), but it prompted people to multiply the number of lights and total output by an order of magnitude simply because it’s so cheap. This stems a secondary issue of further increasing light pollution and intrusion.

        Greater efficiency doesn’t make things right if it comes with an increase in use.

        • MudMan@fedia.io
          link
          fedilink
          arrow-up
          1
          ·
          5 months ago

          For one thing, it’s absolutely not true that what these apps provide is the same as what we had. That’s another place where the AI grifters and the AI fearmongers are both lying. This is not a 1:1 replacement for older tech. Not on search, where LLM queries are less reliable at finding facts and being accurate but better at matching fuzzy searches without specific parameters. Not with image generation, obviously. Not with tools like upscaling, frame interpolation and so on.

          For another, some of the numbers being thrown around are not realistic or factual, are not presented in context or are part of a power increase trend that was already ongoing with earlier applications. The average high end desktop PC used to run on 250W in the 90s, 500W in the 2000s. Mine now runs at 1000W. Playing a videogame used to burn as much power as a couple of lightbulbs, now it’s the equivalent of turning on your microwave oven.

          The argument that we are burning more power because we’re using more compute for entertainment purposes is not factually incorrect, but it’s both hyperbolic (some of the cost estimates being shared virally are deliberate overestimates taken out of context) and not inconsistent with how we use other computer features and have used other computer features for ages.

          The only reason you’re so mad about me wasting some energy asking an AI to generate a cute picture but not at me using an AI to generate frames for my videogame is that one of those is a viral panic that maps nicely into the viral panic about crypto people already had and the other is a frog that has been slow burning for three decades so people don’t have a reason to have an opinion about it.