• queermunist she/her@lemmy.ml
    link
    fedilink
    arrow-up
    3
    arrow-down
    1
    ·
    7 days ago

    Deepseek showed that these chatbots can be run much more cheaply than they have been and it isn’t really necessary to build giga warehouses of servers. It might be possible to run them on even tighter hardware specifications too.

    Of course, chatbots aren’t AI and the fact that they’re trying to use them as AI isn’t going to work out anyway lol

    • zerakith@lemmy.ml
      link
      fedilink
      arrow-up
      2
      ·
      7 days ago

      Yes its clear that the path of throwing more and more resource at LLMS to improve quality has been a lazy growth focused approach that we could do better if we actually try a design focussed approach.

      For me though it comes back to the fact we are facing a polycrisis and most of our resource should be focused on looking for solutions to that and I’m not sure what problem* this technology solves yet alone what problem relating to the polycrisis.

      *I realise what they are designed to solve is a capitalist problem. How can we avoid paying staff for service and creative type jobs to increase profit.