• Blue_Morpho@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    4 months ago

    Cancelling new data centers because deep seek has shown a more efficient path isn’t proof that AI is dead as the author claims.

    Fiber buildouts were cancelled back in 2000 because multimode made existing fiber more efficient. The Internet investment bubble popped. That didn’t mean the Internet was dead.

    • contrafibularity@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      4 months ago

      yeah, genai as a technology and field of study may not disappear. genai as an overinflated product marketed as the be all end all that would solve all of humanity’s problems may. the bubble can’t burst soon enough

    • scarabic@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 months ago

      This is a good point. It’s never sat right with me that LLMs require such overwhelming resources and cannot be optimized. It’s possible that innovation has been too fast to worry about optimization yet, but all this BS about building new power plants and chip foundries for trillions of dollars and whatnot just seems mad.

    • FooBarrington@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      I’m gonna disagree - it’s not like DeepSeek uncovered some upper limit to how much compute you can throw at the problem. More efficient hardware use should be amazing for AI since it allows you to scale even further.

      This means that MS isn’t expecting these data centers to generate enough revenue to be profitable, and they’re not willing to bet on further advancements that might make them profitable. In other words, MS doesn’t have a positive outlook for AI.

      • Blue_Morpho@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        4 months ago

        More efficient hardware use should be amazing for AI since it allows you to scale even further.

        If you can achieve scaling with software, you can delay current plans for expensive hardware. If a new driver came out that gave Nvidia 5090 performance to games with gtx1080 equivalent hardware would you still buy a new video card this year?

        When all the Telcos scaled back on building fiber in 2000, that was because they didn’t have a positive outlook for the Internet?

        Or when video game companies went bankrupt in the 1980’s, it was because video games were over as entertainment?

        There’s a huge leap between not spending billions on new data centers ( which are used for more than just AI), and claiming that’s the reason AI is over.

  • BlameTheAntifa@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 months ago

    There’s no need for huge, expensive datacenters when we can run everything on our own devices. SLMs and local AI is the future.

    • yarr@feddit.nl
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      This feels kinda far fetched. It’s like saying “well, we won’t need cars, because we’ll all just have jetpacks that we use to get around.” I totally agree that eventually a useful model will run on a phone. I disagree it’s going to be soon enough to matter to this discussion. To give you some ideas, DeepSeek is a recent model. It’s 671B parameters. Devices like phones are running 7-14B models. So, eventually what you say will be feasible, but we have a ways to go.

      • BlameTheAntifa@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 months ago

        The difference is that we’ll just be running small, specialized, on-demand models instead of huge, resource-heavy, all-purpose models. It’s already being done. Just look at how Google and Apple are approaching AI on mobile devices. You don’t need a lot of power for that, just plenty of storage.