Running AI is so expensive that Amazon will probably charge you to use Alexa in future, says outgoing exec::In an interview with Bloomberg, Dave Limp said that he “absolutely” believes that Amazon will soon start charging a subscription fee for Alexa

  • barsoap@lemm.ee
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    1 year ago

    but the hardware will continue to improve and get cheaper.

    Eh. I mean sure the likes of A100s will invariably get cheaper because they’re overpriced AF, but there isn’t really that much engineering going into those things hardware-wise: Accelerating massive chains of fmas is quite a smaller challenge than designing a CPU or GPU. Meanwhile moore’s law is – well maybe not dead but a zombie. In the past advances in manufacturing meant lower price per transistor, that hasn’t been true for a while now and the physics of everything aren’t exactly getting easier, they’re now battling quantum uncertainty in the lithography process itself.

    Where there might still be significant headways to be made is by switching to analogue, but, eeeh. Reliability. Neural networks are rather robust against small perturbations but it’s not like digital systems can’t make use of that by reducing precision, and controlling precision is way harder in analoge. Everything is harder there, it’s an arcane art.


    tl;dr: Don’t expect large leaps, especially not multiple. This isn’t a naughts “buy a PC twice as fast at half the price two years later” kind of situation, AI accelerators are silicon like any other they already make use of the progress we made back then.