• MudMan@kbin.social
    link
    fedilink
    arrow-up
    5
    ·
    8 months ago

    Yes. Technology went faster than people expected in some areas and much slower in others, to the point where the outcome may not be possible at all.

    That IS my “thesis”. the idea that in the 1960s video calls and a sentient robot cleaning your house seemed equally cartoonishly futuristic is the entire point I’m making.

    And to be clear, that holds even when restricted simply to consumer software and hardware. We got a lot better than expected at networking and data transmission… and now we’re noticeably slowing down. We are actually behind in terms of AI, but we’re way better at miniaturization.

    Again, people extrapolate from their impression of current rates of progress endlessly, but it’s hard to predict when the curve will flatten out. That’s the thesis.

    • kromem@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      8 months ago

      a sentient robot cleaning your house

      Again - we’re still 40 years away from that envisioned future.

      We got a lot better than expected at networking and data transmission… and now we’re noticeably slowing down.

      The difference between the 2023 and 2022’s world records increased speeds for data transfer rates is nearly 5x more than the increase from 2020 to 2021.

      As for your claims about being behind on AI, I’d strongly recommend looking at the various futurist predictions of what to expect from AI in 2020 from various firms, and how literally all of them completely missed the mark for the arrival of GPT-3.

      Look at predictions for 2023 and you’ll see a lot of comments around a potential AI winter and how the data sources have been tapped. Meanwhile the major research advances over actual 2023 was basically “how is GPT-4 so good at all these things” and “it turns out using GPT-4 to generate synthetic data can train much smaller networks to be much much better than we could have achieved with previous data sets.”

      And this is all in advance of the very promising work at a shift to new chip architectures for AI workloads, specifically optoelectronics which went from a pipe dream five years ago to proof of concept at MIT with DIY kits being made available for other researchers this year. So rather than hitting a plateau, if anything much like the gains in optical networking we’re heading towards more oil poured on the AI fire, not less.

      Your thesis is great for things like colonizing Mars or living in spaceships, but it’s kind of crap for things like AI and software advancement.

      • MudMan@kbin.social
        link
        fedilink
        arrow-up
        3
        ·
        8 months ago

        I guarantee your Roomba will not walk around in a little maid uniform with a feather duster while balancing on a single wheel and making snarky comments in 40 years. I’ll bet any amount of money on that. Also no bubble flying cars or Rube Goldberg machines to wash your teeth. I don’t even know what point you’re trying to make at this point.

        Transfer speed records are not what I meant. Admittedly I should have thrown the word storage in there, I thought I had. The idea is that while online infrastructure was exploding it outpaced the needs for storage and transmission, so we went through a very fast rise in specs. Google was out there giving people free storage left and right because their capacity was growing faster than the needs of users. Now every cloud service is monetized, including Gmail’s storage, because storage growth no longer outpaces storage demand. ISPs tapped out by the time Netflix was chugging a quarter of the bandwidth with 4K video and Youtube has been throttling resolution since the pandemic. Turns out, Google won’t be adding zeros to your available free storage forever.

        As for AI, again, my entire point is that it’s not easy to know when it will flatten out growth. I’d argue it already has, at least in terms of consumer access. We keep getting incrementally better image generation, but a future where every image is created by a computer and every search is done via a chatbot interface does not seem to be materializing the way early knee-jerk reactions suggested.

        Which is not to say that ML applications won’t be ubiquitous. There are tons of legit uses where the changes will be groundbreaking. But it being on a straight line that takes you to the holodeck? That line may bend down a lot sooner than that.