• R00bot@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    1
    ·
    edit-2
    1 month ago

    I feel like the amount of training data required for these AIs serves as a pretty compelling argument as to why AI is clearly nowhere near human intelligence. It shouldn’t take thousands of human lifetimes of data to train an AI if it’s truly near human-level intelligence. In fact, I think it’s an argument for them not being intelligent whatsoever. With that much training data, everything that could be asked of them should be in the training data. And yet they still fail at any task not in their data.

    Put simply; a human needs less than 1 lifetime of training data to be more intelligent than AI. If it hasn’t already solved it, I don’t think throwing more training data/compute at the problem will solve this.

    • Todd Bonzalez@lemm.ee
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 month ago

      A human lifetime worth of video is not anywhere close to equalling a human lifetime of actual corporeal existence, even in the perfect scenario where the AI is as capable as a human brain.

      • R00bot@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 month ago

        Strange to equate the other senses to performance in intellectual tasks but sure. Do you think feeding data from smells, touch, taste, etc. into an AI along with the video will suddenly make it intelligent? No, it will just make it more likely to guess what something smells like. I think it’s very clear that our current approach to AI is missing something much more fundamental to thought than that, it’s not just a dataset problem.

    • stupidcasey@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 month ago

      You’ve had the entire history of evolution to get the instinct you have today.

      Nature Vs Nurture is a huge ongoing debate.

      Just because it takes longer to train doesn’t mean it’s not intelligent, kids develop slower than chimps.

      Also intelligent doesn’t really mean anything, I personally think Intelligence is the ability to distillate unusable amounts of raw data and intuit a result beneficial to one’s self. But very few people agree with me.

      • Peanut@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 month ago

        I see intelligence as filling areas of concept space within an econiche in a way that proves functional for actions within that space. I think we are discovering more that “nature” has little commitment, and is just optimizing preparedness for expected levels of entropy within the functional eco-niche.

        Most people haven’t even started paying attention to distributed systems building shared enactive models, but they are already capable of things that should be considered groundbreaking considering the time and finances of development.

        That being said, localized narrow generative models are just building large individual models of predictive process that doesn’t by default actively update information.

        People who attack AI for just being prediction machines really need to look into predictive processing, or learn how much we organics just guess and confabulate ontop of vestigial social priors.

        But no, corpos are using it so computer bad human good, even though the main issue here is the humans that have unlimited power and are encouraged into bad actions due to flawed social posturing systems and the confabulating of wealth with competency.

    • rdri@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      1 month ago

      There is no “intelligence”, ai is a pr word. Just a language model that feeds on a lot of data.