• Rhaedas@kbin.social
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    It’s not AGI that’s terrifying, but how people are so willing to let anything take over their control. LLMs are “just” predictive text generation with a lot of extras to make things come out really convincing sometimes, and yet so many individuals and companies basically handed over the keys without even second guessing its answers.

    These past few years have shown how if (and it’s a big if) AGI/ASI comes along, we are so screwed, because we can’t even handle dumber tools well. LLMs in the hands of willing idiots can be a disaster itself, and it’s possible we’re already there.