• Damarus@feddit.org
    link
    fedilink
    English
    arrow-up
    37
    ·
    3 months ago

    It never fails to amaze me, how much C-level people are disconnected from reality.

    • verdi@feddit.org
      link
      fedilink
      English
      arrow-up
      13
      ·
      3 months ago

      Abstraction layers. They are so detached from everyone else through abstraction layers that we’re nothing more than D2 NPC character sheets to them. That’s why when a Luigi, alegedly, breaks through all of the abstraction layers and brings a leaded reality check to these fucking parasites they double down on palantir like projects to keep themselves safe while making the state even more oppressive and invasive of everyone’s privacy.

    • Onno (VK6FLAB)@lemmy.radio
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      3 months ago

      What you’re describing is a general experience with LLM, not limited to the C-level.

      If an LLM sprouts rubbish you detect it because you have external knowledge, in other words, you’re the subject matter expert.

      What makes you think that those same errors are not happening at the same rate outside your direct personal sphere of knowledge?

      Now consider what this means for the people around you, including the C-level.

      Repeat after me, AI is Assumed Intelligence and should not be considered anything more than autocorrect on steroids.

    • hcbxzz@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      3 months ago

      Managers love these AI tools because that’s what they’re already doing and familiar with; the same way you talk an AI to doing something for you is not very different from the experience of instructing a mediocre worker.