• stealth_cookies@lemmy.ca
    link
    fedilink
    English
    arrow-up
    2
    ·
    7 months ago

    Why is this news? Every company that deals with intellectual property, proprietary information, and/or sensitive information should not be using public LLM tools due to the risk of leaking that data. That is why these companies are providing more sandboxed versions of these tools to protect against the issue.

  • surewhynotlem@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    7 months ago

    With copilot you can lock your data into your own tenant. You don’t leak data that way (except to Microsoft I guess)

    • demonsword@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      7 months ago

      With copilot you can lock your data into your own tenant.

      openai is almost a Microsoft branch, what you said doesn’t made much sense to me

      • brick@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        7 months ago

        The selling point for M365 Copilot is that it is a turnkey AI platform that does not use data input by its enterprise customers to train generally available AI models. This prevents their internal data from being output to randos using ChatGPT. OpenAI definitely does use ChatGPT conversations to further train ChatGPT so there is a major risk of data leakage.

        Same situation with all other public LLMs. Microsoft’s investments in OpenAI aren’t really relevant in this situation.