Why is this news? Every company that deals with intellectual property, proprietary information, and/or sensitive information should not be using public LLM tools due to the risk of leaking that data. That is why these companies are providing more sandboxed versions of these tools to protect against the issue.
With copilot you can lock your data into your own tenant. You don’t leak data that way (except to Microsoft I guess)
With copilot you can lock your data into your own tenant.
openai is almost a Microsoft branch, what you said doesn’t made much sense to me
The selling point for M365 Copilot is that it is a turnkey AI platform that does not use data input by its enterprise customers to train generally available AI models. This prevents their internal data from being output to randos using ChatGPT. OpenAI definitely does use ChatGPT conversations to further train ChatGPT so there is a major risk of data leakage.
Same situation with all other public LLMs. Microsoft’s investments in OpenAI aren’t really relevant in this situation.
This.