That is kind of assuming the worst case scenario though. You wouldn’t assume that QA can read every email you send through their mail servers ”just because ”
This article sounds a bit like engagement bait based on the idea that any use of LLMs is inherently a privacy violation. I don’t see how pushing the text through a specific class of software is worse than storing confidential data in the mailbox though.
That is assuming that they don’t leak data for training but the article doesn’t mention that.
Always assume the worst, I gaurentee it is usually that bad in reality. Companies absolutely hate spending money on IT and security is always an after thought. API logs for the production systems that contain your full legal name, DOB, SSN, and home address? Yea wide open and accessible by anyone. Production databases with employee SSN, address, salary information? Same thing, look up how much the worthless management is making and cry.
Booz Allen just got shit on because of the dude they hired who specifically sought out consulting for the IRS so he could steal Trumps IRS records.
You wouldn’t assume that QA can read every email you send through their mail servers ”just because”
I absolutely would, and Microsoft explicitly maintains the right to do that in their standard T&C, both for emails and for any data passed through their AI products.
v. Use of Your Content. As part of providing the AI services, Microsoft will process and store your inputs to the service as well as output from the service, for purposes of monitoring for and preventing abusive or harmful uses or outputs of the service.
We don’t own Your Content, but we may use Your Content to operate Copilot and improve it. By using Copilot, you grant us permission to use Your Content, which means we can copy, distribute, transmit, publicly display, publicly perform, edit, translate, and reformat it, and we can give those same rights to others who work on our behalf.
We get to decide whether to use Your Content, and we don’t have to pay you, ask your permission, or tell you when we do.
Microsoft is almost certainly recording these summarization requests for QA and future training runs; that’s where the leakage would happen.
100% agree. At this point I am assuming everything sent through their servers is actively being collected for LLM training.
That is kind of assuming the worst case scenario though. You wouldn’t assume that QA can read every email you send through their mail servers ”just because ”
This article sounds a bit like engagement bait based on the idea that any use of LLMs is inherently a privacy violation. I don’t see how pushing the text through a specific class of software is worse than storing confidential data in the mailbox though.
That is assuming that they don’t leak data for training but the article doesn’t mention that.
Always assume the worst, I gaurentee it is usually that bad in reality. Companies absolutely hate spending money on IT and security is always an after thought. API logs for the production systems that contain your full legal name, DOB, SSN, and home address? Yea wide open and accessible by anyone. Production databases with employee SSN, address, salary information? Same thing, look up how much the worthless management is making and cry.
Booz Allen just got shit on because of the dude they hired who specifically sought out consulting for the IRS so he could steal Trumps IRS records.
https://home.treasury.gov/news/press-releases/sb0371
https://en.wikipedia.org/wiki/Charles_E._Littlejohn
This is some pathetic chuddery you’re spewing…
I absolutely would, and Microsoft explicitly maintains the right to do that in their standard T&C, both for emails and for any data passed through their AI products.
https://www.microsoft.com/en-us/servicesagreement#14s_AIServices