It will be in the terms of service, but terms of service violations cost these businesses less than a day of profits, when they cost them anything at all.
Has me thinking about enterprise privacy. What happens if a company has secrets exposed? Will they stop supporting AI or just fire the unlucky employee who did as instructed.
My company sent out guidelines telling us not to put confidential shit in copilot. So they’re already preemptively blaming us. Idk how they could enforce it though.
Old code is insane. The coders at my work don’t want to touch the millions of lines of visual basic 6 and fortran that prop up the company. No loops. No encapsulation. Just assignment and soft validations.
Co-pilot says that was considered safe back in the day. One team just triple checking things and sending to production. The comments suggest issues I have today have been issues and unaddressed for decades.
I can’t get the code to compile and you have to pay MS if you want VB6 IDE, so all I can do is look at the ancient texts I barely understand and ponder its implications on my job.
If you use any old LLM, they would probably fire you. If the company had something like copilot through the enterprise license for you to use, it has the same data protection thing (whatever they call that shit) as the rest of the suite like SharePoint, onedrive, and teams. In that case it’d be a pretty big issue for Microsoft if something leaks from there.
“Aw no, darn it! Nameless Junior Engineer/intern/contractor lost all your data to the open web and a forced update corrupted your C: Drive…Again! Such a shame. Anyway you’re gonna love what’s new in M$ 365 / Azur€ / Gam£₱a$$ / Window$ €£€V€N / ¢o₱i£ot…”
In my previous company, I pushed hard against incorporating non-local LLMs for that reason, since we dealt with very sensible information. Was ignored for that same argument you just posted.
A coworker said “with paid subscription they don’t use you data/chat to train the ai”. Has been deccades since i laughed that hard.
They probably also reposted one of those “if you post this picture facebook can’t scrape all your data” images
I hear that if you super-upgrade to the enterprise plan, they will promise your legal department to be totally cool with ALL your data and prompts!
“I didn’t read the terms of service, but I’m still gonna talk like I did.”
It will be in the terms of service, but terms of service violations cost these businesses less than a day of profits, when they cost them anything at all.
To be fair, none of these LLM companies make profits, so there is nothing to fine or tax.
@baggachipz @pinball_wizard well, there is if you account for the environment destruction and water consumption
Has me thinking about enterprise privacy. What happens if a company has secrets exposed? Will they stop supporting AI or just fire the unlucky employee who did as instructed.
My company sent out guidelines telling us not to put confidential shit in copilot. So they’re already preemptively blaming us. Idk how they could enforce it though.
You don’t think they can track everything sent?
Worse, they’ll probably ask claud who sent it and trust the output.
Don’t think there is an if (just maybe a “when”)… but yeah, they blame the employee for sure
Old code is insane. The coders at my work don’t want to touch the millions of lines of visual basic 6 and fortran that prop up the company. No loops. No encapsulation. Just assignment and soft validations.
Co-pilot says that was considered safe back in the day. One team just triple checking things and sending to production. The comments suggest issues I have today have been issues and unaddressed for decades.
I can’t get the code to compile and you have to pay MS if you want VB6 IDE, so all I can do is look at the ancient texts I barely understand and ponder its implications on my job.
If you use any old LLM, they would probably fire you. If the company had something like copilot through the enterprise license for you to use, it has the same data protection thing (whatever they call that shit) as the rest of the suite like SharePoint, onedrive, and teams. In that case it’d be a pretty big issue for Microsoft if something leaks from there.
Leaks happen out of MS products all the time.
But it seems to always turn out to be “user error” of course.
When MS really fucks up, it seems like it’s big news in the Linux communities here, and basically doesn’t make the news elsewhere.
“Aw no, darn it! Nameless Junior Engineer/intern/contractor lost all your data to the open web and a forced update corrupted your C: Drive…Again! Such a shame. Anyway you’re gonna love what’s new in M$ 365 / Azur€ / Gam£₱a$$ / Window$ €£€V€N / ¢o₱i£ot…”
–Micro$£o₱, every time
In my previous company, I pushed hard against incorporating non-local LLMs for that reason, since we dealt with very sensible information. Was ignored for that same argument you just posted.
I’m against training on my code because my code is terrible. The only thing worse is the other code from coworkers in the same project.
I pity the people whose LLM used my GitHub codes, my codes are awfully written.