Apple is joining the growing list of companies banning ChapGPT and other third-party generative AI tools for company use.
According to a document viewed by The Wall Street Journal, Apple is worried that employees using these tools could inadvertently leak confidential information. And considering what happened with Samsung in April, when employees accidentally uploaded private source code not once but three times using ChapGPT, it's really no surprise that Apple does not want to take that risk.
One piece of AI software that Apple specifically mentioned as being banned from use is GitHub Copilot, an AI tool that can generate and write simple code. It's a big time-saver for programmers: The issue is that the data used from AI tools are stored on external servers, which are used to train multiple AI models, including those created and operated by other companies.
For instance, let's say you're working on code meant to be used on a super secret pair of Apple AR glasses; that's information you'd probably don't want to accidentally feed an AI being run by your rivals such as Microsoft or Google. Protecting proprietary information is why companies like Amazon, Verizon, JPMorgan Chase, and Northrop Grumman have banned ChatGPT for the time being.
OpenAI recently updated ChatGPT's privacy options to allow users to delete and disable their chat history, preventing that data from being used to train its large Language model. However, according to its Data Controls FAQ, all conversations are kept for 30 days before permanently deleting to "monitor for abuse," but nothing else—it's not using that window to squeeze out some bonus LLM training.
In an earnings call, Apple CEO Tim Cook said it's "very important to be deliberate and thoughtful in how you approach these things, and there’s a number of issues that need to be sorted" regarding deploying AI.
Like Samsung, Apple is reportedly working on its internal Al so that employees can take advantage of the efficiency these generative AI tools can bring without revealing the company's secrets to the world.
It's also a good idea to add two-factor authentication to the accounts you do use, just to be the safe side.
One thing to consider is the accounts of people who died, and content associated with their accounts, like photos and videos, could be lost forever.
Twitter announced a similar policy of purging inactive accounts earlier this month, which faced backlash after users like former Id founder John Carmack pointed out that this would eliminate a lot of legacy content and start a "land grab" of freed-up usernames.