Even the smartest of boffins can trip up sometimes, and that's exactly what happened after a member of Microsoft's AI research team accidentally exposed 38TB of sensitive internal data after misconfiguring a link.
Wiz, a cloud security company that routinely looks for vulnerabilities or exposures of cloud-hosted data detailed the exposure on its blog (via ITWire). It found a GitHub repository belonging to Microsoft’s AI research division, hosting open-source code and AI models for image recognition. But that's not all Wiz found.
A configuration error allowed anyone access the entire storage account, and this data included two complete PC backups belonging to Microsoft employees. According to Wiz, the data included "sensitive personal data, including passwords to Microsoft services, secret keys, and over 30,000 internal Microsoft Teams messages from 359 Microsoft employees."
Furthermore, the files weren't read-only. They could be rewritten or deleted at will. In fairness to Microsoft — and the employees, access to the files wasn't completely public. Access was granted via an Azure sharing feature called a SAS token, which is a shareable link, but in this case it granted full access. Anyone with that link, which would include users looking to access the AI source code, would have had access.
What's worse is that the data has been exposed since 2020. Microsoft was made aware of the exposure in June this year, meaning the data was available for three years.
Microsoft posted a lengthy statement on its own blog, stating "No customer data was exposed, and no other internal services were put at risk because of this issue. No customer action is required in response to this issue".
That sounds fair, but internally there is sure to be a few red faces and breathless IT personnel running this way and that to change passwords and keys that were exposed. Just in case.
Kids, adults, gamers, and boffins alike, it's important to configure your storage accounts correctly. You never know who might come sniffing.