Facing the new realities oF security For ai 13 Amplified risks of data leakage in AI systems As the volume of AI-generated content expands, Without appropriate user training, the rapid the potential for data leakage and exposure proliferation of AI tools can also create increases as well. Organizations face heightened environments in which users share or use data risks stemming from practices such as data without fully understanding its sensitivity, oversharing and shadow IT. compounding the risk of compliance violations and data breaches. Data oversharing and breaches: Data We want to make sure that whatever oversharing occurs when users inadvertently Shadow IT: With 78% of AI users bringing 1 data that we feed to it, it stays within [our gain access to sensitive information through their own AI tools to work (BYOAI), sometimes AI applications, often due to insufÏcient without the knowledge of the IT or security company], and it’s not some proprietary labeling policies or inadequate access controls. group within an organization, the risk of data This might lead to unauthorized exposure of leakage increases. When employees use third- information [that] gets leaked outside… confidential data, posing significant risks to both party AI tools and paste sensitive information individuals and organizations. such as source code, meeting notes, and data spreadsheets into user prompts, they can inadvertently expose confidential company data outside of the company. Technical Decision Maker, IT
