To protect your organization, ISF recommends you take these actions:
- Obtain advice on the metadata that communications providers must legally store, in every jurisdiction in which you operate.
- Collaborate across your organization and conduct a risk assessment to understand the impact of metadata lost by a communications provider.
- Engage with communications providers to agree to responsibilities and set minimum requirements for the secure storage of metadata.
- Establish if, how and when communications providers will notify you of a breach and work together to minimize impact.
Privacy regulations impede the monitoring of insider threats
According to a study released by McAfee in 2015, 43 percent of data breaches in that year were caused by insiders: users, managers, IT professionals and contractors. It should come as no surprise, then, that User Behavior Analytics (UBA) tools, which flag anomalous user behavior, have become increasingly popular: a 2016 report by MarketsAndMarkets Research predicted sales of UBA tools would increase nearly 600 percent from $131.7 million in 2016 to $908.3 million by 2021.
But the ISF says new privacy regulations like the GDPR, South Korea's Personal Information Protection Act (PIPA), Hong Kong's Personal Data (Privacy) Ordinance and Singapore's Personal Data Protection Act, have the potential to constrain the use of such tools. They stipulate that an employers' use of such tools must be controlled and transparent to the user. Under GDPR, for instance, all profiling of employees is forbidden unless the employee is informed of the logic underpinning the process. While Durbin notes that transparency and creating a culture of trust is good, these regulations will position malicious insiders to circumvent UBA.
To address the insider threat and the implications of new regulations, the ISF recommends you do the following:
- Take legal advice on restrictions regarding user profiling in every jurisdiction in which your organization operates.
- Establish a rigorous program (tied to the disciplinary process) that is transparent about any employee monitoring activity.
- Make employees aware of insider risk and train them to identify suspicious behavior.
- Undertake more regular and stringent audits of access privileges for insiders, assuring appropriate role-based access.
A headlong rush to deploy AI leads to unexpected outcomes
AI systems represent a major innovation in terms of automation. The ability to learn independently will allow them automate increasingly complex and non-repetitive tasks in areas ranging from manufacturing to marketing and consulting. But Durbin notes that while AI are no longer in their infancy, they're only likely to reach adolescence in the next two to three years. And that makes them prone to errors: learning from wrong or incomplete information can lead to inaccurate conclusions, for instance.
Sign up for Computerworld eNewsletters.