Companies that added public cloud services and remote workers during the COVID-19 pandemic discovered compliance complications and an increased likelihood of data loss.
According to a recent study out of 304 IT professionals conducted by Enterprise Strategy Group (ESG), “The State of Data Privacy and Compliance”, 57% of respondents said they thought more than 20% at half of their sensitive data already stored in the public cloud was probably insufficiently secure. Sixty-one percent of respondents said they had lost data or suspected they had lost data.
The loss is largely attributed to human error, which has escalated due to remote working policies. A third (36%) of respondents said the data loss was indeed related to remote users.
From a senior management perspective, cloud applications added during the pandemic could become another silo to hamper business, although it’s a silo at AWS or Azure versus one living on-premises. .
“Regulations are changing, architectures are changing, and we’re replacing our legacy applications with SaaS solutions,” Niel Nickolaisen, CIO at Sorenson Communications, a provider of video relay and American Sign Language in-person interpretation services, told Salt. Lake City, Utah. “Now there’s nothing there and the data is in someone else’s data center. There are more bad places for things to happen.”
To meet this challenge, IT professionals must adjust security and access policies, regardless of where the data resides. “A well-architected environment between storage, compute and cloud is directly related to how enterprises manage compliance,” said Vinny Choinski, analyst at ESG, a division of TechTarget.
Businesses need to act quickly to secure data because any leak creates a sense of risk for their customers. After all, no one wants to do business with a company that can’t secure their data. But today’s distributed environments are moving targets that are likely managed by multiple departments or even multiple companies. And the many different vendors that provide the SaaS applications currently used in these enterprises often focus more on service availability than security. Securing the data that exists in these services – and securing access to these services – is always the responsibility of the user.
The rush to digital transformation
Underlying much of this increased activity is the fact that companies have accelerated their data transformation projects, but in many ways the ecosystem cannot absorb the changes. For example, while larger companies, such as AWS or Microsoft, may be set up to provide local services, others are unprepared for this level of service segregation, Nickolaisen observed.
In a perfect world, all service providers would offer regional instances of their services to customers who want data to reside in specific geographies. This would be useful in cases where a customer wants to replicate data from one region to another using the same provider for each.

“It reduces my complexity and also improves my agility,” Nickolaisen said. “If I make a change to my services, I simply replicate them, using the same provider services, in my different regions.”
Regulatory instability too adds complexity to compliance. GDPR rules, for example, are quicksand that create uncertainty about what companies can and cannot do. In the first version of the GDPR, Nickolaisen said the way his company complied was that it articulated in its EULA how it would use customer data. Customers were invited to register.
“Even in the EU or the UK, depending on the service used, everything was fine,” he said. “Since then, it’s not as airtight as we thought, and even opt-in may be insufficient.”
Internal Security Risks
The ESG study pointed to other sources of cloud-resident data loss that go beyond remote employees. Businesses typically keep their most sensitive data in a data center, if only for the reason that they believe service providers face some of the same challenges that their customers face, including attacks from inside. The control might not be worse, but it feels like they’ve taken a step up now.
In a query of 177 respondents, 29% said cloud-resident data loss was due to competitors’ sensitive data being uploaded to IT-led cloud services. 29% of data loss came from data exposure from personal devices, while 25% came from the use of unauthorized cloud services. Although there were fewer accounts, 20% reported data loss due to malicious insiders.
To stay up to date – automate
One expert said users can and should fight back with automation. Companies need to look at all possible ways to monitor and manage their infrastructure, which includes interconnectivity and access controls, as well as applications, said Andrew Plato, CEO of Zenaciti, a Beaverton-based consultancy, Ore.
Plato explained that automation makes compliance easier, not harder, in the cloud, although he acknowledges that companies in the early stages of automation may find it more difficult. He recommended contacting representatives of cloud service providers for assistance. They have the resources, security and development patterns to guide IT teams through the process, he said. Microsoft Azure, for example, publishes libraries of templates, blueprints, and other documentation on how to provide privacy and security in Azure environments.
“Service providers want you to stay on their platform,” he said. “They want you to consume services there. Anything you can get out of the platform – and compliance and security are the most important things – they’re here to help.”
But, he added, “They won’t come to you. You have to ask.”
The recent rush to build apps in the cloud is likely hitting hardest the employees who built the on-premises apps in the first place. People are indebted to what they have built, said Plato. The concern is that they lose power when the application they have developed becomes a cloud service and they discover that their expertise, cultivated for years, is no longer relevant.
And some worry that the cloud offers less control, but you might say that’s much more control. “It’s finer, more granular control,” Plato said. “People are the weakest link. You have to build around scripting and DevOps, where humans are less part of the equation.”
While regulations and compliance standards have certainly become more complex, there are also mature technologies well suited to new remote security requirements.
Remote access technologies can constrain and limit the scope of an attack when a device is compromised, Plato said. Expanded detection and response technologies can detect and monitor not only an attack, but also attacker recognition and user behavior that contributes to a compromise, Platon added.
The two technologies work together to protect the terminal.