It’s widely agreed that the introduction of cloud services, over the last few years, has been one of the most significant developments in the history of the web. Cloud services power faster innovation through flexible resources, unprecedented economies of scale - and, according to the Netskope Cloud report - they now account for 85% of all enterprise web traffic.
However, as more and more cloud services enter the enterprise environment, complexity increases for users and IT departments alike. Despite years of security advances, many businesses are still sceptical of how well the cloud can keep their organisation’s data secure; given the frequency of reports about compromised cloud data, they are valid concerns.
Defining cloud services vs web
The line between cloud services and the broader web is a blurry one. Cloud is generally defined as a service which has been designed for multi-tenant support, requires a login for access and provides the ability to store and process data. So, you will find the inclusion of Google Apps and Microsoft Office 365 in this category, alongside social platforms like Facebook, Twitter and YouTube.
At first glance it may seem a broad definition, encompassing both enterprise and consumer services, apps likely to be sanctioned and those more likely to be found within shadow IT. But it’s worth looking at this group as a whole, because there are significant commonalities and all cloud services drive very different usage patterns compared to more traditional web use.
The primary difference to note, is that while traditional web traffic is primarily in-bound and characterised by a limited set of methods, cloud service traffic is bi-directional and API-driven: cloud services offer a broader set of interactivity by the user and encourage and enable data to move out of an organisation as often as it moves into it - which means that the risks presented by cloud services are broader.
Understanding the risks
For traditional web traffic, the Cloud Report identified that the top three policy violations are: acceptable use violations, malicious site violations and malware detections - so, plenty of external threats here. For cloud services, the top threats are all internal: data protection violations, cloud activity policy violations and anomalous activity violations, top the list of risks for cloud services.
If we add to this picture our own research, which shows nine in ten enterprise devices in use today are mobile, we add another layer of complexity and concern for security teams. Mobile devices are disconnected from a local network for more than half of the time they are used; they are often connected to unsecure public or third party WiFi and their widespread use within enterprise was funded by bring your own device (BYOD) models which means the device itself is often not owned by IT teams. These factors can all make it more challenging to control unintended data leakage through apps or services.
It’s clear that the traditional idea of perimeter security is fast becoming obsolete in this cloudy world without borders. So, how can organisations address these new risks?
Moving DLP to the cloud
It is important to recognise that data leakage risks can increase with cloud use if cloud infrastructure is not properly architected and secured. It seems that not a day goes by without another data leak coming to light, yet none of them seem enough to serve as a rallying cry for change. Leaky AWS S3 buckets are probably the most common crime scene for unsecured data in the cloud and the number of breaches reported, because of data sets left in publicly accessible cloud buckets, is worrying.
The average cost of a data breach in the U.S. is $3.86m - and for large companies, the cost can be significantly higher. Punitive fines for legislative breaches can also push costs up exponentially and in addition to this, the reputational damage of a high-profile data loss incident can be irrecoverable.
However, cloud presents solutions as well as problems. As data - and the risk of data leakage - has moved into the cloud, so too have effective automated data protection strategies. Cloud storage services can be scanned for any sensitive content that is publicly exposed and a continuous security assessment of the IaaS environment can detect and remediate misconfigurations. For example, we use the simple rule: ‘ensure S3 bucket is not publicly accessible’ to detect leaky buckets and, equally, the ransacking of records from misconfigured Elasticsearch databases can be prevented with a simple rule that alerts in case the TCP 9200 port is left open to the whole internet.
Technology alone is not enough; with the majority of cloud security breaches caused by human error, user education is an important component in surmounting cloud security issues. Regular employee security training is essential at all levels to mitigate the human negligence that results in unsecured login credentials, publicly accessible data, misconfigurations and unpatched vulnerabilities.
Beyond the basics for DLP
Best practice in DLP requires more than the basics (which are analogous to making sure you lock the front door when you leave the house). Today, companies often implement inconsistent, ad-hoc practices and technologies - indeed, it’s this inconsistency which leads to a lack of visibility into data assets and ultimately to weak data security. Best practice combines the best technology, clearly defined processes, knowledgeable staff and wider employee awareness. Organisations need staff with DLP expertise who can evaluate a range of different data types and assess their value to the business.
This involves classifying relevant data in terms of where the data is stored, how it is handled and whether or not it is sensitive. Technology can automate the auditing and classifying of data based on its content and context - intelligently deciding into which category the information falls. After classifying data, the next step is to create (or update) policies for handling different categories of data, which adhere to both government legislation and the needs of the individual organisation.
After this, policies need to be administered. The best DLP enforcement technologies provide options for handling potential security breaches. For example, if an employee is uploading potentially sensitive data to a cloud storage service (whether the cloud provider is sanctioned by IT or not) a context aware system can prompt an advisory pop-up message, or even block the action entirely. It’s all about ensuring that employees know how to adhere to these policies and the importance of doing so.
Understand that clouds are more grey than black and white
An ‘allow or block’ approach to cloud services is pretty common for businesses of all sizes, but this binary strategy could be ineffective, as malicious actors are increasingly exploiting trusted or whitelisted services.
Indeed, while many organisations see whitelisting as a savvy approach (allowing only a few approved applications and blocking everything else) that’s not always the case. Just because you trust an application, that doesn’t mean that hackers won’t also use it to get an easy ride into your systems.
Malicious actors regularly host their command and control in AWS and other cloud services that are often not just whitelisted by businesses, but actually seen as mission critical. And of course, this approach doesn’t protect adequately against insider threats - employees (perhaps disgruntled ones that are about to leave the business) who have access to sanctioned applications and are able to exfiltrate data from them.
What’s needed is a more nuanced, context-aware approach, which can spot anomalies in usage patterns and data access. Simply put, if something unusual is happening in a whitelisted application, it needs to be identified and stopped - quickly.
Reimagining the perimeter
Security professionals are realising that they require a new architecture to secure the multiple perimeters enterprises are charged to protect. Networking and security-as-a-service capabilities need to be consolidated into a cloud-delivered Secure Access Service Edge (SASE).
SASE provides low latency direct-to-cloud connectivity, which brokers access between end users, cloud services and applications. It understands data context and converges different types of network and network security controls (for example, secure web gateway and cloud access security broker). Gartner has predicted that SASE will be as disruptive to network and network security architectures as IaaS was to the architecture of data centre design.
Ultimately, securing the cloud requires appropriate automation alongside user education to limit human error, as well as next generation technology that understands the context and provides the ability to build much more nuanced data policies that better reflect the complexity of cloud service usage.