One of the most immediate repercussions of the heightened regulatory environment to emerge in the wake of the creation of the General Data Protection Regulation, the California Consumer Privacy Act, and similar pending legislation in most states, is the prioritization of cybersecurity.
The aforementioned legislation makes data breaches mean much more than the potential loss of information, yielding competitive advantage or trade secrets. Today, breaches mean staggering financial penalties, considerable loss of reputation and legal issues detrimental to continued operation.
The enormity of these considerations is only reinforced by the fact that they exist across verticals. It’s not just large enterprises dealing with these realities, but any business with even a semblance of customer or transactional data.
Subsequently, it’s incumbent upon organizations to fortify their security as much as possible to minimize data breaches. Modern approaches to some of the most time-honored methods of protecting data assets deliver significant value to this endeavor. Some of the more effective include:
• Browser-based isolation: There are numerous methods for implementing isolation techniques to reinforce cybersecurity. Browser-based isolation inserts a protective layer between organizations and the internet to mitigate most forms of cyberattacks.
• Micro-segmentation: Micro-segmentation involves a form of isolation for cybersecurity. Notable advances in this space include the deployment of software-defined perimeters that buttress the application layer.
• Triple attributes: Triple attributes create an additional means of protecting data within their repositories based on enterprise-defined specifications. They’re a sophisticated means of ensuring users can’t cheat conventional security measures to access data.
Collectively, these three measures ensure stalwart protection at the conventional perimeter, application and data layers. Organizations implementing all of these methods will greatly negate external threats, and perhaps only be vulnerable to exfiltration “inside job” security issues.
Cybersecurity at a glance
The core principles of browser-based isolation are derived from analysis of a cursory history of the cybersecurity landscape for the past couple of decades. “If you look at cybersecurity in general for the last 20 years, every single piece of technology ever produced and in operation today is around the basis of determining whether something is good or bad, and then acting upon it,” said Menlo Security Chief Technology Officer Kowsik Guruswamy. “They all try to see whether the link you clicked on is bad, the website that you went to is bad, the PDF that you’re trying to download is good or bad. Then they have two levers: allow or block.”
Browser-based isolation techniques invert this approach with a number of formidable assumptions. First, they assume this deterministic stance is innately flawed. The rash of security breaches in the past several years seem to attest to this viewpoint, as does the increasing sophistication of phishing, zero day and lateral movement attacks. Moreover, browser-based isolation practically assumes that almost all material encountered on the web is suspicious and takes measures to ensure such suspicion is unfounded. “By doing browser isolation in the cloud, we are not ever trying to determine whether a web site is good or bad,” Guruswamy said. “We’re simply taking everything that’s active, which ultimately results in malware, and executing the entire web session in our browser. That way we have really stopped playing this detection game, which means you don’t have false positives, false negatives; all of that mess just goes away.”
Browsers on demand
The architecture for browser-based isolation is particularly reassuring from a security perspective. These solutions effectively sit between organizations and the internet so there’s a “sort of browser on demand if you will,” Guruswamy said. “That way we sort of create this air gap so there’s a virtual separation.” Thus, users don’t directly visit the internet, but do so via spinning up a secure browser on demand in the cloud. Any potential malware encountered affects the provider’s browser and resources — not the consumers’ The severity of such occurrences, however, is severely limited. According to Guruswamy, if a browser becomes infected “it’s only that browser that’s going to be affected by it, and it gets thrown away.” Therefore, whenever users close web sessions, there are browsers constantly created in the cloud on their behalf, meaning malware doesn’t have a substantial opportunity to persist or move laterally to affect other users on the platform.
The application-level security approach of software-defined micro-segmentation similarly limits, if not outright obliterates, the possibility of lateral movement attacks for intruders. This method deftly replaces the perimeters of conventional firewalls or Virtual Private Networks with those of individual applications. According to 451 Research analyst Eric Hanselman, the former is simply no longer relevant in today’s distributed data landscape. “Work applications have been scattering themselves to the four winds for a while,” Hanselman maintained. “The difficulty is that those that were living in that happy, [conventional] perimeterized world were running into a bit of delusion.” Competitive software-defined perimeter solutions utilize several measures to ensure applications can communicate with each other remotely without any knowledge they’re connected. As such, attackers have few means of even knowing data are being transmitted, let alone accessing them. Those measures include:
• Micro-tunnels: Micro-tunnels are the essence of contemporary software-defined perimeter solutions that connect applications. DH2i CEO Don Boxely mentioned that with their discreet communication, they “don’t give a user a slice of my network. They just have a secure tunnel between two points. As a result, if somebody takes over the machine, they don’t have the opportunity to do a lateral movement attack.”
• Random port generation: The tunnels on both sides are connected through a matchmaker service in the cloud that randomly generates ports for the tunnels to use. This way, attackers can’t home in on specific ports. Once tunnels are closed after connections are made, the tunnels become all but invisible.
• User Datagram Protocol (UDP): UDP is a more obscure protocol for transmitting data than traditional Transmission Control Protocol (TCP) is. By adjusting UDP to include packet correction capabilities, micro-tunnels are “faster; there’s less latency,” Boxley said. “Because people don’t use it a lot, the bad guys haven’t figured it out because they don’t see it that often.”
The Data Layer
Software-defined perimeter transmissions also guard information at the data layer by utilizing Datagram Transport Layer Security (DTLS) encryption and Public Key Authentication. Fortifying information assets at the data layer is likely the most dependable method of protecting them, because it’s the layer in which the data are actually stored. It’s important to distinguish data layer security versus access layer security. The latter involves a process known as security filtering in which, based on particular roles or responsibilities, users can access data. “You can specify filters where for a particular user or a particular role whether you could see or not see particular [data],” Franz CEO Jans Aasman said. “You could say if someone has the role administrator, we’re telling the system ‘administrators cannot see [certain data]’.”
Data layer security facilitated by triple attributes greatly enhances access-based security by putting security mechanisms into the database in which the data reside. This method is enabled by smart data technologies in which data are described by triples or semantic statements. By imbuing triples with key value pairs related to security, only users with the requisite security credentials can view that data. Key value pairs include keys such as security clearance level, department, etc. In this case, values are the different security levels and departments within the organization. This methodology has plentiful advantages, including:
• Innermost layer security: Unlike security filtering that’s architecturally on top of storage layers, triple attribute security occurs “at the core of the graph database,” Aasman remarked. The dual mechanisms complement each other; triple attributes provision data access where the data reside for maximum security.
• Interminable triples: Users can implement as many triples as are practically possible for data assets. Regardless if users have the credentials to satisfy some attributes, they can’t view the data unless they can satisfy all the key value pairs.
• Arbitrariness: The flexibility of the key value pair format makes it entirely arbitrary as to which key value pairs are selected. Organizations can leverage those that make the most sense to them, their use cases, or business unit needs.
Moreover, triple attributes can be based on compliance needs specific to regulations — which is immensely utilitarian in the post-GDPR data landscape. “For the government you could have a feature of whether you’re a foreigner or not,” Aasman said. “HIPAA doesn’t care whether you’re a foreigner or not, but you can do a separate mechanism for it.”
The coming decade will be characterized by increasingly mounting regulatory demands. Organizations can mitigate regulatory consequences pertaining to security breaches by safeguarding data at the perimeter layer, the application layer and the data layer. Prudent firms will learn how to coalesce these approaches for their own particular needs to secure their data assets and adhere to regulations. Browser-based isolation, software-defined perimeters, and triple attribute security can directly ensure they meet these goals well into the coming decade.