January 24th, 2010 | Information Security
[ Check out my latest post on the HP Security Blog: “The Secure Web Series, Part 2: How to Avoid User Account Harvesting” ]
The recent AURORA attack is about to change how the web browser is handled within the enterprise. This sort of exposure will introduce a concept my friend Steve Crapo has been talking about for years: virtualized, isolated browser farms.
In the beginning of Internet commerce, web-facing servers were located on the same network segment as protected internal resources, such as database servers, HR systems, etc. This was demonstrated to be a universally stupid idea, and the concept of the DMZ was born and propagated as a standard architectural practice.
The same is about to happen for web browsers within corporate networks. It will soon be considered unacceptable to have regular web clients sitting on the same network as protected systems–or even on a network with access to those systems.
In the near future, all web browser interaction with the Internet will be done virtually–from a segmented, virtualized network with multiple layers of protection between the browsing network and the Internet. Some of these will include:
- state-of-the art proxying and real-time whitelisting/blacklisting
- sandboxing to isoloate browser from OS
- application/executable whitelisting on the browser OS
- regular patching of all browsing VMs (near-immediate)
- regular snapshot restores of browsing VMs to known-best state
So once a user drops out of their Internet web session, that VM will be restored to it’s known-best state and put back into the pool of available browser sessions. And when any patches are released, they will be applied immediately after testing to the pool browsers. Ideally, within a couple of hours of a patch for IE, Reader, or Flash being released the browser pool could be updated with those changes.
Using this model, the enterprise will begin to rightly assume that that all Internet browsers are compromised, and will take aggressive steps to ensure that users are always touching the Internet through highly filtered, constantly restored systems that cannot interact with protected internal assets.
This will present a number of challenges–not the least of which being that the web browser today is used to seamlessly transition between intranet and Internet interaction. This will require that an extra step be taken to get content attained from the Internet browser farm to user workstations, but over time this challenge will be considered small compared to that of handling browser-vectored malware on the internal LAN. ::
- Bypassing a Web Proxy Using Chrome on OS X
- New “Man in the Browser” Attack Bypasses…
- Microsoft Looks at Browser Replacement Technology
- Chrome Wins Again: Now Clears Flash Cookies Natively
- Tracking Web Visitors Using Cached HTTP Redirects |…
- Burp Suite Professional Version 1.3.09 Released
- Three Proxy Options Every Security Pro Should Consider Using
Thank you for visiting.blog comments powered by Disqus