Skip to content
January 10, 2017 / Simon Crosby

Dear Washington – My Advice on Making America (Cyber) Safe Again


  • In part one, I shared my story of giving advice to the Trump Transition Team.
  • My bottom line advice, “Move everything to the cloud. Fast.”
  • Then use virtualization to protect what matters most.

cloud-hostedFirst, some clarity on the term “cloud”.

I’m heavily biased toward the adoption of public cloud services wherever regulations permit. The three major public clouds: AWS (including GovCloud), Azure and Google Cloud Services are best known, but there are many others. A host of government contractors operate facilities that are FedRAMP accredited, and SaaS application offerings from major players also count as public cloud services. Public clouds can operate at a high degree of scale and automation – and thereby achieve cost savings and benefits of dependability and security that are impossible for any single enterprise to achieve on its own.

Where use of a public cloud is not possible I strongly recommend the use of private cloud infrastructure. VMware is clearly the infrastructure leader, but Microsoft, Citrix and Red Hat play important roles. Importantly I strongly recommend adoption of VMware NSX for network micro-segmentation as a key enhancement to improve isolation and therefore security of private cloud infrastructure.

Private cloud offerings from other vendors exist too, and should be considered carefully. At its simplest, the mandate should be interpreted as “Get out of owning/operating data centers, servers, networks and storage as much as possible, and focus on areas of core competence – your agency’s applications”.

IT organizations across the government should need to focus (only) on the applications needed by their personnel to successfully execute their mission – from email and browsing on PCs, to complex back-end ERP logistics apps. Users need access to their applications from endpoints across the globe. Cloud-hosted IAM solutions are fundamental to secure multi-factor authentication and access control, and delivery needs to be secured at all times over state-of-the-art encrypted links. Of course, application delivery and execution technologies have changed enormously over time.

My recommendation to adopt cloud is literal: it comprehensively includes both user workspaces and the hosting of back-end application servers because by adopting cloud IT can automate the lowest-value operational tasks, and focus instead on high-value, mission-centric outcomes for users.

Let’s start with the more familiar use of cloud – to replace traditional on-prem data-center based back-end services. Below I’ve tried to picture the revolution in back-end service architectures that is occurring.

Ditch data centers for the cloud.

I was shocked to hear recently from an IT Pro in the DOD that virtualization is a new technology.

It isn’t.

Legacy attitudes keep applications hostage – tied to infrastructure that is fed by an army of IT pros. Very few enterprise applications cannot be more reliably run in a private cloud. And every private cloud needs to enforce network micro-segmentation to boost security. New enterprise applications must be developed using the latest techniques for micro-virtualization, micro-services/containerization and micro-segmentation. The app is composed of ephemeral, short-lived micro-services that are fundamental to scalability, availability and security.

Every invocation creates new components from known-good images that are discarded when they are no longer needed – typically in a few seconds or minutes. Short lived micro-services are harder to attack, and self-remediate if attacked. Finally, in the public cloud, lambda architectures are being adopted to support massive scalability and security for IoT/sensor driven applications and other event-based workflows. Examples include AWS Lambda, Azure Function and Google Cloud Functions.

cloud-hosted-appsCloud and virtualization for the desktop.

Over 90% of enterprise breaches start with a user mistake and a “click”. End-user desktops can be more securely delivered from the cloud (whether private or public) than traditional using PCs or laptops, using VDI. But today’s VDI desktops benefit primarily from automation – of patching, app deployment and delivery – but not security. VDI endpoints are no more secure than PCs: malware hides in the user profile and reappears when the VDI image is rebuilt. The US Intelligence Advanced Research Projects Agency (IARPA) recently concluded that “…a general purpose Windows desktop in a VM is not adequate to protect sensitive government users or workloads of the future.”

What’s needed is a desktop infrastructure (both cloud hosted and native) with security built-in, to massively up-level our resilience to attack and get us out of the stale rut of vacuous promises from security vendors that try to detect threats to protect the endpoint.

Fortunately the world’s most secure workspace already exists. It is a hosted virtual desktop with nested micro-virtualization, and it has been widely deployed in some of the most security-critical US agencies. This desktop architecture is massively secure by design and has not suffered a single breach since it was deployed, despite daily attacks by “sophisticated nation state attackers”.

No more excuses.

Integrating core security capabilities into the desktop rather than bolting on failed technologies like AV and DLP as an after-thought reduces the complexity of IT infrastructure management dramatically. Here’s why. Every time the user logs on, they use a known-good device or hosted desktop that automatically uses virtualization to protect each app, all data, credentials and networks – by design. And when they log off, everything is discarded to start fresh at the next login.

Of course there will always be a mix – users need secure devices to access the web and ‘remoted’ applications or desktops, and also need to run fat-client applications – but the same technologies that make the cloud fundamentally more secure can transform the security of PCs and mobile devices.

Micro-virtualization plays a fundamental role in securing devices by design, enabling users to click on anything without the risk of a breach. Devices that are dependent on legacy software, or that are unpatched, are no longer vulnerable to attack. Automation and integration in the user workflow is key: Each user task or application that accesses content at a different trust level is automatically and instantly hardware isolated in a container that enforces least privilege.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: