In a BYOD world, this approach is compelling. By hosting the desktop, IT owns a virtualised generic hardware environment yet can supply that environment to a variety of hardware devices-smartphones, tablets, Linux PCs and even smart TVs, which could be used more readily for high-end, off-site conferences in rented facilities or as a cheaper alternative to more expensive conference room solutions.
Kepler's promise - Supplying tomorrow's cloud services
Like most of Nvidia's presentations, this one from CEO Jen-Hsun Haung was visually stunning, covering the company's new high-end GeForce GTX 390 graphics cards and boasting of tenfold performance improvements.
To demonstrate this prowess, the company rolled out legacy technology, which was able to render the birth of the universe, and then ran Kepler to render the collision of two galaxies, our Milky Way and Andromeda. The power required to render this destruction was the proof point to Nvidia's 10x performance claim-and, as you might imagine, it was powerful and a little humbling. (Don't worry, it's not due to occur for about 3.5 billion years, so you have plenty of time to pack and ponder the insignificance of human life.)
The big news for the enterprise, of course, isn't the graphics performance increase for an individual user. Rather, it's the fact that the Kepler processor is designed to supply cloud services, is branded VGX and includes a gaming counterpart called GeForce GRID. This means it's been tuned to provide virtualised desktops-and to do so efficiently.
On stage, Cisco showed its server based on Kepler technology. As a networking vendor, Cisco is uniquely positioned to provide a system that will require not only specialised servers but tuned networking hardware that can function with low enough latency and sufficient bandwidth to scale to enterprise levels. Vendors not on stage, but with their own plans to have hardware or services tied to Kepler, include Dell, IBM, HP, Supermicro and Amazon. In addition, Citrix, Microsoft, VMware and Xen plan to release supporting software. These are some of the most powerful names in technology, which suggests that change is in the wind.
Virtual desktop hosting beats Wild West of unregulated BYOD
Unlike earlier efforts that provided either low performance or a dedicated one-to-one, server-to-desktop system, the key to hardware built on the Kepler processor is that it is designed to auto-scale to the required performance level. Moving dynamically from a shared resource to dedicated resources based on processing loads that range from light email to full-on CAD or media editing-and displayed on everything from traditional desktop monitors to iPads and smart TVs-could be the ideal solution for IT managers struggling to meet the conflicting needs of a BYOD world while still providing a consistent level of breadth and security.
Consider the alternative. Amazon Web Services are covertly making their way into enterprises, as employees are increasingly willing to pay for them in order to obtain quick solutions to common storage, database and networking problems, to name three.
This new class of Kepler-based servers will be coming into enterprises and technology-intensive small businesses. The aforementioned services from Amazon and other vendors will likely enter these companies where these servers don't exist, but the need for BYOD coverage does. Right now, that's majority of businesses.
The not-so-subtle message, then, is that IT needs to get its arms around both the opportunity and the risk of technologies such as Kepler and provide alternatives to anticipated Amazon and Amazon-like services. Otherwise, IT will find itself out of the loop for yet another layer of end-user technology-and, since budgets typically follow usage, this means IT budgets will suffer as a result.
We are clearly moving to PCs-as-a-service. Like any major change, this one promises less-managed IT complexity and avoids the problems associated with the native support of BYOD hardware. It also provides another opportunity for the IT department to become obsolete. Making sure an organisation takes advantage of the former and isn't buried by the latter will differentiate the IT organizations that retain control of their computing clients from those that don't.
Those of us who have been around long enough have seen this attempted several times. In the past, the solution was either too software-centric, with no tuned hardware; or it was too hardware-centric, with thin clients running no software. As Goldilocks would say, for once this feels like it might be just right.