In January, Kevin Kelly wrote an essay entitled "Better Than Free" that explained which concepts held value on the Internet. This generated a lot of interest, mostly around the question of how best to make money out of these concepts. As a career security guy, I found myself wondering how on earth my field will respond - how does security need to adapt to support business models based on these values? When we're used to locking everything down, how do we respond when people start calling for openness?
Kelly's essay set out one of those ideas that sound completely obvious once you've heard them: When something that can be copied comes into contact with the Internet, copies soon become freely available. And when copies become free, you need to sell something that can't be copied. That is, of course, a very brief summary of an elegantly stated argument; I urge you to go and read his original essay. It's a great read.
Kelly goes on to explain eight "generatives," things that can't be copied and so still hold value on the Internet: Immediacy, Personalisation, Interpretation, Authenticity, Accessibility, Embodiment, Patronage, and Findability. You may not want to pay for that mp3 that you can download for free, but you might pay to be able to have it right now, to have a copy tweaked to sound best on your audio setup, to have the lyrics translated into your language, to know it's the real thing, and so on.
So what do we need to do? How does security adapt itself to these generatives?
The answer is that we need to do more than you might expect. Except for his principles of Embodiment and Patronage, that is: people may pay for a physical copy of something (embodiment), or for the joy of supporting a particular artist or designer (patronage), but there's little that security can do to help there except get out of the way. But in each of the other areas, security has a big part to play, whether to directly help generate revenue or as a supporting role. Let's take them one by one.
Trust. Kelly passes over trust briefly, and doesn't include it as one of his generatives. But he does acknowledge its importance and rightly so, because trust underpins most successful transactions. When buying online, I won't give my credit card details to a company I don't trust, and Paypal made a lot of money out of realising that people feel that way. When buying financial products, I'll deal with a big bank in a regulated market that I trust rather than a niche company based in a small country that I can't find on a map.
So what creates trust? I suspect the answer is different depending on the person you ask, but three big components of trust for me are trusting that a company will look after whatever I entrust to their care, whether that's my money, my data, or my identity; trusting that if a problem happens on their watch they will do whatever they can to fix it; and trusting that the company will still be around for the lifetime of the deal.
This is, of course, the heartland of security. We understand trust; we understand how to support and nurture it. Perhaps we need to talk more to marketing specialists to understand better how to sell this brand of security-inspired trust to our customers, but this is our strength.
Immediacy. In our impatient society, we will pay to have something right now, even if we can have it cheaper or free in a few weeks or months. This has been said so many times by so many people it must be obvious to everyone - so why have we security folks not heard? How did we miss this?
We must have missed this, because as a consumer every time I see a security measure it's as a roadblock, a barrier to stop me spending my money. Registration processes with data validation schemes that don't recognise my address or my phone number - just give me the product. The pages of interminable licence agreements that I do not and never will understand - just give me the product. The increasingly complex authentication processes: username, password, secret word, CAPTCHA, mother's maiden name, inside leg measurement of your father's cousin's pet hamster - just give me the product!
Each of these mechanisms slows down the consumer just a little bit. And time and time again I've heard security people say "But it's just a simple verification step. This won't really get in the way." Maybe each on its own won't, but add all the security steps together, and then see which is easier to get hold of: your legitimate product or a counterfeit copy.
If it's easier to get hold of the counterfeit than the real thing - and I bet you, nine times out of 10 it will be - then you're going to have a fun time trying to get your consumers to pay money for your version. I tried buying some GPS software online recently, and kept running into barriers. Every step was made that little bit more difficult by intrusive and poorly designed security mechanisms, and eventually I gave up trying to give these people my money and went and downloaded a pirated version. I spent about two hours struggling with the company's website before giving up, then spent about five minutes on Pirate Bay finding a pirate copy. I actively wanted to give this company my money, and they seemed to actively be trying to stop me. They succeeded.
Of course there are good reasons why some of these security steps need to be in place, and I've put in enough in my time. I'm not for one moment suggesting we get rid of these mechanisms, but we must switch our focus. There must be two fundamental design objectives for any security mechanism: it must let the good guys in as easily as possible, and it must keep the bad guys out as effectively as possible. Too often, we concentrate on the bad guys and forget that it's the good guys that pay our wages.
Personalisation. There's no inherent security problem in tailoring a product to a person. The security problem comes in identifying the person in the first place, in remembering who they are and what they like, in recording what they've bought and what they decided not to.
This is identity management, one of the hardest problems in security. Over the past two decades I've seen security problems wax and wane in difficulty: firewalls have gone from specialist network routing devices to a black box, a standard building block in any network architecture; anti-virus has gone from custom-coding responses to individual Excel macro viruses to a mass-market product that even your grandparents will buy and use. But identity management remains hard.
Identity management is already one of the hardest problems, but if personalisation becomes a driving force of the new economy it's only going to get harder. Some companies do this well already: Amazon remembers me when I visit, remembers what I like and what I don't and recommends books and DVDs to me, like a village shopkeeper that knows all her customers as friends. But apart from the retail success stories, most companies are still struggling to remember who their own staff are and to manage the identities of their own employees, and are a long way from being able to extend this to their customers.
If Kelly is right about personalisation, then companies need to look at their identity management solutions now, and wonder if they're future-proof. Will it still support the needs of your staff and a (hopefully) expanding customer base in five years' time? If not, now is the time to start putting this fundamental piece of IT infrastructure in place.
Interpretation. There are many ways of interpreting a software product, and one important one is security updates. This comes back to one the building blocks of trust - knowing that your business partner can be relied upon to fix a problem if it occurs on their watch. There's money to be made, and customer loyalty to be won, in finding the security vulnerabilities and getting reliable patches out to market as quickly as possible. It doesn't matter that the majority of your customers wouldn't know a buffer overflow from a broken fan belt, but they do know that you are fixing the problem - that you are looking after them - and that encourages them to trust you.
Companies should be advertising their security updates as part of their sales material: "Buy our product, it does X, Y and Z, and if it goes wrong we'll fix it". (Maybe in a slightly more polished form than that - I never claimed to be a good copywriter - but to me that's a compelling sales pitch.) We've all been burned too many times by companies that took our money, handed over a product, and then conveniently forgot we existed when the wares started to go wrong. In other industries this problem is overcome by warranties, but in the software industry we have chosen not to guarantee that our product works. But if we want our customers to trust us, we have to give them faith that we'll fix problems that we cause, and regular bug fixes and security updates are one of our best ways of doing so.
Authenticity. Once again we're back to the heartland of security - Authenticity and Authentication. (Security people can breathe a sigh of relief, as this one's in all the textbooks.) We are the specialists in telling you what's real and what's fake. We have all sorts of tools and techniques to sign, to checksum, to watermark, to prove which is the forgery and which is the real deal. This is something we can go out and lead with.
But again, we're heading in slightly the wrong direction. For the past few years, our focus has been on stopping the bad guys, not on making life easier for the good guys. Digital Rights Management is the poster child of this movement, and few other security mechanisms have ever caused so much public disgust . DRM takes the view that the bad guys must be stopped at all costs, even if that leads down the path of suing your own customers, of deliberately designing functionality out of products - 'Defective by Design' - of associating hard-won corporate reputations with bully-boy legal tactics.
Kelly reminds us that Authenticity is actually a prized value to the customer, not just to the vendor. This is why I'm willing to admit in writing that I downloaded some pirate software from the Pirate Bay a few weeks ago - because I stopped the download and never installed it. I value the authenticity of what I buy; I don't want to put a piece of pirated software on my machine because, like something the cat dragged home, you never know where it's been. Put that piece of software on the same machines that holds all my email, my work, my bank account details? No thanks, I'll wait until I can eventually get hold of a copy of the original, authentic software.
The same goes for all sorts of products: yes, you can buy cheap copies of designer clothes in all sorts of places, but over time you learn that the authentic clothes usually last longer and look better. Given the choice of an expensive Nokia phone or a cheap 'Nokai' phone from my local market, I know which one is likely to be more reliable in the long run.
We need to remember that the good guys pay our wages. Your customers aren't stupid, they can tell the difference between a cheap fake and a valuable original, and they're often willing to pay for the real thing. Authentication mechanisms don't have to make life impossible for the forger: often it's enough just to make it clear which is real and which is fake.
Accessibility. Maybe Acme Digital Warehouse can sell me on the idea that they'll organise all my data, my music, my photos, my digital identity for me, but first they're going to have to sell me on the idea that they're going to look after it properly - again, we're back to trust. Facebook learned this recently after a customer backlash regarding their use of subscribers' shopping data, and to give them credit they seemed to learn quickly and sort out the problem equally fast. They're also learning that access control needs to be increasingly finer-grained as they give more access. I actually have more detailed control over access to my Facebook profile than I ever had over access to confidential data in the last few companies I worked for, and more and more Facebook users are making use of these features.
Findability. When there are millions of options, being able to find the right one for you is valuable. This is why one of the most valuable tech companies in the world is Google, a company originally founded to help you find things. This is nothing new: sales, marketing and advertising teams have always known that unless people know your product exists, no one can possibly buy it.
But if a fundamental principle of marketing is Findability, a fundamental principle of security is Confidentiality. There couldn't be two more diametrically opposed principles, and in security we have a whole array of tools designed to hide, to conceal, to protect, to guard against people ever finding out what we know. These are exactly the "skills of hoarding and scarcity" that Kelly labels as obsolete.
So is Confidentiality obsolete? No, though maybe we need to ease up a bit. There are still and always will be secrets in commerce - a company's financials just before results day, personal data covered by a person's reasonable expectation of privacy, the recipe for the secret sauce - but far fewer than we might think. I remember vividly a meeting I once attended as part of a data classification scheme implementation: labelling types of data as 'public', 'confidential', 'secret' and so on so that it can be protected appropriately. The longer the meeting went on, the more got labeled as secret, until eventually it seemed that everything in the company was secret and perhaps you'd need special clearance to find your way to the coffee machine.
The truth is that most 'secrets' aren't, and needn't be. The acid test for 'secret' should be "who wants it, what can they do with it, and will that hurt me?". The company's financials before results day clearly are secret - every investor wants it, every investor can profit from it, and you'd better believe that when your regulators find out you let that information go, it's going to hurt you. Similar arguments can be made for personal data, but for so many other 'secrets' you can't find an answer to one of those three questions. Someone wants your data, and it won't hurt you? Fine, give it to them! Give it willingly, give it enthusiastically, then go back and see what they've done with it and half the time you'll either make a new customer or find a new, interesting thing you can do with your data. Either way, both sides win.
For the majority of these generatives, we already have the skills to do what needs to be done. What we need to do is change the way we think about security. We need to remember that the good guys pay our wages; we need to remember that trust underpins every deal, and we are the brokers of trust; and when it comes to confidentiality and authentication, sometimes a little of a good thing is quite enough.
Find your next job with techworld jobs