According to Microsoft, .NET is: "the Microsoft solution for Web services, the next generation of software that connects our world of information, devices, and people in a unified, personalized way". In this RTFM, we'll explain what .NET really is, in easily understood English.

Traditional programming systems (either stand-alone compilers or integrated development environments that make code-writing and debugging easier) provide a means for the developer to enter program code and have it compiled into machine code that can be executed on whatever platform it's designed to do. Most languages have the ability to connect in some way to databases, networks, peripheral devices, and the programmer can use these facilities to whatever extent he or she sees fit.

This is all very well, but as any C programmer knows, while it's perfectly possible to (say) fetch a web page programmatically from within the code, it'll take many lines of code. That’s because not only do you have to write the code to make the actual connection but you also have to deal with sending requests, interpreting responses, dealing with errors and so on. It's doable, but it's not pretty. As any Perl programmer knows, however, fetching a Web page from afar is dead easy. These days Perl includes neat modules that do all the messy communication and allow the programmer to simply say: "Here's a Web page address – go fetch it and either give me the result or tell me why you couldn't get it".

The approach taken with .NET is not dissimilar to this, except that the extensions it provides are focused on distributed application development and execution. All programmers are used to writing code that calls functions, within either the current program or some code library that resides on their computer, but why not be able to call functions that live on a random computer somewhere on the Internet?

Standards-based ideas
This is entirely feasible using standards-based concepts such as HTTP (for providing the connection from the calling machine to the machine hosting the function), WSDL/SOAP (for allowing the server to define for the client precisely how to access the function) and XML (to provide a structured format in which to transfer both the input parameters and the results of the function call). The trick with .NET is that it's designed, just like the Perl web library we mentioned, to do all the rubbish in the middle and allow the programmer simply to say: "Call this function with these parameters and I don't care where it is".

And this is the point. If you've written a function that programs on your PC can access and you want to go the next step and make it accessible as a function to remote computers, this is as simple as telling the compiler: "By the way, make this work as a Web function". Similarly, you tell the calling program: "By the way, this function isn't local, it's over there on that machine, under this URL". There's a whole shedload of WSDL, XML and HTTP stuff happening underneath, but you don't have to worry about that as .NET does it all for you.

.NET, then, isn't doing anything particularly Microsoft-specific. It uses standards-based concepts to provide distributed computing in a much simpler way than ever before. This is, of course, nothing that couldn't be done by anyone else developing a computer language. The reason Microsoft can be so far ahead of the pack is because .NET bolts easily into concepts that the company has been developing over the years. Need authentication? Easy, include Passport authentication into the system, or point the server that's providing your functions to the world into an Active Directory forest.

Covering all the bases
The other thing Microsoft has done is to make the development task for .NET both cheap and simple – though not at the same time. Visual Studio.NET is one of the most powerful IDEs on the market, and makes application development and debugging a breeze – albeit for a cost (VS.NET will cost you upwards of £1,000). And with the .NET incarnation, Microsoft has brought Visual C++ out of the dark-ages and made it a forms-led implementation system, just like Visual BASIC has been for years, by devising a "common language runtime" on top of which all its high-level languages are now based. If, on the other hand, you don't mind having to make much more effort to write your code, and do without VS.NET doing much of the work for you, you can download a free software development kit and start writing .NET code tomorrow for nowt.

In short, then, .NET extends well-understood programming concepts into the field of distributed computing with negligible effect on how they appear to either the developer or the user. The techniques used are complex, but they're standards-based and – just as they should be – are largely hidden from the developer.

Windows Explorer made a massive leap of logic (which, in hindsight, is a completely obvious step) when Microsoft decided that a file is just a file, whether it happens to live on the local disk, on an FTP server, or a web server. Files have addresses, and as long as the user knows where the file is, Explorer will do whatever clever stuff is needed to fetch the file. The user is blissfully unaware of the difference, except perhaps for a slight delay on a remote file. .NET extends this concept in just the same way into the realms of program interaction.