The UK government plans to unleash driverless cars on UK roads by the end of 2013, it was announced today.

A team of researchers from Oxford University and Nissan UK will test a driverless Nissan Leaf on public roads, after successfully completing trials around Oxford Science Park.

The move is part of the Department of Transport’s new £28 billion road investment and strategy scheme that was announced today.

The report claims autonomous vehicles are capable of driving on their own "using knowledge of the environment in which they are driving".

The report also states: "Systems are now starting to emerge linking technologies such as lane keep assist, advanced intelligent cruise control and advanced emergency braking. These technologies allow a vehicle to travel along major roads, maintaining a safe distance from the vehicle in front at a set speed and without deviating from their lane - all without the driver's input,"

The vehicles are guided by a system of sensors and cameras and hold the potential to be safer and more efficient than regular vehicles. As a precaution, a back-up driver will be in the driverless vehicle at all times and ready to intervene in case of an accident.

Until now it has only been possible to test autonomous cars in the UK on private land.

Google is involved in a similar project in the US where it has equipped a fleet of Toyota Prius cars with autonomous technology.

"I think the self-driving car can really dramatically improve the quality of life for everyone," said Google co-founder Sergey Brin at the company’s headqaurter’s last September when the California driverless car bill was signed. He added that he thought the vehicles would be commercially available before 2020.

However, at present, each Google vehicle is thought to cost approximately £100,000, with half of that money being spent on a high-quality laser range finder.

Meanwhile, the Oxford team claims that its prototype navigation system costs less than £5,000.

“We use the mathematics of probability and estimation to allow computers in robots to interpret data from sensors like cameras, radars and lasers, aerial photos and on-the-fly internet queries,” reads the academic group’s website.

Find your next job with techworld jobs