In 2015, while sat in a meeting at his full-time job, Julius Sweetland posted to Reddit about a project he had quietly been working on for years, that would help people with motor neurone disease communicate using just their eyes and an application. He forgot about the post for a couple of hours before friends messaged him to say he'd made the front page.

Now three years on Optikey, the open source eye-tracking communication tool, is being used by thousands of people, largely through word of mouth recommendations. Sweetland was speaking at GitHub Universe at the Palace of Fine Art in San Francisco, and he took some time to speak with Techworld about the project.

optikey lead2

"So, 2009 was when my aunt was diagnosed with motor neurone disease, and she died about a year later in 2010," says Sweetland. "I wasn't thinking about doing anything at the time - it wasn't until quite a bit later that I got an idea in my head. It was probably 2012 when I had really gone 'hang on, I think I've got something here'."

Sweetland had seen eye tracking systems before but never utilising Swype as you would on a phone. Your eyes move a lot faster than your fingers, Sweetland says, so they're actually a lot more accurate.

"They jump between where you're looking at an incredible speed," he adds. "I thought, I'm sure there's a way here you could flick your eyes around a keyboard and be faster than you would be if you were typing, that was really the genesis of the idea. The next problem was I have got a financial background, and I've got no idea how to do any of this stuff."

He set about learning everything about how eye tracking, how cameras, and how infrared all work.

"The way it figures out where your eyes are is quite interesting," he says. "You have infrared lights in fixed positions below your screen, and they create reflections called 'glints' on your eyeball. What these systems do is they detect your pupil and they detect the glints, and your pupil moves relative to the glints, which stay fixed - so you can see how the eyeball is moving. Then, if you combine that by tracking the head movement in 3D space, and calibrate it to where you knew they were looking at a certain point, you can error-correct for where they're looking on a screen.

"All cool stuff... all incredibly difficult to actually do in real life."

For his first attempt he found an Instructables demo online that was based on stripping out and then rewiring a PlayStation camera. This is then attached to a pair of eye frames, and the device protrudes as a little stick that points at one of your eyes, where it can track that eye's movement.

He wrote a simple piece of kit initially called ETTA - Eye Tracking Transcription Assistant.

"A bit wordy," he says. "But what it showed me was that it was possible for this to work. I was able to Swype whole words, and it would show them on the screen. As you moved on to Swype the next work, it would say, well, that's acknowledgement that the last word you Swyped is good, so I'm going to say it.

"It made sense at the time but it's actually really clunky to use because you'd be thinking about the next word, and it'd be saying the last word the whole time. But that showed me with that demo that every part of what I wanted to do was possible. That was about 2013, I'd achieved that. So I did the sensible thing, and threw it all away and started again."

After having solved the initial problem he realised he had far bigger ideas - "I thought I may as well make this ten times more complicated" he says - so he set about creating an interface that would simulate a keyboard and simulate a mouse.

"I want full computer control," he adds. "And that introduced literally thousands of small problems."

Two more years of churning away coding in the middle of the night and being a new father to his first child he says he was hitting a wall in terms of how much time to give it.

Instead of letting perfect be the enemy of good, he made that post in 2015 on Reddit, and that's when the project really took off, transforming from a personal project where Sweetland shouldered all of the burden into a GitHub-based community project where it still thrives.

People started discovering bugs, or requesting features, and would pitch in to get involved, using the issue tracker, the pull request system, and so on. "The nuts and bolts of GitHub really came into their own for me, and it allowed the community to invert control," he says. "Instead of me saying: Hey, I know where this is going, I got this guys... it was more like them saying 'we want it', 'we want Turkish', 'we want German', 'we want Italian', 'we want to play games with it', do this, do that. They pulled me in different directions but also put their hands up to volunteer to do most of the work. It's been incredible."

Optikey is used by thousands of people in the wild, and in fact after Sweetland gave his presentation at GitHub Universe someone from the crowd personally introduced himself.

"When I came out of the talk a guy came up to me and introduced himself and said thank you, because his wife's uncle used it up until he died," Sweetland says. "This is a small crowd... so if it's touched someone here, I feel very proud. It's doing its job."

In the 2015 post Sweetland said that he had wanted to disrupt the existing market, where healthcare technology companies charge high prices for eye-tracking communication technology - with some insurance providers charging as much as $15,000 for access to the life-changing equipment. Has this open source affordable alternative disrupted that as he intended?

"To an extent," Sweetland says. "It has provided a counter-point to the expensive systems. But the problem I have is an advertising problem, people don't know it exists."

Future plans include a plug-in fork which will allow customisation of the keyboard that can then be connected to an automated smart home, allowing the user to turn the lights on or off, for example.

Originally, Sweetland's exposure to open source had largely been through the consumption of tools such as the GIMP.

"I knew of the concept, I didn't really know how the nuts and bolts worked, I was always a little blase about how do you make money from something like that... but flipping it around again I'm still coming from the point of view that there's no money in my product, so I still don't understand how people make money in open source...

"I'm just in awe of the amount of people who will contribute to a project because they think it's cool," he says. "Everyone who's here's time is valuable, and they will give it up for projects like this. I think it's a great, really good thing."