As an economics student at Cambridge University Tom Adeyoola knew as much about card counting at the Blackjack table as he did about garment supply chains. In short: nothing.
However, over the past eight years Adeyoola has gone from building online blackjack games to 3D modelling clothes and consumers. The connective tissue is computer vision technology - where computers can learn to recognise objects in digital images.
Speaking from their London office in a redeveloped Victorian wool warehouse in Whitechapel, Tom Adeyoola, founder of the startup Metail, and his CTO Jim Downing explained how the business developed its underlying computer vision technology and how they plan to commercialise it in 2017.
In 2007 Adeyoola was sitting in his hotel room in Macau trying to solve a work problem: how to take advantage of looser rules around online gambling thanks to the Gambling Act of 2005 and create a live blackjack game for his employer Inspired Gaming Group.
He wanted to create a game "where you could have players or machines playing against a live table", he said, "and the way to solve that was by looking up and using cameras to recognise cards being dealt live in the casino".
The next step took a leap of faith. Adeyoola Googled 'leading expert in computer vision' and found Professor Roberto Cipolla at Jesus College, Cambridge. "So I cold called him from my hotel room in Hong Kong. I asked him how easy it would be to create this system from scratch and he said 'no problem, come to my lab'."
Cipolla introduced Adeyoola to one of his ex-PhD students: Duncan Robertson, whom he commissioned to build a proof of concept for this gaming system.
The experience proved to the young entrepreneur that he was able to take a concept through to completion, and he started to set his sights on his own project. After months of soul searching he settled on online clothes shopping, an experience he calls "emotionally draining".
He asked himself: "Could I do that through creating a 3D virtual you?" and saw a range of external signals that the timing was right: the growth of online retailers like ASOS, broadband penetration, the growing maturity of computer vision and cloud services.
"I felt that people had failed to solve the hard problems," he told Techworld. "People tended to stick to the data side. So tell us loads of data about yourself and we will match you to clothes, and that was very onerous on the consumer. I didn't feel that was a way to get mass adoption. It needed to be easy."
Adeyoola says most of the retailers he spoke to had been burned in the past trying to find a workable solution, and had set their sights lower, opting to just take on size advice rather than providing a 3D visual of the consumer. "For me it felt like you needed the visual to get a sense of getting something back from the user for asking them for the information," he said.
So the Metail team worked away at its core MeModel consumer application, looking for ways to improve the accuracy of visuals in order to drive adoption rates. The startup went from asking consumers to arduously submit photos of themselves to simply working from measurements.
"We established that by using height, weight and bra size, and then adding in hips and waist, you can get the accuracy level we were getting from a photo and deliver a visualisation in under ten seconds, which is key."
Metail will be bringing photos back though to allow users to increase the accuracy of their model. "Now people are moving to mobile we have gone mobile first and we will start to allow users to do more to increase the accuracy of their model."
The CTO Jim Downing has also been working on deep learning algorithms to apply to user submitted photos. He explained: "Where we used to fall down was users submitting Facebook pictures as it was all they had. So with deep learning we could make it more robust to those inputs. It will be able to work out what the pose was and which bits of the body it is seeing."
Next Adeyoola turned his attention to digitising clothing items, and found the main stumbling block was cost. "No one had been able to digitise clothing for cheaper than $300-400 per garment, so it was absurdly expensive."
Adeyoola wanted to get that figure down to $3 per garment, and he turned to Duncan Robertson again to solve it "because I felt the technology direction I needed to go in was computer vision".
The eventual solution was to simplify the process by digitising garments in 2.5 dimensions instead of 3D. Adeyoola explained: "So starting with a sample we digitise against ground truth, which is a mannequin, and then based on the standard elements of how the garment changes into different sizes we use software to deform and warp that garment based not just on how it changes shape but the underlying shape of the person under the garment."
So now Metail had to ensure that both the underlying MeModel and the overlying garment interacted correctly, and at speeds that wouldn't put users off.
Before cloud providers like AWS allowed startups to specify the number of graphics processing units (GPUs) they wanted to use a company like Metail had to do everything on its own servers. "So we had to build our own virtualised server farm of Mac minis, which was the best bang for your buck graphics processing unit on the market. Now you can specific GPUs in AWS and that has changed things a lot."
And the cost? "Now we are at $10 and have the R&D to get down to $3 and push those boundaries," Adeyoola said. "Because we have gone out and made the process of digitising garments so cheap our big focus is digitising the world's garments. So we did 40,000 last year and want to do 100-200,000 this year."
This approach led to a new revenue stream of its own. "So this autumn we have a product called Composed Photography which is leveraging our digitisation expertise to deliver all of the onsite model garment photography a retailer needs at considerably less cost, increased flexibility and increase efficiency," Adeyoola said.
Metail would essentially become the photography department for a retailer, using its warping software to reduce the amount of photos required to digitise a garment.
"So we needed to be able to do the photography in a way that we supply some sort of rig to go into their supply chain," he explained. "So it could be operated where the garments are, because the samples are gold dust and that is where you need to do the photography."
Downing added that the breakthrough came when they realised "that we didn’t need a person behind the camera with a standard set of settings. The idea that by using computer vision you could still get good results was controversial at the time."
The focus was on making it easy and simple enough to be operated by low cost labour. So for a one dollar labour cost per garment all of those images are sent to the cloud and processed by Metail.
This allows retailers to shoot a model once and the collection on a mannequin and the software layers all of the items onto the model. Cutting recurring photography drastically and doing a lot of work in software potentially allows retailers to cut spending on models and focus on hiring superstar brand ambassador they only have to shoot once. So instead of having a stable of ten models, a brand could hire one Cara Delevingne.
This also gives the startup a platform for selling in its value added services into retail customers, like the MeModel from which is takes a revenue share on all sold items.
Eventually Adeyoola wants Metail to "be wherever anyone is having a conversation about fashion". What he means is, by digitising the world's garments a user could be reading Vogue magazine, scan the dress and see it on the MeModel, share it on Facebook and buy it on Amazon, all from their phone.
Then there is the growing data side of the business. By having detailed data on what does and doesn't fit, and where users are abandoning items on their MeModel the startup's data could potentially enrich retailer's recommendation engines, or automate returns information to improve size recommendations on the back end.
"As we see garments as we shoot we can use deep learning and computer vision to apply attributes to those garments." So take a brand new jacket and Metail could use the metadata to liken it to an older item and pair it with similar recommended outfits as it did before. This is particularly powerful when you pair it with data around a user's body shape.
Adeyoola has high hopes for Metail in 2017, pushing towards greater commercialisation, especially in a UK market which has been tougher to break than Asia. He says that, regardless of the growth potential of Asia, breaking into your home market is important, if not simply for morale, and that one of the leading UK retailers is starting a trial next month.
Metail has raised nearly $20 million (£16 million) to date from New World Private Equity and angel investors. It is currently planning a Series C round which will likely be led by strategic Asian investors.