Building digital humans: Would-be 鶹Ƶ entrepreneurs to put a non-'creepy' human face on digital assistants
Would-be entrepreneurs enrolled in a business of software course at the University of Toronto are being asked to play God in the digital realm.
The theme for the course, offered by the department of computer science, is “creating virtual humans” using virtual reality, or VR, technology provided by Quantum Capture, a local Toronto startup that provides tools for content creators working in VR, augmented reality (AR) and other 3D platforms.
The students will use Quantum Capture’s platform to create ultra-realistic 3D characters for a variety of potential industry applications – everything from building police training simulators to creating virtual tech support workers.
In effect, they will be putting a human face on virtual assistants like Apple’s Siri or Amazon’s Alexa.
“We like to think of it as the embodiment of a chat bot,” says Matt McPherson, a 鶹Ƶ alumnus and former sessional lecturer who’s now Quantum Capture’s head of corporate development.
The course’s instructors, Helen Kontozopoulos and Mario Grech, got the idea for the theme after Quantum Capture joined the Department of Computer Science Innovation Lab, or DCSIL, where both Kontozopoulos and Grech are co-founders and directors. The lab is one of several entrepreneurial hubs at 鶹Ƶ, which has emerged as a leader in developing research-based startups. The accelerator focuses on growing startups that specialize in turning 鶹Ƶ research in areas like artificial intelligence, machine learning, virtual reality and blockchain into game-changing companies.
“This is the first time we’re actually leveraging teams out of our accelerator to produce the problem sets we use in the class for our students,” Grech says, adding that it’s possible some of the student-led teams may decide to pursue their startup ideas further by applying to DCSIL’s accelerator program upon completion of the course.
Among the problems Quantum Capture has asked the students to tackle: building realistic virtual humans for doctors, police and soldiers to interact with in training simulators; creating virtual teachers for language instruction courses; and, in what may be perhaps the toughest feat of all, developing virtual tech support workers who can deftly deal with irate customers.
While two of Quantum Capture’s founders, Morgan Young and Craig Alguire, came from the video game industry, McPherson says the technology is quickly moving beyond the entertainment space as companies seek to make their computer-generated assistants and agents more interactive and engaging – basically more like the humans they seek to replace.
“There are certain social cues that are better for conveying information during a real-life interaction,” says McPherson, who has a master's in information studies from 鶹Ƶ, before he opens his laptop and shows off one of the company’s emotive creations. Her name is “Alyssa.”
Quantum Capture’s approach involves taking high quality, 360 degree images of, say, an actor with the company’s 112-camera rig, housed in an Etobicoke warehouse. The resulting 3D image is then animated using motion capture technology and made to look lifelike with behavioural software.
But making a virtual human seem like a real thing is a fraught exercise thanks to a phenomenon known as the “uncanny valley.” The term, coined by a Japanese roboticist in the 1970s, refers to the creepy feeling one gets upon interacting with a robot or computer-generated character that appears almost human, but not quite. (The “valley” in question refers to the drop-off in “familiarity” experienced by observers, as represented on a graph.)
One of the better known pop culture examples is the 2004 animated film Polar Express, starring Tom Hanks. Writing on CNN.com, one reviewer called the CGI Christmas movie about a group of kids on a magical train trip to the North Pole “at best disconcerting, and at worst, a wee bit horrifying.”
Needless to say, few businesses are willing to risk inflicting a similar nightmare on their customers. As a result, there has been a tendency to avoid creating ultra-realistic virtual agents in favour of ones that are more cartoon-like.
McPherson, however, believes it’s possible to bridge the chasm with Quantum Capture’s technology.
“Our research has shown that what makes the character look creepy is when the lip synch is off and when the eye behaviour or other subtle cues aren’t quite right,” he says.
“But you can actually push through the uncanny valley in immersive environments like VR and AR. You can make the eye behaviour start to look pretty accurate based on cognitive science literature.”
To that end, Quantum Capture is incorporating into its platform and tools a lip synch algorithm that was developed by 鶹Ƶ's dynamic graphics project lab.
McPherson says that he’s looking forward to seeing the potential applications devised by the 鶹Ƶ students, adding that part of the reason Quantum Capture decided to partner with 鶹Ƶ on the course is to test potential markets for the technology.
He believes it’s only a matter of time before virtual humans – non-creepy ones – become a familiar part of our day-to-day existence.
“We interact with humans all the time,” McPherson reasons. “It’s the ultimate user interface.”