Microsoft has filed a patent that raises the interesting prospect of digitally reincarnating people as a chat bot.
Microsoft's patent - as shown by Ubergizmo - increases the possibility of developing a chat bot from the output of a single person instead of using the traditional approach of training chat bots using conversations and materials from a large sample of users.
To create a profile of an individual, the system will use "social data" such as "images, voice data, social media posts, electronic messages and written letters," reports Forbes.
"Social data may be used for the creation or modification of a special index on the subject of the personality of a specific person," notes the patent. "The special index can be used to train a chat bot to converse and interact with a specific individual's personality."
According to the patent, the chat bot could even sound like a real human. In certain aspects, a specific person's voice font may be created using recordings and sound data relating to the particular person.
In addition, "a specific person's 2D/3D model can be generated using images, depth information, and/or video data associated with the particular person."
Dead or alive
Microsoft's patent isn't particularly fussy about who might be chosen to be the subject of one of its chat bots, stating that the subject could be dead or alive. "The specific person [who the chat bot represents] may correspond to a past or present entity (or a version thereof), such as a friend, a relative, an acquaintance, a celebrity, a fictional character, a historical figure, a random entity etc."
"The specific person may also correspond to oneself (e.g., the user creating/training the chat bot," the patent adds, raising the possibility of people training up a digital version of themselves before they die.
The patent emphasizes the degree to which this chat bot will be trained to the individual's personal traits, in particular the "conversational attributes" of the person, "such as style, diction, tone, voice, intent, sentence/dialogue length and complexity, topic and consistency".
If the chat bot doesn't have enough data to provide an answer on a specific topic, crowd-sourced conversational data stores may be used to fill in the gaps, which is almost literally putting words in people's mouths.
The patent also deals with the tricky issue of handling profiles of the dead, suggesting that the bot may even be conscious (for wont of a better word) that it's imitating a dead person. For example, if the bot were asked a question about an event that took place after they died in real life, "such questions may indicate the specific person represented by the personalized personality index (e.g., the deceased relative) possesses a perceived awareness that he/she is, in fact, deceased".
Right to opt out?
The idea of reincarnating people as chat bots obviously raises all manner of privacy implications that aren't discussed in the patent, which is, by nature, concerned with the technical workings of the system.
For example, will people be given the right to opt out of such a system? Would the relatives of the dead be able to prevent others from turning their deceased loved ones into chat bots?
Such questions are, of course, moot until Microsoft (or someone else) delivers a working prototype. But it might not be the case for much longer that your personality dies with you.