Replika [alert]

I know this is an odd topic to include in a blog on preserving and protecting one’s personal privacy.   Stay with me for a moment, though.   This post is an [alert] to highlight a disturbing and dangerous trend to losing the privacy of … your personality, to an AI (artificial intelligence).  Yes, this is the antithesis of privacy, to share, endorse, and support (train) an AI to simulate your own personality, your beliefs, your … pretty much everything.

Here it is, an ai-based chatbot that I can train to talk like myself, and mimic my personality.    It’s at least a little creepy, since the idea is that the replika in its digital form can survive even after my body is dead and gone.

You can reserve your own name to train your own replika today:  https://replika.ai/     Where it began, the story is told at http://money.cnn.com/mostly-human/dead-irl/  .  Replika is the contemporary  tamagochi / hatchemo  with additional features that aim to capture (your) personality and one day can react to events in the future based on your reactions to events today.   More to come, clearly.

Remember the movie “Her” – Scarlett Johansson’s voice animated the AI “operating system”?   Beyond capturing your personality via chatbot, why not give it the capability to write letters for you?  The male protagonist in that movie is a writer paid to write heartfelt personalized letters, and the physical letters are written by a robot which has captured a writing sample and recognized your way of applying pen to paper, your writing style, your writing personality.

If that’s not creepy enough, Adobe Voice now allows you to do for audio, what PhotoShop does for pictures:   seamless editing of photos before, now audio recordings of, yes, your voice.  BBC warned about his in late 2016.   The demonstration is amazingly seamless.   “Photographic evidence” ?  “I have an audio recording” ?  Fake, fake, fake.  Google is not out of this game either.   But so very useful if these tools were to be embedded into an interface to Replika,  where my replika can respond to your voice.

If all this creeps you out, you’re not alone.   There are so many fundamental ethical issues I did not bother to cite them in context above, and back in 2012 data scientists were already concerned about the ethics and creepiness of capturing personal information in  “big data” scenarios.   There’s an important lesson here:  trust your gut and not the shiny dnagling bauble.   If something makes you feel creepy, there is a good chance you feel that way for a good reason, and among the good reasons, is detecting that someone has an intent to harm you, or the situation has an opportunity to harm you at some time in the future.

It should be old news if you have followed this blog, that emails are trivial to forge, Caller ID is easily spoofed, and falsified identity are readily used by pranksters and criminals.

Sadly I have to conclude “trust no one”.   Trust no photograph, trust no audio recording.  The only person you might trust is, in the end, yourself.

Advertisements