What am I talking about? It’s more that I don’t expect my devices to talk back at me, mainly because I turned off their ability to talk. Even more, I have turned off their ability to evaluate my commands.
Before I make myself sound even more paranoid, this is based on the principle of knowing that, if I want something, I will ask for it. I will not say “Alexa...” or “Hey Siri,” “Hey Google,” “Hey Cortana” – I’m not really a “hey” kind of person – and expect the artificial intelligence based on previous interactions to throw up what it thinks is the right answer, or what is the first answer, or the answer most accessed by others.
The only virtual assistant I do use is Siri, on my Apple TV box, purely to speak the name or title I want to search on YouTube, Vimeo or the BBC iPlayer, an expediency over just typing it, which is the final resort when it doesn’t accept how I’ve pronounced a vowel.
While the debate over the application of virtual assistants is currently focussed on the microphone – your voice being recorded and analysed, even when you are not using it – my concern is on its ability to act as a bridge in human thought, making evaluations over the answer it thinks I will need, without informing me on how it came to that decision. There is an implication of trust on my part, which turns into a lack of trust in practice.
If so, why am I happy using a search engine like Google? Is it the trust gained over twenty years of usage, that virtual assistants need to demonstrate? (Siri, the oldest of them, began in 2011.) Is it having a screen to view the answers of the search engine providing the illusion of choice? Is it that the algorithms that influence the search results are more well-known? (For Google, PageRank evaluates the links to pages, and between pages, Panda promotes higher-quality sites, while Hummingbird emphasises natural writing over forced keywords – how am I doing so far?)
Perhaps, I am still looking at virtual assistants as being in their infancy, as having pretensions over previous voice-activated units that acted as glorified hands-free switches. It comes back to trust over the answers I would expect them to give, if I tried to use them. I just need something to explain how they came to their decisions, like a screen, and something to choose and edit the answer, like a keyboard.
No comments:
Post a Comment