We do not yet have chatbots. What we do have are publicly-facing undocumented nondeterministic command line interfaces which expect the user to guess the right commands. This interface further insults the user by pretending to be a person.
I couldn't disagree with article more. I don't want a damn chat bot. I want a way to solve my problems that's not pretending to talk to person. I don't what to have to guess a string to enter to get the results I want. Give me some sort of real UI. I can't see how a chat bot could be more effective than good documentation with good search functionality.
"Your chatbot should be purposeful, reflective of your product’s voice, and simpatico with your users. One helpful design exercise is to produce an assistant persona and personality:"<p>For a business, a "chatbot" or any feature similar to it needs to do one thing and that is solve the users problem(s). If the user wants to do X within the app or learn about Y, the chatbot needs to help the user with that efficiently and better than a human can for the feature to be successful. The "chatbot having a personality" comes second to "solve the users problem."<p>If the users are completely happy with whatever chatbot they are using, then sure adding in some "personality" might be a good idea and increase engagement slightly, but a poorly-performing chatbot that can't help the user <i>but has a personality</i> isn't going to help the business at all.
In my course on chatbot building (shameless plug <a href="https://cognitiveclass.ai/courses/how-to-build-a-chatbot/" rel="nofollow">https://cognitiveclass.ai/courses/how-to-build-a-chatbot/</a>) I cover some of these important design decisions and recommend creating a prompt that, while concise, removes the guesswork. Giving it a name and injecting personality is a good idea, but you should announce that the user is talking to a chatbot.<p>In fact, two of the worst chatbot design flows, in my opinion, are: 1) Having the user try to figure out whether they are talking to a real person or a chatbot 2) Prompting the user to ask you anything (e.g., "Hi, how can I help you?") rather than guiding them on the scope.
Chatbots are like IVR systems and phone trees. It's a neat idea, but they are a pain in the ass, nobody likes them, and their benefit over a human is marginal in a best case scenario.
> “I am the psychotherapist. Please, describe your problems.”<p>"M-x doctor" aside, try getting on #emacs and talking to fsbot. I swear I have seen some of the eeriest exchanges between human and robot in that channel – it's not an AI, of course, but man it fakes it well. As someone in the channel once said: "Someone's cheating on their Turing test..."
"Your chatbot should be purposeful, reflective of your product’s voice, and simpatico with your users"<p>Agreed 100% but IMO, this is not a function of "personality" but rather a function of deeply understanding user intents. A bot cannot be purposeful if its own designers don't know its purpose from a user-centric perspective.<p>(Disclaimer: I work on Chatbase, a service for analyzing and optimizing bots)
“What I had not realized is that extremely short exposures to a relatively simple computer program could induce powerful delusional thinking in quite normal people.” -- oh man this is a fantastic quote. So true. AI will surely destroy humanity!
The problem is not that your chatbot needs a personality.<p>The problem is you implemented chatbot in the first place so you don't have to directly deal with customers.
Am I the only person on earth who wants my interactions with technology to simple, effective, and straightforward? I want to give simple, straightforward commands and receive a terse confirmation or explanation as a result.<p>Tech is so blessedly, unfailingly logical and we stupid humans have to sully that to make some of us feel happier, and not more effective :(
I feel bad for human customer service workers. It's already a job that demands a lot of emotional labor, without customers shouting "representative!" at them.