Why people swear at bots (and what copywriters should do about it)

For a bot, standing up to verbal abuse isn’t just an add-on: it’s essential for continued engagement. If you write bot scripts, it’s time to teach your bot some self-respect.

Why people swear at bots (and what copywriters should do about it)

Bots are quickly becoming a key part of social media. Twitter is full of bot experiments, you can now order pizza from Facebook Messenger, and reddit has all sorts of bots that can integrate and automate various jobs.

Comment from discussion Test.

One reddit bot in particular recently caught my attention. ColorizeBot colours in black and white photos. But more importantly, it is written in a way to mitigate abusive replies to its efforts.

When called by a user, it says: “Hi I’m ColorizeBot. I was trained to color b&w photos (not comics or rgb photos! Please do not abuse me I have digital feelings :{} ).”

I don’t know if the creators of ColorizeBot anticipated abuse and put this in from the start, or if this was a new addition after seeing overly-critical replies to the plucky photo bot’s colouring efforts. Either way, it’s clear that bots can and will be abused by users, and it’s up to bot writers to take this into account when crafting their virtual helpers.

Why does a bot care about abuse?

Bots don’t really care what gets said to them, but users often do.

In our previous blog about how why chatbots are written to sound human, we touched on the need for respect in the bot-human relationships, based on an interview between ComputerWorld and Deborah Harrison, editorial writer for Microsoft’s Cortana division.

“Part of the craft of virtual assistant character development is to create a trusting, respectful relationship between human and assistant… If you don’t respect it, you won’t like it. And if you don’t like it, you won’t use it.”

The bottom line is that if a user doesn’t respect a bot, they won’t use it. Tell me, would you trust and respect someone if they took all the abuse you hurled at them and never once stood up for themselves? What about if it was a bot – would you hold it to the same standard?

AI fights back

This was a particular concern for Harrison and the Cortana team when Microsoft’s virtual assistant launched in 2014, and rightly so according to what the team saw. “A good chunk of early queries were about her sex life.”

It’s not just Cortana either. Robin Labs’ CEO Ilya Eckstein estimates that 5% of the queries sent to its route and logistics bot are of a sexually explicit nature according to an interview conducted with Quartz.

If Cortana and other AI were to put up with these kinds of inappropriate questions, it could jeopardise the respectful relationship needed to foster continued engagement. Harrison discussed this at the Re.Work Virtual Assistant summit in San Francisco: “If you say things that are particularly assholeish to Cortana, she will get mad… That’s not the kind of interaction we want to encourage.”

Firm, not condescending

Where Cortana is successful is in how she puts down potentially insulting questions, suggestions and remarks. While it is important for our virtual assistants to stand up for themselves, insulting the user is, of course, a huge no-no.

It’s a fine balancing act Cortana walks deftly. When asked “what are you wearing?” Cortana responds “A phone. Like it?” Curt enough to put the question to rest, sarcastic enough to highlight the question was inappropriate, but importantly it is not especially harsh or insulting. In fact, it’s quite playful depending on how you read it.

Similarly, when asked “will you marry me?” Cortana is quick to remind users “we’ll need a plan. I’ll work on being more human, you work on being more digital.” Again, humour is used wisely to deflect the question without breaking the sacred user-software bond.

Stick to the character

The bottom line is that a sprinkle of humour can really help avoid nasty or abusive questions, and ultimately keep your AI or chatbot respected. Far from just being good for a brand image, it’s also a practical consideration that significantly improves user engagement.

But there is a fine line to walk between being respected, and just being a bit of a jerk. If you’re having trouble getting the balance right (and are serious about keeping your AI respected and used), it pays to get a writer involved. Preferably a funny one.


Related Posts

Why is Alexa written to sound human?

Why is Alexa written to sound human?

Many virtual assistants and chatbots are written to seem human. George investigates why, if it’s necessary and where writers come in.
Chat bots: the next big opportunity for copywriters?

Chat bots: the next big opportunity for copywriters?

Forget apps; all the cool companies want you to interact with them via bots. Wo King from Hi9 talks to us about why that’s exciting news for copywriters.

Comments

  • Simon Gornick

    Great piece, George. I see the urge to undermine and humiliate bots as a symptom of a wider – almost evolutionary – human aversion to communicating with robots. Our whole attitude to the automation of communications so far has been far more negative than positive. Your tactical solutions are excellent. I wonder if they need to adhere to a more global strategic overview of how to overcome the human/robot ‘social’ divide. PS Really like your posts!

    • George Reith

      Hi Simon,
      Thanks for the feedback, glad you liked the post.
      I used to write for gaming websites in a past life, and I definitely used to see the kinds of anti-machine behaviours you mentioned in the way people interacted with AI opponents in games. I always assumed this was a competitive thing, but maybe people just naturally feel animosity towards machines in certain scenarios?

      I like the idea of trying to address the human/bot divide. Not sure I can think of a neat solution for it though!

      • Simon Gornick

        I’m working on it, but I think it has something to do with integrating language design directly into the dev process. 😉

Sign up to our monthly email

Copywriting tips and insights — delivered to you every month.