Everywhere you look, these days, you encounter artificial intelligence. From personal electronic devices, through online shopping, to bots in social media, AI has assimilated into our lives. We now trust it to recommend local eateries, theater shows, or films to watch, and then even allow it to guide us home afterward.
Too Much Information…
We live in an age of information overload. The plethora of content made available to us through media and electronically means we are constantly connected to it. In fact, it has become increasingly difficult to disconnect at all, what with the bombardment of notifications, both social and work related.
And someone has to write all of this content. I mean, it can’t just generate itself. Or can it? With the current demand for words, human authors are being pushed to their breaking points. People want their content to be fresh, and they want it to be relevant. So where can these content editors go to satisfy their readers’ lust for material?
To understand this question, we need to look at the state of play of AI. Currently, not only can AI tirelessly generate content by monitoring how it is consumed, by whom, where, and the reactions and behavior of the readers, it can also learn from it, filter it into the creative process, become stronger, and more focused. Add into this equation the fact that it never asks for a raise, gets drunk at the Christmas party, expects holidays, or needs to visit the bathroom, and you can see why people are finding it so attractive.
I Can Spot a Phony!
There are plenty of wise guys and gals that claim they can tell the difference between human- and computer-generated content. They say that the lack of wordplay, rhetoric, and feel for the language is obvious. So, do they have a point? Let’s look at the way it works to have a better idea of the processes behind it, especially the use of Natural Language Processing (NLP)—using computational techniques to analyze and synthesize natural language and speech.
The Holy Trinity of NLP
You can divide NLP into three types: inquiry, conversational, and reasoning.
Inquiry NLP uses text analytic tools to define when someone is requesting information. This can be question words, such as “who, what, when, where, why, how, is, can, does, do”, to use of imperatives, e.g., “Show me a list of Indian restaurants.” By using text analytic tools, it is possible to break the sentence into parts: subject, verb, object, manner, and place—to understand the nature of the question—cross-reference them against an ontology, for example. www.schema.org, and then reply accordingly.
Conversational NLP (or Natural Language Understanding (NLU))
This uses techniques to refine inquiries by engaging the questioner in conversation to clarify any uncertainties. These replies further add context. Examples of this are IBM Watson, Amazon Echo, and Siri, which use the ideas of necessity and sufficiency.
Necessity—that is, factors that have to be true in order for something else to be true: if P, then Q. Here is an example from Wikipedia:
“For it to be true that "John is a bachelor", it is necessary that it be also true that he is
since to state "John is a bachelor" implies John has each of those three additional predicates.”
Sufficiency—that is saying according to the information provided, it is adequate grounds to assume that something else is true: P implies Q. Again, an example from the same Wikipedia page:
“Stating that "John is a bachelor" implies that John is male. So knowing that it is true that John is a bachelor is sufficient to know that he is a male.”
In practice, we use the example of Siri, Apple’s flagship AI bot. If I say to Siri, “Hey Siri, call Andrew”, Siri answers, “I have found the following Andrews in your address book. Which one would you like me to call?” By asking additional questions, Siri is able to refine the nature of the request. Then applying the process of Necessity and Sufficiency, Siri is able to assume more accurately.
Reasoning NLP is the self-learning version of Natural Language Processing, such as in the case of the MIT project Open Mind Common Sense (OMCS). This means AI understanding more than just the physical forces of existence, such as mass, dimensional attributes, and mechanics, or linguistic devices, such as syntax, semantics, and pragmatics, to deal with more abstract concepts such as culture, belief, and emotion. These are the very things that separate us from machines—think “The Voight-Kampff Test” in Philip K. Dick’s novel, “Do Androids Dream of Electric Sheep?”—where androids are identified using questions that should create empathy and then have their responses monitored. Avoiding cultural dilemmas and utilizing empathy by not merely applying pure logic are the greatest steps towards generating replies that are purely “human”.
Give Me More Content
Gartner estimates that 20% of business content will be authored by machines by 2018. Using analysis, the data crunching necessary to generate items such as reports and less-creative content, such as financial and legal documents, can be automated. But even AI tools like Wordsmith aren’t so automated. Wordsmith still requires you to add your data, write your template, preview your stories, and then publish them.
So, although it’s true that a lot of news published today is machine generated, machines aren’t able to understand things such as consistency of voice or notice the sentiment between the lines. These nuances are either absent from the text, or, at worst, forced. And in the case of the news, these should often be stories about real people, with real emotional impact. Stripping away these elements rob these news items of the very humanity they need to breathe.
Why all the Fuss?
One area where Artificial Intelligence comes into its own is analyzing consumer behavior. This means it can almost real-time monitor these factors at a level unachievable by humans. So tools such as A/B testing can be automated and the results implemented based on the AI analysis undertaken to maximize layouts and elements there and then. Also, by applying best practices and trends, developers can influence web design from the outset, taking a lot of the guesswork out of UX design. The same is true of content. Building content according to template-based personas and segmentation can lead to presenting the same or completely different text based on the profile of the visitor.
Welcome to the Machine
So, should you be thinking of firing all of your copywriters and relying on AI? Depending on the creative elements of your content, probably not. And should your copywriters be watching out for an excessive whiff of engine oil in the air? Again, creative NLP is too reliant on the human touch to add those subtle nuances to connect with the reader that human writers can achieve. After all, generating loads of content for the sake of demand is one thing, but connecting with the heart of the reader is a very different matter.
Have you read something that was blatantly machine made? What gave it away? Maybe you are using some form of AI in your company. To what levels does it just get on with it itself? I would love to hear your comments on this subject.