Forgive the sensationalist headline – far more tabloid than you might expect of me, but I’ve been asked this question quite a lot recently along with “have you tried…?”, “what do you think about….?”, plus a host of other questions along similar lines.
Since the release of ChatGPT in November last year, everyone seems to be wondering if machines are soon going to be replacing humans. Of course, we’ve been wondering that in various forms for a long time – HG Wells touched on it in The Time Machine in 1895 – and it hasn’t happened yet, so I suspect the end of the world isn’t quite nigh.
I’ve been looking into it recently, so here is my take on the current rise in AI content creation and whether you should use it.
AI has a place
I suppose a good place to start is for me to say that I find technology fascinating (little known fact: my degree includes a minor in computing), so the research to create a ‘thinking machine’ is one I can appreciate. While AI content writers might not yet pass a Turing Test (developed to assess whether a machine can display intelligence) they certainly seem to be getting more and more sophisticated. The capability of the developers who have brought it to this point should be applauded – from a purely academic standpoint.
ChatGPT is still a developmental tool of course and one developed from the world of R&D so it isn’t yet considered to be a ‘product’ for commercial exploitation. Every time I go to try it out, it is ‘at capacity’. I was mildly amused that the first thing it did was verify I was a human…. There are others, however, which claim to offer the same, or similar, services – Jasper, Grammarly and Creaitor to name a few.
And, if you are short on time and on budget, using AI to generate content for you seems a useful solution, especially when it can give you 500 possible blog topics that might work for you. Fantastic!
I know that web developers are very keen on the AI code development features that are available too – great for solving knotty coding problems and for debugging.
Can AI replace a writer?
This, of course, is the knub of the issue. ChatGPT can write poetry and has written a sonnet in the style of Shakespeare. It’s very creative and interesting but ‘in the style of…’ isn’t Shakespeare. Part of the glory of Shakespeare is the style and meter and the way it scans. But surely part of it is the sheer glory that a person created those wonderful words. He sweated over them, to get them just right. He didn’t go though 100,000 word combinations at a million operations per second until he found a match.
I think it all comes down to what you, as the business owner, want from your content.
Pages of fast copy?
Or something authentically you?
Science? Or Art?
If you are looking for ideas or a first draft that you can then edit to match your own tone of voice and generate the ides you can then rework, then it seems a reasonable thing to do. Mind you, Google is already saying it will down-rank content created by AI, so if you want to optimise your website for search engines, it could well work against you. I’d be interested to find out it’s capability for differentiation too – if four people ask an AI for an article on the same topic, will they get the same article?
A question of ethics
We now come to the knotty problem of attribution.
When you publish an article, blog, book or web page, you are implying that the content is your own. If everything has been written by a clever robot, do you need to declare that?
You don’t if it is written by a copywriter – although any good writer would have spent time with you to understand your values and perspective, so anything published under your name reflects your opinion. If you review and edit a computer-generated article, is that any different?
To my mind, though, using a professional writer to create something for you feels less ‘transactional’ than using AI. It lacks the human touch (literally) and it feels like a deceit for your audience.
What about copyright and intellectual property?
I don’t imagine this is an issue that is yet to be fully addressed but consider this:
You use AI to write you an e-book and prospects can download it free of charge from your website. It creates a relationship which, later, leads to you making a sale. Who is entitled to that revenue? You – most would argue.
Okay – seems fair. What about this one:
You use AI to generate an eBook which prospects download for a small fee. Who owns that revenue?
And what if it isn’t an eBook, but a business book? Or an online course?
Who is entitled to the revenue? Who owns the copyright? And who owns the intellectual property?
A quick chat with a solicitor friend of mine confirmed that this particular ‘copyright time-bomb’ hasn’t been explored but that someone, somewhere is likely to light the fuse.
AI is still a machine
Whichever tool is used, artificially created content is still a creation of technology and not of heart and mind. The developers of ChatGPT admit that it ‘lacks critical analysis’ which means it can draw information from the internet and come up with a wrong answer. And, more worryingly, it lacks moral and ethical judgement. It can be asked to comply with local laws, for example, but won’t necessarily reflect your own values or a current sway in public opinion. The site itself warns you that information may be inaccurate, harmful or biased.
The implications of that worry me a little. I was chatting to someone last week who is using ChatGPT to write the management reports on his team performance. He makes notes each day on what they have done and at the end of the month says “Hey, ChatGPT can you summarise these into a report using a professional business style?”
But he wasn’t checking them….
Which means, if he had a bad day and one of his team did something stupid, that prompted him to write a thoughtless note about them, ChatGPT would simply include that in the report. It wouldn’t think, “That’s a bit harsh” or leave it out because performance on other days was fine and reason that this poor performance was an isolated incident. Nor would it know that that was a ‘bad day’ and not a fair reflection on the manager’s feelings.
Outcome: Grievance, or someone quitting because they felt undermined, to name but a few consequences.
I pointed this out and the possibility hadn’t occurred to him. He will now check the reports before submitting them.
This is one example but, for those who put faith into technology we could be seeing a rise in mis-information and hurtful content.
So, am I for it or against it? And would I use it?
AI is a tool like any other. Tools are useful if used correctly.
If used incorrectly, they can be downright dangerous. If you choose to use AI to generate content, do so with an enquiring mind and with your eyes wide open. Check it, add your own tone of voice and values to anything it creates and check the facts it presents. And be aware that, some time in the future, you might need to declare or credit AI content (just as you might for a guest blog), so make a note of what content is your own work and what is the work of a machine.
As to where I’d use it, I have, finally, managed to create a ChatGPT account and I will play with it to find out its capabilities – after all I love technology and it is really clever. I have tested its ability to come up with a list of blog titles for a specific industry and target market and I can see myself using it for that.
For actual content though? No – I see writing as an art form and content gold comes from the glory that is human. So, our clients can expect only actual intelligence creating content alchemy for the foreseeable future.
Want to join the discussion about AI content? Add a comment below. And if you want to see how Creative Words can give you real, authentic, bespoke content, contact us for a chat.