Generative AI is here to stay – and it won’t be long before tools like ChatGPT and Google Bard are part of every customer education program.
The good news is that there’s still time to get ahead of the pack and enjoy the benefits of being an early adopter.
You might recall that in advance of the 2023 CedMA Europe Conference, CloudShare and the association jointly conducted a survey of its members on key technologies in customer education.
Only 28 percent of respondents had identified potentially helpful use cases for artificial intelligence, while 3 percent identified themselves as power users. The majority were still kicking the tires and trying to figure out where to start. We can help with that – let’s go over a few best practices for incorporating generative AI into your customer training program.
There’s one critical skill you must develop before you start working with generative AI – prompt engineering. Although ChatGPT and its counterparts can respond to most questions in a reasonably humanlike fashion, that doesn’t mean you can speak to it like it’s human. There’s a science to working with the software, and it starts with knowing exactly what you want it to do.
From there, it’s a matter of describing that task with concise, straightforward language while providing the AI with the necessary context. This could include information about your business, details about your training participants, or an explanation of your specific use case. For example, let’s say your software helps businesses streamline vendor invoicing.
Do you work with small, medium, or enterprise-level organizations? In what industry or sector do the majority of your customers operate? Is your software primarily used by an organization’s sales team, its accounting department, or someone else?
If you’re interested in learning more, you can check out ChatGPT Prompt Engineering for Training Managers for additional guidance on what a good prompt looks like. Otherwise, let’s move on.
It’s well-documented by now that generative AI has a tendency to hallucinate. From inventing scenarios and statistics that don’t exist to claiming ownership of content it did not create, ChatGPT has a tenuous relationship with the truth. The problem is that for all that they’ve been hyped up, generative AI tools are basically just highly sophisticated chatbots.
They don’t know the difference between reality and fiction. To them, it’s all just patterns in data. If they don’t have enough real-world information to fulfill a prompt, there’s no harm in extrapolating something from those patterns.
In other words, don’t count on generative AI to provide you with factual information and don’t accept anything it tells you at face value.
Generative AI uses something known as a large language model. Without getting too technical, it’s basically a collection of layered machine-learning algorithms that work together similarly to neurons in the human brain. These algorithms work together to help the model eventually learn not only the semantic connections between different concepts but also how to express those connections in human language.
Doing this requires a massive volume of data – and we do mean massive. OpenAI’s GPT LLM, for instance, was trained on over 570 gigabytes of text. For context, The Complete Works of Shakespeare takes up five megabytes.
Unfortunately, there’s been some recent controversy on how and where OpenAI obtained that 570 gigabytes. Thus far, the company has been accused of breaching the GDPR, accused of copyright infringement by everyone from Lana Del Ray to The New York Times, and sued for scraping medical records and personal information.
If you plan to train your own large language model, you need to make sure you know where your training data originates. You must also incorporate data privacy controls such as anonymization and security measures such as strong encryption and access limitations. Finally, if you must use personal data to train the model, do so only with consent.
One unfortunate side effect of all the hype surrounding ChatGPT has been a significant uptick in AI-generated content. Content farms, always keen to prioritize quantity over quality, were all too eager to start using – and then eventually overusing – generative AI. In a recent investigation, news and information ratings website NewsGuard identified 49 websites whose content was entirely written by AI.
The problem is that AI doesn’t create content on its own. It iterates on and summarizes content from other sources. Ironically, it also doesn’t have the best grasp of conversational language. Text written by AI is often unnecessarily complex and verbose, with incorrect wording and unusual phrasing.
If you decide to use ChatGPT or a similar platform to generate any content for your customer training, proofread and rephrase the output so it actually sounds like it was written by a human being.
Generative AI may be the only thing anyone is currently talking about, but it actually represents only a fraction of artificial intelligence as a whole. Don’t get us wrong, it’s capable of doing a lot of different things. But it’s important to remember that there’s much more to AI than ChatGPT.
There’s still a space for traditional AI in your training for tasks like analytics, sentiment analysis, data orchestration and workflow automation.
The best way to leverage generative AI for any initiative – not just customer education – is through small-scale, iterative projects. This approach allows you and your team to learn more about how AI fits into your training. It also allows you to gradually build up a strong foundation for your customer education initiative, one built on smart decisions, knowledge, and continuous improvement.
Before you do anything, however, you should determine how you intend to incorporate AI into your training. Circling back around to the survey we mentioned earlier, the majority of participants felt that content creation and operations were the two use cases with the biggest impact. Customer support, analytics, and assessments were also mentioned.
While there’s a chance that your organization is one of the rare few with the necessary in-house expertise, it’s far likelier that you lack anyone experienced in data science or artificial intelligence. Given that there’s still a considerable data science talent gap, finding a qualified employee is far easier said than done. That’s the bad news.
The good news is that you don’t necessarily have to hire someone. There are plenty of vendors and consultants with knowledge of both data science and generative AI. Working with one of those organizations is likely a better option for most companies.
The past year has been huge for generative AI. The biggest change is that ChatGPT – once the only major player in the space – now has to contend with competitors from the likes of Google, Bard, Microsoft, and even Meta. Unfortunately, as the technology continued to grow in popularity, it ran headfirst into multiple legal, ethical, and security concerns.
How do we ensure generative AI respects privacy? What can we do about biased data? Is there a surefire way to stop AI from derailing itself with hallucinations?
These questions all remain largely unanswered – but you can bet that the moment someone comes up with a concrete answer to any of them, that solution will see widespread adoption. That’s why if you intend to leverage generative AI in any capacity, you need to at least be aware of what’s going on in the industry. Keep yourself informed about the latest advancements, trends, and regulatory changes and how they might impact your customer training initiatives.
If you believe the hype, generative AI is basically a silver bullet, a revolutionary technology with the potential to change everything about how we do business. As always, the truth is a bit more grounded. While ChatGPT is certainly disruptive, it’s far from perfect.
There are certain things it can’t do. We already mentioned its inability to differentiate fact from fiction and the fact that it doesn’t actually create content. Generative AI is also incapable of making moral or ethical judgments, practicing empathy, or understanding context.
Perhaps most importantly, it’s incapable of training your customers. Without your guidance, it doesn’t actually know anything about your company, your industry, or your software. It’s just very good at pretending it does.
Learn more about how generative AI can empower customer training programs in our eBook Harnessing Generative AI for Effective Customer Software Training, then read 12 Ways to Propel Productivity in Customer Education With AI to learn about some of the best AI use cases.
We also recommend watching our webinar AI: The New Imperative for Advancing Customer Training, co-hosted with Kristine Kukich, Senior Director of Customer Marketing at Thought Industries.