Take on any CX challenge with Pipeline+ Subscribe today.

Looking (ChatGPT) 4-Ward

Looking (ChatGPT) 4-Ward

/ Operations, Technology, Artificial Intelligence
Looking (ChatGPT) 4-Ward

How ChatGPT-4 changes the customer conversation.

Earlier this year, OpenAI announced a new iteration of their viral AI (artificial intelligence) model: ChatGPT-4. When ChatGPT-3 was initially announced, in November 2022, it was immediately hailed as “game-changing,” and was valued at $14 billion as recently as February.

While not everyone has access to the new version yet, and it likely won’t be available without a subscription, ChatGPT fans and foes alike have been keeping a close eye on its new capabilities, particularly those in the customer experience (CX) realm.

With this new version, there are features like visual input instead of text input and the potential for more personalities. However, with the previous version of ChatGPT already raising questions about how AI will impact our daily lives and workforce, the leaps and bounds made by ChatGPT-4 will only serve to raise more questions, potential concerns, and yet also excitement for the possibilities ahead.

In our fast-paced digital world, such large improvements or differences between iterations will no longer serve to just placate consumers and business users, but rather will lead them to clamor for what’s next. Partially because of how impressive ChatGPT-4 is in comparison to the version from just a few months ago.

As soon as we understand the new capabilities of ChatGPT-4, we’ll immediately start to think of what else is possible. This means that discussions surrounding ChatGPT-4 won’t just have a note of “what’s next,” but rather will have people asking, “what’s possible?”

So, What’s New?

In order to get to that part of the conversation with all of the information necessary, (or to just have an educated guess on what comes next, in some instances), let’s first discuss what exactly is different in this new version compared to its predecessors.

First and foremost, ChatGPT-4 is now capable of visual input. For example, you can upload a picture instead of typing in the question, “what can you make with flour, eggs, and milk?” ChatGPT-4 will recognize the three ingredients present and give you a list of recipes you can make with the ingredients detected.

This new capability has powerful customer service benefits. Like the ability of a consumer or a user (such as a remote worker) to upload pictures of faulty equipment e.g., a router, and asking it “what’s messed up with my internet?”, aiding them when they have questions but don’t know where to start.

This new update launches ChatGPT into the world of multimodality, enhancing both the consumer/user experience and introducing a new world of potential. Not only will the visual element aid users in the way they interact with ChatGPT, but the new version also assists app developers who use ChatGPT capabilities to augment their systems.

...ChatGPT-4 is now capable of visual input...Additionally, ChatGPT-4 has what is called longer context capabilities.

Be My Eyes, an app that previously allowed visually impaired users to ask volunteers what their phones are seeing, is now using ChatGPT-4 capabilities to have these scenes described to them without the need for humans on the other end.

Because so many apps have a visual component, this multimodal approach to AI will be an instantly useful tool for developers everywhere.

Additionally, ChatGPT-4 has what is called longer context capabilities. When it comes to text generation, longer context, or the “context window,” refers to the amount of text an AI refers to before generating an answer. This also includes the text within a conversation, so the AI will be able to “remember” longer and longer conversations for the entire duration of the exchange.

This feature will keep customers from having to input the same information over and over, because the chatbot will simply remember details from when they were initially explained.

ChatGPT-4 is capable of remembering eight times more than its previous iterations. This allows users to point ChatGPT to a data source during a conversation, such as a news story, medical journal, academic text, or analyst research study or blog and then use that data source to inform its answers.

This capability for deep memory, coupled with the ability to interpret conversations, may allow for a customer service chatbot to draw from blogs published by the organization it serves to better explain concepts to a confused customer: in a way that still aligns with the organization’s thoughts and values.

Finally, ChatGPT-4 is producing much more impressive results than its predecessor. ChatGPT-3.5 may have passed the bar exam, but ChatGPT-4 passed the bar exam in the 90th percentile nationally. In the multiple-choice section of the Multistate Bar Exam, ChatGPT-4 got 75% of questions in the multiple-choice section correct: compared to the human average of 68%.

Moreover, in the Biology Olympiad, ChatGPT-3.5 scored in the 31st percentile, and ChatGPT-4 scored in the 99th. This is more than just the small improvements we’re used to seeing between new generations of technology: a jump from 31st to 50th, for instance, would have been impressive. But 31st to 99th? Unbelievable. With this kind of “intelligence”, gone are the days of the “dumb” chatbot that can’t really understand or help in any way.

What’s Next?

The first word that comes to mind is “bigger.” If there was such a large jump in capability just between iterations 3.5 and 4, and with the full capacity of AI still yet unknown, the next versions of ChatGPT are limited by what imagination—and data –can accomplish.

As ChatGPT continues to grow in both popularity and aptitude, the amount of data needed to train further versions will grow exponentially alongside competence.

With more data being generated each year than the year before, and at a truly inconceivable rate and amount, OpenAI will likely only be limited by what data they can access, not by what data is actually out there.

With questions already existing about privacy concerns and the normalization of personal data usage, this conversation will only be amplified. Particularly with the comprehension gap between generations, there will need to be frank and open conversations about what data is going where- and why.

Earlier, we mentioned how ChatGPT-4 enabled Be My Eyes to operate on a whole new level. This won’t just be true for a few apps scattered throughout, but rather, the strides made by ChatGPT will allow more apps to use it for whatever they may need to augment their user experience.

The popular language-learning app Duolingo has already taken advantage of this by launching Duolingo Max. It allows subscribers to further their education through things like roleplay enabled by ChatGPT-4.

Organizations across all industries will be able to apply this to their business. For instance, telecoms could build an industry-specific customer help app that’s able to instantly leverage customer data for individualized CX.

...ChatGPT and other AI tools of their ilk were developed for everyone to better their lives.

Thanks to ChatGPT-4, those former “chatbot” type interactions won’t feel bot-like at all. This could improve CX while freeing human employees to deal with complex issues that demand more time and energy.

Importantly, it won’t take a coding expert to apply this new technology to your own applications or websites.

We’re currently diving headfirst into a low-code world; Gartner estimates that low-code application building will be used to develop 65% of applications by next year. OpenAI Codex is a driving force behind this transformation, as you can now use its derivative of ChatGPT to turn language prompts into code suggestions with no prior coding knowledge required.

While this capability is still basic, ChatGPT-4 has also introduced visual prompts to be turned into coding suggestions, lighting up the path to a more accessible future. Soon, software development will be less about coding mastery and more about creative mastery.

One Word: Accessibility

Some of the fear around the latest developments in ChatGPT-4 are valid, particularly when it comes to ethical concerns ranging from hate speech to plagiarism or even the concerns surrounding “AI Rights”. However, it’s important to remember that ChatGPT was developed not to take over the world.

Rather, ChatGPT and other AI tools of their ilk were developed for everyone to better their lives. Usage of this tool can be as simple as a bored student using it to learn how to say “hello” in every language or as complex as a developer’s aid in building an entirely new application. Telecom organizations, for example, can use this AI to build a better chatbot or to simply better sift through customer data to better serve their customer base.

In all of these situations, however, it’s important to focus in on the human aspect; none of this would be possible without the human behind the keyboard creating the AI in the first place. And it wouldn’t be successful without other humans to fact check and shape what the AI produces.

With how rapidly we’ve already seen ChatGPT develop in just the short time it’s been publicly available; we can only expect it to become more sophisticated - and prevalent – in the coming years.

Think of it this way: when humans first made fire, some people ran away from it in fear. Those who stayed to work with and control that scary fire thrived and grew.

If ChatGPT is going to exist in our world (and let’s be honest, it certainly will), then the best course of action is to learn, implement, and shape its capabilities to best suit the needs of you and your organization. That is, after all, what it was developed for in the first place: as a tool for growth, not to burn your house down.

Eric Carrasquilla

Eric Carrasquilla

Eric Carrasquilla is President of Customer Experience at CSG, where he oversees product, sales and services for the business. Under his leadership, CSG is accelerating the growth of its Customer Experience practice into core vertical and international markets while driving new product innovation that helps global brands wow their customers.

Contact author

x

CURRENT ISSUE: October 2024

Dissatisfied Customers, Unsatisfactory Responses?

View Digital Issue

SUBSCRIBE SUBSCRIBE

Most Read

ULTCX Research Results Infographic
Upland 20231115
Cloud Racers
Webex CC 20240826
Gartner VoC CCaaS Report
Verint CX Automation
ULTCX Research Rundown Infographic