Is ChatGPT sabotaging your collections strategy?
Key Takeaways
- Horizontal generative AI tools like ChatGPT and Gemini promise efficiency in collections—but fail to connect with customers emotionally.
- Over-aggressive or tone-deaf AI messages risk overwhelming past-due customers, eroding trust and engagement.
- Without behavioral science, generative AI tools can quickly turn into a liability in delinquency management.
Horizontal generative AI tools like ChatGPT are becoming increasingly popular in collections. But while they promise efficiency, they can also backfire—eroding customer trust and engagement instead of enhancing them.
The hidden danger of generative AI in debt recovery
Generative AI's promise of efficiency comes with a critical blind spot: its inability to connect with past-due customers on a human level. It's true that ChatGPT, Gemini, and similar tools can mimic tone and craft polished messages. However, they lack the contextual understanding and behavioral science knowledge necessary to effectively connect with delinquent customers.
The real risk
Horizontal generative AI can amplify poor strategies—flooding past-due customers with an overwhelming volume of ineffective messages. Instead of motivating repayment action, this kind of communication risks frustrating customers and driving them further away.
Let's break down how these pitfalls show up in real-life examples—and the impact they can have on your collections strategy.
Real-life AI messaging mistakes you can't afford to make
Generic AI messaging can often miss the mark, failing to engage past-due customers—or even driving them away. Here's what that can look like in action:
Overly aggressive tone:
"Your payment is overdue. Failure to act immediately may result in serious consequences. Click here to pay now."
This message may pressure customers, but it lacks empathy, making recipients feel threatened rather than supported.
Generic message without personalization:
"We noticed your account is overdue. Please pay as soon as possible to avoid penalties."
With no acknowledgement of individual circumstances, this could alienate past-due customers, who are already under financial stress.
Overload of irrelevant messages:
"Don't miss this opportunity to resolve your account today!"
Repeated, impersonal prompts from generative AI can overwhelm customers, pushing them into avoidance rather than action.
What's at stake when AI strategies miss the mark
These missteps highlight the challenges of using horizontal generative AI in collections. So what's the real risk?
The real risk lies in using generative AI without understanding how to align behavioral science tactics with each audience segment.
For instance, imagine a past-due customer receiving multiple overly generic payment reminders in quick succession. Without considering the customer's financial stress or emotional state, the messages could easily overwhelm them, leading to disengagement instead of repayment. This is often referred to as the "ostrich effect"—where individuals avoid dealing with uncomfortable situations, like unpaid bills, by ignoring communication altogether.
Additionally, generic AI tools can inadvertently deliver tone-deaf messaging. A well-meaning email about resolving debt during the holidays, for example, might come across as insensitive if it doesn't acknowledge the customer's financial pressures during this high-stakes period. Far from inspiring positive repayment action, these types of heavy-handed tactics erode trust and damage customer relationships.
Going it alone with ChatGPT is a losing strategy
Without the critical expertise to align AI-driven strategies with human behavior, generic generative AI can quickly turn into a liability—and escalate problems instead of solving them.
These limitations of horizontal AI are only the beginning. In the next blog, we'll dive deeper into why generic AI needs human insights to navigate the complexities of debt recovery effectively.
Frequently Asked Questions
What is horizontal AI?
Horizontal AI refers to general-purpose AI tools like ChatGPT, Gemini, and Claude that are designed to handle a wide range of tasks across different industries. Unlike vertical AI solutions built for specific industries, horizontal AI lacks specialized knowledge of domains like debt collection, behavioral science, or regulatory compliance.
What are horizontal generative AI tools?
Horizontal generative AI tools are broad-purpose AI systems that can generate text, images, and other content. Examples include ChatGPT (OpenAI), Gemini (Google), and Claude (Anthropic). While powerful for general tasks, they lack the specialized training needed for sensitive applications like debt collection where empathy, compliance, and behavioral understanding are critical.
Why is ChatGPT risky for collections?
ChatGPT and similar tools pose risks in collections because they: 1) Lack understanding of behavioral science principles that drive payment behavior, 2) Can generate tone-deaf or aggressive messaging that damages customer relationships, 3) Don't understand regulatory requirements like FDCPA and TCPA, and 4) Cannot adapt messaging based on individual customer circumstances and emotional states.