The pitfalls of AI chatbots in customer service
Adobe Stock Image
// This is a machine generated transcript. Please report any transcription errors to will-help@illinois.edu. [00:00:00] Brian Mackey: This is the 21st Show. I'm Brian Mackey. When was the last time you had to speak to a company's customer service over the phone? Did it sound a bit robotic? Maybe it asked you to tell them in a few words, what are you calling about? Customer service seems to increasingly be handled by AI chatbots, which can bombard us with unrelated questions. It can fail to understand what we're saying, and it really just wears us down as we try to get a hold of a human being, human, person, agent, please, who can actually solve our problems, or so we hope. It turns out those chatbots can come with security issues for the companies that use them and the people who reach out to them. Take Sears, the once thriving department store chain. Even after a major decline that led to a bankruptcy filing in 2022, it's still a going concern. There are five stores across the U.S. None of them are in Illinois, but the headquarters still is. Sears also still has a home services business, which provides appliance repairs, and that, for customer service, relies on AI and virtual assistants. We know this in part because the company left a database of millions of interactions people had with its bots accessible to the public. That database included chat logs and audio recordings, some of which go on for hours. The man who discovered those files is cybersecurity researcher Jeremiah Fowler. He joins me now from Krakow, Poland, as we talk about the role chatbots play in customer service and the problems people have with them. Jeremiah, welcome to the 21st Show. Thanks for being with us. [00:01:49] Jeremiah Fowler: Hello, thank you. Yeah, it's good to, good to be with you today. [00:01:52] Brian Mackey: Also with us is Andy Jeon. He's an assistant professor of marketing at Northern Illinois University in DeKalb, and he researches chatbots as part of his work. Andy, welcome to you as well. And listeners, you can join the conversation for the rest of the hour by giving us a call at 800-222-9455. Maybe you've had to deal with a customer service chatbot. What was your experience like? And how do you feel about the possibility of these chatbots fully replacing human workers in customer service altogether? The number again is 800-222-9455. That's 800-222-9455. All right. Andy Jeon, I'm gonna start with you. Talk about the uptake of these customer service chatbots in the past few years. How are we seeing companies use them? Andy, are you with us? All right, it seems we may be having some connection issues with Andy Jeon. So, Jeremiah, let me, let me come to you. What, what do you understand about why corporations like this technology? [00:03:01] Jeremiah Fowler: I think at the end of the day, it, it, you know, they're saving money. You don't have to pay a chatbot. It's kind of a one-time investment and then minor upkeep. So I think at the end of the day, that's, that's it. It's a revenue-based decision, no matter how convenient or inconvenient it is for the actual end user. [00:03:21] Brian Mackey: Yeah. Talk, talk about where privacy risks come into this. [00:03:26] Jeremiah Fowler: As a cybersecurity researcher, I've been doing this for about 15 years, and I've seen an explosion in data breaches involving artificial intelligence because we all think that AI is just this magic. You give it information and it goes off in the cloud somewhere, but that data has to be stored somewhere and if the storage of that data is not secure, it's a huge privacy and security risk. [00:03:53] Brian Mackey: Talk about how these programs work. And, you know, as I sometimes say, maybe explain it to me like I'm a smart teenager. [00:04:01] Jeremiah Fowler: So many of them work different ways. Some of them can be very complex and, and use generative AI or they can use, uh, if this, then that system. So for example, if the customer says, I need help, OK, what do you need help about? And then there's, you know, kind of a script that it can follow. [00:04:23] Brian Mackey: So I guess this is where if you say the word hours, then it'll direct you to a message that has the store hours. And if you say, you know, agent or something, maybe it connects you with a human, I guess that's the if, if this, then that sort of system. [00:04:37] Jeremiah Fowler: Yeah, these days getting connected with a human seems more and more difficult. But yeah, that's correct. To look at it that way. They're they're basically prompted scripts that when the customer or the user requests a certain feature or information, then that script is automatically generated, whether by text in the chatbot or a virtual assistant and audio. [00:05:02] Brian Mackey: Hm. All right, so I think we have Andy Jeon with us. Uh, apologies for the technical difficulties there. Talk, talk about the uptake of these customer service chatbots in the past few years. Why are we seeing companies increasingly adopting these? [00:05:17] Andy Jeon: So again, I mean, you already been said, but you know companies try to, you know, replace human labor with a chatbot because chatbot is much faster and much more efficient and definitely much cheaper than hiring human labor. And also in terms of wellbeing of the human labor, I mean usually when you know labor agents [are] dealing with difficult customer, they're emotionally exhausted. So there are a lot of, you know, issues regarding well-being of the human labor, you know, working in the call center or you know, working as a labor agent. So there are a lot of issues addressed. So companies just turn to you know chatbot[s]. [T]hey can, you know, easily solve those issues. [00:05:58] Brian Mackey: That's interesting. The the labor side of it is sort of being, I actually knew somebody who was a medical professional in a town that had a big AT&T call center back in the day and said that a lot of the patients from the AT&T place, you know, not, not giving any identifying details, but they were pretty depressed. It was a really tough job. [00:06:16] Andy Jeon: Mhm. Exactly. I mean, I think, you know, anyone has some experience or, you know, being like a frustrated or even angry about the product of a brand, so they just like to talk to the, you know, agent. Just, just, you know, to get it out and then just talk about it. [00:06:35] Brian Mackey: Yeah, yeah, absolutely. All right, we need to take a break on the program. Your call is very important to us, but we're gonna pick it up after this. You're listening to the 21st show. We're talking about chatbots. Stay with us. I don't even wanna think about how much of my life I've spent listening to this song, the Cisco standard, the Cisco phone system standard hold music. It's the 21st show. I'm Brian Mackey. We're talking about AI chatbots and customer service. Corporations seem to love them. Customs, not so much. We asked about this in our listener texting group, which you can join by sending the word talk to 217-803-0730. Janice in Crawford County told us, "I don't mind using chatbots, and they do oftentimes help and can solve a request or problem. If and when I get frustrated, I just tell the chatbot and ask to speak to a human customer service representative. [O]nly once have I been referred to another chatbot." Terry [in] Batavia said, "I don't know which is worse, an offshore customer service rep with an accent who's having as much trouble understanding me as I'm having understanding him or her, or an AI chatbot that understands nothing." But Jay [in] Elburn said simply, "I deal with the 21st show bot several times a week." All right, thanks for those messages, Jay. I, I promise you we have real human people who read every text message [that] comes in. We're talking about people's experiences with chatbots as well as a leak that left millions of conversations people had with Sears chatbots accessible on the open web. My guests are Jeremiah Fowler, who discovered the Sears leak, and Andy John, a professor of marketing at Northern Illinois University whose work includes researching chatbots. You can join us live at 800-222-9455. All right, Jeremiah, say more than I did in the introduction about what was in those Sears files, broadly speaking. [00:08:58] Jeremiah Fowler: Sure, yeah, and I'll, I'll expand a little bit. So, I'm a cybersecurity researcher with around 15 years experience. I work with Black Hills Information Security. And right now I'm speaking to you from Krakow, Poland in the [anti-siphon] Research Center. And one of the things we do is figure out how data is exposed and how to protect it. In this particular incident, there were three separate databases with 3.7 million records. Those included chat logs, transcripts, audio recordings. And these were from 2024 to [2025], and it was completely accessible without a password to anyone with an internet connection and knew where to look. [00:09:41] Brian Mackey: Some of these included audio files that went on for hours, even after the service part of the calls was done. Talk, talk about that. [00:09:49] Jeremiah Fowler: Yeah, so a lot of times we've all had these automated calls where it's like, thank you for calling, goodbye, and you just assume that that's the end of the call. A lot of older people that don't look at their phone every 30 seconds for social media, they're just like, OK, the call is done, you set it down and it records everything. And not only does it record, but it also created a plain text, uh, transcript [that] could be used, it's searchable, it's indexable. So huge privacy risk there. In addition to that, from the potential of fraud or scams, when you're giving someone your personally identifiable information and insider knowledge. So for example, I know what, what you have, what, what, you know, item you want repaired and the date you want it repaired. Now, if I took that and I called you and I said, hey, I'm calling from Sears Home Services, I see you have a Kenmore dryer, uh, give me your credit card number. The chances of that increase exponentially because who else knows that information? [00:10:55] Brian Mackey: Yeah, that's interesting. One of the other things you also found in, in the report is a lot of frustration directed at the chatbots. Talk about that. [00:11:05] Jeremiah Fowler: A lot is probably putting it mildly. You know, so yeah, some of them went on with just, you know, people out of frustration testing the chatbot, asking questions that were [ir]relevant to the chat. You know, in a time of frustration, a lot of times a human really can help you. It makes the difference between satisfying the customer and just tying them up or passing them on the next link of the chain. But yeah, I saw some very, very disgruntled chats that I cannot repeat many of the words on there. [00:11:40] Brian Mackey: Yeah, that's a good thing. Andy John, what is it that makes people frustrated with these chatbots? [00:11:47] Andy Jeon: Yeah, most of the time, I mean, we are, if we are thinking about the the old fashioned, you know, chatbot that is almost like an answering machine, then humans, I mean, the customer think that the, uh, this old fashioned answering machine like a chatbo[t] cannot handle complicated issue. For example, we can ask simple question, we can ask chatbo[t] simple question, but if we, uh, you know, make the question complicated, then you know, chatbot is most likely cannot fully answer the question. And then they want to, you know, talk to, you know, live agent for that reason. And, you know, thinking about customer being frustrated, you know, we need to, you know, think about, you know, human [psychology]. So why, you know, but you know, given the[at] we have like a more advanced chat[bot] like a like a [ChatGPT] that, you know, many people are using, they really like, you know, talking like a human and then, you know, their thinking capacity is beyond our imagination. It's so good, but even so, why, you know, why even, you know, [chatbot] is, you know, good at handling customer issue[s]. Why do customers still want[] to talk to a live agent, uh, especially when they're frustrated, you know, [the] customer just do want [to] get some kind of psychological comfort from just simply talking with a live agent. I mean, yeah, we can get some kind of, you know, emotional connection with the you know machine, but it's just a machine, so we just want to talk to another human being to just get relief from what the customers are experiencing. [00:13:16] Brian Mackey: You know, we actually got a text message on this very point. Sonia in Decatur says, "The more technology evolves, the more humans are non-existent. I prefer to speak to a human. The only thing good about AI would be for very rude people. The bot won't take offense. I'm old. I like interacting with people. I like interactive situations, not sterile ones." Thanks for that message, Sonia. Andy, I want to come back to her, to her point on rudeness. Does, does any of the behavior we see have to do with people treating a chatbot differently, feeling like they can get away with something that they maybe wouldn't do with a human[?] [00:13:52] Andy Jeon: Yeah. So, you know, there are a lot of research, you know, actually discussing about this. So when you believe that you're talking with the, uh, you know, machine, then you like uh unconsciously believe that the, uh, that your, your interaction will be just [ending] with that machine, it's not going to be seen by human beings. So you're kind of exposing your inner anger more brutally than when you are interacting with another social being because when you're interacting with other, you know, people, then you consider others['] emotion. But when you are interacting with a machine, we don't consider emotion of a machine typically because machine doesn't have emotion. So for that reason, you know, humans are kind of exposing their, you know, anger or frustration when they are talking with the, uh, you know, chatbot. [00:14:43] Brian Mackey: Jeremiah, I wanna come back to this Sears leak. What are some of the reasons files like this might end up on the open internet? [00:14:51] Jeremiah Fowler: There are a range of reasons. It could be just the firewall misconfiguration. It could have been an update. It could have been migration to a new system. I don't believe these were exposed for a long period of time, and the reason for that is just because [y]ou know, the, the bad guys are also looking to ransomware databases like this, uh, for profit. So, that's more than likely how it happened, but it's a learning experience for creators of chatbots and organizations that use chatbots, just to understand that yes, you collect that data and you store that data, but how is it stored and how is it protected? [00:15:36] Brian Mackey: Is this, I mean, to what extent do you have a sense that this is partly because Sears is this, you know, declining corporation that's, that's struggled and, and is like a shell of its former self? Should, you know, would this, should people feel more secure about, you know, more established companies or, you know, ChatGPT, that sort of thing? Or is this a lesson for us all to just, you know, cause I, I don't know how you get around this problem, right? If you're, if, if a company wants you, you know, if the only way to interact with them is through one of these chatbots, what do you do? [00:16:07] Jeremiah Fowler: Well, you know, I think Sears has the right idea. You know, this is a legacy brand that's been around, you know, 140 years, so it makes sense for them to come into the digital age. And, and the fact that they're embracing technology, I would say would be an asset in the long run for their business and for their customers. So, you know, you know, I would look at it that way as opposed to, uh, you know, being a a con as opposed to a pro. But at the end of the day, you have to realize that the data you collect is, is that of real people, and it's equally as valuable as the products or services you provide. [00:16:49] Brian Mackey: What do you do when you discover a breach like this? [00:16:54] Jeremiah Fowler: First thing I do is is try to find out what's there, how sensitive it is. And then validate who owns that data and then I do a responsible disclosure notice. In this case, I reported it to them. They closed it within the same day, you know, within a matter of hours. [S]o. They were very responsive with it, and and that's important as well when it comes to making sure that it's not exposed any longer than it needs to be. [00:17:23] Brian Mackey: And you do that, I presume before publishing that you found it to the to the to the wide world and you know, it's written up [in] Wired and that sort of thing. Yeah, yeah. All right, Andy John, beyond even privacy, let's talk about what can make a chatbot a liability for a company. And I'm thinking of a particular case in Canada involving an airline chat[bot]. [00:17:46] Andy Jeon: So first of all, I mean, [f]or example, when you want to build a [chatbot] or, you know, deploy [a chatbot], it can cost like $300 per month or, you know, when you want to build your own [chatbot], [it] can cost like $10,000 or or more. But you also need to be prepared to invest uh the budget in maintenance of the chatbot in terms of maintenance in terms of security, safety, and protecting data privacy. So it's just not just building a chatbot, you know, using chatbot is also about the good maintenance of the chatbot. So that's what the company needs to [think] about. And also, I mean, chatbot, many people think of a chatbot as just answering machine. But just think about it. [I]f you go to some e-commerce website. And the first thing that is greeting you is a chatbot, right? That means that chatbot can play a role as a even brand ambassador, means that the good impression coming from chatbot can actually carry or carries over to your evaluation of the product and even purchase decision. So chatbot, again, we need to think differently about chatbot than before. Because [chatbot] has a lot of role in building customer trust and shaping brand, brand attitude. [00:19:05] Brian Mackey: Hm, interesting. All right, Jeremiah, let's end with like maybe a practical note, right? How much of this is something customers can do something about, can control, as opposed to this being on the companies that employ this technology, in terms of securing your information in a case like this? [00:19:23] Jeremiah Fowler: Unfortunately, the customers have very little control over this. The only [t]he only time that it falls to the customer is once a data incident happens, just making sure that you know what to do to not be a victim of fraud using stolen information. And that would be, you know, always question suspicious information requests. Only communicate through authorized or legitimate channels and double check if you are suspicious that someone is not who they say they are. You have to be proactive these days, and unfortunately, consumers don't have control of their data. Once it's gone, it's gone. [00:20:07] Brian Mackey: Jeremiah Fowler is a cybersecurity researcher, joined us today from Poland. Good evening. Thank you for being with us. And Andy Jeon is an assistant professor of marketing at Northern Illinois University in DeKalb. Jeremiah, Andy, thanks so much for being with us. [Thank you.] And that is all the time we have for our program today. Coming up tomorrow on the 21st show, there was a flood of super PAC money into Illinois. Much of it misleading. It came from groups like crypto promoting cryptocurrency, sports betting, and also AIPAC, the American Israel Public Affairs [Committee], and it [was] a factor in this year's elections, although not always the way the people spending the money intended. We're going to take a deep dive into that tomorrow. Plus, with the weather getting warmer again, it's time to consider your spring and summer travel plans. We'll talk about popular destinations in Illinois nationally and internationally with traveling content creators Jay and Himani Patel. Before we go, I wanna urge you to join our texting group. You heard messages throughout the show today, we love sharing your comments and questions with our guests. You can join by sending the word TALK to 217-803-0730. Again, text the word TALK to 217-803-0730. You can find that, how to join our texting group, plus our email address, voicemail line, every other way to get in touch. It's all at our website, twentyfirstshow.org. You can find our past programs there and links to subscribe to our podcasts or look us up on Apple, Spotify, YouTube, or wherever you listen. The 21st Show is a production of Illinois Public Media. I'm Brian Mackey. Thank you for listening. We will talk with you again tomorrow.
Customer service is increasingly the domain of AI chatbots. This has created a lot of frustrations for people. These chatbots are also vulnerable to security issues. A cybersecurity researcher and a marketing expert share their thoughts.
GUESTS
Jeremiah Fowler
Cybersecurity researcher
Andy Jeon
Assistant Professor of Marketing, Northern Illinois University