)

Advertisement

Ad code

Flirting with a Customer Support Chatbot? Stop it now

[Collection]

Expecting a human connection and quick solutions to your problem from a chatbot might cause you privacy issues, cybersecurity expert warns

Customer support chatbots help users solve their daily problems in many areas, from online shopping to financial services, but they can also become targets for flirtatious customers. A recent survey shows that nearly 20% of Americans have flirted with chatbots. Cybersecurity experts say that significant starvation of human contact and willingness to solve problems as quickly as possible might lead to identity theft and severe privacy issues.

“Many users are still unaware that on most occasions when they ask for customer support, it’s usually a chatbot, not a human being. People claim they flirt with chatbots because of curiosity and confusion, but loneliness and sex are also mentioned among the main reasons. This might sound surreal, but it actually raises both psychological and privacy concerns. This is when people lose their privacy because they dump all their personal data just to get rid of some problem or to get the emotional satisfaction that they are being helped,” says Adrianus Warmenhoven, a cybersecurity expert at NordVPN.

Pushing yourself off a privacy cliff

Engaging in flirtatious conversations with chatbots is dangerous in terms of digital privacy. Customers tend to reveal way more personal information to the chatbot than they should just to impress an imaginary person on the other side of the wire.

Similarly, customers tend to throw bits and pieces of sensitive personal data, like their ID or social security number, when they are eager to solve a problem quickly but they constantly get replies from the chatbot to “rephrase the question” or “tell more about the problem.”

While it does not mean privacy issues as such, all data typed in the chat is collected, stored, and accumulated. Needless to say, any system can have vulnerabilities, representing a flaw, gap, or unintentional “backdoor” into a system that a hacker can exploit, especially if the chatbot does not properly protect customer data using encryption.

“Customer support operators used to be a filter, understanding the domain and privacy risks and asking only for relevant and less sensitive information. Now AI has to grasp nuances in what people say they need and what they need. While turning to AI for support functions is unavoidable, consumers will have more responsibility for what data should be shared with a chatbot. They must be extra cautious about the information they disclose, since they cannot know how this information will manifest as outputs at some point in the future, especially knowing that in some cases this data is used for teaching algorithms,” Warmenhoven says.

How to protect your privacy from customer support chatbots

To protect your privacy while using chatbots, Adrianus Warmenhoven offers these preventive measures:

“The main rule is not to provide more information than is needed to resolve the issue. There is no need to flirt with a chatbot or share really personal information that you don’t want to become public in case of a leak.

“While drafting a request message, do not include any information that would allow identifying you or others. Use the order number if you are in touch with an online shop or the booking number of your flight tickets when dealing with airlines. This information should be enough to identify you. Do not provide a chatbot with an ID, social security, or bank card number. And do not sign your message with your name and last name, as this is not a love letter.

“Prepare your request and information before approaching the customer support chatbot. Drafting a message in advance in your notepad app will allow you to think twice about the clearance of the message and leave more time to check if you provide only the necessary information.

“To protect your identity from cybercriminals, always request a verification email from the chatbot. This is nothing new, but it is an effective tool, and reputable businesses have this function in their privacy protection toolbox.”

The post Flirting with a Customer Support Chatbot? Stop it now appeared first on SiteProNews.

[Collection]

Expecting a human connection and quick solutions to your problem from a chatbot might cause you privacy issues, cybersecurity expert warns

Customer support chatbots help users solve their daily problems in many areas, from online shopping to financial services, but they can also become targets for flirtatious customers. A recent survey shows that nearly 20% of Americans have flirted with chatbots. Cybersecurity experts say that significant starvation of human contact and willingness to solve problems as quickly as possible might lead to identity theft and severe privacy issues.

“Many users are still unaware that on most occasions when they ask for customer support, it’s usually a chatbot, not a human being. People claim they flirt with chatbots because of curiosity and confusion, but loneliness and sex are also mentioned among the main reasons. This might sound surreal, but it actually raises both psychological and privacy concerns. This is when people lose their privacy because they dump all their personal data just to get rid of some problem or to get the emotional satisfaction that they are being helped,” says Adrianus Warmenhoven, a cybersecurity expert at NordVPN.

Pushing yourself off a privacy cliff

Engaging in flirtatious conversations with chatbots is dangerous in terms of digital privacy. Customers tend to reveal way more personal information to the chatbot than they should just to impress an imaginary person on the other side of the wire.

Similarly, customers tend to throw bits and pieces of sensitive personal data, like their ID or social security number, when they are eager to solve a problem quickly but they constantly get replies from the chatbot to “rephrase the question” or “tell more about the problem.”

While it does not mean privacy issues as such, all data typed in the chat is collected, stored, and accumulated. Needless to say, any system can have vulnerabilities, representing a flaw, gap, or unintentional “backdoor” into a system that a hacker can exploit, especially if the chatbot does not properly protect customer data using encryption.

“Customer support operators used to be a filter, understanding the domain and privacy risks and asking only for relevant and less sensitive information. Now AI has to grasp nuances in what people say they need and what they need. While turning to AI for support functions is unavoidable, consumers will have more responsibility for what data should be shared with a chatbot. They must be extra cautious about the information they disclose, since they cannot know how this information will manifest as outputs at some point in the future, especially knowing that in some cases this data is used for teaching algorithms,” Warmenhoven says.

How to protect your privacy from customer support chatbots

To protect your privacy while using chatbots, Adrianus Warmenhoven offers these preventive measures:

“The main rule is not to provide more information than is needed to resolve the issue. There is no need to flirt with a chatbot or share really personal information that you don’t want to become public in case of a leak.

“While drafting a request message, do not include any information that would allow identifying you or others. Use the order number if you are in touch with an online shop or the booking number of your flight tickets when dealing with airlines. This information should be enough to identify you. Do not provide a chatbot with an ID, social security, or bank card number. And do not sign your message with your name and last name, as this is not a love letter.

“Prepare your request and information before approaching the customer support chatbot. Drafting a message in advance in your notepad app will allow you to think twice about the clearance of the message and leave more time to check if you provide only the necessary information.

“To protect your identity from cybercriminals, always request a verification email from the chatbot. This is nothing new, but it is an effective tool, and reputable businesses have this function in their privacy protection toolbox.”

The post Flirting with a Customer Support Chatbot? Stop it now appeared first on SiteProNews.

[Collection]

Expecting a human connection and quick solutions to your problem from a chatbot might cause you privacy issues, cybersecurity expert warns

Customer support chatbots help users solve their daily problems in many areas, from online shopping to financial services, but they can also become targets for flirtatious customers. A recent survey shows that nearly 20% of Americans have flirted with chatbots. Cybersecurity experts say that significant starvation of human contact and willingness to solve problems as quickly as possible might lead to identity theft and severe privacy issues.

“Many users are still unaware that on most occasions when they ask for customer support, it’s usually a chatbot, not a human being. People claim they flirt with chatbots because of curiosity and confusion, but loneliness and sex are also mentioned among the main reasons. This might sound surreal, but it actually raises both psychological and privacy concerns. This is when people lose their privacy because they dump all their personal data just to get rid of some problem or to get the emotional satisfaction that they are being helped,” says Adrianus Warmenhoven, a cybersecurity expert at NordVPN.

Pushing yourself off a privacy cliff

Engaging in flirtatious conversations with chatbots is dangerous in terms of digital privacy. Customers tend to reveal way more personal information to the chatbot than they should just to impress an imaginary person on the other side of the wire.

Similarly, customers tend to throw bits and pieces of sensitive personal data, like their ID or social security number, when they are eager to solve a problem quickly but they constantly get replies from the chatbot to “rephrase the question” or “tell more about the problem.”

While it does not mean privacy issues as such, all data typed in the chat is collected, stored, and accumulated. Needless to say, any system can have vulnerabilities, representing a flaw, gap, or unintentional “backdoor” into a system that a hacker can exploit, especially if the chatbot does not properly protect customer data using encryption.

“Customer support operators used to be a filter, understanding the domain and privacy risks and asking only for relevant and less sensitive information. Now AI has to grasp nuances in what people say they need and what they need. While turning to AI for support functions is unavoidable, consumers will have more responsibility for what data should be shared with a chatbot. They must be extra cautious about the information they disclose, since they cannot know how this information will manifest as outputs at some point in the future, especially knowing that in some cases this data is used for teaching algorithms,” Warmenhoven says.

How to protect your privacy from customer support chatbots

To protect your privacy while using chatbots, Adrianus Warmenhoven offers these preventive measures:

“The main rule is not to provide more information than is needed to resolve the issue. There is no need to flirt with a chatbot or share really personal information that you don’t want to become public in case of a leak.

“While drafting a request message, do not include any information that would allow identifying you or others. Use the order number if you are in touch with an online shop or the booking number of your flight tickets when dealing with airlines. This information should be enough to identify you. Do not provide a chatbot with an ID, social security, or bank card number. And do not sign your message with your name and last name, as this is not a love letter.

“Prepare your request and information before approaching the customer support chatbot. Drafting a message in advance in your notepad app will allow you to think twice about the clearance of the message and leave more time to check if you provide only the necessary information.

“To protect your identity from cybercriminals, always request a verification email from the chatbot. This is nothing new, but it is an effective tool, and reputable businesses have this function in their privacy protection toolbox.”

The post Flirting with a Customer Support Chatbot? Stop it now appeared first on SiteProNews.

Post a Comment

0 Comments

Comments