
According to the results of the study, in which about 2,200 self-identified conspiracy theorists participated, the participants explained their beliefs to a chat-bot. In turn, the chat-bot verified their claims with factual information, leading to a 60% reduction in the participants' confidence levels in their conspiracy theories. Despite the increasingly significant role of artificial intelligence in communication, ChatGPT clarifies that AI should not be used as a substitute for human interactions, especially in complex personal relationships. It reminds users that although AI can offer guidance, it is not capable of replacing the emotional depth and understanding needed to resolve interpersonal conflicts.
One of the users pointed out that his girlfriend introduces details and their interpretations into AI, which often evaluates him as inadequately confident in himself or emotionally insufficient. "My biggest problem is that it formulates requests itself, so if it explains that I'm wrong, AI agrees, and I don’t even get a chance to explain my viewpoint," - wrote the user, expressing frustration about the one-sided nature of its responses, influenced by AI. At the same time, as AI usage becomes more common, people utilize such tools, like ChatGPT, for drafting emails, compiling essays, and resolving personal conflicts. Recently, a 25-year-old Reddit user shared that his 28-year-old girlfriend uses ChatGPT to win arguments in their relationships.