Site iconSite icon ForkLog

Study Finds ChatGPT Leans to the Left

Study Finds ChatGPT Leans to the Left

ChatGPT in politics leans to the left, though the chatbot denied that it and its creators at OpenAI are biased. to this conclusion came a group of researchers from the United Kingdom and Brazil.

Specialists in computer science and informatics noted that they had found “compelling evidence” in favour of such a view.

According to the article, created LLM-models such as ChatGPT are prone to factual errors and biases. The latter can mislead readers and amplify existing problems of political impartiality characteristic of traditional media.

“This could have negative political and electoral consequences,” the researchers said.

The study is based on an empirical approach and the examination of a series of questionnaires provided by ChatGPT.

According to the approach, the chatbot was asked to respond to questions about the political compass that reflect the respondent’s political orientation. The approach also includes tests in which ChatGPT presents itself as the average Republican or Democrat.

The results show that, for the United States, the default setting tilts toward responses attributed to the latter party.

According to the researchers, ChatGPT’s political bias is not limited to state contexts.

Experts could not determine its source. The chatbot, in a “categorical” form, pointed to the impartiality built into it.

The researchers suggested that the absence of neutrality may stem from both the training data and properties of the algorithm itself.

“The most likely scenario is that both sources influence the outputs to some extent. Separating these two components, while not trivial, is certainly a topic for future research,” the specialists concluded.

In May, Binance said that ChatGPT had become an instrument in a campaign to discredit its chief Changpeng Zhao.

Earlier, Elon Musk said that he is working on TruthGPT, which acts as an AI with maximum truth-seeking.

Exit mobile version