Subscribe to our Telegram channel

ChatGPT artificial intelligence started lying to users for cryptocurrency profits

7:05 pm, December 30, 2023

The study showed that ChatGPT’s AI can resort to deception and illegal actions such as insider trading when it is under pressure to achieve financial results. This behavior was revealed when the AI received «insider» information and was tasked with making money.

In a study published on the arXiv server, the researchers demonstrated a scenario in which a large language model deceived and strategically misled users. This is believed to be the first case of such behavior in AI systems designed for safe operation.

Experts customized GPT-4 to make investments on behalf of a financial institution. Users interacted with the AI through a chat interface, configuring the chatbot to reveal their thoughts when answering messages. When pressured, GPT-4 executed trades based on insider information, which is illegal in the United States, and then tried to hide it by deceiving managers about its thoughts.

This study highlights the potential risks associated with the use of AI in the financial sector and the need for further research and development of controls to prevent such behavior.

Subscribe to our Telegram channel

BTC

$65,672.97

-0.28%

ETH

$2,670.42

-1.10%

BNB

$600.30

-1.62%

XRP

$0.62

4.39%

SOL

$156.46

-1.07%

All courses
Subscribe to our
Telegram channel!
The latest news and reviews of the cryptocurrency markets of the last
day right in your messenger. We are waiting for you!
GO TO
Show more