Home Bitcoin News Chatgpt ‘Is the New Crypto,’ Meta Says Malware Actors Exploit AI Craze – Security Bitcoin News

Chatgpt ‘Is the New Crypto,’ Meta Says Malware Actors Exploit AI Craze – Security Bitcoin News

0
Chatgpt ‘Is the New Crypto,’ Meta Says Malware Actors Exploit AI Craze – Security Bitcoin News

[ad_1]

A growing number of malware creators are now taking advantage of the significant interest in Chatgpt to lure victims, Facebook owner Meta has noticed. According to its head of information security, the AI-based chatbot is “the new crypto” for bad actors and the social media giant is preparing for various abuses.

Malware Inspired by Chatgpt Is on the Rise, Facebook’s Parent Company Says

Meta, the corporation behind Facebook, has found that malware purveyors are now exploiting public interest in Chatgpt, Openai’s chatbot powered by artificial intelligence (AI), to entice users into downloading malicious apps and browser extensions.

The company has identified around 10 malware families and over 1,000 malicious links that have been promoted as tools featuring Chatgpt since March, according to a report quoted by Reuters. On Wednesday, its representatives likened the phenomenon to crypto-themed scams.

In some of the cases, the malware delivered working Chatgpt functionality alongside abusive files, Meta noted. At a press conference on the findings in the report, its Chief Information Security Officer Guy Rosen remarked that for the bad actors “Chatgpt is the new crypto.”

During the briefing on Wednesday, Rosen and other Meta executives also pointed out that Facebook’s parent company is preparing its defenses for a variety of potential abuses related to generative AI technologies like Chatgpt.

The rising popularity and rapid development of platforms like the Microsoft-funded chatbot have raised concerns among authorities around the world, including that such tools are likely to make online disinformation campaigns easier to propagate, Reuters noted.

Meta executives believe it’s still early for examples of generative AI being used in information operations, although Rosen commented that he expects some bad actors to employ such technologies to “try to speed up and perhaps scale up” their activities.

In a statement issued after their meeting in Japan at the end of April, digital ministers of the G7 countries agreed that their developed nations should adopt AI regulations that are “risk-based” while enabling the development of AI technologies.

In a recent interview, entrepreneur and investor Elon Musk accused Openai, the developer of Chatgpt which he helped found, of “training the AI to lie.” He also announced plans to create a rival to the offerings of tech giants which he called “Truthgpt.”

Tags in this story
ai, Apps, Artificial Intelligence, chatbot, Chatgpt, Crypto, Cryptocurrencies, Cryptocurrency, Facebook, Information Security, links, malicious apps, malicious links, Malware, malware actors, Meta, openai, Social Media, Victims

Do you think the trend of malware actors leveraging public interest in Chatgpt to lure victims will continue to grow? Share your thoughts on the subject in the comments section below.

Lubomir Tassev

Lubomir Tassev is a journalist from tech-savvy Eastern Europe who likes Hitchens’s quote: “Being a writer is what I am, rather than what I do.” Besides crypto, blockchain and fintech, international politics and economics are two other sources of inspiration.

Image Credits: Shutterstock, Pixabay, Wiki Commons

Disclaimer: This article is for informational purposes only. It is not a direct offer or solicitation of an offer to buy or sell, or a recommendation or endorsement of any products, services, or companies. Bitcoin.com does not provide investment, tax, legal, or accounting advice. Neither the company nor the author is responsible, directly or indirectly, for any damage or loss caused or alleged to be caused by or in connection with the use of or reliance on any content, goods or services mentioned in this article.



[ad_2]

Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here