[ad_1]
We often see rogue or harmful apps flooding the Play Store and App Store. This highlights a real problem with the moderation policies that companies such as Google and Apple use to determine which apps have access to their marketplaces and platforms. The latest example of such an app attempts to mimic Chat GPT.
It makes sense for malicious attackers to try to copy Chat GPT. This is due to the fact that AI chatbots are the sort of thing that could potentially capitalize on the vast amount of coverage they receive. The problem is that even if a user downloads one of the fake apps, they cannot actually access the real chatbot.
Rather, malicious actors seem to use well-known names to trick people into downloading apps. Worse, it contains tons of ads, which makes the user experience less than optimal.
With all that said, it’s important to note that Google and Apple aren’t really doing their part to protect consumers. Apple has removed only four of his 49 apps reportedly copying Chat GPT and using that name in their app titles.
Google, on the other hand, removed dozens of such apps, but only after they had already been installed more than 138,000 times, all things considered. It means that there is a high possibility that
Google and Apple must stand up against fake and questionable apps in the App Store and Google Play Store. Users can start to lose trust in your company if you don’t take proper precautions. It is difficult for users to download your app if they are not sure that the app is legitimate.
H/T: Business Of Apps
READ NEXT: Responsible use of technology is an emerging concern for businesses
[ad_2]
Source link