After coming out of nowhere, a viral new app that pays people to record their phone calls for the purpose of training AI has been yanked offline after a security flaw allegedly exposed user data.
Neon founder Alex Kiam told Gizmodo in an email that the app’s servers are down while the team patches the vulnerability and conducts a security audit to ensure the issue doesn’t happen again.
Neon launched just last week and quickly shot to the number two spot on iPhone’s top free app chart before it was taken down on Thursday.
The app pays users who agree to record their calls and lets Neon sell those recordings and other data to AI companies to train their models and voice assistants. It was pitched as a way for people to earn some money from their data, which tech companies have long profited from.
“Companies collect and sell your data every day. We think you deserve a cut,” the company’s website says.
Things took a turn on Thursday after TechCrunch discovered and reported a major flaw that let nearly anyone access sensitive Neon user data, including phone numbers, call recordings, and transcripts.
While testing the app, TechCrunch used the network-traffic tool Burp Suite to analyze the data coming in and out of the app. Neon’s interface only shows a simple list of a user’s recent calls and how much each earned. However, Burp Suite was able to get a lot more info from the app’s back-end servers, like full call transcripts and public links to the raw audio files from other users’ calls.
Probing further, TechCrunch reporters discovered they could also access call metadata from other users. That information included both parties’ phone numbers, the time and duration of a call, and how much each call earned.
Kiam said the Neon team shut down the app’s servers immediately after TechCrunch alerted them to the flaw.
In an email to users, the company said it expects to be back online soon.
... continue reading