Skip to content
Tech News
← Back to articles

Do Chatbots Fill You With Rage? This Startup Will Pay You $100 an Hour to ‘Bully’ AI.

read original get AI Chatbot Interaction Kit → more articles
Why This Matters

This innovative role highlights the growing importance of user feedback in AI development, emphasizing that consumer frustrations can directly influence the improvement of AI chatbots. It also demonstrates a shift towards more accessible, user-driven approaches to refining AI technology, making it more responsive to real-world interactions. For consumers and the industry, it underscores the value of human critique in creating more reliable and user-friendly AI systems.

Key Takeaways

Listen to this post

Key Takeaways AI startup Memvid is paying $800 for a day of “bullying” AI chatbots.

The worker’s job is to examine where chatbots lose track of details, forget context or misrepresent data, and then feed those findings back to Memvid.

The role doesn’t require a computer science background, AI credentials or any kind of work experience.

An AI memory startup called Memvid is offering $800 for a one-day, eight-hour shift for one candidate to “bully” AI chatbots by telling them what to do on camera.

Business Insider reported this week that Memvid wants someone to spend eight hours testing and critiquing the memory of popular AI chatbots, effectively paying $100 an hour for what they have branded as a “professional AI bully” role. The worker’s job is to examine where chatbots lose track of details, forget context or misrepresent data, and then feed those findings back to Memvid so the startup can improve its products.

“You’ll spend a full 8-hour day interacting with leading AI chatbots — and your only job is to be brutally honest about how frustrating they are,” the job listing reads.

The job posting

The draw is that the role doesn’t require a computer science background, AI credentials or any kind of work experience. “No prior AI bullying experience required — we all start somewhere,” the listing reads.

The requirements are deeply personal. The first requirement is an “extensive personal history of being let down by technology,” and the second desired trait is “the patience to ask a chatbot the same question four times (and the rage when it still gets it wrong).”

... continue reading