Skip to content
Tech News
← Back to articles

Meta To Start Capturing Employee Mouse Movements, Keystrokes For AI Training Data

read original get Ergonomic Wireless Mouse → more articles
Why This Matters

Meta's plan to collect detailed employee interaction data aims to enhance AI models that assist with everyday computer tasks, reflecting a broader industry trend toward more immersive and responsive AI systems. However, this raises important privacy considerations for employees and highlights the increasing reliance on real-world data to improve AI performance. The initiative underscores the ongoing push for more sophisticated AI tools that can better understand human behavior, impacting both the tech industry and user privacy expectations.

Key Takeaways

Reuters reports that Meta plans to start collecting U.S.-based employees' mouse movements, clicks, keystrokes, and occasional screen snapshots to train AI agents that can better learn how humans use computers. The tool, called Model Capability Initiative (MCI), will reportedly "not be used for performance assessments or any other purpose besides model training and that safeguards were in place to protect 'sensitive content.'" From the report: Meta CTO Andrew Bosworth told employees in a separate memo shared on Monday that the company would step up internal data collection as part of those "AI for Work" efforts, now re-branded as Agent Transformation Accelerator (ATA). "The vision we are building towards is one where our agents primarily do the work and our role is to direct, review and help them improve," Bosworth said. The aim, he added, was for agents to "automatically see where we felt the need to intervene so they can be better next time." Bosworth did not explicitly spell out how those agents would be trained, but said Meta would be "rigorous" about "building up data and evals for all the types of interactions we have as we go about our work." Meta spokesperson Andy Stone acknowledged that the MCI data would be among the inputs. [...] "If we're building agents to help people complete everyday tasks using computers, our models need real examples of how people "actually use them -- things like mouse movements, clicking buttons, and navigating dropdown menus," said Stone.

Read more of this story at Slashdot.