Anker’s smart home sub-brand Eufy hasn’t always had the most stellar reputation. In 2023, for example, the company came under fire after it was revealed that some of its camera feeds weren’t properly encrypted. That wasn’t a great look for a brand that sells itself on home security.
So when news popped up that Eufy was literally paying customers two bucks a pop to stage package thefts and fake car break-ins, you’d be forgiven for thinking, “Here we go again.” The idea of a smart camera company asking people to pretend to be porch pirates almost sounds like the setup for a viral YouTube prank gone wrong.
But in this case, the initial optics are worse than the reality. As TechCrunch reports, there was no security bait-and-switch here, but instead, a relatively novel means to improve for Eufy to improve its AI theft-detection model.
With the side effect of the cameras getting better
Apparently, the program quietly ran for several months and recruited more than 120 participants. Eufy’s stated goal was ambitious: collect over 40,000 clips of both staged and real incidents to feed its AI models. The company wanted training data of suspicious behavior, like someone testing a locked car door or “accidentally” walking away with a delivery box. The process was simple: upload the footage through a Google Form, attach a PayPal address, and receive $2 per clip.
The key here is transparency. Eufy didn’t just scrape the internet for clips, didn’t siphon real user footage without permission, and didn’t bury the details in fine print. The company explicitly told people what it was collecting, why it wanted the clips, and what the data would be used for — AI training, and nothing else. Compared to the industry standard (which often amounts to “we’ll take what we want, thanks”), this is practically wholesome.
Sure, $2 won’t pay for your next Prime order, and faking a car break-in in broad daylight probably raised a few eyebrows from the neighbors. But in an industry where “training data” often comes from scraping whatever’s available online, Eufy’s approach is, somehow, oddly respectable. Plus, it could ultimately help users’ smart homes keep them safer.
In today’s climate, it’s fair if you initially rolled your eyes at hearing about a smart camera brand paying people to fake being criminals. The ironic part is, that might actually be the most ethical thing happening in AI.