Robot or the Room Camera?
Do you have a Roomba at your home?
Let me ask you — do you see it like a cute little robot vacuum cleaner or like a person who does the odd sweeping & cleaning job at your house? If you are like most people, perhaps the former. Just that it would be a big mistake.
Most of us are either ignorant or forget the fact that such robot cleaners are equipped with a camera that can record and process that information to determine whether something in front of it is an object, a person, a pet or some dust that needs to be removed.
- Is this a sock or a piece of paper?
- Is this a cable or a rope?
- Is this a pen or someone’s finger?
Questions like this are easy for humans to answer. But robots need good cameras and AI technology to support them in identifying such objects on a floor.
What happens to these images or videos taken by the robot vacuum cleaner? Usually, they are processed locally (i.e.) within the cleaner. But sometimes, these images are sent to some server, where a human can get to validate the image, identify objects in it, etc.
It explicitly requires a user of such a machine to first agree to share the information with the manufacturer of the machine. This is a common practice in reality. Most of us would click or sign the end-user-license-agreements for our hardware & software without even thinking.
However, there are concerns about privacy and security with such data sharing that most people are not even aware of. Most mid-range robotic vacuum cleaners do not use powerful enough cameras but the higher-end ones do.
All of the top three robot vacuum makers by market share — iRobot, which has 30% of the market and has sold over 40 million devices since 2002; Ecovacs, with about 15%; and Roborock, which has about another 15% — have high resolution cameras in their high-end devices.
There have been instances where such photos got leaked onto social media. Though actions were taken post facto in this particular instance, the potential damage in some cases could be disastrous.
How do companies improve their products without seeking feedback from devices on the field? How are users’ concerns about their safety and privacy addressed effectively? There has to be a balance struck to address both these aspects.
People have to trust Artificial Intelligence-driven devices first. Legal regulations enforced by the Government on companies collecting and using such information for training their AI devices will go a long way in helping take AI mainstream.