Connected objects: Is there need for greater scrutiny?
What are connected objects? Consider home assistants, smart thermostats, fitness wearables and connected cars to name but a few which advocate the advantages of convenience, home efficiency, health monitoring, safety, and security. Collectively known as the internet of things (IoT), these are devices with unique identifiers that interconnect via the internet, Bluetooth, and other means.
Manufactured by hardware and software companies from around the world, they contain sensors for motion, image, sound, pressure, and optics which serve a range of market sectors, such as the industrial, medical, automotive, consumer, communications, computer, military, and aerospace. In practice, they have the capability to collect large amounts of data which can be accessed by manufacturers, mobile application companies, and third-party vendors.
With 127 new devices connected to the internet every second, there are now more than 26 billion active devices worldwide of which two-thirds are in China, North America, and Western Europe. By 2025, this is expected to rise to 75 billion devices generating revenues in trillions of US dollars.
Home alone: How visible should we be?
I had previously thought that when I entered my home and shut the front door, I could be, to some extent at least, left alone, free to access information, form opinions without interference, make important life decisions, and discuss intimate family matters. But with everyday objects able to listen, see, and thereby gather data from my home, I now ask myself whether I can effectively exercise my right to a private family and home life, including my correspondences.
The fact that our ‘always-on’ smartphones, tablets, and PCs make us visible, trackable, and identifiable now extends to televisions, refrigerators, ovens, vacuum cleaners, and other home appliances. For example, consider how smart televisions has the capacity to send living-room chatter to third parties, the robot vacuum cleaners map home interiors, or how ‘smart mattresses’ and watches track your heart rate, breathing, sounds, and movements. They are leaving a trail of ‘digital breadcrumbs’ which reveal data about our behaviour and movements.
As consumers, we have some choice in the matter. We are required to consent to the terms and conditions of service agreements when purchasing these objects. We can also buy appliances that are not connected to the internet. But for how long will the offer of unconnected objects last and will consent really be possible if their functionality and security are limited or compromised by a decision to opt out of data-sharing arrangements? The functionalities of the iRobot vacuum cleaner and Sleep Number Bed have been shown to be limited when users do not consent to their terms and conditions of service; they have also been criticised for not being able to guarantee the secure storage of user data, and that this data will not be transmitted to others.
The potential surveillance and predictive capabilities of connected devices, coupled with relatively inexpensive data storage, makes them ever more affordable and attractive to consumers with little thought given to their potential risks and threats, especially when left unsecured. Obfuscated by lengthy and complex policies and agreements, it is likely that consumers will ignore, dismiss or at best tolerate the autonomous capture, collection, and transfer of behavioural data which, cross-referenced with other data, can reveal very personal attributes and profiles.
The collection of health-related data is a particular cause of concern. Google’s (Alphabet Inc.) purchase of Nest thermostat in 2014, Fitbit in 2019, and Coefficient Insurance in 2020, provides a glimpse of the power accruing in one actor to monitor, aggregate, predict, and share health-related data. In this connection, John Hancock, one of the largest life insurance providers in North America, will now sell only interactive policies that collect health data. By design, connected devices are paying ever closer attention to their owners by logging many of their daily activities. These perceived benefits may be a friend of the healthy but could also be a foe of the vulnerable, those without the means to adopt a healthy lifestyle.
Regulation and governance
In the absence of specific IoT regulation, existing laws can be relied upon to protect privacy and data protection, and to tackle abusive and otherwise criminal behaviour such as harassment, spying, and surveillance. The need to balance privacy with wider commercial and public benefits can be reconciled by various instruments, initiatives, and watchdogs. Importantly, the rulings of the European Court of Justice (ECJ) and the European Court of Human Rights (ECHR) continue to draw the red lines for what privacy and data protection looks like in the digital society.
To some extent, however, the governance of the digital society is forming outside of treaty and legislative frameworks involving many non-state actors, norms, procedures, processes, and institutions. The pace of innovation is resulting in ‘law-lag’, making it difficult to control or change technology once it has become entrenched in economic markets. Consumer reliance and trusted relations between companies and users might also be blunting the regulatory efforts of states. Prevailing contracts, technical standards, and best practices can differ from legislation and regulation leading to legal conflicts and situations where different norms cover the same actors without the existence of clear rules.
What the market will bear
For companies accountable to shareholders, it is the market value of behavioural patterns and personal preferences that matter the most, obtained with or without the knowledge or consent of data subjects. The billions of sensors deployed in these objects will be hard to police, especially noting the proprietary nature and opacity of the algorithms that analyse the collected data. For tech companies touting the freedom to innovate, the potential threats and harms of connected objects will be negligible until there is significant evidence to the contrary. Most connected objects will simply pass by unnoticed, the personally identifiable data leaking from them will be masked by software updates designed to fix security flaws and other bugs in an effort to protect consumers.
With developers focusing on how to monetise new streams of data, there will be no turning back from the rendition of human experience driven by market forces. Our future will be one with more and more of these objects, enchanted by their promises but ignorant and/or willfully blind to their capabilities for misuse and even abuse.
Note: This post was originally published in 2020.
Browse through our Diplo Wisdom Circle (DWC) blog posts.