Tuesday, April 26, 2016

Obtaining users’ consent on IoT devices

1. Introduction

The number of IoT devices is growing daily. Although the estimates of the current number of connected smart devices vary, it is indisputable that all the sensors produce unimaginable amounts of data. We have come from smart phones and smart fridges to smart everything. The aim behind all the data collection is to improve users’ daily life. However, as with many good things, IoT also have side effects: they may pose risk to users’ privacy and security. For this reason, the regulators believe users should be adequately informed about the possible risks to make an informed decision on the use of the novelty.

In this post, I will focus on privacy aspects of IoT, in particular on the issue of obtaining users’ consent. Many would argue that IoT devices process data that cannot be considered personal, such as air temperature, water temperature, time passed since the last feeding of a pet dog, amount of clothes washed to calculate the efficient water usage, etc., however, the regulators disagree. Given the amount and frequency of the data collection, the data protection regulators in the EU are of the opinion that all IoT collected data should be considered personal data. Such classification may bring IoT stakeholders under the application of data protection laws, which are getting increasingly popular around the world. Over 100 countries & territories have adopted comprehensive data protection laws and the number is growing.[1] Hence, companies who have not yet thought about the potential privacy implications of their IoT devices should consider implementing some data protection safeguards. The EU data protection laws are believed to be one of the strictest and the EU data protection regulators have a reputation for being on the conservative side of privacy v. technology debates.

As already indicated above, I find obtaining a valid user’s consent when processing personal data gathered by IoT devices as one of more pressing and troublesome issue. The regulators are not united regarding the type and quality of consent that should be obtained from users in order to be considered as valid, however, they agree that users should be notified when at least certain types of their personal data are being gathered. Given that the IoT devices may be different in size and shape, the companies are left to find a (creative) way for obtaining a valid consent. This may be particularly challenging as even in online services, where an explanation of the data usage is only a click away, it is common knowledge that very few people read the terms of service and privacy policies anymore. Even if users wanted to, it is virtually impossible given their length and reading time they require.[2] Furthermore, it can be very challenging to set privacy settings on a small IoT device and provide a valuable consent for the data processing.

2. Possible solutions

I was thinking about possible ways to obtain opt-in consent on IoT devices that could be considered sufficient for (some) regulators. I believe it is hard (if not impossible) to provide notice and obtain valid consent on the device itself, thus I mostly focused on getting consent via online tools, such as an IoT focused privacy consent app. I also looked into a possibility of alternative consent solutions, such as notice in audio instead of written form and a demo usage of the device until opt-in consent is provided. Although data protection regulators may prefer traditional types of consent versions, the aim of data protection and privacy laws is to safeguard users’ rights. Hence, if a company finds innovative solutions to protect those rights, the regulators may give in, eventually.

2.1. Demo usage of IoT device until consent is provided

As already noted, IoT devices are often deprived of quality information delivery means that would support the reading of a privacy notice needed for informed consent. Thus, users may need to seek detailed information online, on the company’s website, possibly dedicated to a particular IoT device. However, users do not always have the time or means to read the information before they start using a particular device. For example, if the device refuses to start until the user reads the privacy notice and provides consent, this may significantly affect the user’s experience. Although the latter is likely to represent the perfect implementation of data protection laws, it is highly impractical. I think users should have some kind of a grace period in which they could use the device and take time to properly organize their data protection rights. However, if companies favor user experience over legal requirements, especially if their device processes sensitive personal data, they may risk huge fines. In the EU, the new data protection law predicts fines of up to €20 million, or 4% of company’s global turnover, whichever is higher.[3]

I think it should be possible to enable a great user experience and comply with the laws at the same time and maybe it could be done through a demo service of the device that would work like a testing period. When a user starts using the IoT device, it would collect the data needed for the performance but would retain them for a minimum amount of time. Furthermore, only certain functions of the device would be enabled. For example, the demo version of a smart thermostat would delete all the data gathered every 12 hours and would not support personalized settings for a longer period; the smart calorie meter would measure the calories burned during the training but would delete all the information 15 minutes after the activity. Such limited use of the device would be possible, for example, for a week, until the user provides consent needed for the use of the device. Such consent could be obtained online, through the company’s website where all the information regarding personal data processing would be available. To avoid unnecessary processing of personal data, the website should not require user registration or collect additional user’s data. Obtaining consent through the device identifier would ensure higher anonymity for the device owner.

However, what to do if the user will not register herself during the grace period? Should the device stop working? This may be a harsh move but without sufficient consent, the company may (as believed by some regulators) end up processing personal data illegally. Should in such a case the company assumes that the user gave her implied opt-in consent and transform the demo version to full use while applying some pre-set (standard) privacy setting?

2.2. Consent app

The consent app would work similarly as obtaining users’ opt-in consent through browser settings, which (in some cases) could already be used for the placement of cookies. I think it is an easy, user friendly and a non-time-consuming solution to collect consent. One may argue that obtaining consent through browser setting could be used as well for IoT devices. However, I think it is not sophisticated enough and browsers may not be able (or willing) to communicate with all IoT devices users would want to connect. No company has a browser that would be used by all their IoT users, thus their devices would necessarily need to communicate with browsers created by other companies, possibly competitors. Furthermore, it is unlikely that all browsers would support all privacy choices needed for the type of personal data that the particular IoT device is processing. In addition, this solution only resolves the issue of obtaining consent but does not provide users with notice needed for an informed opt-in decision.

Unlike the browser settings, the consent app would provide users with more granular privacy choices. It would exist independently from the IoT device and would push the pre-set privacy settings on the device. However, it would require some interoperability and ‘device chatting’. For the app to be truly interoperable, it makes sense for it to be created by a government organization or by an independent pro-privacy entity. Companies tend to restrict interoperability with devices of other companies. From the perspective of user experience, users do not want to have a separate consent app for each IoT device, the idea is to have one, and then push the settings to all the devices the user owns. However, such an app could be a solution for a company that produces/owns several IoT devices.

When a user buys a new IoT device, she would only need to connect it to the consent app. The app could provide settings for multiple different scenarios that could be modified. So users would be able to choose the retention periods, location data sharing, etc. in one place and modify it for each IoT device if they wanted. With such an app, users would have more control over their personal data processing. The app could also translate privacy policies and terms of use of different IoT devices into the same style, possibly using icons. This would enable users with faster and easier understanding of differences regarding privacy needs of different devices, similarly to what some websites already offer.[4]

2.3. Audio notice

The audio notice would modify the form in which the notice is usually delivered to users. Especially IoT devices that support sound files but do not have a smartphone-like screen could try to deliver the notice in the audio format. Although regulators are not enthusiastic about the smartphone screen as a solution for delivering privacy notices, it supports a substantial amount of text that could be read. The other issue are users, who are not keen on reading such notices, regardless of the size of the screen. Taking all these into consideration, the sound format of privacy notices may be worth considering.

As mentioned above, IoT devices that support audio format files and have the hardware to deliver it, could provide the notice on their own, when used for the first time. Given that providing the whole notice may be too long, a fragment regarding obtaining consent could be delivered on the device with an invitation to the users to read the whole notice on the company’s website. If an active reaction from a user is required for the opt-in consent to be sufficient, the user could do so by pressing a button. Such a system may not be suitable for all IoT devices, but could, for example, work in a car or on a fridge.

3. Legal analysis of IoT consent requirements

Different legal systems pose different obligations to companies producing/owning IoT devices. Furthermore, often the regulators within the same (or similar) data protection system disagree on the implementation of the law. For example, in the EU, all the data protection laws are based on the same Data Protection Directive, however during the implementation into the national law, and through the interpretation of those national laws, the regulators went in different directions. The new Data Protection Regulation aims to harmonize the data protection requirements across the EU, but the national regulators may still interpret it differently. Such legal uncertainty is particularly difficult for companies that try to conquer global markets with their IoT product. It is often not possible to adjust the product to several legal requirements and data protection is only one aspect of rules. Thus, companies frequently make a business decision on how globally compliant they want to be and how much risk they want to take with a particular product. Unfortunately, legal requirements that are not fit for the tech world not only hinder the technological development but also may deprive users from novelties. Some are of opinion that the law is at least 5 years behind the emerging technologies.[5] So, companies that are working with cutting edge technologies will always carry a policy burden of educating the legislators and regulators and live in a gray area of compliance until the law catches up. In this section, I will briefly look into the differences between the EU and US IoT consent requirements; I will touch upon the issues of explicit and implied consent and try to legally analyze the IoT consent solutions proposed above.

3.1. Comparison: EU vs US

In the EU, the Article 29 Working Party (WP29) issued an opinion on the IoT.[6] In the opinion, they emphasize that consent should be main legal ground for processing personal data. The consent should be “fully informed, freely given and specific”. While the WP29 recognizes the issue of obtaining consent via the traditional mechanisms, they warn that low quality consent may not fulfill the EU data protection standards. The EU data protection laws are open to other legal grounds for personal data processing, such as legitimate interest of the company; however, the EU watchdog refers to the Google Spain[7] case pointing out that solely economic interests of a company may not be sufficient (legitimate) interest without user’s consent. Further, the WP29 emphasizes the importance of purpose limitation and data minimization principle. Companies can process personal data solely for previously established purposes and should not collect the data non-discriminatorily or seek for the purposes retrospectively. Such conduct may violate the EU data protection laws.[8] When a company establishes a clear purpose of personal data collection, the company should gather and process only the data that are necessary for achieving such a purpose.

In the end of the opinion, the WP29 acknowledges several security risks that may “turn an everyday object into a potential privacy and information security target,”[9] and provides recommendations to the IoT stakeholders. Main takeaways for companies are to carry out a privacy impact assessment in order to evaluate the potential risks to the users’ fundamental rights; to maintain short retention periods; to enable users to exercise control over their data; and to provide a privacy notice and obtain an opt-in consent in a user-friendly manner.[10]

A few months after the WP29, the US Federal Trade Commission (FTC) issued the Staff Report on Internet of Things,[11] in which the FTC provides staff’s recommendations in B2C cases, with a focus on security, data minimization, notice and choice. The FTC acknowledges that safeguards implemented by companies should reflect the sensitivity of personal information processed by the IoT device. To safeguard privacy and security of personal information companies should build into the IoT devices security by design mechanisms, after assessing the potential risks to the user’s personal information. Further, companies are advised to minimize the amount of data they retain and are encouraged to test the adopted security measures before launching the product. The FTC report emphasizes the importance of notice and choice for the IoT users; however they also recognize that “providing choices for every instance of data collection is not necessary to protect Privacy.”[12] The report refers to the 2012 Privacy Report, where the FTC made a distinction between expected and unexpected uses of gathered personal information. A company needs to provide choice only in cases where the use of data would be considered as unexpected use. The use-based approach is expected to “set “permissible” and “impermissible” uses of certain consumer data.”[13] The company also does not need to offer choices to users if they de-identify personal data immediately after the collection. The FTC also recognizes that there is no one-size-fits-all approach and that it is difficult to provide notice on IoT devices, so the FTC leaves companies freedom to choose the way of providing notice to users.

While both regulators seem to agree regarding the potential security risks and possible solutions to mitigate those risks, such as security and privacy by design and data minimization principles, the regulators have different views regarding notice and opt-in consent. The WP29 follows much more conservative approach than FTC, who imposes stricter requirements only in cases of unexpected uses of personal information. Thus, consent options sufficient in the eyes of the US regulator may not fulfill the requirements of the EU watchdog.

In the US, the California’s Office of Privacy Protection was the first to set up rules regarding privacy policies.[14] Now the COPPA requires that a firm operating a “commercial website or online service” should post privacy policy that is available to the users. Such policy must identify the categories of personal data collected and types of third parties with whom the company shares the collected data with.[15] In the case of IoT the COPPA requirements apply because they either maintain a website or operate an online service.[16] However in practice, companies tend to apply minimal standards when exercising COPPA provisions. Regarding the IoT products, users are often faced with difficulty in locating the relevant privacy policy. And when successful, these policies tend to provide very little information on how the data are shared and with whom. Furthermore, companies define personal data differently, which can create even greater confusion for consumers.[17] In contrast, global data protection regulators have agreed at the 36th International Privacy Conference that all data collected by IoT devices should be considered personal.[18]

3.2. Explicit vs. implied consent (possible IoT consent)

When debating the quality of consent needed for lawful processing of personal data, we have to differentiate between explicit and implied consent. Given that it is challenging to obtain explicit opt-in consent from the users of IoT devices, the option of implied consent could be very beneficial for companies. However, implied consent for IoT data processing may not be a global solution.

In the EU, there are two Directives that could serve as legal basis for consent within the IoT personal data processing: Data Protection Directive and e-Privacy Directive. The latter is lex specialis and Article 5(3) applies when the IoT stakeholder either stores or accesses the data already stored on an IoT device. Is this case, the company should provide users with sufficient information, such as purposes of processing, needed to make an informed opt-in decision.[19] Although the law does not prohibit implied consent, the WP29 said that obtaining consent through browser settings does not fulfill the requirements of informed consent.[20] The WP29 discussed the issue of consent by way of browser settings: this means that a company, usually for the purpose of online behavioral advertising, places a cookie in the user’s browser, unless the browser (or a similar app) is pre-set to reject such cookies. However, such implied consent is considered insufficient as it cannot be expected that the user agrees with the placement of a behavioral cookie since the majority of browsers are pre-set to allow such cookies. For the WP29 it is important that the user is able to provide an informed consent. For browser settings to provide that, it should not be possible to circumvent them with the use of technology. Furthermore, consent provided to a large number of cookies indicates that the user likely does not know the purposes of cookies that are going to be placed. The WP29 accepts the possibility for a browser setting or similar app to provide valid consent only in cases where such a browser/app is pre-set to reject all third party cookies.[21]

As already indicated in the WP29, some companies indeed used technical solutions to by-pass the browser settings and collect users’ personal data without their approval. In 2012, Google has settled for the largest FTC penalty for violating the Commission order.[22] Safari browser was pre-set to reject third party cookies and Google has circumvented this setting in order to obtain personal information from users that visited websites within Google’s DoubleClick advertising network.[23] This conduct by Google has also resulted in a lawsuit, in which the court stated that “the means by which defendants allegedly accomplished their tracking, i.e., by way of a deceitful override of the plaintiffs' cookie blockers, marks the serious invasion of privacy contemplated by California law.”[24] However, as CIPA in § 631(a) only refers to eavesdropping, the plaintiffs failed to show the violation of the law.[25] In general, regarding privacy violations the court was of opinion that “to prevail on claim of violation of right to privacy under the California Constitution, a plaintiff must possess a legally protected privacy interest, the plaintiff's expectations of privacy must be reasonable, and the plaintiff must show that the intrusion is so serious in nature, scope, and actual or potential impact as to constitute an egregious breach of the social norms.”[26] My interpretation of the above is that unless the IoT device’s data processing is particularly significant and exceeds the user’s expectation of privacy, the data processing conducted by the IoT device may fall within the scope of privacy requirements. Similarly, the FTC pointed out that companies should provide users with a choice of whether their IoT data processing exceeds the expected use of the device.

Overall, there has not yet been much guidance regarding the use of implied consent (for IoT devices), apart the FTC report presented above. In the US case law, implied consent laws relate to consent motorists give for the usage of breathalyzers,[27] which cannot be applied to IoT issues. However, there is some recent case law regarding implied consent involving Google,[28] Facebook[29] and Yahoo[30] for practices of screening/reading emails and LinkedIn[31] for spamming users’ contacts obtained from the users’ address books. Although these cases relate to implied consent, they discuss a very specific issue and also involve the rights and expectations of users who were not direct users of their services. These cases may be relevant for dealing with consent of users indirectly affected by the IoT devices.

3.3. Presented IoT consent solutions

In the light of the above, I think that my proposed solutions in section 2: (1) the demo usage of an IoT device; (2) obtaining consent through a consent app and (3) providing notice in audio version could satisfy legal requirements regarding consent on IoT devices (at least in the US). The purpose of data protection laws is to safeguard the rights and freedoms of data subjects. The new EU Data Protection Regulation emphasizes the importance of privacy impact assessment and adoption of adequate organizational and technical measures to mitigate the risks for data subjects. Furthermore, it advocates a risk-based approach.[32] Therefore, if the IoT device does not process sensitive personal data, the risk for data subjects is lower and the use of a demo version of the IoT device may be permissible in the eyes of the EU regulators.

Where data processing involves sensitive personal information, the US laws pose stricter requirements. Companies should offer users a choice before engaging in such data processing. The consent app, especially if offering a broad spectrum of privacy choices and if it is immune to various tricks of different technological solutions to bypass the privacy settings, may fulfill the requirements needed for (implied) consent.

Lastly, there seem to be no requirements in the EU nor in the US regarding the form of the notice, hence an audio instead of a written notice may be sufficient. Especially the FTC seems to be open toward companies’ creativity regarding the IoT data processing notices. However in the EU, consent should be (in some cases) provided in a written format. The upcoming Data Protection Regulation seems to be more flexible regarding the consent form, stating only that where data processing is based on consent, the company should be able to demonstrate that it was given.[33] I am optimistic that this provision recognizes various possibilities of obtaining users’ consent, maybe even through the possible solutions presented in section 2.

4. Conclusion

Legal requirements vary among countries and legal systems, which presents a difficulty for companies operating on global markets. The IoT technology is on its rise and various types and sizes of stakeholders are present on the IoT market. While differences in legal requirements regarding users’ consent may be easier to review for already established and successful businesses, startups and SMEs may need to devote a significant amount of resources to navigating through the maze of obligations. This may not only hinder competition in the market, but it may also deprive users from novelties. At the same time, it is important that new technological solutions do not impede fundamental rights of data subjects and that users are not forced to ‘pay’ for not being ‘left out’ with their freedoms. The regulators carry the important role of balancing the interests of the tech sector and at the same time the protection of vulnerable users. In the modern world, users often do not have the power to bargain on the contract provisions: privacy policies and terms of use are a clear example of “take it or leave it” conditions. However, the regulators should not take too conservative an approach but should actively try to engage in seeking new possibilities for safeguarding the users’ rights and enabling technological developments at the same time. Creating a privacy app could be one, perhaps (too) idealistic, possibility.


[1] David Banisar: National Comprehensive Data Protection/Privacy Laws and Bills 2014 Map, ARTICLE 19: Global Campaign for Free Expression, December 8, 2014, available at http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1951416

[2] Reading the Privacy Policies You Encounter in a Year Would Take 76 Work Days, The Atlantic, available at http://www.theatlantic.com/technology/archive/2012/03/reading-the-privacy-policies-you-encounter-in-a-year-would-take-76-work-days/253851/.

[3] Consolidated text, outcome of the trialog: REGULATION (EU) No XXX/2016 OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL on the protection of individuals with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation), available at http://www.emeeting.europarl.europa.eu/committees/agenda/201512/LIBE/LIBE%282015%291217_1/sitt-1739884.

[4] Terms of Service; Didn't Read, available at https://tosdr.org/.

[5] Manav Tanneeru: Can the law keep up with technology?, CNN, November 17, 2009, available at http://www.cnn.com/2009/TECH/11/17/law.technology/.

[6] Article 29 Working Party: WP 223 Opinion 8/2014 on the Recent Developments on the Internet of Things, September 16, 2014, available at http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf.

[7] Case C-131/12, available at http://curia.europa.eu/juris/liste.jsf?num=C-131/12.

[8] Law-Now: The Internet of things: a data protection challenge?, January 16, 2015, available at http://www.cms-lawnow.com/ealerts/2015/01/the-internet-of-things-a-data-protection-challenge.

[9] Article 29 Working Party: WP 223 Opinion 8/2014, supra note 6.

[10] Ibid.

[11] FTC Staff Report: Internet of Things, Privacy & Security in a Connected World, January 2015, available at https://www.ftc.gov/system/files/documents/reports/federal-trade-commission-staff-report-november-2013-workshop-entitled-internet-things-privacy/150127iotrpt.pdf.

[12] Ibid.

[13] Ibid.


[15] Ibid.

[16] Ibid.

[17] Ibid.

[18] Out-law.com: 'Internet of things' data should be 'treated as personal data', say privacy watchdogs, October 21, 2014, available at http://www.out-law.com/en/articles/2014/october/internet-of-things-data-should-be-treated-as-personal-data-say-privacy-watchdogs/.


[19] Article 29 Working Party: WP 223 Opinion 8/2014, supra note 6.

[20] Article 29 Working Party: Opinion 2/2010 on behavioral advertision, June 22, 2010, available at http://ec.europa.eu/justice/policies/privacy/docs/wpdocs/2010/wp171_en.pdf.

[21] Ibid.

[22] Federal Trade Commission: Google Will Pay $22.5 Million to Settle FTC Charges it Misrepresented Privacy Assurances to Users of Apple's Safari Internet Browse, Press release, August 9, 2012, available at https://www.ftc.gov/news-events/press-releases/2012/08/google-will-pay-225-million-settle-ftc-charges-it-misrepresented.

[23] Ibid.

[24] In re Google Inc. Cookie Placement Consumer Privacy Litigation, 806 F.3d 125 (2015).

[25] Ibid.

[26] Ibid.

[27] FindLaw: Implied Consent Laws, available at http://dui.findlaw.com/dui-arrests/implied-consent-laws.html.

[28] In re Google Inc. Gmail Litigation., Not Reported in F.Supp.2d (2013) and (2014)

[29] Campbell v. Facebook Inc., 77 F.Supp.3d 834 (2014)

[30] In re Yahoo mail litigation, 308 F.R.D. 577 (N.D. Cal. 2015)

[31] Perkins v. LinkedIn Corporation, 53 F.Supp.3d 1190 (2014)

[32] See also Consolidated text, outcome of the trialog, supra note 3, Articles 22 and 33.

[33] See also Consolidated text, outcome of the trialog, supra note 3, Article 7(1).