Analysis of American Data Privacy and Protection Act (ADPPA) Discussion Draft Released June 3
The following information and analysis is being shared with TMA members courtesy of The Security Industry Association (SIA).
Over the last week, SIA Government Relations has been collecting and analyzing feedback from their members on the bipartisan data privacy discussion draft under consideration in the House Energy and Commerce Committee (which was released June 3). SIA has concerns pertaining to some specific issues addressed in the language as it appears that could have sweeping impact on the use of video surveillance systems, alarm systems and biometric technologies (see below).
SIA Government Relations will be working quickly to raise concerns about these issues with the key Congressional offices to ensure they are addressed, as it is likely this measure would clear committee before the July 4 recess and possibly the House by August on its current trajectory. Subcommittee markup could occur as early as next week.
Below is described SIA’s analysis and member input received as of June 14th. Additional analysis of the House bill (and a competing Senate bill released by Sen. Cantwell that may also receive consideration this month) is also provided.
- Quick Reference Comparison
- Side-by-Side of Key Legislative Text
- Summary of Obligations under Three Corners Proposed Privacy Legislation
Framework: The overall framework ADPPA does not align with the GDPR and existing state data privacy laws in important respects, fails to preempt laws harmful to businesses and consumers, e.g., the Illinois Biometric Information Protection Act (BIPA), and includes a private right of action sure to encourage abusive class action lawsuits. Specific to our industry, significant and interrelated flaws in the definition of key terms and provisions make it incompatible with the effective use of many security systems, among other negative impacts. Without substantial revisions the measure would have serious – and likely unintended – negative impact on public safety.
Security Systems: The draft’s broad definition of “biometric information” combined with the prohibitions on its collection, processing, and transfer in Sec. 102(a)(3) would essentially prohibit the commercial use of security cameras of any kind without obtaining the “affirmative express consent” of all individuals who are recorded – consent a malicious actor is unlikely to provide. Apart from this concerning outcome, this would create an insurmountable burden in the security setting, when such cameras are ubiquitous components of security systems widely used and accepted for protecting most businesses, commercial facilities, schools, transit systems, and connected public spaces in the country – and virtually every business and non-profit organization is considered a “covered entity” subject to requirements under the bill.
By including “facial imagery” (versus biometric measurements) in the definition of “biometrics”, any photo or video recording of a face becomes “biometric information” under the draft, regardless of how it’s used. Significantly, data privacy laws in five states already specifically exclude photographs, video recordings and derived information from their definition of biometric information. Additionally, all current state data privacy laws include some form of an exception for security and anti-fraud purposes and/or cooperation with law enforcement. Likewise, the general exceptions in Title II Sec. 209 of the discussion draft include (albeit narrowly) “to detect or respond to a security incident” and “to protect against fraudulent or illegal activity.” However, the Title I’s Section 102(a)(3) prohibition with respect to biometric information appears to supersede the exceptions in Title II, rendering these limited security and anti-fraud exceptions inapplicable to photo and video data.
Similarly, due to the overly broad definition of “precise geolocation information” in the bill, the transfer prohibition in Section 102 (a)(2) likely encompasses the date, time and location information typically associated with photo and video data when it is created and included when it is transmitted, impeding operation of security and life safety systems. For example, remote video verification for intrusion detection systems is increasingly utilized to reduce false alarms. Use of this technology may include the transmission of facial imagery along with this associated information. It is objectively impossible to obtain consent from all individuals that may trigger such alarms. Additionally, while the exceptions in Section 209 apply to “covered data” in the draft text, this does not also explicitly include “sensitive covered data” (like biometric information and precise geolocation information).
For the reasons outlined above, it is imperative to, among other things, 1) provide a more robust and workable security exception in Section 209, 2) clarify that Section 209 exceptions apply to practices in Title 1 and to sensitive covered data, and 3) exclude photos and video from the definition of “biometric information.”
Consent: Instead of aligning with GDPR and the latest state data privacy laws providing for a “clear affirmative act” to signify consent, the discussion draft would require a similar sounding – but different – “affirmative express consent” which could be inferred to negate the ability to use notice and consent mechanisms like signage, etc. Affirmative express consent” as defined in the draft should be replaced with the common definition of consent used in existing state data privacy laws in Colorado, Virginia, Utah and Connecticut.
Law Enforcement: All current state data privacy laws include an exception to requirements when it comes to cooperating with or assisting law enforcement investigations, including Connecticut’s data privacy law enacted last month. Subsection (a)(9) on law enforcement cooperation in Section 209 of the draft (which is currently bracketed for review) should be retained and expanded to include facilitation with a law enforcement investigation.
Government Contractors: Related to law enforcement and other functions, there are many private entities that collect, process and/or transfer information to federal, state or local governments as a contractor, including acting upon their behalf in some cases. It appears under the bill’s framework that government entities, while not covered entities, could be considered third parties in this arrangement. Therefore, it should be clarified that contractors acting in this capacity are considered service providers and not covered entities.
First Responders and Alarm Systems: Additionally, to address the third-party issue above, the exceptions should be clarified in Section 209 to include covered data that may be transmitted to first responders (as third parties) for responding life safety emergencies, such as fire, etc. in addition to security incidents.
Publicly Available Information: Information about individuals that is available to the public is not private and thus is excluded from the definition of covered data. However, the draft substantially narrows the commonly accepted definition of “publicly available information” used across existing state data privacy laws with additional caveats. This definition should be adjusted to align with definitions in existing state laws to ensure publicly available information continues to generally mean information lawfully available to the general public through government records, widely distributed media or required to be displayed in public by local, state or Federal law.
Biometrics Definition/Provisions: As currently drafted, data could be included as “biometric information” that is not actually biometric or does not present a privacy risk, because there is no requirement that such data is used to identify a specific individual. The definition should be aligned with all existing state data privacy laws in the U.S. that address biometrics, by requiring an identification capability or purpose for data to be considered biometric information. Also, unless the Section 102 prohibition with respect to biometric information is altered or removed, it will effectively prevent beneficial applications of biometric technologies for access control and security, where the collection of biometric data and use of analytics is necessary to distinguish between enrolled/non-enrolled individuals.
Facial Recognition: Under Section 404(k) any state law solely addressing facial recognition is not preempted. This section should be removed. Facial recognition data is not fundamentally different than all other biometric data. Software-specific templates are created based on biometric measurements that are compared with enrolled data for similarity to make probabilistic match determinations. While some states have specifically restricted or regulated use of facial recognition by law enforcement and/or other government entities, no state has enacted a law specifically regulating or restricting use of this technology by the private sector entities covered by the ADPPA. At the same time, commercial use of facial biometrics around the world for applications like identity protection and authenticated access to accounts and services is rapidly growing. For these reasons it makes little sense encourage future state laws that might be at odds with the principles and structure of national data privacy rules, and no reason why the rules specific to biometric information contemplated in ADPPA would be insufficient to protect such data.
Leave a Reply
Want to join the discussion?Feel free to contribute!