FTC Delays Enforcement of its Negative Option Rule

In March of 2023 the Federal Trade Commission (FTC) requested comments from industry on their Notice of Proposed Rulemaking to change how the agency enforces the 1973 Negative Option Rule. This rule has allowed the agency to combat unfair or deceptive practices related to subscriptions, memberships and other recurring-payment programs. As online retail has become ubiquitous in recent decades, the agency is reviewing all automatic renewal programs continuity offers in which customers are continually billed for products or services unless they expressly signal their wish to exit their agreement for said products or services.

The new rule would require companies to provide specific express disclosures, obtain the consumer’s unambiguous affirmative consent to the negative option rule separately from the rest of the transaction as well as offering one-click cancellation mechanisms. All the proposed changes would be applicable to all forms of negative option marketing in all media (i.e. telephone, internet, traditional print media, and in-person transactions).

The Federal Trade Commission’s decision to expand their role in monitoring renewal continuity agreements by new rulemaking will require companies offering subscription services to revisit their legal agreements. The new rulemaking will broaden to apply to: automatic renewals, continuity plans, free-to-pay conversions and pre-notification negative option plans. The FTC will be increasing disclosure requirements, imposing specific consent requirements, mandating acceptable cancellation options and limiting the ability of a company to attempt to “save” a deal.

Significant Rulemaking Ramifications:

Clear Disclosures: The FTC will not be altering what companies need to disclose rather how companies disclose information. If an advertisement is made both visibly and audibly a disclosure for a continual subscription and negative options must be detailed visibly and audibly. A hidden hyperlink to the disclosure will not be acceptable.

Consent: The new rulemaking would be expanding the expressed informed consent to “unambiguous affirmative consent”. The negative option offer must be separately and clearly defined in the transaction. A check box or simple click may be used to signal a consumer’s unambiguous affirmative consent.

Simple Cancellation: Companies must provide an easy cancellation method of the negative option method and the FTC offers a “click-to-cancel” option.

Annual Reminders for Non-Physical Goods: Companies offering non-physical goods must offer at least annual reminders of the service and terms along with a means to cancel. The definition of “annual” is under review at FTC and has not been fully determined.

Penalties:

A key change to the FTC’s enforcement are new financial penalties for companies in violation of the new rulemaking.

Any violation of the FTC’s negative option rule will come at a penalty of $50,120.

 Additional Information & Dates:

In October of 2024 the FTC published its final Negative Option Rule, which applies to any negative option program, including those using online, telephone, print, or in-person mediums. The ruling will apply to all business to business and business to customer transactions. The final Rule requires businesses to obtain a consumer’s “express informed consent” to the negative option feature before charging the consumer.

Companies must be in compliance with obtaining informed consent and providing a simple cancellation mechanism by May 14th.

 The FTC provided an extended compliance deadline for the remaining provisions of the ruling for July 14th.

Alarm monitoring companies should use this additional time to ensure full compliance with the rule’s requirements, which include clear disclosures, obtaining express consent, and providing straightforward cancellation methods for all negative option features, including auto-renewing subscriptions and sales.  The rule is currently facing legal challenges before the Eighth Circuit Court of Appeals and a decision there could impact the rule’s future enforcement.

COURT OVERTURNS FTC’S BAN ON NON-COMPETE CLAUSES. SEPT. 4 DEADLINE FOR NOTIFYING EMPLOYEES IS CANCELLED

On August 20, the US District Court for the Northern District of Texas overturned the FTC’s recently-adopted ban on the use of non-compete restrictions in employee contracts.  The rule was scheduled to go into effect on September 4, which would have required employers to proactively notify their employees as of that date that most existing non-compete clauses would not be enforced, and no new non-compete arrangements could be imposed. The Court set aside the Non-Compete Rule in all respects, finding it to be arbitrary and capricious, and beyond the authority of the FTC (especially since the ban applied to all non-compete clauses with no attempt to identify only those restrictions that were harmful to competition and did not serve a legitimate need). Consequently, the Court made it clear that “the Rule shall not be enforced or otherwise take effect on its effective date of September 4, 2024 or thereafter.”  The FTC may appeal this decision to the US Court of Appeals, so employers will need to stay abreast of any further rulings in this matter.  But it is clear that employers do not have to send their employees a non-compete notice on September 4 (unless a higher court issues a stay of the District Court’s decision, which is unlikely since postponing the Sept. 4 deadline simply maintains the status quo).

Please consult with employment counsel if you have any questions.

TMA/AICC Meet with Congressional Committee on Privacy Act

AICC attorney John Prendergast, Managing Partner, Blooston, Mordkofsky, Dickens & Prendergast, LLP, and lobbyist Bill Signer, Carmen Group, Inc., met this morning with House Energy and Commerce Democratic staff to discuss the American Privacy Rights Act. Several issues were raised  – two which are specific to the alarm industry and three which are generic to the bill and will impact any company subject to the Act.

Read update.

Next steps:

  • A memo containing recommendations for modifications and the needed legislative language will be drafted.
  • Outreach to the majority staff has been made.
  • Outreach to the Senate Commerce staff will also be made.

Member Call-for-Action – FCC Ruling

Call for Action

The Alarm Industry Communications Committee (AICC), a committee of The Monitoring Association (TMA), is requesting your help in this survey to protect your company’s interest in Internet communications in a proceeding now before the Federal Communications Commission (FCC). The FCC is once again considering, after several attempts, rules of the road for broadband Internet access service (BIAS) providers which are fair to industries that use the Internet, as the monitoring industry does.

These BIAS carriers are the companies with whom you contract for Internet service, and which allow the transmission of information and access to third party websites like YouTube. Importantly, quality of Internet service directly impacts the quality and speed of video transmission, which is an integral and growing aspect of alarm monitoring.

Please answer the following questions to the best of your knowledge. If you do not know an answer, simply indicate so. If this has occurred more than once, describe the one you feel impacted your company the most. Your company’s identity and specific responses will be kept confidential and will not be disclosed to third parties, including government agencies, without your permission.

Your time and attention are greatly appreciated. If you have any questions or comments, please contact Ben Dickens (bhd@bloostonlaw.com, 202-828-5510) or Sal Taillefer (sta@bloostonlaw.com, 202-828-5562).


AICC BIAS Survey

This field is hidden when viewing the form

Next Steps: Install the Survey Add-On

This form requires the Gravity Forms Survey Add-On. Important: Delete this tip before you publish the form.
Name(Required)

Question 2

Question 3

Question 4

Question 5

WA and VA ECCs Go Live with TMA’s ASAP Service

TMA welcomes its 121st and 122nd ECCs to its ASAP-to-PSAP service – the Snohomish County WA 911 (Sno911) in Washington and the New River Valley Emergency Communications Regional Authority (NRV 911 Authority) in Virginia.

Snohomish County WA  #121

The Snohomish County WA 911 (Sno911) is the 121st ECC in the United States and the 5th agency in the state of Washington to implement the Automated Secure Alarm Protocol (ASAP).  The Snohomish County ECC went live January 31st, 2023, with: Vector Security, Rapid Response Monitoring, Security Central, Guardian Protection, Securitas, United Central Control, National Monitoring Center, Johnson Controls (Tyco), Affiliated Monitoring, Protection One, Brinks, Stanley Security, Vivint and ADT.

Andie Burton, Director of Operations, Snohomish County said, “The installation of the Automated Secure Alarm Protocol program into our system will streamline and increase efficiency of processing various alarm calls for service.  Use of the program will also free up call takers to answer other incoming 9-1-1 calls.  In a large Emergency Communications Center environment, preserving these precious resources will help provide better service to our community.”

New River Valley (Montgomery County) VA  #122

The New River Valley Emergency Communications Regional Authority (NRV 911 Authority) is the 122nd ECC in the United States and the 22nd agency in the state of Virginia to implement the Automated Secure Alarm Protocol (ASAP).  The NRV 911 Authority went live February 27, 2023 with: Rapid Response Monitoring, Vector Security, Stanley Security, Securitas, Protection One, United Central Controls, National Monitoring Center, Guardian Protection, Affiliated Monitoring, Brinks Home Security, Vivint, Security Central, CPI Security, Johnson Controls (Tyco) and ADT.

Learn more about ASAP-to-PSAP service HERE.

New York Governor signs Fair Repair Act, with amendments to protect most alarm devices and services

At year end, New York Governor Hochul signed the Fair Repair Act into law, making it easier for consumers to repair their own electronic devices.  Fortunately, in response to concerns raised by AICC and members of the alarm industry, the Governor simultaneously enacted amendments to the Act to help prevent the compromise of alarm systems in a way that would endanger customers and the public.

The original version of the Act (S. 4104-A) would have required manufacturers of “digital electronic equipment”, including alarm systems, to make product manuals, repair tools, lockout codes, passwords, system design schematics and other information available to customers and third-party contractors, so that they can attempt repairs on their own.  Both houses of the New York legislature passed the bill in June 2022.  Before the bill was forwarded to the Governor for signing, AICC and members of the alarm industry weighed in with a request asking for chapter amendments to the Act, so as to exempt central station alarm operations from the disclosure requirements in the bill.  AICC pointed out that if access codes, passwords, or alarm system schematics are provided to customers or their contractors, and then either hacked or innocently made public (e.g., as part of a You Tube self-help video), it could allow bad actors to disable alarm systems, endangering tens or hundreds of thousands of New Yorkers.

The Governor’s amendment to the Fair Repair Act was responsive to alarm industry concerns in multiple ways:

  • Creation of a specific exemption for “home” security devices and alarm systems;
  • Creation of a specific exemption for equipment sold under a specific business-to-government or business-to-business contract, which is not otherwise offered for sale directly by a retail seller;
  • Deletion of the requirement for any alarm manufacturer or provider to disclose security codes and passwords for alarm equipment under any circumstance;
  • Creation of an exemption for “medical devices” which should be broad enough to cover many security/medical monitoring pendants (to the extent that these devices are not already protected by the “home” alarm exemption discussed above);
  • Grandfathering of pre-July 1, 2023 equipment, narrowing the scope of the new disclosure requirements and giving the alarm industry time to prepare for the new law (to the extent any devices don’t qualify for one of the above exemptions).
  • Creation of an exemption for certain anti-theft security measures;
  • Allowing the provision of replacement part assemblies rather than individual components.
  • Protecting alarm and other manufacturers from having to disclose trade secrets or intellectual property.

Provided courtesy of AICC.

TMA Honors Fiore’s Lifetime Contribution with Everlasting Scholarship

The Monitoring Association’s (TMA) President Morgan Hertel announced the establishment of the Louis T. Fiore Electronic Communications Scholarship on Mon., Oct. 31st during the TMA General Business Meeting at the 2022 Annual Meeting on Marco Island, FL. The scholarship honors Mr. Fiore’s enduring contribution to the alarm industry and the Alarm Industry Communications Committee (AICC) and is intended to promote careers in electronic communications.

“How do you recognize someone who’s given his heart and soul for more than 30 years to an organization?” asked President Hertel when introducing the new scholarship. “Lou has served the alarm industry in many capacities. We wanted to honor his immeasurable contributions in an everlasting manner to make certain he and his work is never forgotten.” The room rose with a standing ovation as Fiore approached the stage to accept a small token of recognition from President Hertel.

“I am honored. As chair of the AICC for 30 years, I worked hard to make certain the Committee and its members thrived. The work was very close to my heart. Thank you very much. This is incredible,” stated Fiore in his acceptance comments.

The scholarship is open to anyone studying electrical engineering, specifically electronic communications, or software development related to electronic communications, regardless of financial need. Students at any nationally accredited educational institution, including vocational schools, two-year colleges, or other institutions of higher learning are eligible.

For more information on this scholarship, including donating, go to http://tma.us/louis-t-fiore-electronic-communications-scholarship/

Pictured (l to r): Bill Signer, Carmen Group; Ret. U.S. Rep. Peter T. King [R-NY]; and Louis T. Fiore

New York Governor Signs Fair Repair Act, with Amendments to Protect Most Alarm Devices and Services

At year end, New York Governor Hochul signed the Fair Repair Act into law, making it easier for consumers to repair their own electronic devices.  Fortunately, in response to concerns raised by AICC and members of the alarm industry, the Governor simultaneously enacted amendments to the Act to help prevent the compromise of alarm systems in a way that would endanger customers and the public.

The original version of the Act (S. 4104-A) would have required manufacturers of “digital electronic equipment”, including alarm systems, to make product manuals, repair tools, lockout codes, passwords, system design schematics and other information available to customers and third-party contractors, so that they can attempt repairs on their own.  Both houses of the New York legislature passed the bill in June 2022.  Before the bill was forwarded to the Governor for signing, AICC and some of its members weighed in with a request asking for “chapter amendments” to, so as to exempt central station alarm operations from the disclosure requirements in the bill.  AICC pointed out that if access codes, passwords, or alarm system schematics are provided to customers or their contractors, and then either hacked or innocently made public (e.g., as part of a You Tube self-help video), it could allow bad actors to disable alarm systems, endangering tens or hundreds of thousands of New Yorkers.

The Governor’s amendment to the Fair Repair Act was responsive to alarm industry concerns in multiple ways:

  • Creation of a specific exemption for “home” security devices and alarm systems;
  • Creation of a specific exemption for equipment sold under a specific business-to-government or business-to-business contract, which is not otherwise offered for sale directly by a retail seller;
  • Deletion of the requirement for any alarm manufacturer or provider to disclose security codes and passwords for alarm equipment;
  • Creation of an exemption for “medical devices” which should be broad enough to cover many security/medical monitoring pendants (to the extent that these devices are not already protected by the “home” alarm exemption discussed above;
  • Grandfathering of pre-July 1, 2023 equipment, narrowing the scope of the new disclosure requirements and giving the alarm industry time to prepare for the new law (to the extent any devices don’t qualify for one of the above exemptions).
  • Creation of an exemption for certain anti-theft security measures;
  • Allowing the provision of replacement part assemblies rather than individual components.
  • Protecting alarm and other manufacturers from having to disclose trade secrets or intellectual property.

AICC is seeking clarification of some of the terms in the new law, and expects more guidance in the coming weeks.  The alarm industry should move expeditiously to urge the adoption of similar exemptions and measures in other pending state and federal Right to Repair legislation.

E&C Announces Subcommittee Markup of Bipartisan, Bicameral Privacy Legislation & Seven Other Bills

Energy and Commerce Committee Chairman Frank Pallone, Jr. (D-NJ), Ranking Member Cathy McMorris Rodgers (R-WA), Consumer Protection and Commerce Subcommittee Chair Jan Schakowsky (D-IL), and Subcommittee Ranking Member Gus Bilirakis (R-FL) announced today that the Consumer Protection and Commerce Subcommittee will hold a markup on Thursday, June 23, at 10:30 a.m. (EDT) in the John D. Dingell Room, 2123 of the Rayburn House Office Building.

“This week, we will take another major step in putting people back in control of their data and strengthening our nation’s privacy and data security protections by marking up the bipartisan American Data Privacy and Protection Act,” Pallone, Rodgers, Schakowsky, and Bilirakis said. “We continue to welcome and encourage input as we begin this next step in the regular order process. The Subcommittee will also consider seven other bills, including legislation to protect children from dangerous products, prevent unwanted recording by smart devices, and defend horses from inhumane practices. We look forward to working with Committee members on both sides of the aisle to advance these important bills.”   

 The Subcommittee will consider the following bills:

 H.R. 8152, the “American Data Privacy and Protection Act,” which was formally introduced in the House today by Pallone, Rodgers, Schakowsky, and Bilirakis. 

 H.R. 3355, the “Save America’s Forgotten Equines Act of 2021” or the “SAFE Act,” which was introduced by Reps. Schakowsky and Vern Buchanan (R-FL).

 H.R. 3962, the “Securing and Enabling Commerce Using Remote and Electronic Notarization Act of 2021,” which was introduced by Reps. Madeleine Dean (D-PA), Kelly Armstrong (R-ND), and 32 original bipartisan cosponsors.

 H.R. 4081, the “Informing Consumers About Smart Devices Act,” which was introduced by Reps. John Curtis (R-UT) and Seth Moulton (D-MA). 

 H.R.4551, the, “Reporting Attacks from Nations Selected for Oversight and Monitoring Web Attacks and Ransomware from Enemies Act” or the “RANSOMWARE Act,” which was introduced by Rep. Bilirakis.

 H.R. 5313, “Reese’s Law,” which was introduced by Reps. Robin Kelly (D-IL), Jodey Arrington (R-TX), and Ted Lieu (D-CA).

 H.R. 5441, the “Prevent All Soring Tactics Act of 2021” or the “PAST Act,” which was introduced by Rep. Steve Cohen (D-TN) and 212 other original bipartisan cosponsors.

 H.R. 6290, the “Manufacturing.gov Act,” which was introduced by Reps. Paul Tonko (D-NY), Cindy Axne (D-IA), and Fred Upton (R-MI).

This will be a hybrid markup that includes both in person and remote member attendance via Cisco Webex video conferencing. Members of the public may view the markup via live webcast accessible on the Energy and Commerce Committee’s website. Please note the webcast will not be available until the markup begins.

IMPORTANT NOTE:

Our language is included under Sec. 101 (b) (4) as a permissible service

1              (4) To prevent, detect, protect against, or re-

2             spond to a security incident, or fulfill a product or

3             service warranty. For purposes of this paragraph,

4             security is defined as network security as well as in-

5 t           trusion, medical alerts, fire alarms, and access con-

6             trol security.

Analysis of American Data Privacy and Protection Act (ADPPA) Discussion Draft Released June 3

The following information and analysis is being shared with TMA members courtesy of The Security Industry Association (SIA).

Over the last week, SIA Government Relations has been collecting and analyzing feedback from their members on the bipartisan data privacy discussion draft under consideration in the House Energy and Commerce Committee (which was released June 3).  SIA has concerns pertaining to some specific issues addressed in the language as it appears that could have sweeping impact on the use of video surveillance systems, alarm systems and biometric technologies (see below).

SIA Government Relations will be working quickly to raise concerns about these issues with the key Congressional offices to ensure they are addressed, as it is likely this measure would clear committee before the July 4 recess and possibly the House by August on its current trajectory.  Subcommittee markup could occur as early as next week.

Below is described SIA’s analysis and member input received as of June 14th. Additional analysis of the House bill (and a competing Senate bill released by Sen. Cantwell that may also receive consideration this month) is also provided.

Framework: The overall framework ADPPA does not align with the GDPR and existing state data privacy laws in important respects, fails to preempt laws harmful to businesses and consumers, e.g., the Illinois Biometric Information Protection Act (BIPA), and includes a private right of action sure to encourage abusive class action lawsuits. Specific to our industry, significant and interrelated flaws in the definition of key terms and provisions make it incompatible with the effective use of many security systems, among other negative impacts. Without substantial revisions the measure would have serious – and likely unintended – negative impact on public safety.

Security Systems: The draft’s broad definition of “biometric information” combined with the prohibitions on its collection, processing, and transfer in Sec. 102(a)(3) would essentially prohibit the commercial use of security cameras of any kind without obtaining the “affirmative express consent” of all individuals who are recorded – consent a malicious actor is unlikely to provide. Apart from this concerning outcome, this would create an insurmountable burden in the security setting, when such cameras are ubiquitous components of security systems widely used and accepted for protecting most businesses, commercial facilities, schools, transit systems, and connected public spaces in the country – and virtually every business and non-profit organization is considered a “covered entity” subject to requirements under the bill.

By including “facial imagery” (versus biometric measurements) in the definition of “biometrics”, any photo or video recording of a face becomes “biometric information” under the draft, regardless of how it’s used. Significantly, data privacy laws in five states already specifically exclude photographs, video recordings and derived information from their definition of biometric information. Additionally, all current state data privacy laws include some form of an exception for security and anti-fraud purposes and/or cooperation with law enforcement. Likewise, the general exceptions in Title II Sec. 209 of the discussion draft include (albeit narrowly) “to detect or respond to a security incident” and “to protect against fraudulent or illegal activity.” However, the Title I’s Section 102(a)(3) prohibition with respect to biometric information appears to supersede the exceptions in Title II, rendering these limited security and anti-fraud exceptions inapplicable to photo and video data.

Similarly, due to the overly broad definition of “precise geolocation information” in the bill, the transfer prohibition in Section 102 (a)(2) likely encompasses the date, time and location information typically associated with photo and video data when it is created and included when it is transmitted, impeding operation of security and life safety systems. For example, remote video verification for intrusion detection systems is increasingly utilized to reduce false alarms. Use of this technology may include the transmission of facial imagery along with this associated information. It is objectively impossible to obtain consent from all individuals that may trigger such alarms. Additionally, while the exceptions in Section 209 apply to “covered data” in the draft text, this does not also explicitly include “sensitive covered data” (like biometric information and precise geolocation information).

For the reasons outlined above, it is imperative to, among other things, 1) provide a more robust and workable security exception in Section 209, 2) clarify that Section 209 exceptions apply to practices in Title 1 and to sensitive covered data, and 3) exclude photos and video from the definition of “biometric information.”

Consent: Instead of aligning with GDPR and the latest state data privacy laws providing for a “clear affirmative act” to signify consent, the discussion draft would require a similar sounding – but different – “affirmative express consent” which could be inferred to negate the ability to use notice and consent mechanisms like signage, etc. Affirmative express consent” as defined in the draft should be replaced with the common definition of consent used in existing state data privacy laws in Colorado, Virginia, Utah and Connecticut.

Law Enforcement: All current state data privacy laws include an exception to requirements when it comes to cooperating with or assisting law enforcement investigations, including Connecticut’s data privacy law enacted last month. Subsection (a)(9) on law enforcement cooperation in Section 209 of the draft (which is currently bracketed for review) should be retained and expanded to include facilitation with a law enforcement investigation.

Government Contractors: Related to law enforcement and other functions, there are many private entities that collect, process and/or transfer information to federal, state or local governments as a contractor, including acting upon their behalf in some cases. It appears under the bill’s framework that government entities, while not covered entities, could be considered third parties in this arrangement. Therefore, it should be clarified that contractors acting in this capacity are considered service providers and not covered entities.

First Responders and Alarm Systems: Additionally, to address the third-party issue above, the exceptions should be clarified in Section 209 to include covered data that may be transmitted to first responders (as third parties) for responding life safety emergencies, such as fire, etc. in addition to security incidents.

Publicly Available Information: Information about individuals that is available to the public is not private and thus is excluded from the definition of covered data. However, the draft substantially narrows the commonly accepted definition of “publicly available information” used across existing state data privacy laws with additional caveats. This definition should be adjusted to align with definitions in existing state laws to ensure publicly available information continues to generally mean information lawfully available to the general public through government records, widely distributed media or required to be displayed in public by local, state or Federal law.

Biometrics Definition/Provisions: As currently drafted, data could be included as “biometric information” that is not actually biometric or does not present a privacy risk, because there is no requirement that such data is used to identify a specific individual. The definition should be aligned with all existing state data privacy laws in the U.S. that address biometrics, by requiring an identification capability or purpose for data to be considered biometric information. Also, unless the Section 102 prohibition with respect to biometric information is altered or removed, it will effectively prevent beneficial applications of biometric technologies for access control and security, where the collection of biometric data and use of analytics is necessary to distinguish between enrolled/non-enrolled individuals.

Facial Recognition: Under Section 404(k) any state law solely addressing facial recognition is not preempted. This section should be removed. Facial recognition data is not fundamentally different than all other biometric data. Software-specific templates are created based on biometric measurements that are compared with enrolled data for similarity to make probabilistic match determinations. While some states have specifically restricted or regulated use of facial recognition by law enforcement and/or other government entities, no state has enacted a law specifically regulating or restricting use of this technology by the private sector entities covered by the ADPPA. At the same time, commercial use of facial biometrics around the world for applications like identity protection and authenticated access to accounts and services is rapidly growing. For these reasons it makes little sense encourage future state laws that might be at odds with the principles and structure of national data privacy rules, and no reason why the rules specific to biometric information contemplated in ADPPA would be insufficient to protect such data.