Trade Law Daily is a service of Warren Communications News.
FCRA Boundaries Debated

Industry, Privacy Groups Clash in Consumer Scoring Comments to FTC

Technology and marketing associations are disputing calls for legislation on consumer data analytics products, in FTC comments following a recent seminar on the issue (WID March 20 p3). The Direct Marketing Association (DMA) and the Software and Information Industry Association (SIIA) stressed the current regulatory framework’s effectiveness and urged the FTC to focus on enforcement rather than regulations. Privacy advocates cautioned that many analytical products aren’t regulated and thus secluded from consumer or government scrutiny, creating myriad possibilities for discrimination. Comments on the issue were due Monday.

Sign up for a free preview to unlock the rest of this article

Timely, relevant coverage of court proceedings and agency rulings involving tariffs, classification, valuation, origin and antidumping and countervailing duties. Each day, Trade Law Daily subscribers receive a daily headline email, in-depth PDF edition and access to all relevant documents via our trade law source document library and website.

The FTC and Congress have been focused on data broker business practices in recent months, including data analytics used to segment consumers, called alternative scoring products or predictive consumer scoring. The FTC probed representatives on both sides of the debate at the March seminar, and the commission has been hinting for months at the upcoming release of a years-long overarching data broker study (WID March 7 p4). FTC officials have also taken to Capitol Hill to explain their concerns for consumers (WID Dec 19 p1).

Comments to the FTC on consumer scoring split along industry and consumer interest lines. The two sides differed on: Whether current law covers consumer scoring; how harmful using consumer scores -- both accurate and inaccurate -- are for consumers; and whether consumers have enough information about these scores.

SIIA cited the March 19 workshop. Many examples of harmful consumer scoring raised at the workshop are either not harmful to consumers or are already covered by current laws, said the group. Workshop panelists, which didn’t include SIIA, discussed the Klout score -- a number the company Klout produces based on a user’s social network connections and followers. Consumer advocates like World Privacy Forum (WPF) Executive Director Pam Dixon cited the example of an employer not hiring an employee because of a low Klout score as evidence of possible harm not covered by current law. But a Klout score is transparent and people are able to opt out of it, SIIA said. “The major business function of a Klout score is to enable businesses to reach social media participants who might be especially influential if they were persuaded that a particular product or service was valuable.” Two common examples cited in comments are hotel discounts and flight upgrades, said several commenters. “None of the activities of Klout appear to be harmful to the consumer.” Current laws like the Fair Credit Reporting Act (FCRA) need not apply, it said. Klout is not an SIIA member, according to SIIA’s website.

"People can opt out of having a Klout score if they dare, but in some professions, not having a Klout score would be a professional liability,” said WPF. Dixon, who wrote the comments, conceded the transparency of the Klout score itself, “but the secrecy of the Klout score’s composition and opacity of use remain prime concerns,” she said. Unlike credit scores -- available to the individual but not the public without constraint -- Klout scores can possibly be used by data brokers to combine with other publicly available scores to profile consumers, said WPF. “If so, the Klout score may have more influence than even the company itself realizes, because the score could be used in algorithms that determine consumer placement and rank on lists for a wide range of consumer offers and non-offers."

"In those circumstances, however, FCRA does apply,” said SIIA. Any time a consumer score is used to “make a final eligibility decision or take an adverse action regarding these eligibility contexts” is covered by FCRA, said SIIA. “But there are different circumstances where … the final eligibility decision or adverse action is still to be made and the score is used solely to directly market to people to encourage more useful applications.” These circumstances are “marketing information, not eligibility information,” SIIA said. Marketing information shouldn’t be under FCRA, it said, quoting a line from a FTC 2012 report on consumer privacy: “For data used solely for marketing purposes, the Commission agrees with the commenters who stated that the costs of providing individualized access and correction rights would likely outweigh the benefits."

The argument that consumer scores are used solely for marketing doesn’t “hold water,” said U.S. Public Interest Research Group (U.S. PIRG) and Center for Digital Democracy (CDD) jointly, echoing many WPF concerns. “As ads or other forms of marketing (such as using social media techniques) promoting credit cards or loan products appear in real time directly on a consumer’s phone or computer … these scoring systems should be considered to result in prescreened offers of credit and thus trigger the protections of the FCRA.” If not, the FTC should subject these practices “to a set of parallel regulatory protections,” said CDD and U.S. PIRG.

"Robust enforcement” of DMA’s self-regulatory data-use guidelines provides a more “nimble” method of addressing any harms arising from consumer scores, said that association. “Unlike legislation, which is static and runs the risk of codifying practices that may become out-of-date even before a bill turns into law, industry self-regulation is nimble by its very nature and thus better suited to provide protections."

The FTC should focus on private sector outreach, said the Information Technology and Innovation Foundation-backed Center for Data Innovation. “While we will never have perfect predictions, we are moving towards better accuracy, and this creates more fairness for consumers,” said CDI Director Daniel Castro, who wrote the comments. “The key is for policymakers to reduce the risk of inaccurate data and, in some cases, evenly distribute risk among citizens, such as with health insurance.” To achieve this, the FTC “should work with the private sector to encourage more widespread use of predictive analytics,” said Castro. He said that eliminating restrictions on data sharing allows the FTC to focus on ensuring “individuals are treated fairly.”