Security

Worker surveillance must comply with credit reporting rules

US Consumer Financial Protection Bureau demands transparency, accountability from sellers of employee metrics


The US Consumer Financial Protection Bureau on Thursday published guidance advising businesses that third-party reports about workers must comply with the consent and transparency requirements set forth in the Fair Credit Reporting Act.

The Fair Credit Reporting Act (FCRA) was enacted in 1970 to ensure the accuracy, fairness, and privacy of information in profiles maintained by credit reporting agencies. But it also includes provisions that apply when a consumer report is used to make employment decisions.

The Bureau (CFPB) is concerned that companies may be using third-party reports about worker activity or behavior to inform adverse employment decisions (such as firing workers) based on undisclosed surveillance or opaque algorithmic scores.

"Workers shouldn't be subject to unchecked surveillance or have their careers determined by opaque third-party reports without basic protections," declared CFPB director Rohit Chopra in a statement. "The kind of scoring and profiling we've long seen in credit markets is now creeping into employment and other aspects of our lives. Our action today makes clear that longstanding consumer protections apply to these new domains just as they do to traditional credit reports."

Worries about workplace surveillance and unaccountable algorithmic decision-making have proliferated with the adoption of machine learning models, the growing sophistication of online analytics, the uptake of sensor-laden mobile phones, and the need to manage remote workers. Cracked Labs, an Austrian nonprofit research group, has been exploring the topic in a recent series of reports.

While algorithmic wage discrimination remains a significant concern for employees, the CFPB is focused specifically on consumer reports used for predicting worker behavior (like guessing whether the worker will join a union), automatic job assignment systems that rely on worker performance data, warnings or disciplinary actions dispensed without human oversight, and assessments of social media usage in the context of employee evaluation.

Chopra elaborated on these concerns in public remarks on Thursday at the Michigan Nurses Association, noting how he has received questions from workers about being obligated to carry a device or install an app that surveils them.

"I have serious concerns about how background dossiers and reputation scores can be used in hiring, promotion, and reassignment," he noted. "If an employer purchases a report that details whether a worker was a steward in a union, utilized family leave, enrolled their spouse and children in benefits programs, was cited for poor performance, or was deemed to be productive, this can raise serious issues about privacy and fairness. And if this information is converted into some sort of score using an opaque algorithm, that makes it even more suspicious."

Consumer reporting agencies and background screening companies, according to the CFPB, now offer employers data about workers' activities and personal lives.

"For example, some employers now use third parties to monitor workers' sales interactions, to track workers' driving habits, to measure the time that workers take to complete tasks, to record the number of messages workers send and the quantity and duration of meetings they attend, and to calculate workers' time spent off-task through documenting their web browsing, taking screenshots of computers, and measuring keystroke frequency," the agency reported. "In some circumstances, this information might be sold by 'consumer reporting agencies' to prospective or current employers."

The CFPB circular explains the agency's legal basis for applying the FCRA to data collected about workers and its requirements for businesses that rely on such data. Companies that wish to use such data must obtain employee consent before doing so. They must provide detailed information data used to make adverse employment decisions. When workers dispute said data, companies must correct inaccuracies. And any such data can only be used for purposes allowed under the law – companies can't, for example, sell the information or use it for marketing financial products to workers. ®

Send us news
18 Comments

Gary Marcus proposes generative AI boycott to push for regulation, tame Silicon Valley

'I am deeply concerned about how creative work is essentially being stolen at scale'

Manifest file destiny: Declare your funding needs via JSON

India-based stockbroker Zerodha pledges $1M a year for open source projects

Digital River runs dry, hasn't paid developers for sales since July

Vendor told El Reg the biz's law firm claims merchant debts aren’t valid obligations

AI firms and civil society groups plead for passage of federal AI law ASAP

Congress urged to act before year's end to support US competitiveness

Smart TVs are spying on everyone

Regulators know this is a nightmare and have done little to stop it. Privacy advocacy group wants that to change

Socket plugs in $40M to strengthen software supply chain

Biz aims to scrub unnecessary dependencies from npm packages in the name of security

Bitwarden's FOSS halo slips as new SDK requirement locks down freedoms

Arguments continue but change suggests it's not Free Software anymore

Lab-grown human brain cells drive virtual butterfly in simulation

Could organoid-driven computing be the future of AI power?

US lawmakers push DoJ to prosecute tax prep firms for leaking taxpayer data to big tech

TaxSlayer, H&R Block, TaxAct, and Ramsey Solutions accused of sharing info with Meta and Google

'Open banking' rules will put your financial data back where it belongs

Well, at least eventually since some companies have until 2030 to comply

US moves ahead with crackdown on data brokers selling to six 'countries of concern'

Biden's Executive Order finally getting its day in the sun, soonish

Sorry, but the ROI on enterprise AI is abysmal

Appen points to, among other problems, a lack of high-quality training data labeled by humans