This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.

Freshfields TQ

Technology quotient - the ability of an individual, team or organization to harness the power of technology

| 8 minutes read

“Please show your contact tracing app at the door”: Legal implications of contact tracing apps for US companies

This post is part of a series on contact tracing apps. You can read our introduction to the series and get links to the other entries here.

You’re a business that desperately wants to reopen its doors in the midst of the COVID-19 pandemic. At the same time, you obviously don’t want to expose your employees, customers, and visitors to risk if you don’t have to. You would rather invite back individuals who pose lower risk, while (politely) asking people who pose higher risk to stay away. It’s not practical to do a medical background check on each person waiting to get into your building. Instead, could you demand that individuals show you their contact tracing apps?

By way of reminder, contact tracing apps have emerged as a compelling method for containing the spread of COVID-19, particularly as parts of the US begin to lift restrictions. Essentially, it lets users know if they might have come into close contact with other users of the app who have tested positive for the virus. Because governments in the United States have largely stayed out of the development of contact tracing apps (as we discussed in our first blog post on this subject), individuals are legally free to use or not use contact tracing apps as they wish. The point of this blog post is to explore a slightly different question: what if use of these apps is required not by governments, but by businesses?

In short, there’s not much stopping businesses from asking employees, customers, or other visitors from showing their contact tracing apps at the door. But if a business chooses to do so, employers will have to remain within the bounds of data privacy laws and discrimination laws. Employers should also be mindful of the potential for bias or disparate impact of any measures taken.

Privacy Laws

A concern at top of mind whenever a company is asking individuals for information—especially health-related information—is privacy. But is it a real concern? Suppose you have a guard at the door who merely asks people to hold up their phone and show their contact tracing app. The guard doesn’t ask for the visitor’s name, ID number, or anything else about the visitor. The guard simply determines whether the app says “not exposed” or “exposed” (or whatever the various apps end up saying). In that situation, it seems unlikely that privacy laws will engage. Because you haven’t asked for the visitor for any identifying information, there isn’t really any collection of personal information—regardless of whether you’re talking about “personally identifiable information,” “personal information,” or something else.

The situation is harder when the guard knows the identity of visitors. Maybe the visitors are employees and the guard has gotten to know them over time. Maybe visitors have to sign in, or maybe employees flash a key card that lets the guard know their name. In these cases, there may be a “collection” of personal information by the guard, even if the information isn’t recorded on paper or in a computer. The collection may be momentary, fleeting, and trivial—but it’s still technically a “collection.”

Still, privacy law doesn’t pose a significant challenge. Take the California Consumer Privacy Act (CCPA), for example.

If you’re in California and you’re in the situation we’ve just described, you might be required to provide a “collection notice,” even though the collection of personal information is trivial. Luckily, the CCPA requires collection notices to state the categories of information collected and the business purposes of the collection, but not much more. (You may also need to provide a link to your online privacy policy.) “We’re looking at your contact tracing app status to secure our building against the risk of infection” would likely suffice. And the CCPA proposed regs allow notices to be provided orally, or using a reasonably visible sign. A small placard on your front desk will likely be fine.

There is also a very—very—literal argument that your organization’s privacy policy also needs to reflect the collection of contact app data. But because CCPA privacy policies are structured around broad categories of data, many privacy policies may already be flexible enough to accommodate contact app data. And this requirement wouldn’t apply when it comes to employees, due to the CCPA’s complicated exceptions.

What if the business refuses service to anyone who refuses to show their contact tracing app? This may be the trickiest part. Under the CCPA, businesses aren’t supposed to give consumers different levels of service—or refuse service entirely—just because the consumer refuses to provide their personal information.  A business can deny service only if the decision not to provide the service is reasonably related to the value of the consumer’s data, and if the business jumps through some hoops. Plus, under the proposed regulations, the business must give a “financial incentive” notice that discloses the business’s valuation of the personal information and how the business has calculated that valuation. Which means that if a company wants to ask consumers to show their contact tracing apps at the door, and they want to do so in a way that could be deemed a collection of personal information, they may need to come up with a good faith “valuation” of knowing whether a customer is at risk of carrying COVID-19 and then disclose that valuation. Or, maybe the business takes a risk-based decision that the California AG wouldn’t waste its effort on this personal information collection of such a trivial, fleeting nature. Or maybe they just save themselves the trouble, and make sure they ask to see contact tracing apps on a purely anonymous basis.

Moving past the CCPA, we’re sure to hear someone shout: “But HIPAA!” Well, not really. HIPAA is an important law governing medical privacy. But it generally applies only to particular types of businesses—medical providers, medical insurers, medical information clearing houses, and service providers (called “business associates”) of people in the other categories. The average business doesn’t need to worry about it.

Bottom line is that companies should be applauded for thinking about privacy if they require visitors to show contact tracing apps.  (Privacy by design!)  But the burden should not be too great, and can be managed.

Discrimination Law

What about discrimination laws?  If a business refuses to allow entrance to people who have had contact with an infected person, is that “discrimination”?  What about the mere act of denying entrance to people who refuse to show their contact tracing app—or who don’t have the app at all?

Disability-Based Discrimination 

Companies who require visitors (whether employees, customers, or otherwise) to show their contact tracing apps should first and foremost be mindful of the Americans with Disability Act (ADA). The ADA prohibits people from discriminating against a person based on the person’s disabilities, and sometimes illnesses can be deemed disabilities.

In the employment context, the ADA is administered by the Equal Employment Opportunity Commission (EEOC), so let’s start with what the EEOC has said on the topic. Current guidance from the EEOC does not address the use of contract tracing apps specifically, but does address similar employer measures. Under the ADA, an employer can’t make a disability-related inquiry to an employee or require an employee to submit to a medical examination, except where these measures are related to the job itself and shown to be business-necessary. This threshold is satisfied where, for example, the employee will pose a “direct threat” to her own health and safety or to that of others, due to a medical condition. The EEOC’s recently published guidance on COVID-19 expressly states that the current pandemic meets the ADA’s “direct threat” standard. So even if COVID-19 were considered a “disability,” the ADA would let employers to make disability-related inquiries and require medical examinations in the context of COVID-19, including administering COVID-19 testing and measuring employees’ body temperature.  Asking to see someone’s contact tracing app—much less invasive—would likely be fine, too. In any event, the EEOC’s guidance expressly takes the position that an employer can ask an employee about exposure to COVID-19, even if the employee exhibits no symptoms. Asking to see a contact tracing app is really just another way of asking about an employee’s exposure.

Although the EEOC’s guidance doesn’t apply when it comes to customers and visitors, it’s likely that a similar approach would apply. The ADA’s “direct threat” exception applies to employees, customers, and visitors alike.

Disparate Impact

But what about other types of discrimination? This might be the thorniest part. Here’s the problem: Contact tracing apps are really only viable for smartphones, since the whole point is to record a user’s physical proximity to others, and that usually requires Bluetooth or GPS technology. Not everybody has a smartphone. Quite the contrary, there’s a “digital divide,” or rather, a number of digital divides. The rate of smartphone adoption differs depending on things like socio-economic status, race or national origin, age, or disability (e.g., visual impairment). It’s already been noted how these digital divides might impact the ability of certain communities to access, for example, personal health records. So if you exclude customers who can’t show you a contact tracing app, that means excluding people who don’t have smartphones; if you exclude people who don’t have smartphones, will you unintentionally exclude people in part based on protected classifications like race or age? Anti-discrimination law for employment prohibits employers from doing things that cause a statistical “disparate impact” on groups based on these classifications, unless the employer shows that the practice is “job-related and consistent with business necessity.” Anti-discrimination law for public accommodations may do the same (the law isn’t settled on that yet); and anti-discrimination law in other contexts varies a bit. What this means is that a business that wants to require employees, customers, or other visitors to show a clean contact tracing app may need to consider and document why they believe doing so is a business necessity. With no consistent guidelines across the country on how to open up businesses safely, this may be easier said than done.

Employee, customer, and visitor reaction

Companies should also consider how people will react. We’ve seen significant pushback to COVID-19 measures in some places. Employees, for example, may object to an employer demanding to see employees’ contact tracing apps. If employees do so as a group, this may well engage collective bargaining rights under the National Labor Relations Act. And if employers meet those objections by taking adverse employment actions against objecting employees, they may face retaliation claims.

More broadly, companies should consider their overall relationship with employees, customers, and visitors. The main apps proposed in the US offer robust privacy protections, but the average person may not understand the technologies that achieve those protections, and some distrust of the apps is inevitable. Will the mere act of asking to see contact tracing apps at the door cause some people to turn away?

Other posts in this series:

Tags

covid-19, americas, intellectual property, data, data protection