Articles You May Have Missed

Data Privacy Predictions and Protections

The law around data privacy is entering extremely interesting territory, thanks to recent Supreme Court decisions, ongoing cybersecurity trends, and more. Increasingly, ethics and compliance professionals find themselves having to navigate these issues in ways they never expected. Data privacy expert Miriam Wugmeister of Morrison Foerster discusses what she sees coming, and how to prepare for it.

October is Cybersecurity Awareness Month, and data privacy is a huge element of this. From your perspective, what is the most notable challenge or opportunity you are counseling clients to take action on when it comes to data privacy readiness as we head into 2023?

From a privacy perspective—by that I mean what personal information you collect, the purposes for which you use it, and how you share it—the major opportunity for organizations is to think globally. We have 150 countries that have privacy laws of some kind. In the U.S., we have five states with omnibus privacy laws and we’re going to have more, as well as Federal laws, so it’s really easy to lose the forest for the trees.

The good news is that just as about every single privacy law is based on the same core set of privacy principles: notice, choice, access and correction, supervision of service providers, data security. The way organizations often find a way to deal with this is to base a privacy compliance program on those core privacy principles. Focus on the forest, not the trees.

From a cyber security perspective, that’s a little bit different. The bad guys are only getting more active. The amount of money to be made, particularly by cybercriminals, is continuing to increase. So there, the opportunity organizations is to make it harder for the bad guys to get into your system. You know the phrase, an ounce of prevention is worth a pound of cure? In cyber, an ounce of prevention is worth many pounds of cure. If you can make yourself an unattractive target—multi-factor authentication, good firewalls, good monitoring, none-easily guesses passwords—that’s the opportunity. It’s prevention.

As the business world settles into remote work as a permanent or semi-permanent condition, what data privacy issues do you see that companies have yet to really adapt to?

I think the issue that companies are still grappling with is how much monitoring of employees makes sense, and how much do employees want?

On the one hand, organizations need to monitor certain activities to make sure that the information that’s entrusted to them is protected. You need to know, for example, if Miriam logged in at 8:00 am from Connecticut, and then also logged in at 8:02 am from China. That’s an impossible login. You need a certain amount of monitoring that has to happen in order to keep the system secure. But do you need to capture every keystroke I make just because I’m working at home? Do you need to tell me that I have to have my camera on so that you can make sure that I’m actually at my desk and working all the time? I think a lot of organizations are still struggling to find out what that right balance is.

Over-monitoring will make people not want to work for an organization. Employee trust is really important to the culture for many organizations. Obviously, the more sensitive the information you have, the more complicated it is, and there’s a whole concept of insider threat and monitoring the monitors.

I think a good test is if you’re trying to decide what monitoring to do on, let’s say, a line worker, ask how your C-suite would respond if you applied it to them. How would our CEO feel about having this applied to her. How would our CISO feel this being applied to everybody on their team? If your C-suite says that it feels invasive to them, maybe that’s how the line workers feel too.

The New York Department of Financial Services recently released new draft amendments to its controversial cybersecurity rule that would include significant changes such as a mandatory 24-hour notification for cyber ransom payments, heightened cyber expertise requirements for board members, and new access restrictions to privileged accounts. Can you walk us through what companies can expect from this rule, and others like it?

The New York DFS rule is part of a larger trend. We have 72-hour notice requirements for cybersecurity incidents in a lot of jurisdictions, so that is not new. India just passed a rule that requires notice of a breach, in certain circumstances, in six hours. This is part of a continuum where we are seeing the period within which regulated and non-regulated entities need to give notice regarding cyber security incidents getting shorter.

The part about giving notice of a ransom or extortion payment is new. The New York DFS has it. Some of the new rules that are being proposed at the U.S. Federal level also have a requirement. Part of that is really that law enforcement and regulators believe that they can’t figure out what the right rules should be if they don’t know how often people pay, or how much. They don’t know the factors that organizations consider when deciding whether or not to make a payment. Is it risk to life and safety? Is it existential risk to the business? There is this new desire on the part of law enforcement and regulators to gather more information quickly. The 24 hours, I think, is pretty hard, because if you’re if you’re in the middle of a ransomware attack, there is a lot of stuff going on. Companies will have to make sure that they report any extortion payment, assuming that the New York DFS and federal rules go into effect.

Another trend we are seeing at both the NY DFS and Federal levels (the SEC in particular) is a heightened desire to have the Board of Directors, as opposed to management, engaged in the evaluation and mitigation of cyber risk. In the draft SEC and NY DFS rules, companies are going to have to publish whether or not the Board has cyber expertise. But if the New York DFS goes into effect, the CISO is going to have specific reporting obligations to the Board of Directors, and you will also need to have a Board-approved cybersecurity policy. Which is interesting, because normally, organizations determine what are their highest risks and then share that with the Board. But it’s not by statute that the Board has to think about specific issues.

The other thing that we are starting to see is the way in which the laws evolve. First, we had laws that said a company should have reasonable organizational and technical measures to protect data. Then we had the breach notification laws say that if an organization does not protect the information, they are required to publicly disclose that fact to individuals and regulators, which was designed to encourage organizations to have reasonable organizational and technical measures to protect personal data. Now, we’re seeing the third phase, which is, the legislators are not going to wait until a company has a breach, rather they are going to tell organizations the technical and organizational measures that have to put in place to protect data in a way the legislators and regulators think is reasonable.

Again, this is not unique to NY DFS. We’re seeing it on the Federal U.S. level. We’re seeing it in other countries as well. This is the evolution of the cybersecurity statutory landscape. NY DFS is just part of it, and a lot of what they are mandating are currently best practices, such as having an asset inventory. There’s nothing in the rule that’s shocking. But it’s going to be a big lift for certain organizations.

What might a fourth phase of cyber regulation look like?

I don’t have a good answer as to what comes next. One of the problems when legislation dictates particular technical control is that those controls become outmoded. A good current example of that is encryption. We have lots of laws that say organizations must use encryption, but we know with quantum computing that encryption is going to become antiquated. So, you’re going to have laws on the books that say you have to have encryption when we know it doesn’t work. What are you going to do then? Will we advise companies, don’t encrypt because it no longer provides protection? Or will we say, encrypt because you have to until the legislatures get around to updating the laws? That’s going to be an interesting question. But that’s what I’m worried about in the next phase, that you’re going to have laws and regulations that are just becoming antiquated.

California is currently opposing federal data privacy protection under the American Data Privacy and Protection Act. What’s going on here?

The latest thing is that California is the roadblock, but the truth is, it’s exactly the same roadblock we’ve had for the last 10 years whenever we try to put in place any kind of Federal privacy or reach notification law: should there be a private right of action, and should there be preemption? Those two issues are not new. They’re most recently expressed by the California delegation, but they’re not new.

Many companies would rather have one privacy law that applies across the country, even if it’s not the best law possible, just because it becomes very onerous for organizations to comply with 50 different state laws. I believe we should have one federal law that preempts all the State laws here. Because if it’s the floor, not the ceiling, then we’re going to be in exactly the same situation we’re in right now, with 50 different iterations and a Federal law. That makes it very challenging to get it right.

Morrison Foerster has published a terrific set of guidance documents regarding privacy best practices for individuals, health care, providers, and technology companies when it comes to the U.S. Supreme Court’s reversal of Roe v. Wade. Can you talk about this project?

There has been a lot of attention about the reversal of Roe v. Wade, and one of the things which has not gotten enough attention are the privacy issues associated with the information about reproductive healthcare. Individuals and organizations create and share lots of information about reproductive health. Once the information is in a system or on the internet, it has a way of replicating and spreading. It is really important to think about how individuals and organizations can potentially minimize their digital footprints, so that information doesn’t proliferate and can’t be misused. We created the MoFo Privacy Tips for Protecting Reproductive Rights to help individuals and organizations better protect personal information associated with reproductive healthcare

I think if the average consumer would be shocked if they understood the amount of information that’s collected about them, particularly in the context of reproductive rights. This becomes such a more important issue in those states that have now banned abortion and are limiting access to reproductive rights.

Some employers, for example, that have said they will provide employees who live in a state where abortion is limited, the money to travel to another state to acquire those reproductive rights. But think about what information is generated and where it goes. The HR department is going to know because they have to be able to make sure the employee is eligible for the benefit. Because it’s likely a taxable benefit, it means the tax preparer of the company is going to receive the information. If the benefit is provided through an insurance policy, now the insurance company needs to have the information. Those are absolutely legitimate reasons for a company to collect information relating to reproductive health and to share it. However, that’s three different organizations that all may receive a subpoena from law enforcement in a state that restricts reproductive health care. I don’t think that there’s been enough attention paid to how to minimize the information at the front end. If the information doesn’t get created, it can’t be turned over.

We created the Privacy Tips to help individuals and organizations take steps to help protect themselves, because there are so many ways for data to be proliferated and it’s unclear how states may use data to prosecute individuals seeking reproductive healthcare and those who helped them.

 

ABOUT THE EXPERT

Miriam Wugmeister is a Partner with Morrison Foerster in New York. She is a leading expert in privacy and data security laws, obligations, and practices. As Co-chair of Morrison Foerster’s preeminent Global Privacy and Data Security Group, and ranked among the top in the profession by all major directories, Miriam is regularly called upon by some of the world’s largest and most complex multinational organizations to confront their most difficult U.S. and international privacy challenges.

Subscribe to our bi-weekly newsletter Ethisphere Insights for the latest articles, episodes, and updates.

RELATED POSTS

Free Magazine Access!

Fill out the form below, and get access to our Magazine Library

Free Magazine Access!

Fill out the form below, and get access to our Magazine Library

%d