Interview with Rachel R. Watkins Schoenig and Donna McPartland

Some steps for protecting personal information are simple, such as avoiding the use of public Wi-Fi without additional protections such as VPN...

Interview with Rachel R. Watkins Schoenig and Donna McPartland

Our latest interview is with Rachel R. Watkins Schoenig and Donna McPartland, attorneys with significant experience in the assessment industry. They have worked together on various projects and both of them have a special interest in the privacy space — which is one of our main interests, as well. So, obviously, we decided to send them a few questions and get their answers. And… here’s what these fascinating ladies had to say…

Can you shortly introduce yourself?

We are both attorneys with significant experience in the assessment industry. We have worked together on various projects and both of us have a special interest in the privacy space.

Rachel is the founder of Cornerstone Strategies, a boutique firm specializing in security, privacy, and statistical detection of test fraud. Cornerstone clients deliver assessment services around the globe, and range from educational clients to healthcare, credentialing programs, and governmental licensing entities. An attorney by trade, Rachel has spent decades protecting her clients’ legal rights and advising them on regulatory issues. Today, she focuses on the needs of the assessment and learning industries. She is regularly asked to speak on security and integrity matters and has a number of relevant industry-related publications. Rachel founded Cornerstone with a firm belief in the value of quality assessments, a passion to serve others, and the desire to enhance trust in testing.

Donna is the founder of McPartland Privacy Advising LLC, where she is a solo practitioner providing data privacy legal services to a wide range of clients in the education, testing and credentialing industries. Donna has over 20 years of legal experience with many years in-house in Chief Privacy Officer roles in the educational and testing industry. Most recently, Donna was the Chief Privacy Officer at the College Board. Donna was also the CPO at the Graduate Management Admission Council (GMAC), which owns the GMAT exam. Donna is a Fellow of Information Privacy (FIP), a Certified Information Privacy Manager (CIPM) and a Certified Information Privacy Professional (CIPP/US).
Both of us regularly volunteer time to help enhance the testing industry, including sharing our knowledge regarding exam security and privacy.

What do you see as the main challenges for our privacy today?

Rachel: One challenge is the gradual erosion of privacy occurring with the ever-increasing incorporation of technology into our lives and especially the lives of our children. For example, the proliferation of educational technologies is gathering data about our children at younger and younger ages. Today, as schools have shifted to teaching and testing at home in response to the COVID-19 pandemic, collaborative learning technologies and remote proctoring tools are collecting even more data about our families and homes in unprecedented ways. This isn’t to imply these technologies are bad, but rather to recognize the fact that their inclusion in our lives and our children’s lives has a marked impact on privacy, and one that we are not fully prepared to deal with from a legal or societal perspective.

Another challenge is the use of artificial intelligence (AI), including Machine Learning by organizations to make predictions, recommendations and classifications about people including children. AI is an umbrella term for a range of technologies and approaches that often attempt to mimic human thought to solve complex tasks. Machine Learning is generally a subset of AI and is based on algorithms that can learn from data without relying on rules-based programming. AI often relies on personal information and in some instances, AI creates more personal information. For example, when AI is used to make decisions about individuals, such as decisions about their competences or whether they were engaged in testing misconduct, that information itself becomes personal information. While there are some efforts around the globe to address the rights of individuals when their personal information is used in connection with AI, this is a new area that is not well-understood and oftentimes, not well-regulated. As a result, there is the potential for decisions or information that is generated by AI that is not easily explainable or is based on biased data. Creating ethical guidelines around the use of personal information and AI and shaping regulatory frameworks that balance the privacy rights of individuals with the progress of technology remain challenges for our industry.

Donna: I agree with both of the above issues and think that they reflect a general loss of control over our personal information because of the prevalence of information gathering technologies. So many organizations need to utilize personal information to provide their products and services that it has taken on incredibly strategic importance to their bottom lines. AI technologies may be promoted but with little understanding of the potentially harmful and discriminatory impacts of the use of AI, especially in education. There are certainly important benefits to AI but they need to be balanced against the potential detrimental impacts, such as incorrect classifications and discriminatory effects. I think organizations’ understanding of technologies that they are using is really important and the transparent and ethical use of such technologies is critical.

What can we as individuals do about it?

Rachel: Be aware and involved. Understand how third parties and schools are using information about your child. Know what is being collected about your child, by whom, and for what purpose. While there are some statutory requirements dealing with the private information of a child, such as the Family Educational Rights and Privacy Act (“FERPA’) and the Children’s Online Privacy and Protection Act (“COPPA”), these may have unexpected applications and gaps as education and assessment technologies are used more frequently at home. Also, seek to understand where AI is employed to make decisions about your child and request information that explains how it is used. If key decisions are being made by the use of AI, ask additional questions to understand if humans are also involved in the process and how the AI is being monitored to ensure the data and results are not biased.

Donna: I agree that education and being proactive are great steps. Don’t be afraid to ask about what technologies are being used, how they are used and by whom, and what is the oversight concerning the use of AI. There is more guidance coming out from the AI Now Institute at NYU and from the Turing Institute in the UK. More generally, given the increase in online activities, now more than ever be careful with sharing personal information.

Can VPNs help? Do you use one?

For us, VPN’s are helpful, especially when we travel (which both of us did on a fairly regular basis until the recent COVID-19 restrictions). VPNs can help protect your personal information by shielding your personal information when you are communicating over public wi-fi.

For students, schools may want to consider providing their students with VPNs as an added level of protection.

What do you do to protect your (or your child’s) personal information?

Rachel: Some steps for protecting personal information are simple, such as avoiding the use of public wi-fi without additional protections (such as VPN) and not saving passwords on commonly used websites. Other steps are more involved and require more time to understand what data you are actually relinquishing to others. Thoughtfully reviewing privacy notices and being aware of what data you are permitting an application to access is important.

It can be more complicated to protect your child’s personal information. As they get older, it’s important to talk with children about what they are sharing online and how it can be used, now or in the future, to cause them damage. It’s also important to know what information schools and education technology vendors are collecting about your children and how that will be used. Legal and regulatory improvements over the last few years have done a better job of addressing these activities, but parents should not assume that the school will have the same privacy concerns that you may have as a parent. Ask questions and take steps to protect your child’s information. This can include requiring vendors to destroy your child’s data.

Donna: Also, take control of your technology. For example, limit the apps on phones, and check all settings on your apps and devices to make them more privacy-protective such as turning off sharing functions. It is also wise to use 2-factor authentication on devices so only you can access the device.

Do you have some other advice for our readers so they could, at least partially, regain their privacy?

Rachel: Get involved in understanding the legislative and regulatory landscapes. In some jurisdictions, privacy regulations are quite advanced, while in others there is very little oversite. For example, in the United States, protections vary by State.

Also, be aware of efforts to create ethical guidelines for the use of personal information and AI. Especially when personal information of children is involved, it is important to require that data scientists and corporations develop and adhere to common ethical norms that protect children from the potential negative implications of weak security, privacy, and AI use.

Donna: Stay informed, ask questions and take control of your personal information to the extent you can. For instance, if you are a resident of California you can opt out of data sharing by many companies.