Whose Data is it Anyway?March 29, 2021
By Justin Walker
Dorothy Leidner has a tradition with her oldest daughter. Around the holidays, the pair like to sit down and watch Hallmark Christmas films together. They enjoy them for their values and the lack of foul language, Leidner said.
However, this past Christmas, Leidner was reading a digital newspaper when she came across an advertisement for a sweatshirt reading "My Hallmark Movie Watching Shirt." She had never searched for any of their films before, nor had she ever watched one on her laptop.
"It was weird," Leidner, the Ferguson Professor of Information Systems at the Hankamer School of Business, said. "I was curious how they knew that about me."
This incident is what Leidner would call a personal data digitalization encounter. Leidner has experience with this in her research.
Back in March 2018, MIS Quarterly issued a special call for theory development papers. Fields such as psychology and sociology are continually developing theories, Leidner said. Even in the business world, researchers in management have produced numerous theories. Information systems researchers, however, usually apply theories from other fields to their research, she said.
Leidner's research has primarily focused on information systems and its influence on human and organizational behavior. Lately, she has also been compelled by the concept of dignity and how that shifts us from not only the business realm but into society as well.
"There is all of this data about ourselves that is being collected—some of it we know about, but much of it we don't—used in all sorts of ways," she said. "There are real implications for who we are as humans because of what is happening with this data. That for me was the key to tie it to dignity."
Dignity is what separates humans from other species, Leidner said. This paper, "The CARE Theory of Dignity Amid Personal Data Digitalization," dives into the implications of personal data being digitalized on someone's dignity as a human being. Personal data refers to any information—biological, behavioral, physical, cognitive, social—relating to a person. When this data is digitalized for use in various systems, everything from wearable devices to search engines, social media and more, the potential to both know and show the self is possible.
For example, wearable fitness devices enable one to know more about one's daily exercise and social media applications allow one to show oneself or one's viewpoint to others. Virus tracking applications are an example of the former and organizational gamification systems of the latter. At the same time, there is the potential to know and show others' data.
In many cases, the knower and shower are organizations and governments and the personal data that is being gathered and used is unbeknownst to the individual. Many are aware this raises privacy issues but even more troublesome are the potential affronts to human dignity.
There are three fundamental notions of dignity according to literature: behavioral, where one's dignity is expressed through their character; status, where dignity is associated with high social status, honors and respectful treatment; and inherent, which views all human beings as equally entitled to moral respect regardless of behavior or status.
As aspects of a person become digitalized, Leidner believes several risks arise. First, an affront, or challenge, is posed to a person's behavioral dignity. Platforms will often use your data to predict what news or advertisements you would like to see, therefore removing your choice in the matter, she said.
"You may not even realize this is happening behind you," Leidner said. "Your ability to live as you want to live is being affronted by some of these technologies."
Inherent dignity can also be affronted, she said. Certain technologies are designed to push the user to make decisions. In these cases, the user is treated as an object to be managed and manipulated, which goes against the belief that human beings should not be treated as a means to an end.
By understanding dignity, Leidner and her fellow researcher Olgerta Tona, from the University of Gothenburg in Sweden, were able to structure their theory. First, personal data digitalization encounters will produce claims (rewards) or affronts (challenges) to one's dignity, causing dignity disequilibrium. Second, the claim or affront will force the user to respond to the disequilibrium. Third, technologies are not finished in how they will analyze our information, so our notion of dignity will continue to evolve.
"At some point, we start to question whether it is just humans that have this inherent value," Leidner said. "Is there another species that has dignity? Already we see systems emerging to digitalize farm animals' well-being—their feeding, sleeping, and roaming habits—to give farmers data on how their animals are doing. And how about nature? The Klamath River has been given personhood rights in recognition of the inherent value it has to the Native Americans. If nature can have personhood rights, what does that mean for the human obligation to treat nature in a certain way?"
The other question this theory raises revolves around ownership, Leidner said. When it comes to personal data, who does this data belong to exactly? Does it belong to the user since it is in fact data about them? If one takes a video of someone else without their permission, is it right to upload this video to a site for others to see, effectively showing the individual in the video to others without the individuals' permission?
For data collected online—does the company that collects the data own it? Is the individual not owed any compensation if the company sells or even gives the information to a third-party organization? And what about government-collected personal data? Using the perspective of protecting human dignity helps provide insights to answer these questions and inform policies, Leidner said.
This research and theory development has several implications for both the business and education worlds, Leidner said. From an industry perspective, employee data is a big topic. A lot of the discussion around employee data goes back to the previous questions on ownership.
"Is everything you do between 9 and 5—any behavior you undertake—considered organizational data because you are being paid during that time?" she said. "Or is some of that still personal? It may be work-related but does that still mean that anything you do can be digitalized and used to evaluate you and your performance within your organization? And who has a right to know this data?"
From there, it becomes a question of who manages that information and what governance processes are in place to ensure that the employee is aware of what the organization is tracking data about him/herself. Leidner hopes to have answers for this in future research, as this topic will continue to increase in importance as more work moves virtual and more tools become available to digitalize employee behaviors.
In the classroom, Leidner sees this theory as a resource for students to gain the tools to understand their own decisions regarding their personal data. It gives them the language to talk about the world they live in and going forward, she said.
"It is not just businesses we need to be concerned about," Leidner said. "We need to be concerned about humanity as a whole with these technology developments."