In the 1997 film Gattaca, character Vincent Freeman, to fulfill his dream of joining the space program, purchases the genetic identity of another person when his own genes are deemed inferior due to his conception outside of a state-controlled eugenics program. While the dystopian premise that one day people’s genetics could affect their ability to pursue their chosen careers has not come to pass, decisions are currently being made about athletes’ eligibility to compete based on sensor data.
Sensors collect data on all sorts of information including gait consistency, body temperature, heart rate, force exerted on the body, distance traveled, balance, and more. An ongoing study at the UO is using sensors and biomechanics to determine whether cleat design is leading to more ACL injuries in female athletes.
But where is the ethical line between using sensor data to help an athlete improve their performance—and even avoid injury—and that same data being used to sideline them or used as surveillance of behavior? Courtney Cox is an assistant professor in the Department of Indigenous, Race, and Ethnic Studies at the University of Oregon. She studies sports to conceptualize culture and technology. In her research, she tussles with the idea that technology can make sports more efficient and safer for athletes.
“It all started with the Moneyball idea—build a team based on the analytics of folks’ past performances,” Cox said, referring to the book and film of that name in which Oakland Athletics general manager Billy Beane used sabermetrics to build out the 2002 baseball team that went on to a record-breaking winning streak. “The Moneyball concept has evolved somewhat into chasing the perfect algorithm for sports. It has also evolved in other ways: Where is the barrier between safety and surveillance? If you’re watching me sleep, does my sleep determine whether you let me start or keep my scholarship?”
At what point is it a stethoscope, and at what point is it an ankle monitor?
Cox notes that despite the promises sensors offer of helping people improve their performance, track exercise routes, and more, sensor tech itself is still rooted in a morally questionable basic of capitalism: The human body and its labor as a commodity. She questions whether players, especially at the high school and college levels, receive an explanation of how sensor data can be used to both help and harm them.
“What agency do people have to opt out?” she asked. “If not, what are the hidden costs? What do folks have to give up in terms of their own safety or privacy? Are we giving people a fair opportunity to make their own informed decisions?”
The risk of homogenization
Philosopher Colin Koopman says that while many if not most people would consider Gattaca-level behavioral genetics and eugenics practices as repugnant, he doubts as many people intuitively would recoil to data collected on high school athletes who are being recruited to play college sports.
“It’s one thing to amass data on professional athletes, but when it starts happening at lower levels and ages, it raises equity concerns,” Koopman said. “It raises concerns about how equitable the data systems are themselves. You could imagine a coaching team that gets deep into these data systems and bases a lot of their decisions off them. Then a middle-performing player comes along who has a really different style. They’re strong on the court, maybe not a standout, but their performance against the data is seen as weak because they have a unique style. Collecting data on athletic performance encourages homogenization of playing abilities and styles.”
Koopman notes that algorithms and data analysis software are tuned to look for trends and the expected. There is no one-size-fits-all method to analyzing an athlete’s individual playing style.
“Does an analytics approach leave room for the next Messi or Maradona or Jordan? We’re talking about people who come along and shake up a sport,” he said.
The bottom line of Koopman’s ethical argument is that offloading decision making to an algorithm not only limits the future possibility of an individual, but also the possibility of further developing a sport.
“We need to ask ourselves: What really is the point of a sport, especially at the high school level?” Koopman said. “Is it to develop pros, or is it to give large numbers of people a sense of camaraderie and sportsmanship, to train people how to treat others with fairness?”
Who makes the decisions for whom?
For sports ethicist Peg Brand Weiser, the use of sensors in sports is just the latest data point in a long history of decision-making about athletic performance.
“Sensors are just a sophisticated way to monitor a body, which coaches and trainers have been doing for thousands of years,” she said. “In ethics, we try to look for justifications for moral judgments.”
Weiser notes there are at least three approaches to the question of sensor use in sports: Consequentialism, which argues greatest good for the largest number of people. If you’re only risking a small percentage of the population for the enjoyment of many millions worldwide, this philosophy argues it’s worth it. Adults have the right to make educated decisions about the level of risk they are willing to incur upon themselves. We give autonomy to people’s choices.
From a Kantian perspective, sensors should be used only if they’re available to everyone.
“We’re talking about an economic issue—who gets access to sensors? Not that many people,” Weiser said. “You’re giving a sports advantage to certain people and withholding its benefit from others.”
Finally, virtue theory would argue that neither individual consequences nor universal laws are the point at all.
“This is actually not a new idea at all, but rather one that goes all the way back to ancient Greece,” she said. “Aristotle taught that you participate in sport and competition to become a better citizen and person, to increase your own virtue; spectators and money be damned.”
Koopman noted that interrogating when and how sensors are used through the lens of sports brings many people to the conversation who might not otherwise participate. This same thought exercise on the outcomes of sensor data can be applied beyond the realm of sports, too. What would it mean if health insurance companies began prorating policies based on data from your Apple Watch?
— By Kelley Christensen, Office of the Vice President for Research and Innovation