Critical Conversations Panel on Artificial Intelligence Explores Ethics, Possibilities
What is AI (artificial intelligence), how will it affect healthcare, how we do business, and how do we drive cars? What is the social impact of AI?
Panelists explored those questions and others during the recent Critical Conversations: Artificial Intelligence panel discussion in the Rubin Campus Center.
The panelists, who also fielded questions from the packed audience, were Dmitry Korkin, associate professor and director of the bioinformatics and computational biology program; Bengisu Tulu, associate professor, Foisie Business School; Xinming Huang, professor and Dean’s Excellence and Joseph Samuel Satin Fellow, electrical and computer engineering; Matthew Spofford ‘22, computer science; and Soundar Srinivasan, head and director of the AI program at Microsoft New England.
Jean King, professor and Peterson Family Dean of Arts and Sciences, moderated the event.
Robotics and AI are part of WPI’s Smart World initiative, which includes the construction of a new academic and research facility on campus, King said. Development of AI is at the forefront of the Fourth Industrial Revolution and will transform every aspect of society over the next 10 years, according to the panelists.
“It is a combination of WPI strengths across all disciplines,” Korkin said, “and a great opportunity where we can look at the robots not as individual components, but as part of society and human interactions.”
AI will also impact business degrees, Tulu explained, with WPI and other institutions embedding data science into the core of all undergraduate programs.
“Any business school that refuses to do that will be left behind.”
Huang said all students will have to deal with data in some capacity after they graduate from WPI.
“It is an opportunity to learn all new topics and enhance your chance in the job market,” he said. “(Our students) will be at the forefront of AI.”
Panel Discussion Includes Ethics, Concerns of Privacy
AI, also referred to as machine intelligence, is transforming the world as we know it. Panelists agreed that it is important to consider ethical implications of the emerging technology, including privacy issues related to data collection and the impact on workers potentially displaced by the technology.
“Social justice is a piece of what we do,” King said. “We are never just building technology and doing science. We are trying to do that while impacting the world in a positive way.”
Huang said those advancing AI do not want the technology to replace workers, but rather to create jobs. AI technology has advanced but is unable to substitute for humans in some of our most common business operations.
AI cannot currently account for certain human factors, he said, such as one driver waving another driver ahead in a busy intersection—something self-driving trucks cannot do. Researchers need to look at how robots interact with other vehicles when designing self-driving technology, he said, while also considering displaced workers.
“Truck driving is big business. There are a lot of workers,” Huang said. “We need to look at how we use AI and create jobs for those people. We don’t want them left out of society.”
Panelists cited data collection as another ethical concern for AI researchers.
King asked the audience how many people who use Amazon’s cloud-based Alexa Voice Service (AVS) have misgivings about using it at home. Many in the audience raised their hands or nodded.
The concern, King said, is how much of a person’s life is being exposed, and fears that data collected is “identifying you to the microcosm.”
Korkin responded that his kids, ages 6 and 9, are using Alexa, which is “more frightening than an adult who is able to control what to feed it.”
“Kids are just talking to it and telling their life stories and telling it what Dad and Mom do,” Korkin said. “It is natural for them to talk to a device like it’s a human,” he said, because children are growing up with smart devices.
Panelists agreed that those working in data collection should adhere to strict ethics, and that additional public policies are needed to protect privacy.
A second panel discussion on AI as part of the Critical Conversations forum—which has tackled other prominent issues, such as gene editing, 5G, and online education—is planned for the spring.
-By Paula Owen