Using Amazon Echo, Google Home to Learn: Skill of the Future or Bad Idea?
The technology company Datch Systems offers factories a voice assistant that allows workers to dictate maintenance orders and other requests without having to hand-write or type. Company representatives spend two full days training each new set of employees to use the tool.
Mark Fosdyke, the company's CEO and co-founder, recently showed the Datch tool to his younger brother, who's in his early 20s. He "picked it up in five minutes," Fosdyke said. His 5-year-old niece, meanwhile, is even more comfortable with voice assistants-she already knows how to seamlessly request a specific song from her Google Home device.
"She speaks to it in a completely different way than she speaks to me or her parents," Fosdyke said. "It's a lot more structured, it's a lot more patient."
Voice technology has quickly become a common tech tool in homes and workplaces, and many children are as comfortable giving verbal instructions to an Amazon Echo or Google Home device as they are talking to a person. These developments are forcing educators to think more deeply about the role they play in developing the technological fluency necessary for students to communicate effectively with artificially intelligent machines.
But the use of voice-activated technologies in school settings also raises big concerns about the privacy of student data and whether schools should allow them in the classroom at all. What if a child shares personal information while talking to a device that is then stored for others to see?
Recent news reports have highlighted instances in which private conversations recorded by voice assistant devices were posted online or monitored by outside observers. The U.S. Education Department provides a list of several instances in which recordings from voice assistants in the classroom could be considered violations of federal student privacy laws, such as an audio recording of a student being disciplined or having a medical emergency.
That worries student-data privacy advocates like Leonie Haimson, the founder and executive director of Class Size Matters.
"I don't see any evidence that students are being 'taught' here how to use these devices appropriately or warned of their risks," Haimson said. "Instead, it appears that there is a push to condition students early to accept the inevitability of surveillance and the violation of privacy it entails, as well as the mechanization of their classroom experiences, rather than resist this."
Teachers generally are not designing lessons around explicit goals to teach students how to communicate effectively with voice-activated technologies. But one real-world consequence of using voice tech as a classroom tool is that students become more familiar with how the tools function and will be able to use them more effectively in the workplace. Some educators, while acknowledging unresolved privacy issues, emphasize that getting experience using these technologies could help students later in life become critical evaluators and developers of them.
"We all just need to make sure that we're giving the students opportunity to know how the things happening now could impact the future," said Rachelle Dene Poth, who teaches foreign languages and STEAM at Riverview Junior/Senior High in Oakmont, Pa.
Poth has asked students in her Spanish class to ask Siri to read them a children's story from Bolivia, and to respond to questions she can't immediately answer. In the process, she helps them understand the pace of technological evolution by explaining how she would have found that information when she was a student.
"We had to go to the library, look it up in a book," Poth said. "It's not just instant knowledge coming at you."
Testing Classroom Use
During the 2017-18 school year, Tonia Dousay, an associate professor of education at the University of Idaho's Doceo Center for Innovation + Learning, conducted a study in which she and a colleague used voice assistants and other artificial intelligence tools in classrooms across four Idaho K-12 school districts. She witnessed a wide range of uses, from helping students with learning disabilities develop communication skills to getting students excited about specific research projects.
In a geography class Dousay observed, students got into a heated debate when an Amazon Alexa told them the capital of Australia was different from what they had previously learned. They realized that Australia has had several capital cities, which led them to wonder why, and where else such transitions had occurred.
"This could have been completely missed had it been a traditional classroom," Dousay said.
Knowing how to get the most out of voice assistants, while understanding that they're fallible, will be essential as workforce needs evolve, according to Sanjay Pothen, the director of Emerson Launch, the entrepreneurial hub at Emerson College in Boston.
"Just as when mobile [technologies] arose, there was a big need for mobile designers, we feel the same thing's going to happen in voice," Pothen said.
To prepare to meet that demand, Emerson has begun offering a professional studies certificate program in voice design, and a minor is soon to follow, he said. The college has received support from Amazon's Alexa Fellowship, which provides some funding as well as resources and technical support for a handful of higher education institutions nationwide.
Writing skills are crucial for people developing voice technologies, Pothen said. The ability to game out human-tech interactions-known as "conversational design"-is also crucial. That is why script writers and people with storytelling or film experience tend to be natural fits for Emerson's voice program, Pothen said.
The minor will incorporate voice design as well as voice ethics, to ensure that students examine all the potential ramifications of voice tech.
Instructional Obstacles
Data privacy concerns are the principal obstacle to wider adoption of voice technology as classroom tools, said Julie Daniel Davis, the director of instructional technology and innovation for the Chattanooga Christian School in Tennessee. Davis is one of the leading K-12 advocates for voice technology use in classrooms and has worked hard to get educators on board.
Davis would love to eventually see students building their own Alexa "skills," verbal tactics and strategies which are comparable to using apps on a smartphone. But she's wary of letting students create an Amazon account, with all of the potential privacy problems that come with doing that.
And just getting teachers to experiment with new technologies can be challenging. "Teachers don't have a lot of time. When something doesn't go right, it's really hard to get them to use it," she said.
Still, Davis has seen firsthand at recent voice tech conferences that voice assistants are already taking hold in industries such as health care (for doctors taking notes and accessing records) and finance (for managing bank accounts). Using these tools requires shifts in thinking that might not be intuitive, such as tailoring search terms to meet a voice assistant's specifications.
"We've got to start preparing students to figure out what that looks like for their future," Davis said.
For the Montour school district in Pennsylvania, that means training students to be technology "producers," not just consumers, said Justin Aglio, the district's director of academic achievement and district innovation.
Students worked with the company WIT Lingo to produce school- and district-specific Alexa knowledge, including a set of school facts and a series of interviews with students and teachers about what middle schoolers can expect at Montour High. That information became part of Alexa's database. The district is also starting to offer lessons on the ethics of voice tech, designing voice technology attributes that contribute positively to society.
To Use or Not to Use?
But data privacy advocates like Haimson see such initiatives as troubling. "If high school students are now able to develop the information that Google or Alexa provide to device owners, who is doing the fact-checking at these companies, and how do we know that others may not make up 'fake news' and spread it via these devices in order to sow disinformation and division?" she asked.
Haimson believes any device that collects student voices should require informed parental consent of all the students in the classroom whose voices may be captured by the device, along with strict limitations as to how that data will be processed and shared and requirements for deletion. Many school districts agree. To protect against potential problems, they have policies prohibiting teachers from using voice assistants in classrooms.
Others see the positives significantly outweighing the negatives. Voice assistants can encourage creativity, said Rebecca Dwenger, an instructional technology consultant for schools in Hamilton County, Ohio.
After seeing her young son, who has some language-learning difficulties, successfully getting homework help from her at-home device, Dwenger has been pushing instructors to experiment with it. One teacher in her district tells students to publish their written work on a voice device authoring platform so they can share it with other users and have it read aloud.
"We're at the cusp," Dwenger said. "In Hamilton County, I would say there might be one or two teachers in every school trying to use it. There's a lot of work that still needs to be done."