[ad_1]
Washington: For New York teacher Michael Flanagan, the pandemic was a crash course in new technology. We rushed to hand out laptops to students who were staying at home and moved their hectic school life online.
Students are long gone back to school, but technology lives on, and a new generation of apps has emerged to monitor students online.
The program scans students’ online activities, social media posts, and more, with the goal of keeping students focused, detecting mental health issues, and pointing out potential violence.
“You can’t ring the bell,” said Flanagan, who teaches social studies and economics. “Everyone has a device.”
But new trends in tracking mean that some apps may target minority students, others may exclude LGBT+ students without their consent, and many are out of care. It is used to instill discipline as well as provide.
As such, Flanagan parted ways with many of his colleagues and never uses such apps to monitor students online.
He remembers seeing a demo of one such program, GoGuardian. In this demo, a teacher showed what a student was doing on his computer in real time. The child was at home on his day off.
Such scrutiny raised a huge red flag for Flanagan.
“I have a school-issued device, but I know I can’t expect privacy. But I’m an adult — these kids don’t know that,” he said.
A spokesperson for the New York City Department of Education said the use of GoGuardian Teacher “allows teachers to see what’s currently on a student’s screen, provide prompts to refocus, and prevent access to inappropriate content.” It is used only to limit it.”
Worth over US$1 billion (RM4.27bil), GoGuardian is one of the few apps in the market to gain traction, currently serving over 22 million people, including public systems in New York City, Chicago and Los Angeles. supervising students. .
Globally, the education technology sector is expected to grow by US$133 billion (RM56.844 billion) from 2021 to 2026, market research firm Technavio said last year.
Parents expect schools to keep their children safe in classrooms and on field trips, and schools “have a responsibility to keep students safe in digital spaces and on school-issued devices,” GoGuardian said in a statement. increase.
The company says it “provides educators the ability to protect students from harmful or explicit content.”
Online surveillance is now “just part of the school environment,” said Jamie Golosh, policy adviser on the future of the privacy forum, a surveillance group.
And even if the school survives the pandemic, “it doesn’t look like we’ll be back,” she said.
guns and depression
A key priority for monitoring is to keep students focused on their studies, but it also addresses rapidly growing concerns about school violence and children’s mental health. This is what the medical group called a national emergency in 2021.
Federal data released this month shows that 82% of U.S. schools are now training staff on how to spot mental health issues, up from 60% in 2018. 65% have a covert threat reporting system, an increase of 15% over the same period.
In a survey last year by the nonprofit Center for Democracy and Technology (CDT), 89% of teachers reported that their school monitors their students’ online activities.
But it’s not clear if the software will make schools safer.
Gorosh cited the shooting in Uvalde, Texas, last May that left 21 people dead at a school heavily invested in surveillance technology.
Some worry that tracking apps are actively causing harm.
For example, the CDT report found that while administrators overwhelmingly claimed that the purpose of surveillance software was student safety, “the software is much more commonly used for disciplinary purposes and We see contradictions along racial lines,” Elizabeth said. Laird, Equity Her Director for CDT’s Civic Technology Program, said:
The program’s use of artificial intelligence to scan for keywords can sometimes kick students out without their consent, she said, and 29% of students who identify as LGBT+ say they or someone they know mentioned that they experienced this.
And over a third of teachers say their school automatically sends alerts to law enforcement after school hours.
“The stated purpose is to ensure student safety. We have set up a system here where law enforcement has regular access to this information to find out why students enter student homes,” Laird said. says Mr.
“preyed”
A report last year by U.S. lawmakers against four companies producing student-monitoring software found they made no effort to determine whether their programs disproportionately targeted marginalized students. rice field.
Massachusetts Senator Ed Markey, one of the report’s co-authors, said in a statement to the Thomson Reuters Foundation, “Students should not be monitored on the same platforms they use for schooling. .
“As school districts bring technology into the classroom, we need to ensure that children and teenagers are not preyed upon by targeted advertising and intrusive surveillance webs.”
The U.S. Department of Education promised to release guidelines on the use of AI earlier this year.
A spokesperson said the institution is “committed to protecting the civil rights of all students.”
Aside from the ethical issues around spying on their children, many parents are frustrated by the lack of transparency.
“We need to be more explicit about whether data, especially sensitive data, is being collected. One Cathy Cresswell said:
Cresswell, who has a daughter in a public school in Chicago, said some parents have been sent warnings about their children’s online searches.
Another child was repeatedly warned not to play certain games. Even though the student was playing the game at home on the family computer.
Creswell et al. acknowledge that the problems monitoring seeks to address (bullying, depression, violence) are real and need to be addressed, but question whether technology is the solution.
“If we’re talking about monitoring self-harm, is this the best way to tackle the problem?”
Pointing to evidence suggesting AI is imperfect at catching warning signs, she said more funding for school counselors could help it better tailor the problem. rice field.
“There are big concerns,” she said. “But technology may not be the first step in answering some of these questions.” – Thomson Reuters Foundation
[ad_2]
Source link