The increase in the use of online monitoring tools follows a similar trend in the US, and comes as education departments continue to grapple with an online bullying epidemic in schools awash in electronic devices.
CyberHound’s ClearView is one example of the kinds of surveillance products that are already being rolled out in schools to address everything from radicalisation and school violence to instances of self-harm.
Billed as a ‘behavioural analytics’ tool, ClearView tracks the entirety of what students are doing on the web – including what they are searching for on YouTube, Wikipedia or Google – and reports potentially dangerous or risky interactions to nominated teachers, welfare staff and school leaders.
ClearView can also flag risks across communications that are made outside of class time – if they occur on the weekend or the previous night, for example – and has the capability to monitor student activity on school-provided devices beyond the school gates.
CyberHound calls the software a form of "actionable intelligence" that "gives schools the ability to see risks for students before they escalate, enabling better and more efficient support to be provided in a timely manner".
CEO of the company, John Fison, says ClearView has prevented suicides, alerted schools to instances of self-harm, identified predatory approaches and reduced the incidence of bullying.
The technology is already being used in almost 500 schools, and the company is in discussions about implementing it across state networks.
But some critics have questioned whether a solution like ClearView goes too far when it comes to balancing student welfare against concerns around privacy – or, indeed, if it’s even the right technological tool for the job.
Dr Katina Michael from the School of Computing and Information Technology at the University of Wollongong says CyberHound should be commended for “investing time and energy” into developing solutions around student cyber safety.
But she remains cautious about the implications of media surveillance on younger generations.
Michael is concerned about whether students at schools using ClearView know if they are being monitored.
While the company advises schools to inform students that their technology is being used, Cyberhound says it’s “not their place” to tell schools what to do, and, in-line with most school’s acceptable use policies for ICT equipment, they don’t have to.
Surveilling students who aren’t aware that they are being monitored opens up a number of ethical questions, Michael says.
“I know of many instances in Australia, and even in my local area, where social media monitoring has been occurring and there has been no direct instruction or awareness made to students, staff, or even to their families...”
But she can also point to problems associated with students being knowingly surveilled.
Michael says explicit surveillance runs the risk of creating “a culture of self-censorship” in schools that teaches students to temper their opinions and stifle self-expression.
“...So, rather than saying what I really feel, I say things that might be playing to an audience – that might be better respected as content. Or I don’t say anything at all ... It teaches you to be mute as you get older.
"Even if you see things that you don’t really agree with – you might not say anything because you think somebody might be watching. And that might be in your workplace, or at your university.”
Fison sees technology like ClearView as simply necessary in the world that students live in, and where everyone’s data is being tracked and recorded to some degree, CyberHound are doing so in the name of a good cause.
“We’re not recording every communication and we’re not providing the data to anyone that hasn’t been authorised within the school ... All we’re doing is giving [schools] data that can really help make a massive difference to students’ welfare and lives,” he says.
“So when we hear about cases where this technology has helped prevent what the school tells us would have probably led to a suicide, our response to any sort of criticism about privacy is that it’s worth having these tools.”
Privacy issues aside, however, it’s still worth debating whether monitoring tools like ClearView are the best mechanisms for addressing issues of cyberbullying and online threats to student welfare.
Michael believes that while there’s a place for a technological solution to issues like cyber bullying, “paternalistic” surveillance tools aren’t necessarily the answer.
For one, determined bullies who know they are being monitored will simply find ways around the surveillance tools, she says.
“Perpetrators who really have it in their guts to offend will find any means to perpetrate, and they will use technology to even overcome technological barriers placed in front of them.”
Instead, the academic is advocating for the use of more “persuasive” technologies that are geared towards encouraging behavioural change and self-regulation at the early stages in a bully's ‘career’.
Michael points to tools like health apps that remind the user how many steps they’ve done in a day as an example of a different, more “individualised” approach.
“The persuasion here is an intervention that basically says, 'be careful’.”
“It’s instructional; it's not pressuring you to go out there and do 9000 steps at midnight, but it's trying to persuade you to change your behaviour,” she says
She imagines a similar tool for students to help them learn to regulate their behaviour online – reminding them that they have sworn a certain amount of times in their communications or prompting them to think before the send an abusive message or threat – without needing to report them to an authority, or “tell them that they’re a bad person”.
What both parties can agree on is that technological solutions are not “a silver bullet” for cyberbullying or online harassment.
Michael says that any tech solution needs to be done in tandem with an educational one – a course or program, for example, that focuses on educating students to be responsible digital citizens from a young age, in the same way we prepare them to use other adult tools.
“At the moment we've got schools like the ones my kids attend handing out iPads and saying, 'you shall use an iPad, it's a requirement of our school to use an iPad'.
“And that’s like saying, 'here everyone, have some chocolate but don't open it until the end of class', or 'here kids, I'm going to give you a weapon but really you should never use it'."
“We don't give [the use of social media] the same weight as driving a car or [using] a weapon and yet some of these people are using technology in a weaponised form.”