
6 minute read
POLICIES AND PROCEDURES
Digging into data privacy
In their hasty bid to deliver online learning, schools may be unwittingly putting their students in the way of damaging long-term data privacy complications, digital security experts have warned. Here’s how to do your due diligence to ensure your staff and students are safe online.
Advertisement
BY SARAH DUGGAN
COLIN Anson, CEO of pixevety – a child image protection and photo storage solution for schools – says that many school leaders may fail to grasp the fact that popular platforms used for online learning, like Microsoft Teams, Zoom and Canvas, actually link with many other companies – and that this presents a real issue when it comes to ensuring data privacy.
“When you start doing a bit of due diligence into the product itself, you start to realise that they actually connect to multiple other services and share information. So, for example, I think Canvas has close to 24 companies that it shares data with. “Now as a user, or as a teacher, you can’t be bothered by that. But from a school leader’s perspective, they must have done their due diligence first on the application.
“If you don’t understand how to use these services, you’re opening yourself up to a whole lot of pain. And quite often schools don’t,” Anson says.
It should not be left up to individual teachers to look into these kinds of specifics either, he asserts. Nor should it be OK for educators to pick and choose digital tools on a whim.

to provide that sort of guidance. They need to understand what tools they’re using and how they’re using them…”
The case of some educators using Pinterest as a “centralised platform to put ideas and thoughts on” is risky business – and an example of staff making a decision without bothering to scour the fine print, Anson indicates.
“Pinterest is only a few clicks away from pretty damned adult content. Also, with that data stored, who has access to it and what for what purpose?” he queries.
As we dive deeper into the digital age, Anson says school leaders really need to include IT experts more rigorously in their decision making circle. because so much relies on those people. They are the custodians of some very sensitive and pri vate information. And just to [sa y] ‘oh well, let Microsoft deal with it’– and Microsoft is proba bly one of the better ones – it’s j ust not good enough.
“You can’t let a teacher, just because they’re technically savvy, take over your IT department, because their number one skill-set is educat ing kids and that’s what you w ant them doing.
“Reading T’s and C’s and understanding data sovereignty and what not, and encryption etc, is not their core job.
“So putting the right people in the right place to do the right due diligence to then make an assess ment and get consent, that’s the key,” Anson says.
S usan McLean, known by her moniker the ‘cyber cop’, is the founder of consultancy group Cyber Safety Solutions and a former Victoria Police officer specialising in online safety.
She says that when it comes to delivering learning in the online sphere, for schools there is no excuse for not knowing the privacy implications they are involving their students in.
“I get that there’s not a teacher out there that sets a lesson thinking they’re putting their child at risk. But ignorance is no excuse.
“If you are a teacher of any description, child safety and child protection needs to be front and c entre – it shouldn’t have to be taught, it shouldn’t be an afterthought.”
McLean reports that the pandemic has seen some schools forced to write policy “on the sly”, a move which has put their student’s digital safety on the backseat.
“It’s the [schools] that are reacting to an issue and then trying to fix it, they’re the ones that are ha ving problems,” she says.
Recording online lessons poses a real problem, Anson and McLean agree.
“One of the biggest complaints I got with remote learning the first time around, was children com plaining to parents that the teacher or other students were recording the lesson. And it made them feel unc omfortable. They weren’t told about it. They didn’t want to participate because they didn’t want to sound stupid or look stupid, or things like that. So it is a privacy thing, but it’s (thinking about) best practice as well,” McLean says.
The data gleaned from a recorded class may be accessible indefinitely, Anson notes.
“Where is that recording stored? Who has access to that recording and what’s it going to be used for?”
Parental consent is imperative here, he says.
“Parents might not have consented or are not aware (of recording taking place) – a school is at risk because it hasn’t sought informed consent.

“Informed consent means it’s ‘VICS’, which is ‘voluntary, informed, current and specific’. So ‘we w ant to use [Microsoft Teams] and will be recording your child and we’ll be storing it here for this purpose’.”
Information about student behaviour, attendance and even academic performance during a recorded class may well compromise their opportunities in the future, Anson adds.
“Say, for example, a child is in remedial maths – or even in top maths, it doesn’t really make any difference – and they say or do something stupid. That’s recorded forever.
“You know, files have the potential to live forever.”
When it comes to students’ photos being shared and footage being recorded online, school leaders may not realise the AI systems brought into the mix here.
“There’s a big difference between a yearbook and Facebook, they are fundamentally different,” Anson warns.

Facial recognition tools are just the start.
“Bear in mind facial recognition is not a perfect biometric. It’s not even close. But it’s going to get there in time,” Anson says.
“The issue you’ve got, is some of the uses of facial recognition – or the things that you’ve got to be careful of, with any AI tool for that matter studying behaviour – is that there’s so much information that it’s gathering that you don’t know about.”
Mood, even one’s sexual preference, can be ana lysed, the expert says.
“H ow in God’s name a system can work that out from a facial recognition [I don’t know], but some of the services do this. They aggregate it all, and that data is kept somewhere – now who has access to it and what do they do with it? conditions, and then not just those terms and conditions – the terms and conditions of the other services that have access to it, you’re not going to know.
“There are certain things that we all do as kids that [are really all] about us growing up and [these don’t] need to be recorded and regurgitated any time you feel like it.”
M cLean says that technically speaking, all parents should be provided with concrete information on what data each platform is gathering, and where it is being stored.
And in the absence of a clear leadership policy on this, educators are more likely to act irresponsibly.
“I get teachers saying, ‘walk around your house and film yourself doing stuff, film you in your house and upload it’.
“They’re adding to the child’s digital footprint that in some ways could be quite dangerous, because we know that there has been a 400 per cent increase in child sex offenders being online during COVID,” McLean says.
Wrapping your head around the juggernaut of dig ital privacy terms and conditions with any one service can be an overwhelming prospect – but it’s now absolutely essential, Anson makes clear.
“Understand the tools you’re using, and the potential risk of those tools. If you’re using a particular service, and it links to multiple other servic es, what are those services and what are they using it for?”