<img height="1" width="1" style="display:none;" alt="" src="https://px.ads.linkedin.com/collect/?pid=1416810&amp;fmt=gif">

Web filtering and mental health

April 30, 2021

Cybersafety and mental health go hand in hand, as the internet is now intertwined in student life, from technology in the classroom to tools for remote education, and recreational screen time.

With the boom of social distancing and online learning, student mental health is a district priority. Technology enhances education, but it also presents new social and emotional risks for children when not managed properly.

classroom technology, monitoring

Districts are turning to cyber safety solutions like web filtering software to not only block content but to actively help them identify students who may be facing mental health challenges. Due to their increased visibility into students’ online behavior, web filters have become valuable mental health aids in uncovering early signs of violence, cyberbullying, and self-harm. 

Here’s what districts need to know about the value and limitations of web filtering when it comes to protecting and addressing students’ mental health. 

How Web Filtering Supports District Mental Health Initiatives

Web filtering software was originally designed simply to control which online content students were allowed to access. To keep up with the changing internet landscape and the rise of technology to support distance learning, web filters have become far more sophisticated in tracking students’ online activities. Modern cybersafety software must protect students while still allowing them to access digital tools that enhance their education.

“When I started in filtering around 12 years ago, it was all about, can you handle the amount of bandwidth that my network demands, and can you block pornography?” shares Ross Young, VP of Linewize in North America. “Education has really moved almost 100% online, especially because of COVID. The demands and the skills required to manage a filter have grown to include mental health, visibility that a filter can have because of searches, and the use of AI to look for early indicators of self-harm, school violence, cyberbullying.”

Web filtering solutions like Linewize can do more than just block websites — the software can identify search terms, monitor for keyword phrases, track which videos students have watched, and more. This brings tremendous value to district mental health initiatives, as a tool to detect early warning signs of mental health issues.

For instance, Linewize’s red flag safety alerts send notifications when a student conducts an internet search that may indicate harmful behavior or violence. With timely alerts, professionals can intervene in an appropriate manner to help a student in need. For students battling severe mental health challenges, this can mean the difference between life and death.

child, youth counselor, technology

Having this information in real-time helps counselors, principals, teachers, and other staff interject in a timely manner to have a conversation with a student or their family. With so much of student life happening online, technology is a must-have to help monitor cyber safety— but it can’t be the only tool districts have. 

 

Identifying Mental Health Issues is Only Half the Solution

When it comes to handling student safety the right way, identifying a mental health risk is only one piece of the puzzle; the other piece is knowing what to do with the information once it’s been brought to attention.

Is this student being bullied? Who should be responsible for that, and when? Is this student showing signs of depression or simply having an off day? How urgent is the need for intervention?

Districts must have protocols and processes in place to enable the appropriate people — such as school counselors and psychologists — to intervene when necessary. Keep in mind that network admins who manage a district’s web filter are not trained mental health professionals, nor should they be responsible for a student’s immediate safety. The district-wide strategy should be to get technical visibility of warning signs to the right people and provide them with the context to determine whether a student is a threat to themselves or others.

This is where organizations like Gaggle, a leader in managing student cyber safety and addressing mental health threats, are helping to bridge the gap between web filters and mental health treatment. Gaggle helps districts set up processes to take red flag alerts that come in from their web filtering software, and route them to the appropriate people who can determine the right response.

Districts can leverage web filtering software to be their eyes and ears, particularly where teachers and faculty are not able to directly monitor every student’s online behavior at all times of the day. But software can only highlight a potential risk — districts need processes in place to bring alerts to the right people at the right time, to adequately protect students.

Interested in learning more about cybersafety?

Read our blog "Why you Need Both AI and Human Monitoring For Student Cybersafety"

Read more

Topics: Cyber Safety, EdTech, Human Monitoring, Mental Health

Would you like some more information? Or a demo?
Get in touch
Subscribe to our newsletter
Follow us on social media

Recent posts

 
Back to School with Teodora

Linewize Lead Cyber Safety and Digital Wellness Expert: As we find ourselves in a school year unlike any other, it’s hard to tell whether ...

 
6 Ways the COVID-19 Pandemic Has Changed Education for the Future

There’s no going back to pre-pandemic education. Although the 2020-2021 school year brought a lot of rushed and temporary solutions for ...

 
How Network Admins and Teachers Can Work Together on EdTech

Network admins and teachers aren't always in alignment when it comes to balancing student safety and learning. 

 
Completing Summer Updates Before School Resumes

While students are on summer vacation, network administrators often take advantage of the time and lack of strain on their tech stack to ...