Nowadays everybody wants to talk like they got something to say but nothing comes out when they move their lips just a bunch of gibberish.

  • 0 Posts
  • 71 Comments
Joined 3 months ago
cake
Cake day: May 28th, 2024

help-circle

  • Allow me to make it easier for you by posting the entire article and you can point to me where it says " the use of this invasive software is required in order to attend public school".

    Imagine your search terms, key-strokes, private chats and photographs are being monitored every time they are sent. Millions of students across the country don’t have to imagine this deep surveillance of their most private communications: it’s a reality that comes with their school districts’ decision to install AI-powered monitoring software such as Gaggle and GoGuardian on students’ school-issued machines and accounts. As we demonstrated with our own Red Flag Machine, however, this software flags and blocks websites for spurious reasons and often disproportionately targets disadvantaged, minority and LGBTQ youth.

    The companies making the software claim it’s all done for the sake of student safety: preventing self-harm, suicide, violence, and drug and alcohol abuse. While a noble goal, given that suicide is the second highest cause of death among American youth 10-14 years old, no comprehensive or independent studies have shown an increase in student safety linked to the usage of this software. Quite to the contrary: a recent comprehensive RAND research study shows that such AI monitoring software may cause more harm than good.

    That study also found that how to respond to alerts is left to the discretion of the school districts themselves. Due to a lack of resources to deal with mental health, schools often refer these alerts to law enforcement officers who are not trained and ill-equipped to deal with youth mental crises. When police respond to youth who are having such episodes, the resulting encounters can lead to disastrous results. So why are schools still using the software–when a congressional investigation found a need for “federal action to protect students’ civil rights, safety, and privacy”? Why are they trading in their students’ privacy for a dubious-at-best marketing claim of safety?

    Experts suggest it’s because these supposed technical solutions are easier to implement than the effective social measures that schools often lack resources to implement. I spoke with Isabelle Barbour, a public health consultant who has experience working with schools to implement mental health supports. She pointed out that there are considerable barriers to families, kids, and youth accessing health care and mental health supports at a community level. There is also a lack of investment in supporting schools to effectively address student health and well-being. This leads to a situation where many students come to school with needs that have been unmet and these needs impact the ability of students to learn. Although there are clear and proven measures that work to address the burdens youth face, schools often need support (time, mental health expertise, community partners, and a budget) to implement these measures. Edtech companies market largely unproven plug-and-play products to educational professionals who are stretched thin and seeking a path forward to help kids. Is it any wonder why schools sign contracts which are easy to point to when questioned about what they are doing with regard to the youth mental health epidemic?

    One example: Gaggle in marketing to school districts claims to have saved 5,790 student lives between 2018 and 2023, according to shaky metrics they themselves designed. All the while they keep the inner-workings of their AI monitoring secret, making it difficult for outsiders to scrutinize and measure its effectiveness. We give Gaggle an “F”

    Reports of the errors and inability of the AI flagging to understand context keep popping up. When the Lawrence, Kansas school district signed a $162,000 contract with Gaggle, no one batted an eye: It joined a growing number of school districts (currently ~1,500) nation-wide using the software. Then, school administrators called in nearly an entire class to explain photographs Gaggle’s AI had labeled as “nudity” because the software wouldn’t tell them:

    “Yet all students involved maintain that none of their photos had nudity in them. Some were even able to determine which images were deleted by comparing backup storage systems to what remained on their school accounts. Still, the photos were deleted from school accounts, so there is no way to verify what Gaggle detected. Even school administrators can’t see the images it flags.”

    Young journalists within the school district raised concerns about how Gaggle’s surveillance of students impacted their privacy and free speech rights. As journalist Max McCoy points out in his article for the Kansas Reflector, “newsgathering is a constitutionally protected activity and those in authority shouldn’t have access to a journalist’s notes, photos and other unpublished work.” Despite having renewed Gaggle’s contract, the district removed the surveillance software from the devices of student journalists. Here, a successful awareness campaign resulted in a tangible win for some of the students affected. While ad-hoc protections for journalists are helpful, more is needed to honor all students’ fundamental right to privacy against this new front of technological invasions. Tips for Students to Reclaim their Privacy

    Students struggling with the invasiveness of school surveillance AI may find some reprieve by taking measures and forming habits to avoid monitoring. Some considerations:

    Consider any school-issued device a spying tool. 
    Don’t try to hack or remove the monitoring software unless specifically allowed by your school: it may result in significant consequences from your school or law enforcement. 
    Instead, turn school-issued devices completely off when they aren’t being used, especially while at home. This will prevent the devices from activating the camera, microphone, and surveillance software.
    If not needed, consider leaving school-issued devices in your school locker: this will avoid depending on these devices to log in to personal accounts, which will keep data from those accounts safe from prying eyes.
    Don’t log in to personal accounts on a school-issued device (if you can avoid it - we understand sometimes a school-issued device is the only computer some students have access to). Rather, use a personal device for all personal communications and accounts (e.g., email, social media). Maybe your personal phone is the only device you have to log in to social media and chat with friends. That’s okay: keeping separate devices for separate purposes will reduce the risk that your data is leaked or surveilled. 
    Don’t log in to school-controlled accounts or apps on your personal device: that can be monitored, too. 
    Instead, create another email address on a service the school doesn’t control which is just for personal communications. Tell your friends to contact you on that email outside of school.
    

    Finally, voice your concern and discomfort with such software being installed on devices you rely on. There are plenty of resources to point to, many linked to in this post, when raising concerns about these technologies. As the young journalists at Lawrence High School have shown, writing about it can be an effective avenue to bring up these issues with school administrators. At the very least, it will send a signal to those in charge that students are uncomfortable trading their right to privacy for an elusive promise of security. Schools Can Do Better to Protect Students Safety and Privacy

    It’s not only the students who are concerned about AI spying in the classroom and beyond. Parents are often unaware of the spyware deployed on school-issued laptops their children bring home. And when using a privately-owned shared computer logged into a school-issued Google Workspace or Microsoft account, a parent’s web search will be available to the monitoring AI as well.

    New studies have uncovered some of the mental detriments that surveillance causes. Despite this and the array of First Amendment questions these student surveillance technologies raise, schools have rushed to adopt these unproven and invasive technologies. As Barbour put it:

    “While ballooning class sizes and the elimination of school positions are considerable challenges, we know that a positive school climate helps kids feel safe and supported. This allows kids to talk about what they need with caring adults. Adults can then work with others to identify supports. This type of environment helps not only kids who are suffering with mental health problems, it helps everyone.”

    We urge schools to focus on creating that environment, rather than subjecting students to ever-increasing scrutiny through school surveillance AI.



  • Cowardly slinking away into the shadows is par for the course.

    Offer a real conversation to others instead of being combative and maybe you will have better results with people in the future.

    The whole pro company schtick is getting old.

    I never once stated I was “pro company”. Another tip, free of charge, don’t put words in peoples mouths. Makes your already faulty argument moot.

    Is everyone free not to use them?

    Yes.

    Can people afford to make another choice while at the same time being forced to use laptops?

    No one is forced to use a laptop. Libraries exist with free internet access and computer.

    Is the type of monitoring reasonable and proportional?

    Yes. If you do not think it is I can only assume you weren’t a child when the internet became a thing. I was “monitored” on school computers in the 90’s and this is no different.

    Now are you actually going to participate in good faith or should I simply block you?











  • School is the counterparty and a state actor with everything that entails sign a poorly negotiated, likely corruption ridden contract with some trust me bro we don’t sell data, vendor, ie third party.

    What could be gained by monitoring someones school activity that is not already bought and sold by social media companies that the majority use excessively and daily?

    Now your child is subject to a contract arrangement that you are not privy too that enables some “dudes” to track your child’s usage of equipment.

    Don’t use third party hardware if you are worried about being monitored.

    If you don’t see this as an overeach, society has really degraded esp in context of the child abuse issues we are coming grips with.

    If you could make a real argument that isn’t a personal attack or logical fallacy that would be great.