Canvas hack hit students at the worst time, and it’s a wake up call for schools everywhere


A cyberattack on Canvas could not have come at a worse time. The learning platform, used by schools and universities for assignments, exams, grades, lecture materials, and class communication, went down during finals week, leaving students and instructors scrambling for alternatives.

The incident has been linked to ShinyHunters, a hacking group known for data theft and extortion. According to BleepingComputer, Canvas login portals at hundreds of institutions were defaced with a ransom-style message warning that stolen student data would be leaked unless the attackers were contacted. The group claimed to have obtained data tied to millions of students, teachers, and staff across thousands of schools.

What went wrong inside Canvas?

Instructure, the company behind Canvas, said hackers exploited an issue related to its Free-for-Teacher accounts, forcing it to temporarily shut down the platform to investigate the matter. This outage caused major chaos during the ongoing finals season as students and teachers were suddenly locked out of a platform.

During the initial outage, the Canvas login screen reportedly displayed a message from ShinyHunters claiming it had breached Instructure “again” and warning schools to make contact before a May 12, 2026, deadline to prevent stolen data from being published. The message also included a list of affected schools, making it clear the attack was part of an extortion attempt.

Why did this hit students so hard?

This hack resulted in some institutions postponing exams, while others asked faculty to be flexible with deadlines and course requirements. For students already in the middle of finals, the outage created more stress around study materials, submissions, and exam schedules.

While Instructure has claimed that passwords or financial details were not compromised in this attack, the hackers did get access to millions of user names, email addresses, student IDs, and internal messages. This information could easily be used for phishing attacks that mention real classes, schools, or instructors.

Haven’t we seen ShinyHunters before?

ShinyHunters has been connected to several major breaches in the past, including incidents involving Ticketmaster and Rockstar. Even Instructure has had previous run-ins with the hacker group. In September 2025, ShinyHunters targeted Instructure’s Salesforce environment through social engineering to access business systems, but Instructure said no Canvas product data was accessed and that the exposed information was mainly public business contact details.

What now?

Canvas coming back online does not end the problem. Hackers are still holding data from millions of users for ransom, which means the risk remains. That said, ShinyHunters has reportedly removed Instructure from its “Pay or Leak” portal, suggesting negotiations may be underway.

The attack should be a wake-up call for every school that relies on a few digital platforms to run classes, exams, and communication. These tools are now essential to how schools operate, which means they need stronger cybersecurity to protect student data and backup plans in case another outage or attack happens.



Source link

Leave a Reply

Subscribe to Our Newsletter

Get our latest articles delivered straight to your inbox. No spam, we promise.

Recent Reviews



Researchers at the University of Washington have developed a new prototype system that could change how people interact with artificial intelligence in daily life. Called VueBuds, the system integrates tiny cameras into standard wireless earbuds, allowing users to ask an AI model questions about the world around them in near real time.

The concept is simple but powerful. A user can look at an object, such as a food package in a foreign language, and ask the AI to translate it. Within about a second, the system responds with an answer through the earbuds, creating a seamless, hands-free interaction.

A Different Approach To AI Wearables

Unlike smart glasses, which have struggled with adoption due to privacy concerns and design limitations, VueBuds takes a more subtle approach. The system uses low-resolution, black-and-white cameras embedded in earbuds to capture still images rather than continuous video.

These images are transmitted via Bluetooth to a connected device, where a small AI model processes them locally. This on-device processing ensures that data does not need to be sent to the cloud, addressing one of the biggest concerns around wearable cameras.

To further enhance privacy, the earbuds include a visible indicator light when recording and allow users to delete captured images instantly.

Engineering Around Power And Performance Limits

One of the biggest challenges the research team faced was power consumption. Cameras require significantly more energy than microphones, making it impractical to use high-resolution sensors like those found in smart glasses.

To solve this, the team used a camera roughly the size of a grain of rice, capturing low-resolution grayscale images. This approach reduces battery usage and allows efficient Bluetooth transmission without compromising responsiveness.

Placement was another key consideration. By angling the cameras slightly outward, the system achieves a field of view between 98 and 108 degrees. While there is a small blind spot for objects held extremely close, researchers found this does not affect typical usage.

The system also combines images from both earbuds into a single frame, improving processing speed. This allows VueBuds to respond in about one second, compared to two seconds when handling images separately.

Performance Compared To Smart Glasses

In testing, 74 participants compared VueBuds with smart glasses such as Meta’s Ray-Ban models. Despite using lower-resolution images and local processing, VueBuds performed similarly overall.

The report showed participants preferred VueBuds for translation tasks, while smart glasses performed better at counting objects. In separate trials, VueBuds achieved accuracy rates of around 83–84% for translation and object identification, and up to 93% for identifying book titles and authors.

Why This Matters And What Comes Next

The research highlights a potential shift in how AI-powered wearables are designed. By embedding visual intelligence into a device people already use, the system avoids many of the barriers faced by smart glasses.

However, limitations remain. The current system cannot interpret color, and its capabilities are still in early stages. The team plans to explore adding color sensors and developing specialised AI models for tasks like translation and accessibility support.

The researchers will present their findings at the Association for Computing Machinery Conference on Human Factors in Computing Systems in Barcelona, offering a glimpse into a future where everyday devices quietly become intelligent assistants.



Source link