• 5 Posts
  • 739 Comments
Joined 2 years ago
cake
Cake day: December 20th, 2023

help-circle


  • Not my screenshot, but yeah, this seems to be several standard KDE widgets bundled together. You can always open System Monitor app, though, if you want to check your system through a customizable, organized dashboard. Or, like it’s done here, group standard widgets to enjoy them all in a neat fashion.

    (Both can be set up to show you literally anything, and you’d be surprised how many sensors are in your computer)


  • All main desktop environment users triggered in 3…2…1…

    But seriously, as a KDE Plasma user, I have to note it’s extremely customizable. It doesn’t have to look or behave like Windows at all, it’s just a default.

    An entirely different look? Sure! All sorts of completely customizable shortcuts? Yep! Tiling? If you so wish!

    The thing that made Plasma my forever choice is that whatever I want to make it, it delivers. It has settings for everything.

    Here are just two examples of the non-standard KDE looks by the way:

    1000108151

    1000108150











  • Su often takes more time and is more involved, even if it’s a difference between very little effort and no effort at all.

    For example, I update and install apps through CLI about once a week, and I’d rather just bang the sudo <update command> than go su, enter root credentials, and only then go for what I wanted in the first place.






  • That would be true if children were abused specifically to obtain the training data. But what I’m talking about is using the data that already exists, taken from police investigations and other sources. Of course, it also requires victim’s consent (as they grow old enough), as not everyone will agree to have materials of their abuse proliferate in any way.

    Police has already used CSAM with victim’s consent to better impersonate CSAM platform admins in investigative operations, leading to arrests of more child abusers and those sharing the materials around. While controversial, this came as a net benefit as it allowed to reduce the amount of avenues for CSAM sharing and the amount of people able to do so.

    The case with AI is milder, as it requires minimum human interaction, so no one will need to re-watch the materials as long as victims are already identified. It’s enough for the police to contact victims, get the agreement, and feed the data into AI without releasing the source. With enough data, AI could improve image and video generation, driving more watches away from real CSAM and reducing rates of abuse.

    That is, if it works this way. There’s a glaring research hole in this area, and I believe it is paramount to figure out if it helps. Then, we could decide whether to include already produced CSAM into the data, or if adult data is sufficient to make it good enough for the intended audience to make a switch.