• 0 Posts
  • 58 Comments
Joined 1 year ago
cake
Cake day: August 8th, 2023

help-circle
  • Lemmy generally attracts the same kind of person that would also use Linux. Both of them are open source and community driven alternatives to software otherwise provided by large corporations and milked for every last cent. Both of them require just a bit more knowledge in order to comfortably use them. Linux with all the distro’s and desktop environments, Lemmy with all the instances and apps/front-ends. We’re very much a bubble here.


  • I try to steer as many people as I know to Signal, but I don’t want to be the type of person who accepts no compromise so I also use a bunch of others. Whatsapp is the most common, as pretty much everyone here in the Netherlands uses it. I used to use Telegram, but nowadays I trust it less than Whatsapp and all my Telegram chats have moved to Signal. SMS is only there for backup and older people who don’t use other apps. And Discord is there for people who want their messages to never be read, because that app is a dumpster fire that constantly makes me miss messages.



  • Video games are honestly incredible. The prices have stayed relatively the same for a very long time, despite inflation, and yet the quality has shot up immensely. On the one end you have the AAA games like Cyberpunk, Jedi: Survivor, and RDR2 which look absolutely stunning. I’ve spent significant amount of time in games like those just being in awe with the graphics, taking screenshots. These worlds are so big and immersive, and there are so many tiny details.

    Then you have the huge indy/smaller game scene. There are so many good games these days, it’s impossible to play them all. Factorio, Satisfactory, Celeste, Stardew Valley, Valheim, BAR, the list goes on and on. And all for a low price or even no money at all.


  • I always thought LibreOffice was shit and it always felt like I was using a “replacement”. However, after finally using Word again after many years I’ve come to the conclusion that it’s actually not miles ahead and also quite shit. The docx format is bad, so Word is still better at dealing with it purely because it’s their format, but LibreOffice honestly has a nore logical but uglier design. The Word top bar is pure pain


  • No, people just don’t like crypto because it’s a huge waste of energy that has no use for the average person at the moment and is only used by rich people to get richer without much regulation. Don’t get me wrong, it might definitely be useful when used correctly in the future. Not wasting as much energy by ditching proof of work, becoming actually useful for normal transactions, etc. But right now it’s just an overhyped technology for obnoxious cryptobros.


  • I work as a Java programmer, which means that I spend about 50% of my day complaining about Java. Why doesn’t it have enums like Rust? Why are there no tuples? How many goat sacrifices do the Java gods require to support named optional arguments to functions like Python? In the remaining time I have meetings, write docs, write tests, and sometimes even code. Nothing to complain about though, seeing how we are treated compared to people I know who work as taxi drivers or in elderly care, we programmers are basically treated as gods.

    As for music, I like Hardstyle and Drum and Bass primarily. Examples: “Phuture Noize & Devin Wild - Waves”, DnB: “Telomic & Susan H - Underwater”. I’ll be visiting the DnB festival “Liquicity festival” this weekend so I’m very hyped right now



  • Machine learning and compression have always been closely tied together. It’s trying to learn the “rules” that describe the data rather than memorizing all the data.

    I remember implementing a paper older than me in our “Information Theory” course at university that treated the creation of a decision tree as compression. Their algorithm considered sending the decisions tree and all the exceptions to the decision tree and the tree itself. If a node in the tree increased the overall message size, it would simply be pruned. This way they ensured that you wouldn’t make conclusions while having very little data and would only add the big patterns in the data.

    Fundamentally it is just compression, it’s just a way better method of compression than all the models that we had before.

    EDIT: The paper I’m talking about is “Inferring decision trees using the minimum description length principle” - L. Ross Quinlan & Ronald L. Rivest



    • Actually being the owner of my phone. Apple decides everything for their users and allows them little freedom. I want to be able to put random apps on my phone, including maybe even my own.

    • Price. Shit’s expensive. I now got a Pixel 8 for less than 500 euro’s. Before that I had phones around the 300 euro price range.

    • Their ecosystem. They try to lure you into an everything Apple ecosystem. Stuff like iMessage is horrible for consumers. With an Android phone I have choice of apps, smart watch, earbuds, etc. Apple will always try to force you into buying their fancy but expensive things.

    • No benefit, there’s plenty of cool Android phones.

    Etc etc.


  • I’m on Arch (actually a converted Antergos) and I have an NVIDIA card as well. My first attempt a few months ago was horrible, bricking my system and requiring a bootable USB an a whole evening to get Linux working again.

    My second attempt was recently, and went a lot better. X11 no longer seems to work, so I’m kinda stuck with it, but it feels snappy as long as my second monitor is disconnected. I’ve yet to try some gaming. My main monitor is a VRR 144Hz panel with garbage-tier HDR. The HDR worked out of the box on KDE Plasma, with the same shitty quality as on Windows, so I immediately turned it off again. When my second monitor is connected I get terrible hitching. Every second or so the screen just freezes for hundreds of milliseconds. Something about it (1280x1024, 75Hz, DVI) must not make Wayland happy. No settings seem to change anything, only physically disconnecting the monitor seems to work.









  • You call it “quick to judge and superficial”, but imo that’s the wrong attitude. Every tool we use as humans should be designed to be as intuitive as possible. It makes it easiest for people to learn how to use a new tool. That doesn’t mean that a tool cannot be complex or customizable, but the default experience should make it easy for new users to quickly achieve something. Once they grow accustomed to the tool they can tailor it their own way.

    No tool has to do this, but if it wants to be widely used then this is kinda necessary.

    There’s a reason why there are whole fields of study into human media interaction, and why software companies hire UI designers. Everything that doesn’t have to be explained in words and text because it is intuitive saves mental overhead for the user and makes the application more accessible.