• 0 Posts
  • 15 Comments
Joined 4 months ago
cake
Cake day: March 3rd, 2024

help-circle






  • IBM then. or, i don’t know, the British Royal Family?

    the reality of talking about extremist economics is no one knows how it would work out in the long term. but regardless, if it happened tomorrow we already have a Microsoft to deal with.

    “taxation is theft” “wage labour is exploitation”

    sometimes things are subtle and complicated and can’t be practically boiled down to absolutes.


  • “we don’t know how” != “it’s not possible”

    i think OpenAI more than anyone knows the challenges with scaling data and training. anyone working on AI knows the line: “a baby can learn to recognize elephants from a single instance”. reducing training data and time is fundamental to advancement. don’t get me wrong, it’s great to put numbers to these things. i just don’t think this paper is super groundbreaking or profound. a bit clickbaity and sensational for Computerphile




  • chrash0@lemmy.worldtoLinux@lemmy.mlLix - a new fork of Nix
    link
    fedilink
    arrow-up
    34
    arrow-down
    2
    ·
    edit-2
    2 months ago

    i really want to like Nix.

    gave it a shot a few years ago, but i felt like documentation and community support wasn’t really there yet. this was long before Nix surpassed Arch in terms of number of available packages. now people still complain about documentation, especially of the Nix language. i see a lot of package authors using it, and that kind of tempts me to start using at least the package manager. but a lot of packages don’t. the allure of GitOpsing my entire OS is very tempting, but then there’s been these rumors (now confirmed) of new forks, while Guix splintered off much earlier. for something that’s ostensibly supposed to be the most stable OS, that makes me nervous. it also seems to have some nontrivial overhead—building packages, retaining old packages, etc.

    the pitch for Nix is really appealing, but with so much uncertainty it’s hard to pull the trigger on migrating anything. heck, if i could pull off some PoCs, i think my enterprise job might consider adopting it, but it’s a hard recommend for me today as it was 5 years ago.



  • chrash0@lemmy.worldtoTechnology@lemmy.worldRabbit R1 is Just an Android App
    link
    fedilink
    English
    arrow-up
    65
    arrow-down
    7
    ·
    2 months ago

    what else would it be? it’s a pretty common embedded target. dev kits from Qualcomm come with Android and use the Android bootloader and debug protocols at the very least.

    nobody is out here running a plain Linux kernel and maintaining a UI stack while AOSP exists. would be a foolish waste of time for companies like Rabbit to use anything else imo.

    to say it’s “just an Android device” is both true and a mischaracterization. it’s likely got a lot in common with a smartphone, but they’ve made modifications and aren’t supporting app stores or sideloading. doesn’t mean you can’t do it, just don’t be surprised when it doesn’t work 1-1



  • it’s not a password; it’s closer to a username.

    but realistically it’s not in my personal threat model to be ready to get tied down and forced to unlock my phone. everyone with windows on their house should know that security is mostly about how far an adversary is willing to go to try to steal from you.

    personally, i like the natural daylight, and i’m not paranoid enough to brick up my windows just because it’s a potential ingress.


  • seems like chip designers are being a lot more conservative from a design perspective. NPUs are generally a shitton of 8 bit registers with optimized matrix multiplication. the “AI” that’s important isn’t the stuff in the news or the startups; it’s the things that we’re already taking for granted. speech to text, text to speech, semantic analysis, image processing, semantic search, etc, etc. sure there’s a drive to put larger language models or image generation models on embedded devices, but a lot of these applications are battle tested and would be missed or hampered if that hardware wasn’t there. “AI” is a buzz word and a goalpost that moves at 90 mph. machine learning and the hardware and software ecosystem that’s developed over the past 15 or so years more or less quietly in the background (at least compared to ChatGPT) are revolutionary tech that will be with us for a while.

    blockchain currency never made sense to me from a UX or ROI perspective. they were designed to be more power hungry as adoption took off, and power and compute optimizations were always conjecture. the way wallets are handled and how privacy was barely a concern was never going to fly with the masses. pile on that finance is just a trash profession that requires goggles that turn every person and thing into an evaluated commodity, and you have a recipe for a grift economy.

    a lot of startups will fail, but “AI” isn’t going anywhere. it’s been around as long as computers have. i think we’re going to see a similarly (to chip designers) cautious approach from companies like Google and Apple, as more semantic search, image editing, and conversation bot advancements start to make their way to the edge.