• 1 Post
  • 85 Comments
Joined 1 year ago
cake
Cake day: July 10th, 2023

help-circle
  • Great to hear this story of success. That plus

    $266.99 per probe for the original proprietary one

    Reminds me of Schneider’s stupid proprietary dongle for programming their PLCs. It’s just a CH341 in a funny shaped case that fits into the funny shaped slot on the PLC, where it plugs onto an ordinary 0.1" pin header to talk logic level serial.

    Plus it has a custom USB ID of course. Probably costs $2 to manufacture, sells for almost $300 as well.


  • I don’t see how people like you miss the entire concept of “base load”.

    I live in a region with vast amounts of renewable energy resources. It’s always windy and the sun shines almost every day. I have solar panels on my house that cover most of my DHW and a large fraction of my summer cooling load, and keep most of my appliances running.

    But right now, the sun is down and the wind is flat. And I still need power. My battery storage would be depleted by morning, damaging it through overdischarge if I don’t buy power from the grid instead.

    And it’s a lovely summer evening with no heating or cooling demand! What about midwinter, -35C and dark and snowy? Where is my power coming from on that day, after a month of days just like it?

    Nuclear.





  • Right, we need to come up with better terms for talking about “AI”. Personally at the moment I’m considering any transformer-type ML system to be part of the category, as you stated none of them are any more “intelligent” than any others. They’re all just a big stack of tensor operations. So if one is AI, they all are.

    Remember long ago when “fuzzy logic” was all the hype and considered to be AI? Just a very early form of classifier network but everyone was super excited at the time.


  • I’m just stating that “AI” is a broad field. These lightweight and useful transformer models are a direct product of other AI research.

    I know what you mean, but simply stating “Don’t use AI” isn’t really valid anymore as soon these ML models will be a common component. There are even libraries and hardware acceleration support for tensor operations on the ESP32-S3.


  • It’s possible for local AI models to be very economical on energy, if used for the right tasks.

    For example I’m running RapidOCR which uses a modern transformer architecture, and absolutely blows away traditional OCR at capturing data from character displays.

    Doesn’t even need a GPU and returns results in under a second on a modern CPU. No preprocessing needed, just feed it an image. This little multimodal transformer is just as much “AI” as bloated general purpose GPTs, but it’s cheap, fast and useful.









  • evranch@lemmy.catoMicroblog Memes@lemmy.worldUnprovoked
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    4
    ·
    3 months ago

    Another idealist who has mixed up “allies” with “friends”. We don’t need to approve of anything our allies do, or even think that they’re good guys. All that “allies” means is that they would have our back, just like we have theirs, against our common enemies.

    This is because “enemies” in this case refers to nations like Iran, who hate our entire way of life and aren’t afraid to say they would like to see us all dead. It doesn’t matter to them if you think your government represents you (I certainly don’t) but if you believe we deserve to die, why not offer yourself up?

    You might get lucky and get thrown off a building, or a fairly quick beheading with a dull-ish knife. Or less lucky, and get roasted alive in a cage over a fire, dragged behind a vehicle, or hacked into various pieces.

    “We” are allies with Israel only because they are a shining light of sanity in a region that could otherwise be classified as “batshit insane”


  • I feel the OOP debate got a bit out of hand. I hate OOP as well, as a paradigm.

    But I love objects. An object is just a struct that can perform operations on itself. It’s super useful. So many problems lend themselves to the use of objects.

    I’ve been writing a mix of C and C++ for so long I don’t even know where the line is supposed to be. It’s “C with objects”. I probably use only 1% of the functionality of C++, but that 1% is a huge upgrade from bare C IMO.


  • I was more referring to the fact that everything is immutable by default. As someone who’s just starting to get old (40) and literally grew up with C, it’s just ingrained in me that a variable is… Variable.

    If I want a variable to be immutable I would declare it const, and I’m just not used to the opposite. So when playing with Rust, the tutorial said that “most people find themselves fighting with the borrow checker” and sure enough, that’s what I ended up doing!

    I like the concepts behind it, it really encourages writing safe code, and I feel like it’s not just going to be a fad language but will likely end up underlying secure systems of the future. Linux kernel rewrite in Rust when?

    It’s just that personally I don’t have the flow of writing code like I would in C/++, just not used to it. The scoping, the way you pass variables and can sort of “use up a reference” so it’s not available anymore just feels cumbersome compared to just passing &memory_location and getting on with it, lol


  • evranch@lemmy.catoMicroblog Memes@lemmy.worldUnprovoked
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    9
    ·
    3 months ago

    Obviously? Like, this is how war and geopolitics has worked since the invention of the pointy stick.

    Whether you like them or not, Israel is an ally and Iran is a self-proclaimed enemy of the Western world.

    An enemy jumps in on a war and bombs one of our allies, you expect no reaction? Duh


  • Rust is heresy. Everything should be mutable, the way that God intended it to be!

    Seriously though as someone who has mainly done embedded work for decades and got used to constrained environments, the everything is immutable paradigm seems clunky and inelegant. I don’t want to copy everything all the time.

    Now if you’ll excuse me, these null pointers aren’t going to dereference themselves