

But Bacteria infections aren’t true. RFK Jr. told us so.
But Bacteria infections aren’t true. RFK Jr. told us so.
I’m starting to wonder how much of these explosions are acts of corporate sabotage.
because they want more human brains?
It’s life Jim, but not as we know it.
Not as we know it.
Not as we know it.
Star Trekkin’ Across The Universe…
Ahh seeing images load line-by-line and you get excited as you can actually start to tell what the image is!
I’m finding I’m using youtube less and less that I can’t actually remember the last time I used it for anything. Youtube has so much crap now.
Because of their monetisation scheme for content creators, it was profitable for creators to drag 15 mins of information out to one hour, and then to include shock faces as thumbs to get more clicks, and then to get more viewers just to watch a bit of the video as viewership dwindles into just mindlessly watching suggestions.
I feel sorry for those creators who have gone balls deep with youtube content and who have no “plan b”.
Any suggestions for solutions?
You’re conflating me asking how to use these tools with you who’s misusing them. I see you still don’t accept what you’re doing is wrong. But go you.
Please be very careful. The python code it’ll spit out will most likely be outdated, not work as well as it should (the code isn’t “thought out” as if a human did it.
If you want to learn, dive it, set yourself tasks, get stuck, and f around.
Yeah shell scripts are one of those things that you never remember how to do something and have to always look it up!
Was this system vibe coded? I get the feeling it was…
lol. Way to contradict yourself.
I haven’t actually found the coder-specific ones to be much (if at all) better than the generic ones. I wish I could have. Hopefully LLMs can become more efficient in the very near future.
Some questions and because you don’t actually understand, also, the answers.
I mean yeah, it’s a use case, but own up to the fact that you’re wrong. Or be pissy. I don’t care.
Doesn’t Twitch own all data that is written and their TOS will state something like you can’t store data yourself locally.
No, what is it? How do I try it?
Surely none of that uses a small LLM <= 3B?
But won’t this be a mish-mash of different docker containers and projects creating an installation, dependency, upgrade nightmare?
But its website is Chinese. Also what’s the github?
Thanks, when I get some time soon, I’ll have another look at it and cherry ai with a local install of ollama