![](/static/253f0d9b/assets/icons/icon-96x96.png)
![](https://lemmy.ml/pictrs/image/q98XK4sKtw.png)
Check if you find anything about this in the kernel log (dmesg
).
Check if you find anything about this in the kernel log (dmesg
).
Generally, I tend to think more in the direction of that there is some misunderstanding happening, then people being stupid. Maybe that is just the optimist in me.
What exactly is meant when people say they don’t know git. Do they mean the repository data format? Do they mean the network protocol? Do they mean the command line utility? Or just how to work with git as a developer, which is similar to other vcs?
I think if you use some git gui, you can get very far, without needing to understand “git”, which I would argue most people, that use it daily, don’t, at least not fully.
“you” as in person with required skills, resources and access to a chip fabrication facility. For many others they can just buy something designed and produced by others, or play around a bit on FPGAs.
We will also see how much variation with RISC-V will actually happen, because if every processor is a unique piece of engineering, it is really hard to write software, that works on every one.
Even with ARM there are arguable too many designs out there, which currently take a lot of effort to integrate.
Depends a bit on what the default cloning url will be. If the domain is in control of mozilla, which forwards it to github, then fine, if most people start using the github url, then it is still a vendor lock in, because many people and projects will use it, and that is not so easy to move away.
Update: To the people down-voting my comment, I would love to hear why you either disagree with me, or find that my that my contribution to this discussion is worthless.
The upstream URL of a project or repo is important, because it will be used in other projects, like in build scripts for fetching the sources. If a projects changes that URL in the future, and the old URL is no longer available/functional, all those scripts need to be changed and the old versions of these scripts do not work anymore out of the box.
If the project owns the URL, then can add redirect rules, that might help alleviate some of these issues. I don’t think github allows projects that move away from it to do that. So this is a sort of vendor lock-in. The project needs to maintain the repo on github, because they want to break the internet as little as possible.
Also state owned is only really useful for infrastructure, where it doesn’t make sense to have multiple providers and monopolies are easily attainable. Like roads, rails, electricity, internet backbone infrastructure and providers, social media, etc. Democracy is the currently best way we know of managing monopolies.
For other stuff, you probably want employee owned democratic collectives. You would still have competition on the market, but its ordinary people that have the say. This would give more power to the people enthused about the tech and long term success, then all the short term gains.
There is a different term for that:source-available
So you can play a racing game in your car, while letting the autopilot kill you.
And so that the manufacturer can sell you a new car, so that you can play newer games.
What I meant is that is caches the password database for offline use.
There is not much difference between having two apps (password manager and authenticator app) or one app, that does both on the same device.
So, if you want more security, then you have to deal with a hardware token and never with a authenticator app. But then if you loose your token, then you have trouble.
I used to use Aegis, but after setting up my own vaultwarden, I use the normal bitwarden app/plugin on all my systems for passwords and TOTP.
The advantages are that I don’t need my phone to login, the keys are synced and backuped in the encrypted vaultwarden database, which I can then handle with normal server backup tools. It still works offline, because bitwarden app caches the password.
This is IMO much more convenient and secure (in a way that loosing access to a device doesn’t shut you out, and you don’t need to trust third parties) then most other solutions.
Not the drama itself should influence your judgment, but how they will deal with it.
Whenever people work together on something, there will be some drama, but if they are dealing with it, then that should be fine.
Nix and NixOS are big enough, that even if it fails, there are enough other people that will continue it, maybe under a different name.
Even it that causes a hard fork, which I currently think is unlikely, there are may examples where that worked and resolved itself over time, without too much of burden on the users, meaning there are clear migration processes available: owncloud/nextcloud, Gogs/Gitea/Forgejo, redis/valkey, …
“Non-profit organizations” that sounds like the minority of developers. Most projects are from single developers that just throw their project on github et al. and release it from there.
Only really nice when not CLA is required and every contributor retains their copyright. Ente doesn’t seem to require a CLA.
Otherwise it allows the owner to just take the changes from their contributors and change the license at a later date.
The AI part is what makes this fuck up special and international news.
We are used to human fuck ups, but a in person event where the organizers where so lazy that they used AI to create the content and that it sucked is something novel.
AI generated pictures, blogs and books are old news, generated in person events is new.
Well, there are also the mobile variants of Firefox, which are more of their own thing.
IMO Mozilla limited itself a bit too much on Firefox. Which results it their web engine not attracting many developers for it outside Mozilla.
Embedding gecko in your own app was much easier in the past. This is now mostly taken over by CEF and WPE for Blink and WebKit respectively.
Also stuff like B2G (Boot 2 Gecko) or FirefoxOS are dead as well.
A goal of open source should be to be hacker friendly as well, were currently Blink/WebKit is leading. There are so many more projects around those engines than Gecko, which is sad.
Or other standard archiving formats like WARC.
There also is https://github.com/ArchiveBox/ArchiveBox which looks a bit similar.
It contains only code, no assets or textures.
Snap is just one case where Ubuntu is annoying.
It is also a commercial distribution. If you ever used a community distribution like Arch, Gentoo or even Debian, then you will notice that they much more encourage participation. You can contribute your ideas and work without requiring to sign any CLAs.
Because Ubuntu wants to control/own parts of the system, they tend to, rather then contributing to existing solutions, create their own, often subpar, software, that requires CLAs. See upstart vs openrc or later systemd, Mir vs Wayland, which they both later adopted anyway, Unity vs Gnome, snap vs flatpak, microk8 vs k3s, bazar vs git or mercurial, … The NIH syndrom is pretty strong in Ubuntu. And even if Ubuntu came first with some of these solutions, the community had to create the alternative because they where controlling it.
I mod my games on my PC and sync it to my SteamDeck. I also sync the save files back and fourth, to continue playing on different devices. Mostly non-steam games.
I also sync my eBook collection to my eink reader with syncthing.
Everything is also mirrored to my always-on NAS, so syncing always works.
“Copying is theft” is the argument of corporations for ages, but if they want our data and information, to integrate into their business, then, suddenly they have the rights to it.
If copying is not theft, then we have the rights to copy their software and AI models, as well, since it is available on the open web.
They got themselves into quite a contradiction.