I guess it’s worth checking if those names point to the expected binaries, but I also think it would be highly unlikely they would be anything else than just /usr/bin/ssh
and /usr/bin/ssh-agent
.
I guess it’s worth checking if those names point to the expected binaries, but I also think it would be highly unlikely they would be anything else than just /usr/bin/ssh
and /usr/bin/ssh-agent
.
As mentioned, -v
(or -vv
) helps to analyze the situation.
My theory is that you already have something providing ssh agent service, but that process is somehow stuck, and when ssh tries to connect it, it doesn’t respond to the connect, or it accepts the connection but doesn’t actually interact with ssh. Quite possibly ssh doesn’t have a timeout for interacting with ssh-agent.
Using eval $(ssh-agent -s)
starts a new ssh agent and replaces the environment variables in question with the new ones, therefore avoiding the use of the stuck process.
If this is the actual problem here, then before running the eval
, echo $SSH_AUTH_SOCK
would show the path of the existing ssh agent socket. If this is the case, then you can use lsof $SSH_AUTH_SOCK
to see what that process is. Quite possibly it’s provided by gnome-keyring-daemon
if you’re running Gnome. As to why that process would not be working I don’t have ideas.
Another way to analyze the problem is strace -o logfile -f ssh ..
and then check out what is at the end of the logfile
. If the theory applies, then it would likely be a connect
call for the ssh-agent.
I highly doubt businesses would have been this fast in making the switch.
Well that’s exactly the worry. Why shouldn’t it be? It is their business and livehood.
Apparently Lapce has remote development as its core feature. But I only (re?)learned of it today…
How didn’t tramp
work out for you?
A great git integration can work well in an editor. I use Magit in Emacs, which is probably as full-featured Git-client as there can be. Granted, for operations such as cherry-picking or rebasing on top of a branch or git reset
I most often use the command line (but Magit for interactive rebase).
But editor support for version management can give other benefits as well, for example visually showing which lines are different from the latest version, easy access to file history, easy access to line-based history data (blame), jumping to versions based on that data, etc.
As I understand it vscode support for Git is so basic that it’s easy to understand why one would not see any benefits in it.
It still maintains their market position, which has value. For example, you might not visit other sites because they don’t have the content you want (and the content stays on YT because they have the viewers), or you might even share YT links to other people.
Yes, just mount to /mnt/videos
and symlink that as needed.
I guess there are some benefits in mounting directly to $HOME
, though, such as find
/fd
work “as expected”, and also permissions will be limited automatically per the $HOME
permissions (but those can be adjusted manually).
For finding files I use plocate
, though, so I wouldn’t get that marginal benefit from mounting below $HOME
.
My /home is also on a separate filesystem, so in principle I don’t like to mounting data under there, because then I cannot unmount /home (e.g. for fsck purposes) unless I unmount also all the other filesystems there. I keep all my filesystems on LVM.
So I just mount to /mnt and use symlinks.
Exception: sshfs I often mount to home.
But how many use it for browsing, which I imagine this data is from?
Papermerge version 2.0, version 2.1 and version 3.0 are entirely different and incompatible applications.
That doesn’t exactly inspire confidence in the future versions of this application, given in particular the use case of long-term document archival :).
I think the second point is the biggest for me: it’s almost like Canonical wanted to have a single dominant store for apps, as the ecosystem they are building supports only one. And, apparently, that one server is also closed?
So if you try to make an alternative source and give instructions to people how to configure their snap installation to use it (I found this information very hard to find for some reason…), your “store” probably won’t have the same packages Canonical’s has, so users won’t be able to find the packages and I imagine updates are also now broken?
Contrasting this with flatpak: you just install apps from wherever. Or from flathub. Or your own site. Doesn’t matter. No business incentive behind—built into the tools—to make everyone use flathub.org.
I just noticed https://lemmy.ml/u/giloronfoo@beehaw.org had proposed the same, but here’s the same but with more words ;).
I would propose you try to split the data you have manually into logically separate parts, so that you could logically fit 0.8 TB on one drive, 0.4 TB on another, and maybe sets of 0.2TB+0.2TB on a third one. Then you’d have a script that uses traditional backup approaches with modern backup apps to back up the particular data set for the disk you have attached to the system. This approach will allow you to access painlessly modern “infinite increments” backups where you persist older versions of data without doing full and incremental backups separately. You should then write a script to ensure no important data is forgotten to be backed up and that there are no overlapping backups (except for data you want to back up twice?).
For example, you could have a physical drive with sticker “photos and music” on it to back up your ~/Photos and ~/Music.
At some point some of those splits might become too large to fit into its allocated storage, which would be additional manual maintenance. Apply foresight to avoid these situations :).
If that kind of separation is not possible, then I guess tar+multi volume splitting is one option, as suggested elsewhere.
The pins are part of the window, so… You can access old closed windows through the history menu, which I believe works after starting a new session after quitting it.
I have 64GB RAM and my 64GB swap still gets filled to 60% over time.
It just happens so that apps end up touching some memory once that they never then use again. Better use some SSD for that instead of RAM.
Do share if you have experiences using yabridge with the flatpak distribution of Bitwig! My existing setup did not work with that, but the deb version worked ok on Debian, so I keep using that.
yabridge works really great for working with Windows plugins. I have quite a few of them working out just fine—at least with Bitwig, which is a native application.
That said, I’ve also seen some plugins that did not work. In particular the problems can be related to license management; they probably get confused of what kind of system it is running on…
In my view yabridge is easy to use, but on the other hand I have a decent amount of Linux experience, so perhaps the experience can vary.
There is the DJVU format for this exact use case, but you’d need to convert them to, say, pdf for many use case. Its also a bit old and perhaps not maintained, soo…
HEIF and other modern video encoders (HEIF=H265) should fare a lot better than JPEG, though.
I believe it’s quite possible that there is no information where a removed file belonged to in exr4fs and it’s ilk; after all they also have concept of “lost and found” files, and files there also don’t have that information. If the directory they were contained in gets overwritten in a form that the file is not there, then the information is likely gone; along with the name of the file.
Just get a secondary device, recover everything you can, pick the files you needed. Consider yourself lucky if you get to restore the file you lost.
Good luck recovering your files! For future I recommend making backups. I use kopia, borgbackup is also popular.
At the end of the log you find:
822413 connect(4, {sa_family=AF_UNIX, sun_path="/run/user/1000/gcr/ssh"}, 110) = 0 ... 822413 read(4,
meaning it’s trying to interact with the ssh-agent, but it (finally) doesn’t give a response.
Use the
lsof
command to figure out which program is providing the agent service and try to resolve issue that way. If it’s not the OpenSSH ssh-agent, then maybe you can disable its ssh-agent functionality and use real ssh-agent in its place…My wild guess is that the program might be trying to interactively verify the use of the key from you, but it is not succeeding in doing that for some reason.