Actually, ufw has its own separate issue you may need to deal with. (Or bind ports to localhost/127.0.0.1 as others have stated.)
Actually, ufw has its own separate issue you may need to deal with. (Or bind ports to localhost/127.0.0.1 as others have stated.)
Thank you for posting this, hadn’t heard of it before.
Yes, I also work in IT.
The paid GUI version is extremely cautious on the auto-updates (it’s basically a wrapper for the CLI) - perhaps a bit too cautious. The free CLI version is also very cautious about making sure your backup storage doesn’t break.
For example, they recently added zstd encryption, yet existing storages stay on lz4 unless you force it - and even then, the two compression methods can exist in the same backup destination. It’s extremely robust in that regard (to the point that if you started forcing zstd compression, or created a new zstd backup destination, you can use the newest CLI to copy data to the older lz4 method and revert - just as an example). And of course you can compile it yourself years from now.
The licence is pretty clear - the CLI version is entirely free for personal use (commercial use requires a licence, and the GUI is optional). If you don’t like the licence, that’s fine, but it’s hardly ‘disingenuous’ when it is free for personal use, and has been for many years.
IMHO, Duplicacy is better than all of them at all those things - multi-machine, cross-platform, zstd compression, encryption, incrementals, de-duplication.
Wouldn’t you be on CGNAT though? How are they blocking it - at the DNS level? Have you tried a CNAME record that points your own domain to the actual duckdns domain? Just curious how/why they might be doing this.
Well your account is on lemmy.world so how d’ya know the issue isn’t with your own access to the front end?
Many don’t interact with the lemmy.world directly, so we might only see delays in post propogation (if there is such an issue on the backend - I don’t see any but could be wrong).
I agree picking the biggest instances isn’t great from a scaling perspective, but s’gonna be hard to move any community once established.
+1 for Duplicacy. Been using it solidly for nearly 6 years - with local storage, sftp, and cloud. Rclone for chonky media. Veeam Agent for local PC backups as a secondary method.
More people should use BiglyBT and its Swarm Merging feature. You get the ability to seed or download chunks from peers across separate torrent files.
It’s a shame because if more people used it, the BiglyBT devs might add hash-based merging (with v2 torrents) instead of just size-based. Hybrid/v2 merging is still possible, but file size is less reliable and caters to files only larger than 50MB.
Some kinda auto v1/v2/hybrid private<->public torrent maker plugin for BiglyBT would be… bigly.