For a few hundred kilobyte file sure, the difference is like pocket change. For a larger one you’d choose the right tool for the job though, especially for things like a split archive or a database.
Username checks out! Also you’re absolutely right, just last month I was looking for the best compression algorithm/packages to archive a 70gb DB
What did you find?
I ended up with xz. According to this page it’s the one with the best compression ratio. It’s also the slowest but since it was one off I didn’t mind about it.
Why isn’t everyone using .7z ?
For archiving/backupping *NIX files, tar.whatever still wins as it preserves permissions while 7z, zip and rar don’t
Oh, and while 7z is FOSS and supported out of the box on most Linux desktop OSes and on macOS, Windows users will complain they need to install stuff to open your zip. Somehow, tar.gz is supported out of the box on Linux, macOS, and yes Windows 10 and 11!
Because gzip and bz2 exists. 7z is almost always a plugin or addon, or extra application. While the first two work out of the box pretty much everywhere. It also depends on frequency of access, frequency of addendum, size, type of data, etc. If you have an archive that you have to add new files frequently, 7z is gonna start grating on you with the compression times. But it is Ok if you are going to extract very frequently from an archive that will never change. While gz and bz2 are overall the “good enough at every use case” format.
Just reserve your dislike for the ones still doing .bin, .img, and .cue.
1:1 copies of the bits on the disc is a valid option that some people prefer. Especially if you want to make your own physical disc or make compressed files encoded in a very specific way. It’s also the most reliable way to archive a disc for long-term storage.