I use 7zip to compress 163MB folder. The folder only has photos. So I choose:
Compression level: [b:2mkuoqn3]Normal[/b:2mkuoqn3]
compression method: LZMA
After compressed:
It's 157MB.
Now this is what I don't understand: I also have tried the:
Compression level: [b:2mkuoqn3]Ultra[/b:2mkuoqn3]
compression method: LZMA
After compressed:
It's 157MB.
Does it mean Normal = Ultra = 157MB?
Can someone test the 2 compression methods on picture?
Hi Joel,
What you're encountering is actually a quite common problem: Images are already compressed and it is difficult to compress them further.
You didn't mention what type of images they are (JPG, TIF, PNG), but unless you've specifically not done this, the images are in a compressed format. JPG, TIF, PNG, etc all use varying compression algorithms and they typically don't compress much further.
If you were to take different files (text, pdfs, etc) and compress them you would find that they compressed much better. Database files are awesome candidates for compression.
In short, 163 MB of images = 157 MB of zipped images. Compressed files can't be compressed any further.
For more info: [url:1wjuht6n]http://en.wikipedia.org/wiki/Image_compression[/url:1wjuht6n]
Just adding my 2 cents worth here, but what I find silly are those who upload a torrent file (tv show , using p2p), and it's broken down into say 20 compressed files, with the rar extension. Torrent files are already in compressed format, so it's actually speaks volumes of ignorance on those who think they are helping others. Personally I avoid them, as a compressed file can contain more than is advertised, Mindblower!
"For the needy, not the greedy"
SFX is simply a self extracting archive. It's an EXE file so someone without unzipping software can extract files from an archive.
[url=http://en.wikipedia.org/wiki/Self-extracting_archive:3oybmv87]Wikipedia - SFX Archive[/url:3oybmv87]
[url=http://en.wikipedia.org/wiki/Dictionary_coder:3oybmv87]Dictionary Encoding[/url:3oybmv87] is a type of compression that searches for known words or phrases and can substitute a memory reference in their place when compressing. A memory reference is much smaller than the actual text. The dictionary size simply refers to how large the dictionary (list of known words or phrases) can be within the compression technique.
Word size refers to how large a word or phrase can be in the dictionary.
Solid Block is a way of grouping files. The size is the max. size of these blocks.
--Zig
Hey Joel - I'm a bit late with this response.....sorry about that. I just found out that FILEminimizer Pictures has now gone freeware!
If you are looking to compress images in particular, to help save space, then you cannot go past FILEminimizer Pictures. This freeware is dedicated solely to image compression and will help you gain maximum compression with minimum quality loss.....it is [i:2spn7j0r]the[/i:2spn7j0r] best.
Check it out further [url=http://www.balesio.com/fileminimizerpictures/eng/index.php:2spn7j0r]HERE.[/url:2spn7j0r]
Cheers.....Jim
[quote="Mindblower":aw8ie7ob]Just adding my 2 cents worth here, but what I find silly are those who upload a torrent file (tv show , using p2p), and it's broken down into say 20 compressed files, with the rar extension. Torrent files are already in compressed format, so it's actually speaks volumes of ignorance on those who think they are helping others. Personally I avoid them, as a compressed file can contain more than is advertised, Mindblower![/quote:aw8ie7ob]
This originated on UseNet and is very useful when you think about it. Not all connections are solid and uploads/downloads can corrupt. When this happens the corruption is in a smaller subset of the file instead of the entire file. Of course, Usenet uploaders usually include parity files you any missing or corrupt files can be easily rebuilt.
[quote="Ozbloke":aw8ie7ob]If you are looking to compress images in particular, to help save space, then you cannot go past FILEminimizer Pictures. This freeware is dedicated solely to image compression and will help you gain maximum compression with minimum quality loss.[/quote:aw8ie7ob]
If I remember from past investigation programs like this actually search for, and remove, any embedded content like Exif, preview images, etc. while retaining the full actual image.
[quote="Mindblower":32i2b1p7]Dave, am I correct in believing you agree that torrent files should be broken down into RAR or ZIP, Mindblower![/quote:32i2b1p7]
Well, I was just pointing out that the practice originated with the UseNet crowd and serves them well. There are obvious infrastructure differences between UseNet and Torrents and the P2P model. Files broken down into smaller blocks with parity is extremely effective on UseNet. The P2P model is more robust than UseNet, but a little error correction and recovery never hurts.
I see your point. The biggest obstacle when d/l these packets, in NOT reaching 100%. Some players, like GOM, can skip over sections, so one can still the bulk of what was d/l. Is there any way to construct a file when the packet is not completed? That's the hurdle, Mindblower!
"For the needy, not the greedy"
1 Guest(s)