lossless compression of still images - recommendations?

Dave Stevens geek at uniserve.com
Sun Feb 19 23:51:21 UTC 2017


On Sun, 19 Feb 2017 23:37:49 +0100
Ralf Mardorf <silver.bullet at zoho.com> wrote:

> Data compression might be useless, if the data is needed for work and
> not just for archiving data.
> 
> 17,280 photos a day = every 5 seconds one photo = 6,307,200
> photos/year
> 
> Assuming each photo should have a size of 10 MiB it would be
> 63,072,000 MiB / 1024 = 61594 GiB
>                         61594 GiB / 1024 = 60 TiB
> 
> That is a lot of HDD space. Assuming it should be possible to compress
> such an amount of data a lot, then another issue does arise.
> 
> How long does it take
> 
>   1. to compress the data?
> 
> and much more important
> 
>   2. to extract the data for usage?
> 
>   3. What amount of uncompressed data is required for the work?
> 
> IOW it might happen, that for using the data, a lot of data anyway is
> required, in a non-compressed format. Compressing data and extracting
> data might take very long.
> 
> Regards,
> Ralf
> 
> 

That's pretty much the size analysis I did Ralf, except for an image
size closer to 2.6 MiB. The device produces .jpgs and so far that's all
the testing I've done. I can tell you that running 1750 images (so a
tenth of a day) through ffmpeg gives 149 MiB of .mp4 so there's some
serious compression going on there (having started with 4GiB).

The length of time for compression and extraction are not critical
features. Trying the previous suggestion of tap+bz2 gace a small
reduction in file size and took a lot longer than tar.gz (by a factor
of about ten). So really not very worthwhile. I'll try some other
compression options.

D


-- 
In modern fantasy (literary or governmental), killing people is the
usual solution to the so-called war between good and evil. My books are
not conceived in terms of such a war, and offer no simple answers to
simplistic questions.

----- Ursula Le Guin




More information about the ubuntu-users mailing list