Linked by Thom Holwerda on Sun 16th Sep 2007 12:59 UTC, submitted by anonymous
Graphics, User Interfaces "The HTML file that contains all the text for this article is about 25000 bytes. That's less than one of the image files that was also downloaded when you selected this page. Since image files typically are larger than text files and since web pages often contain many images that are transmitted across connections that can be slow, it's helpful to have a way to represent images in a compact format. In this article, we'll see how a JPEG file represents an image using a fraction of the computer storage that might be expected. We'll also look at some of the mathematics behind the newer JPEG 2000 standard."
Thread beginning with comment 271759
To view parent comment, click here.
To read all comments associated with this story, please click here.
ValiSystem
Member since:
2006-02-28

very strange. To me jpeg2K is the grail for huge images and does not need indexing because of the fractal/tiles format design.

To have an idea you can extract small portion of a 200 000x200 000 pixels compressed files, with in place resizing, quicker than a flat uncompressed file when you resize a lot, on cheap hardware (CPU and Memory). This is simply wonderful, but it's mainly used by cartographers, though.

A file format like jpeg2K allow completely new approach to problems, for example, we begin to see heterogeneous dpi for screen, from 75 to 130, and i won't be surprised to see 200+ dpi screens in the years coming. Just think about this image on web pages problem, in high dpi screens images will become too small or overzoomed, and too big or with heavy resize in low dpi screens. With some engineering jpeg2k allows you to stop the download when you have enough data : the format allows the building of an image overview with just reading the beginning of the file. So for low dpi screens you download only 12kB to display the well fitted image and 50kB for the high dpi one.

If this format weren't crippled by patents it would be already the high definition image format de facto standard.

Reply Parent Score: 3

bnolsen Member since:
2006-01-06

I work with photogrammetry and GIS.
Jpeg2k doesn't allow for embedded indexing of the tiles. It requires a separate "index file" to be generated.

Additionally with any small amount of compression (~5:1 or better) the tile boundaries become clearly visible unless custom smoothing is done at the tile boundaries after decompression and image formation.

The SPIHT family of wavelet compression schemes is superior to what jpeg2k uses. The problem was that SPIHT is a patent minefield and in its original form is prohibitively expensive both in cpu and memory.

Jpeg2k is an improvement time wise but not complexity wise. The bit encoding scheme is very complex and very difficult to implement a clean room scheme.

BCWT is pretty revolutionary and very clean. Implementation is very easy, the algorithm is small. and BCWT outperforms jpeg2k at the same compression rates WITHOUT doing any encoding of the output stream.

I don't know what the patent problems are with BCWT but for today it would be a great place to start, and to move ahead to fix the problems with jpeg2k (container issues).

Reply Parent Score: 2