November 14, 2018
Most direct imaging machines are driven by bitmaps. An exposure head containing thousands or millions of tiny beams scans over the substrate. As it moves, a bitmap is used to turn each beam on and off. The result is the desired image patterned onto the substrate.
If you want to image a line with any kind of accuracy you will need to be able to place accurately the beam centers. For example, if you want to image a 10 um wide line, and you want to hold a line width tolerance of 10% (i.e. 1 um) then you need to be able to control each beam "shot" to 1 um.
This means you need a bitmap whose pixels are spaced 1 um.
So how many pixels do we need in total if the substrate is 500 x 500 mm?
(500 mm x 1000 pixels/mm) x (500 mm x 1000 pixels/mm) = 250E9 pixels.
Since we can pack 8 pixels in a byte, the number of bytes needed:
(250E9 pixels) / (8 pixels/byte) = 31.2 GByte
Clearly we can't put this many pixels into a TIFF file because the maximum file size supported by TIFF is 4GB.
If we compress the bitmap before storing it as TIFF we can expect about a 10X reduction in file size. So our 31.2 GBytes is reduced to 3.12 GBytes and this clearly is under the 4 GB limit.
Finer and Finer
Technology doesn't stand still and soon we will have a request for finer lines - say 4 um. To maintain the same line tolerance we now need a pixel size of 0.4 um. How many bytes do we have to deal with for the same 500 x 500 mm substrate?
(500 mm x 2500 pixels/mm) x (500 mm x 2500 pixels/mm) = 1560E9 pixels.
The byte size is:
(1560E9 pixels) / (8 pixels/byte) = 195 GByte
Even after a 10X compression the resulting 19.5 GBytes can't be put into a TIFF file.
I am not even going to compute the byte size for a 0.25 um pixel. (OK, I did and we get 4000E9 pixels representing 500 GByte; after compression we are at 50 GByte.)
The TIFF header uses 4 byte (32 bit) offsets. This limits any offset to ~4 Gbytes and that's what limits the file size.
Why hasn't TIFF been updated to support larger files? First, TIFF is not controlled by a standard setting body -- it's owned by Adobe. Second, the last official revision of TIFF is v6.0 released in 1992. Yes, in 1992. Reviewing my notes, most desktop computers had about 2-4 MBytes of RAM at that time so a 4 GByte limit seemed to have plenty of head room.
Since it is unlikely that Adobe will be extending TIFF officially any time soon, others have taken it upon themselves to address the limit.
Since TIFF has been in use for so long and there are many libraries in use, the plan is to make only the few changes necessary to support larger files. This should minimize the work required to upgrade readers and writers to support BigTIFF.
Here is a summary of the changes proposed for BigTiff ( from the WEB page: BigTIFF Design )
|Version||42||43||So reader knows BigTIFF|
|Offset to First IFD||4||8||Comments|
|Datatype||LONG||LONG8||New Datatype for BigTIFF|
|Strip and Tile Offsets||LONG||LONG/LONG8||Can now use longer offsets to support larger file size.|
We also want to thank Aware Systems for their excellent description of BigTIFF.
Artwork does not provide a BigTiff output at this time. Many of our customers are using the raw bitmap (no header) output which has no file size limit because it has no headers or associated offsets.
However if compression is desired and the standard TIFF with packbits exceeds 4 GB, we are prepared to offer a BigTIFF alternative.