Saturday, February 13, 2010

The Myth of Graphic Perfection

From time to time in the course of my printshop work day, I find myself conversing with pressmen. They are, in most cases, a rare breed of crusty individual who, much like movie theatre projectionists, IT specialists and auto mechanics, believe themselves to be the keepers of special, arcane, under-appreciated knowledge. The world at large just doesn't understand them, and things in general just aren't as good as they used to be. Just ask 'em, they'll tell ya. Every damn day.

Anyway, yesterday's conversation centered around digital printing and poly-plates versus traditional metal plates, and their relative accuracy. This press guy says to me "Do me a favor... sometime, when you've got time*, I want you to make a square, 8" x 8", and print it out on one of those copiers up front, and then measure it to see how far off it is. I guarantee you the image will be distorted." I reply "yeah, I know... if I output that same image on 4 different printers, and the plate-setter, I'll get 5 different results, because none of them are 100% accurate, and they're never calibrated the same way at the same time."

"Exactly," he says, "people think them damn computers are perfect, but they can't even make a perfect square." I take a breath, count quickly to 10 and say "well, the computer is fine, it can make a perfect square, it's the output devices that can't accurately reproduce that perfect digital information." He takes a quarter of a second to not even attempt to process the fact that I've presented him with a contradictory viewpoint and says "that's what I mean, you never have that problem with negative to metal plate, it comes out the same way every time." He then walks back to his press, confident that he has just imparted an important piece of lithography gospel to the foolish young upstart who runs that pre-press department. Never mind the fact that I'm 46 years old, and I've worked in printing for a quarter of a century, and I've probably shot, developed and stripped more film negatives than he has, and the fact that he's barely 10 years older than me, and... well, just never mind. This guy never wants to have a real discussion, he just wants me to hold still while he explains to me that I don't know things that I actually do know.

Here's where the pressman and I agree:

He's right about the fact that a perfect square (or a circle, or a trapezoid, or any font, or the half-tone dots of any continuous tone photo) will almost never come out of a printer or plate-setter 100% perfectly accurate. The image will stretch, sometimes vertically, sometimes horizontally. These are machines, with hundreds of moving parts, and they do wear down and need to be re-calibrated on a regular basis. The software will always send accurate information to the output device, but due to wear and tear, differences in stock thickness and even room temperature the device will probably not reproduce the same image the same way two days in a row. This is a solid fact, and I will never dispute it.

Here's where the pressman is just plain wrong:

The problem with this guy's bullshit opinion is his insistence that the above listed inaccuracies don't extend to more traditional methods of image reproduction. They do. In traditional offset lithography, an image is transfered onto a piece of negative film (the black parts are clear, and the white parts are black). That film is then taped to a paper or vinyl sheet by a stripper (yes, that's what they're called, stop giggling). Then that stripped negative is placed over a emulsion coated metal plate and exposed to light. Then the plate is developed, and it's time to fire up the press.

Nowadays, that film negative is generated by sending a document from a computer to a film output device, which uses a laser to expose the image onto the film one line at a time (industry standard is somewhere in the 133 to 150 lines per inch range). The problem here is, it's just another instance of a computer sending digital information to an output device, and all the potential problems covered in the last paragraph still apply. On top of that, the exposed film has to be sent through a processor, where it is developed, fixed and washed, like any other piece of photographic film. The developer and fixer chemicals slowly lose potency over time, and can be radically affected by temperature, and human error.

The pre-computer version of this process is even more troublesome. Back in the day, when I and that pesky pressman learned our stuff, film was developed in a darkroom, by hand. You stuck a finger in the developer to see if it was warm enough, and development time was usually determined by watching the film develop and checking your wristwatch. To make matters worse, the camera that was used to photograph the original graphic image was a behemoth of a machine, and image size was controlled by moving the artwork closer to or farther away from the camera lens. This was achieved by turning two cranks with both hands while staring at a small piece of tape with percentages from 1% to maybe 500% printed on it, as it whirled by. The camera operator rested his head against the camera, turned the cranks, and stopped when the tape reached the desired percentage. This means that image size could vary depending on the size of the camera operator's head, as it affected the angle at which the percentage tape was viewed. I'm not making this up. This is how we used to do it. 100% wasn't even exactly the same from minute to minute, on the same machine with the same operator.

And I haven't even mentioned the possibility of plates stretching once you put them on the press. Don't worry, I won't give you the details, just believe me when I say that this can happen.

What the hell does this all mean at the end of the day?

What it means is this: ultimate perfection in the graphic arts is a myth. It's not possible. It can't be achieved. The best you can hope for is to reach a level of attempted perfection that exceeds the human eye's range of perception. At 150 lines per inch, the edges of a curved line do indeed look perfect to the naked eye, but if you magnify that printed image to a point beyond 150 parts per inch, that curve will begin to break up, and you'll see the jagged edges of all the little squares the computer used to make that curve. A billboard along the highway looks perfect to the naked eye as you drive by at 65 mph, but if you were crazy enough to pull over and climb up the ladder to stand right in front of that billboard, you'd see jagged edges in every curve and half-tone dots the size of silver dollars. To put it another way, if you stare too closely at a sausage, sooner or later you're going to see what it's made of. So why does most printing look "perfect" to most people? People see what they want to see. As your eyes scan across the printed page, they process the information as quickly as possible. As long as the imperfections are smaller than your ability to detect them, your brain will satisfy itself that it has seen a perfect image. It's a trick. An optical illusion. And as long as the information is conveyed from the page to your brain, it's good enough.

Would this argument convince the pressman of anything?

Of course not. He doesn't need to actually be right, he just needs to maintain his personal belief that he is right. Much like the myth of perfection in graphics, he only needs to process information from the physical world at a safe distance that allows him to see exactly what he wants to see. In fact, he wouldn't even sit still long enough to listen to this whole diatribe. He's already moved on to telling you that he's in his fifties, was the youngest ever certified auto mechanic in the state, spent some time in the armed forces, and has somehow also spent over 40 years in the printing business. And he bagged a 14 point buck last fall. And his secret biscuit recipe is the best in the world. Just ask him. He'll tell ya. Every damn day.

* This will never happen, and he knows it.

1 comment:

  1. Did I work with you or does every shop have a "gentleman" like the one you described? Oh wait, my guy couldn't bake biscuits...

    Great post!

    ReplyDelete