An I For An Eye

ultra-high-definition-4k-wallpapers

I’m having the video equivalent of what serious audiophiles must have gone through when digital recording appeared in the Eighties. I can barely believe I used the term “video,” but that’s the age we live in. Sometimes it’s all too digital.

We bought each other a holiday gift this year, a new tv to replace the ten-plus-year-old one. Back in vinyl days, all you had to worry about on your stereo was whether the left channel was connected to the left speaker: you uncrated and assembled in half an hour and thought you were MacGyver. But today’s gear is so complicated that rather than bang our heads in frustration, we just call in “Agents” from the Geek Squad — Best Buy is literally across the street from us, maybe 100 paces away. First we got a consultation visit. All our stuff still worked, but we were wondering if we weren’t missing out on some tech developments over the last decade, and on a few components we were. (Not everything new is necessarily good. We were actually warned against buying a set that could play 3-D; evidently that’s the Google Glass of home video just now.)

When we installed our old tv a decade ago, the big new thing was high definition in broadcast, Blu-Ray in physical media. Most major network shows had only recently gone hi-def for a quantum leap in picture clarity: video-taped chat shows appeared to be coming through a window and DPed film series like LOST were crisp and sharp, stone cold gorgeous. People my age can remember when they first had access to a color tv set: you’d uncritically watch anything just because it was in color. Same deal here. Hi-def was the bee’s knees.

Now they’ve ramped it up to “Ultra” HD, “4K” encoding. By coincidence we’ve caught this wave earlier in the cycle. The networks aren’t there yet, but Netflix is already streaming in 4K, and no big home video release arrives without an “Ultra HD” version (continuing to represent the leading edge, all Criterion releases are 4K transfers now). So, let’s give the new tube a spin!

The resolution is indeed immaculate: you can see pores on the anchorman’s face, a tiny drop of hot-light sweat from a talk-show guest that would have been undetectable before. For live or taped material it’s as if you were sitting in the control room with the director. Amazing. Then I put in a Blu-Ray of a film and the strangest thing happened: all of a sudden, I didn’t like the effect any more.

To my surprise, even images captured on celluloid and realized using an emulsion looked like studio-bound video tape. On old black-and-white pictures the effect can be refreshing, making them appear to be immaculately preserved. But everything else was somehow cheapened, as if we were screening videocam dailies rather than the full cinematic monty. Expensive visual effects looked awful, traveling mattes shimmering, CGI performers out of match with their real-world counterparts. Everything was this way. THE GODFATHER. THE WIZARD OF OZ. AKIRA KUROSAWA’S frickin DREAMS! Films I knew by heart looked as if their production budgets had been cut in half. Yes, resolution was indeed off the scale, but the bald, flat end result was plug ugly to me.

I had a week with the system before my Geek Squad agent returned to install a small piece of audio gear, and by then I was ready for some help. At first I had a little trouble explaining what was wrong, but then he said, “You mean everything looks like a soap opera.” Exactly! “That really annoys some people.” Count me in! “Easy fix.” Evidently it has something to do with how movement is depicted on the screen, a feature you can toggle. He did, and now the ultra-sharp image wasn’t quite as ultra-sharp as before, but a movie looked like a movie again. (I’ve noticed that an object which isn’t moving at all, like a framed photograph on a desk, can look a tad sharper than the rest of the scene.)

My reaction was interesting because it goes against the grain. My eyeballs have become desensitized enough that I actually prefer digital images. I first noticed it at the New York Film Festival two years ago, when P. T. Anderson made a big deal out of the fact that we were going to screen his new movie INHERENT VICE by actually running film through a projector. Goosebumps! Then the thing rolled and my heart sank, because the very imprecision that makes film film now reads as murkiness, deterioration of focus, mere proximity to the image I really wanted. The same sequence repeated this year with James Gray’s THE LOST CITY OF Z. Note that I’m not commenting here on the artistic quality of the work, only the physicality of the visual image as seen through my eyes. Celluloid projected at 24 frames per second can only approach perfection. Digital projection ensures focus, balance, and no deterioration whatsoever. (Also no reel-change dots: some people think that’s weird.)

I’m not saying digital is necessarily better. But it is what I’ve become accustomed to, what I expect. Hard-core stereo freaks had to clap their ears closed at compact discs when they first appeared, and even I could imagine a clipped, mechanical aspect to early full-digital recordings like TRICYCLE by Flim & the BB’s or Dire Straits’ BROTHERS IN ARMS. And why not? It’s the difference between the physical back-and-forth vibration of a record needle and the ja-or-nein precision of a stored byte, the way a string player creates vibrato versus the way a bunch of electronic cables are routed. But thirty-plus years later, digital audio sounds normal to me. It’s what I’ve become accustomed to. What I expect.

Filmmakers who still shoot on film are a dying breed. Anderson, Gray, Spielberg, Tarantino, Christopher Nolan, Judd Apatow — you can almost call the roll in your head. Even the Coens have thrown in the towel. Digital is just faster and cheaper (the current trendiness of handheld doesn’t hurt), making it the medium of choice for young DPs, and in a generation or two it’ll be hard to put together a “slow” celluloid crew at all. TV and indie crews tear through many times the script pages of a lumbering big-time feature, but do you have any tech complaints about, say, GAME OF THRONES?

Digital still looks great when projected, not at all like the “soap opera” on my badass monitor. I cannot recognize a digital shoot just by looking at it. Again and again I’ve been surprised by end credits or festival Q&As when it’s revealed. (Some productions even brag: it’s no longer uncommon to see “Captured in…” rather than “Filmed in…”) Vinyl-record devotees still maintain their purity, and someday film snobs will rage, rage against the dying of the light. But face it: we’ll still call them “films,” just as we still call them “albums,” which they haven’t been since the days of the 78rpm single. Things change — which I believe is also a comprehensively stated history of the universe.

One Response to An I For An Eye

  1. klhoughton says:

    “Also no reel-change dots: some people think that’s weird.”

    And we are firm in that opinion.

    “I cannot recognize a digital shoot just by looking at it. ”

    But I’ll give you odds that, on those few occasions a radio station doesn’t play an mp3, you can tell.

    Like

Leave a comment