It seems everyone who has ever set foot in a movie theater has an opinion on Walter Murch’s opinion of stereoscopic cinema that was quoted in Roger Ebert’s blog. I’m not interested in going toe to toe with Murch. He’s articulate, brilliant, and famous – and he hates 3D movies. I’m not nearly as articulate or brilliant, and I’ll never be famous – but I like 3D movies.
So what? Some people like them and some don’t. And some people get headaches watching them. No one spends more time thinking about, writing about, and talking about editing and how people see those edits than Walter Murch. I had been a professional editor for several years before I read Murch’s In the Blink of an Eye, and it was only then that I fully understood why cuts work. (Murch reasons that through blinking the brain “cuts” our visual stream into segments. Thus we actually think in cuts, so cutting is a natural way to present a series of clips.)
It’s a bit of a leap to equate blinking when we turn our head 15 or 20 degrees with a cut in the action that takes us from New York to London in 1/24th of a second, but the brain is adaptable. It makes it work. Murch’s theory on blinking and cuts has literally changed the way I look at the world.
That profound reasoning is what makes his current theory on 3D cinema’s shortcomings fail my sniff test.
The biggest problem with 3D, though, is the “convergence/focus” issue. A couple of the other issues — darkness and “smallness” — are at least theoretically solvable. But the deeper problem is that the audience must focus their eyes at the plane of the screen — say it is 80 feet away. This is constant no matter what.
But their eyes must converge at perhaps 10 feet away, then 60 feet, then 120 feet, and so on, depending on what the illusion is. So 3D films require us to focus at one distance and converge at another. And 600 million years of evolution has never presented this problem before. All living things with eyes have always focussed and converged at the same point.
Not really. Murch is only half right. Focusing is a physical act, but convergence is a brain trick — just like reframing by blinking. Think of that eye test where you move the pencil closer to your face until you can no longer see a single pencil. You’re not physically crossing your eyes to converge on the pencil until it’s just inches from your face. Convergence isn’t physically challenging after a few feet. So you’re not doing any physical work with your eyes that competes with its attempts at focusing when sitting 30 feet from the screen.
Getting the brain to adapt to weird visual cues isn’t all that difficult. Back in 1896, George Stratton ran that famous experiment testing perceptual adaptation where subjects wore glasses that inverted the image sent to the eyes, making the world appear upside down. In some surprisingly short amount of time, the subjects’ brains compensated so that the world appeared right again. The whole 600 million years of focus and convergence theory doesn’t hold up.
Murch also complains of the strobing of 3D images.
I edited one 3D film back in the 1980’s — “Captain Eo” — and also noticed that horizontal movement will strobe much sooner in 3D than it does in 2D. This was true then, and it is still true now. It has something to do with the amount of brain power dedicated to studying the edges of things. The more conscious we are of edges, the earlier strobing kicks in.
He’s right, but it’s nothing directors and editors haven’t faced before. I remember making the transition from SD to HD and noting that camera moves needed to be slowed down. Different visual problem, same solution. Along a similar vein, as a television editor going to the cinema after a day in the cutting room could be torturous. After a day of closely watching the world go by at 30 fps, a 24 fps film looks downright staccato — for the first 10 minutes. And then the brain adapts.
Just a couple of weeks ago I was in LA visiting some cutting rooms working in 3D. DPs’ and directors’ approaches to stereoscopic shooting have evolved — so much of that edginess has been softened. Lighting and angles have evolved as well. And just as we learned with Avatar, some of the best 3D is subtle 3D.
Though the jury is still out on 3D cinema, 3D television shows promise. Anyone who frequents live sporting events is disappointed in the flatness of traditional 2D broadcasts at home. At the stadium I know where the ball is going to land. I have depth perception. On TV I need to rely on Joe Buck to tell me where it’s going. 3D changes that — the viewer can see the play unfold, and Joe Buck can say less – that alone makes the case for 3D. As previously noted here, the Economist published a very good article on the topic nine months ago that still holds.
Murch may very well be right. 3D might fizzle, but we won’t need to look back 600 million years for the reason. It would more likely be the $3 per ticket premium that buries it.