Friday, December 14, 2012

Me on Hobbits and High Frame Rates

Concerning Hobbits....

(These are my opinions and not those of my employer etc. etc....)




A while back, I wrote a post called "Me on 3D", where I explained the fear and uncertainty over if Peter Jackson had made the right choice deciding to go to 48 fps for the Hobbit.


The post is here, if you want to read it first.

So I had my whole family there, at opening night, and out of curiousity, I decided to try the HFR 3D version. Not because I would actually expect to enjoy it, but as a... lets call it "technical study".

So in short - was my fears that High Frame Rate cinema look shitty hold?

I relly wish I could yell a resounding Yes to that, but there are a few niggly parts that makes me not do that. Make no mistake, though, it's still an 85-90% "Yes", but it is not the 110% doubt-free utter hate I had expected myself to feel. Which is very strange, and is the reason it took me so long to write this post.

So does the Hobbit at 48 FPS look like shitty soap opera?

Oddly - depends on who you ask!

My wife liked it. She things I'm being a luddite of some kind.

And to my surprise, my middle son, Oscar, also said he liked it. This is strange to me, because he was the kid that when we were watching "2012" in theatres, even though he was just 12 years old at the time, leaned over to me during those few shots shot with a 360 degree shutter (something with is perceptually very similar to a high frame rate) and with NO prompting from me whispered "Dad, did you see that? The framerate was off!"

Oliver (youngest) had no opinion, and Victor (the oldest) was... uncertain.

And if you ask me?

Oh for sure, to me it looks like crap. There is absolutely no doubt there.

...which is partially the reason I don't know what to think. I had expected a universal loathing, because to me it is so unquestionably aweful, that these other reactions confuse me... and seed.... doubt.


So what's my history with HFR?

As long as I've owned video cameras, I've always dispised the "look", and it was only when digital video with DV cameras came on to the scene, and you started to deinterlace the footage, something happened.

I live in a PAL country, which normally shoots video at 50 "fields" per second, which while not be 50 whole "frames" it kinda is, for all practical purpouses (though it is every other line, ignore that for now, the perceived temporal resolution is 50 images per second). And it looks like shite.

Then people realized that when you removed the interlace (effecively tossing out every other line) you also casued the image to be only 25 frames per second... and it looks way better. It stops looking like crappy video, and starts to look "cinematic".

Yes, "cinematic". An elusive word we will be re-visiting more in this post.

Soon, cameras had this as a feature: "Progressive". So instead of the insanity of two interlaced half-pictures, you had whole frames, at 25 fps. "25P" became a big selling point on video cameras. I paid almost $2000 for a video camera once, specifically so it should do "25P"



And it wasn't just me. As a matter of fact, every TV program (at least in Sweden) worth it's salt wanting to look "cool", immedaiately went to progressive mode, and 25 fps. Only news, sports, documentaries and other stuff that tries to "depict reality" was at 50i, everything dramatic, and narrative was at 25P.

This has been going on for, what, 15+ years now in TV land?

All was good in the world until .... 

...some crazy person in hopes to put butts in seats decided to yet-again revive the recurring dud of "3D" in movies.

And 24P does have troubles with 3D. The "stroby" motion, which gives such wonderful surreal looking action in 2D, plays all sorts of havoc with your spatial sense when blown out to 3D. So 3D in 24/25P barely works. This is known.

So now people have tried to "fix" this with HFR (claiming it as something magically "new", while it's just back to interlaced video look of 20 years ago, hardly "new" in any way, shape or form).

And sure, it does help 3D.

But.... what else does it do?

Why does HFR look Crappy?

This is the key question, the core of the poodle, as we say in Sweden. Why? What causes this? Or does nothing cause this?

There is a whole set of theories:


Theory #1: Familiarity / "Learned Response"

When I was a kid growing up we had a lot of British TV. One interesting aspect of BBC TV at the time was that stuff shot indoor was shot to tape on huge tube based video cameras large as houses. Since these were pretty much immobile, anything shot outdoors was shot on film. This created a very odd sensation that, within the same TV show, the perceived framerate could shift between "indoor" and "outdoor" looks. And I remember even then, thinking that "somehow, the indoor looks cheaper and crappier". I couldn't even articulate why, just that something was cheezy about it, and the outdoor was "cooler".

So there are people who claim that this is a "learned" thing, that we old farts have "learned" that "movies look like this, TV looks like this other thing", and then by repeated association connected one look with "quality" and one look with "crap".

I contest that, because I recall even at super young age (as per above) pegging the 50i as cheezy looking, without knowin why. And within the same TV programme, so it kinda makes the "by association" theory fly out the window, IMHO.

Yet the fact that some youngsters - and my wife - actually had nowhere near the amount of trouble with watching this thing as I had... it can't be as universal as I thought.

The people in the "learned response" camp also bring up things like "people were panning CD's in the beginning as bad sounding" and similar things, or "people didn't like color film at first" or "talkies didn't take root immediately"... I don't believe that.

For example, CD's vs. vinyl... vinyl disks need massive compression (audio level compression here) or the needle would - literally - jump out of the track. But CD's were touted as "high dynamic range" and used very little compression in the beginning. Turns out, psycho-acousitically, we percive more dynamic compression as "better" for some reason. Not a "learned response",  but a psycho-acoustic fact. Today, the audiophile complaint is, ironically, that CD's are overcompressed instead (which they are, because the race in todays world is "to sound the loudest on the radio", and the more you compress, the higher you brint up the average sound pressure level, and the "louder you sound". And the wimpier things like drums sound.)



Theory #2: Idealized Movement theory

This is a theory I believe more in myself. Because some thing I think I noticed in all the HFR examples I've seen (and that includes the HFR panel at SIGGRAPH 2012 where lots of different frame rates etc. was showcased - more on this at a later time) I notice one very peculiar effect:  HFR makes you a worse actor!

Yes, it is weird. Somehow, every mistake, and every nuance of movement is suddenly present, and you can tell the tiny fidgety moves people to. The theory is, that when we only "sample" the motion at 24 points every second, our brain fills in the intermediate movement with some "ideal", but when we actually have the HFR information, no such "motion smoothing" can occur in our brain, and we see the movements as jiggly and imprecise as they are, making them look more real but then also more like an actor acting.

Also, as many people have pointed out, HFR makes you see more detail. It has to do with how we perceive motion blur... not just that there is more motion blur in 25P content, but the fact that the brain actually adds blur in certain cases (long story, will explain later).

This causes you to see stuff you never saw... including... slight differences in color between prosthetics and skin, contact lenses being visible, how cheaply constructed some sets are, etc. etc. Even the ridiculousity of proportions somehow become more "obvious"... such as the dwarves in the Hobbit ... never sold as real creatures, but as the over-exaggerated prosthetics they were - sadly.



Theory #3: The constant "Stopped Clock Illusion"

This is my personal theory, came up with it all meself' - so sue me.

Part of the reason of me coming up with this hare-brained theory, is the repeated reports from people saying that they perceive HFR imagery as "sped up", as if they were watching something on a slight "fast forward", as if time was passing magically faster... while in fact it is not. And conversely, that the "cinematic" feel of 24P makes things seem "slower" in some... odd way.

What do I think?

Well, I think there is something inherent in the 24p rate that links to the speed of the saccadic movement our eyes do. Our brain is a freaky thing, and it invents almost everything you see on very flimsy foundations (the optic nerve simply doesn't have the bandwidth to give you as much visual information that you think you see - 'tis all an illusion made by the brain). If you check for the Stopped Clock Illusion you will find some interesting psycho-optical perceptual freakery that can make your head fall out your ear...

...wo shat if 24 frames per second happens to match the average lange of a saccade, and the fact that we are fed a new still image every 24:th of a second causes our brain to, effectively, be in a constant state of the "Stopped Clock Illusion". Something that higher framerates do not do.

The eye... is freaky. Our psycho-optical subsystem is... freakier. I had an interesting demo program once that scrolled some text. I happened - at the time - to own both a CRT moinotor and an LCD monitor. The scrolling graphics was completely readable on the CRT, but a smeary mess on the LCD. This was not because the LCD was "slow"... this was due to the temporal characteristics of the two media.


Why? Well it is similar to when I wrote my 1st raytracer, RayTracker back in 1989. I had it render every 8:th pixel first, then every 4:th, then every 2:nd etc. If I did this with little dots on a black background, it looked much better than if I did this with blocks. The eye could interpolate and "assume" the missing information when on black, but when it was ipso facto large blocks, it could not. The blocks were perceived as a much lower resolution than the blocks. Psycho-optics at work!


The same thing happens to time: A CRT basically "blinks" the image at you. The image exists for a super brief moment of time, and then fades. Whereas an LCD holds the image for a much longer time. The eye can sort of "interpolate" the movement in the case of the blinking, but can not in the case of the non-blinking CRT. The eye is trying to figure out the "ideal movement" based on the scrolling text, and the actual deviation from this (the text standing still for x milliseconds) is viewed as a discrepancy and hence a blur. Whereas the blink looked to the eye like a point-sample, and it could merrily interpolate the in-between information.



Or What?

I don't know.

Frankly, the fact that not everybody hates this crap is one thing that disturbs me. If my theories were right everyone and their mother would see that, indeed, the emperor has no clothes, and 48 fps is just ugly as crap insanity. But they don't. Some people seem to... like it.

The second problem is that there were pieces of this move where I didn't care. I.e. for certain segments, at certain ... brief.... moments... I found myself not caring. That disturbed me, because I was expecting to hate all of it with the fury of a thousand exploding suns. I didn't. There were whole stretches - somtimes up to almost a whole minute in length - where I wasn't annoyed out of my skull by the 48 fps. That smells danger to me. Why was that?  I bet the gold lives in knowing when it doesn't annoy you out of your skull.

I don't know.


Alas..... I ramble.


I'll shut up now.


Oh wait. The movie?


Could have lost about 40+ minutes, especially in the beginning. Some cool fanboy-serving bits. Gollum was nice. The lead CG goblin looked lead CG (Gollum much less so). Some action was stretched out beyond absurdity. But sure... it was vieable. Worth watching, even.

Not just completely sure it's necesary to watch 24 extra fames of it every second....


And the re-sizing of Gandalf was awesome and flawless. Wish I could say the same of the incessant color correction of his face....



/Z

ADDENDUM:

Stu Maswitch is a very clever guy, and has a few posts on this topic that is worth viewing w.r.t. this one: