… and we’re back. I realise I haven’t posted for a couple of weeks. This is down to a nasty bout of flu which laid me very low and generally made the world seem an iller place. This week I want to talk a bit about 50Hz technology in movies, and the storm of slightly odd criticism being thrown at The Hobbit. I realise no one is terribly interested in what I think about his, but hey, when does that ever stop the average blogger?

First a bit of background. Most movies update at 24 frames per second – this is pretty well-known. Yes, yes, yes TV is often interlaced but the complete image updates 24 times per second, even if the lines are staggered. As far as I can tell, this dates back to the earliest days of cinema, when it was recognised as the lowest frame rate that gave a convincing illusion of movement rather than a sequence of still images. Like all the best standards, it’s persisted for an impressively long time.

Recently, though, a plethora of HD technologies have been foisted on an unsuspecting public. We’ve gone relatively rapidly from VHS to DVD to Blu-ray and 3D. The TV in my front room has eye-wateringly crisp definition and magnificent contrast and depth of colour. Its a joy to watch and the movies I play on it are incredibly well mastered and encoded.

All of which makes it a bit odd that by default it came with literally dozens of digital post-processing algorithms that try to reprocess poor quality video input into, well, I assume they are intended to make the picture better but in fact all the dynamic contrast enhancement, edge detection, smart vector deinterlacing and deblocking conspired to make my blu-rays look frankly ludicrous. I spent 3 weeks finding all the settings on the TV and switching everything off, then doing the same with my Blu-ray player software and again on my graphics card driver (I use an HTPC). Having finally gotten to the bottom of them all, I still find I occasionally have to repeat the process after software updates helpfully turn a few randomly back on.

Of all of these things, two are the most infuriating. The first is smart deinterlacing, which is so smart that it tries to deinterlace content that is not in fact interlaced in the first place and managed to turn 1080p into something resembling over-played VHS. Seriously, turn this one off and you’ll see a VAST improvement in picture quality. If in doubt, deactivate deinterlacing completely, you won’t regret it.

The second, however, is the real troll under the bridge: smoothing. On my TV this is called MotionPlus, but technically it’s interpolation in the time domain. From a technical standpoint this is actually a pretty obvious thing to do. All the improvements in definition I mentioned above improve the spatial resolution of the individual images in the movie: smaller dots, larger images, simply more pixels. This is great, but it’s only half the story. Because the images are animated, you could also add in extra frames in between the 24 you’ve already got each second.

And this is where is starts to get interesting, because we’re starting to talk about hings that don’t just involve the technology but also involves human perception. Throughout the history of cinema it’s been assumed that 24fps is enough to fool a person into thinking that a sequence of still images is actually a smooth flow of animation, but as I mentioned at the beginning, it was adopted because it was the slowest speed that did the job, and this does make a difference.

If you try hard, however, (and largely ignore the detail of what you’re watching) it is possible to see the transitions between frames. This is actually easier with HD stuff – the picture is so crisp that the tiny judder due to the frame rate is easier to spot. The reason we overlook this, I believe, is that all of us have grown up watching 24fps video and accepting it as such. We subconsciously ignore the fact that the illusion isn’t quite perfect.

So, enter temporal interpolation. With hardware acceleration, you can interpolate frames in real-time, and with a decent enough algorithm you can increase the apparent frame rate quite a lot. My TV’s top setting is at 200Hz. That’s pretty smooth. Newer sets are (discretely) claiming frame rates of 400Hz. That’s a hell of a lot of interpolation.

I don’t use it. Why not? because it makes beautifully directed movies with high production values look about as convincingly real as local amateur dramatics (I’m not having a go at amateur dramatics, it’s just that production values tend to be a bit cheap and cheerful. This, of course, is not why you go!). A lot of the time it’s completely unwatchable. Interestingly, this is all very similar to the criticisms leveled at The Hobbit, which was shot at 48fps, which suggests that it’s not an artifact of the interpolation but the increased temporal resolution itself that’ to blame.

But why on earth should this be? Improving the frame rate should reduce judder and Peter Jackson is quoted as saying that filming at 48fps almost eliminates motion blur, both of which should improve the viewing experience, not make things unwatchable.

So, here’s my 2 cents-worth: the effect is perceptual. I’ve already mentioned that because we’ve always watched movies at 24fps we’re conditioned to it. You could extend this argument to suggest that, subconsciously, we have a mental category for filmed material – we accept things animated in this way as fictions, separate from reality.

To develop this idea completely I need to introduce one more concept, the Uncanny Valley. The uncanny Valley is a well-documented effect in computer graphics, but it is basically perceptual. It works like this: imagine a simple cartoon, like a stick-man. We accept this a rough representation of a person. It’s not very accurate, but we accept it as being a representation of a person. If we add more detail, like a face or some feet, we accept this as well. In fact, this cartoon is slightly more convincing than before.

Adding more and more detail: more realistic shape, colouring, clothing, more nuanced behaviour and we find the illusion more and more convincing, whilst still being aware that it is a cartoon. This continues up to a point, but once a representation of a person gets very close to being completely realistic we start to reject it – the cartoon character becomes a doll-eyed automaton and the illusion is ruined.

This is the Uncanny Valley, and it’s a valley of perception. At a certain level of realism there a shift in perception, we cease to accept it as a cartoon and start to put it in the same category as actual people. At this stage different mechanisms kick in and we apply different standards: this is an illusion, our inner cave-men will not be fooled by this sorcery. Ha!

This is a difficult thing to overcome. Polar Express is often touted as a movie that suffered hugely from the uncanny valley. It attempted to be photorealistic and failed. The same was true of Final Fantasy: The Spirits Within, but isn’t true of (say) Tin-Tin, which has people who objectively look nothing like real people but are paradoxically easier to accept as such.

And this is what I think the problem is with higher frame rates – they cause us to categorise what we are seeing as real life rather than movies and as a consequence they look like actors on film sets rather than convincing illusions. Of course, with practice you can train yourself to recognise them as movies and all becomes right with the cosmos. I watched season 2 of Game of Thrones at a friend’s house on their 200Hz TV. It took seven episodes before I could look past the overt, hyper-reality of the interpolation and I’m still not completely past it but at home I just deactivate it as it only distracts from enjoying the movie.

There’s one final thing to add here. This effect is made worse by bad lighting. Interpolated content shot outside with natural lighting does grate nearly as much as poorly lit studio shots. I’m guessing this just adds to the unreality.

So that’s my thought on the subject. Perceptual shifts causing movies to look unrealistic. If this is true, I would imagine it would be an interesting challenge for a director, can you make a convincing-looking movie at 48fps? It would probably end up being spectacular.

 

Update: It seems that the situation in early cinema with regards frame rate is quite interesting. Silent movies typically had frame rates of 20-26 fps, although this was more of a function of equipment design than anything perceptual. Thomas Edison apparent insisted that 46 fps was the minimum, and that “anything less will strain the eye.” Interesting.

Furthermore, the perception of very short events is quite complex, with flashes of darkness as short as 10ms being perceptible, but flashes of light causing a “persistence of vision” effect that can cause 10ms flashes of light to appear more like 100ms long, and cause consecutive flashes of different colours to merge together so that, for example, red plus green is perceived as yellow.

About these ads