Science and Tech

Either the CGI designers get their act together or our televisions will keep putting their movies on the ropes

QD-OLED panels are the best thing that has happened to televisions: what certainties and uncertainties this technology leaves us

Who was going to say it. ‘Hombre de Hierro’the 2008 film directed by Jon Favreau, has a CGI (computer generated imagery) more convincing than a good part of the latest movies released by Marvel. Some of the sequences of ‘Spider-Man: No Way Home’ or ‘Doctor Strange in the multiverse of madness’ have a surprisingly less successful finish than that of the production that laid the foundation stone of the Marvel Cinematic Universe.

‘Iron Man’ hit theaters roughly fourteen years before the two films I just mentioned, and yet its visual finish is more polished. The worrying thing is that this It is not an isolated case. It doesn’t just affect Marvel. Here is another example, if possible more revealing: ‘Jurassic Park’the production with which Steven Spielberg left us speechless in 1993, has a CGI and special effects more believable and better resolved than those of many films released almost three decades later.

Badly Resolved CGI Can Ruin Suspension of Disbelief

It is clear that the artists involved in the design of the CGI of today’s movies have not forgotten how to do their work. So what has caused this decline in the quality of computer generated images? One reason is that every time more movies resort to this techniqueplus, CGI is present in more and more footageoften reducing the time designers have to refine the finished images.

However, this is not all. There is also a technical reason why CGI often falls short: usually renders at 2K (2048 x 1080 points). However, most movies shot with digital cameras are shot in 4K resolution. Both elements must coexist in the same frame, so to homogenize this difference in resolution it is necessary to process the images filmed at 4K to pass them to 2K.

Then, once the digital elements have been integrated to this latest resolution and the footage originally shot at 4K, all frames are upscaled to 4K using artificial intelligence algorithms. This last scaling procedure is necessary because some movie theaters project the images using 4K projectors, and, above all, because televisions with a 4K UHD panel dominate the domestic market with shocking clarity.

If the CGI is not very well resolved, the rendering at 2K and its subsequent scaling to 4K may not be up to par when we enjoy that content on a television with a 4K UHD panel.

The problem is that if the CGI is not very well resolved, the rendering at 2K and its subsequent scaling to 4K may not be up to par when we enjoy that content on a television with a 4K UHD panel, especially if we use a latest-generation device that is capable of retrieving a large amount of information. Currently, a large part of cinemas project at resolutions lower than 4K, so if the CGI is careful, the fact that it was originally rendered at 2K in this context is not too problematic.

However, TVs with a 4K UHD panel are unforgiving. Many of the movies that seem to have correct CGI in theaters offer us a much less satisfying experience. when we see them on our 4K UHD TV. Frequently the digital elements of some frames are not as believable as they should be, and when this happens the magic of cinema can go to waste because the suspension of disbelief stops working.

she-hulk

The CGI of the first images we saw of ‘She-Hulk’, which will arrive on Disney + on August 17, left much to be desired. In the latest trailer for this series it seems to have improved, but we’ll be sure when we see how it looks on 4K UHD TVs.

When we watch a movie, especially if it is science fiction, fantasy, action or adventure, viewers voluntarily accept the need to set aside the criteria that we usually use to judge the real world. We accept the rules proposed by the film because if we don’t, it’s essentially impossible for us to enjoy it. Of course, for this tacit agreement to work it is necessary that what we see seems coherent to us. And it isn’t always.

The subwoofer, the ugly duckling of high fidelity, is actually the holy grail of acoustic conditioning.

A poorly executed CGI can ruin our experience no matter how much good intentions we put on our part. It can take us out of the movie, especially if his presence in the footage is constant. And this usually happens in superhero movies that have been so lavish on billboards for years, which is why some productions that feature Marvel and DC characters have been criticized by many fans for having poor CGI.

If we want the CGI not to falter when we watch the movies that use it on a 4K UHD television, the ideal is that it is originally rendered in 4K

The solution to this problem lies in refining the computer generated images without letting yourself be gripped by the tight deadlines with which film production companies work. Movies like ‘Iron Man’, ‘Jurassic Park’ and many others show that it is possible to make believable and highly satisfying CGI. And the filmmakers know it. But this is not enough. If in our homes we have mostly 4K UHD televisions and we want the CGI not to falter when we watch the movies that use it on them, there is no doubt that the ideal is that is originally rendered in 4K.

The problem is that time is a very valuable resource, and it is not clear that the production companies are willing to delay the time they invest in the post-production of some of their films. The other ingredient in the recipe, 4K rendering, also represents a significant compromise because the computational effort required to render a frame at that resolution is much greater than that involved in rendering it at 2K. In any case, sooner or later the filmmakers they will have to jump through hoops. Otherwise our televisions will continue to bring out all the imperfections of your work.

Source link