Close
Is Film Grain Good or Bad

Is Film Grain Good or Bad?

Film grain is a term that we often hear used when discussing the appearance of some celluloid films, specifically when the appearance results in a textured look. It’s actually the result of random small metallic silver particles that become stuck on processed celluloid films, but the appearance of film grain is also produced digitally in some cases to recreate film grain texture in post-production. But is film grain good or bad?

At first thought, newly aspiring filmmakers and editors might be inclined to think that film grain is a bad thing. Why would we want a crisp, clean digital image that has perfectly high resolution and amazing sharp edges to appear “textured” or “grainy?”

However, as we look to answer the common question, “Is film grain good or bad?” It’s important to understand that some filmmakers are actually interested in recreating this appearance, and some directors don’t think it’s all that bad.

So, what do you think? Is film grain good, or bad?

What is Film Grain?

In the simplest definition of film grain, it represents film that is textured by specs of metallic matter which create a “grain” which may vary in size and in patterns of appearance.

The term “film grain” comes from early use of celluloid film which was processed in a much different setting and procedure than our digital assets are now produced in.

Film grain is also sometimes referred to as granularity, representing various sizes of random “grains” which, along with image size, would shrink or increase on the screen. 

Film grain was long referred to as “film noise” when filmmakers were referring to analog film grain. In fact, the use of the term film grain primarily represented what viewers might see when the film reel would start, or come to the end.

You might recall the old sputtering, random flashes of specks, hairs, and other noticeable elements that signified the start, or the end, of an old film?

In these senses, when film grain was described in historic use of analog films, it was a byproduct of processing that was considered a “bad” thing. But, is film grain good or bad, now? The answer to that question remains largely to be decided by the individual, or Director.

Physical Film Grain vs Digital Film Grain

Anytime analog film or celluloid films are processed, film grain is going to be present. This is because film grain is just a byproduct of processing physical film stock, and it’s always going to be there.

Physical film grain may be more, or less noticeable based on a variety of underlying elements, but it’s certain to be present in some capacity with celluloid films. 

Changes in shutter speed, ISO, and aperture can impact the significance of film grain, with higher ISO outputs resulting in higher frequency of film grain in your digital films. When shooting digitally, like most of us now do, your number one defense against film grain is your ISO.

Shooting with a lower ISO, in most cases, will reduce the risk of any grain becoming noticeable in your footage, especially if you’re fortunate enough to produce in properly illuminated environments. 

The addition of grain in digital productions has actually been found to benefit some films, especially those that seek to provide the audience with the appearance of a film that was produced in the 20th century.

In fact, although digital cameras do not actually produce film grain in its actual sense, because they don’t utilize photochemical processes and produce metallic particles that cause the grain, the use of various contemporary production techniques, and the addition of grain in post-production is something that some Directors find incredibly useful. 

So, is film grain good or bad? The answer, oddly enough, is “YES.” Film grain is good. Film grain is bad. Film grain could really go either way, depending on the individual needs of the production. 

Leave a Reply

Your email address will not be published. Required fields are marked *