Why it is possible to automatically detect poor images: about blurred photos

The two kind of blurred images

Do not get me wrong: there are great images that are blurred. Below, just two examples. One image out of focus on purpose, the other, a very nice  motion blur effect quite well managed.

Photo by Manh Nghiem on Unsplash
Photo by Damon Lam on Unsplash

Art has no limits. It is therefore wrong to qualify an image of being “great” or “poor” just because it is not sharp. Some photographers seem to be obsessed by how sharp their new lens can be, other make their images blurrier on purpose. That is fine, nobody is right or wrong from that perspective.

It is nevertheless important to remember there are basically two kind of blurred images: those blurred on purpose, and those blurred unintentionally.

About blur, done unintentionally

Because we are not all professional photographers, we make mistakes and even pro can fail, nobody is perfect, isn’t it ?

Beyond the basics of managing well shutter speed level to avoid blur when you do not want any, in some circumstances, it can be difficult to avoid blur even when you are not a beginner.

One can argue it is easy to delete these images when you download your photo session images. And this is true. At the same time, culling images is not – on my opinion – something so exciting and I have wished for a long time that some part of the culling process could be automated. Culling blur images being one of the features I have been looking for. Let’s come back to blurred images:

First case, when there is not enough light. To find the right balance between noise (or blur after noise reduction) from high iso, shutter speed at the level needed, too much underexposure or the wrong depth of field (too little): that’s typical of low lights shooting.

Second case, out of focus can still easily happen even with top notch autofocus in 2020 (e.g. during a wedding, or kids playing, or wildlife like birds, … ). The more people / subjects moving, the faster they move, and the more challenging it can be. Expect some images to be blurred.

And both can be combined of course. Shooting moving subjects in low lights is going to maximize the chances of failing. As a conclusion, blurred images are not what you wanted, but it is what you got.

Some challenges when it comes to auto-detect “poor” images where blur is not intentional

The two main causes for blur are well known: out of focus and shutter speed not aligned to motion needs. But it is not so easy. Some images can be blurred almost everywhere, they may not be rejected when the photographer on purpose only focused on a small detail. Similarly, motion blur with the main moving subject moving are not unintentional blur images. Let me give two examples below, of images almost blurred every where, on purpose, that should pass the blur tests:

Photo by lalo Hernandez on Unsplash
Photo by Luke Porter on Unsplash

However, images where nothing is in focus may be classified as blurred images. Of course, again, some photographers will do this on purpose but for all other photographers, these images are failed ones. And similarly, images with motion blur where nothing is sharp are failed ones as well.

The two below images are just failed images. When I shot them, I either been using a too slow shutter speed or the AF missed the point – literally (on the underwater image).

Too slow shutter speed, or wrong rotation of the camera during the shoot. Or both !
Autofocus failed to do the job

Some solutions to fix the culling of blurred images

Some software aims at auto-detecting these images. But I tend to believe it is not enough. It has to be part of a greater thing: a quality gate during pre-process where elimination of blur images (unintentional of course) are just one rule of a set of rules needed to let the photographer focus at only the successful images. We tend to shoot too many images  and this quality gate is useful to reduce the frustration of browsing later tons of failed images, reduce the foot print generated by our digital content  and let the photographer spend more time at  enhancing the images which have passed this technical tests. It has nothing to do with art, it is more whether you have delivered what you wanted, whatever you are trying to do.

I have been experiencing photography for a while. I still like to try new way to shoot, and therefore I am still generating fail images where acutance is not like I wanted. This is just one of the reasons why I have been developping this rule on acutance on Futura Photo from the company Camera Futura for that purpose. The rule is embedded into a full quality gate at pre-process step before moving on with enhancement and DAM software (like Lightroom, Capture One, …).

The idea has been to make the rule as simple as possible, hiding the complexity to the user. It still seems necessary to propose some customization of the rule to the user as what is acceptable for one photographer will not be for another one. But It is managed very simply so far, thanks to a trivial choice between “demanding” or not “demanding” acutance level, depending on your needs.

And for you, what kind of acutance rule are you looking for ? Did you try to use the rule within Futura Photo ? Do not hesitate to let me know what you think.

Disclaimer: I am the CEO of Camera Futura and probably biased. But I wanted to explain more in details why and how this rule has been developped.

1 thought on “Why it is possible to automatically detect poor images: about blurred photos”

  1. Pingback: Sharpness on image: did you get what you need, or what is needed? – Tristan Romain Renaud

Comments are closed.