Muzahir Hussain
Muzahir Hussain

Reputation: 1033

Why does images gets blurry after removing noise?

When we remove a noise from a gray-scale image using weighted average, why does image gets blurry? All we do is take average of neighboring pixels and replace it with the middle pixel. That pixel should get dark or bright depending on that value, but not blur. So why does it gets blur? Thanks in advance...

Upvotes: 0

Views: 1427

Answers (3)

igweyn
igweyn

Reputation: 141

To understand this you should need to understand the concept of low-frequency and high-frequency components in an image. In an image, the region which changes less over space are called low-frequency regions (for example a flat region like a plain wall in an image). Similarly, high frequencies occupy those regions where pixel intensity varies a lot (i.e. regions with lot of edges):

Low/High frequency comparison

Averaging filters are classified as low-pass filters. Take an example of Gaussian Blur Function. As fourier transform of gaussian is a gaussian, it acts as a low-pass filter. This means that it'll filter out the high-frequency information (edges and regions with a lot of variation) from the image. That's why an image convolved with a low-pass filter looks blurry.
Now let's say you don't want to understand it using the frequency domain analysis. A blur function tries to smooth an image i.e. transition from one intensity to another across a region is very smooth. While doing this a blur function reduces the edge content in an image. Take an example of a following 1D Array

A= [0 0 0 255 0 0 0]

If you'll average with a window size of size 3. This is the result:

B = [0 0 85 85 85 0 0]

So, we observed that darker pixels became brighter and a bright pixel became relatively dark. Other way to look at it is that an edge is smoothed.
As natural images tend to have a lot of edge details, an averaging operation smooths the image content and removes high frequency details( edges + noise) from the image. An image looks sharp is due to the edge information in it. Hence, an image which is averaged or convoluted with a low-pass filter looks blurry.

Upvotes: 4

user1196549
user1196549

Reputation:

Think of an image that is half white and half black, with a sharp transition.

Imagine a sliding window that progressively crosses the border. The average inside the window will first be pure white, than gray, getting progressively darker and darker, up to full black. This is because the window contains a variable mixture of white and black.

In the end, you will get a smooth transition area as large as the window: the border is blurred.

Blur arises because the pixels that make up image details are blended together. Actually, image information is found along the edges, and these get "erased" by averaging.

Better denoising methods (such as the bilateral filter) take care to avoid averaging the edges (where you can't see the noise anyway).

Upvotes: 1

Piglet
Piglet

Reputation: 28974

An image is a discrete representation of a light distribution captured by your camera. Assuming that your lens was focussed and your optical resolution was sufficient, every pixel carries unique information. As soon as you replace that unique information by an average of that pixel's sourrounding you lose that information, or let's say you spread it over its neighbourhood. Of course the pixel itself only changes its value. But this is done with all pixels. That's where you lose high frequencies and your image gets blurred. Stupid expample: Take 3 different paints, make 3 dots next to each other and then mix them with your finger. The locations where those dots used to be only slightly changed their colours. But you don't see any dots and you don't see 3 different colours anymore.

Upvotes: 2

Related Questions