When you see and very beautiful scenery, you take out your phone (or camera) to capture it, isn’t it? But damn! The picture appears to be somewhat different (if you don’t possess a luxurious DSLR). The touch of beauty that appears to our eyes is missing from the picture. Why can’t we capture the exact image we see? You are about to find it out. It is all a matter of resolution.
The Human eye is having a resolution of approximately 576-megapixel, according to Dr. Roger Clark. He is a digital and film imaging professional, who is also involved in several outer space imaging NASA projects. And, this is what makes the actual difference in our vision and pictures.
Consider a view in front of you that is 90 degrees by 90 degrees, like looking through an open window at a scene. The number of pixels would be –
90 degrees * 60 arc-minutes/degree * 1/0.3 * 90 * 60 * 1/0.3 = 324,000,000 pixels (324 megapixels).
At any one moment, you actually do not perceive that many pixels, but your eye moves around the scene to see all the detail you want. But the human eye really sees a larger field of view, close to 180 degrees. Let’s be conservative and use 120 degrees for the field of view. Then we would see –
120 * 120 * 60 * 60 / (0.3 * 0.3) = 576 megapixels.
The full angle of human vision would require even more megapixels. This kind of image detail requires A large format camera to record.
Amazing isn’t it? in order to quantify this, we need a 32000×18000 pixels monitor. This is somewhat equivalent to have an array of 275 1080p monitors. In short, you will need a 576-megapixel image to fool your brain from distinguishing whether its a picture of reality.
The Eye is not a Digital Device
However, one has to remember that the eye is not a digital imaging device. The human eye does not capture images like a digital camera. Instead of taking a snapshot, the eye is constantly moving and the brain stitches together these stimuli to form the images we see.
Lastly, the number of pixels is only one element when determining resolution quality. There are other factors that matter, including lighting, distance, and spatial resolution. In the case of spatial resolution, the number of pixels on the screen remains the same even when an object goes out of focus, yet our perception of the image quality goes down.
So in short, speaking about the resolution of the human eye is rather complex, and there is no simple & easy answer.
How does our eye actually function?
- Light enters the eye through the cornea, the clear front surface of the eye, which acts like a camera lens.
- The iris works much like the diaphragm of a camera–controlling how much light reaches the back of the eye. It does this by automatically adjusting the size of the pupil which, in this scenario, functions like a camera’s aperture.
- The eye’s crystalline lens sits just behind the pupil and acts like an autofocus camera lens, focusing on close and approaching objects.
- Focused by the cornea and the crystalline lens, the light makes its way to the retina. This is the light-sensitive lining in the back of the eye. Think of the retina as the electronic image sensor of a digital camera. Its job is to convert images into electronic signals and send them to the optic nerve.
- The optic nerve then transmits these signals to the visual cortex of the brain which creates our sense of sight.
Perception, Color, and Image
The eye’s retina contains millions of tiny light-sensing nerve cells called rods and cones, which are named for their unique shapes.
- Cones are responsible for perceiving color and detail.
- Rods are responsible for night vision, peripheral or side vision, and detecting motion.
Rods and cones convert the light from our retinas into electrical impulses, which are sent by the optic nerve to the brain, where an image is produced. The macula is the part of the retina that gives us central vision. It’s how we see the form, color, and detail in our direct line of sight.
Video Courtesy- “Techquikie“