What is Resolution?
Resolution is not the same as pixels; a pixel is the smallest part of an electronic image, whereas resolution refers to the quality of an image, which is made up of pixels. It is used to describe the fine detail of the image and depends on the amount of light, the size of the digital sensors and how far away one is when viewing it.
Resolution is also determined by how many pixels are used in the construction of an image, this is called spatial resolution. If something on a screen goes out of focus, the number of pixels in the video frame stays the same but the image loses detail to the eye.
Problems with comparing human vision to a digital image
There are several problems with comparing human vision to a digital image. For example, a digital camera snaps a single image in one go whereas our eyes are constantly moving about, and the brain must make sense of a stream of information to form what we call vision. The image created by the eye alone during a single glance is not the image which we interpret.
Unlike a camera, we’ve got stuff obstructing our field of vision; for example, we are always looking at our own nose. Luckily our brains process these factors out as they are not relevant and don't matter.
We also have blind spots; where the optic nerve meets up with the retina and no visual info is received due to no photoreceptors in this area; you wouldn't expect this from a camera.
In addition to this, humans can have refractive errors such as myopia and hyperopia. It's also possible for one to have tetrachromacy which means that they are able to see more colour varieties than the average person.
Our fovea is another factor which makes it difficult to compare human vision to a digital image. The fovea is the part of your retina that provides the clearest vision. It receives light from the central two degrees of your field of view, which is roughly the area covered by both of your thumbs when held at arm’s length away. Colour vision and 20/20 vision are only possible within that small area.
Our eyes are constantly moving, and our brains fills in details which merge together this visual information and makes guesses to form images which make sense, therefore, what we see is a processed image.
How many pixels can our vision appreciate?
According to scientist and photographer Roger M. Clark of Clark Vision, a screen would have to have a density of 576 megapixels in order to encompass our entire field of view.
Nevertheless, there is a problem with this question because our eyes work differently to cameras. Our eyes move about rapidly, taking in lots of visual information which then gets processed into detailed images by the brain. The brain combines what your two eyes see to increase the resolution, assembling a higher resolution image than the photoreceptors in the retina can do alone.
Our eyes do not digest all visual information equally, we only digest the information in our fovea. Therefore, the image on a 576-pixel screen would be too detailed for us to interpret.
We can see about 7 mega pixels in our fovea range; it’s been roughly estimated that rest of our field of view would only need 1 megapixel more information to render an image.
You simply cannot compare human vision to a digital image as the human eye does not contain pixels. Our visual system is different to that of a camera. As humans, what we see is a picture which we put together with our eyes and brain. This is not necessarily a reality.