Society & Culture1 min ago
Resolution of the human eye?
4 Answers
Just heard that we are due to have photorealistic 3D modelling in about ten years time, and it got me thinking. At what stage will the resolution of computer monitors become so detailed that we can't see any improvements? i.e. How many pixels could an eye feasibly recognise? As an afterthought, do you think that, as we watch more and higher quality computer animation, our eyes may evolve further to be able to distinguish more? I know it's a bit of an odd question, but ... I was bored, sue me ;o)
Answers
Best Answer
No best answer has yet been selected by shybearuk. Once a best answer has been selected, it will be shown here.
For more on marking an answer as the "Best Answer", please visit our FAQ.I think the human eye is pretty sensitive - I remember reading that it can resolve a single match burning from an extrodinary long distance away ( 20 miles I think, but don't quote me ). However the brain would probably filter out a lot of the information concentrating only on what it thinks is important. ( It can even cope with watching fuzzy channel 5.) So very extremely high resolution monitors probably aren't needed for realism - what you probably need is a monitor (or glasses) that can mimic 3D vision.
This is a very interesting question, so I decided to do small test. I just printed off some very small black and white squares arranged like a chessboard. There are 16 of these squares per centimetre on a small section of the paper. I supposed that if resolution is thought of as "the ability to distinguish between two points" then all I will have to do is to walk away from the paper until I can't tell that the squares are not just one solid colour. Admittedly, this is a pretty crude test. Added to that I have been staring at a computer screen for over an hour and it's quite dark, and I already know there are squares on the paper so that's what I am expecting.
Right, I've just done the test once and the squares started to form a sort of speckled grey at around 2.8m from the paper. In terms of angle, this is a resolution of about 0.013�, or � of an arc minute (one arc minute is equal to one sixtieth of a degree). This figure, while not accurately measured, ties in quite well with the figures given on the site below.
The site also quotes the fovea (the high resolution part of the eye, the part that sees in the best quality) to have a total of 24 million pixels. If I understand this correctly, this means that in any one direction it can distinguish between 2800 points, assuming the fovea to be roughly circular. As for its resolution, I have learned from a none-too-reliable source that the fovea is 0.2mm in diameter. This gives it an approximate resolution of 13 500 pixels per centimetre, or 35 000 pixels per inch. (Please don't take any figures I tell you to be accurate.)
http://www.opticalphysics.com/Vision.htm
Right, I've just done the test once and the squares started to form a sort of speckled grey at around 2.8m from the paper. In terms of angle, this is a resolution of about 0.013�, or � of an arc minute (one arc minute is equal to one sixtieth of a degree). This figure, while not accurately measured, ties in quite well with the figures given on the site below.
The site also quotes the fovea (the high resolution part of the eye, the part that sees in the best quality) to have a total of 24 million pixels. If I understand this correctly, this means that in any one direction it can distinguish between 2800 points, assuming the fovea to be roughly circular. As for its resolution, I have learned from a none-too-reliable source that the fovea is 0.2mm in diameter. This gives it an approximate resolution of 13 500 pixels per centimetre, or 35 000 pixels per inch. (Please don't take any figures I tell you to be accurate.)
http://www.opticalphysics.com/Vision.htm
Looking to the future, my personal guess is that in 10 years, TVs will have the resolution that computers do now, in 20 years realistic computer resolution will be being perfected and in 25 years everyone will have it. I do not believe that the human eye, or in fact any part of the body, is evolving. That is, compared to normal evolution, humans' must be even slower because it is not just the people with perfect vision that survive to pass on their genes ' it is everyone.
(Apologies for the long, drawn out answer.)
(Apologies for the long, drawn out answer.)
Okay, I'm no expert, but I did work in an opticians for four years and I remember having this discussion with one of the optometrists and her professional opinion is that the human eye is extremely inefficient as far as the lens and cornea are concerned. Compare the focussing ability of the human eye to a top end camera and the camera will win hands down each time. The reason we see so well (and the reason we see the right way up!) is because of the brain. It fills in the gaps and clears everything up and turns it all the right way round. Squirrel is also correct in the fact that the fovea is the part of the retina that sees all the detail, and it is only the very central point of our visual image, that small few centimetres radius in the middle of our vision that is actually seen in any detail at all, the rest is peripheral and filled in by the brain.