| |
This article is dedicated to a friend of mine, Mike.
There is a common misconception in human thinking that our eyes can only interpret 30 Frames Per Second. This misconception dates back to the first human films where in fact a horse was filmed proving actually that at certain points they were resting on a single leg during running. These early films evolved to run at 24 Frames Per Second, which has been the standard for close to a century.
A Movie theatre film running at 24 FPS (Frames Per Second) has an explanation. A Movie theatre uses a projector and is projected on a large screen, thus each frame is shown on the screen all at once. Because Human Eyes are capable of implmenting motion blur, and since the frames of a movie are being drawn all at once, motion blur is implemented in such few frames, which results in a lifelike perceptual picture. I'll explain the Human Eye and how it works in detail later on in this multi-page article.
Now since the first CRT TV was released, televisions have been running at 30 Frames Per Second. TV's in homes today use the standard 60Hz (Hertz) refresh rate. This equates to 60/2 which equals 30 Frames Per Second. A TV works by drawing each horizontal line of resolution piece by piece using an electron gun to react with the phosphors on the TV screen. Secondly, because the frame rate is 1/2 the refresh rate, transitions between frames go a lot smoother. Without going into detail and making this a 30 page article discussing advanced physics, I think you'll understand those points.
Moving on now with the frame rate. Motion blur again is a very important part to making videos look seamless. With motion blur, those two refreshes per frame give the impression of two frames to our eyes. This makes a really well encoded DVD look absolutely incredible. Another factor to consider is that neither movies or videos dip in frame rate when it comes to complex scenes. With no frame rate drops, the action is again seamless.
Computer Games and their industry driving use of Frames Per Second
It's easy to understand the TV and Movies and the technology behind them. Computers are much more complex. The most complex being the actual physiology /neuro-ethology of the visual system. Computer Monitors of a smaller size are much more expensive in cost related to a TV CRT (Cathode Ray Tube). This is because the phosphors and the dot pitch of Computer Monitors are much smaller and much more close together making much greater detail and much higher resolutions possible. Your Computer Monitor also refreshes much more rapidly, and if you look at your monitor through your peripheral vision you can actually watch these lines being drawn on your screen. You can also observe this technology difference by watching TV where a monitor is in the background on the TV.
A frame or scene on a computer is first setup by your video card in a frame buffer. The frame/image is then sent to the RAMDAC (Random Access Memory Digital-Analog-Convertor) for final display on your display device. Liquid Crystal Displays, and FPD Plasma displays use a higher quality strictly digital representation, so the transfer of information, in this case a scene is much quicker. After the scene has been sent to the monitor it is perfectly rendered and displayed. One thing is missing however, the faster you do this, and the more frames you plan on sending to the screen per second, the better your hardware needs to be. Computer Programmers and Computer Game Developers which have been working strictly with Computers can't reproduce motion blur in these scenes. Even though 30 Frames are displaying per second the scenes don't look as smooth as on a TV. Well that is until we get to more than 30 FPS.
NVIDIA a computer video card maker who recently purchased 3dFx another computer video card maker just finished a GPU (Graphics Processing Unit) for the XBOX from Microsoft. Increasing amounts of rendering capabilities and memory as well as more transistors and instructions per second equate to more frames per second in a Computer Video Game or on Computer Displays in general. There is no motion blur, so the transition from frame to frame is not as smooth as in movies, that is at 30 FPS. In example, NVIDIA/3dfx put out a demo that runs half the screen at 30 fps, and the other half at 60 fps. The results? - there is a definite difference between the two scenes; 60 fps looking much better and smoother than the 30 fps.
Even if you could put motion blur into games, it would be a waste. The Human Eye perceives information continuously, we do not perceive the world through frames. You could say we perceive the external visual world through streams, and only lose it when our eyes blink. In games, an implemented motion blur would cause the game to behave erratically; the programming wouldn't be as precise. An example would be playing a game like Unreal Tournament, if there was motion blur used, there would be problems calculating the exact position of an object (another player), so it would be really tough to hit something with your weapon. With motion blur in a game, the object in question would not really exist in any of the places where the "blur" is positioned, that is the object wouldn't exist at exactly coordiante XYZ. With exact frames, those without blur, each pixel, each object is exactly where it should be in the set space and time.
The overwhelming solution to a more realistic game play, or computer video has been to push the human eye past the misconception of only being able to perceive 30 FPS. Pushing the Human Eye past 30 FPS to 60 FPS and even 120 FPS is possible, ask the video card manufacturers, an eye doctor, or a Physiologist. We as humans CAN and DO see more than 60 frames a second.
With Computer Video Cards and computer programming, the actual frame rate can vary. Microsoft came up with a great way to handle this by being able to lock the frame rate when they were building one of their games (Flight Simulator).
The Human Eye and it's real capabilities - tahDA!
This is where this article gets even longer, but read on, please. I will explain to you how the Human Eye can perceive much past the mis conception of 30 FPS and well past 60 FPS, even surpassing 200 FPS.
We humans see light when its focused onto the retina of the eye by the lens. Light rays are perceived by our eyes as light enters - well, at the speed of light. I must stress the fact again that we live in an infinite world where information is continuously streamed to us. Our retinas interpret light in several ways with two types of cells; the rods and the cones. Our rods and cells are responsible for all aspects of receiving the focused light rays from our retinas. In fact, rods and cones are the cells on the surface of the retina, and a lack thereof is a leading cause of blindness.
Calculations such as intensity, color, and position (relative to the cell on the retina) are all forms of information transmitted by our retinas to our optic nerves. The optic nerve in turn sends this data through its pipeline (at the nerve impulse speed), on to the Visual Cortex portion of our Brains where it is interpreted.
Rods are the simpler of the two cell types, as it really only interprets "dim light". Since Rods are light intensity specific cells, they respond very fast, and to this day rival the quickest response time of the fastest computer. Rods control the amount of neurotransmitter released which is basically the amount of light that is stimulating the rod at that precise moment. Scientific study has proven upon microscopic examination of the retina that there is a much greater concentration of rods along the outer edges. One simple experiment taught to students studying the eye is to go out at night and look at the stars (preferably the Orion constellation) out of your peripheral vision (side view). Pick out a faint star from your periphery and then look at it directly. The star should disappear, and when you again turn and look at it from the periphery, it will pop back into view.
Cones are the second retina specialized cell type, and these are much more complex. Cones on our retinas are the RGB inputs that computer monitors and graphics use. The three basic parts to them absorb different wavelengths of light and release differing amounts of different neurotransmitters depending on the wavelength and intensity of that light. Think of our cones as RGB computer equivalants, and as such each cone has three receptors that receive red, green, or blue in the wavelength spectrum. Depending on the intensity of each wavelength, each receptor will release varying levels of neurotransmittor on through the optic nerve, and in the case of some colors, no neurotransmitter. Due to cones inherent 3 receptor nature vs 1, their response time is less than a rods due to the cones complex nature.
Our Optic nerves are the visual information highway by which our lens, then retina with the specialized cells transmit the visual data on to our Brains Visual Cortex for interpretation. This all begins with a nerve impulse in the optic nerve triggered by rhodospin in the retina, which takes all of a picosecond to occur. A picosecond is one trillionth of a second, so in reality, theoretically, we can calculate our eyes "response time" and then on to theoretical frames per second (but I won't even go there now). Keep reading.
The optic nerves average in length from 2 to 3 centimeters, so its a short trip to reach our Visual Cortex. Ok, so like the data on the internet, the data traveling in our optic nerves eventually reaches its destination, in this case, the Visual Cortex - the processor/interpretor.
Unfortunately, neuroscience only goes so far in understanding exactly how our visual cortex, in such a small place, can produce such amazing images unlike anything a computer can currently create. We only know so much, but scientists have theorised the visual cortex being a sort of filter, and blendor, to stream the information into our conciousness. We're bound to learn, in many more years time, just how much we've underestimated our own abilities as humans once again. Ontogoney recapitulates phylogeny (history repeats itself).
There are many examples to differentiate how the Human Visual System operates differently than say, an Eagles. One of these examples includes a snowflake, but let me create a new one.
You're in an airplane flying looking down at all the tiny cars and buildings. You are in a fast moving object, but distance and speed place you above the objects below. Now, lets pretend that a plane going 100 times as fast quickly flys below you, it was a blur wasn't it?
Regardless of any objects speed, it maintains a fixed position in space time. If the plane that just flew by was only going say, 1 times faster than you, you probably would have been able to see it. Since your incredible auto focus eye had been concentrated on the ground before it flew below, your visual cortex made the decision that it was there, but well, moving really fast, and not as important. A really fast camera with a really fast shutter speed would have been able to capture the plane in full detail. Not to limit our eyes ability, since we did see the plane, but we didn't issolate the frame, we streamed it relative to the last object we were looking at, the ground, moving slowing below.
Our eyes, technically, are the most advanced auto focus system around - they even make the cameras look weak. Using the same scenario with an Eagle in the passenger seat, the Eagle, due to its eyes only using Rods, and its distance to its visual cortex being 1/16 of ours wouldn't have seen as much blur in the plane. However, from what we understand of the Visual Cortex, and Rods and Cones, even Eagles can see dizzy blurry objects at times.
What is often called motion blur, is really how our unique vision handles motion, in a stream, not in a frame by frame. If our eyes only saw frames (IE: 30 images a second), like a single lens reflex camera, we'd see images pop in and out of existance and that would really be annoying and not as advantagous to us in our three dimensional space and bodies.
So how can you test how many Frames Per Second we as Humans can see?
My favorite test to mention to people is simply to look around their environment, then back at their TV, or monitor. How much more detail do you see vs your monitors? You see depth, shading, a wider array of colors, and its all streamed to you. Sure, we're smart enough to use a 24 frame movie and piece it together, and sure we can make real of video footage filmed in NTSC or PAL, but can you imagine the devices in the future?
You can also do the more technical and less imaginative tests above, including the star gazing, and this tv/monitor test. A TV running at only 30 FPS is picking up a Computer monitor in the background in its view, and with the 30 FPS TV Output you see the screen refreshes on the computer monitor running at 60 FPS. This actually leads to eyestrain with computer monitors but has everything to do with lower refresh rates, and not higher.
Don't underestimate your own eyes Buddy...
We as humans have a very advanced visual system, please understand that a computer with all it's processor strength still doesn't match our own brain, or the complexity of a single Deoxyribonucleic Acid strand. While some animals out there have sharper vision than us humans, there is usually something given up with it - for eagles there is color, and for owls it is the inability to move the eye in its socket. With our outstanding human visual, we can see in billions of colors (although it has been tested that women see as much as 30% more colors than men do. Our eyes can indeed perceive well over 200 frames per second from a simple little display device (mainly so low because of current hardware, not our own limits). Our eyes are also highly movable, able to focus in as close as an inch, or as far as infinity, and have the ability to change focus faster than the most complex and expensive high speed auto focus cameras. Our Human Visual system receives data constantly and is able to decode it nearly instantaneously. With our field of view being 170 degrees, and our fine focus being nearly 30 degrees, our eyes are still more advanced than even the most advanced visual technology in existance today.
So what is the answer to how many frames per second should we be looking for? If current science is a clue, its somewhere in sync with full saturation of our Visual Cortex, just like in real life. That number my friend - is - well - way up there with what we know about our eyes and brains.
It used to be, well, anything over 30 FPS is too much. (Is that why you're here, by chance?) :) Then, for a while it was, anything over 60 is sufficient. After even more new video cards, it became 72 FPS. Now, new monitors, new display types like organic LEDS, and FPDs offer to raise the bar even higher. Current LCD monitors response rates are nearing the microsecond barrier, much better than millisecond, and equating to even more FPS.
If this old United States Air Force study is any clue to you, we've only scratched the surface in not only knowing our FPS limits, and coming up with hardware that can match, or even approach them.
The USAF, in testing their pilots for visual response time, used a simple test to see if the pilots could distinguish small changes in light. In their experiment a picture of an aircraft was flashed on a screen in a dark room at 1/220th of a second. Pilots were consistently able to "see" the afterimage as well as identify the aircraft. This simple and specific situation not only proves the ability to percieve 1 image within 1/220 of a second, but the ability to interpret higher FPS.
This article was updated: 7/27/2002 due to its popularity and to reflect in more detail the science involved with our eyes and their ability to interpret more than 60 FPS.
To Mike (and everyone else), from Dustin D. Brand... Human Eye Frames Per Second - Part 2
Related AMO.NET Headlines: Windows Server 2003 solves DLL issues.
China to see Windows Source Code.
Windows source code given to Russia.
The truth about XBOX in Europe and worldwide...
Microsoft and DOJ win, 9 States lose.
1 Year Later, XBOX, GameCube, and the PS2.
Microsoft buys Game Publisher Rare...
Could Microsoft buy Game publisher Rare?
Windows.NET Server hits a high score.
Warcraft 3 sells 4.5 Million INITIAL orders.
Historical Microsoft case gets closing arguments.
The bump in the road for Broadband.
Microsoft XBOX gets LAUNCHED.
File swapping case may bring P5 into Court.
Linux isn't doing too well.
Microsoft Case split 9/9.
Microsoft closes in on Instant Messenging.
Did Microsoft block Web Browsers from MSN?
Netscape upgrade, just for Windows XP
Microsoft and Government near Settlement
Send us news tips | Contact Me | Back to Top
|
|
|