Re: Friday Facts #264 - Texture streaming
Posted: Thu Oct 18, 2018 2:47 pm
This is something I don't understand at all given modern hardware.
In the older days we had cathod ray tube monitors with an electron beam going left to right real fast and top to bottom less fast so it would go over every single pixel on the monitor. Both of those motions were done with magnetic fields at a fixed rate. Later then you had multi-sync monitors that could change those rates. But changing the rates takes a second or two and then you stick with the selected rates for every frame. And that means for each frame you have to have the pixel data at the exact moment the electron beam goes over the pixel. You can't use it earlier or later.
Now fast forward to today. We have TFT monitors that have a big chunk of memory where they store a full frame and drive individual pixels from that internally to make an image appear. They can even scale the image from a e.g. 640x480 image to full HD what the monitor actually has.
There also is no need for any kind of fixed refresh rate. It's not like the pixel fades if it isn't refreshed at 60Hz.
So why don't we have graphic cards that send out screen refreshes to the monitor when we tell them to? Why not send frame 1 after 10ms, frame 2 after 17ms, frame 3 after 12ms and so on. Why can't the monitor update when we tell them to instead at a fixed rate that we have to keep up with?
In the older days we had cathod ray tube monitors with an electron beam going left to right real fast and top to bottom less fast so it would go over every single pixel on the monitor. Both of those motions were done with magnetic fields at a fixed rate. Later then you had multi-sync monitors that could change those rates. But changing the rates takes a second or two and then you stick with the selected rates for every frame. And that means for each frame you have to have the pixel data at the exact moment the electron beam goes over the pixel. You can't use it earlier or later.
Now fast forward to today. We have TFT monitors that have a big chunk of memory where they store a full frame and drive individual pixels from that internally to make an image appear. They can even scale the image from a e.g. 640x480 image to full HD what the monitor actually has.
There also is no need for any kind of fixed refresh rate. It's not like the pixel fades if it isn't refreshed at 60Hz.
So why don't we have graphic cards that send out screen refreshes to the monitor when we tell them to? Why not send frame 1 after 10ms, frame 2 after 17ms, frame 3 after 12ms and so on. Why can't the monitor update when we tell them to instead at a fixed rate that we have to keep up with?