Video display notes
Many-element, video display - TV and monitor notes (and a little film)
Backlit flat-panel displays
CCFL or LED backlight
https://nl.wikipedia.org/wiki/CCFL
Self-lit
OLED
OLED are organic LEDs, which in itself party just a practical production detail, and really just LEDs. (...though you can get fancy in the production process, e.g. pricy see-through displays are often OLED with substate trickery(verify))
While OLED is also a thing in lighting, OLED usually comes up in the context of OLED displays.
OLED displays are mainly contrasted with backlit displays.
OLEDs being off just emit no light at all (compared to pixels that block backlight, because it is hard to get those to block all light)
So the blacks are blacker, you could go brighter at the same time,
There are some other technical details why they tend to look a little crisper.
Viewing angles are also better, roughly because the light source is closer to the surface.
PMOLED versus AMOLED makes no difference to the light emission, just to the way we access them (Passive Matrix, Active Matrix).
AMOLED can can somewhat lower power, higher speed, and more options along that scale(verify), all of which makes them interesting for mobile uses. It also scales better to larger monitors.
POLED (and confusingly, pOLED is a trademark) uses a polymer instead of the glass, so is less likely to break but has other potential issues
QLED
On image persistence / burn-in
VFD
-
larger segments
-
dot matrix VFD
Vacuum Fluorescent Displays are vacuum tubes applied in a specific way - see Lightbulb_notes#VFDs for more details.
Some theory - on reproduction
Reproduction that flashes
Film
Mechanical film projectors flash individual film frames while that film is being held entirely still, before advancing that film to the next (while no light is coming out) and repeating.
(see e.g. this and note that it moves so quickly that you see that the film is taken it happens so quickly that you don't even see it move. Separately, if you slow playback you can also see that it flashes twice before it advances the film - we'll get to why)
This requires a shutter, i.e. not letting through any light a moderate part of the time (specifically while it's advancing the film). We are counting on our eyes to sort of ignore that.
One significant design concept very relevant to this type of reproduction is the flicker fusion threshold, the "frequency at which intermittent light stimulus appears to be steady light" to our eyes because separately from actual image it's showing, it appearing smooth is, you know, nice.
Research shows that this varies somewhat with conditions, but in most conditions practical around showing people images, that's somewhere between 50Hz and 90Hz.
Since people are sensitive to flicker to varying degrees, and this can lead to eyestain and headaches,
we aim towards the high end of that range whenever that is not hard to do.
In fact, we did so even with film. While film is 24fps and was initially shown at 24Hz flashes, movie projectors soon introduced two-blade and then three-blade shutters, showing each image two or three times before advancing, meaning that while they still only show 24 distinct images per second, they flash it twice or three times for a regular 48Hz or 72Hz flicker. No more detail, but a bunch less eyestrain.
As to what is actually being show, an arguably even more basic constraint is the rate of new images that we accept as fluid movement.
- Anything under 10fps looks jerky and stilted
- or at least like a choice.
- western and eastern animations were rarely higher than 12, or 8 or 6 for the simpler/cheaper ones
- around 20fps we start readily accepting it as continuous movement,
- above 30 or 40fps it looks smooth,
- and above that it keeps on looking a little better yet, with quickly diminishing returns
So why 24?
Film's 24 was not universal at the time, and has no strong significance then or now. It's just that when a standard was needed, the number 24 was a chosen balance between various aspects, like the fact that that's enough for fluid movement and relatively few scenes need higher, and the fact that film stock is expensive, and a standard for projection (adaptable or even multiple projectors would be too expensive for most cinemas).
The reason we still use 24fps today is more faceted, and doesn't really have a one-sentence answer.
But part of it is that making movies go faster is not always well received.
It seems that we associated 24fps to feels like movies, 50/60fps feels like shaky-cam home movies made by dad's camcorder (when those were still a thing) or sports broadcasts (which we did even though it reduced detail) with their tense, immediate, real-world associations. So higher, while technically better, was also associated with a specific aesthetic. It mat works well for action movies, yet less for others.
There is an argument that 24fps's sluggishness puts us more at ease, reminds us that it isn't real, seems associated with storytelling, a dreamlike state, memory recall.
Even if we can't put our finger on why, such senses persist.
CRT screens
CRT monitors do something vaguely similar to movie projectors, in that they light up an image so-many times a second.
Where with film you light up the entire thing at once (maybe with some time with the shutter coming in and out, ignore that for now).
a CRT light up one spot at a time - there is a beam constantly being dragged line by line across the screen -- look at slow motion footage like this.
Except what they light up is different. A film projector is just bouncing light off something white.
A CRT is pushing energy into phosphorescence - lighting up a pixel's worth of phosphor at a time. That phosphor has a softish onset, and retain light for a little while.
...but still mostly fallen off a millisecond or two later(verify), so they're definitely pulsing.
The largest reason that these pulsing phosphors don't look like harsh blinking is that our persistence of vision, combined with the fact that it's relatively bright, end up looking fairly constant. (you could say our eyes framerate sucks, though actually this is a poor name for our eyes's actual mechanics).
While TVs were fixed to 50Hz or 60Hz, primarily because they had to deal with one specific broadcast standard,
most CRT monitors can be told to refresh at different rates.
There's a classic 60Hz mode that was easy to support, but people often preferred 72Hz or 75Hz or 85Hz or higher modes, primarily because they reduced eyestrain.
And yes, after working behind one of those faster-rate monitors and moving to a 60Hz monitor would be really noticeable.
Because even when we accept it as smooth enough, we still perceive it as blinking.
To recap, in TVs and CRT monitors, we light up a line at a time (the line nature is not the only way to use a CRT, just the easiest way to fill the entire screen. For alternatives, see e.g. vector monitors, and CRT oscilloscopes), in fact a pixel at a time. Which happens so fast -- necessarily so -- that you would need a very fast camera to notice this. Take a loook at [1].
This means that there needs to be something that controls the left-to-right and top-to-bottom steering[2] - and because you're really just bending a stream it back and forth, there are times at which that would be a visible line (mostly horizontal-ish between lines, one diagonal between frames).
We solve that by just not emitting electrons just then, and call that the blanking intervals.
That's a lot of intricately timed things, and this is done within the set. In computers, there is a hsync and vsync pulses, which if I understand correctly are not so much control signals as... suggestions, interpreted by the monitor as "does that look like a timing of a mode I know? Okay, then I'll do the rest".(verify)
CRTs were driven relatively directly from the graphics card, in the sense that the values of the pixel we're beaming onto a pixel will be the value that is on the line at that very moment.
Later monitors were not tied to that mechanism,
but there was still not that much reason to deviate.
You could add buffers, but why? It would cost more, be more complex, and probably a little slower.
How are CRT monitors different from CRT TVs?
In TVs, redraw speeds were basically set in stone, as were some decoding details.
When each part happened was still synchronized from the received broadcast signal, but the speed was basically fixed, as that made things easier.
On color TV there were some extra details, but a good deal worked the same way.
Early game consoles/computers just generated a TV signal, so that you could use the TV you already had. Which was slightly awkward to do, but a lot cheaper than something dedicated)
After that, CRT monitors started out as adapted CRT TVs,
and it didn't take long at all before speed at which things are drawn was configurable.
By the nineties it wasn't too unusual to drive a CRT monitor at 56, 60, 72, 75, perhaps 85, and sometimes 90(verify), 100, or 120Hz.
We also grew an increasing amount of resolutions that the monitor should be capable of displaying. Or rather, resolution-refresh combinations. Detecting and dealing that is a topic in and of itself.
Yet at the CRT level, they were driven much the same way -
synchronization timing to let the monitor know when and how fast to sweep the beams around,
and a stream of pixels passed through as they arrive on the wires.
So a bunch of the TV mechanism lived on into CRT monitors - and even into the flatscreen era.
Flatscreens
For context: in film, and in CRTs, the mechanism that lights up the screen is the is the same mechanism as the one that shows you the image.
For LCD/TFT-style flatscreens, the image updates and the lighting are now different mechanisms.
Basically, there is one constant big white area behind all the pixels, and each pixel can block that light to different degrees - or rather can block a red-filtered (sub)pixel, a blue-colored (sub)pixel, and a green-filtered (sub)pixel (and those pixels stay like that until asked to change later).
If you take a high speed camera, you may still not see it flicker this part of the same slow motion video, because the pixels sort of slide their values, and the backlight is often relatively constant (they may or may not pulse somewhat).
Say,
a CRT at 30hz would look very blinky and probably strain most eyes,
while a flatscreen at 30fps updates will look choppier, but not blinkier or more eyestrainy
So the difference between, say, a 60fps and 240fps monitor isn't in the lighting, it's how fast the light-blocking pixels in front of that constant backlight change.
On a 60fps monitor you can changes its pixels every 16ms (1/60 sec), on a 240fps you can change it every 4ms (1/240 sec). The light just stays on.
Footnotes to LCD backlight
How blinky it is instead depends on the backlight is. That global backlights tends to be lit fairly continuously, and some monitors make more of a point of the way they do that.
CCFL backlight seem intentionally be made with slowly-decaying phosphors.
LED backlights are often PWM'd at kHz speeds(verify), or current-limited(verify), which are both potentially much smoother.
Worst case, it's a slow PWM that you do perceive, and note that dimming a PWM'd backlight screen will effectively change the flicker a little. At high enough speeds this should not matter perceptibly, though.
There are another few reasons why both can flicker a little more than that suggests, but only mildly so.
People who are more sensitive to eyestrain and related headaches will want to know the details of the backlight, because the slowest LED PWM will still annoy you, and you're looking for faster PWM or current-limited.
...but it's often not specced very well. -->
On sending and updating pixels