Video display notes

From Helpful
Jump to navigation Jump to search

The physical and human spects dealing with audio, video, and images

Vision and color perception: objectively describing color · the eyes and the brain · physics, numbers, and (non)linearity · color spaces · references, links, and unsorted stuff

Image: file formats · noise reduction · halftoning, dithering · illuminant correction · Image descriptors · Reverse image search · image feature and contour detection · OCR · Image - unsorted

Video: file format notes · video encoding notes · On display speed · Screen tearing and vsync

Simpler display types · Video display notes · Display DIY
Subtitle format notes


Audio physics and physiology: Sound physics and some human psychoacoustics · Descriptions used for sound and music

Noise stuff: Stray signals and noise · sound-related noise names · electronic non-coupled noise names · electronic coupled noise · ground loop · strategies to avoid coupled noise · Sampling, reproduction, and transmission distortions · (tape) noise reduction


Digital sound and processing: capture, storage, reproduction · on APIs (and latency) · programming and codecs · some glossary · Audio and signal processing - unsorted stuff

Music electronics: device voltage and impedance, audio and otherwise · amps and speakers · basic audio hacks · Simple ADCs and DACs · digital audio · multichannel and surround
On the stage side: microphones · studio and stage notes · Effects · sync


Electronic music:

Electronic music - musical terms
MIDI · Some history, ways of making noises · Gaming synth · microcontroller synth
Modular synth (eurorack, mostly):
sync · power supply · formats (physical, interconnects)
DAW: Ableton notes · MuLab notes · Mainstage notes


Unsorted: Visuals DIY · Signal analysis, modeling, processing (some audio, some more generic) · Music fingerprinting and identification

For more, see Category:Audio, video, images

Many-element, video display - TV and monitor notes (and a little film)

Backlit flat-panel displays

CCFL or LED backlight

https://nl.wikipedia.org/wiki/CCFL

Self-lit

OLED

This article/section is a stub — some half-sorted notes, not necessarily checked, not necessarily correct. Feel free to ignore, or tell me about it.

OLED are organic LEDs, which in itself party just a practical production detail, and really just LEDs. (...though you can get fancy in the production process, e.g. pricy see-through displays are often OLED with substate trickery(verify))

While OLED is also a thing in lighting, OLED usually comes up in the context of OLED displays.


OLED displays are mainly contrasted with backlit displays. OLEDs being off just emit no light at all (compared to pixels that block backlight, because it is hard to get those to block all light) So the blacks are blacker, you could go brighter at the same time, There are some other technical details why they tend to look a little crisper.

Viewing angles are also better, roughly because the light source is closer to the surface.


PMOLED versus AMOLED makes no difference to the light emission, just to the way we access them (Passive Matrix, Active Matrix).

AMOLED can can somewhat lower power, higher speed, and more options along that scale(verify), all of which makes them interesting for mobile uses. It also scales better to larger monitors.

POLED (and confusingly, pOLED is a trademark) uses a polymer instead of the glass, so is less likely to break but has other potential issues


QLED

On image persistence / burn-in

VFD

Vacuum Fluorescent Displays are vacuum tubes applied in a specific way - see Lightbulb_notes#VFDs for more details.



Some theory - on reproduction

Reproduction that flashes

Film
This article/section is a stub — some half-sorted notes, not necessarily checked, not necessarily correct. Feel free to ignore, or tell me about it.

Mechanical film projectors flash individual film frames while that film is being held entirely still, before advancing that film to the next (while no light is coming out) and repeating.

(see e.g. this and note that it moves so quickly that you see that the film is taken it happens so quickly that you don't even see it move. Separately, if you slow playback you can also see that it flashes twice before it advances the film - we'll get to why)

This requires a shutter, i.e. not letting through any light a moderate part of the time (specifically while it's advancing the film). We are counting on our eyes to sort of ignore that.


One significant design concept very relevant to this type of reproduction is the flicker fusion threshold, the "frequency at which intermittent light stimulus appears to be steady light" to our eyes because separately from actual image it's showing, it appearing smooth is, you know, nice.

Research shows that this varies somewhat with conditions, but in most conditions practical around showing people images, that's somewhere between 50Hz and 90Hz.


Since people are sensitive to flicker to varying degrees, and this can lead to eyestain and headaches, we aim towards the high end of that range whenever that is not hard to do.

In fact, we did so even with film. While film is 24fps and was initially shown at 24Hz flashes, movie projectors soon introduced two-blade and then three-blade shutters, showing each image two or three times before advancing, meaning that while they still only show 24 distinct images per second, they flash it twice or three times for a regular 48Hz or 72Hz flicker. No more detail, but a bunch less eyestrain.


As to what is actually being show, an arguably even more basic constraint is the rate of new images that we accept as fluid movement.

Anything under 10fps looks jerky and stilted
or at least like a choice.
western and eastern animations were rarely higher than 12, or 8 or 6 for the simpler/cheaper ones
around 20fps we start readily accepting it as continuous movement,
above 30 or 40fps it looks smooth,
and above that it keeps on looking a little better yet, with quickly diminishing returns



So why 24?

Film's 24 was not universal at the time, and has no strong significance then or now. It's just that when a standard was needed, the number 24 was a chosen balance between various aspects, like the fact that that's enough for fluid movement and relatively few scenes need higher, and the fact that film stock is expensive, and a standard for projection (adaptable or even multiple projectors would be too expensive for most cinemas).


The reason we still use 24fps today is more faceted, and doesn't really have a one-sentence answer.

But part of it is that making movies go faster is not always well received.

It seems that we associated 24fps to feels like movies, 50/60fps feels like shaky-cam home movies made by dad's camcorder (when those were still a thing) or sports broadcasts (which we did even though it reduced detail) with their tense, immediate, real-world associations. So higher, while technically better, was also associated with a specific aesthetic. It mat works well for action movies, yet less for others.

There is an argument that 24fps's sluggishness puts us more at ease, reminds us that it isn't real, seems associated with storytelling, a dreamlike state, memory recall.

Even if we can't put our finger on why, such senses persist.


CRT screens
This article/section is a stub — some half-sorted notes, not necessarily checked, not necessarily correct. Feel free to ignore, or tell me about it.

CRT monitors do something vaguely similar to movie projectors, in that they light up an image so-many times a second.


Where with film you light up the entire thing at once (maybe with some time with the shutter coming in and out, ignore that for now). a CRT light up one spot at a time - there is a beam constantly being dragged line by line across the screen -- look at slow motion footage like this.


Except what they light up is different. A film projector is just bouncing light off something white.

A CRT is pushing energy into phosphorescence - lighting up a pixel's worth of phosphor at a time. That phosphor has a softish onset, and retain light for a little while.

...but still mostly fallen off a millisecond or two later(verify), so they're definitely pulsing.

The largest reason that these pulsing phosphors don't look like harsh blinking is that our persistence of vision, combined with the fact that it's relatively bright, end up looking fairly constant. (you could say our eyes framerate sucks, though actually this is a poor name for our eyes's actual mechanics).


While TVs were fixed to 50Hz or 60Hz, primarily because they had to deal with one specific broadcast standard, most CRT monitors can be told to refresh at different rates.

There's a classic 60Hz mode that was easy to support, but people often preferred 72Hz or 75Hz or 85Hz or higher modes, primarily because they reduced eyestrain.


And yes, after working behind one of those faster-rate monitors and moving to a 60Hz monitor would be really noticeable. Because even when we accept it as smooth enough, we still perceive it as blinking.


To recap, in TVs and CRT monitors, we light up a line at a time (the line nature is not the only way to use a CRT, just the easiest way to fill the entire screen. For alternatives, see e.g. vector monitors, and CRT oscilloscopes), in fact a pixel at a time. Which happens so fast -- necessarily so -- that you would need a very fast camera to notice this. Take a loook at [1].


This means that there needs to be something that controls the left-to-right and top-to-bottom steering[2] - and because you're really just bending a stream it back and forth, there are times at which that would be a visible line (mostly horizontal-ish between lines, one diagonal between frames).

We solve that by just not emitting electrons just then, and call that the blanking intervals.

That's a lot of intricately timed things, and this is done within the set. In computers, there is a hsync and vsync pulses, which if I understand correctly are not so much control signals as... suggestions, interpreted by the monitor as "does that look like a timing of a mode I know? Okay, then I'll do the rest".(verify)


CRTs were driven relatively directly from the graphics card, in the sense that the values of the pixel we're beaming onto a pixel will be the value that is on the line at that very moment.


Later monitors were not tied to that mechanism, but there was still not that much reason to deviate.

You could add buffers, but why? It would cost more, be more complex, and probably a little slower.



How are CRT monitors different from CRT TVs?

In TVs, redraw speeds were basically set in stone, as were some decoding details.

When each part happened was still synchronized from the received broadcast signal, but the speed was basically fixed, as that made things easier.

On color TV there were some extra details, but a good deal worked the same way.

Early game consoles/computers just generated a TV signal, so that you could use the TV you already had. Which was slightly awkward to do, but a lot cheaper than something dedicated)


After that, CRT monitors started out as adapted CRT TVs, and it didn't take long at all before speed at which things are drawn was configurable. By the nineties it wasn't too unusual to drive a CRT monitor at 56, 60, 72, 75, perhaps 85, and sometimes 90(verify), 100, or 120Hz.

We also grew an increasing amount of resolutions that the monitor should be capable of displaying. Or rather, resolution-refresh combinations. Detecting and dealing that is a topic in and of itself.


Yet at the CRT level, they were driven much the same way - synchronization timing to let the monitor know when and how fast to sweep the beams around, and a stream of pixels passed through as they arrive on the wires.

So a bunch of the TV mechanism lived on into CRT monitors - and even into the flatscreen era.

Flatscreens

This article/section is a stub — some half-sorted notes, not necessarily checked, not necessarily correct. Feel free to ignore, or tell me about it.

For context: in film, and in CRTs, the mechanism that lights up the screen is the is the same mechanism as the one that shows you the image. And, as a result, blinky.


For LCD-style flatscreens, the image updates and the lighting are now different mechanisms.

Basically, there is one constant big white area behind all the pixels, and each pixel blocks light.


If you take a high speed camera, you may still not see it flicker this part of the same slow motion video, because the pixels sort of slide their values, and the backlight is often relatively constant (they may or may not pulse somewhat).


So the difference between, say, a 60fps and 240fps monitor isn't in the lighting, it's how fast the light-blocking pixels in front of that constant backlight change. On a 60fps monitor you can changes its pixels every 16ms (1/60 sec), on a 240fps you can change it every 4ms (1/240 sec). The light just stays on.

As such, while a CRT at 30Hz would look very blinky and be hard on the eyes, a flatscreen at 30fps updates will look choppier, but not blinkier or more eyestrainy.



On sending and updating pixels
This article/section is a stub — some half-sorted notes, not necessarily checked, not necessarily correct. Feel free to ignore, or tell me about it.
Screen tearing and vsync
On pixel response time and blur

Vsync

Variable refresh rate

arguments for 60fps / 60Hz in gaming

Reaction time

On end-to-end latency

Tracking objects?

On unintentional motion blur

On intentional motion blur

On contrast ratio / dynamic range

On perceiving

The framerate of our eyes

On human reaction time

On resolution

see also

Visuals_DIY#Analog_video_notes