Display types: Difference between revisions

From Helpful
Jump to navigation Jump to search
(13 intermediate revisions by the same user not shown)
Line 770: Line 770:
-->
-->


====CRT screens====
=====CRT screens=====
{{stub}}
{{stub}}
'''Also flashing'''


CRT monitors do something ''vaguely'' similar to movie projectors, in that they light up an image so-many times a second.
CRT monitors do something ''vaguely'' similar to movie projectors, in that they light up an image so-many times a second.
Line 782: Line 779:
a CRT light up one spot at a time - there is a beam constantly being dragged line by line across the screen -- look at [https://youtu.be/3BJU2drrtCM?t=137 slow motion footage like this].
a CRT light up one spot at a time - there is a beam constantly being dragged line by line across the screen -- look at [https://youtu.be/3BJU2drrtCM?t=137 slow motion footage like this].


The phosphor will have a softish onset and retain light for some while,
and while slow motion tends to exaggerate that a little (looks like a single line),
it's still visible for much less than 1/60th of a second.


The largest reason that these pulsing phosphors don't look like harsh blinking is that our persistence of vision
Except ''what'' they light up is different. A film projector is just bouncing light off something white.
(you could say our eyes framerate sucks, though actually this is a poor name for our eyes's actual mechanics), combined with the fact that it's relatively bright.


A CRT is pushing energy into [[phosphorescence]] - lighting up a pixel's worth of phosphor at a time.
That phosphor has a softish onset, and retain light for a little while.
...but still mostly fallen off a millisecond or two later{{verify}},
so they're definitely pulsing.
The largest reason that these pulsing phosphors don't look like harsh blinking is that our persistence of vision, combined with the fact that it's relatively bright, end up looking fairly constant.
{{comment|(you could say our eyes framerate sucks, though actually this is a poor name for our eyes's actual mechanics)}}.


<!--
Analog TVs were almost always 50Hz or 60Hz (depending on country).
(separately, broadcast would often only show 25 or 30 new image frames, depending on the type of content content - read up on [[interlacing]] and [[three-two pull down]]).


Most CRT ''monitors'', unlike TVs, can be told to refresh at different rates.
While TVs were fixed to 50Hz or 60Hz, primarily because they had to deal with one specific broadcast standard,
most CRT ''monitors'' can be told to refresh at different rates.


There's a classical 60Hz mode that was easy to support, but people often preferred 72Hz or 75Hz or 85Hz or higher modes because they reduced eyestrain.
There's a classic 60Hz mode that was easy to support,  
but people often preferred 72Hz or 75Hz or 85Hz or higher modes, primarily because they reduced eyestrain.




And yes, after working behind one of those faster-rate monitors and moving to a 60Hz monitor would be ''really'' noticeable.
And yes, after working behind one of those faster-rate monitors and moving to a 60Hz monitor would be ''really'' noticeable.
Because even when we accept it as smooth enough, it still blinked, and we still perceive it as such.
Because even when we ''accept'' it as smooth enough, we still ''perceive'' it as blinking.
 
 
 
To recap, '''in TVs and CRT monitors''', we light up a line at a time {{comment|(the line nature is not the only way to use a CRT, just the easiest way to fill the entire screen. For alternatives, see e.g. [https://en.wikipedia.org/wiki/Vector_monitor vector monitors], and {{imagesearch|CRT oscilloscopes}})}}, in fact a pixel at a time.
Which happens so fast -- necessarily so --  that you would need a ''very'' fast camera
to notice this. Take a loook at [https://youtu.be/3BJU2drrtCM?t=52].
 
 


This means that there needs to be something that controls the left-to-right and top-to-bottom steering[https://youtu.be/l4UgZBs7ZGo?t=307] - and because you're really just bending a stream it back and forth, there are times at which that would be a visible line (mostly horizontal-ish between lines, one diagonal between frames).


'''How do pixels get sent?'''
We solve that by just not emitting electrons just then, and call that the blanking intervals.


That's a lot of intricately timed things, and this is done within the set.
In computers, there is a hsync and vsync pulses, which if I understand correctly are not so much control signals as... suggestions, interpreted by the monitor as "does that look like a timing of a mode I know? Okay, then I'll do the rest".{{verify}}




CRTs were driven relatively directly from the graphics card, in the sense that the values of the pixel we're beaming onto a pixel will be the value that is on the line ''at that very moment''.
<!--
(in fact, some early PCs didn't have a framebuffer, they just decided what needed to be output ''right now''. And then often did some work during the blanking intervals)
-->
-->


====Flatscreens====
{{stub}}


Flatscreens do not reproduce by blinking things at us.
Later monitors were not tied to that mechanism,
but there was still not that much reason to deviate.
 
You could add buffers, but why?
It would cost more, be more complex, and probably a little slower.
 


While in film, and in CRTs, the mechanism that lights up the screen is the is the same mechanism as the one that shows you the image,
in LCD-style flatscreens, the image updates and the lighting are now different mechanisms.


Basically, there's one overall light behind the pixely part of the screen, and each screen pixel blocks light.


'''How are CRT monitors different from CRT TVs?'''


That global backlights tends to be lit ''fairly'' continuously.
In TVs, redraw speeds were basically set in stone, as were some decoding details.
Sure there is variation in backlights, and some will still give you a little more eye strain than others.


CCFL backlight phosphors seem intentionally made to decay slowly,
''When'' each part happened was still synchronized from the received broadcast signal,  
so even if the panel is a mere 100Hz, that CCFL ''ought'' to look look much less blinky than e.g. CRT at 100Hz.
but the speed was basically fixed, as that made things easier.


On color TV there were some extra details, but a good deal worked the same way.


LED backlights are often [[PWM]]'d at kHz speeds{{verify}}, or current-limited{{verify}}, which are both smoother.
{{comment|Early game consoles/computers just generated a TV signal, so that you could use the TV you already had.
Which was slightly awkward to do, but a lot cheaper than something dedicated)}}


If you take a high speed camera, you may still not see it flicker [https://youtu.be/3BJU2drrtCM?t=267 this part of the same slow motion video] {{comment|(note how the backlight appears constant even when the pixel update is crawling by)}} until you get really fast and specific.


After that, CRT monitors started out as adapted CRT TVs,
and it didn't take long at all before speed at which things are drawn was configurable.
By the nineties it wasn't too unusual to drive a CRT monitor at 56, 60, 72, 75, perhaps 85, and sometimes 90{{verify}}, 100, or 120Hz.


So the difference between, say, a 60fps and 240fps monitor isn't in the lighting, it's how fast the light-blocking pixels in front of that constant backlight change.
We also grew an increasing amount of resolutions that the monitor should be capable of displaying.
A 60fps monitor changes its pixels every 16ms (1/60 sec), a 240fps the latter every 4ms (1/240 sec). The light just stays on.
Or rather, resolution-refresh combinations. Detecting and dealing that is a topic in and of itself.


As such, while a cRT at 30Hz would look very blinky and be hard on the eyes,
a flatscreen at 30fps updates looks choppy but not like a blinky eyestrain.


Yet at the CRT level, they were driven much the same way -
synchronization timing to let the monitor know when and how fast to sweep the beams around,
and a stream of pixels passed through as they arrive on the wires.


<!--
So a bunch of the TV mechanism lived on into CRT monitors - and even into the flatscreen era.
'''Footnotes to backlight'''


Note also that dimming a PWM'd backlight screen will effectively change the flicker a little.
====Flatscreens====
At high speeds this should not matter perceptibly, though. 
{{stub}}


On regular screens the dimming is usually fixed, for laptop screens it may do so based on battery status as well as ambient light.
For context: in film, and in CRTs, the mechanism that lights up the screen ''is the is the same mechanism'' as the one that shows you the image. And, as a result, blinky.


There are another few reasons why both can flicker a little more than that suggests, but only mildly so.


LCD-style flatscreens, the image updates and the lighting are now different mechanisms.


Basically, each pixel blocks light, and there is one big white area behind all the pixels.


People who are more sensitive to eyestrain and related headaches will want to know the details of the backlight, because the slowest LED PWM will still annoy you, and you're looking for faster PWM or current-limited.


But it's often not specced very well - so whether any particular monitor is better or worse for eyestrain is essentially not specced.
If you take a high speed camera, you may still not see it flicker [https://youtu.be/3BJU2drrtCM?t=267 this part of the same slow motion video] {{comment|(note how the backlight appears constant even when the pixel update is crawling by)}} until you get really fast and specific.




-->
So the difference between, say, a 60fps and 240fps monitor isn't in the lighting, it's how fast the light-blocking pixels in front of that constant backlight change. 
A 60fps monitor changes its pixels every 16ms (1/60 sec), a 240fps the latter every 4ms (1/240 sec). The light just stays on.


As such, while a CRT at 30Hz would look very blinky and be hard on the eyes,
a flatscreen at 30fps updates looks choppier, but not blinkier or more eyestrainy.


=====On updating pixels=====


<!--
<!--
'''Footnotes to backlight'''


To recap, '''in TVs and CRT monitors''', there is a narrow stream of electrons steered across the screen, making small bits of phosphor glow, one line at a time {{comment|(the line nature is not the only way to use a CRT, see e.g. [https://en.wikipedia.org/wiki/Vector_monitor vector monitors], and CRT oscilloscopes are also interesting, but it's a good way to do a generic display)}}
{{comment|It's only as blinky as the backlight is. That global backlights tends to be lit ''fairly'' continuously.
Sure there is variation in backlights, and some will still give you a little more eye strain than others.


So yes, a TV and CRT monitor is essentially updated one pixel at a time, which happens so fast you would need a ''very'' fast camera
CCFL backlight phosphors seem intentionally made to decay slowly,
to notice this - see e.g. [https://youtu.be/3BJU2drrtCM?t=52].
so even if the panel is a mere 100Hz, that CCFL ''ought'' to look look much less blinky than e.g. CRT at 100Hz.


This means that there needs to be something that controls the left-to-right and top-to-bottom steering[https://youtu.be/l4UgZBs7ZGo?t=307] - and because you're really just bending it back and forth, there are also times at which that bending shouldn't be visible, which is solved by just not emitting electrons, called the blanking intervals. If you didn't have horizontal blanking interval between lines, you would see nearly-horizontal lines as it gets dragged back for the next line; if you didn't have vertical blanking interval between frames you would see a diagonal line while it gets dragged back to the start of the next frame.
LED backlights are often [[PWM]]'d at kHz speeds{{verify}}, or current-limited{{verify}}, which are both smoother.
}}


In CRTs monitors, '''hsync''' and '''vsync''' names signals that (not control it directly but) help that movement happen.


Note also that dimming a PWM'd backlight screen will effectively change the flicker a little.
At high speeds this should not matter perceptibly, though. 
On regular screens the dimming is usually fixed, for laptop screens it may do so based on battery status as well as ambient light.


CRTs were driven relatively directly from the graphics card, in the sense that the values of the pixel we're beaming onto a pixel will be the value that is on the line ''at that very moment''.
There are another few reasons why both can flicker a little more than that suggests, but only mildly so.
It would be hard and expensive to do any buffering, and there would be no reason (the phosphor's short term persistance is a buffer of sorts).  


So there needs to be a precisely timed stream of pixel data that is passed to the phosphors,
and you spend most of the interval drawing pixels (all minus the blanking parts).}}




People who are more sensitive to eyestrain and related headaches will want to know the details of the backlight, because the slowest LED PWM will still annoy you, and you're looking for faster PWM or current-limited.


'''How are CRT monitors different from CRT TVs?'''
But it's often not specced very well - so whether any particular monitor is better or worse for eyestrain is essentially not specced.


In TVs, redraw speeds were basically set in stone, as were some decoding details.
It was still synchronized from the signal, but the speed was basically fixed, as that made things easier.


On color TV there were some extra details, but a good deal worked the same way.


Early game consoles/computers just generated a TV signal, so that you could use the TV you already had.
LCDs need to be refreshed somewhat like DRAM (LCD doesn't like being held at a constant voltage, so monitors apply the pixel voltage in alternating polarities each frame. (This is not something you really need to worry about. [http://www.techmind.org/lcd/index.html#inversion It needs an ''extremely'' specifc image to see])).


After that, CRT monitors started out as adapted CRT TVs.
Yet we were not tied to the broadcast format,
so it didn't take long at all before speed at which things are drawn was configurable.
By the nineties it wasn't too unusual to drive a CRT monitor at 56, 60, 72, 75, perhaps 85, and sometimes 90{{verify}}, 100, or 120Hz.


We also grew an increasing amount of resolutions that the monitor should be capable of displaying.
-->
Or rather, resolution-refresh combinations. Detecting and dealing that is a topic in and of itself.




Yet at the CRT level, they were driven much the same way -
=====On sending and updating pixels=====
synchronization timing to let the monitor know when and how fast to sweep the beams around,
{{stub}}
and a stream of pixels passed through as they arrive on the wires.


So a bunch of the TV mechanism lived on into CRT monitors - and even into the flatscreen era.
<!--
With a CRT at 60fps, roughly 16.6 milliseconds per frame,
''most'' of that 16ms has the beam visible,
meaning a graphics card would need to spend most of its time putting values on the wire.


That means that at, say, 60fps, roughly 16.6 milliseconds per frame,
''most'' of that 16ms is spent moving values onto the wire and onto the screen.


With LCDs, the sending-pixels part is still much the same.


Even the hsync and vsync signals still exist, though the LCDs are often more forgiving, even doing some mild correction if you get the timing a little wrong [https://youtu.be/muuhgrige5Q?t=425]




'''How are flatscreens different from CRTs?'''
Even though we are no longer sending the value right into the beam,
we were still spending most of the 16.6ms-at-6fps moving pixels.


The physical means of display is completely different.
In part because we initially still used the same VGA connector,
but in part because... why change that?


There is a constant backlight, and from the point of view of a single LCD pixel,
Most any single frame is already millions of values
the crystal's blocking-or-not state will sit around until asked to change.
(e.g. 1920 * 1080 * 3 colors ~= 6 million).
Just transferring that many individual values will take some time.


If you sent it faster, the refresh rate would still be the same (it's limited by other things),
so it would not display any faster.


And yet the sending-pixels part is still much the same.
Also, faster runs into cable issues, because higher rates need more bandwidth,
Consider that a ''single'' frame is millions of numbers (e.g. 1920 * 1080 * 3 colors ~= 6 million).
which on wires means potential timing issues, or fancier cables.
For PCs with color this was never much under a million.


Regardless of how many colors,  
It would just be more complex, more costly, and potentially more fragile.
actually just transferring that many individual values will take some time.




The hsync and vsync signals still exist,
though LCDs are often a little more forgiving, apparently keeping a short memory and allowing some syncing [https://youtu.be/muuhgrige5Q?t=425]


'''Does a monitor keep a framebuffer?'''


As the above suggests, no.


Does a monitor have a framebuffer and then update everything at once?
It ''could'' be designed that way if there was a point, but there rarely is.
It ''could'' be designed that way if there was a point, but there rarely is.
It would only make things more expensive for no reason.


Not unless it has fancy post-processing features, but those are mostly in flatscreen TVs.


When the only thing we are required to do is to finish drawing one image before the next starts,
then we can spend most of that time sending pixels,
and the screen, while it has more flexibility in ''how'' exactly, can spend most of the refresh interval updating pixels.
...basically as in the CRT days.


Panels even still tend to update a line at a time.


And when we were using VGA on flatscreens, that was what we were doing.
Why? Roughly two reasons:


Panels even tend to update a line at a time.  
That's how the image comes in, so we can decode it as it's going and send a line to the panel at a time.


Why?
Also, the way you update a panel. The ability to update specific pixels and nothing else would require a lot more wiring.
: the ability to update specific pixels would require a lot more wiring - a per-line addressing is already a good amount of wires (there is a similar tradeoff in camera image sensor readout, but in the other direction)
Instead, it works more like a shift register, and you end up replacing an entire line.
: LCDs need to be refreshed somewhat like DRAM (LCD doesn't like being held at a constant voltage, so monitors apply the pixel voltage in alternating polarities each frame. (This is not something you really need to worry about. [http://www.techmind.org/lcd/index.html#inversion It needs an ''extremely'' specifc image to see])).






Scanout itself basically refers to the readout and transfer of the framebuffer on the PC side,
'''Scanout''' itself basically refers to the readout and transfer of the framebuffer on the PC side,


'''Scanout lag''' can refer to  
'''Scanout lag''' can refer to  
: the per-line update
: the per-line update
: the lag in pixel change, GtG stuff
: the lag in pixel change, GtG stuff
:




Line 1,021: Line 1,036:


In particular PCs soon switched to framebuffers, meaning "a screenful of pixels on the PC side that you draw into", and the graphics card got a dedicated to sending that onto a wire (a [https://en.wikipedia.org/wiki/RAMDAC RAMDAC], basically short for 'hardware that you point at a framebuffer and spits out voltages, one for each pixel color at a time'). This meant we could draw anything, meant we had more time to do the actual drawing, and made higher resolutions a little more practical (if initially still limited by RAM cost).  In fact, VGA as a connector carries very little more than hsync, vsync, and "the current pixel's r,g,b", mostly just handled by the RAMDAC.
In particular PCs soon switched to framebuffers, meaning "a screenful of pixels on the PC side that you draw into", and the graphics card got a dedicated to sending that onto a wire (a [https://en.wikipedia.org/wiki/RAMDAC RAMDAC], basically short for 'hardware that you point at a framebuffer and spits out voltages, one for each pixel color at a time'). This meant we could draw anything, meant we had more time to do the actual drawing, and made higher resolutions a little more practical (if initially still limited by RAM cost).  In fact, VGA as a connector carries very little more than hsync, vsync, and "the current pixel's r,g,b", mostly just handled by the RAMDAC.





Revision as of 17:22, 24 June 2024

Just a few elements

Lighting

Nixie tubes


Dekatron

Eggcrate display

Mechanical

Mechanical counter

https://en.wikipedia.org/wiki/Mechanical_counter


Split-flap

https://en.wikipedia.org/wiki/Split-flap_display


Vane display

Flip-disc

https://en.wikipedia.org/wiki/Flip-disc_display


Other flipping types

LED segments

7-segment and others

This article/section is a stub — some half-sorted notes, not necessarily checked, not necessarily correct. Feel free to ignore, or tell me about it.
7-segment, 9-segment display, 14-segment, and 16-segment display. If meant for numbers will be a dot next to each (also common in general), if meant for time there will be a colon in one position.


These are really just separate lights that happen to be arranged in a useful shape.

Very typically LEDs (with a common cathode or anode), though similar ideas are sometimes implemented in other display types - notably the electromechanical one, and also sometimes VFD.


Even the simplest, 7-segment LED involves a bunch of connectors so are

  • often driven multiplexed, so only one of them is on at a time.
  • often done via a controller that handles that multiplexing for you


Seven segments are the minimal and classical case, good enough to display numbers and so e.g. times, but not really for characters.

More-than-7-segment displays are preferred for that.


https://en.wikipedia.org/wiki/Seven-segment_display

DIY

LCD character dislays

Character displays are basically those with predefined (and occasionally rewritable) fonts.


Classical interface

The more barebones interface is often a 16 pin line with a pinout like

  • Ground
  • Vcc
  • Contrast
usually there's a (trim)pot from Vcc, or a resistor if it's fixed


  • RS: Register Select (character or instruction)
in instruction mode, it receives commands like 'clear display', 'move cursor',
in character mode,
  • RW: Read/Write
tied to ground is write, which is usually the only thing you do
  • ENable / clk (for writing)
  • 8 data lines, but you can do most things over 4 of them


  • backlight Vcc
  • Backlight gnd


The minimal, write-only setup is:

  • tie RW to ground
  • connect RS, EN, D7, D6, D5, and D4 to digital outs


I2C and other

Matrix displays

(near-)monochrome

SSD1306

OLED, 128x64@4 colorsTemplate:Vierfy

https://cdn-shop.adafruit.com/datasheets/SSD1306.pdf

SH1107

OLED,

https://datasheetspdf.com/pdf-file/1481276/SINOWEALTH/SH1107/1

Small LCD/TFTs / OLEDs

This article/section is a stub — some half-sorted notes, not necessarily checked, not necessarily correct. Feel free to ignore, or tell me about it.

Small as in order of an inch or two (because the controllers are designed for a limited resolution?(verify)).


💤 Note that, like with monitors, marketers really don't mind if you confuse LED-backlit LCD with OLED,

and some of the ebays and aliexpresses sellers of the world will happily 'accidentally' call any small screen OLED if it means they sell more.

This is further made more confusing by the fact that there are

  • few-color OLEDs (2 to 8 colors or so, great for high contrast but only high cotnrast),
  • high color OLEDs (65K),

...so you sometimes need to dig into the tech specs to see the difference between high color LCD and high color OLED.



When all pixels are off they give zero light pollution (unlike most LCDs) which might be nice in the dark. These seem to appear in smaller sizes than small LCDs, so are great as compact indicators.


Can it do video or not?

If it does speak e.g. MIPI it's basically just a monitor, probably capable of decent-speed updates, but also the things you can connect to will (on the scale of microcontroller to mini-PC) be moderately powerful, e.g. a raspberry.

But the list below don't connect PC video cables.

Still, they have their own controller, and can hold their pixel state one way or the other, but connect something more command-like - so you can update a moderate amount of pixels with via an interface that is much less speedy or complex.

You might get reasonable results over SPI / I2C for a lot of e.g. basic interfaces and guages. By the time you try to display video you have to think about your design more.

For a large part because amount of pixels to update times the rate of frames per second has to fit through the communication (...also the display's capabilities). There is a semi-standard parallel interface that might make video-speed things feasible. This interface is faster than the SPI/I2C option, though not always that much, depending on hardware details.


Even if the specs of the screen can do it in theory, you also have to have the video ready to send. If you're running it from an RP2040 or ESP32, don't expect to libav/ffmpeg.

Say, something like the TinyTV runs a 216x135 65Kcolor display from a from a RP2040.

Also note that such hardware won't be doing decoding and rescaling arbitrary video files. They will use specifically pre-converted video.


In your choices, also consider libraries. Things like TFT_eSPI has a compatibility list you will care about.



Interfaces

This article/section is a stub — some half-sorted notes, not necessarily checked, not necessarily correct. Feel free to ignore, or tell me about it.


ST7735

LCD, 132x162@16bits RGB


ST7789

LCD, 240x320@16bits RGB

https://www.waveshare.com/w/upload/a/ae/ST7789_Datasheet.pdf

SSD1331

OLED, 96x 64, 16bits RGB

https://cdn-shop.adafruit.com/datasheets/SSD1331_1.2.pdf


SSD1309

OLED, 128 x 64, single color?

https://www.hpinfotech.ro/SSD1309.pdf

SSD1351

OLED, 65K color

https://newhavendisplay.com/content/app_notes/SSD1351.pdf

HX8352C

LCD https://www.ramtex.dk/display-controller-driver/rgb/hx8352.htm


HX8357C

R61581

ILI9163

LCD, 162x132@16-bit RGB

http://www.hpinfotech.ro/ILI9163.pdf

ILI9341

https://cdn-shop.adafruit.com/datasheets/ILI9341.pdf

ILI9486

LCD, 480x320@16-bit RGB

https://www.hpinfotech.ro/ILI9486.pdf

ILI9488

LCD

https://www.hpinfotech.ro/ILI9488.pdf

PCF8833

LCD, 132×132 16-bit RGB

https://www.olimex.com/Products/Modules/LCD/MOD-LCD6610/resources/PCF8833.pdf

SEPS225

LCD

https://vfdclock.jimdofree.com/app/download/7279155568/SEPS225.pdf


RM68140

LCD

https://www.melt.com.ru/docs/RM68140_datasheet_V0.3_20120605.pdf

GC9A01

LCD, 65K colors, SPI

Seem to often be used on round displays(verify)

https://www.buydisplay.com/download/ic/GC9A01A.pdf

Epaper

SSD1619

https://cursedhardware.github.io/epd-driver-ic/SSD1619A.pdf


Many-element - TV and monitor notes (and a little film)

Backlit flat-panel displays

CCFL or LED backlight

https://nl.wikipedia.org/wiki/CCFL

Self-lit

OLED

This article/section is a stub — some half-sorted notes, not necessarily checked, not necessarily correct. Feel free to ignore, or tell me about it.

OLED are organic LEDs, which in itself party just a practical production detail, and really just LEDs. (...though you can get fancy in the production process, e.g. pricy see-through displays are often OLED with substate trickery(verify))

While OLED is also a thing in lighting, OLED usually comes up in the context of OLED displays.


OLED displays are mainly contrasted with backlit displays. OLEDs being off just emit no light at all (compared to pixels that block backlight, because it is hard to get those to block all light) So the blacks are blacker, you could go brighter at the same time, There are some other technical details why they tend to look a little crisper.

Viewing angles are also better, roughly because the light source is closer to the surface.


PMOLED versus AMOLED makes no difference to the light emission, just to the way we access them (Passive Matrix, Active Matrix).

AMOLED can can somewhat lower power, higher speed, and more options along that scale(verify), all of which makes them interesting for mobile uses. It also scales better to larger monitors.

POLED (and confusingly, pOLED is a trademark) uses a polymer instead of the glass, so is less likely to break but has other potential issues


QLED

On image persistence / burn-in

VFD

Vacuum Fluorescent Displays are vacuum tubes applied in a specific way - see Lightbulb_notes#VFDs for more details.



Some theory - on reproduction

Reproduction that flashes

Film
This article/section is a stub — some half-sorted notes, not necessarily checked, not necessarily correct. Feel free to ignore, or tell me about it.

Mechanical film projectors flash individual film frames while that film is being held entirely still, before advancing that film to the next (while no light is coming out) and repeating.

(see e.g. this and note that it moves so quickly that you see that the film is taken it happens so quickly that you don't even see it move. Separately, if you slow playback you can also see that it flashes twice before it advances the film - we'll get to why)

This requires a shutter, i.e. not letting through any light a moderate part of the time (specifically while it's advancing the film). We are counting on our eyes to sort of ignore that.


One significant design concept very relevant to this type of reproduction is the flicker fusion threshold, the "frequency at which intermittent light stimulus appears to be steady light" to our eyes because separately from actual image it's showing, it appearing smooth is, you know, nice.

Research shows that this varies somewhat with conditions, but in most conditions practical around showing people images, that's somewhere between 50Hz and 90Hz.


Since people are sensitive to flicker to varying degrees, and this can lead to eyestain and headaches, we aim towards the high end of that range whenever that is not hard to do.

In fact, we did so even with film. While film is 24fps and was initially shown at 24Hz flashes, movie projectors soon introduced two-blade and then three-blade shutters, showing each image two or three times before advancing, meaning that while they still only show 24 distinct images per second, they flash it twice or three times for a regular 48Hz or 72Hz flicker. No more detail, but a bunch less eyestrain.


As to what is actually being show, an arguably even more basic constraint is the rate of new images that we accept as fluid movement.

Anything under 10fps looks jerky and stilted
or at least like a choice.
western and eastern animations were rarely higher than 12, or 8 or 6 for the simpler/cheaper ones
around 20fps we start readily accepting it as continuous movement,
above 30 or 40fps it looks smooth,
and above that it keeps on looking a little better yet, with quickly diminishing returns



So why 24?

Film's 24 was not universal at the time, and has no strong significance then or now. It's just that when a standard was needed, the number 24 was a chosen balance between various aspects, like the fact that that's enough for fluid movement and relatively few scenes need higher, and the fact that film stock is expensive, and a standard for projection (adaptable or even multiple projectors would be too expensive for most cinemas).


The reason we still use 24fps today is more faceted, and doesn't really have a one-sentence answer.

But part of it is that making movies go faster is not always well received.

It seems that we associated 24fps to feels like movies, 50/60fps feels like shaky-cam home movies made by dad's camcorder (when those were still a thing) or sports broadcasts (which we did even though it reduced detail) with their tense, immediate, real-world associations. So higher, while technically better, was also associated with a specific aesthetic. It mat works well for action movies, yet less for others.

There is an argument that 24fps's sluggishness puts us more at ease, reminds us that it isn't real, seems associated with storytelling, a dreamlike state, memory recall.

Even if we can't put our finger on why, such senses persist.


CRT screens
This article/section is a stub — some half-sorted notes, not necessarily checked, not necessarily correct. Feel free to ignore, or tell me about it.

CRT monitors do something vaguely similar to movie projectors, in that they light up an image so-many times a second.


Where with film you light up the entire thing at once (maybe with some time with the shutter coming in and out, ignore that for now). a CRT light up one spot at a time - there is a beam constantly being dragged line by line across the screen -- look at slow motion footage like this.


Except what they light up is different. A film projector is just bouncing light off something white.

A CRT is pushing energy into phosphorescence - lighting up a pixel's worth of phosphor at a time. That phosphor has a softish onset, and retain light for a little while.

...but still mostly fallen off a millisecond or two later(verify), so they're definitely pulsing.

The largest reason that these pulsing phosphors don't look like harsh blinking is that our persistence of vision, combined with the fact that it's relatively bright, end up looking fairly constant. (you could say our eyes framerate sucks, though actually this is a poor name for our eyes's actual mechanics).


While TVs were fixed to 50Hz or 60Hz, primarily because they had to deal with one specific broadcast standard, most CRT monitors can be told to refresh at different rates.

There's a classic 60Hz mode that was easy to support, but people often preferred 72Hz or 75Hz or 85Hz or higher modes, primarily because they reduced eyestrain.


And yes, after working behind one of those faster-rate monitors and moving to a 60Hz monitor would be really noticeable. Because even when we accept it as smooth enough, we still perceive it as blinking.


To recap, in TVs and CRT monitors, we light up a line at a time (the line nature is not the only way to use a CRT, just the easiest way to fill the entire screen. For alternatives, see e.g. vector monitors, and CRT oscilloscopes), in fact a pixel at a time. Which happens so fast -- necessarily so -- that you would need a very fast camera to notice this. Take a loook at [1].


This means that there needs to be something that controls the left-to-right and top-to-bottom steering[2] - and because you're really just bending a stream it back and forth, there are times at which that would be a visible line (mostly horizontal-ish between lines, one diagonal between frames).

We solve that by just not emitting electrons just then, and call that the blanking intervals.

That's a lot of intricately timed things, and this is done within the set. In computers, there is a hsync and vsync pulses, which if I understand correctly are not so much control signals as... suggestions, interpreted by the monitor as "does that look like a timing of a mode I know? Okay, then I'll do the rest".(verify)


CRTs were driven relatively directly from the graphics card, in the sense that the values of the pixel we're beaming onto a pixel will be the value that is on the line at that very moment.


Later monitors were not tied to that mechanism, but there was still not that much reason to deviate.

You could add buffers, but why? It would cost more, be more complex, and probably a little slower.



How are CRT monitors different from CRT TVs?

In TVs, redraw speeds were basically set in stone, as were some decoding details.

When each part happened was still synchronized from the received broadcast signal, but the speed was basically fixed, as that made things easier.

On color TV there were some extra details, but a good deal worked the same way.

Early game consoles/computers just generated a TV signal, so that you could use the TV you already had. Which was slightly awkward to do, but a lot cheaper than something dedicated)


After that, CRT monitors started out as adapted CRT TVs, and it didn't take long at all before speed at which things are drawn was configurable. By the nineties it wasn't too unusual to drive a CRT monitor at 56, 60, 72, 75, perhaps 85, and sometimes 90(verify), 100, or 120Hz.

We also grew an increasing amount of resolutions that the monitor should be capable of displaying. Or rather, resolution-refresh combinations. Detecting and dealing that is a topic in and of itself.


Yet at the CRT level, they were driven much the same way - synchronization timing to let the monitor know when and how fast to sweep the beams around, and a stream of pixels passed through as they arrive on the wires.

So a bunch of the TV mechanism lived on into CRT monitors - and even into the flatscreen era.

Flatscreens

This article/section is a stub — some half-sorted notes, not necessarily checked, not necessarily correct. Feel free to ignore, or tell me about it.

For context: in film, and in CRTs, the mechanism that lights up the screen is the is the same mechanism as the one that shows you the image. And, as a result, blinky.


LCD-style flatscreens, the image updates and the lighting are now different mechanisms.

Basically, each pixel blocks light, and there is one big white area behind all the pixels.


If you take a high speed camera, you may still not see it flicker this part of the same slow motion video (note how the backlight appears constant even when the pixel update is crawling by) until you get really fast and specific.


So the difference between, say, a 60fps and 240fps monitor isn't in the lighting, it's how fast the light-blocking pixels in front of that constant backlight change. A 60fps monitor changes its pixels every 16ms (1/60 sec), a 240fps the latter every 4ms (1/240 sec). The light just stays on.

As such, while a CRT at 30Hz would look very blinky and be hard on the eyes, a flatscreen at 30fps updates looks choppier, but not blinkier or more eyestrainy.



On sending and updating pixels
This article/section is a stub — some half-sorted notes, not necessarily checked, not necessarily correct. Feel free to ignore, or tell me about it.


On pixel response time and blur

Vsync

Adaptive sync

On perceiving

The framerate of our eyes

arguments for 60fps / 60Hz in gaming

On reaction time

On end-to-end latency

Tracking objects?

On unintentional motion blur

On intentional motion blur

On resolution

On contrast ratio / dynamic range

see also

Visuals_DIY#Analog_video_notes

Monitor mounts

VESA mounts

The sizes are often one of:

  • 7.5 cm x 7.5 cm (2.95 inches), 8kg max
  • 10 cm x 10 cm (3.94 inches), 12kg max
  • 20 cm x 20 cm (7.87 inches), 50kg+

...though there are smaller and larger variants, and also non-square ones.

Most products will have holes to fit more than one.

10cm was apparently the original, 7.5cm was added for smaller displays, though note that lightish displays could use either.


See also:

Monitor faults

Permanent lines on monitor