Latency: Difference between revisions

From Helpful
Jump to navigation Jump to search
mNo edit summary
 
(One intermediate revision by the same user not shown)
Line 53: Line 53:
but needs  
but needs  


===Compared to what?===
The more separate devices you add, the more you have to synchronize,
or at the very least verify.
A lot of this stuff is best-effort at low levels, which is why it can be great,
and is also why verifying can be hard.
How do you even measure how late a USB input was?
You can electronically measure when the button got pressed,
but compare it to what?
"Why is my 120Hz TV faster than my 390Hz monitor?"
Because you confused input latency with refresh interval or even the pixel's response time.
That number you give describes the middle one, suggests an upper limit to the third, and says nothing about the first.
LG C2 120Hz Native Resolution @ Max Hz: 5.3 ms
Acer Nitro 390Hz Native Resolution @ Max Hz: 1.8 ms




Line 64: Line 93:


(tape delay was ''intentional'' latency)
(tape delay was ''intentional'' latency)
Audio latency originates both from the
"different devices aren't necessarily synced" detail,
as well as the fact that the typical solution involves a small buffer.
As such, driving audio latency below a few milliseconds requires
So it's ''not'' that the devices couldn't time the samples -- the timing of the sound itself is accurate to dozens of ''micro''seconds.






-->
-->

Latest revision as of 18:46, 12 March 2024

Device latency

Audio latency