A few weeks ago, an interesting question cropped up: How fast is a PS/2 keyboard? That is to say, how quickly can it send scan codes (bytes) to the keyboard controller?
One might also ask, does it really matter? Sure enough, it does. As it turns out, the Borland Turbo Pascal 6.0 run-time, and probably a few related versions, handle keyboard input in a rather unorthodox way. The run-time installs its own INT 9/IRQ 1 handler (keyboard interrupt) which reads port 60h (keyboard data) and then chains to the original INT 9 handler… which reads port 60h again, expecting to read the same value.
That is a completely crazy approach, unless there is a solid guarantee that the keyboard can’t send a new byte of data before port 60h is read the second time. The two reads are done more or less back to back, with interrupts disabled, so much time cannot elapse between the two. But there will be some period of time where the keyboard might send further data. So, how quickly can a keyboard do that?
The theoretical answer lies in the IBM PS/2 Technical Reference and other similar manuals. The PS/2 keyboard communicates using a fairly straightforward serial protocol, with separate clock and data lines. At least on the keyboard controller side, the protocol is implemented in software (that is, microcontroller ROM).
The protocol uses one start bit, 8 data bits, a parity bit, and a stop bit. That is 11 bits total. If we consider the best case (or is it worst case?) scenario, the keyboard controller is infinitely fast and only the time to transfer those eleven bits matters. The PS/2 Keyboard and Auxiliary Device Controller reference for an unknown reason only specifies the timings for auxiliary devices, but PS/2 keyboards should behave the same.
IBM gives the CLK inactive period as 30-50 μs, and CLK active period as 30-50 μs as well. The time to transfer one bit is then 60-100 μs. That’s a bit rate of about to 10-16.67 kHz, which at 11 bits per byte translates to about 900 to 1,500 bytes per second. In other words, it takes 660 to 1,100 μs to transfer one byte (scan code) from the keyboard.
The absolute best/worst case would then be 660 μs; that particular clock starts counting when the host reads the keyboard data from port 60h the first time. To put it differently, after reading from port 60h once, software has at least 660 μs to read from port 60h again without worrying that a new byte might have arrived. In reality the time is likely longer because the keyboard controller is not infinitely fast and the keyboard is probably not communicating at the maximum allowed speed.
In CPU terms, 660 μs (well over half a millisecond) is a long time and lots of instructions executed, even on a very slow PC. The only scenario where the Borland keyboard logic might get upset would be a long NMI or SMI blocking execution between the two port accesses. But a system that can just “lose” half a millisecond or more of CPU time arguably has serious issues already.
Note: XT keyboards are not considered here. Those use a somewhat different protocol, with different timings.
To determine whether the theory has any bearing on reality, I set out to measure how fast an actual PS/2 keyboard sends data. The host cannot measure this precisely, because it does not “see” the bits on the keyboard wire, but it is possible to estimate the speed well enough.
The method is to measure how fast keyboard interrupts can occur. The limiting factor would typically be the human pressing the keys. Instead of asking the user to randomly bang the keys and hope they’re pressed in quick enough succession, it is much better to press keys which generate extended scan codes in Scan Set 2. That is, two-byte scan codes with E0h prefix and a data byte. These sequences are generated by the keyboard in response to a single press and thus represent the true top speed at which the keyboard sends data.
Measurements were taken on an IBM ThinkPad 760XL, using its built-in keyboard. The system in question has a 166 MHz Pentium MMX processor; the TSC was used for maximum accuracy.
The initial results were surprising. It took about one millisecond for the byte after the E0h prefix to arrive when pressing a key… but two milliseconds when releasing the key. What?!
On closer look, that is perfectly logical. The keyboard by default uses Scan Set 2 (AT style), which the keyboard controller translates to Scan Set 1 (XT style). For example for the Right Ctrl key, the host sees an E0 1D sequence when the key is pressed, and E0 9D when the key is released; that’s Scan Set 1. But the keyboard in fact sends data in Scan Set 2 format, and the Right Ctrl key press sequence is E0 14, whereas the key release sequence is E0 F0 14. That is, the key release sequence consists of three bytes instead of two, and for the host to see one byte after the E0 prefix when releasing the key, the keyboard has to send two bytes (the conversion is done by the keyboard controller). Therefore, the delay between the E0 prefix and the next byte seen by the host when releasing a key is about twice as long as the delay for a key press.
Now the results make much more sense: It takes about a millisecond for the keyboard to send a byte, which is less than the longest theoretical time of 1.1 ms but more than the shortest theoretical time of 0.66 ms. In other words, well within the theoretical range. So maybe the Borland run-time is not completely crazy after all.
What About USB?
The method used for testing PS/2 keyboard speed can also be applied to USB keyboards with “legacy” support, although it will obviously measure very different things. In a test system with an Intel DZ68DB board and Core i7-3770 CPU, the elapsed time between E0h prefix and the next byte was close to 16 milliseconds. More or less identical results were obtained on an Intel DX48BT2 board with a Core 2 Extreme QX9770 CPU.
It should be noted that there are several possibilities of providing PS/2 keyboard (and mouse) compatibility with USB keyboards (and mice) and I did not investigate which one the tested Intel boards use. What’s notable is that compared to a PS/2 keyboard, the delays are much longer (16ms vs. 1 or 2ms) and that there is no difference between key presses and key releases, because there is no Scan Set 2 to Scan Set 1 translation behind the scenes.
As to why 16 milliseconds—that appears to be an artifact of the particular (common) BIOS USB keyboard support implementation. All keyboard-related processing is done with a 16ms interval, including conversion of a USB key event into two (or more) scan codes seen by the keyboard interrupt handler.
The upshot is that a USB keyboard in legacy mode has a much less precise response than a USB keyboard with native drivers; in this particular setup, it will take up to 16 milliseconds for a key press or release to be reported, or longer if the key sends a multi-byte response.