BT.601 is from 1982 and was the first widely adopted analog component video standard (sampling analog video into 3 color components (YUV) at 13.5 MHz). Prior to BT.601, the main standard for video was SMPTE 244M created by the Society of Motion Picture and Television Engineers, a composite video standard which sampled analog video at 14.32 MHz. Of course, a higher sampling rate is, all things equal, generally better. The reason for BT.601 being lower (13.5 MHz) was a compromise - equal parts technical and political.
Analog television was created in the 1930s as a black-and-white composite standard and in 1953 color was added by a very clever hack which kept all broadcasts backward compatible with existing B&W TVs. Politicians mandated this because they feared nerfing all the B&W TVs owned by voters. But that hack came with some significant technical compromises which complicated and degraded analog video for over 50 years. The composite and component sampling rates (14.32 MHz and 13.5 MHz) are both based on being 4x a specific existing color carrier sampling rate from analog television. And those two frequencies directly dictated all the odd-seeming horizontal pixel resolutions we find in pre-HD digital video (352, 704, 360, 720 and 768) and even the original PC display resolutions (CGA, VGA, XGA, etc). To be clear, analog television signals were never pixels. Each horizontal scanline was only ever an oscillating electrical voltage from the moment photons struck an analog tube in a TV camera to the home viewer's cathode ray tube (CRT). Early digital video resolutions were simply based on how many samples an analog-to-digital converter would need to fully recreate the original electrical voltage.
For example, 720 is tied to 13.5 Mhz because sampling the active picture area of an analog video scanline at 13.5 MHz generates 1440 samples (double per-Nyquist). Similarly, 768 is tied to 14.32 MHz generating 1536 samples. VGA's horizontal resolution of 640 is simply from adjusting analog video's rectangular aspect ratio to be square (720 * 0.909 = 640). It's kind of fascinating all these modern digital resolutions can be traced back to decisions made in the 1930s based on which affordable analog components were available, which competing commercial interests prevailed (RCA vs Philco) and the political sensitivities present at the time.
This is repeated often and simply isn't true.
if (rtt < config.min_rtt)
rtt = config.min_rtt;
else if (rtt > config.max_rtt)
rtt = config.max_rtt;
Wouldn't this hide bugs in the code or network anomalies? Replies from localhost seem to typically arrive in less than 50 µs.Comments in an earlier version [2] make no sense to me:
/* Use standard timersub for more accurate results */
if (rtt < 0)
rtt = 0;
/* Cap at reasonable maximum to handle outliers */
if (rtt > 1000)
rtt = 1000;
[1] https://github.com/davidesantangelo/fastrace/blob/5b843a197b...[2] https://github.com/davidesantangelo/fastrace/commit/79d92744...
Did you find out how this came to be, or just random typo?
Curious what the purpose of calling a pay phone is? (wasn't possible in my country)
- GarageFS
- OpenStack Swift
- CEPH Object Gateway Rados
- Riak CS
- OpenIO