Since it’s off-topic, I won’t go in to great detail but I think you’re not understanding the constraints of real-time, -live- broadcast quality, HD video from the broadcast/satellite world. Think CNN, BBC, Fox, Superbowl, World Series etc. I wasn’t referring to streaming static files in a PC context.
Consider, for a moment, a single stream moving 60 frames per second of video and audio through paralllel audio/video transcoding paths and maintaining lip sync and clock sync accuracy that varies less than +/-500ns (unique clock per stream), delivered over UDP with no opportunity for loss or delay. Then do that scaled across 36 inputs at 54 (or more) megabits per second each (1,944 megabits/second total on the input) and deliver the live output at 36 x 19.4 Mbps (698 megabits/second). The system is moving ~2600 megabits/second (spread across multiple network cards and IRQs) in an environment -extremely- sensitive to jitter and loss. The system needs to perform without losing a single 188 byte frame of data - 24/7/365. And each individual stream needs to be able to be stopped, started or modified without the stop/start impacting any of the parallel streams in any way. As well any stream on the input may be mildly or heavily impaired without warning due to signal impairment in the upstream transmission path (like rain fade or satellite transponder malfunction, or an originating piece of equipment hitting a bug). To make things more interesting consider that hardware inputs other than ethernet also need to be supported for these live signals, like satellite reception, off-air antenna reception, or legacy digital transports like ASI (serial interfaces than typically running up to 108 Mbps or more).
Not to mention actually transcoding the video payload from MPEG-2 > MPEG-4 compression for all streams as well as transcoding 1-4 live lip-sync audio streams -per- video stream and re-multiplexing them back together in real time, as well as doing filter and transform operations on the various sub-payloads of the overall transport stream. While doing this, also provide an interactive web interface for configuration and live status monitoring and alarming for seven different conditions of each stream. I think you get the idea. There is also sub-title burn in considerations and ad splicing/program substitution scenarios to consider.
Many firms use a RTOS to help achieve that, but the load conditions described above are being met 24/7/365 on systems with a single (older generation) i5 class CPU running a fairly stock Debian Linux base, slightly older kernel, booting from flash and running from RAM disk. This is achieved (and this is a mature commercial product) with efficient parallelization of the video specific code, dedication of specific cores to various functions and careful tuning of low level hardware and OS parameters like IRQ balacing, buffer and queue depths, network driver configurations optimized (but stock drivers otherwise) etc. As challenging as CNC is (which also doesn’t run 24/7/365) ,
I would humbly suggest the Maslow does not represent higher real time load or precision demand than likely one to two of the kinds of streams described above, let alone an entire system.
-Jeff