1. Sure isn't 16:9 though: Anamorphic widescreen (which is most movies) is 2.35:1 = 23.5:10 = 21.15:9, so you're still getting black bars on a 16:9 screen anyways. 2. It does actually. Due to 16:9 panels being all the rage, pretty much all current 16:10 monitors are high-end IPS monitors as opposed to the piles of of cheap, shitty TN 16:9 panels. 3. Fair point, but you're still getting black bars on 16:9, even on your 80" HDTV. 4. 4K resolution output exist already: 4.1. The RED Scarlet (a fairly popular 4K camera) is selling quite well to both pros and semi-pros, so recording into 4K is not an issue. 4.2.1. File networking is covered: a 1080p movie can at most max at 42Mbps (that I've seen), straight scaling by a factor of 4 give a required bandwidth of upto 168Mmbps. We then round up very generously to, say, 250Mbps. At this point in time we have 1000Mbps deployed in LANs (my computers are wired using gigabit ethernet most of the time) and many WANs. Curent Wifi tech could theoretically provide the necessary bandwidth, but we all know how that works in practice. 4.2.2. Signal output is covered too: HDMI 1.4 and DisplayPort 1.2 both support 4K at 24-30Hz and 60Hz respectively. Decoders/Encoders already exist as well, so no issue there either. 4.3. 4K panels already exist. Bloody expensive though, you're looking at US$ 2.5k+ for a single desktop-class monitor. 4.4. Theatres already use 4K. At this point, we need releases and screens. By the looks of it, 4K screens should reach high-end around 2015-2016. Hopefully the movie industry can adopt tech quickly for once... otherwise we end up having in the HD situation where people had 1920x1200 screens in their computers (Like the 15.4" latitude D830 from 2005) and had to watch SD content only... On that note, I'd like 3 nice 30" 120Hz 5120x3200 (precise doubling on 2560x1600) IPS screens. With Thunderbolt.