It's amazing that pixel density has been stagnant since 2014, when the first 5k TV (low ppi) and 5k desktop displays (>200 ppi) were released. It's 2023 now and it still takes a kidney to get 5k >200 ppi, and we only recently got the 6k >200 ppi option for 2 kidneys. Pixel density, however, is stagnant.
I expect the industry realizes 6k is basically the stopping point so they're intentionally approaching it very slowly.
Edit: Updated with ppi specs to draw focus to pixel density of desktop displays.
I saw a Samsung 8k screen and the resolution was amazing. Of course there was not much more content available than just that demo movie, but still it was interesting.
I do see individual pixels on retina screens but I don't notice them on more organic stuff like movies on my screen. I guess computer content is just perfect squares and then perfect circles and fonts with some jaggy edges where they're rounded off, which is really not that hard to spot on retina screens. The difference between 4K and 8K in my opinion is that you can better see little glimmers, textures and things like that. But the 8K demo movie of course was highly optimized to show exactly that.
What I really like is HDR on true black OLED, killer feature.
But what I want more than anything is a "compression" feature on the audio of movies. I mean I just want to hear what people are saying to each other without the neighbors calling the police when going into a scene with explosions and/or music. I'm using subtitles in English for English movies nowadays FFS!!!
> "But what I want more than anything is a "compression" feature on the audio of movies. I mean I just want to hear what people are saying to each other without the neighbors calling the police when going into a scene with explosions and/or music."
This is due to the (misconfigured? buggy?) way that 5.1 surround audio sometimes gets mapped to 2-channel output. Dialogue in a film is usually carried on the centre channel, which in a proper cinema will be pretty powerful and loud set of speakers. Music and surround effects are carried on the other channels.
But for whatever reason, sometimes when down-mixing the centre channel comes through way too quiet and gets drowned out by the others. This seems to have gotten better in recent years (Apple TV, Netflix, etc are either good at down-mixing or the streams come with good 2-channel audio), but some TVs still make a mess of it.
No, it's due to bad mixes being trendy these days. Downmix issues are certainly a thing, but the real problem starts in Hollywood, with auteurs like Nolan that no sound professionals can say 'No' to.
With Nolan at least, he's acknowledged it was a deliberate "creative decision".
> “We made carefully considered creative decisions,” Nolan explained. “There are particular moments in this film where I decided to use dialogue as a sound effect, so sometimes it’s mixed slightly underneath the other sound effects or in the other sound effects to emphasize how loud the surrounding noise is. It’s not that nobody has ever done these things before, but it’s a little unconventional for a Hollywood movie.”
Also in case it will help someone: OSMC/Kodi has a similar “downmix center mix level” option. Mine is always at -7db for 5.1 audio. Works like a charm most of the times
Or if you have a PS5, plug the controller into the console (must be wired, not BT) and then headphones into the controller. The virtual surround is surprisingly convincing and sounds quite natural and not headphoney.
I'm not so sure that's the case. I have true 5.1 and I still have to have the center channel at a relative +7db minimum for the dialog to be as audible as I want it to be, compared to the rest of the audio.
Yeah, but that's probably because your centre channel is relatively smaller than on the cinema audio systems the 5.1 track was designed for. So setting it to +7dB at home is probably normal/expected.
After all, (Tenet aside) we don't experience these problems at actual cinemas, where the centre channel is a huge speaker stack behind the screen, and the surrounds are relatively small by comparison. But I agree that it's a common issue on on home setups.
If you've got a receiver, a lot of them have audio compression options. Not sure how well they work though, but they're there. I think some TVs have them too. A receiver should also let you increase the levels on the center speaker vs the others (maybe even if you only have a 'virtual' center where you don't have an actual center speaker and the audio goes to left and right). The center speaker gets mostly dialogue, so that can help a lot.
Ha, I thought I am weird for using subs while watching English stuff. Have small kids sleeping not so far, can't wake them up just because some idiotic sound mixer thought that some explosion/crash/something should shatter your windows and that's how everybody should experience given movie.
Older movies made so many things so much better. Sound 'friendliness' is definitely one of them.
Yeah, the blacks absolutely. Recently upgraded to a PS5 (for 4K discs) and a Samsung S95B. Some of the most impressive discs I own are well restored films from the b&w era. The dead blacks and (with HDR bright highlights) gives it a wonderful natural projected look. The quality is probably better than even the most pristine release print, since these days these sorts of restorations are fine by scanning the original camera negative, so several analog generations better than prints for projection.
I just bought a new receiver after my old on was killed by lightning and I was very surprised to find that few receivers and virtually no inexpensive ones (< $500) can pair with Bluetooth headphones. They all support Bluetooth, but only as an audio source.
If you use your receiver to send to a Bluetooth device, it's not really doing its job of digital to analog conversion or amplification anymore, it's just acting as an audio relay. Most times the source that is plugged into your receiver can do the pairing to Bluetooth, so the receiver is sort of redundant. I admit it'd be a nice feature to have for occasions when you want it, though.
The positive of using the receiver is that the receiver is also usually the AV switch determining what device is actually active. Otherwise you're having to change what device you're paired with when you change inputs.
Also, I haven't been shopping for things like Blu-Ray players and Rokus/Fire TVs/other media boxes in a while, do things like that really do Bluetooth pairing?
I'm not sure if any of those do bluetooth pairing these days. In retrospect I probably shouldn't have said "most" sources will do the pairing themselves. Some will.
I'd like it in the receiver because Bluetooth introduces a delay that my receiver can compensate for and I can configure that in one place rather than in every source.
You're right. I've been thinking about picking one up because the software in my Sony TV is pretty bad and Sony stopped supporting it a couple of years ago (the TV is a 2019 model).
I'm probably going to buy one because the ethernet port in my TV recently failed and I had to switch it to WiFi. Streaming on the TV seems to be less reliable and an AppleTV would fix that as well.
>> I guess the solution is to buy a surround sound system and hope it's good enough.
This made a huge difference for me. I went from TV > Sonos Beam > Sonos Beam + two surrounds. I've tested going back to just the Beam (I wanted to use the surrounds as stereo music speakers) but the huge decrease in my ability to understand dialog made me keep the surrounds. It's particularly problematic on TV shows where background music is used a lot. Having that come out of the surrounds and dialogue from the Beam made things so much better.
> I guess the solution is to buy a surround sound system and hope it's good enough.
Yet many people opt for a soundbar which has similar issues with the physics of thin/small speakers. I personally have a set of restored Bozak B-302As from 1958, big box speakers that are a solid meter cube, and they sound great :)
The consequences of high dynamic range video are trivial to enclose within the bounds of one room in a way that high dynamic range audio is decidedly not trivial to enclose.
Hearing loss or damage is more of a challenge than vision issues as well. Wider dynamic range doesn't hinder visual understanding the same as it does audio.
I recommend getting a used receiver from pre to early 2000s and getting some used passive speakers and a sub to go with it. I spent $100 on my setup and it sounds better than any sound bar and can be endlessly adjusted to match my room’s acoustics
Agree with this, lots of sound bars are crap and even the ones that aren't are very constrained by their form factor. Good speakers can actually make very dynamic audio be less annoying as well, by virtue of being less distorted. Same volume, more signal, less noise.
Speaking of affordable, Apple TV + Dolby Atmos movies + AirPods Pro 2 with spatial audio enabled is probably the consumer friendly way to go for dynamic surround sound that won't get you evicted.
We can probably draw an analogy with frame rate: why do movie fans insist on 24 fps, when TV is 50/60 fps, and you could play your games at 240 fps if you really wanted to?... Because "more" is not always better.
Btw. I do actually prefer LESS contrast in video sometimes, just as I prefer a compressed dynamic range in audio. This has a lot to do with the environment I'm watching / listening in, and the quality of my screen and headphones or speakers.
And finally, using dynamic range to "enhance" the sound of detonations is just awfully childish. If you are watching a movie just because the detonations are loud in it, I would argue you are watching the wrong movie...
> We can probably draw an analogy with frame rate: why do movie fans insist on 24 fps, when TV is 50/60 fps, and you could play your games at 240 fps if you really wanted to?... Because "more" is not always better.
That seems to be more "they can get away with more at 24fps" and so using same techniques but just upping FPS results in more of it looking "fake" and I lot of "it doesn't look to movies I'm used to".
Also sometimes referred to as the Soap Opera effect as those shows tended to be shot video instead of film and end up with higher frame rates. Somehow we've associated that smooth motion with cheap and shallow content which turns people off to it.
Maybe they’re looking for contrast in a particular scene, eg bright beside dark. If a sequence as a whole is very loud, while a later sequence is quiet, they might have to drop the volume down because of the loud part, effectively compressing the range of the quiet part to the point that it is unintelligible.
TV resolution goes at the pace of available content, and everything goes at the pace of context (living room 8-12' away), which is presently stuck at roughly what the human eye can resolve at that distance, which is about 4k.
There isn't a world where a 5k or 6k tv ever makes sense. Monitor, yes. But not for TVs.
4k does not exist because it's optimal for the human eye. It's dominant right now because anything higher carries a really significant bandwidth penalty, which would be hard to do for both streaming and optical media.
If we lived in a world where blu-ray disks could hold 1TB, and global internet speeds were 10x what they are now, 8k TVs would be the norm.
People have been saying "You can't tell the difference between X and Y" since the dawn of HD graphics in general.
"You can't tell the difference between HD and FHD." FHD and QDH. QHD and 4K. 4k and 8K. 60FPS and 120. 120 and 240. A higher resolution texture. Higher resolution audio. A higher resolution mesh. Etc.
Every single time someone makes the claim it makes no difference, yet we do it regardless.
For something that seemingly makes no difference, people sure seem to continue improving it regardless. And shockingly, consumers seem to really enjoy the products too.
Maybe. That test doesn't cover everything. Not all content, not all TV sizes, not all individuals, not all display technology, not all environments, not all compression tech, etc.
What about images that use HDR10+? Or Dolby Vision? What about completely un-compressed lossless video? What about common compressions like X264 and X265? What about on an 80-inch screen? What about in dark vs bright scenes? Or even environments? (Human eyes handle images differently at low vs high brightness)
I'm not saying the data is worthless because it's not, but I am saying that I think it's too sweeping to say it's completely un-noticable. Not enough data for that conclusion. Vision and display technology are both very complicated topics and there's always room for improvement IMO.
Pixel density has diminishing returns, we've known forever its not linear. Its the same reason why phones aren't all 4k by now, nobody can tell the difference.
Every single upgrade in resolution has been diminishing returns, and that hasn't stopped anyone from doing it anyways.
The only reason it will ever stop is if we hit major technological or even physical barriers. And even then, we'll most likely just move onto another technology that allows yet higher resolution.
Yeah but you can keep source material at 44.1k and oversample before the DAC.
And 24 bit is nice purely because you can regulate volume in software without losing fidelity (if DAC is good enough). But again, no such requirement on source, just the processing pipeline
That's not what I'm getting at. For most purposes the 22KHz limit is a hard one. What I mean is that the eardrums do not have a linear response at any frequency.
Actually it is, but I'm stoned to death even when I share audible sound deltas between lossy and lossless codecs, so I'll just leave it with one sentence:
Higher sampling rates and higher quality encodings result in a broader, more impressive and immersive sound stage primarily, and more details are secondary, and in some cases are elusive to run after.
So I have a setup: Akai AM2850 Amplifier, a pair of HECO Celan GT 302 speakers, a Yamaha CD-S300 CD player, which has USB input with iPod interface and MP3 capability.
A well mastered album (read: No brickwalling) like Wasting Light (Foo Fighters) or Brother in Arms (Dire Straits) will provide you a larger soundstage when you listen in front of this set from CD (or any lossless source, but let's keep everything in the same DAC), even compared with 320kbps MP3 (it's minimal, but audible).
This can be ruled as placebo alright, but audio science is hard science. It's subjective, and there's an aura of lies around it due to great deal of real snake oil in this.
Yet, I listen to same amplifier for 30 years, and I know how it behaves in any genre, any sound source and any input level, yet I can make sense of the subtle sound stage changes, because I'm so familiar with it.
Honestly, I played in large orchestras and whatnot, but when listened intently, the difference is there. It's difference between a good cup of Tchibo vs good cup of Davidoff coffee, but it's there.
I'm currently not at my home, but if you want I can provide you a fresh set of sound deltas (FLAC vs. 320kbps MP3) when I return.
I agree that higher encoding quality can provide the differences you describe, but higher sampling rates (beyond 48kHz, that is) cannot.
Personally, I try to acquire all music on CD or as FLAC from e.g. Qobuz or Bandcamp. But I see no reason to go beyond 24-bit 48kHz for any kind of audio unless I plan on slowing it down for e.g., a slow-mo video.
I have a 24bit WAV version of Radiohead's "OK Computer Oknotok" directly from the band, and sounds absolutely beautiful, though I don't remember whether it's 48K or 96K.
In any case, I really don't believe that I have a system can render 96K sound meaningfully different than 48K. I'm also not sure that I can tell the difference unless I try very intently, either.
I also an avid Bandcamp shopper. Some of the people make really beautiful music there.
Lastly, if I'm happy with the sound I get, I don't run after numbers. I use a Behringer Bass V-Amp as my bass processor, and even though it's far inferior on the paper, the result is not for my ears.
Heck, if Haggard can use this thing for live and studio, why can't I?
A sound stage is not quantifiable, subjective and probably not real. I've heard similar things said many times for a number of audio systems and encodings and in a blind test it is just not discernible IME.
It's discernible and measurable. It's called stereo separation and instrument placeability in mastering parlance. iZotope Ozone has a nice stereo imaging view to show the stereo separation in tracks.
When you mic a big orchestra (or any orchestra) clearly, you can capture the stereo image (which is generally done with non-instrument ambient microphones), too.
Mixing this ambience into the final stereo mix can increase the instrument placing when done correctly, but needs a good room with good acoustics. This is why we have Atmos and other tech. It allows you to virtually position channels in a room, and increases this separation, in theory. Nice when done subtly, ugly when done overboard.
Also, speakers' sound rendering has something with it. For example, the Celans I have can fill a room with positionally correct instruments fairly impressive. What's more impressive is, Creative Gigaworks speakers also capable of doing this while being desk speakers (yet they have kevlar cones and silk tweeters too).
There's a sensible upper limit to pay for audio systems. Mine is neither top of the line, nor too cheap to be true. It's a powerful, yet vintage all analog powerhouse, and I have it for 30 years or so.
After some point, even if the resolution provided by the system is justified by the money it requires, finding sources to saturate is impossible to find. So, when you find something you like for the first time, you stop there. I plan no changes in my audio system for example.
There are infinite number of variables in audio (system, source, room, placement, etc.), but sound resolution and soundstage are not placebo effects. If it was, a $4 headphone would sound the same wth $44 or even with $440 one, yet they don't.
Same for speakers, amps, etc. Until a certain point as I said.
48khz is better for video due to the way it can be divided more evenly with popular video frame rates - and besides it’s the standard for audio alongside video so you might as well use it for everything.
> 48khz is better for video due to the way it can be divided more evenly with popular video frame rates
Is the half sample per frame at 44,100 Hz and 24 fps really an issue? DVDs support 48kHz, but they only support 25 fps and 29.97 fps. You're going to run into so much NTSC content that I'm not sure even divisibility is a thing.
Why do you keep repeating 24 bits? It is way too much, even at 16 bits the dynamic range is far more than what you can reasonably hear (unless you want to damage your ears!) - especially if you are listening in a real world environment, and not in a completely silent room... Vinyl has a significantly worse dynamic range, yet it is loved by hi-end fans, sometimes even preferred to CD!
24 bits is useful for mixing various sources, but otherwise it is a waste of space.
Tbh, if you're editing, remixing, etc, 32-bit float at higher sample rates obviously make sense. But not for a final delivery, and not for your playback devices. Even when I edit 192kHz 32-bit float audio (e.g. for a slow-motion video sequence where I need to slow the audio to match) my playback setup still only supports 16-bit 48kHz (and I don't need more).
My 20+ years of being involved in broadcast/tv/video technology says otherwise about TV resolution goes along with content. It’s actually a weird chicken/egg situation, that involves more than screens or content such as delivery, price of content creation and customer acceptance.
A larger TV is meant to be viewed from a further distance though. At the suggested distance, your eyes shouldn't be able to resolve the individual pixels.
You're right, the angular size is the true determining factor. But most people don't have a large enough living room to sit 12' from their TV :)
e.g. I've got a 48" UHD TV at 2.5m (8') distance, and a 27" UHD monitor that I use at 60cm distance, and while the TV is beyond the resolution limit of my eyes, the PC monitor isn't.
Even with really good film scanners, 4K seems to be about the limit of what’s worthwhile. Even at 4K itself compared to 1080, film grain I’d much more evident (not a bad thing persay, but a thing).
You've missed screen size as a factor. TVs are getting larger by about an inch per year since flat screens took over, and that trend doesn't seem like it's stopping any time soon.
There's also the type of content and your personal visual acuity. If you're young, your eyesight is probably quite capable of resolving much finer details than normal 6/6 vision. I can tell the difference between 4K and 8K at a normal sitting distance on a 65" TV.
However... the difference isn't enough for me to care. I'm much more interested in brightness, colour, framerate and a load of other things over another resolution bump.
As someone who used to run a photography company: total costs of producing 4k content (camera system, computer setup to handle file management and editing, time required to process and manage content, storage, headaches, etc) is absolutely enormous compared to 1080p. Higher resolution content like 8k seems like a nightmare for everyone but the consumer. I do think and hope we’ll get to the point where it’s not a big deal, but even today a brand new macbook pro will struggle to render a basic 1080p composite in After Effects. Even basic 1080p footage editing in Premiere Pro can overwhelm reasonably modern machines.
(Yes, I know that you do X on your Y machine without problem, but my point is that it’s easy to forget to account for all these little costs, and they pile up! Especially for professional workloads where people love to push the bar higher and higher.)
Also, from experience: The bitrate and bandwidth of streaming services is so bad that the actual resolution doesn't really matter.
A good high-quality 1080p export will have significantly higher perceived quality than a "4K" (UHD) video on YouTube, Netflix, Amazon Prime or Apple TV.
I've decided that I'll produce all content in 1080p at the current time, even though I'm recording video in DCI 4K oversampled from a 6K sensor. No viewer is ever going to see the difference anyway.
There are several studies which say that viewer emotional engagement caps out at 1080p, and the only thing that drives more engagement beyond 1080p is HDR/WCG and HFR.
Customers are also not as willing to pay for 4k as those invested in UHD would like to think. It's a "nice to have" and having big numbers makes people feel good, but it doesn't actually make them love the content more.
I know that you do X on your Y machine without problem, but my point is that it’s easy to forget to account for all these little costs, and they pile up!
Cloud-based After Effects render farms do help significantly though.
I think I do. The point is that you can build your video in Affect Effects locally at 1080p (or less) on a standard Macbook, and then render the video in the cloud at 8k really easily and at quite a low cost. You don't need to "do X on your Y machine". You can rent someone else's machine for that.
So, let's actually look at that. For a "low-budget" 8K pipeline you'd first need a camera that can shoot at 8K or more. Ideally you'd want more so you can crop-in or do stabilization in post. So you'd be shooting on the Blackmagic Ursa Mini Pro 12K (~$6k) [1].
Usually you're using 5-10% of the material that you shot in the final edit, for some movies that can go as low as 1% (e.g., Apocalypse Now [2]), but let's calculate with a 45min documentary and 7.5% material used.
That means you've now got 30 TB of raw material [3]. You'll obviously use proxies for editing, but at least for color grading and for delivery you'll need access to the full material.
So now you'll need to store 30 TB of raw material somewhere in the cloud accessible to the machine that's doing the delivery. Even assuming you've got a symmetric fiber connection so we can disregard potential traffic limits and transfer speeds for initially uploading the material, you'll still need to pay for the cloud storage.
Creative Cloud has a storage limit of 100 GB for individuals or 1 TB for business plans [4], which is obviously far too limiting, so you'd need to use something like LucidLink. As you'd be working with large video files, you'd need high-performance storage, so you'd have to calculate with their performance plan, which is another $80 per TB per month, so $2400 per month just for this one single project [5].
And of course all of the above may be "peanuts" for professional studios, but I am not aware of any hardware/workflow/money/things you can throw at the problem to make the editing and production experience smooth. There are tons of inconveniences, practical barriers, bandwidth issues (and I'm not even talking about network/cloud bandwidth, that's a whole other thing-- even just disk IO, moving stuff around via USB-C, backing up stuff, etc... It's all super labour-intensive and annoying).
Well, if you're editing with proxies (which is really easy with Resolve or Media Composer), the editing experience is really smooth. But that doesn't help for grading or delivery, where you'll still need the full resolution files.
Even with Gen 4 NVMe storage you'll quickly hit bottlenecks at those resolutions.
You'd be surprised at how much buffering and loading is needed to just playback 1080p content (even directly from your local machine's built-in SSD) in the video editor before it's rendered. It's incredibly frustrating to do video work on "regular"/prosumer (macbook pro) hardware.
A lot of that is on Adobe more than anything else, mind you. Same with Premiere, too. These tasks can be a lot faster than Adobe software allows them to be. But there is a limit, of course
I don't think these products are directly comparable. I think Final Cut Pro probably maps better to Adobe Premiere Pro... But even then, it's not a complete overlap. There are things you can easily do in Premiere that you can't easily do in FCP, and vice-versa. In general, I find that Premiere is a bit more powerful/flexible than FCP, but FCP is much better software and does what it does much smoother and much faster.
No, they're for completely different use cases. While it's technically possible to edit a video in After Effects, its main use case is compositing VFX and advanced motion graphics.
A lot of people are watching content on their phones, for one thing.
I create some travel video content and up until last year, most of my clients weren't even fussed about 4K. I had been shooting at 5.1K for most of last year and then dropped to 4K this year because no one has needed it, but the storage and delivery costs are higher.
Cost of delivery. Most stuff is being streamed these days, and nobody wants to have to pay for the cost of streaming 100gb files. Plus the cost of commercial videography equipment is insane. Why upgrade before you have to?
The BBC probably thought that when they recorded over old Dr. Who episodes. It’s up to our grandchildren to decide what they want to see and what they don’t.
Yeah, for TVs that sounds right, but not for desktop displays. I wonder if desktop display pixel density is only stagnated because the TV industry has so much influence on the momentum and the TV industry thinks 4k is good enough.
I think that influence is pretty clear. In my neck of the woods, when pc monitors in 16:9 became a thing, the marketing copy was basically how great they were for watching movies.
Only now do some manufacturers start to put out taller monitors, but they're still rare, at least in my market.
Is it even needed on desktop displays? My experience has been that anything under 32” you need to use scaling to make anything readable on a 4K display. I’m current using a 43” 4K display as my main monitor, sitting not very far from to, I can’t distinguish individual pixels as it is, and frankly even this is uncomfortably large, I have gutters set up around the edges so that windows maximise to a comfortably viewable size.
Everytime you increase the resolution all the requirements in term of cpu, ram, bandwith, storage explode for production, media distribution and end user explode.
It is not just replacing one screen for another on client's side.
I don't care about resolutions > 4K, like, at all.
Things I care about:
- More HiDPI laptops that are not 16:9, preferably 15"-16".
- More 4K Monitors in the 22"-24" size range, preferably not 16:9.
Having more of those things would have a much bigger impact on my daily display usage than a 5K TV. Apparently, the market doesn't share my preferences in displays very much. We're finally getting 16:10 and 3:2 alternatives in the laptop space, after more than a decade of enduring 16:9, but other than that, almost everything 4K is 16:9 14" and 27"-32".
If you do code or anything that has text in it, the vertical space needed is infinite.
And even if you edit 16:9 video content, you need extra space for your controls.
Incidentally, on a 16:10 monitor you can have a 16:9 movie or game take up the whole upper part of the screen and still have your dock or task bar visible.
For "infinite vertical space" you can do 9:16, i.e. portrait.
That said, widescreen is quite useful for two 8:9 windows side by side, which works less well on more square ratios.
> For "infinite vertical space" you can do 9:16, i.e. portrait.
The top feels too high for me then. Most ergonomic advice says your eyesight should see the top of the monitor horizontally doesn't it?
Also you will run into some web interface that was designed with landscape in mind sooner or later.
I have a friend who keeps two 16:9 monitors, one landscape and one portrait. That may work.
> That said, widescreen is quite useful for two 8:9 windows side by side, which works less well on more square ratios.
Matter of taste. I prefer multiple smaller monitors so i can keep a full monitor in my field of view. What you describe requires a monitor so large that you can't look at it entirely from programming distance.
> Matter of taste. I prefer multiple smaller monitors so i can keep a full monitor in my field of view
I think this is mostly due to software support, e.g. for ultrawides (21:9) you often can have it pretend to be two or three separate monitors. At that point the only difference between multiple smaller monitors and one wide one is whether you have bezels.
That said, in my opinion the problem is not the aspect ratio but the number of pixels. For example, I could take my 2560x1440 (16:9) display and put a black bar on the right to make it 1920x1440. That would be a "better" ratio, but having the extra horizontal pixels does not hurt (e.g. for a dock). The problem is that Xx1080 is just not enough vertical pixels, period. No matter whether it is 1920x1080 or 1080x1080.
Huawei Mateview is 3840x2560, so you even get a few more vertical pixels for a 3:2 aspect ratio. There also is the Surface Studio at 4500 x 3000, which, however, is an all-in-one.
I haven't tried either of them, but there are (expensive) options.
Definitely agree. For software development I always end up rotating my monitors to portrait. You are always going to have more lines in a file than will fit on a landscape 16:9 screen. A taller screen gives you a much better "view" of the context of the code.
And with window title bars, tabs, url bars, bookmarks, headings and sub headings the first jira issue appears more than halfway down the screen
Display technology is not the limiting factor in resolution viability - remember that driving pixels is expensive for the whole system. A 4k at 120Hz panel takes a ~30Gb/s of data to drive. For an 8k display, this is ~120Gb/s.
That's a lot data, for which content needs to be continuously rendered, read from VRAM and sent over a link during scanout and finally processed by a fast enough display controller to update the pixels. High-end systems can do this, but for the resolutions to become the norm you need to be able to do this from a low-power media boxes and entry-level laptops. Only then will such displays become the norm and prices go down.
That's by far the cheapest part of it, decoding will eat more power and transistors than just few differential lines. Add few magnitues if you need to generate it in a video game
Differential transceivers and high speed circuit design is not cheap before economy of scale kicks in for that particular unit, but the mention of throughput was not as much to denote the link as much as to denote the speed at which content must render, scan out and decode...
Sony uses screens with 3.840 x 1.644 pixel resolution and 120 Hz in their top level smartphones, so a laptop is probably able to do significantly more.
That's not how really how rendering works - there is no 2d or 3d, just variable complexity. On that phone, some UI activities like simple scrolling in some apps might be able to fun at a full 120Hz. Many apps apps won't, and games will either dynamically render a much lower resolution to keep up (a common performance technique), or just render nowhere near 120Hz, with sub-60 being more likely.
And that's at 4k. This thread is about higher resolutions. To give the same meh performance at 6k, the phone would have to be more than 2x as fast in every metric. At 8k, more than 4x as fast. All while staying within the same power envelope.
pixel density doesn't matter. what matters is anyway pixel density. your TV is 2x bigger than your monitor, but you sit at least 2x further away, so the angular density is the same
8k is double the vertical and horizontal resolution of 4k. You can argue that 4k was deceptive, since it switched from measuring the vertical resolution (2160p) to the horizontal, but 8k is just sticking with the established standards.
4k was the marketers cashing in a puffery token they'd been carrying for quite a long time by accurately measuring the short side of displays.
They moved to measuring the long side AND exaggerating by rounding up in the same generation, stealing the name of a slightly-better existing standard in the process.
You're right that 8k makes sense so long as you accept 4k, though.
Well, 4K is called that because when 1080p became common at home, cinemas switched to DCI 4K and suddenly TV manufacturers had to compete with that, so they branded the closest they could get to DCI 4K as their own "UHD 4K".
Originally 4K came from the DCI spec for digital cinema and it was 4096x2160. For consumer use it was shrinked to UHD which is the 3840x2160, but cinemas still use 4K (although most of the screens actually use 2K)
Now 4K refers to image roughly the width of 4000 pixels according to wikipedia [1]
All new/upgraded auditoriums installed 4K (laser) projectors when I left the business 5 years ago. So not sure if most auditoriums only have 2K these days?
IMAX has only started switching to 4K laser projectors a few years ago, they were one of the last hold outs for 2K projectors (and many IMAX cinemas still use 2K projectors sadly).
Cinema projectors. It should be said that I worked in the business in Norway which had converted all auditoriums to digital in 2010, so many theatres were looking at upgrading their projectors around the time I left for something else.
There really isn't much benefit of going above 4K with either TVs or desktop displays and multiple drawbacks. I run dual 4K monitors on my desktop and the pixel size is smaller than you can resolve at reasonable viewing distances, while I've yet to find an OS that deals with HiDPI displays well (Windows and Linux are both terrible at this). If you want to game you are pushing 4X the pixels as with 1080p so you don't get good framerates unless you spend an exorbitant amount on a GPU.
For content viewing, 4K is probably the upper limit of what you want, and if you game there probably isn't much benefit for going above 1440p.
What does this mean? The "retina" resolutions are all over the place and depend on the device size and type. Also they seem to always be somewhere in the middle of pc options, e.g. for the macbook air 13 you have 1920x1200 (average pc) < 2560x1664 (air or expensive pc) < 3840x2400 (overkill pc). For your example at 27" that seems to be 4k (average pc) < 5k (studio display) < 6k (dell etc.) < 8k (overkill).
All displays from Apple have different resolutions. Natively they are comparable to previous Apple displays of half the linear density (800x1280-ish for 13”, 1080 for 15” and so on).
Okay, so you can "see the difference" (positively, I assume?) of this to a 27" 4k monitor.
What about 27" 6k or 14" 4k?
Side rant: Yeah, apple makes great stuff, but the naming is obnoxious. It is not "hidpi" it is "retina", not "high refresh rate" but "promotion" and then you have to look through the marketing material to figure out how '14" retina' compares to a normal UHD+ display.
If I can see the difference I would assume I will be able to see a difference between 27" 4k and 27" 6k.
My canon 80d has a resolution of 6k x 4k.
I also can perceive the sharpness of text.
I don't mind to not run 8k IF driving the display pixels consumes too much energy. The argument of Performance though should not really exist. It's much more pixels true but there are plenty of technics to separate this technically.
Even on gaming they have adaptive rendering but the display resolution itself stays the same.
The power consumption is not so much on the gpu side. Denser pixels require stronger backlights because less light gets through. So even if you were to run the screen at half the max resolution, you get worse battery life. Similarly, oled is awesome tech, but high brightness requires more energy than led backlights.
My comment was based on visual acuity limits. A person with 20/20 vision and a 32" 4K display will hit that limit at around 2ft. Going to a higher resolution you won't be able to see individual pixels anymore unless you have better than average eyesight or sit close to your monitor.
The GPU bit was purely about games and how many pixels you are pushing in games vs cost for a minimum quality and performance level. For content consumption or creation use it isn't a problem.
Speak for yourself. I want a single curved 55" 8k display to replace my three 32" 4k monitors. Going from 6k to 8k with no bezels is an upgrade I will pay $$$$ for.
I like having a 4k monitor with a somewhat underpowered video card for the task. Most games that are graphically intense let you set the 3D render scale to something lower. At 70% render scale I get the benefit of sharp 4K text and UI stuff but with the 3D parts being closer to 1440p resolution. Upscaling can be really ugly at lower pixel densities, but rendering 1440p scale on 4K is hardly distinguishable from 1440p native, in my experience.
I've got a 5K 27" display at home (the much maligned LG Ultrafine) and generally use a 4K 27" one at work. The difference is very obvious.
That said I wouldn't see a huge amount of point in going far _beyond_ 5K at that screen size, and 27" is about as big as I want to go for a screen, so for my purposes we've hit the limits (albeit only expensively).
I've been wearing glasses for most of my life due to nearsightedness, and I don't care at all about density anymore. Even 4k is a bit luxurious. As I get older, it matters even less. I simply cannot tell the pixels apart anymore.
The real limiting factor is display size at the current densities. I can always use more real estate. A 49" super ultra wide is still fairly expensive (but getting cheaper quite quickly) even at not 4k densities.
Even with phones, Apple pretty much nailed it with the Retina display. The only thing I notice with newer displays is that the colours are better. In terms of smoothness, I can't tell the difference.
A HMD is much closer to your eyes than a computer monitor and a monitor will be much close to your eyes than a TV in your living room. The closer it is to your eyes the more PPI maters.
Here in Japan, every big electronics store sells 8k televisions from a number of Japanese brands. NHK also broadcastsl some (all?) of its content in 8k. I don’t know much about the technical aspects so I can’t attest to whether it is “true” 8k being streamed but it looks much better than the 4k OLED televisions, which also look amazing.
NHK only broadcast 8K in one of their channel. And AFAIK all 8K TV are premium segment.
EU has new energy efficiency requirement which basically means 8K in EU market wont happen unless there are substantial improvement in technology. 8K has more exposure in Japan mainly because they were the first proponent of the technology.
I much rather we settle on 6K and HDR10, and leaves room for 60Hz or 120Hz for different types of video which requires those framerate.
Samsung's already sidestepped the 90W regulation for the EU by shipping 8K displays this year by dimming the display in factory settings (you can turn this setting off)
There's also nothing physically stopping engineers from designing more efficient 8K displays that are sub 90W (in fairness they generally are more than double that atm, though)
Except for Cost. Even before EU regulation, 8K has been an hot topic between Panel Vendors and TV brands mainly pushed by Japan. And market were showing signs that 8K may be just like 3D TV. EU regulation were more like the nail in the coffin. Forcing panel marker to focus on other aspect rather than pixel density. Which is something much harder and costly to achieve with self emission panel technology.
Also worth mentioning not all pixels are created equally. Many years ago a Nokia phone’s camera had 44mp but the images were worse than many 12mp phone cameras.
Yes. 4K TV is enough for people with 20/20 vision. That is roughly ~80%. of population. And even with better vision most of the time they are only visible during static screen comparison. In motion as in video they are not as visible.
We really should have taken the middle ground between 8K and 4K and settle on 6K.
Instead we now have 4K that is good enough but not the best for most people.
It’s no secret that higher pixel counts are harder to achieve since computational power required grows approximately quadratically while GPU and CPU power improvements are more linear.
We also just didn't have any good ways of driving such high PPI displays until recently without having to resort to janky hacks like dual cable setups (and the lines down the middle that those tend to have).
A lot of laptops still don't support HDMI 2.1, and DP 2.0/2.1 only just came out (and I don't know if any displays are using it yet). There's also the bullshit the HDMI forum pulled where they relabeled HDMI 2.0 as HDMI 2.1 (years after HDMI 2.1 was out and established as "the one that can do 4k120hz no chroma subsampling" (a lot of implementations were around 40Gbps instead of the full 48Gbps because it was cheaper, and they didn't need the full 48Gbps for 4k120hz 10bit)), despite them having hugely different specs.
Displayport 2.0/2.1 is also a clusterfuck because they decided that it would max out at either 40Gbps or 80Gbps depending on the implementation (DP40 and DP80), and companies don't generally tell you what their implementation actually supports.
Macbook Pros only got HDMI 2.1 this year, iirc base and airs still don't have it. Intel iGPUs only do HDMI 2.0 and DP 1.4, AMD's Zen4 laptops can do HDMI 2.1 and DP 2.1 (idk what bandwidth). For dGPUs, AMD's 6000 series is HDMI 2.1+DP 1.4 (but there are issues with DSC), while 7000 series is HDMI 2.1 and DP 2.1 (DP80). Intel Arc dGPUs are DP 2.0 (DP40) and HDMI 2.1. Nvidia's 30 and 40 series are HDMI 2.1 and only DP 1.4.
edit: Also, people don't realize just how fucking huge an 8k framebuffer is, and you need multiple for a swapchain. Even for just basic 8bit colour you're looking at ~133MB per framebuffer.
Great comment. I just bought a new Dell 6K display and I'm struggling to imagine a purpose for 8k, but it's incredible that we have the technology to drive that.
> I just bought a new Dell 6K display and I'm struggling to imagine a purpose for 8k
It's for large monitors. I run a 43" 4k tv as a monitor. I would absolutely benefit from a similar sized tv in 8k. And people who run closer to the 50" size would benefit even more.
I think you're probably right about traditionally sized monitors, though.
I’m a bit surprised we don’t see point-to-point ethernet cables doing this. They are standardised, reliable, and carry a lot of data. My napkin math says an 8K display at 40 bits per pixel and 60 fps is about 18 Gbps, well into the range of fancy ethernet.
And, if consumer electronics can drive down the costs of these interfaces, everyone wins.
That's not a lot for people in the market for 8K monitors. 24G VRAM in the form of used 3090 cards is way cheaper than high end monitors. But yeah lack of DP2.1 support on current-gen cards like the 4090 is a problem
> you need multiple for a swapchain
There should be just two with variable refresh rate
On an 8GB (eg. base Macbook) or 16GB laptop where you're sharing ram between the CPU and iGPU it's a ton.
People want 8k for better PPI for text rendering, not for games, meaning they want to use it with their work laptop.
> There should be just two with variable refresh rate
There's still a lot of software out there that's using DXGI blit model instead of DXGI flip model, and a triple buffered vsync+windowed blit app means you're looking at 5? framebuffers in flight at once. (ignoring any additional intermediate buffers for each app, and potentially having multiple overlapping windows drawing onscreen at one time).
Thanks to Moore's Law, computer power has been growing exponentially since forever. Single-core performance has stagnated, sure, but total performance is still following the curve. Graphics is easy to parallelise, so throwing more cores at it is not a problem. In fact, this is precisely what GPUs do.
In other words, the available "operations per pixel per second" have gone up over the years, despite the increasing resolution!
In fact, we're at the point now that we can even do real time ray tracing, which was unthinkable at any typical resolution just a decade ago. Top-end RTX 4090 GPUs can even do ray tracing at 4K, which is just crazy.
Yes, but for GPUs performance scales fairly close to 1:1 with increasing transistor counts. Newer GPUs generally have more of the same type of compute units tiled out.
The original general-purpose GPU, the NVIDIA 8800 GT had 112 cores. The RTX 4090 has 9,728! Each core has pretty much remained the same: a 32-bit arithmetic/logic unit (ALU).
Moore’s law is a thing yes, but in the real world, power consumption and heat dissipation issues seem to be the actual constraints rather than transistor counts.
Vertical scaling is of course possible, but it’s not really useful for driving a higher resolution monitor in your living room because of bottlenecks being completely elsewhere.
>Power is definitely a major factor. Unless games do not exist in your world
Maybe you don't realize you're on a worldwide forum where anyone can respond to anyone else for any reason? I'm perfectly entitled to respond to your comment as someone who does not game. The person you replied to isn't the only person in this entire thread. And suggesting that it's strange for me to reply to your comment is just strange in itself.
For the 10 minutes every other day I spend doing it, it's just fine. I don't waste hours gaming like some people. LCD displays interest me, and so I came into the thread - is that reasonable enough to satisfy your question?
I think PC displays are not the innovation driver here. They get technology leftovers from TV. And TV doesn't care about Pixel density because of viewing distance.
Meanwhile phones (and apple's VR thing) have massive pixel density, but PC displays are more like scaled down tvs than scaled up phones.
> It's amazing that pixel density has been stagnant since 2014, when the first 5k TV (low ppi) and 5k desktop displays (>200 ppi) were released. It's 2023 now and it still takes a kidney to get 5k >200 ppi, and we only recently got the 6k >200 ppi option for 2 kidneys. Pixel density, however, is stagnant.
Maybe this is so for desktop size screens, but for larger things, the big advance in the last 18 months is real 8K 60Hz displays at 65 to 80 inch diagonal size going from "absurdly incredibly expensive" to being priced similar to where 4K displays were when they were introduced. 8K displays are absolutely viable now.
There is obviously a lack of real high bitrate 8K content unless you're shooting it yourself.
The rationale behind buying PC monitors for gaming purposes is very different from the rationale behind buying TVs.
As a PC gamer, you take your GPU into account when choosing a monitor. Doesn't help to buy a huge screen, even if it‘s relatively cheap, when you can't afford the high-end GPU you need to drive those pixels.
The TV buyer doesn't face this problem at all. Usually no signal source is connected to the (smart) TV anyway, and the internal streaming apps have no problem to play 4k content, as long as the subscription allows it.
Pixel density is pointless after you get above your retina’s “resolution”. I can’t really tell 4K from 1080p from my couch. What I can tell apart is HDR from SDR. I think a lot of progress will continue on that area because it results in higher perceived quality.
If I had a pet owl, they’d probably be the only ones in the house to be able to tell content apart by resolution.
Even for my 27” desk monitors, it’s almost pointless to go beyond 5K.
It just doesn't matter that much for the average consumer. Most office workers (that is, not silicon valley programmers) work on 1920x1080 and that's fine. There really is not much to gain by doubling or tripling the resolution.
I'm working on a 27" 2560x1440 screen. I can see pixels, but that really doesn't matter. Text is readable, nothing is blurry, I can do my work and get on with my day. Screens are good enough, they work, and there is not much to gain by having higher resolutions.
It's probably a matter of demand vs cost; I'm confident technology can make displays at a higher DPI, but... why? If it's not for consumer use (e.g. 4K, now 8K screens which are IMO completely gratuitous), if they can't scale up to produce millions of panels, it's not worth the investment beyond the research of whether it's possible.
Anyway, the next frontier is (has been?) VR, which needs / needed high DPI but small screens.
The only area where I’d think we want more pixels would be in medical imaging but even then we can always zoom out the interesting parts of the image (and rely on machines to look for interesting parts)
My phone screen is >600ppi. My e-book reader (e-ink) is 300ppi. For me 300ppi seems to be the point at which things start to look "pixel-less". A 300ppi e-ink display looks remarkably close to paper print quality. I think popularity of sans serif fonts online is mostly due to serif fonts looking shit on low PPI displays. It would be really nice to have a 300ppi monitor just so I could switch to more pleasing fonts.
There are more metrics than just pixel density that they need to optimize for. HDR, WCG, refresh rates and latency, aging/color stability, ...
4k is good enough for most needs, so the demand is elsewhere. Plus the gaming market is limited by what GPUs can render.
I feel like these DPIs have improved beyond the point of it mattering. Diminishing returns given the limits of human vision. 5k, 6k, seems more about marketing, like 96khz in the audiophile world. Makes little subjective difference.
I'm guessing it's due to the influence of the TV market on the digital display industry. TV users don't care for >4k, which could make manufacturing >4k more difficult. It's only a guess, and not a very educated one either.
I got a 218 ppi 27 incher, and I was disappointed when English is crisp but my native language is a pain at small font sizes. My 350+ ppi mobile renders them fine at those sizes.
Talking about PPI without including distance to the screen is like talking about how far your car can drive on a tank of gas without mentioning how big the gas tank is.
220 PPI on a TV screen that you're sitting 8 feet away from? More than most people will ever notice. 220 PPI on a monitor that you're 18-24 inches away from? Probably the ideal density. On a phone or tablet? Meh. On a VR display? Absolutely unusable.
Oh god. I hate that Apple helped stick this BS in people's heads.
The human eye can resolve way, way past 220PPI. Even at distance.
I'm sitting here typing on a surface book with a 13" panel that is 267PPI and I would adore it to be double that. I run at native resolution with no scaling.
For the "average" human eye (20/20 vision) it's something like ~338ppi at 25cm right? If you have better vision (my corrected vision is 20/10 which is near the theoretical max for a human (20/8 I think is the max~?).
It's aggravating that people are like "over 300PPI is the max so screw improving". It's literally just the "max" at being able to discern the spacing between pixels accurately. Higher PPIs/DPIs still lead to an increase in objective clarity and crispness as you are able to improve aliasing etc.
Some of my old devices had PPIs in excess of 520PPI and I absolutely adored it (Note Edge). I would kill for a desktop monitor with similar PPI.
SO yeah, I guess "diminishing returns" is a thing but I wish it was at least a readily available option. I would adore a 27-30" monitor that was ~500ppi.. sighs
I don't have a definite reference for anything between 185 and 217, but it is a big difference visually. Somewhere in that range, the pixels disappear at a reasonable viewing distance.
There's no reason to go higher. 4k at most sizes in most living rooms is already at a point of diminishing returns. 8k TVs do exist, but nobody bothers making content for them. There's no appetite for it.
I agree 4k is enough for TVs, but not for computer displays. 27"@4k still looks pretty bad on a desktop display. For desktop display, 220 ppi (27"@5k or 32"@6k) is perfect.
Yes but those displays are stretched over a much wider field of view. When I sit on my couch and look at my TV it is covering maybe 30 degrees of my FOV. A VR headset typically would like to have 100+ degree FOV, which means a 4K VR display will have way less fidelity than a 4K TV.
This is the fundamental struggle with VR displays. Human eyes have very good visual fidelity and it would be real nice to match or exceed that with a VR headset. This presents many technical challenges.
Yes, it is around 3400ppi. This has been achievable for a bit in very small displays (these are 1.4in), although I believe there are difficulties with anything larger.
I expect the industry realizes 6k is basically the stopping point so they're intentionally approaching it very slowly.
Edit: Updated with ppi specs to draw focus to pixel density of desktop displays.