View Full Version here: : 8K not OK
Hans Tucker
28-12-2022, 03:09 PM
The progression from UHD to 4K was relatively rapid but alas the same can't be said for 4K to 8K ... it seems to have stalled with no progress over the year.
The local JB HiFi has had this Samsung 8K 65" TV for sale for the past year with a couple attempts to move it on via price reductions but no takers. It made me think that 8K really never progressed .. it fizzled out. So, is 4K the limit?
Oh .. should add that it has the worst energy star rating at 1 Star where as its 4K counterpart is 4.5 Star. Do anyone purchase based on Energy Rating?
mura_gadi
28-12-2022, 03:52 PM
For energy rating I do factor them in a lot, esp. TV's, you don't turn off TV's anymore you place them in standby mode.
A plasma TV uses 60% of the energy in standby mode. If your TV is around 180watts to run and a plasma, that's .1kw, x24hrs for 2.4kw a day, check your energy bill but say 25cents a kilowatt hour. So, 2.4kw x 25cents x 365 days, that's not a small sum of money for something that you haven't even turned on yet.
AstroViking
28-12-2022, 07:58 PM
There's also the bandwidth issue to deal with. Double the X and Y resolution and there's a quadrupling of the amount of data required for each frame.
I think that 8K video will eventually arrive, but there's the whole production / distribution process to consider - from "filming" at 8K, processing the video streams (non-linear video editing require massive compute resources), physical media capacities for people who like region-locked disks, and the Internet speeds / throughput to stream it.
In the meantime, I'll stick to reading a good book.
DarkArts
29-12-2022, 12:26 AM
Depending on viewing distance and screen size, many people would be hard pressed to see the difference between 2K and 4K, let alone 8K. I think that's why it hasn't taken off - it doesn't offer an advantage to the average home viewer.
My current 55" OLED screen uses 0.1W on standby and my old 42" plasma uses 0.7W on standby - measured with the same power meter.
I am still hanging on to my own Panasonic Plasma which is 1080p. At my viewing distances, I wouldn't appreciate 4K, let alone 8K.
dikman
02-01-2023, 08:40 AM
I've always said that most people wouldn't know a good picture if it slapped them in the face (people often oversaturate the colour and contrast, for example). For the majority of viewers HD is more than adequate, with 4k being a status symbol.
It comes down to content.
About the only streaming 8K content readily available at the moment
are demos on YouTube.
But 4K content has become ubiquitous, with an increasing number of
movies on services such as Netflix and Apple TV available in Ultra HD.
In the case of a smart TV, streaming service apps for Netflix and alike
are typically preinstalled. Pay for your subscription with the 4K premium
and off you go.
If you play content via some connected device, such as a home theatre
computer via a HDMI connect, it of course needs to support 4K. Older
computers aren't often fast enough.
A Foxel box offers 4K on sports and some movies.
Even consumer video cameras have no shortage of 4K offerings including
many in the sub-one-thousand-dollar range.
But in home video, there are now no shortage of 8K offerings as well.
Less than $1500 will buy you a Nikon camera that can shoot 8K at 30fps.
No free to air TV stations in Australia are broadcasting in 4K at the
moment. Elsewhere in the world there are trials of CODECs that support
4K but require the TV's to have tuners that support them. But if you
buy a 4K TV and watch mainly free-to-air, then you are really only watching
HD which the set then upscales.
In Japan, NHK has an 8K channel.
And an increasing number of movies are shot in 8K on high end
cameras from companies like RED.
At the end of the day, the take-up of 8K comes primarily down to
the small amount of 8K content and the bandwidth required.
Plus it is on the back that 4K itself is relatively new-ish and a lot of
consumers probably still have not so old 4K sets and many HD sets
that have reliable and they are not about to part with.
My eyesight at a distance is pretty good. The last time I had an eye
exam the optometrist said, "Your vision exceeds that of the requirements
of a fighter pilot". It made such an impression that on the drive home,
all other vehicles looked like targets. :lol:
Seriously though, whenever I have shown 4K content streaming
on a TV here to friends and family, their reaction has always been they
find it stunning compared to HD.
It's a self-evidently dramatic improvement in image quality that makes it more realistic.
Hmmm. For those who can't see the difference as dramatically obvious, it
might be time for that eye check-up. :thumbsup:
astro744
02-01-2023, 01:19 PM
Give it a few years and you’ll be asking the same question of 16k. I just typed
“is 16k tv an option yet?” into Safari and got a response saying most people wouldn’t notice the difference between it and 8k. You’ll also read Sony has a $5M 16k set with plans to develop further. 32k is then mentioned in the search results.
The trash on TV won’t be any better at 4k, 8k or 16k whether it is upscaled or not. I know many people stream movies but I’d rather spend money on other things than internet speed and bandwidth just to get good streaming but each to their own.
DarkArts
02-01-2023, 09:10 PM
I think we should all be a little circumspect when anything is given as "self evident".
I expect people will mostly continue to spend their money wherever there's content to support their wishful thinking, and then convince themselves that their purchase was worthwhile (classic examples being Hi-Fi systems and vinyl records). But will they (most people) actually see or hear the difference? Doubtful.
The medical science and maths is pretty straightforward. A very good human eye (6/6m or 20/20ft visual acuity - suitable to be a fighter pilot without corrective lenses) can resolve approximately 1 arcmin. From that, the maths gives - for an optimistic 100 inch screen size - the following maximum resolution viewing distances:
2K (FHD/regular BluRay) - 3.96 metres
4K (UHD/4K BluRay) - 1.98 m
8K (https://en.wikipedia.org/wiki/8K_resolution) (still called UHD, apparently) - 0.99 m
This all scales linearly for screen size, pixels, viewing distance and visual acuity.
If you sit further away than the distances above, your (20/20) eye cannot resolve all the additional detail and the higher resolution is lost. As an example, at my current viewing distance (measured) of 3.3m, and assuming excellent vision, I would not see the difference between 4K and 8K, but I might see some extra detail in 4K ... not forgetting that's for an extremely large 100 inch screen. For my 55" screen, I would not see any difference between 2K and 4K (which is what I observe ... yes, of course, I've tested it!).
Many people will buy whatever the marketing convinces them they "need", like overly-large SUVs, cars that can do 280km/hr, $1000+ smartphones, the latest clothing 'fast fashion' and ever-larger screens with pixels too small to resolve with the human eye from typical viewing distance.
Renato1
02-01-2023, 09:14 PM
When I walk past 4K TVs in shops I look at them and think,
"Very nice....shame it's not real".
Same when I see most published photos of birds or other things.
I don't see the world with high contrast, high dynamic range vision.
I recently decided it wasn't worth paying for High Definition on Netflix, and dropped down to Standard Definition (i.e. DVD quality). It really doesn't worry me if I can't quite see the pattern on someone's tie in a detective drama.
Regards.
Unfortunately when it comes to the human visual system, there is
absolutely nothing straight forward to the medical science and maths. :)
Sounds absolutely plausible but unfortunately fails to explain why people
can see the difference :)
A common mistake some make is that they know a little about how
digital cameras work and assume the human visual system is analogous.
For example, they assume that the three cone cell types, because of their
different spectral sensitives, must work in a way analogous to the RGB
sensors of a camera. Yet how we perceive color doesn't work in that
simplistic way at all and there are a set of beautiful and startling
experiments one can do that throw that model completely out the door.
Likewise some know a little about visual acuity and the established one
arc-minute minimum angle resolution for acute vision that you can
determine from an eye chart test or trying to split two stars naked eye.
They then might assume the eye must work like a CMOS sensor array,
throw in a little about the Rayleigh criteria and diffraction limits,
chuck in the Nyquist sampling criterion, do the numbers and convince
themselves there is no way they are going to see a difference between
two different resolution TV images at some distance because the
theory says the light from two point sources will overlap the two
photodetectors in the eye. QED.
Fortunately for us humans, when viewing complex scenes rather than just
a couple of dots, such as stars, we can do way, way better than that.
In fact in some visual tasks, our visual acuity is as good as 5 arc-seconds.
That means an observer can reliably resolve features that are less than
0.02mm apart at a 1 metre distance. That's like being able to resolve
a twenty-cent coin viewed at 17km.
We do this through a mechanism termed hyperacuity. The first experiments
demonstrating it go back to the late 1890's.
For certain visual tasks, like determining if two lines are aligned, it kicks
in and we can detect any misalignment with a precision ten times
better than with visual acuity.
We don't know entirely how it works - there are several theories - but like
color processing what we do know is that it is a function of both the brain
and the eye, not just the eye alone.
The fovea plays a big part and microsaccade - those little jerky involuntary
eye movements we make when are processing a scene - appear to
contribute to be able to see important details finer than an
explanation using simple diffraction limited visual acuity can explain.
In fact part of the trick appears to be when we have more than one
photoreceptor cell being triggered at a time by lines.
The human visual system does a lot of tricks to get the most out of its
hardware. Prior to writing this response I went for a walk. As I came
around the corner of my house to my driveway, I spotted a small snake
that my path would have intercepted with in about 5 metres if I had not
screeched to a halt. In seconds my eye and brain processed it. I could
see its tiny black eyes as it raised its head toward me, I could see the
flitting of its tongue and I could see it was a green tree snake and not the
similar sized venomous whip snake I had only seen a couple of days
earlier.
Looking back at what I saw, I know exactly where I was standing but
anything that was not the snake is somewhat of a blur.
We don't look at the world like one big camera taking it all in at once.
Our eye flits around and with our neural network in our brain, we process
the important things - the snake on the path, a person's face or what
we deem to be important when processing the images from a TV screen.
We do it in the time domain. Except when we are at the optometrist
or splitting double stars, we don't rely on just raw visual actuity.
We use hyperacuity for many visual tasks as well.
Electrical engineers are well aware of the human capability of hyperacuity
and papers modelling it appear in the professional journals. Designers of
high-end video cameras, computer graphics engineers and
engineers who design laser printers are particularly aware of it.
They appreciate that hyperacuity allows the brain to
extrapolate visual information from high resolution imagery - things like
smooth curvature detection which helps the brain process things like
faces to make us believe they look more realistic.
We also do see the world in high-dynamic range. We would be in trouble
if we didn't. But that's another topic and one that in part blends in with
how we process color in an area of the brain at the back of our heads.
The_bluester
04-01-2023, 11:44 AM
From my own perspective with a good sized 4K set which we watch from about 3M distance. The colour rendition was an improvement on the old 2K and slightly smaller set it replaced, but resolution wise there was not a huge amount in it. 4K to 8K, I reckon they would have to hit the point where we bought the 4k set, that being when they were almost at price parity and were easier to lay your hands on than a 2K set.
We are stuck with Malcolm (Turnbull) spec fixed wireless NBN so we can't rely on having the bandwidth to stream 4K without resolution drops, let alone 8K, it is going to be a pretty tough sell here to see any benefit to an 8K set with physical media slowly going the way of the dodo.
vBulletin® v3.8.7, Copyright ©2000-2025, vBulletin Solutions, Inc.