The Future Of Display Technology

The TV making industry has had a hard time
in recent years – just ask Sony. After the advent of the high-definition plasma
and then the LCD TV, things looked rosy for a while, when prices were high and
demand was growing.
Too many companies jumped on the bandwagon,
though, and soon prices were dropping dramatically and with them the margins
necessary to cover the research investment that had been made to build big
digital displays.

3D Failure

I know Sky TV still thinks it’s great, but
the rest of the world has pretty much accepted that 3D TV has failed to convince
people to buy this service or the TVs to watch it.
Marketing material showing TVs doing things
they patently can’t is always a hint that expectation is unlikely to win a
confrontation with reality, and that’s exactly how it was with 3D. However, 3D
TV was doomed from the outset, coming as it did.

A Most people agree 3D TV is dead. Perhaps
creating promotional images like this one, which show the TV doing something it
can’t possibly do, didn’t help
off the back of yet another attempt to
float it as the must-see technology for film.
You can blame the studios, James Cameron or
whomever, but film goers across the world have been aggressively fleeced
numerous times over to see films in 3D. Or rather, to see limited scenes in
poor stereographic separation, to be wholly accurate.
If you don’t believe me, try going to a
movie, waiting for a dialogue strong scene and remove the specs. Hmmm…looks
suspiciously like 2D.
Very few films are entirely in 3D, and most
look much better when viewed in much brighter and definitely clearer 2D.
In terms of the TV, the onslaught really
got going in 2012, when almost every new TV model presented at the CES show in
Las Vegas was promoted as 3D. In 2013, almost none of the new products were
pushing this aspect.
It could be argued that the problem with 3D
on TV was not entirely separate from those in the cinema, where a price premium
was charged for what the public expected to be inclusive.
However, 3D had many other problems, some
of which the designers have since desperately tried to address. First of these
was the glasses, which in early TV products you needed to buy separately.
People don’t like wearing them, and they often don’t work for people who have
prescription lenses or who suffer from one of a number of ocular conditions,
including eye dominance.
Even for those who don’t mind the glasses
and have perfect vision it can give them eye strain, headaches and for some
make them nauseous (or is that the movies?).
Many people spend an average of five hours
a day in front of the TV, so making a good portion of them feel unwell probably
isn’t going to be a popular move.
There have been some attempts at
glasses-free 3D, but none of them have been totally successful or been
delivered at an attractive price.
When you also factor in that it actually
reduces the image quality by half and the frame rate by half in games, you can
see that 3D has a big public perception problem that it was unlikely to be
easily overcome.
However, most of this was entirely
predictable, because we’ve seen 3D launched to a marketing onslaught before,
even in my lifetime. The first 3D movie was screened in the 1920s, before it
was brought back in the 1950s. I recall seeing some horrible 3D movies in the
early 80s, and now it’s back again.
It’s failed each time, because it’s not
remotely like the 3D experience we get in our normal lives every day, and it’s
not different to that in a good way either.
“Help me, Obi-Wan Kenobi, you’re my
only hope.” In a word, Leia, ‘No’.

Touchy Failure

As a person who writes about TV and film,
along with computers,
I always find it fascinating that some
films are massively influential, even if they’re actually not great. One of
those must be the Tom Cruise sci-fi actioner Minority Report, another in a long
list of ideas of the great Philip K Dick that Hollywood has plundered.

In the movie, Tom uses a computer system by
gesture, and some technical people, mostly in Microsoft’s HQ in Redmond, saw
giant dollar signs instead.
All this subtle tiny mouse movements was
all wrong, they concluded; we should be waving at our systems like they’re
brave soldiers returning from war.
That places touch displays in the unique
position of being a total success and yet a massive failure at precisely the
same time ft
I don’t need to really detail how that’s
working, because it’s utterly obvious to all but the most hardened Microsoft
fan that it isn’t.
It’s a shame therefore that Microsoft got
so entirely engrossed in how this notion would take over the world that it
redesigned Windows around it, forgetting that at the time almost nobody in the
world had a touch-controlled monitor.

One can only assume that Microsoft assumed
that once we’d seen Windows 8 in action, we’d all head to our hardware vendors
demanding touch laptops, touch desktops and touchy-feely monitors. We didn’t.
Just six months after Windows 8 launched,
less than 5% of all laptop sales were for touch-enabled ones, and the sales of
touch monitors have been even more abysmal.
Microsoft reacted by making some
adjustments to Windows 8 in the 8.1 release, most of which were to address the
large number of people who had no intention of operating their PC with their
fingers. At this time, it seems to be clinging to the hope that eventually all
laptops will come with touch or we’ll all agree to change our opinion, both of
which hopes seem utterly delusional.
While touch on a phone or tablet seems
appropriate, on a desktop PC it’s ridiculous, unless you’re a big fan of greasy
smudges and obscuring what you’re doing while you try to do it.
That places touch displays in the unique
position of being a total success and yet a massive failure at precisely the
same time.

Where Next?

Having dramatically failed to convince the
buying public to accept 3D or touch, where do the masters of technology want to
take us next? There are some new directions and some old ones reworked.

4K Resolution

Surely displays are high enough resolution
these days? Nah, they’re horrible and blocky, like Tetris played on Ceefax. Or,
that’s the way that some in the display industry would like people to think
about 1080p panels, as they start to churn out 4K ones.
The problem with 4K, as I’ve documented
here before, is that unless you’re less than eight feet away from a 56″
TV, you can’t see the difference between a 1080p and 4K panel. Walk up to them,
say two feet away, and you can see the difference. But most people, unless
they’re drastically myopic, don’t sit a few feet away from their TV.
These details don’t appear to be stopping
any of the TV makers from diving headlong into 4K, despite some real challenges
in making displays of this resolution cost effectively.
Currently, Sharp is the expert in making
them and has designed a whole new Plasma Enhanced Physical Vapour Deposition
system around its IGZO metal oxide technology in an attempt to solve some of
these problems.
Apple used an IGZO panel in the new iPad
Air, as it provided a display with a lower weight, smaller volume and greater
brightness and substantially better power efficiency.
However, the likes of Samsung and LG are looking
at focusing more on the physical properties of displays and the cost of making
them, rather than their size or resolution.

OLED, Again

I think every year for at least the last
five, I’ve written something about how next year would be the ‘OLED’ year,
because in terms of colours represented, it’s a stunning solution.
There are currently large R&D budgets
being spent on making OLED more affordable, as the cost of these panels has
almost exclusively made them limited to phones and digital camera.
They get used in these applications because
they’re super model thin, have excellent colour representation and amazing
black levels. The downside to OLED is the cost and the power consumption, which
has never achieved the efficiency that was first promised for them.
Could OLED ever have its technological day
in the sun?
Maybe. The University of Bonn has been
working on some remarkable quantum analysis of OLED displays in an effort to
make them cheaper.
The flexible display is one of those
technologies that you see, impresses you it’s possible, and then you wonder
what possible practical applications there might be tt
One of the reasons they cost so much is
that to achieve the power efficiency they do have requires the addition of
metals like platinum and iridium to the display matrix, neither of which are
notorious for being cheap.
The work in Bonn has discovered that by
adding a new organic layer (i.e. cheap), they can reduce the amount of light
energy that gets converted to heat and therefore do away with the coating of
precious metals. This could make OLED easier to make, substantially cheaper,
and make them practical for desktop PC applications or even large TVs.
In breakdowns of phones that use an OLED
display, it has been calculated that often the panel costs as much as the rest
of the phone, so any reduction in cost could have a big impact on making these
excellent displays more common. Here’s hoping.

Bendy Displays

When you first see a flexible display, it’s
hard not to be impressed by the fact it’s possible, but then you wonder what
possible practical applications there might be. So far, a few prototypes have
been shown and a few curved TVs have been made, but already most people are
wondering ‘why?’

The answer, if slightly implausible, is
mostly ‘wearable tech’. Prototypes of phone devices made as flexible bracelets
have been shown, and the ability to bend a display around the arm does seem an
attractive idea.
None of this is especially new thinking.
About five years ago, a company called Polymer Vision was developing a display
technology so flexible that it could be folded. The key to its new method was
electrophoretic technology that allows for entirely new form factors, where a
larger display is folded into a smaller space for pocketing.


A LG and Samsung both make curved TVs, for
that that hate the rectangle.
Unfortunately, the company folded, before
it was brought out of bankruptcy by Wistron, which is still interested in
developing the idea. It’s produced some new product designs incorporating a
colour panel, but no products to buy.
What has actually made it to market are
some curved TV designs, where the display is bent to take account of the
increased distance of the sides from the centre.
While these panels look very interesting,
they can only work for a limited number of people sat at the focal projection
of the curve, and for anyone sat outside the sweet spot they’ll look much worse
than a flat panel.
Public reaction, even in Japan, to curved
TVs has so far been lukewarm. The problems facing the flexible display are
highlighted by the lack of production facilities that both Samsung and LG have
so far committed to building them.
Samsung has a production line that could
only make 1.5 million 6″ displays a month, if it could achieve a yield of
100% success. And LG has only a third of that capacity. Until they work out the
production bugs that are keeping yields far away from 100% working, they’re
unlikely to ramp up these facilities or start pushing their bendy panels on the
phone makers.
With the exception of yet more curved
prototypes, we’re unlikely to see flexible screens on many products in the next
year and possibly not for another couple.

Augmented Reality

Google Glass wasn’t a revolutionary idea
from the outset, but it’s still being vaunted as a ‘game changer’ by many. The
idea of a mobile device that you wear that overlays information in front of
your eyes sounds wonderful initially.
For starters, it could reduce the number of
people who walk into others, street furniture and under public transport while
obsessing with their phones.

In addition to letting you look where
you’re going, it could also enhance the world around you with directions and
the locations of services and outlets. And in doing all this, in theory it also
blows the whole ‘big resolution’ screen idea out of the window, because
augmented reality could make your entire field of vision a movie screen and
link it directly to your head movement.

However, there are problems with products
like Google Glass. Not least the disconcerting notion that if you’re meeting
someone for the first time, they could access information about you from the
internet.
Google Glass is due to become a buyable
product in May 2014, at which time we’ll all either wonder what the fuss was
about or be ordering Cantonese without an English menu.
What Google Glass can’t currently do or
even have plans to offer is a solution where your entire field of vision
becomes augmented or computer generated.
That idea is being more extensively explored
by Oculus VR, a company that began with a Kickstarter project and has now
evolved into something bigger. Its first product, Oculus Rift, has yet to reach
commercial availability, but it’s already talking about future designs.

Its head mounted display prototype only had
screens of 1280×800 resolution, whereas the released retail product will
feature a 1080p display split between two eyes. CEO Brendan Iribe is already
talking about 4K and even 8K variants, and it’s also trying to reduce the
latency between the wearer’s motion and the display updating. The first kits
worked with a 40-50ms delay, but Oculus already has prototypes that work with
just 15ms delay, which it describes as ‘magical’ to use.
By definition, the computer power needed to
drive these resolutions at speed won’t deliver a solution that could be easily
disconnected from a desktop PC. And that’s something that’s unlikely to change
in the short, medium or even long term. If Oculus-like devices do become popular,
though, they could stimulate yet another relaunch of 3D TV. Personally, I can’t
wait, mm

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.