The future is here! Sony's QD OLED TV in the test
I was particularly looking forward to this test: Sony's A95K shows why QD-OLED will soon be the new standard for all picture technologies. But QD-OLED is still too expensive for me.
OLED TVs offer the best picture quality money can buy. This will not change in 2022. Except that the technology has undergone a further development that could shift the balance of power in the OLED market, which has been strongly dominated by LG until now. Its name: QD-OLED.
QD-OLED is a technology from Samsung. The "QD" in QD-OLED stands for Samsung's Quantum Dots - special colour filters, simply put. These not only make for more beautiful colours, but are also supposed to increase the maximum brightness of the TV - especially by otherwise rather modest OLED standards. This is exactly what Sony did not want to miss out on. The Bravia A95K, the new flagship of the Japanese TV giant, no longer gets its panel from LG, but from Samsung.
Are we witnessing the beginning of a new TV era?
Design and sound: I love Sony's trademark!
This year, too, Sony's flagship TV comes with a pedestal-less design. Meaning: The A95K is designed in such a way that its panel does not "sit" on a stand, but leans against it. A bit like a picture frame that you don't hang up but stand around. So if you look at the TV head-on, your focus is on the picture. Nice.
But where to put the soundbar? At least for me, this was the question, especially since my TV unit doesn't have any extra space for it. So I hid my Sonos Arc behind the TV. This is not ideal; the soundbar radiates directly towards the front of the panel. That could be the killer criterion for some people.
Acoustic Surface provides good sound
However, if it were up to Sony, you would manage the sound differently anyway. For years, the Japanese have been relying on their in-house sound technology "Acoustic Surface Audio+": four drivers installed behind the TV do not cause air to vibrate like conventional speakers, but the panel itself:
- 2x actuators (20 watts each)
- 2x subwoofers (10 watts each)
Strictly speaking, a 2.2 system. However, Sony does not want to be specific. Thanks to "3D Surround Upscaling" - a nicer term for digital sound manipulation - more speakers should be simulated than are physically present. That's why the system also supports Dolby Atmos.
What can I say? Even after years, I'm still amazed at how well this system works: No other TV creates such a voluminous and, at the same time, powerful sound image. I would even go so far as to say that "Acoustic Surface Audio+" easily replaces a medium-priced soundbar. But if you want surround sound - real surround sound - you can't do without a home cinema system.
Sony knows that. That's why this feature is not new, but still relevant: within a hi-fi system, the TV can be used as a centre speaker. Then you don't need a centre speaker or soundbar. Or you can go straight for the HT-A9 sound system. Four speakers create a 360-degree sound stage, no matter where you place them.
If you do want a soundbar and you've found a solution to the placement problem, connecting the TV to a Sony soundbar won't turn the A95K into a centre speaker, but it will focus its sound on high frequencies and voices. This in turn relieves the Sony soundbar, whose freed-up computing capacities improve the sound over the mid and low frequencies.
About the connections. They are behind the TV and on the side:
- 2× HDMI 2.1 connections (4K120Hz, ALLM and HDMI Forum VRR).
- One of them with eARC (HDMI 2)
- 2× HDMI 2.0 ports
- 2× USB 2.0 ports
- 1× USB port for external HDDs
- 1× output for Toslink
- 1× LAN port
- 1× CI+ 1.4
- Antenna and satellite ports
- Bluetooth (BT 4.2)
All four inputs support HLG, HDR10 and Dolby Vision.
The 65-inch version of the TV provided to me by Sony weighs a whopping 51 kilograms. If you want to mount the TV on the wall - it still weighs 27 kilograms without the stand - you'll need a VESA 300×300mm mount. You can find one here in our shop. Not a bad idea if you want to use the TV together with a soundbar.
QD-OLED in a nutshell
It would actually take a whole article to explain QD-OLED to you properly. Well, I've already written it. If that's too long for you, here's the shorter version. If you just want to know how good the A95K is, you can skip all that and go to the chapter "Measurements: QD-OLED flexes its muscles".
First things first: Before I can explain QD-OLED to you, you need to know why OLED is (still) considered the best picture technology on the display market. The special thing about OLED pixels is that they not only generate the image, but also their own light. LCD pixels cannot do that. This has a big impact on the picture quality. I have also written about this:
The decisive advantage of OLED is the representation of real black - and the better contrasts that come with it. On the other hand, OLED pixels shine less brightly than conventional LEDs. This is due to their light-absorbing colour filters: If OLED TVs were to shine brighter as compensation - supplying more energy to the pixels - more heat would develop. This, in turn, would accelerate the wear of the material and lead more quickly to burn-in. These are unsightly, ghostly image residues, which I once explained here.
Years ago, LG was the first manufacturer to find a way to improve the brightness of its screens without significantly increasing the risk of burn-in: by adding a white sub-pixel to the pixel architecture. Until then, a pixel consisted only of a red, blue and green subpixel. Since then, the white subpixel provides more brightness and at the same time reduces the energy load per subpixel and thus the burn-in risk. On the other hand, the white subpixel tends to bleach out the other colours. Not that OLED colours are bad because of this. Quite the opposite. But they do not exploit their full potential.
The name of this technology: WOLED.
And now comes Samsung's QD OLED. What's different about it? The quantum dots that give it its name. Shown in the graphic below as the QDCC layer. Hence QD-OLED. Put simply, Samsung's Quantum Dots don't filter. They recolour. That is the crucial difference. Light is lost during filtering. The quantum dots, on the other hand, change the wavelength of the light and thus its colour. Samsung's QD OLED technology therefore does not need an additional white subpixel to artificially provide more brightness.
A small change - with a potentially huge effect.
All in all: Samsung exploits more potential from the OLED pixels with its QD layer than LG. They shine brighter and more powerfully. With the same energy input. That is also important. We remember: more energy equals more heat equals higher burn-in risk. No wonder, manufacturer Sony wants to jump on the QD OLED bandwagon.
Measurements: QD-OLED flexes its muscles
What comes next goes even deeper into the matter than the QD-OLED explanation above. If tables and diagrams don't interest you, you can skip all that and scroll directly to the chapter "The picture: powerful yet natural". From there on, you'll find my subjective impressions with lots of video material. Have fun!
Now to the measurements. Of course, I could only show filmed or photographed displays and point out strengths and weaknesses. In the end, I would only be giving my subjective impression. But how bright, natural and accurate a television actually is can also be measured in numbers. This has one advantage: numbers are more objective than I am.
To offer you this new service - so far it's only available in my review of Samsung's 2022 Neo QLED (QN95B) - we in the editorial department have acquired professional tools from Portrait Displays.
I measured out all the TV's screen modes. From "Brilliant" to "Standard" to "Dolby Vision Light and Dark", without calibration or manual changes in the settings. Just like most normal people use a television. After all, you want to buy a TV that is already as accurate and true to colour as possible out of the box and without expensive and professional calibration. I only switched off the sensors for automatic brightness. The best values were achieved with "Dolby Vision Bright" for HDR content and the "Cinema" mode for SDR content.
The measurements listed below therefore refer to "Dolby Vision Bright".
Maximum brightness
Brightness is important for the television for two reasons. On the one hand, it influences the contrast value. It determines how many different colours a television can display. On the other hand, brightness is important if you often watch television during the day in rooms flooded with light. If a TV is not bright enough, it can be outshone by the ambient light in the room. The picture then looks rather pale to you.
Let's look at the brightness of the A95K.
Nit ist die englische Masseinheit für Candela pro Quadratmeter (cd/m²), also der Leuchtdichte beziehungsweise Helligkeit. 100 Nit entsprechen etwa der Helligkeit des Vollmondes am Nachthimmel.There are two axes: the vertical stands for brightness, the horizontal for the section in which the brightness is measured. At two per cent of the entire screen surface, i.e. selectively and with very small image areas, Sony's QD OLED achieves an insanely high luminance value of 998 nit by OLED standards. And this in Dolby Vision mode, which is rather darker than the TV's "Standard" or "Brilliant" mode.
By way of comparison, the usual value for OLED TVs would be around 700 nit, and that's only with the picture settings set to maximum brightness, which has nothing whatsoever to do with natural colours. Only LG's Evo panel, which is only used in LG OLED TVs, can keep up to some extent. It scored around 850 nits in most tests last year.
Significantly less superior is the overall brightness of the TV at full window size: 204 nit. That's a lot for an OLED TV; LG's Evo panel came in at 170 nit last year. But LCD TVs shine much brighter. Samsung's QN95B, for example, with 658 nit.
What does that tell us? If you put a QD OLED TV next to an OLED TV, you won't notice much difference in terms of brightness. On the other hand, the maximum brightness in very selective picture areas anticipates better contrast values and thus more displayable colours.
The white balance
How exactly does white actually look? That depends on the colour temperature. On the warmth or coldness of white. Warm tends towards yellow/orange. Cold tends towards blue. This in turn affects the representation of colours. In the industry, a white with 6500 Kelvin has been agreed upon for calibration, in short: white point D65. Most people would consider this to be a warm white, just like the resulting colours. "Film" mode, that is. The white and the colours in "standard" mode are much colder. For this reason alone, the "standard" mode does not produce an accurate picture.
White is created on a television when the red, green and blue subpixels per pixel radiate simultaneously and with equal intensity. Full brightness therefore produces the brightest white. The lowest brightness, on the other hand, produces the deepest black. Everything in between is therefore nothing more than grey tones. The accuracy of the white balance is therefore measured with two tables:
- Greyscale Delta E (dE)
- RGB balance
The greyscale dE shows how much the greyscales produced by the television deviate from the reference value. The RGB balance shows in which direction the greyscales produced by the television deviate from the reference value. Why is this important? Let's take a look at the concrete A95K example:
The graph on the left reads quite simply: the deviation from the reference value is called delta E, or dE for short. If you were to place the TV directly next to a reference monitor, this means:
- Value is 5 or higher: Most people can tell the difference.
- Value between 3 and 5: Only experts and enthusiasts can tell the difference.
- Value between 1 and 3: Only experts can tell the difference, the enthusiasts drop out.
- Value below 1: Even experts cannot tell the difference.
Any value below five is a very good value for a non-calibrated TV. Sony's A95K manages this up to about 70 per cent white. Then the value briefly rises above 5, before it drops back below that at around 90 per cent white. All in all, most people would not even notice the deviation from the reference value.
What exactly does "deviate" mean here? This is shown by a look at the RGB balance. In the "problem zone" between 70 and 90 percent white, the green subpixels radiate somewhat weakly. Even if the blue and red subpixels do not radiate excessively strongly: The imbalance can still lead to a slight blue or red cast.
The colour gamut
We continue with the colour gamut, the coverage of the most common colour spaces: the greater the contrast, the more colours can be represented and the more natural the picture looks. This is why the gamut is important for HDR content, as it uses large colour spaces with its eponymous high dynamic range.
- Rec. 709: 16.7 million colours, the standard colour space for SDR content such as live TV and Blu-rays.
- DCI-P3 uv: 1.07 billion colours, standard colour space for HDR content, from HDR10 to Dolby Vision
- Rec. 2020 / BT.2020 uv: 69 billion colours, still hardly used in the film and series industry
The large "colour blob", including the darkened areas, shows the entire colour palette detectable by the human eye. The lightened area on the left shows the BT.2020 colour space. On the right, the same, simply the smaller DCI-P3 colour space. The white boxes show the actual boundaries of the respective colour spaces. The black circles, on the other hand, show the limits actually measured during the measurement.
The following colour space coverage was measured:
- Rec. 709: 100% (good = 100%)
- DCI-P3 uv: >100% (good = >90%)
- Rec. 2020 / BT.2020 uv: 99.86% (good = >90%)
The numbers up there, dear readers, are a statement. Sony's QD OLED TV has a coverage of over 100 per cent in the important DCI-P3 colour space. The only reason I don't know the exact value is that the scale of my software doesn't go above 100 per cent. For comparison: Samsung's Neo QLED achieves (also very good) 92.49 per cent in this area. An OLED TV would have to be a little higher in comparison. But QD-OLED beats both of them hands down.
The measurement of the BT.2020 colour space is in the same vein: 99.86 per cent. Wow! Neo-QLED and OLED TVs currently achieve a maximum of between 71 and about 75 per cent. Which, by the way, is also the reason why the film and series industry calibrates its HDR content almost exclusively in the much more widespread and better covered DCI-P3 colour space. The BT.2020 colour space is therefore considered the colour space of the future, and the coverage value an indicator of future suitability. In this respect, QD-OLED clearly shows who is the boss in the ring.
The colour error
Even more important than colour space coverage is colour error. Or formulated as a question: What do accurate colours mean? For the TV set, colours are not colours, but numbers. Numbers that precisely define the colours within a given colour space. Red, for example. Ivy green. Or cadet blue. When you watch television, these numbers are sent to your television as metadata. It interprets the data and displays them as corresponding colours. Simple. Simple, right?
No. TVs can process and display most signals within the most common colour spaces. But that does not mean that they will display the colours accurately. Otherwise the picture would look exactly the same on all televisions. Therefore, the more the colours displayed correspond to those on reference monitors, the more accurate and better the television.
As with the greyscales above, the deviation of the TV from the reference value is called dE. The white boxes show the reference colours sent to the TV by the test pattern generator. The black circles, on the other hand, show the colours actually measured. Again, dE values below 5 are good for non-calibrated TVs.
The graph anticipates it: Sony's A95K already has a very high colour fidelity. In fact, with a total of 40 measured values, I measured an average dE of an excellent 2.64. Better than the 2.97 of Samsung's Neo QLED. Sure, with calibration, the value could even be pushed below 2, maybe even down to 1. But the difference to a reference monitor is so small that even experts can hardly see it now.
For comparison: In standard mode, the dE was 11.47 - no comparison to the "Dolby Vision Bright" mode, to which - as a reminder - all measurements listed here refer.
Interim conclusion after the measurement
Let's draw a brief conclusion. The measurements say that the A95K has a bright picture for OLED TVs. Especially when it comes to maximum brightness. The perceived maximum brightness of the QD OLED TV, however, is only slightly higher than that of a conventional OLED TV. The coverage of the most common colour spaces, Rec. 709 and DCI-P3, is all the more impressive: over 100 per cent. And almost as much for the coverage of the even larger BT.2020 colour space - that's top class. In addition, the colour fidelity is particularly good, even if the RGB balance may suggest a slight red cast.
Time to test the theory in practice.
The picture: powerful yet natural
The measurements above attest to the TV's good colour space coverage with very high colour fidelity. Theoretically. How does it look in practice?
Colour reproduction
Hardly any other film is as colourful as "Guardians of the Galaxy, Vol. 2". Hardly any other scene uses the entire colour spectrum like this one. And in no other video clip do you see the advantages of QD OLED as much as here. Especially compared to LG's C2, which looks downright pale: The scene in front of Ego's palace, bathed in sunset red, pops in an even more saturated red, draws even the finest details in the sky without over-emphasising them, and has that certain punch in the picture that I like so much about OLED displays - whether with "QD" or "W" in front.
Samsung's QN95B, the South Korean flagship LCD TV with mini-LED technology, holds up surprisingly well in comparison. No wonder: In my test, I attested that the display was very well calibrated and had high colour fidelity. That's why the picture is warmer than LG's and Philips' OLEDs, which struggle with a slight blue and green cast due to their technology.
But: colours don't always have to pop in the picture. For example, in the film "Knives Out", where a treacherous murderer is on the loose and director Ryan Johnson wants the picture to be as natural as possible. You can see how well a TV plays along, especially with the skin tones.
Compared to Samsung's Neo QLED, you'll notice that the colour tone is similar, but Sony's picture is stronger. Look out for the red wooden façade. Or the hanging notes in the background. This is what I mean when I talk about the certain punch that OLED TVs pack in my tests. LG has a similar punch in comparison, but it also has a slight blue cast. Look out for old Harlan Thrombey's shirt, for example.
Black Crush and Shadow Details
Not all scenes are bright. Some are really dark. That's why I want to test Sony's ability to show details in dark areas of the picture. This time, I first compare the A95K with its OLED competition. There's a reason for this: each OLED pixel emits its own light. Conversely, each pixel can also be switched off with pinpoint accuracy. That's why OLED TVs can display perfect black. No wonder, of all things, that dark scenes are their paradigm.
For example, in the video below, in "Blade Runner 2049". With both Sony's QD OLED and LG's OLED, the scene is wonderfully dark. Naturally. If you film against the light, it is normal that the rest disappears in black silhouettes. With LG, however, more details are swallowed up by the darkness - called black crush. This could be intentional on LG's part. But it could also be due to the brighter QD OLED panel. However, I have seen worse black crush.
The second comparison in the video above, on the other hand, shows quite well the difference between OLED pixels (Sony and LG) and LCD pixels with mini-LED backlight (Samsung). Unlike OLED TVs, LCD pixels cannot be flicked on and off with pinpoint accuracy. This causes blooming, a kind of halo. Good to see around the windows. You never see that with OLED TVs. Next, pay attention to the details in dark areas of the picture. Samsung's Neo QLED brightens areas that I don't think should be brightened. It looks wrong.
Exactly: filming against the light.
Brightness gradations
A final picture test: Detail reproduction in bright areas of the picture. Here, the balance of power between OLED and LCD is exactly the other way round: LCD TVs often handle bright picture areas better and let fewer details disappear in them. In the following "Jurassic World" example, look at the sun in the background: even in such a bright picture area, the gradations of Samsung's Neo QLED are still so fine that the sun can be recognised as a sphere in the firmament. This is much less the case with LG's and Philips' OLED TVs.
Sony's A95K, on the other hand, keeps up pretty darn well with Samsung's QN95B. It's here that the 990-nit maximum brightness of Sony's QD OLED panel becomes apparent. In other respects, too, the picture looks the most natural to me. The punchiest. Especially when I pay attention to the skin colour. With LG and Philips, the picture is too cold.
Processor
The processor is the brain of the TV. Its main task is to receive, process and display picture signals. Processing means that it recognises poor picture quality and enhances it. Sony calls it the "Cognitive Processor XR" and says that it "displays content with consistently vivid colours and lifelike textures at all brightness levels" and "reproduces natural shades and hues that are perceived as beautiful by the human brain".
Behind all the marketing gobbledygook is that the processor is supposed to remove noise, enhance colours, smooth edges, make motion smoother and add any missing pixel information.
Motion Processing and Judder
To start with, I give the processor a really hard time. With judder, a phenomenon that all TVs have. Judder occurs when the picture signal and the TV panel do not have the same frame rate. With cinema films, for example. Sony's A95K can display up to 120 frames per second. Films, however, are shot at 24 frames per second. Processors synchronise this disparity with interframe calculations. If the processor is too aggressive, the picture looks as exaggeratedly fluid as in a soap opera à la "Good Times, Bad Times". But if it holds back, the picture stutters. Especially during long camera pans. The film seems nervous - in English: jittery. Hence the word "judder".
Sam Mendes' "1917" is full of such steady, slow-flowing camera movements and thus perfect for the judder test. With Sony's A95K, judder is immediately visible. Pay particular attention to the vertical bars in the barn. The Japanese manufacturer hardly intervenes in Judder reduction by design. Film, according to Sony's thinking, has to judder. Like cinema used to be, before the digital age. Beautifully old-fashioned. Or old-fashioned beautiful? For me, at least, it's too much judder.
Of course, this can be changed and removed in the advanced picture settings under "Motion Flow". By the way, I also did this with Samsung and LG. Only with Philips did I find the judder reduction without manual intervention to be very good.
Next scene from "1917". Again, Mendes' camera work provides an immense challenge for most processors. Especially with hard edges against a blurred background, for example around the helmets of the two soldiers. Here, both processor and pixel have to react incredibly fast.
Sony's processor does very well, even if it doesn't flex its muscles quite as much as LG's or Philips' processor. Nevertheless, the picture flows, but never looks unnatural.
Pixel response time
Next up is the Apple original "For All Mankind". I want to see how long a single pixel takes to change colour. If this doesn't happen fast enough, it looks to you as if the image is streaking - the effect is called "ghosting". I compare this directly with TCL's C82, the second mini-LED TV in this review after Samsung's Neo QLED. When the camera pans over the surface of the moon, pay attention to the text superimposed on it. Then you'll see the streaks I'm talking about on TCL's right:
With Sony on the left, on the other hand, you see almost nothing at all. On the one hand, this speaks for an excellent processor. On the other hand, the video also shows the excellent pixel response times that are so typical for OLED TVs. That's why they are also considered exquisite gaming monitors. LCD TVs are usually at a disadvantage in this respect.
Upscaling
Now the most difficult test. Here I want to see how well the processor upscales lower-quality sources. Blu-rays or good old live television, for example. Or "The Walking Dead". The series was deliberately shot on 16mm film to create the feeling of a broken, post-apocalyptic world with old-fashioned grain along with image noise.
Sony's "Cognitive Processor XR" is capable of upscaling. Clearly. Because up there, 75 per cent of the filmed display is calculated. In other words, the SDR source with its 2 million pixels has been blown up to UHD with 8.3 million pixels. Pay attention to the sharpness and edge smoothing. Only the noise looks too much like snow flurries to me. I see Samsung and especially LG in front. Fortunately, the noise could be reduced somewhat in the settings. But I wouldn't be too aggressive: Too much noise reduction can quickly make people look like wax figures.
Gaming: Input lag and game mode
The final test: Is Sony's TV also suitable for gaming? Absolutely. With recommendation. The TV supports all features relevant for gamers:
- 2× HDMI 2.1 connections (HDMI 3 and 4 / 4K120Hz / 8K60Hz)
- Auto Low Latency Mode (ALLM)
- Variable frame rates (HDMI Forum VRR)
For this purpose, Sony - just like LG, Samsung, Philips and Panasonic - has entered into a partnership with many major game studios. The result: HGiG - HDR Gaming interest Group. According to the manufacturer, this ensures that HDR is displayed the way the game developers intended. PC gamers in particular could sing a song or two about poorly displayed HDR.
In fact, using Leo Bodnar 's meter, I measured an average input lag of a very good 8.1 milliseconds for a 4K-120Hz signal and 15.8 milliseconds for a 4K-60Hz signal, without seeing any overly serious drops in picture quality. Specifically, the average dE for colour error in gaming mode was an excellent 2.67. For example, when playing "Spider-Man: Miles Morales" on my Playstation 5.
I am satisfied that the colours are bright, black is really black, the edges look sharp and the picture does not blur too much even with fast and jerky camera pans. Notice Miles' dark silhouette against the light, the detailed textures of snow-covered New York, the beautiful warm colours or the clearly visible details in the clouds. This is what a good game mode looks like.
What Sony lacks is a dedicated submenu like the ones on LG or Samsung, which can be seen at the beginning of the video above, where you can make fine adjustments for gaming and read off the current frame rate. At least Sony's A95K supports the new VRR-120Hz mode of the PS5 without any problems. However, I first had to go to the system settings of the TV, under inputs, and tick the box behind "VRR and ALLM".
Conclusion: the first generation is already convincing
Will QD-OLED run LG's ageing WOLED technology into the ground? That's the big question. Let me split the answer in two.
On the one hand, it is currently clear that QD-OLED is better than OLED. The bare figures show that. The colour fidelity is inherently brilliant. And no other TV has ever covered colour spaces as well as Sony's A95K. Add to that the head-to-head comparisons, in which Sony's QD OLED TV was superior to its rivals in almost all disciplines.
On the other hand, I don't think QD-OLED is yet so much better that the technology justifies the currently horrendous surcharge. The A95K costs just over 4000 francs at the time of this review. LG's flagship OLED with Evo panel, the OLED G2, currently costs 600 francs less. The somewhat less well-equipped C2 version of the Evo panel even costs 1000 francs less. Nevertheless, most people would probably only notice the difference in quality between the TVs if they were standing right next to each other.
For me, the price range between QD OLED and OLED is still too wide. But I'm not surprised: as an early adopter of new technologies, you're always asked to pay. Above all, you have to pay for the years of high research and development costs. And the teething troubles. Especially those that only come to light in the course of months and years. It may sound unfair to accuse Sony and Samsung of this now. Perhaps it is. But I am speaking from experience. Like, for example, the green-violet discoloured edges on QD OLED monitors, which I reported on last March. To reassure you: With the A95K, while watching films and series or gaming, a good three metres away from the TV, I could not see the discoloured edges. Only in certain menus or scenes when I paused the picture and looked very close to the screen.
So, enough of the doubts. Before you interpret the conclusion too negatively: QD-OLED already surpasses WOLED in the first, most likely not fully developed generation. In other words, there is even room for improvement. That's why I'm already a fan of QD-OLED. Meanwhile, LG is still working against it with the Evo panel - Evo because of a new, more heat-resistant material. Soon, however, they too should follow suit with their own QD OLED panels. My opinion. Anything else would - in fact - be a surprise.
I'm an outdoorsy guy and enjoy sports that push me to the limit – now that’s what I call comfort zone! But I'm also about curling up in an armchair with books about ugly intrigue and sinister kingkillers. Being an avid cinema-goer, I’ve been known to rave about film scores for hours on end. I’ve always wanted to say: «I am Groot.»