Horizon 7.4

VMware released Horizon 7.4 and besides some interesting features like collaboration, I noticed one other announcement. In a previous article, VMware Horizon 7.2: New kid, Blast (Extreme) is the rising star, I wrote about differences in protocols and showed them. One of the blog readers, P.Cruiser, also noticed that there was a rather important announcement in version 7.4. So this short article is about that announcement and why it is such a deal, or at least I think it is.

Chroma Subsampling

Before I go into the announcement, let’s explain a bit on what Chroma Subsampling is. How to explain as simple as possible, delivering content over bandwidth comes with a challenge.  content and in our case images are delivered slower because of bandwidth limitations. A signal (video) is made up of luminance and colour information.

  • Colour information, well is the colour you see in a picture. Important but not the most important part of a picture.
  • Luminance information is what makes up the picture, it shows you the details of the picture. Luminance is the contrast of the picture. So you want to make sure the luminance of the signal is not harmed as it makes the picture “sharp”. Colour does not make a picture sharper, just nicer perhaps.

With Chroma Subsampling you make sure luminance is sent and you drop some of the colour. By reducing the amount of colour you can get the picture transported and displayed and you make sure the size is reduced by 50%.

In the YUV format luminance is ever further reduced and only uses one-third of the original size. Putting this all together will get you a good picture, sharp with the right colour and without bandwidth issues.

4:4:2 of 4:4:4

There are more Chroma Subsampling options, 4:4:2 and 4:4:4 are not the only two. If we look back a bit we see that you had 3:1:1 with the old HDCAM. PAL, HDTV and Internet video are using 4:2:0 Subsampling. You could do a math on the differences and you would see that e.g. 3:1:1 is offering more colour information than e.g 4:2:0 or 4:2:2 (not with 4:4:4). If you look at pixel density 4:2:2 is far better up to +30% from 3:1:1.

4:4:4 is an uncompressed option of Chroma Subsampling and will deliver an even better result. The cost of this result is that the file size is larger. 4:2:2 (which VMware used to use with H.264 and Blast is around 70% the size of what a 4:4:4 sampling will deliver. So if your image is 100MB in size delivered with 4:4:4, with 4:2:2 it would be 70MB. If you go further back all the way to 3:1:1 you would end up with about 40% of the file size but you would lose quality fast.

As you see the sampling rate defines the size and the quality, I guess you’re not surprised so far. For a lot of scenario’s lesser sampling will do fine. For text, however, you need good quality. VMware realized that also and the first versions for Blast were using the not so optimal 4:2:2 YUV sampling. It was one of the reasons many customers still stayed on PCoIP even though VMware is putting all their effort in Blast.

If you want more information on 4:4:4 and the other samplings look at Wikipedia for one or Adobe blog and you find a ton of information. The differences between samplings are explained, one of the examples they use is seen here.

Horizon 7.4
Copyright Wikipedia


Horizon 7.4
Copyright Wikipedia
horizon 7.4
Copyright Wikipedia

There is more. Together with the Chroma Subsampling, there is also the YUV format.

YUV can be split into the Y which is the luminance and UV which are the Chromian components. The Y is the brightness of the picture and the UV add the colour to the picture. If you want to map the YUV back to the old days, YPbPr colour model was used in analogue component the video digital version, we used, later on, is YCbCr. I was reading several pages (doing some research on this matter) and I think I have to mention that YUV is not equal to YCbCr, the naming just stuck as the official name like…  not go there).

On the left side of the screen you see the RGG version of an image, you see the breakdown of the red, green and blue.  At the right of the screen, you see the same image but in a YUV breakdown. The breakdown is rather different from the RGB version. There is a grayscale image which is delivered in one of the channels (Y) and the colour information which is delivered in the two other channels (UV).

So this is interesting, understand that the Y of YUV was always there also in the black and white age the Y was delivered. When we switched to colour the UV were added to bring in the colour. So now you know how the image is transferred in the YUV format.




Back to Horizon 7.4

Why is this important looking at Horizon 7.4? Well Chroma Subsampling in a 4:4:4 setup is in contrast with the 4:2:2 setup without compression. So even though the image quality is far better the bandwidth requirement would go up massively. Although I come from an area where we all have a fibre Internet connection at home, bandwidth in many parts of the world is still an issue. To cope with that they work with Lumia in the YUV format. Lumia in the YUV format can deliver the image at 1/3rd of the signal which is good for our bandwidth. Of course, I’m interested to see how this will work on the bandwidth as the 4:4:4 format might be heavy still to deliver.

Perhaps that is the reason that 4:2:2 colour space is still the default setting and that you need to configure 4:4:4 through policies. The documentation also says that it might degrade performance when using multiple monitors which is, of course, true as more data is to be transferred to deliver the image.

I got some valuable information from VMware soon after I posted the article, I will quote the comment here and have removed my previous assumption. <Quote”The main reason why we didn’t turn 444 on by default is that it is decoded on the client by a software decoder, not hardware, so it will affect client battery usage. This is because most video cards are unable to do hardware decode of 444 streams because it’s not a common thing to do. When someone is talking 444, it’s usually in the context of sending a high-quality picture to an external display” /Quote>

So I think to have the option to pick 4:4:4 colour space is a good thing, not sure if this a little too late to add it. Perhaps this option should have been there when they claimed to be on-par with the features of PCoIP. I know some customers that will test this as soon as they are on 7.4 as they were looking to go to Blast already but ran into the blurry text issues with the 4:2:2 colour space.

I’m interested to see how this is received by our customers. Good move for sure.


2 thought on “VMware Horizon 7.4 – Blast gets necessary update to support Chroma Subsampling 4:4:4 with H.264”
  1. Hi, VMware engineer here. The main reason why we didn’t turn 444 on by default is that it is decoded on the client by a software decoder, not hardware, so it will affect client battery usage. This is because most video cards are unable to do hardware decode of 444 streams, because it’s not a common thing to do. When someone is talking 444, it’s usually in the context of sending a high quality picture to an external display (e.g. the hours I spent trying to get my multimedia setup to send 4K 444 to my TV 🙂

  2. I’m guessing the method Microsoft used to get “4:4:4 quality text with 4:2:0 hardware encoders / decoders” with AVC 444 mode in RDP 10 (2 years ago btw) isn’t something that VMware could have attempted? Not everyone can afford to dump their existing investments in laptops and thin clients just to have readable text.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.