>I'm an absolute newbie trying to understand the whole DI workflow.
>If we need to use an Apple or 3rd Party LCD for 2K Color Grading how different is the EDP100 compared to Decklink Multibridge Express
>Also a 2K would be 2048 x 1556 so effectively only the Apple 30 inch LCD can show you all the pixels?
I am also reading Steve's excellent website :
DigitalPraxis to understand the diff aspects of DI process.
> how different is the EDP100 compared to Decklink Multibridge Express
> Also a 2K would be 2048 x 1556 so effectively only the Apple 30 inch >LCD can show you all the pixels?
>I can't speak for Blackmagic, of course, however, I can tell you that these are very different products. More importantly, we are focusing on complete reference-grade display systems rather than individual processor boxes. In that context I can tell you the following.
>The DCM23 basic is our new HD-centric reference-grade TFT monitoring system with guaranteed calibration and colorimetry across serial numbers and locations. In short, pull it out of the box, turn it on, and your picture, in India, is guaranteed to match the picture of someone you might be collaborating with in London. The basic single-link system is driven by the EDP100A display processor.
>In about three months we'll have a different display processor (EDP200) ready to work with the DCM23. This will give you 2K capabilities as well as 3D LUT and more. With the DCM23 2K data can either be scaled to fit HD or explored in pixel-for-pixel mode in "pan and scan".
>The EDP200 will also be capable of operating Apple's 30 inch LCD in various modes, including pixel-for-pixel 2K data and a wide range of scaling modes. This, however, will not constitute the best in terms of a color-reference solution as the 30 inch Cinema Display is far from fulfilling some of the most basic requirements in this regard.
>In terms of 3D LUT capabilities, at NAB we introduced OpenLUT. This is a standard, created by eCinema, that allows third parties with LUT technology to operate the internals of our display processors. Look for announcements of support for OpenLUT from major LUT players in the months to come. Of course, end-users can have access to OpenLUT functionality for the purpose of creating their own LUT's, encrypted if necessary.
>In general terms, no off-the-shelf consumer TFT is adequate for high-end color reference work. This is a very important point to understand. They are designed for a typical office environment and without much regard for the needs of the film, video and DI community. In fact, most of those designing these monitors barely know (or care) that we exist.
>I took the EDP100 + Cinema Display solution as far as it could go and even pulled some amazing tricks to enhance colorimetry without loss of dynamic range. However, with the new Apple design and other offerings by several manufacturers (Sony SDM, LG, BenQ, etc.) their consumer-oriented bloodlines came to the surface in a significant way. It was clear that our ability to further professionalize the solution with upcoming consumer display generations was going to be compromised. Hence the decision to produce our own display, devoid of all the consumer-land limitations, designed with only one purpose in mind: to replace A-grade CRT's with a reference-grade LCD. NAB feedback seems to indicate that this goal was met.
>Still, the EDP100 + Apple 23 inch Cinema Display is an excellent high-grade and reasonably color accurate solution. Colorimetry is best in class and signal processing is suitable for many demanding applications. We have customers using this solution to edit and color-time shows going out to millions of viewers.
>The DCM23-based system finds its domain in high-end color-critical applications.
>Lastly, for cost-effective non-reference-grade HD/SD monitoring various companies offer interesting and viable solutions :
>You should visit their sites and learn about these very capable products.
eCinema Systems, Inc.
fax: 661-775-4876 www.ecinemasys.com
>The Multibridge Express is a very new thing. Not seen or tested yet. It is a Multibridge married to a HD-Link. I've ordered 5 Multibridges in the past, and two were delivered. Both were non-working so I cannot comment on how good they work. I did however get a HD-link and in fact am using it to convert HD-SDI to DVI to run a projector.
>Re the 30" Apple display, I have three. You're right it is the only display that can pixel-for-pixel show a 2k image. Actually there's also a Viewsonic and an IBM that can even show 4k images (nearly so)
>But the quality of these new LCD's while quite outstanding for working on, are still a bit behind as far as critical colour correction. For one thing they are 8-bit, and for the other they are quite hard to calibrate. So, for the moment CRTs rule somewhat.
>There is however one Sharp and one LaCie LCD that are 10-bit so maybe suitable for DI, but I don't know anyone who uses these for grading. Besides they are both 1600x1200 so maybe good for proxies.
>Otherwise, you still need to toss up between a D-ILA/DLP projector or a Sony 23" CRT. Like the one that comes with Lustre/IFFFS
> Actually there's also a Viewsonic and an IBM that can even show 4k >images (nearly so)
>Correct, however the drawback of those displays is that they have a 12Hz refresh frequency. They are designed for the medical market where this is not an issue, not really suitable for film work, unless you talk about some drawn animations that where done in 12fps....
> But the quality of these new LCDâ€™s while quite outstanding for working >on, are still a bit behind as far as critical colour correction. For one thing >they are 8-bit, and for the other
> There is however one Sharp and one LaCie LCD that are 10-bit
>10 bit what... None of those displays take an 10 bit input as there is simply no real way to put a 10 bit image into an LCD at this point. Dual DVI is theoretically capable of doing this, but nobody supports that.
>What those 10 bit displays do is process their internal gamma and temperature controls in 10 bit, which is very desirable, but they also take only an 8 bit image as an input.
>I don't think the 8 bit issue is really as important as people make you think. As long as all the processing is done at a higher bit depth, the grading, the calibration etc, I would say that 8 bit without additional processing is pretty OK unless you work with artificial gradients a lot.
>For that however, you absolutely must make sure that your monitor is not doing any gamma or temperature adjustments on the 8 bit image. Any calculations applied to the 8 bit image in the display will degrade the image and lower the actual bit depth. Depending on the monitor you use, it can be a little bit difficult to determine what the "neutral" state is. That's one of the reasons why displays like Martin's are important â€“ they take out all the guesswork.
>What we do in SpeedGrade is do all of our color processing, including calibration on a 16 bit or 32 bit floating point frame buffer. The result is then passed either through the 8 bit DVI and displayed without further processing. Alternatively you can go out to a HD SDI signal that is 10 bit YUV through the Quadro FX 4400 SDI. You can get into endless arguments about whether the 10 bit YUV actually contains more relevant color information than an 8 bit RGB signal. I'll leave it at that.
>One big question about grading on an LCD is still the fact that even the best LCDâ€™s today start shifting brightness and hue once you don't look at it right on. Martin, how are you addressing this issue?
>My feeling is that you need to take the director on your lap to do color grading on an LCD - which might be an option in some European countries, but surely not in the rest of the world.
>> Actually there's also a Viewsonic and an IBM that can even show 4k >>images (nearly so)
> the drawback of those displays is that they have a 12Hz refresh >frequency.
>Yes. The TFT panel used in those displays is made by the same company in Asia. The pixel response time is 50ms (yes, five-zero). Not suitable for anything at normal frame rates at all. Lots of pixels though.
>This brings me to a topic that is being greatly abused by several monitor (and panel) manufacturers. To clarify, "monitor" means a finished product that an end user would purchase. In contrast to this, "panel" means the raw LCD element use by monitor manufacturers in fabricating their product.
>This business of pixel response time has to do with how quickly the pixel is able to fully turn on (white) and then off (black).
>If you were clicking out Morse code with a flashlight, you couldn't go any faster than the time it took for the light bulb to go from full off to full on and then full off again. For example, an LED flashlight can do this significantly faster than a conventional filament bulb flashlight.
>In the context of LCDâ€™s, the only response time number that is of any relevance to film/video applications is a measurement taken from black to black. In other words, start with a black display and flash it to full white, returning to black. This is the only number you want a display manufacturer to quote you for response time.
>Marketing guys have been known to bend things --a lot sometimes-- and so, it should come as no surprise that some are quoting response times that are simply not achievable within today's constraints. The latest one I heard during NAB is 6 milliseconds. I had people coming to the booth saying "how come yours couldn't do 6 milliseconds, "x" are saying that theirs do".
>The critical question here was "Is it from black to black?". When asked, the answer was a resounding "no". People are taking measurements from some level of grey to another level of grey and quoting this as the display response time. You might as well be quoting the time it takes the average person to sneeze, because it would be just as irrelevant. Be aware of this and don't fall prey to contrived marketing.
>Best-of-class response time today is somewhere around 15 milliseconds for all LCD's manufactured anywhere in the world. Why? Because there is only ONE liquid crystal fluid manufacturer that supplies fluid to the makers of ALL the good panels. The response time is largely a function of what the fluid can do. Today, the fastest fluid available can do about 15 milliseconds. Remember that and smile as a sales guy quotes you 2 millisecond response time. It might happen.
>> There is however one Sharp and one LaCie LCD that are 10-bit
> 10 bit what... None of those displays take an 10 bit input
>It's the painful truth. There are also some out there that are implying that the DVI interface is somehow inferior to interfacing directly to the panel within the monitor. This is not true. Most color and performance limitations have nothing to do with how the display connects to anything. The panels themselves impose most of the limits.
> you absolutely must make sure that your monitor is not doing any >gamma or temperature adjustments on the 8 bit image.
>Yes, yes, yes!
>This is what I've been saying for a long time : Gross color correction and calibration of LCDâ€™s using lookup tables is the WRONG way to manage color due to potentially severe degradation of the image.
>There are electronic tricks you can use to manage color that do not use or affect any of the 8 bits of dynamic range, thereby producing the best possible image.
> One big question about grading on an LCD is still the fact that even the >best LCDâ€™s today start shifting brightness and hue once you don't look >at it right on. Martin, how are you addressing this issue?
>I wish I had control over the laws of physics. I don't.
>I've done a little optical work on the display to --in my opinion-- make it usable within a +/- 45 degree cone. Maybe it can go a little farther than that, but not much more.
>The usable viewing angle can be enhanced both electronically and mechanically, however, this comes at the expense of response time. And, being that about 15 milliseconds of response time is just about as slow as you'd want to go, I don't think we have any room to make adjustments.
> My feeling is that you need to take the director on your lap to do color >grading on an LCD
>We are going to offer just that at the booth during IBC, where it wouldn't be out of context.
eCinema Systems, Inc.
> You can get into endless arguments about whether the 10 bit YUV >actually contains more relevant color information than an 8 bit RGB >signal.
>Why endless? The arguments seem straightforward. Let me try to state them clearly :
>To begin with, if a display accepts only RGB 8b input, then any Y'CbCr representation, whether 10b or even of infinite precision, has to go through RGB 8b to get displayed, so any additional information in the Y'CbCr image is completely irrelevant to what is displayed or seen.
>What every color display actually displays is RGB color. If a display can make use of higher-precision image data than 8b, then there are two types of potential additional information in Y'CbCr 10b to consider : out-of-gamut and subquantized.
>About 3/4 of all possible Y'CbCr values are outside the RGB gamut of a display, and thus irrelevant to what is displayed or seen. However, if the extent of the RGB space of a particular RGB 8b image does not match that of the display, then a Y'CbCr 10b or even 8b version of that image may contain values outside the RGB image's gamut but within the display's, which would thus be relevant to what is displayed and seen. Nevertheless, ultimately, all color images - whether from an image sensor in a camera or a renderer in a computer - originate in spectral (nominally RGB) space, not in Y'CbCr space. Thus this situation can only properly arise when an image is converted to Y'CbCr space, edited there in such a way as to create out-of-gamut colors (i.e. filter undershoot and overshoot), and converted back to RGB space. So, frankly, if those out-of-gamut colors are truly desired and displayable, then the RGB images should use a wider gamut in the first place, rather than wasting 3/4 of all possible values by using a Y'CbCr representation. For a diagram of the wasted space in the Y'CbCr cube, see
>By virtue of its higher precision, the Y'CbCr 10b space quantizes the RGB gamut more finely than RGB 8b does. That is, the distorted RGB cube within an Y'CbCr cube of 10 bits per component contains more distinct values than an RGB cube of 8 bits per component. This subquantization is irrelevant to the displayed image as long as the RGB gamut of the image has the same orientation as that of the display - that is, as long as the RGB image gamut can be mapped to the display gamut by one-dimensional transfer functions.
>However, if any rotation, skew, or nonlinear 3-dimensional transformation is required to map the RGB image space to the display space, then a Y'CbCr 10b or higher-precision version of the RGB 8b image may yield distinct RGB display values that fall between the transformed and requantized RGB display values of the RGB image, which would thus be relevant to what is displayed and seen. This situation can only properly arise when the original RGB image is converted to higher-precision Y'CbCr space, edited there in such a way as to create subquantized colors (i.e. just about any filtering), and converted back to RGB space. But if this higher precision is truly desired, then because Y'CbCr representation wastes so much space, it would be more efficient to use RGB 10b.
>In sum, Y'CbCr 10b 4:4:4 images can contain information that is not in corresponding RGB 8b images, and in certain circumstances this additional information is actually visible. As to the opposite question, whether RGB 8b images can contain information that is not in corresponding Y'CbCr 10b images, the answer is no, as proven by our Synchromy technology, which can losslessly convert from RGB 8b to Y'CbCr 10b and back.
>Why endless? The arguments seem straightforward. Let me try to state >hem clearly :
>I think I save more posts from CML than from any other list I subscribe to.
SFD vfx & creative post
Santa Monica, CA
>>If we need to use an Apple or 3rd Party LCD for 2K Color Grading how >>different is the EDP100 compared to Decklink Multibridge Express?
>Well, for starters the Decklink Multibridge doesn't actually work.
>We were one of the first in the US to get one, last fall, and we are still waiting for the fix to arrive that will enable it to perform the primary task we originally purchased it for, converting HD component analog to HDSDI. However, all the other functions work perfectly and it is quite a versatile device, in fact, for the price it is a good deal just for the features that do work. Also, Blackmagic support is very responsive. I do think that they release products too soon. We also expect that we will receive the fix for Component/HDSI soon.
>As for the difference between this and EDP100, it's night and day, both in price and performance. The HDSDI to DVI box works perfectly, but it is a simple device, and doesn't come close to what the EDP100 accomplishes to make the Cinema Display perform well. We bough one to have a simple and cheap large monitor for use on set for various purposes, largely instructional and demonstrations of camera operation, knowing that soon we would need another critical monitor, or projector or both.
>The Blackmagic HDLink is excellent for that task. Very small and low power consumption.
>Andreas Wittenstein wrote:
> This situation can only properly arise when the original RGB image is >converted to higher-precision Y'CbCr space, edited there in such a way >as to create subquantized colors (i.e. just about any filtering), and >converted back to RGB space.
>the RGB image is likely to run through YCbCr space already for color correction. I have done some calculations RGB -> YCbCr -> RGB. the problem is that the blue coefficients are very low. so if you stay in 8bit space, you will have banding in blue gradients already after one generation.