Jump to content

D800E and pro video?


Kent Shafer

Recommended Posts

<p>In the latest issue of Popular Photography magazine, as part of the D800 test report, there's a sidebar on page 82 that says the D800E version "was made by request for pro video shooters." Does this make sense? Can anyone explain it? (The only other similar reference I recall came from the salesman in my local shop, who clearly had no idea what the difference was between the two models but had been told by the Nikon rep that the E version was for "Hollywood types.")</p>

<p>I always thought aliasing was <em>more</em> of an issue with video than with still photography. (We've all seen the dancing suits and ties on TV.) And I don't understand why video shooters would care about eking out the last little bit of resolution.</p>

<p>I don't shoot video so this question has no practical relevance for me, but I'm still curious about it.</p>

 

Link to comment
Share on other sites

<p>You can read this interview with Nikon engineers: <a href="http://imaging.nikon.com/history/scenes/32/index.htm">http://imaging.nikon.com/history/scenes/32/index.htm</a><br>

There is a short discussion on the D800 vs. D800E at the end of that interview:</p>

<blockquote>

<p>Nevertheless, there are those who will only use the camera for landscapes or who <br />need photographs with even higher resolution, and it was for these users that we <br />wanted to create a model that went the extra mile in terms of resolution: the <br />D800E.</p>

</blockquote>

<p>It seems pretty clear to me that Nikon offers the D800E variation for those who need an extra bit of sharpness for still image capture from a 36MP camera. That has been my understanding all along.</p>

<p>I have been comparing the D800 and D800E side by side for two weeks. Like many other reviewers, my conclusion is that the difference between the two models is very small.</p>

Link to comment
Share on other sites

<p>As some one who has shot a good amount of professional video and MoPix film the form factor is all wrong with a DSLR. Yes there are work a round's and sure some stuff has been shot with them like that House episode but as far as I know they only did that once.<br>

One of my DSLR's has video built in and I have used it twice. Both times just to see how to make it work and what it looked like.</p>

 

Link to comment
Share on other sites

<p>Kent, the D800E was made for video like the 600mm F/4 was made to be a walk around lens that fits in your pocket. The Video mode on the Nikon D800 only uses 2MP out of 36MP, at that kind of extreme down sampling any effects on sharpness the D800E would have on stills would be non-existent with video and yes, the anti-aliasing will be much worse (its already bad with the D800). The uncompressed HDMI out was made by request for pro video shooters, but we video shooters would have preferred a heavier Anti-Aliasing filter, not to get rid of it. The people over at Popular Photography made a mistake, they should have said pro video shooters want to stay away from the D800E.</p>

<p>I shoot videos professionally, just wrapped up several for a major music label, and we wouldn't have been able to use the D7000 or the Canon series of DSLRs that shoot video including the Mark III because they don't record enough data to key a green screen (about half of the video was green screen) The uncompressed HDMI out on the D800 allowed us to shoot a lossy compression that keyed really well; its the best thing since shooting digital for filmmakers, but the anti-aliasing filter? Video Pro's around the world cringed when they heard that; they certainly didn't request it.</p>

Link to comment
Share on other sites

<p>Resampling from high to low resolution video always gives sharper results than shooting in low resolution originally. You notice it particularly in the rendering of straight lines. SD video has objectionable staircasing, whereas HD to SD resampling almost completely eliminates staircasing. The shimmering effect you see in plaid fabric is due to aliasing of the electronic signal due to multiplexing, not a Moire pattern in repetetive patterns associated with digital photography. High contrast patterns require higher bandwidth in the spectrum parts "stolen" to make color television compatible with B&W reception.</p>

<p>That said, no video "professional" is willing to sacrifice the controls and ergonomics of a professional video camera for the novelty of video in a DSLR. At very least, an HD video camera has three sensors, each with more than 2M pixels, which are combined into a single 2M frame. I can shoot still frames with a video camera,in the middle of a video shoot, with as much resolution as an 8MP DSLR. I can and do use them for publicity and cover shots.</p>

Link to comment
Share on other sites

<p>Firstly, I'm not sure the D800E is even for amateur video. I'm sure I'll use it (when I get one) as such occasionally, but the increased probability of moiré (visible at the macro scale) means it's far from ideal. Arguably the unmangled HDMI out is a pro feature, but the plain D800 is a much better choice than the E.</p>

 

<blockquote>Resampling from high to low resolution video always gives sharper results than shooting in low resolution originally.</blockquote>

 

<p>Unless you have a D4, from the reports I've heard. "Always" depends on a decent downsampling algorithm. Nikon seem to have over-done the softness, since CX crop seems to be doing better. (Unless they've fixed this?)</p>

 

<blockquote>At very least, an HD video camera has three sensors, each with more than 2M pixels, which are combined into a single 2M frame.</blockquote>

 

<p>Now, maybe. Plenty of professional HD was shot on a 1440x1080 triple sensor a few years back, and I'm not convinced that some devices that output 720p weren't 1280x720. I'm prepared to believe that the state of the art has moved on, but I'd not say it was always so. Besides, Canon seem to be charging an awful lot for a 1Dc and video lenses if a Bayer sensor is useless for video.<br />

<br />

By "more than 2M pixels" I hope you mean <i>much</i> more than 2M pixels, otherwise the image is likely to get worse before it gets better. From an image processing perspective, it drove me nuts that so many "HD" TVs were launched with 1366x768 panels (or 1024x768 for some rear projectors). Then everyone went "ooh, 1080 is much better", which may have had a lot less to do with the 50% linear resolution increase than with the lack of arbitrary stretching - assuming your 1920x1080 panel is actually set up to map one pixel to one pixel, which is astonishingly hard to persuade some televisions to do. I live in hope that the TV industry will eventually stop messing around with the image and just display the content as provided. It took years of people claiming that the technology wasn't capable of the bandwidth before you could even feed 1920x1080 p60 (like my monitor from 1996) to a TV and actually have it show it; even using standard timings still seems to be rocket science. I'm vaguely hopeful that some higher frame rate content (the Hobbit, etc.) might start making people sort the mess out. But I'm not bitter; at least it's not NTSC.</p>

Link to comment
Share on other sites

<p>I don't think E vs. plain D800 makes any significant difference for video, since even the standard D800 skips a lot of pixels (or rows and columns) rather than averaging them to come up with the video feed. Thus there will be aliasing in either case and the difference in low pass filtering between the two cameras would not change that appreciably.</p>
Link to comment
Share on other sites

<p>Ilkka - I assume the D800 does just skip pixels, and that the D4 does some averaging, which would explain the reports of apparent softness at downsampled resolutions. (Filtering properly wouldn't appear soft, but requires a lot more work; the D800 would look "sharp" because of aliasing, not because of better filtering.) I don't actually know what either does, but these assumptions make the most sense to me. Canon also introduced filtering - hopefully <i>decent</i> filtering - in the 5D3, whereas I believe the 5D2 point samples.<br />

<br />

The reason I expect a difference between the D800E and the plain D800 is that I'm assuming this downsampling happens after debayering - an RGB triplet is generated at each sample position, even though the sample positions are spaced apart. If this wasn't how it worked, the colour moiré would be as bad as an HD-resolution sensor with now low-pass filter over it, and that could produce false colours with quite low frequency detail. The problem that the D800E has is that the RGB values it's generating are just as prone to colour moiré effects as still images (no better, no worse, except that an unlucky pixel is likely to be hidden in the crowd on a still image but might distractingly pop up on a frame or two in video). The D800 is (slightly) less prone to false colours, so should be better here.<br />

<br />

Of course, if you're videoing something with high frequency detail and either D800 really is just doing point sampling rather than interpolating the intervening pixels, you're still quite likely to get moiré effects - just not false colours.</p>

Link to comment
Share on other sites

<blockquote>

<p>That said, no video "professional" is willing to sacrifice the controls and ergonomics of a professional video camera for the novelty of video in a DSLR. At very least, an HD video camera has three sensors, each with more than 2M pixels, which are combined into a single 2M frame.</p>

</blockquote>

<p>There is absolutely no bases for your claims Edward, especially the 3 sensor bit. That had me laughing so hard. The Avengers was shot on the $80,000 body only Arri Alexa, which is a single sensor camera, The Amazing Spiderman was shot on the $60,000, body only, Red Epic (which for the record was a 3D rig, which means it took 2 $60,000 cameras), also a single sensor camera.<br>

Proof enough "real" HD camera's don't have to have 3 sensors? Ok then, here are some more:<br>

The Sony F3, a popular "budget" Super 35mm camera is single sensor, the new Canon C300 & C500 are single sensor cameras, come to think of it, I'm having a hard time thinking of any professional Super 35mm camera that is 3 chip. Now many professional "HD" "News style" cameras do have 3 chips, like the Sony EX-1 & EX-3 series, or Panasonic's popular series of professional HD cameras, but DSLR's are not in their league, they are in the 35mm film camera league, which is pretty much all single chip.</p>

<p>While shows like "House" did indeed use the 5D for one shoot, but then went back to high end HD cameras, places like the food network and MTV are using DSLR's more and more and more for their reality styled shows. Last month I worked a $100,000+ commercial shot in San Francisco of the California Academy of arts and sciences, shot entirely on a 5D mkIII. I asked the producer why, because he traditionally used Arri Alexa (the $80,000 professional camera), and he said because once it went to broadcast TV, he couldn't tell the difference and neither could his clients. A good friend of mine just wrapped a music video for Montgomery Gentry shot entirely on the 5D Mk II, not even the Mk III, another good friend of mine shot a bunch of Ronny Dunn's (as in Brooks and Dunn) music video's, shot entirely on the 1D Mk IV. I just wrapped several music videos for one of the largest, actually the largest record label in the world, shot entirely on the Nikon D800. I don't know what your definition of "video professional" is, but all the video professionals I know choose the right camera for the right job, and in many cases, the right camera is a DSLR.</p>

<p>Not to say we don't get frustrated with them from time to time, they are not designed for filmmakers, they are designed for photographers so we do miss things. But like in the case of the D800 and the 5D mkII & III, those cameras have a sensor twice the size of aforementioned $80,000 & $60,000 cameras used to shoot some of this summer's biggest block busters. Its the difference between shooting DX and FX, and as many of you know, you just can't replicate the look of FX, even with the corresponding lens focal length. Or for that matter vice versa. Sometimes I choose to shoot FX because I like the way it looks over a super 35mm camera, regardless of what form factor it comes in, which right now is limited to DSLRs.</p>

<p>To top it all off, Canon just released the 1D-C, which is a video camera, in a DSLR body. It shoots 4k and is being marketed to all the producers are using high end, 4k+ cameras as a way to shoot 4K in ridiculously small package. While no production would probably use it as an A cam, its perfect for stuff you can't fit huge film cameras.<br>

Novelty? No, its real, DSLR's have changed, and continue to change the video industry, the highest end professionals of course have better options for their kind of work, but for many, DSLR's are the bread and butter of their living. Its not different than photography, just because you bought an $80,000 Hassy, doesn't mean you are going to show up and blown the D4 away at the Olympics. Its a different style camera for a different style of shooting. While feature films who "create" things have about anything at their disposal, a reality styled TV show has to have a portable camera that can operated by one person, and to have interchangeable lenses? Not very many "Real HD" cameras do that well, actually I take that back, no "Real HD" cameras do that. DSLR's fill a void, maybe a small one, and you certainly won't see the next Spiderman or Avengers shot on it, but that' doesn't mean video professionals take it very seriously and frequently use it over "real HD" cameras.</p>

<p> </p>

Link to comment
Share on other sites

<p>@ Andrew: Ah! The great "HD ready" scam, perpetrated on the unsuspecting and technically-challenged public. If people had armed themselves with half an ounce of knowledge before parting with their cash for a crappy 720 pixel high, 6 bit-depth screen, then the industry might well have been forced to get its act together a bit sooner. As it is, there's one born less than every minute, and there's equally someone born ready to part them from their cash in that amount of time.</p>

<p>I just walked out of the shop when told a couple of years ago that it was "impossible" to make a reasonably small screen that supported 1080p full HD - muttering that the assistant should look at what resolution was available on a computer monitor at the time.</p>

Link to comment
Share on other sites

<p>Skyler: Thank you for that state of the industry report! I'm off to rummage on Wikipedia and get the hang of these formats. I see the red EF-mount lenses that Canon launched with the 1Dc, check the prices in B&H, and have trouble believing they're not considered "high end" (and also that someone can charge that much for relatively low-spec glass). A 100x zoom is a nice party trick (meh, I can do 8mm-800mm on my D700 if you don't mind the 8mm being a fish-eye), but I wonder when content production devices are going to get out of the "charge what you like" mentality. I suspect it came from the cost of consuming 35mm film...<br />

<br />

RJ: I gave the short version of my rant. :-) Knowing a bit about displays, I would have been an early adopter if I'd not had to wait so long for someone to produce an HD, no I mean matching one of the standard resolutions, okay that apparently means 1080, no I mean 1080p, no I mean 1080p<i>60</i>, actually I meant at 1:1 pixels and not forced into overscan... television. The frustrating thing is that I've had a T221 since 2004, which (as a 3840x2400 monitor) is capable of displaying both 1920x1080 and 1280x720 content without any interpolating - but, of course, it doesn't support HDCP (you know, the standard encryption for which a way of cracking the master key was published well before it became a standard) so it's not actually useful as a TV.<br />

<br />

Oh, and being a 16:10 screen, it would have had some black bars, because nobody involved in the standards had ever used a computer, and thought that a) two resolution standards wouldn't be a problem, b) 16:9 is a sensible aspect ratio (it's much worse for general computer use, but since screens are sold by the diagonal and 16:9 takes up less glass than 16:10 we're stuck with money-grabbing manufacturers producing inferior hardware at a budget), and c) the fact that 1080 lines doesn't divide by 16 (the height of an MPEG block in the chroma channels) didn't matter. And by "two resolution standards", I of course ignore the fact that I'm in the UK, which means that I can expect content to be 24fps (progressive or interlaced), 25fps (progressive or interlaced), 29.97fps (progressive or interlaced), 30fps (progressive or interlaced), 48fps (progressive), 50fps (progressive), 59.94fps (progressive) or 60fps (progressive) depending on whence it came. On a good day, my TV may cope with some of these formats. My old T221, of course, can handle any of them without batting an eyelid (and its EDID block is sensible, too).<br />

<br />

The only genius matching that is 1366x768, which allegedly came about because people adapted 1280x768 hardware by adding columns rather than removing rows, and resulted in the amazing 1366 number, which is twice 683, which is a prime number and about as awkward as you can get for a number of image processing algorithms (such as a fast fourier transform). Although whoever thought it was a good idea to allow h.264 to encode interlaced and non-interlaced blocks on the same frame (allegedly BBC iPlayer actually uses this!) comes close as well.<br />

<br />

But, like I said, I'm not bitter, even though almost every single decision seems to have been made to be annoying.</p>

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...