820 was nominated for deletion. [[Wikipedia:Articles for deletion/820 i|The discussion]] was closed on 22 January 2011 with a consensus to merge. Its contents were merged into 720p. The original page is now a redirect to this page. For the contribution history and old versions of the redirected article, please see its history; for its talk page, see here. |
This is the talk page for discussing improvements to the 720p article. This is not a forum for general discussion of the article's subject. |
Article policies
|
Find sources: Google (books · news · scholar · free images · WP refs) · FENS · JSTOR · TWL |
This article is rated C-class on Wikipedia's content assessment scale. It is of interest to the following WikiProjects: | |||||||||||
|
NBC Resolution
edit"NBC uses the tagline "the nation's finest high-definition standard" in advertising its 720p programming."
- But the NBC website claims that all NBC HD broadcasts are in 1080i:
- "NBC broadcasts HDTV in the 1080i format, which provides the highest possible resolution to our audience." [1]
- Lee M 17:24, 12 Dec 2004 (UTC)
- I see that the reference to NBC has now been changed to Fox. It's all a bit academic to me anyway because I'm in the UK... Lee M 17:08, 17 Dec 2004 (UTC)
which one is better 720p or 1080i? cheers
- Neither. The two standards exist because each is better suited to different purposes, and the technology of the time wasn't good enough to provide what was actually desired - IE full rate 1080p. Instead we have the twin compromises of 720p and 1080i, the former better for high motion material where the effective display would otherwise be a rather weird 1920x540 and suffer interfield motion artefacts, and the latter better for bringing out finer details in low motion video and 24-frame filmed material (and avoids the need for HDTVs to have an initially rather expensive scan doubling framebuffer to prevent flicker, despite otherwise only having half the ideal framerate). If one was automatically "better" than the other in all situations, why would both standards have been enshrined? It's a waste of time and effort, doing something like that for no reason. 80.189.129.216 (talk) 23:56, 23 January 2018 (UTC)
- There is a consensus building that 720p is better for moving images, whereas 1080i is better for video with a lot of still or near-still footage. Thus, movies, nature, and drama content are better in 1080i, like on PBS-HD and NBC. Sports are better in 720p, like on ABC, Fox, and ESPN-HD. Obviously, 1080p60 would be the best of both worlds, but too bad, we can't have it. --Locarno 22:56, 9 Feb 2005 (UTC)
- 1080p60 just requires more efficient data compression. They'll get there eventually, and then they'll start working on even higher definition systems.
Interesting that the ATSC saw the two systems as content-specific, but the Networks, possibly for financial reasons, have each opted for just one. It'll be interesting to see what happens when Sky HD launches in the UK.... incidentally (or perhaps not) Britain will use 1080i25 and 720p50, meaning that despite having the same picture resolution as the US there'll still be frame-rate conversion issues to contend with. Not to mention the whole thorny question of up-and-down converting from PAL and NTSC.Lee M 02:50, 8 Mar 2005 (UTC)
- 1080p60 just requires more efficient data compression. They'll get there eventually, and then they'll start working on even higher definition systems.
- Sorry, but what Locarno says is only true when comparing, say, 720p/60 with 1080i/30. If that is the case he should have stated so. If no refresh rates are specified one is forced to assume they are the same, in which case the opposite of what he wrote is true. Motion portrayal is smoother when using interlaced scanning, for a given refresh rate. 83.104.249.240 (talk) 05:25, 13 March 2008 (UTC)+-
- No, that's not a sensible comparison because now you have 1080i at it's maximum temporal resolution against 720p with it's bandwidth cut in half for no stated reason. In a "which is better" comparison, you have to assume that both contenders are working at their maximum potential. Anyone who is interested in motion portrayal would use the maximum frame rate available for a given system, so 720p wold have the advantage of less line crawl on moving objects. Algr (talk) 15:11, 4 June 2008 (UTC)
- I disagree with the first sentence. Such an assumption certainly cannot be made. The majority of 720p programming is in fact at 24 fps therefore there would be no reason to assume a higher frame rate. 83.104.249.240 (talk) 21:20, 22 June 2008 (UTC)
- The majority of driving happens at maybe 24 mph, but you wouldn't assume that that is as fast as a car can go. Using 24 fps is an artistic choice for drama, no one is ever required to use it. News, game and talk shows are shot at 60 fps, and they add up very quickly, so I doubt that the majority of 720p programming is 24 fps. Algr (talk) 15:09, 24 June 2008 (UTC)
- Whether that is the case or not is hardly relevant. My point was and remains the fact that unless explicitly stated I wouldn't automatically assume one frame rate for one resolution/scan mode combination and another for a different combination. The driving analogy is irrelevant. What does the average driving speed versus the maximum have to do with video refresh rates? I would make no assumption about either and would expect both to be explicitly stated. Similarly, going back a bit, why would I assume the same bandwidth unless it was explicitly stated? There are various other parameters to be considered and I would make no assumptions about any of them. Why should I? 83.104.249.240 (talk) 19:38, 19 September 2008 (UTC)
- If you are making such assumptions then why assume that a "frame" contains two images taken at separate points in time? Who would expect that to happen if it wasn't explicitly stated?Algr (talk) 16:43, 20 September 2008 (UTC)
- The point is that nothing should be assumed when making comparisons of this type, therefore all parameters need to be explicitly stated: number of active lines, scan mode and refresh rate. MegaPedant (talk) 02:14, 5 October 2008 (UTC)
- That is what I was telling 83.104.249.240. His claim was based on some illogical assumptions about how video would be used. Algr (talk) 19:04, 5 October 2008 (UTC)
- I suspect the two of you are talking at crossed purposes. Rewinding back to the top of the page, Locarno answers the question "Which one is better, 720p or 1080i?" without mentioning the relative refresh rates of the two standards he's comparing. I think Algr assumed that he was comparing 720p/50 with 1080i/25 (say) because the bandwidth used by both is similar. I think 83.104.249.240 assumed that he was comparing 720p at any frame rate with 1080i at any frame rate. From those standpoints both make valid points. My pedantic point is that all three people were making assumptions, none of which can be taken for granted to be valid. It is only meaningful to compare different line standards and scan modes if the refresh rate is also specified because it simply can not be assumed. One thing that has to be considered is that 720p/24, 720p/25 and 720p/29.97 are seen by some programme originators as a cheap way of getting into HD as it uses about half the storage of 1080i at the same refresh rate. Now, this doesn't give the comparatively smooth motion portrayal that 720p/50, 720p/59.94, 1080i/25 or 1080i/29.97 do but smooth motion portrayal isn't necessarily what it being sought after by the production company whose inexpensive HDV camcorders don't do 720p at the higher frame rates. MegaPedant (talk) 01:07, 6 October 2008 (UTC)
- That is what I was telling 83.104.249.240. His claim was based on some illogical assumptions about how video would be used. Algr (talk) 19:04, 5 October 2008 (UTC)
- The point is that nothing should be assumed when making comparisons of this type, therefore all parameters need to be explicitly stated: number of active lines, scan mode and refresh rate. MegaPedant (talk) 02:14, 5 October 2008 (UTC)
- If you are making such assumptions then why assume that a "frame" contains two images taken at separate points in time? Who would expect that to happen if it wasn't explicitly stated?Algr (talk) 16:43, 20 September 2008 (UTC)
- Whether that is the case or not is hardly relevant. My point was and remains the fact that unless explicitly stated I wouldn't automatically assume one frame rate for one resolution/scan mode combination and another for a different combination. The driving analogy is irrelevant. What does the average driving speed versus the maximum have to do with video refresh rates? I would make no assumption about either and would expect both to be explicitly stated. Similarly, going back a bit, why would I assume the same bandwidth unless it was explicitly stated? There are various other parameters to be considered and I would make no assumptions about any of them. Why should I? 83.104.249.240 (talk) 19:38, 19 September 2008 (UTC)
- The majority of driving happens at maybe 24 mph, but you wouldn't assume that that is as fast as a car can go. Using 24 fps is an artistic choice for drama, no one is ever required to use it. News, game and talk shows are shot at 60 fps, and they add up very quickly, so I doubt that the majority of 720p programming is 24 fps. Algr (talk) 15:09, 24 June 2008 (UTC)
- I disagree with the first sentence. Such an assumption certainly cannot be made. The majority of 720p programming is in fact at 24 fps therefore there would be no reason to assume a higher frame rate. 83.104.249.240 (talk) 21:20, 22 June 2008 (UTC)
- No, that's not a sensible comparison because now you have 1080i at it's maximum temporal resolution against 720p with it's bandwidth cut in half for no stated reason. In a "which is better" comparison, you have to assume that both contenders are working at their maximum potential. Anyone who is interested in motion portrayal would use the maximum frame rate available for a given system, so 720p wold have the advantage of less line crawl on moving objects. Algr (talk) 15:11, 4 June 2008 (UTC)
- Sorry, but what Locarno says is only true when comparing, say, 720p/60 with 1080i/30. If that is the case he should have stated so. If no refresh rates are specified one is forced to assume they are the same, in which case the opposite of what he wrote is true. Motion portrayal is smoother when using interlaced scanning, for a given refresh rate. 83.104.249.240 (talk) 05:25, 13 March 2008 (UTC)+-
Television resolution chart
editHow does one edit the Television resolution chart on this page? It contains incorrect references to "240i" and "288i" In fact, these are progressive, (240p) even when displayed on NTSC and PAL sets. Algr 05:01, 22 February 2006 (UTC)
Fox Revert
editFox uses 720p, but it is possible that some of their affiliates convert this to 1080i before broadcast. Many sets do a poor job with native 720p, and look much better if such a signal is converted to 1080i by something else first. Algr 05:49, 30 April 2006 (UTC)
citation needed?
editSome U.S. broadcasters use 720p60 as their primary high-definition format; others use the 1080i standard.
- We already have references showing that both 720p and 1080i are in use. Algr 07:05, 13 August 2006 (UTC)
- What Algr says is certainly true, however I would say that the 720p material is almost entirely captured at 24 frames per second (or the NTSC-friendly 23.976 Hz variant). The small amount that isn't is almost certainly captured at 29.97 frames per second in the U.S. The assertion that Some U.S. broadcasters use 720p60 as their primary high definition format therefore does require citation. 83.104.249.240 (talk) 05:14, 13 March 2008 (UTC)
Interlace is misunderstood ...
editThese 2 sentences contain gross misunderstandings ...
Progressive scanning reduces the need to prevent flicker by filtering out fine details, so spatial resolution is much closer to 1080i than the number of scan lines would suggest.
and
The main tradeoff between the two is that 1080i may show more detail than 720p for a stationary shot of a subject at the expense of a lower effective refresh rate and the introduction of interlace artifacts during motion.
Interlace doesn't "filter out fine details". Some interlace systems may low-pass filter vertical detail (only) to a very slight degree, in order to eliminate interline twitter that occurs when the subject includes narrow horizontal stripes that approach the vertical field resolution. 1920 horizontal pixel resolution is definitely greater than 1280.
Interlace doesn't introduce artifacts during motion. Interlaced video is quite good at capturing and portraying motion. Interlace artifacts are only introduced when interlaced video is converted to progressive scan.
- That is completely incorrect. Interlacing can introduce spatio-temporal aliasing (edge flicker, line crawl, etc) unless video is properly low-pass filtered in the vertical dimension. Interlacing samples space-time at half the rate of progressive scanning.
Don - both of us are guilty of not signing our posts. Clearly you know what you are talking about... but I don't completely agree with your statements. As you know, aliasing can occur when you digitally sample something that has information at a frequency greater than half the sampling frequency (the Nyquist frequency). Interlace has half of the vertical resolution per field, but the same vertical resolution per frame as progressive scan. If the picture being captured by the video camera contains information with a lot of detail, artifacts can occur. However there are many reasons why this is not normally a problem. The camera would have to be in focus on the object, otherwise the lack of focus serves as a low-pass filter. If the object or camera are in motion, the motion blur low-pass filters the detail. The accuracy of the lens and the sensor (CCD, CMOS) can low-pass filter the detail. In general, most scenes being shot don't have the type of image detail that causes visible artifacts. Very high-end cameras filter the vertical resolution in order to reduce interline twitter, but most cameras don't... it isn't necessary for most situations. Finally, I would have to say that interlacing samples space-time at the same rate of progressive scanning, to be fair. Space is sampled at half the quantization level at twice the frequency... but the trick of offsetting the samples between frames by on line (spatially) gives the eye a moving average of the detail and the motion. If the object is moving, your eye gets updated information at a better refresh rate. If the object isn't moving, your eye still gets all of the detail of the full frame (due to the way the human visual system works). I've got nothing against 720p... it's great stuff. Progressive scan has advantages over interlace, particularly when it comes to editing (like slow motion detail), or display on progressive scan displays (which are all the rage for TVs these days, plus they are a part of every PC). But interlace is a technique that works, even if it isn't well understood. Tvaughan1 23:52, 26 October 2006 (UTC)
- It is difficult to work out who wrote what in the discussion above as only the last contribution is signed but it seems to me that both contributors are making assumptions without expressing them. Tvaughan1 is assuming the same frame rate between the two scanning modes, such as 1080i/25 and 1080p/25. The former refers to a system with 1080 lines of vertical resolution interlace scanned at 50 fields per second, giving a refresh rate of 25Hz. The latter refers to a system with the same number of lines of vertical resolution progressively scanned at 25 frames per second, also giving a refresh rate of 25Hz. This is the correct way of making comparisons between the two scanning modes, the result being that for static images the two modes give identical results whereas for moving pictures the progressively scanned system gives a better perception of vertical resolution at the expense of the quality of the motion portrayal, for a given refresh rate. The other contributor (Don?) is mistakenly assuming that moving from interlace scanning to progressive scanning necessarily involves doubling the frame rate. He is therefore comparing, say, 1080i/25 with 1080p/50 without explicitly stating so. This is an unfair comparison because the latter uses twice the bandwidth of the former and should naturally look better to the viewer—though I would say not better by a factor of two. 83.104.249.240 (talk) 05:01, 13 March 2008 (UTC)
- I didn't continue to follow this thread, but you can find a discussion of the theory of interlacing in a paper I wrote for SIGGRAPH 199o. This discusses the mechanism of aliasing that can cause moving edges to flicker. Television engineers sometimes talk about the "Kell Factor" which is lower for interlaced video. http://www.mentallandscape.com/Papers_siggraph90.pdf DonPMitchell (talk) 01:12, 6 February 2010 (UTC)
720p vs. 1080i
editJust a suggestion, but I think it would be better to have a separate article for 1080i vs. 720p, and on that page include a table of what broadcaster uses what standard. Maybe even a separate page for that, updating as each one goes along? It would take some clutter out of this article, and would probably make a more sensible link on the 1080i page. Uagent 10:41, 5 February 2007 (UTC)
In order to compare the quality of motion captured and displayed at 720p with 1080i the frame rate of each must be specified or the results are meaningless. This section as it stands doesn't make sense. Is the author comparing 1080i/30 with 720p/30 or with 720p/60, or even with the much more common 720p/24? 83.104.249.240 (talk) 03:47, 13 March 2008 (UTC)
768
editAlmost all LCD televisions in the 20"-40" range have resolutions of 1366x768. (Plasmas seem to be 1024x768, which is 4:3, but they have rectangular pixels.) Wouldn't this provide a worse image than an LCD with only 720 lines? With LCD computer monitors, you have to set your desktop resolution to the native resolution of your monitor or you'll get blurry images. Maybe because computer monitors deal with text, the problem is more pronounced on
Why does 720p even exist?
editDoes anyone know how this, or any of the other HD standard resolutions, came about? The number simply don't make sense to me. Can someone who really knows this stuff comment on my stream-of-conscious below?
The maximum theoretical resolution of NTSC is 525 lines, and of this 486 lines are visible. When they introduced DVDs, the obvious solution was to round this off to 480 lines, and provide the rest of the 525 "on the fly" in the players. So far, so good.
And then comes HDTV. This is where I get really confused. The painfully obvious selection for HDTV would be 480 x 2 = 960. This is very close to the existing 1080 standard, but because it's an even multiple it makes the scaling a whole lot simpler. So why isn't 1080 actually 960? Ok, so maybe you want a standard that can do both NTSC, PAL and SECAM, but that suggests you'd want to find a factor between 480 and 576, which suggests a number based on a multiple of 96... like 960.
The history section notes that the 720 was originally an analog system that was later converted to digital. OK, but how is it that they selected this format? I seem to recall the Japanese standard was something close to 1050 (which makes perfect sense given they used 525 at the time) so it seems like they simply selected this number "out of thin air". Does anyone know why it's 720 specifically?
And how was it that this selection of two incompatible standards has managed to survive to this day? It's one thing to pick a lower resolution standard for an analog system, but since everything went digital the cost of the extra resolution is pretty small. Why didn't it get eliminated somewhere along the line? Or, if the article is correct and the resolution of 720p is actually as good as 1080i, then how is it that 1080i survived? How did we end up in this ridiculous situation of having two standards that don't even have a common multiple?
Maury (talk) 21:57, 9 January 2008 (UTC)
- I'm not sure about the exact numbers, but it takes about the same analog bandwidth to transmit 720p as 1080i:
- 720x1280x60=55,296,000
- 1080x1920x30=62,208,000
- In the early '90s the computer industry experimented with interlaced computer displays, and consumers absolutely hated them, so they wanted to make sure that progressive displays were standard. For things like reading text on a screen, a 720p CRT would look much better then 1080i. Algr (talk) 00:07, 26 February 2008 (UTC)
- The above assumes that the 720p signal runs at twice the frame rate as the 1080i signal—in this case 60Hz and 30Hz, respectively. However, that is by no means universally the case. 720p/24 is in fact much more common.
- I believe that a vertical resolution of 1080 pixels is derived from the number of active lines used by the NHK 1125-line analogue HiVision system. The 1080-line standard was designed with interlace in mind, and with progressive scanning as an option. The 720-line standard was designed only for progressive scanning from the outset and the 33% reduction in line count was felt to give a similar perception of vertical resolution as the 1080-line interlaced standard, at a given refresh rate. Ideally both standards would have the same number of horizontal pixels because interlace has no effect on horizontal resoluton but to maintain an aspect ratio of 16:9 with square pixels means that the 720-line standard needs 1280 of them while the 1080-line standard needs 1920. So, while it can be argued that 720p gives a similar perception of vertical resolution to 1080i at the same frame rate, 1080i has potentially a significantly higher horizontal resolution. I say "potentially" because 1920 pixels of horizontal resolution are not often achieved in practice. Until quite recently HD television cameras had CCDs with only 1440 (rectangular) pixels horizontally. The most common HD videotape recording formats (HDV, DVCPro-HD and HDCAM) down-sample each line and record only 1440 anamorphic samples. In the case of DVCPro-HD working at 1080i/29.97 it actually records only 1280 samples per line, levelling the playing field a little, but not fully because at 720p/24 it records only 960 samples per line. The only two videotape formats in common use that record the full horizontal resolution of 1920 samples per line are HD-D5 and HDCAM SR. Then again, in order to save bandwidth broadcasters often reduce the horizontal resolution of the 1080i signal to 1440 anamorphic pixels when coding it up for transmission. How many people have 1920×1080 resolution television display, anyway? I know the number is growing but most displays branded "HD Ready" have only around 1360×760 pixels. Even if you've paid the huge premium and invested in a real 1920×1080 display are you sure you're getting 1:1 pixel mapping, and can you see the difference anyway? It really is a minefield. 83.104.249.240 (talk) 04:36, 13 March 2008 (UTC)
- 720p and 1080i were proposed at about the same time, in response to the analog HD format developed in Japan. The television industry backed the interlaced format, because it was a familiar trick for them. The computer industry was afraid that interlacing would become comoditized and begin to appear on computer monitors, where it has very poor properties (text and lines flicker). Furthermore, the effective resolution of 1080i is not really much better than 720p, because interlaced video has to be veritically filters to avoid 30Hz flicker and other motion and aliasing artifacts. DonPMitchell (talk) 02:58, 22 May 2011 (UTC)
- There's a thing known as the Kell Factor which estimates the perceptual loss of resolution with interlaced video, and it comes from the CRT era (which is where these standards were also birthed). It's nothing to do with electronic filtering, but the blooming and overlap between the lines of each field (which are offset from each other by exactly one-half of a scanline, and the beam width is usually enough for a slight overlap between them; evidence of this can be seen in how the dark "scanlines" of old computer/console images on CRT TVs - caused by them sending a series of noninterlaced fields of the same type, instead of alternating "top"/"bottom", so a gap is left between each line - are not as tall as the bright lines in between), and the clarity lost by the human eye operating near the edge of persistence/flicker fusion and the brain having to fill in the gaps. Essentially, interlacing still gets you more useful resolution than the same field/frame rate without interlace, but it's not double... closer to 1.5x instead (I forget the actual figure for typical CRTs, but it's between 1 and 2...). That's how, for example, VHS and most analogue TV broadcasts could get away with surprisingly low horizontal resolution compared to their interlaced vertical rez - once the Kell factor is taken into account, they're actually reasonably well balanced against each other.
- However, I have a feeling this isn't the reason for the two standards - they do, after all, have square pixels, and appear designed with the potential of display on extra high rez CRTs (computer grade, far in advance of TV by the late 90s), possibly with scan doubling, or LCD/Plasma flatpanels, where the Kell factor simply doesn't exist. I expect it's more for the flexibility of having a high frequency progressive mode that gives good clarity for high-motion material despite otherwise having 2/3 the spatial resolution in each direction and approx half the total pixels, and another that has greater total resolution but a slower total framerate, for lower-motion material (or for movie transfers), with the interlacing acting both to still give smooth motion where needed (but with only half the vertical resolution, for the fastest objects), as well as reducing flicker on screens which were hard-synched to the input signal.
- The loss of resolution for fast motion in interlace mode is possibly a very crucial thing - it reduces the apparent rez to 1920x540 in the worst cases (whole scene pans, for example). Compared to that, 1280x720 has about the same number of pixels overall, but has a much better balance between horizontal and vertical - and also doesn't suffer the half-line zigzag combing that can make pulling a clear static image from an interlaced original a complete pain in the backside; instead you get 50 or 60 complete, independent pictures every second, no combing, no loss of clarity. Moreover, compared to SD it's at least 1.5x the rez in each direction (nearly 2.5x overall) AND doesn't suffer those interlacing drawbacks inherent to SD recording, AND scans at the same effective field rate (so is no more flickery - less so in fact, as it doesn't have the 25/30hz flicker some fine detailed material can exhibit). Which is why it's the preferred mode for sports broadcasts, if 1080p isn't an option, and even the US military for their surveillance drone cameras - clear, high-motion, and not as fatiguing to stare at for long periods.
- For filmed material that never goes above 24/25fps anyway, and that you want to extract as much detail from as possible (as even 16mm has the potential to exceed 720p resolution with good filmstock, and 35mm is closer to 4k), as well as for news or studio-recorded programmes which tend to have much more in the way of static or low-motion footage, 1080i is your guy. Each pair of fields carries twice the pixel count of 720p, the interlacing means it doesn't look significantly more flickery for most material at normal viewing distances, it appears to maintain full resolution with slow pans or slowly moving objects even at 50/60hz, and if displayed on a flatpanel or a framebuffered, double-scanning CRT, films will look just the same with 1080i as they would with 1080p at the same framerate as 720p (on a single-scanning screen, they'll look that good divided by the Kell factor...).
- Broadcasters don't seem to pay as much attention to choosing between the two options as they should, and tend to stick with 1080i in a lot of cases (though the improved broadcast bandwidth available over digital terrestrial, and particularly satellite, cable or IPTV, means sports channels go for 1080p instead of 720p), but that's what the two standards are best suited to, and I dimly remember reading that there was some thought behind it, setting it up like that deliberately.
- As for why that resolution particularly... it's just what worked. 1080 lines was what was settled on as a common halfway house between 960 (twice NTSC active height) and 1152 (twice PAL) (or, maybe more particularly, the 1125 total scanline count is close to 525+625; having a common picture resolution just makes life SO much easier when moving material between regions, even if the framerates are still different), that was divisible by 16 and would produce a horizontal width also divisible by 16, with square pixels at 16:9 aspect ratio (meaning it's locked in to heights that are multiples of 72; in this case, x10 and x15), and interlaced because that's what was necessary at the time in order to make the engineering practical (bringing the image data rate down to roughly that of cheaper-grade computer equipment, rather than twice as much). When the thought occurred that a progressive standard would also be rather desirable, it had to be as large as possible whilst still fitting within the same overall limits (data rate, aspect ratio, both dimensions divisible by 16), and preferably be produced from the exact same hardware (including pixel clock) just with different programming. There's not many modes which satisfy those requirements. As a nice bonus, the horizontal dimension for both modes is a multiple of 640 (and the verticals can be padded or cropped to 768 or 1024 without adding huge borders or losing anything from the "safe" area), making scaling to and reuse of commodity monitor tubes and panels rather simpler... 80.189.129.216 (talk) 23:46, 23 January 2018 (UTC)
720p vs 1080p - Screen Size
editIf a TV's size is only, let's say, 1366 by 768 pixels, is there a noticeable difference between 720p and 1080p on a screen of this size?--Just James T/C 14:45, 3 June 2008 (UTC)
- Because 720 and 768 are so close, there is a fair amount of loss when images are scaled - every 15 lines in the signal has to be stretched into 16 lines on the display, resulting in a lot of detail falling between the cracks. A true 720p set would look much better, given a 720p signal. The 1080p signal would suffer less from this effect, but would not look as good as a hypothetical 1366 by 768 image. (Such as if the set were being used as a computer monitor.) Algr (talk) 15:21, 4 June 2008 (UTC)
- In practice, though, there's still some overscan anyway. EG a lot of supposed 1080p sets stretch out the centre 1856x1044 or so pixels to fill their 1920x1080 (for why, I'm not sure, maybe it's to compensate in case SD material is carelessly upscaled without the ragged edges being cropped off? It's a figure I've run across a handful of times with different devices, usually when they've glitched out and think that a PC input is actually broadcast material, and you lose the edges of the image, which is very inconvenient if it happens to be your OS desktop), and we can assume HDready 720p or 768p sets do the same. So we have 1044 lines being crammed down, or approx 696 lines stretched out, to 768. In any case, the amount of upscaling is small enough that it doesn't significantly affect clarity, and the downscaling only loses about a quarter of the total transmitted. The difference between the two will depend on the quality of the built-in scaler; a poor one will end up smearing 720p material (and you'll probably end up hunting out the Service Mode codes in order to enforce 1:1 scaling for that mode, same as I had to in order to make PC 1920x1080 modes work correctly), but give a decent impression of 1080-line (maybe with some interlace artefacts for 1080i), but a good one should give an only slightly softer image for 720p vs 1080i/p, and deinterlace 1080i intelligently too. 80.189.129.216 (talk) 22:59, 23 January 2018 (UTC)
BBC
edit"The BBC is one of the EBU members transmitting in HDTV. It has not yet made a final decision on picture scanning format."
I think the BBC has decided on 1080i50, as have all other UK broadcasters. I do not have a source for this, but I don't see a source to say it is wrong either. --Iain (talk) 19:38, 1 September 2008 (UTC)
- In Europe the channels are larger then in the US - 7-8 MZ each. With this and H.264, and the slower refresh rate, I don't see why they keep going with half measures. Why not 1080/50p? Several cameras that can do this already exist. (The Red One for example.) Algr (talk) 18:48, 2 September 2008 (UTC)
- 720p and 1080i were already defined as broadcast standards when HD started in Europe. Many people would have already had the TVs for those standards, and there would have been negligible benefit of going down that route. Bandwidth is limited (especially terrestrial), and film based material, for example, has no benefit of being broadcast at 50p. Even though there would have been benefits in sports broadcasting, it would not have been economical for such a drastic change in infrastructure with added expense of bandwidth.--Iain (talk) 21:04, 3 September 2008 (UTC)
- As far as I know the BBC hasn't made an announcement of any decision but all the HD work I've been involved with in the UK has invariably been 1080i/25. The BBC's HD delivery format was until recently HDCAM (which doesn't even support 720p) with multichannel sound having to be Dolby E encoded due to the format only supporting four regular PCM audio tracks. Their future delivery format will be the much superior but significantly more expensive HDCAM SR format. Algr: sorry to disappoint you, but in Europe at least, HD isn't about quality. It's driven by the equipment manufacturers and it's solely and cynically about separating (mostly gullible) customers from their money. A lot of these people have, in the last five years, already upgraded their television receivers from standard definition 4:3 to standard definition 16:9, then the early adopters of HD bought 1366 x 768 so-called "HD Ready" branded equipment which they are now being encouraged to ditch in favour of "true HD". Meanwhile it's virtually impossible to buy a CRT, whether for domestic or professional use, but the replacement display technologies are all markedly inferior. Meanwhile the digital replacement for analogue broadcasting is sacrificing quality for quantity. In the UK the only SD channel that makes any attempt at maintaining technical quality and which uses a data rate that's consistently over 4Mb/s is BBC ONE. Viewed on a CRT, digital BBC ONE flagship programmes such as Strictly Come Dancing and Doctor Who can look absolutely cracking, but they are the exception, not the rule. Everything else is poor by comparison. The situation in the US may well be different - I have no recent experience. I was last there in 1996, working at the Olympic Games in Atlanta and my opinion after watching several hours of television in my hotel room is that it was dreadful, both technically and in terms of its production values. I do hope it has improved in the intervening 12 years. MegaPedant (talk) 23:33, 9 October 2008 (UTC)
- Let's put this BS to bed.
- *presses "Info" button on freeview PVR remote whilst tuned to a BBC HD channel*
- Says 1080i50, and I think if anyone should know, it's the receiver decoding the broadcast. Is that good enough for you or do we need to find an actual published citation to close this ten-year-old squabble? 80.189.129.216 (talk) 22:49, 23 January 2018 (UTC)
Bitrate
editWhat is the bitrate of 720p? Do you need 5 megabit to stream it? —Preceding unsigned comment added by 83.108.232.65 (talk) 21:37, 24 March 2009 (UTC)
- Depends on the content. Mypal125 (talk) 22:35, 3 April 2016 (UTC)
- To whit: Uncompressed RGB, including usual blanking, is 1782mbit (4:2:0 YUV without blanking is 1188mbit, if my maths are correct). At the other extreme, the 25fps version used by the BBC on the iPlayer VOD site runs around 3mbit; probably if that was compressed with 2-pass VBR instead of a realtime CBR encoder it might average more like 2.5mbit. The upper limit of Blu-ray material (and HD video cameras) is somewhere around 21~24mbit. Most compressed cases (which I assume is what you're after) will be some arbitrary figure in-between those latter rates. 80.189.129.216 (talk) 22:46, 23 January 2018 (UTC)
what does this mean?
edit"Unlike NTSC, the ATSC standards do allow for a true 1080p broadcast leaving this battle almost lost on both sides."
If the ATSC standards do allow for a true 1080p why is the battle lost on that side? Confusing. —Preceding unsigned comment added by 66.46.84.66 (talk) 19:45, 8 July 2009 (UTC)
Objection to comment in opening paragraph
editI dont know why the following is allowed on the first paragraph or anywhere in Wikipedia, because it is quite simply incorrect:
"so spatial resolution (sharpness)" FIXED by SadxPandaxFace
This implies that spatial resolution and sharpness is equivalent, and are not. I am going to change it to say "so sharpness is much closer to 1080i". 62.244.190.66 (talk) 11:32, 11 December 2009 (UTC)
Frames v Fields
editIt seems this article frequently and erroneously uses the word "frames" where it should be using the word "fields". It is my understanding that NTSC standard is 30 or 29.97 FRAMES per second, but 60 or 59.94 FIELDS per second. Each frame consists of two fields, each consisting of half the number of horizontal lines that make up the full frame. Even the first referenced URL in the article ([2]) notes that this is the case. Unless there is evidence to the contrary, can we have all erroneous instances in this article corrected? --Ds093 (talk) 15:10, 1 January 2010 (UTC)
- It's irrelevant in the case of 720p (or 1080p, 2160p... etc), as it's not interlaced. There are no fields, or perhaps more accurately every field is also a frame. Each 1/60th (or 1/59.94th) of a second, you get a full block of 720 lines, and each new block completely replaces the immediately previous one, instead of weaving inbetween them. Thus although the NTSC rate (60hz mono or 60/1.001hz colour) still holds sway for compatibility's sake, it's now the frame rate, not the field rate as it was previously. Of course, for 1080i, the distinction is still important, but this is an article about 720p ;) 80.189.129.216 (talk) 22:38, 23 January 2018 (UTC)
What is 720p when aspect ratio is not 16:9?
editHow do you define 720p when the aspect ratio of the videos are not 16:9? Alot of movies uses e.g. 2.35:1 and in my experience these films usually have a definition of 1280x544. So it's not really the height of the signal that is constant in 720p, it's actually the width. So why don't we call it 1280p then, which would be the correct width in every video. And what would the definition be if the aspect ratio is 4:3? 1280x960, or 960x720? Cachinnus (talk) 03:11, 7 November 2010 (UTC)
- It's 720p even so, because the transmitted image (which is all that matters) will still be either 1280x720 (16:9) or 960x720 (4:3), just with hard coded letterboxing down to the reduced vertical height. Broadcast TV, Blu-ray video encoding, and indeed computer monitor specifications are not the same as the pirate DivXs you're probably thinking of (which have the letterboxing cropped off for economy and best multi-shape display compatibility, and can afford to do that as each programme or movie is individually encoded by an intelligent agent, ie a person, and meant for playback on a device powerful and flexible enough to deal with an entirely arbitrary video format, ie a computer; conversely, broadcast video is designed for entirely automated operation, and compatibility with the simplest and cheapest possible device that can be built, so tends to stick to a small number of very broad standards), and so you shouldn't fall into the trap of thinking they need to be defined the same way. If you play that 544-line video file out over a TV transmission, or directly to a TV in 720p mode or to a 1280x720 monitor, the actual format that finds its way to the screen is still 720-line, with the extra two 88-line high blocks of black padding (for 16:9, obviously it'll be more for 4:3, but that needs more maths because your video will have to be shrunk both horizontally and further vertically to fit inside the 960-pixel width) being added onto the "useful" data before it leaves the computer, media player, or broadcast tower.
- You're still free to call the file resolution whatever you like, and your monitor rez as well if it's different from 720 lines high. Because they themselves are not the same as the broadcast standard, and can be named accordingly and independently. 80.189.129.216 (talk) 22:34, 23 January 2018 (UTC)
720p is not frequently used in Europe?
editThis article says »720p is not frequently used in Europe.« but in the German Wikipedia we can read, that »ARTE HD Deutschland, Das Erste HD, Eins Festival HD, ZDF HD, ORF 1 HD, ORF 2 HD, HD suisse, SVT, NRK, DR und YLE« are using 720p. So this information is definitly false. —Preceding unsigned comment added by 95.115.238.66 (talk) 17:02, 4 April 2011 (UTC)
Incorrect about 3:2 pulldown?
edit- A 720p60 (720p at 60Hz) video has advantage over 480i and 1080i60 (29.97/30 frame/s, 59.94/60 Hz) in that it comparably reduces the number of 3:2 artifacts introduced during transfer from 24 frame/s film.
I don't think that's true. There will be just as much "judder" - caused by repeated fields/frames - on 720p60 as 480i60 or 1080i60. David (talk) 14:21, 2 March 2013 (UTC)
- It does, however, make interframe smoothing/motion blurring a bit easier, potentially improves the overall resolution (depending on how IVTC was performed before), and gets rid of the problem of deinterlacing/detelecine-ing glitches (back-forth jitter, excessive combing, etc) and bleed through between fields, which can make 3:2 pulldown even uglier than it already is. Also, I expect there's probably the option of using the "repeat frame"/IVTC flags as per DVD, both economising on bandwidth and allowing suitably intelligent displays (e.g. 120hz panels) to re-time a series of 60hz frames sent as (1)-R-(2)-R-R-(3)-R-(4)-R-R (etc) to a jitter free 24hz stream, without the additional complication of having to reconstruct the picture from individual fields that can come in four different arrangements. Effectively, the only image data being transmitted is the 24 full frames every second, same as with 25/50hz, just with a bunch of repeat flags inserted. However, it still loses out in terms of resolution, and if your hardware is sophisticated enough and transmission of good enough technical quality it should be able to reconstruct 60i into 24p just as well as 60p to 24p, just with twice as many pixels... 80.189.129.216 (talk) 22:22, 23 January 2018 (UTC)
720p60 CVT-RB
editI am looking for some information for possible use in improving this article (assuming I can find a reliable source)
Background: unless I am mistaken, the specifications for 720p60 CEA-861 video are:
1280 × 720 Resolution
59.940 Hz Refresh
44.955 kHz Vertical
74.176 MHz Horizontal
What I cannot find are the specifications for 720p60 CVT-RB (reduced blanking) video.
CVT-RB is a VESA standard (Coordinated Video Timings, VESA-2003-9).
ESA-2003-9 is available at ftp://ftp.cis.nctu.edu.tw/pub/csie/Software/X11/private/VeSaSpEcS/VESA_Document_Center_Monitor_Interface/CVTv1_1.pdf (despite the "Microsoft Word" in the URL, it is a PDF document).
It has some formulas, but does not specifically list timings for 720p60 CVT-RB. Does anyone know where to find the timing for 720p60 CVT-RB? --Guy Macon (talk) 14:30, 10 March 2013 (UTC)
- (Sound of Crickets...) --Guy Macon (talk) 08:16, 17 March 2013 (UTC)
- AFAIK you just use a typical computer monitor modeline calculator to find that. However I'm pretty sure I've seen it listed amongst a bunch of other modes somewhere online recently, whilst looking up other specs, so at this point, nearly 5 years on from the original question, it may now be googleable? I'll write it in here in a bit if I find it. Also, thanks for the standard Hsync (45/1.001khz) and pixel clock (74.25/1.001mhz) rates, searching for confirmation on those is what brought me here in the first place (I think they need copying onto the main article, as there's no clue about the overall frame structure at all there)... The latter is what I arrived at independently but it seemed rather too high, as it adds up to 1650 pixels total. Little wonder a reduced blanking mode is also available... active width being only 77.6% of a line is a little narrow even in SDTV terms, let alone the modern era. 80.189.129.216 (talk)
- (some time later)
- OK, so, I did a bit of research, and some calculation... and the answer is...
- ...no such thing as 720p60 reduced blanking exists. Because there's no reason for it.
- The reduced blanking versions of CVT modes were produced as a way to push more pixels per second down the bandwidth-limited links represented by VGA, DVI, and early HDMI cables between PCs (or other output devices) and their monitors (or other display devices). As the regular 720p and 1080i HDTV modes are what both DVI and HDMI were designed to transmit, and they're well within the capabilities of a decent quality analogue VGA cable (and don't suffer too badly from a poor quality one), only one standard exists for each of them: the CEA-861s you already mentioned. And actually, that's a pretty efficient standard, at least compared to regular-type DMT, GTF and CVT - it seems rather that instead of allowing for poor quality electronics and thus providing plenty of overscan blanking as the other more general standards do, whoever put CEA-861 together instead came up with a timing spec that would work with the desired resolutions/framerates and available timing crystals, and challenged HDTV hardware designers to produce equipment that could successfully scan it.
- CVT is made for maximum compatibility, even with sloppily engineered CRTs that absolutely depend on the transmitted sync and the blanking period around it to return the electron beam deflection back to the left and/or top of the tube, without either losing any of the picture off the side/spread out through the flyback, OR being *too* wasteful.
- CVT-RB is made mainly for flatpanels, or at least CRTs with framebuffers - either way, displays that don't depend strictly on the blanking period for their internal synchronisation, but just use it as a basis for their own independent timing circuits, and something to centre the active image area between. It cuts the inactive display areas down to the bare minimum that can still contain a recognisable sync pulse for the screen to latch onto, and expands the active duty cycle out as far as possible around that.
- CEA-861 sits in-between the two, but closer to CVT-RB than CVT, allowing a reasonable amount of blanking around the sync pulse, but without being particularly forgiving. For example, with 1080i30, there's just 300 pixel times of horizontal blank and sync allowed on top of the 1920 active pixels, which is roughly what you might expect for a GTF/CVT/etc PC mode of half that resolution. Nearly 88% of each line is active picture, when normally you'd expect 75-80%, and there's about 45 lines of blanking (22.5 per field) in 1080i mode/30 in 720p, which is on the lower side of average (for comparison, NTSC 480i has 45 lines itself, as does basic VGA 640x480, and PAL 576i has 49 lines).
- Now, at the higher resolutions and refresh rates, as used for really fancy monitors (or 4K video), even the small difference between CEA standard and CVT-RB can be significant, where there are even CEA standards anyway; they only define a small number of core resolutions, so you're otherwise stuck with CVT or similar in most cases, and use of RB is pretty crucial. But at the lower, default-standard resolutions defined by the CEA, there's just no point in messing with the timing. The output hardware can handle it just fine, so can the _target display, and the cable can DEFINITELY handle it. All you're doing is fussing with unnecessary details. RB is used to either increase the number of pixels, and/or the refresh rate, beyond what your video chip, cable, or display would otherwise be able to handle... and, well, once you increase one, other, or both of (960/1280 pixels by) 720 lines or 60 frames per second... the standard ceases to be 720p60.
- Of course, what you actually do by reducing the blanking is reduce the required bandwidth for image transmission... but, if you're having trouble sending the 74.25mhz of standard HD video from any kind of modern computer to any kind of modern screen, over almost any practical cable you could name, you've got bigger problems than the scanning specs of the signal itself. Any of the main three mentioned can more than double that data rate (ie, meaning they can easily handle 1080p60, 720p60 3D, or twin 720p60 screens down the one line), and a high quality VGA, dual-link DVI, or any HDMI cable and send/receive equipment from v1.3 standard onwards can double that again (so, 3D or twin 1080p60 over one cable, or a single 2560x1600 60hz progressive).
- Technically, the limit for single-link DVI or original-form HDMI is 165mhz, exact, and a common grade VGA can about match that; dual-link, HDMI 1.3+ or high grade VGA can hit 330mhz (and later HDMI standards take that to 500mhz or more, at which point the bottleneck is more likely to be in your video card or monitor instead). What can you do with 165mhz? With normal blanking, 1280x720 at upto 121.9hz (133.3hz with blanking equivalent to CEA-861), 1440x1080 at upto 74.8hz, 1920x1035 at upto 60hz progressive (CVT; similar to original MUSE hi-def), 1920x1080 at upto 57.6hz (CVT, or 66.7hz with CEA-equivalent blanking... so 60hz progressive is no trouble) = etc, etc = and 2560x1600 at 30hz (progressive with scan doubling/buffered flatpanel, or 60hz interlace). With reduced blanking, you can push 1280x720 to 148.2hz, 1440x1080 to 91.4hz, 1920x1035 to 74hz, 1920x1080 to 71hz, 2048x1080 ("2k") to 67hz, and 2560x1600 to 37.2hz (or 74.4hz interlace).
- Now perhaps you really want to get that 148hz at 720p resolution... but that's 720p148, not 720p60. Maybe you want to run two 720p60 images down the same cable... well, that's an entirely valid aspiration, but you can do that within the parameters of CEA-861, so reduced blanking is still needless.
- If you have a better-than-default cable, capable of 330mhz, you can double all those framerates... or push the resolution higher - 2880x1760 at 59.94hz rates just a smidge under the maximum, with reduced blanking. 3840 or 4096x2160 ("4k" resolutions) work nicely at 25hz with regular CVT, or 30hz with RB (or indeed CEA-equivalent) timing... in fact, up to 70 (4096-pixel) or 75hz (3840-pixel) interlace in the latter case. And if you do some manual massaging of the blanking to trim it by just a few pixels and/or lines from the RB spec, if your monitor can handle it, you can squeeze out 4096x2560 at 29.97hz (59.94hz interlace) or 4096x3072 at 25hz (50hz interlace)...
- Beyond that, you'll either need even-later-standard HDMI, or a recent revision of Displayport or Lightning. But hopefully the point has been driven home that the system has more than enough capacity to handle 720p60 with even the most extremely wasteful conceivable blanking characteristics, let alone the default, so RB isn't needed for it, and so isn't defined as part of any standard (and, in any case, it only gives about 11% extra framerate vs CEA, rather than the 28% vs plain CVT). The figures I give for it above were calculated using a spreadsheet I found on some random website, and I'm not entirely sure of its accuracy... 80.189.129.216 (talk) 01:19, 24 January 2018 (UTC)
Notes 1. = not 60 FPS
editAccording to my calculater 60 * 1000/1001 = 59.94005994005994005994005994006
59.940059 940059 940059 940059 940059 ... (the number with whitespaces between blocks)
59.940059 940059 94 (number on the page)
59.940059 940059 94940059 940059 94940059 940059 94 (extended period)
The number on the page is not the same as the calculator outputs.
Either the formular or the value is wrong
So the number should be 59.940059 - the underlined part is the part that repeats. (No clue how to put the line over the text.) — Preceding unsigned comment added by 77.23.100.19 (talk) 12:13, 19 September 2016 (UTC)
- Whilst this is true, it's an arguably entirely pointless level of pedantic precision. The difference between 59.94 and 59.940059 - never mind between the latter and any more clearly defined figure - is 0.0000599400 frames per second. Or, in other words, one frame per 16683.3 seconds. Or one for every 4 hours, 38 minutes and 3.3 seconds. I think it's safe to say there are almost no circumstances where having the framerate defined to just 3dp / 5sf, never mind the 14dp / 16sf that you complained about, will cause any kind of noticeable problem, because the slight desync blips between individual programmes, or even programmes and ad breaks / continuity announcements, will cause a reset long before it becomes an issue. I think the longest ever single programme I ever recorded was only 4 hours long...
- Additionally, the difference between 59.9400599400594 (itself ludicrously over-specified) and 59.940059 is - as best as Windows Calculator can report, anyway (my physical Casio scientific calculator not having enough digits on its screen to perform the task) - equivalent to one frame every 1.87 million seconds, or one dropped every 59.3 years. 59 years ago was 1959, a mere 6 years after the initial adoption of the 60/1.001 standard as part of the launch of colour TV. The timing error is literally so small that if you'd started a pair of displays running the first day that analogue NTSC colour was broadcast, one at 59.9400599400594hz (assuming you had any way of producing an output of that exact frequency, seeing as it requires precision of a minuscule fraction of a "pixel" per frame), and the other at exactly (60/1.001)hz, they would only just have ended up one full frame out of sync at the point where analogue broadcasting ceased in the USA. That's assuming you're relaying footage from a station that always shows colour footage instead of swapping between that and monochrome, and never has any breakdowns or retunes.
- In real use, the timing variability in recording, playback and broadcast equipment is enough that specifying the frame rate even to 59.94006hz (or indeed 50.00000hz for PAL) would be a little OTT. The colourburst crystals that much of NTSC timing is based off of are only rated to 7sf themselves (eg 14.31818mhz), or even just 5sf (14.318mhz), and even a single oscillator can vary slightly with temperature and other factors. And then there's cheap or badly made equipment that might not count the right number of colour clocks per line, or the right number of lines per frame, not to mention videotape that can stretch and warp, transports that don't quite get up to speed, etc. That's why there's a horizontal sync pulse, and a colourburst pulse itself, to lock the receiver back into the transmitted signal on each and every line, and a vertical sync as well, with the interlace offset built right into it by dint of when it actually happens, with the receiver being quite forgiving about what it accepts as valid input, and just freerunning at whatever speed it likes otherwise. If you pull the specs of basically any and every last computer and games console that connects to a TV for its display, hardly any of them actually produce a proper (15750/1.001)khz horizontal, (60/1.001)hz vertical NTSC compliant output signal, though most of them do at least have a correct 3.579545mhz colourburst...
- tl;dr - precision to that degree often isn't necessary for everyday things. Unless you're measuring things on the atomic scale, you just don't need to bother. There's a whole lot of slop in any given system, and if it's normally just given as "59.94hz", that's probably because that's as exact as it needs to be to work correctly.
- As for "either the formular or my calculater is wrong" (sic), methinks you need to go find out what "rounding" and "limited precision representation" are. You're actually getting in to the realms where consumer level CPUs and FPUs start running into rounding errors, because they just don't represent numbers with enough binary digits (bits) internally to show any greater precision on the output. Even so, the figures still look like they've been calculated correctly - they've just been truncated to a sensible number of decimal digits, with rounding where necessary (e.g. xxx59 gets rounded to xxx6, which is correct unless you decide to add an unnecessary 0 on the end afterwards). Naturally, if you try to subtract a rounded version of a number from its exact rational-fraction or recurring-decimal form, you're going to get a non-zero result. It's just that, as you're not a computer yourself, and therefore imbued with some intelligence and human intuition, you're expected to not do such silly things and expect them to work.80.189.129.216 (talk) 00:34, 24 January 2018 (UTC)
Interlacing
edit720 is like webTV. 30fps at and interlace at a "SPECIFIC" 120khs refresh rate. Well I could go on about Sytax and Vizio but some people care about 30fps or 60fps. Interesting talk though. — Preceding unsigned comment added by 2602:301:7751:160:DD52:A7BA:E943:107C (talk) 18:18, 28 March 2017 (UTC)
- What? Are you feeling OK? 80.189.129.216 (talk) 20:23, 23 January 2018 (UTC)
1080x720? 3:2 aspect? Rec 601 spec? eh?
editOK, let's see... in a great many years, I've never seen the 1080x720 resolution mentioned anywhere before, nor any TV standard that has a 3:2 aspect ratio. Rec 601, at least as far as I knew, and indeed as far as the directly linked WP article is concerned, is an early 80s format that long pre-dates any serious attempt at establishing HD standards, and only defines 525- and 625-line interlaced SDTV modes. No 720p or indeed 1080i/p of any kind, and no horizontal resolutions wider than 768 pixels.
Is there any kind of veracity behind this oddity, or is it just some random misguided wikignome's wild imagining that can be safely deleted? I'm feeling strongly inclined to Be Bold with it. 80.189.129.216 (talk) 20:22, 23 January 2018 (UTC)