I'm pretty sure 1080i draws half the pixels in one pass then the other pass draws the other half alternating while 1080p draws the full picture in every pass. Is that not why 1080p uses twice the bandwidth of 1080i?
1080i
Most HDTV channels are broadcast in 1080i resolution. Programs broadcast in 1080i are sent at a ratio of 30 frames per second. Each frame has 1,808 vertical pixels interlaced with 1,920 horizontal pixels to create a continuous picture on your TV screen. Because 1080i is broadcast in an interlaced image, images such as a speeding car can blur when they move fast. The lines of the image are not rendered contiguously, which can cause a "messy" or blurred picture. It can also cause image skipping, where it looks like the image is delivered in stop-motion photography.
1080p
1080p HDTV resolution has the same number of pixels -- 1080i per frame -- but the image is sent at 60 frames per second instead of 30. The higher rate of frames per second in the progressive scanning of 1080p means that it takes up to twice the bandwidth as 1080i. Subsequently, many TV cable companies transmit or broadcast data in 1080i to prevent lag in the cable feed. This results in a lower resolution picture, but without the inconvenience of having your TV picture freeze or skip ahead without showing all of the program you are watching.
no, its like battlefield 4 for xbox 1 is 30 FPS but for PC is 60 fps(if played at same resolution and graphic settings).. its just smoother frame rate. If you ever used Sony Vegas itd be like if you rendered something at 30 FPS it would be insanely quicker to render than if you did it at 60 FPS but the amount of pixels is the same..
isnt that what I just said
(really 24 fps but techiques to make it 30 fps)
The resolution of a 1080p 16:9 image is 1920 x 1080. The resolution of a 2K 16:9 image is 2048 x 1152. That's not a noticeable difference. Television resolutions such as 480p, 720p, & 1080p are measured vertically. Cinematic resolutions like 2K, 4K, & 8K are measured horizontally.