When and how did they invent 240p?

It’s effin’ okay man.

Actually this computer had only 10 function keys. I stand corrected. F11 and F12 were introduced in 1986 with the model M keyboard.

2 Likes

To add to the “it’s cheaper to do 240p”, 320x200 was chosen as a resolution because it totals up to 64k pixels. You can fit those in the screen RAM, which is sized depending on how many colours you want your computer to display. A computer with 16k of RAM dedicated to the display can have four colours per frame, 32k gives 16 colours and 64k gives 256 colours. By the same calculation, you could use 16k RAM for 640x200 in two colours (DOS prompt, that LCD display I posted above) and 160x200 in glorious 16 colours.

If you insisted on using 320x240 you’d have had to double the RAM allocated for just a few more lines. And RAM wasn’t cheap back then.

3 Likes

Great video!

1 Like

Be sure to like it on YouTube - it’s not got many so far

Spacewar ran on a Radar CRT, so the screen had no interlacing. So 240p isn’t there from the start in video games.

The Magnavox Odyssey appears to use 240p, so that’s possibly the start in games, 1972? Or possibly Computer Space or Pong in the arcade in 1971. But no doubt the idea came earlier for some early microcomputer.

Been a while since I’ve gotten engrossed into a Wikipedia article and just started link hoping, so thanks for that.

2 Likes

Ha! Glad I could help!

I lived this entire era. I often reflect on being 46 today and the exact right age to be a kid as Pong entered the home, arcades started booming and pinball was still doing very well. As a five year old in 1977, I was enthralled with gaming already because we got an Odyssey 300. My brother would let me operate one flipper on one ball when we went to the VFW for dinner on their one pinball machine. No, I don’t even know what game it was…

I literally lived my life alongside the coming of age of gaming and yet there’s so much I still learn from articles like that and from Retro Gamer’s great coverage of all those eras and the people who built it all.

A lot of times though, questions like @Peltz posed are answered rather simply and they did stuff the way they did because it presented the path of least resistance for them getting something out there, or they were simply stymied by physical limitations, specifically processing power and RAM. @khaz great description of 64k pixels pretty much explains everything as to why they used that specific resolution. I can echo him on “RAM wasn’t cheap back then” as it wasn’t until much much later that it got reasonably priced.

Think of this too… for game makers, TVs were an installed based you had to contend with at one time! You weren’t even sure everyone was going to play on a color TV! That’s when I started gaming. That Odyssey was hooked to a black and white in the basement! We had a gigantic console CRT in the living room with the built in AM/FM radio, record and eight track player. It was a beautiful piece of furniture, but it was not to be trifled with for gaming! Well, not until I got the 2600 anyway… or the Video Computer System as it was known at first.

Anyway, glad to send you down a rabbit hole. :smiley:

2 Likes

I have heard elsewhere that the main reason the gaming systems used 240p was because it was much less processing power to run a game with 240 lines, versus 480.

I remember my grandparents had one of those large radios that was about a full bed length in size, and 3 feet high. It wasn’t even a TV, just a radio, 8 track player, record player, and speakers.

The main main reason is less flicker on static screens and higher framerate.

It’s the same number of pixels per second/same number of lines either way.

Ah that make sense

Very informative video, and there is 2nd part about CRT aesthetics


1 Like

Not watched the videos, but I think the PC and the need to read text from a screen all day is the most important factor in the earliest days of 240p. When I run Windows at 480i through CRT_Emudriver, I can attest to the complete inadequacy of interlaced signals for text you’ll be looking at for a long time.

It’s not like you could magically double the video bandwidth for the screens at the time, so 240p (or anything lower than that) was vastly superior for text.

Could be wrong though, I was not around in the time of TV-based computers.

Watch the videos :wink:

:wink: Will do! Happy birthday!

1 Like

I’ve been curious this whole time when a tv is receiving 240p, it’s still spitting out a 480i picture on the screen isn’t it?

Sort of, yes. Usually a 480i signal alternates between even and odd horizontal lines of pixels every sequential frame.

240p is just a “hack” of 480i. Rather than alternating between even and odd horizontal lines of pixels every sequential frame, only half the horizontal lines are drawn for all 60 frames. This results in the same half of lines of pixels being present every frame and the same half lines of pixels being blank every frame giving it the “scanline” look.

As far as an SD CRT knows though, it’s still receiving and displaying a 480i image. That’s why there’s no “pause” when switching between the two resolutions on a native display the same way there could be when switching between other resolution formats.

Keep in mind, the OSSC and Framemeister are able to detect the difference between the two. So they aren’t exactly the same signal. The answer really is “sort of”.

1 Like

It seems funny that my dvd/vcr combo is just as fine resolution switching with old tv sets when all my other equipment has the delay. I think this is an oddball thing to think about as the ossc is literally doing line doubles of the displayed lines, perhaps this is a case where it should be given an input mode for the same 240/480i support not requiring the re-detect?

I’m not a fan of describing SD CRTs as 480i machines. CRTs are ancient tech from the very early 20th century. As such they are merely converting an analogue electrical signal into an electron beam of variable intensity exciting phosphor to make it glow, just like a turntable converts the bobbing of a needle into a vibrating membrane. The SYNC signal gives the tempo that converts the linear pulse into a 2D image, and that gives it its properties like the quantity of lines and the frequency of images. Assuming the horizontal and vertical scanning speed of the beam are fixed, you can manipulate the SYNC signal to have the image start at whatever height you want, get back to the top after how many lines (incidentally affecting the refresh rate).

I believe that when the apparatus was originally conceived, images were shown as progressive, and that interlacing is actually the trick to fake a higher resolution by delaying VSYNC by 50% to have the lines of the next image land in between the lines of the previous image.

Aerial TV (NTSC) standardised how images should be transmitted, with how many lines and how often, and that gave us 480i 60Hz. But the sync could still be exploited nonetheless, which makers of non-NTSC video eagerly did in their computers and video game consoles to show a more stable and less headache inducing progressive video.

1 Like