When and how did they invent 240p?

It’s such an odd thing. Was 240p a thing outside of the video game context? If not, when and/or how did video game and computer hardware manufacturers invent 240p? I feel like there has to be a story there.

Is 240p as old as video games? Or was there a time when games were interlaced prior to the 240p video game standard of the 80s and 90s?

I don’t have any sources for this but I believe it came out due to technical limitations and saving data by halfing the vertical resolution. I think it was also more of a hack of the 480i standard than any kind of official standard, which is why so many things have an issue with handling 240p and treat it like its 480i. Specially when you consider that 240p is kind of catchall and a lot of the times the actual vertical resolution could vary a good deal from that. Look at the SNES for instance which supported progressive resolutions of 256x224, 512x224, 256x239, and 512x239.

I’m not sure technically why, but 320x240 was the standard for computer screens on IBM PCs. That was eventually doubled to 640x480, etc. I don’t know why that was the choice early on, but it is what I remember from the beginning. It probably has something to do with memory size. Remember, 640k was all we ever needed!

The signalling for those resolutions is all the same. More lines are just active in the vertical case, and more data is included in each scan in the horizontal.

It’s certainly all a hack of 480i, the only difference being not including the half-an-hsync offset between fields.

Most DOS games actual render in 320 x 200 though, with the monitor stretching them vertically to fit a 4:3 aspect, which is the exact opposite of many consoles at the time, which normally stretched pixels horizontally. I love this article, which talks a bit about it with some great examples:

(Bonus Sonic composite “transparency” reference at the end!)

Yea that Gamesutra article is very informative and interesting especially to someone like myself who never owned a PC back when such games were made.

Hmmm… yeah, you’re right! I was a little confused there. My PC gaming began in the 320x200 era so I should have known that. :frowning:

640x480 was an IBM standard though. Here’s a Wikipedia article with some good information…


TV Computer Non-interlaced TV-as-monitor Various Apple, Atari, Commodore, Sinclair, Acorn, Tandy and other home and small-office computers introduced from 1977 through to the mid-1980s. They used televisions for display output and had a typical usable screen resolution from 102–320 pixels wide and usually 192–256 lines high, in non-interlaced (NI) mode for a more stable image (displaying a full image on each 1/50th / 1/60th-second field, instead of splitting it across each frame). The limited resolution led to displays with a characteristic wide overscan border around the active area. Some more powerful machines were able to display higher horizontal resolutions—either in text-mode alone or in low-colour bitmap graphics, and typically by halving the width of each pixel, rather than physically expanding the display area—but were still confined in the vertical dimension by the relatively slow horizontal scanning rate of a domestic TV set. These same standards—albeit with progressively greater colour depth and upstream graphical processing ability—would see extended use and popularity in TV-connected game consoles right through to the end of the 20th century. 140×192 NI (low-end), 320×200 NI (typical), 640×256 NI (high-end) 4:3 (non-square pixels) 1–4 bpp typical, 2 or 3 bpp common.

and the entries immediately following it… because I think that had as much to do with game development and the evolution of home consoles and their standards from those early home computers.


Fascinating stuff!

It sucks that all modern displays use fixed-pixels. It leads to a lot of wasted power on just trying to get good IQ either through either AA methods or shooting for native screen res.

I still feel like CRTs would be the best display tech ever made even in the OLED HDR age if they were still being made for today’s resolutions.

We are only now starting to get consumer displays that exceed what we had in the 90s in terms of color representation and accuracy.

If I’m not mistaken, the Amiga was the computer of choice for Genesis game development, at least early on. It had the same processor inside so it was a perfect fit. Many games are written in assembly language though for speed.

You can (IMO) thank Microsoft partially for today’s display technology and how it has evolved. The PC-ization of video display hardware (and consoles) really came via them moving aggressively into the consumer side. If that doesn’t happen, it’s certainly possible that things evolve differently, but the CRT being so heavy and cumbersome as screens grew was probably the biggest overall issue facing its continuation.

Think about how many people literally give away top quality CRT sets now. They do it because of size and weight more than obsolescence. It’s just unfortunate for us that all the stuff that came before wasn’t really optimized for what came next.

1 Like

CRTs weren’t perfect. Yeah they had great color and black levels, but I don’t think any CRT ever made ever had perfect geometry. Which was made even worse when the move to widescreen happened. I went through a ton of widescreen HDCRTs because every single one of them, and I mean EVERY SINGLE ONE of them had major geometry issues that just couldn’t be fixed. I went through a few where a straight line from one end of the screen to the other would bend so much that on one end it would be an inch from the top of the screen and on the other end it’d be 2 1/2 inches.

I mean CRTs survived into the HD age. Hell CRT computer monitors were doing HD comparable resolutions for ages. If I remember right most of the original HD TVs out there were CRTs. The problem wasn’t hitting the resolution, people jumped on LCDs and Plasmas because they were a fraction of the weight and thickness of CRTs. That’s what drove LCD adoption.

I still think SED could have been the perfect blend of CRT tech and the thinness and lighter weight of an LCD. To quote Wikipedia

A surface-conduction electron-emitter display ( SED ) is a display technology for flat panel displays developed by a number of companies. SEDs use nanoscopic-scale electron emitters to energize colored phosphors and produce an image. In a general sense, an SED consists of a matrix of tiny cathode ray tubes, each “tube” forming a single sub-pixel on the screen, grouped in threes to form red-green-blue (RGB) pixels. SEDs combine the advantages of CRTs, namely their high contrast ratios, wide viewing angles and very fast response times, with the packaging advantages of LCD and other flat panel displays. They also use much less power than an LCD television of the same size

It’s just a shame that they were never able to make the manufacturing cheap enough to commercialize the displays. I really think this could have given us the best of both worlds.


Great points and informative post.


Aspect ratio is overrated

( IBM 5140 Convertible PC, 1986)

1 Like

Apparently as are color graophics and backlighting.

1 Like

As is grammar: “Press an F key to select an application”. How many F keys did this thing have? Lol.


Well, thirteen if you really want to be pedantic

surely that thing has 10? Seems to from the images.

Or does it go up to 11?

1 Like

12 is standard. What kind of keyboards are you guys using?

I’ve not watched this… yet


Yea, that video is actually pretty informative.

Don’t mind me, I straight up forgot about F1 - F12, and was thinking of the F letter key, haha. I guess “an F key” works in that context.

I played myself. Press F for Respects :slightly_frowning_face: