> Williams always used a framebuffer and blitter or GPU combo, from Defender through > when they went out of the arcade business. In the case of the 340x0 games, the > blitter was special-function hardware in the CPU itself, but they never did raw > software drawing.
Correction: Midway didn't use the 340x0 for anything except as a CRTC and a CPU. It was too slow to do the blitting they needed, so it was offloaded to a custom blitter, much like the old 6809 games did.
Many companies mixed framebuffers in here and there as well. A lot of early Atari games like Missile Command, Cloak & Dagger, and Crystal Castles used them, and they showed up in 16-bit land as well for games like Rampart. In the days when people designed hardware for specific games, it was not all that uncommon.
Strata/Incredible Technologies games are all framebuffer-based as well. Same for the Art & Magic games. And the Leland/Cinematronics games used a framebuffer for sprites but mixed it with a tilemap for the background.
> Tilemaps stuck around a little longer > in arcades because they were cheap and made it easy to get the HUD/text layer going > quickly. Atari's STUN Runner style hardware, Sega's Model 1/2/3 and Namco's System > 21/22/23 all did this to at least some degree, and you could program if each tilemap > layer appeared in front or behind the 3D framebuffer on most of those systems.
Actually, the Atari polygon games rendered the HUD using the TMS34010 and some tricks, and didn't have any tilemaps. They used the same font as their usual tilemaps, though, so you could be forgiven for believing that's what they were doing.
|