I'm curious. What's an average clock speed from a general use 32-bit CPU in 1991? I couldn't actually think of any myself...
Well, the 68040 was released in 1991 with a 20MHz-40MHz, and the 80486DX-50MHz also came out in 1991.
But those were high-end chips, and way too expensive for cheap consumer-electronics consoles.
The 68000 came out in 1979 with an 8MHz clock speed, and it didn't hit home computers until 1984, and was still used in the Genesis at the same clock speed in 1989.
In 1991 the Sharp X68000 XVI was released with a 68000 running at 16MHz.
Move forward to 1992 and you get the Amiga 1200 running an 8-year-old 68020 at 14MHz, or the Atari Falcon running a 68030 at 16MHz.
10MHz sounds slow ... but if it was a pipelined architecture running at 2-or-3 clock cycles per 32-bit instruction (i.e. like a 6502), it could still have beaten the heck out of a much-faster 68000 (that can take dozens of cycles on an instruction).
And the other thing ... sprite-and-background based consoles really don't do a lot of processing on the CPU.
It's not until you get to the 3D-generation that processors needed to be really fast with lots of math.