Which consoles would that be? I was (perhaps naively) under the impression that pretty much everything up to and including 16-bit era consoles had no input lag from the hardware side?


@mega post that shows the whole thread from twitter.


Thanks man


SNES and Genesis games.


That’s input lag from the game software, not the console / hardware.


I know it’s kinda trolly but I don’t think it will be the same thing. There is an analog option that doesn’t add lag and we have a solution for analog to digital that doesn’t add lag (ossc). I’m sure people are going to code a better scaler than the one built into the mister Core.

I never got run back working in a uniform way too. If I remember you had to set it up for each core or maybe it was game. Wasn’t ideal and it did mean the user had to use some guess work too.


@tomwhite2004 True, but the important takeaway is that playing on original hardware, in the case of pretty much all games, has this added lag and the MiSTer supposedly doesn’t.

@kawika Run Ahead is easy to set up and there’s no guesswork… although yea it needs to be set up on a per game basis. It shows how in the video I posted yesterday and takes very little time. If you’re not hopping around between new games constantly it’s not so bad! As for inconsistent performance? It’s all a matter of having a good enough computer to use it. Like many other Retroarch settings, it’s expected not to perform identically with all cores and games since some are more demanding than others.


the lag created by a games logic having to take a certain amount of frames to run will still be there on mister too, you don’t and cannot remove it just becuase you are using an fpga.


Alright, then I have to wonder what the guy was referring to in the bolded part or if he’s just wrong.

New scaler cores are being written which should allow for lower lag via HDMI, but two frames is still very low, since the cores themselves essentially have no added output lag (compared to the OG console / computer) before it reaches the scaler


he is correct, the emulated hardware core on its own has no lag. but this variable game logic (for want of a better term!) lag which is what run ahead removes is separate from that. would be interested to know if the same save state manipulation could be done on an fpga clone to achieve the same result.


Some things I’ve observed in the dev community in the last few days:

  1. The replacement open source video scaler is targeting including a mode that has three scanlines of latency; OSSC territory.
  2. Ultra-low-latency I/O board is in development to support real console controllers with the least possible latency.
  3. PCE CD development is in early stages. I’m glad I didn’t buy the Super SD System 3.

So latency is being addressed and CD is in development. Work done on the PCE CD plus the new open source cycle-accurate 68000 processor core will both help with Megadrive CD down the road.


I think I read ElectronAsh mentioning that CPS2 is on his list of things he’s looking at/will look at, whether an actual core materialises is another thing but it’s certainly got me excited at the thought.


Ash is absolutely working on various CPS components. CPS2 has all this on-hardware encryption to the point of using a special custom designed 68000 processor with internal encryption. So I expect to see CPS way before CPS2.


Is it not possible to use unencrypted game ROMS on CPS2? I should really google that.


MiSTer now has cycle accurate 68000 CPU in the Mega Drive core.


We have been able to use unencrypted ROMs since 2001, that made emulation possible and gave rise to Pheonix ROMs (which are no longer necessary) for use with real hardware.

In the past couple of years the understanding of the custom chips and encryption process has been completely reverse engineered and understood. There really shouldn’t be any roadblocks to an FPGA implementation of the hardware other than perhaps accurate Q-Sound as that is more powerful than the CPS2 hardware itself. MAME for example have not yet included their cycle accurate Q-Sound emulation because it would make the system unplayable on everything but the best i7’s and make CPS2 one of the hardest systems to run.


That’s wild. I wonder why they developed it that way. Is there anywhere that I can read about it?


Everything I know has come from random snippets from MAME devs on reddit over the years so I don’t really have any reading material I’m afraid, can only trust on what I remember. There was a great discussion for MAME 0.196 when the LLE emulation was first announced but I just cant find it.


Some great QSound/Q1 rips here: https://minirevver.weebly.com/qsound-music.html

MAME QSound/DSP discussion http://www.mameworld.info/ubbthreads/showflat.php?Cat=&Number=374742&page=&view=&sb=5&o=&vc=1

The QSound DSP runs at 30MHz (60MHz clock input divided by two). On CPS2, it’s already clocked at almost three times the speed of the main 68k CPU, it can issue an instruction every one or two cycles, and it can do up to four operations per instruction. The 68k main CPU takes several cycles to execute most instructions, and multiplies are particularly time-consuming. The QSound DSP is far more powerful than the main CPU - maybe 40 times the throughput.


That paragraph was the exact one I was looking for, well found!