Your right, I should have mentioned that crank fore and aft play (crank walk) does play a role. Some of the guys complained about this on the cheap China engines and they changed the code do like Eccentric has mentioned and did an averaging of the pulse timing. On the better built engines this wasn't a problem much once the engine got above 500 RPM. I now with the after market race car ignitions with a crank pickup if the sensor is too far away from the crank magnets then crank walk can show jitter at low RPM <2,000. Another problem with those is crank flex and detonation which shows up at higher RPM and load. With crank walk there is also less jitter depending on how the magnet and Hall are positioned. For example if the magnet is placed near the end of say the flywheel and is placed with the south pole pointing outward so it's on the edge of the flywheel instead of on the face with south pointing to the front or back of the engine, then there is less jitter and crank walk it not much of a factor. Unless crank walk is really bad, in which case it means something is really wrong with that engine. Almost everyone drilled a hole/pocket in the flywheel so the south pole was pointing out.I remain surprised that works. Of course, the processor is much faster than the mechanical bits, but even at 0.001 degree jitter (which astonishes me that you can get it that small - I'd have expected just simple slop in the crank bearings to contribute more variability than that!), that's still 10% of your (500RPM) pulse width. It doesn't matter how fast the processor can calculate/time things, if the RPM estimate is wrong, it seems like that'd play havoc with calculating the right value for the delay between seeing the pulse and firing the plug.
Unless I'm completely misthinking something, a 10% error in the pulse width produces a 10% error in the RPM estimate. And a 10% error in the RPM estimate should produce a 10% error in your delay-until-firing calculation. That could be 36 degrees...
Ah, but wait wait wait -- you're not predicting a delay for an almost complete revolution, your pulse occurs somewhere "close to but before" the earliest possible firing advance, so somewhere maybe 45deg BTDC? 10% error in that's 4.5 degrees, but maybe that's not critical?
As far as the error is concerned what's the real difference between 500 and 550 to an engine, not much. If we're talking about a hit-n-miss, one can adjust the distance and orientation of the magnet/Hall to manage error or change either the magnet or Hall-Effect. We also found by adjusting the pullup resistor value helped with error a lot along with using a Hall-effect that wasn't to sensitive. The most error I saw was <= 5 degrees on the cheap China engines and it wasn't constant. You are kind of right in your later thinking because most of the time the pickup signal was usually around 45deg BTDC which, had to be entered into the software program. It needed to know where in degrees the pickup was. The more advance timing you need the more the pickup has to be away from BTDC.
Also (starting engine) 200 RPM x 360deg = 72,000 deg. per minute, 72,000 / 60 = 1,200 degrees per second, 1,200 X 1msec = 1.2 degrees, or 10% ERROR 1,200 X .9msec = 1.08 degrees, 1,200 X 1.1msec = 1.32 degrees. The amount of error can be corrected with testing it's a trial & error thingy.
One of the biggest problems to get rid of was the noise from the ignition spark. Things like the negative spike from the ignition coil caused anything from a second spark to burning up parts. Also because it was a CDI ignition we had to use 'snubbers' on the capacitor charging circuit (primary side of Txfrm).
Why do I feel like I highjacked this thread? sorry.
Cheers
Ray