Model engine CDI easy and cheap

Home Model Engine Machinist Forum

Help Support Home Model Engine Machinist Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
@willray

The coil I am using likes to see as you say, about 4ms of dwell. This is the time to charge the coil and determines the energy of the spark. If my engine is running at 4000 RPM and I am using wasted spark where I fire both spark plugs every revolution of the crankshaft, I need to fire the spark plugs every 15 milliseconds. (4000 RPM is 66.7 Revolutions per second, or 1/66.7 = .015) Of this 15 milliseconds, 4ms is used to charge the coil. At 7500RPM, the crankshaft is turning 125 times a second, or once every 8 milliseconds. We still have half of this time to charge the coil.

When the engine is running, the crankshaft rotates and the magnetic sensor produces a reference pulse each revolution. These are fed to the microcontroller, which records the time of each reference to microsecond accuracy on an internal clock. From these times the RPM is calculated and the required timing advance angle is derived. The actual time of the next spark is calculated, and when the clock reaches this value, the pair of plugs fire. For starting, the priority is to prevent kickback and ensure a big spark. To achieve this the spark occurs at 3 degrees AFTER TDC. The active edge of the magnet is physically installed at this point and this generates the spark immediately. Both spark plugs are fired in my twin engine, thus one spark is “wasted” as it occurs at the end of the exhaust stroke, and so has no effect. The benefit of this is that no distributor is required and the accuracy of the spark timing is improved, as the timing is taken directly from the crank, not through the gear driven camshaft. The microcontroller does a number of other things as well as controlling the main timing process.

The spark timing is as follows: from 0 to 500 RPM the spark is fired 3 degrees after TDC. From 500 RPM to 1000 RPM the spark is advanced to 7 degrees before TDC. From 1000 RPM to 3000 RPM the timing is advanced linearly from 7 to 25 degrees before TDC. From 3000 RPM to 5000 RPM the timing is advanced to 25 degrees before TDC. Above 5000 RPM the spark is not advanced and set to TDC. This is an over rev limiter of sorts.

The microcontroller uses at minimum a 25 microsecond time base, which equates to one degree timing resolution up to 6667 RPM.

Does this answer your question?
It seems your forgetting one thing - the amount of time the uC takes too calculate all the variables and lookup/s. This time will have to be measured and subtracted from the delay calculation for accurate ignition timing. I have done this using both counting clock tics (@16mhz) and just micro-seconds, this where a test bench comes in handy and a scope. For those that don't know the uC can only make delays to set the timing. So less delay - makes for advancing the timing and vice-a-versa more delay less timing, it can not predict the future. I try for setting the trigger pulse between 45 and 90 degrees BTDC to give the uC enough time depending on the application.

I'm not sure what uC you are talking about but, a time base of 25 msec equates to a 40khz uC ?
With a UNO-R3 running at 16mhz the time base is 62.5 nsec between clocks (time base). Using an interrupt and as far as I know it takes 5 clock pulses just to activate the interrupt handling before any code execution. This is where you need to start a timer from when the trigger pulse is seen and stop it when the interrupt exits, take the number of counts and convert it to time, if you wish. You then subtract this from your ignition delay (actual ignition timing). I use clock counts to control my ignition time delay. All of this (trigger pulse width = RPM, calculation time, and timing lookup) must be measured as processing time. If you only have 1 coil firing per rev then add the coil charging time to your processing time to get your max RPM i.e. Calibration.

Calculating the RPM and firing delays is not that easy. Microchip has a micro-controller the PIC16LF1619 family that was designed to use angular velocity to do the RPM & processing but, it does not use pulse width, it instead uses the time between 2 trigger pulses. It was only designed for single cylinder engines or wasted spark for twins. They have demo software available but, their IDE software has a big learning curve compared to Arduino. In RCgroups we used the PIC12LF1840 (16Mhz) and got 11,000 rpm (single cylinder) out of it with 2 ignition curves stored on it. Programmable-Open-Source-CD-Ignition-PIC1840

Anyway that's my 2 cents worth for now and I hope it helps. Oh, the Excel spreadsheet I have included is for a V8 with triggers very 90 degree but, it still shows what is going on.

Cheers
Ray
 

Attachments

  • Count to RPM to Degrees Per Second (version 1).xls
    762 KB
Perfectly - thanks!

It looks like this would be easy to implement as an interrupt-driven process that simply wakes up uC every time it sees the reference pulse and calculates the delay until it needs to turn the coil back on for the next planned plug firing.

For some reason I was imagining the calculation of the anticipated delay to be harder than this - I'm not sure why my brain wanted to use a sliding average over several revolutions...

One question: It seems like the straightforward way to implement this strategy would result in missing (firing extremely late) on the first revolution after the very first successful "pop" when starting the engine: After the first cylinder fires for the first time when cranking, the next revolution will be much faster than anticipated from the previous one. At least in theory, this could result in the uC not getting around to turning on power to the coil, until after TDC comes around and it should have fired.

At least conceptually, this seems like it would limit your acceleration - below 500RPM, if the next revolution is faster than 4ms faster than the previous revolution, the coil turn-on would happen after it should fire?

I take it that in practice, this isn't a problem?
Using an interrupt stops the uC from executing any other code until the interrupt is finished. Calculating the ignition delay/timing isn't that easy trust me I've been there and done that. The actual processing time must be found out (measured) and it never changes unless the code changes, a scope really helps for this stuff. Otherwise it's just guess work and a lot of playing around with the code. For example if you want 10 degrees BTDC you need a degree timing wheel and a timing light and pointer. So now you start the engine (hopefully) and find that the timing is 5 degrees BTDC, now you have to take some delay out. Do you use time or clock pulses and by how much? Are you using pulse width or the time between triggers too calculate RPM? Are you using the leading edge of the trigger or trailing edge? If using the trailing edge then the time will be getting shorter as RPM increases throwing off the timing, auto-advance. Magnet strength, type of Hall-effect, and the distance between them will change the timing. Using the leading edge means so long as the trigger source devices don't change then the moment when the trigger happens won't change. Using the pulse width means the timing is for that revolution (end of it), using time between pulses means the timing is late by 1 revolution. Either way your timing will only be out by 1 rev, reaction not forecasting.

Cheers
Ray
 
Using an interrupt stops the uC from executing any other code until the interrupt is finished.
Or until interrupts are re-enabled. In real-time systems that I have worked on, one either makes the interrupt routine very short and quick, or one implements a flag to prevent re-entry and one re-enables interrupts so that other routines can run as needed.
 
Or until interrupts are re-enabled. In real-time systems that I have worked on, one either makes the interrupt routine very short and quick, or one implements a flag to prevent re-entry and one re-enables interrupts so that other routines can run as needed.
Your correct. :)
I'm still trying to learn how to do interrupts with dual & quad cores. I haven't had time to figure that one out yet. Hopefully I can have 1 core handle some interrupts and the other core handle some other ones. Any ideas?

Cheers
Ray
 
Using an interrupt stops the uC from executing any other code until the interrupt is finished.
As I imagine you already know, it's why interrupts are kept as short as possible.
As for delays, blink without delay.
 
Your correct. :)
I'm still trying to learn how to do interrupts with dual & quad cores. I haven't had time to figure that one out yet. Hopefully I can have 1 core handle some interrupts and the other core handle some other ones. Any ideas?

Cheers
Ray
Great question, Ray. I have to confess that all of my microcontroller experience is with single-core processors, not multi-core. I find Andreas Spiess ("the guy with the Swiss accent") on YouTube to be a good resource; sometime back he dug into the dual cores of the ESP32 a bit. As I recall, the standard configuration of the ESP32 uses one core to handle the wifi / bluetooth, and leaves the other core free to do whatever is needed - but it is possible to schedule tasks on the wifi/bluetooth core as well. The key - if I remember correctly, which is not at all assured - is that there is actually a mini-RTOS running on the ESP32 behind the scenes.

Extrapolating from that fuzzy memory, I wonder if the key to using multi-core processors as microcontrollers might be some sort of RTOS to allow you to specify exactly which cores handle which tasks - ?? The more I think about it, I'm not sure how you would achieve determinacy with multi-cores without an RTOS to help. But again I stress - none of what I've said is based on personal experience, so please apply a liberal helping of NaCl!
 
Great question, Ray. I have to confess that all of my microcontroller experience is with single-core processors, not multi-core. I find Andreas Spiess ("the guy with the Swiss accent") on YouTube to be a good resource; sometime back he dug into the dual cores of the ESP32 a bit. As I recall, the standard configuration of the ESP32 uses one core to handle the wifi / bluetooth, and leaves the other core free to do whatever is needed - but it is possible to schedule tasks on the wifi/bluetooth core as well. The key - if I remember correctly, which is not at all assured - is that there is actually a mini-RTOS running on the ESP32 behind the scenes.

Extrapolating from that fuzzy memory, I wonder if the key to using multi-core processors as microcontrollers might be some sort of RTOS to allow you to specify exactly which cores handle which tasks - ?? The more I think about it, I'm not sure how you would achieve determinacy with multi-cores without an RTOS to help. But again I stress - none of what I've said is based on personal experience, so please apply a liberal helping of NaCl!
Same here, all my experience has been with single cores. I have an RTOS that I had a hell of a time to get from STmicro but, it is the only one certified by usULc that is free for life critical operations. I have been reading up on it and I think your right that an RTOS is needed for multi-core processing. Most of my experience with multi-cores has been with PCs. They generally turn on extra core when the first core (main) reaches about 60% usage and so on for additional cores. I have noticed my new AMD Ryzen 7 turns on whatever core other than the main core (#1) that has lowest usage. From what I have read this STmicro RTOS allows you to choose which core does what for code. You still have a main core with the main code that talks to the second core and combines any results from the second core to the main core code for further processing, I believe that's how it works. Once again take with some salt. Not sure yet which ARM processor I will be using. Also most RTOS's I have checked out have a max cert rating for appliances, LOL.

Cheers
Ray
 
The electronic ignition I am working on is a simple embedded solution, no RTOS. I have an analog circuit that uses the hall sensor to create a 4ms pulse, firing the spark plug about 3 degrees after top dead center on the compression stroke. When the engine is running below 500 RPM (for example starting), this is passes straight to the COP and the fires the plug, no software in the loop. I use two interrupts with just a few lines of code in them. One is triggered by the hall sensor and the other is triggered by a timer and used to position and time the trigger pulse. I remain in the time domain and do no calculations in inverse time, that is I measure the time between events and do not use RPM. The main loop uses a lookup table to generate the timing of the pulse depending on the time of one revolution (I actually take a running average of several revolutions, minor detail).

Post #118 describes my approach. Also be careful not to confuse microseconds with milliseconds.
 
The electronic ignition I am working on is a simple embedded solution, no RTOS. I have an analog circuit that uses the hall sensor to create a 4ms pulse, firing the spark plug about 3 degrees after top dead center on the compression stroke. When the engine is running below 500 RPM (for example starting), this is passes straight to the COP and the fires the plug, no software in the loop. I use two interrupts with just a few lines of code in them. One is triggered by the hall sensor and the other is triggered by a timer and used to position and time the trigger pulse. I remain in the time domain and do no calculations in inverse time, that is I measure the time between events and do not use RPM. The main loop uses a lookup table to generate the timing of the pulse depending on the time of one revolution (I actually take a running average of several revolutions, minor detail).

Post #118 describes my approach. Also be careful not to confuse microseconds with milliseconds.
Right - we are getting a bit far afield / overkill with multi-core RTOS solutions. I see no reason a basic Arduino couldn't handle this task just fine.

I do wonder if the hybrid analog / digital approach you describe may be less simple than a purely digital approach ... maybe ?
 
Yeah, your are probably right, I built the analog circuitry first, then decided to use a micro controller. When it is all said and done, the majority of the analog circuitry may not be needed.
Yes, so true. I recently ran into the analog vs micro dilemma in an air cylinder project. At first, there was just going to be one precisely timed blast of high pressure air. Then it snowballed into a purge cycle, and then a dwell, and then a return. All was done with 555 timers, hooked in series, with individual and combined outputs to mosfets to drive the solenoid valves. With trim pots at each of the 555's, the timing can be controlled, but in the long-run, a micro probably would have been easier. But then again, the timing can be tweaked with a simple screwdriver.

But my real dilemma was that my last micro work was with a Parallax Stamp over 10 years ago (which I Ioved), and now (at 71) the micro technology is light years ahead of that and I don't know where to start. Yet, the Stamp was so simple to program and use. My needs are very Basic (yes, pun intended), so what do I try that has the easiest learning curve, with no over-kill on capability?
Thanks, Lloyd
P.S. This thread has taken many twists and turns, so I hope my question is in tune with the spirit.
 
Not sure if this is relevant to the mention of the spark pulses affecting the electronics, but it seems that some kind of shielding would be helpful.
I was working on a different kind of project a few years ago and had to prove to myself that the EMI was causing a problem.
Lloyd

Edit, But it just occurred to me that the spark doesn't seem to be affecting the 555 that is attached to the coil.
I don't know.

 
Last edited:
Yes, so true. I recently ran into the analog vs micro dilemma in an air cylinder project. At first, there was just going to be one precisely timed blast of high pressure air. Then it snowballed into a purge cycle, and then a dwell, and then a return. All was done with 555 timers, hooked in series, with individual and combined outputs to mosfets to drive the solenoid valves. With trim pots at each of the 555's, the timing can be controlled, but in the long-run, a micro probably would have been easier. But then again, the timing can be tweaked with a simple screwdriver.

But my real dilemma was that my last micro work was with a Parallax Stamp over 10 years ago (which I Ioved), and now (at 71) the micro technology is light years ahead of that and I don't know where to start. Yet, the Stamp was so simple to program and use. My needs are very Basic (yes, pun intended), so what do I try that has the easiest learning curve, with no over-kill on capability?
Thanks, Lloyd
P.S. This thread has taken many twists and turns, so I hope my question is in tune with the spirit.
Lloyd, a couple of options to consider if you want to wade back into the microcontroller waters.

1) Arduino - which is not just a microcontroller, but a very broad and deep ecosystem. What I mean by that is, if you learn to use the Arduino IDE with a simple low-end 8-bit Arduino Nano, you can use the exact same IDE with a higher end STM32 32-bit ARM microcontroller, or an ESP32, or ... the list goes on. The ecosystem includes extensive libraries, so you do not have to re-invent the wheel. Want to include an LCD screen interface? There's a library to handle that. How about a temperature sensor? There are both libraries and auxiliary boards ready to go. And so on. If you go the standard Arduino route, you will be programming in C++, which may sound daunting ... but I am confident, if you have used a Stamp, you will be able to get the hang of it quickly. Arduino really aims to support beginners as well as more sophisticated users. And likely you will find it easier in many ways than the microcontrollers of years gone by - just plug the Arduino into a USB port and away you go.

2) Adafruit - which is a company that specializes in hobby level microcontrollers and accessories. They support Arduino (and Raspberry Pi) as well, but they also produce a line of "Circuit Python" controllers, which, as the name suggests, allow you to program in Python.I have to confess that I have not used Python, but it is a popular language - and I suspect it may be closer to BASIC than C++ will be. Like the typical BASIC implementation, Python is interpreted rather than compiled. However, unlike the typical BASIC implementation, Python *really* cares about indents. And that's about as much as I know, so I'll stop there, except to say, as I recall, Adafruit has a wide variety of libraries to support the wide variety of sensors and and adapters that are designed to work with their boards.

There are others, of course, but these are the two that I perceive to be most beginner- or getting-back-into-it-friendly. I have used the Arduinos a fair bit, and like them very much; I have not used the Circuit Python boards.
 
Lloyd, a couple of options to consider if you want to wade back into the microcontroller waters.

Andy, thanks, that is the kind of answer I was hoping to get. The electronics is enjoyable, and clean and quiet, and can be done whenever I feel like it. The PBasic of the Stamp was easy to learn because it was so much like Fortan (wow! throwback). If the Arduino IDE is as logical as Basic, it could definitely be a winner for me. Time to dive back in. I will have to look for some kind of "hobbyist starter kit" for Arduino, where a lot of the overwhelming head-scratching decisions have already been made.
 
Here you go: https://www.amazon.com/ELEGOO-Project-Tutorial-Controller-Projects/dp/B01D8KOZF4/ :)

No affiliation, and no experience with this product - this is simply the first link that popped up on Amazon when I searched for "Arduino starter kit" - there were many others, ranging upwards in price.

One nice thing with the Arduinos (and the Adafruit offerings as well) is that you can start experimenting with nothing more than just the board. You can write some simple routines to flash the onboard LEDs to get you started, and all you need is a USB connection to your computer. You can either download the Arduino IDE and run it on your computer (my preference) or run it in the cloud from their website.

But of course, the starter kit is attractive for including both documentation and peripherals and parts and instructions to walk you through using them. Not only may that help move you up the learning curve more quickly, but also it might save you from burning out a board. I won't say that *I* have ever done this, but I have a "friend" who tried to hook up a circuit to an Arduino, only to see the magic smoke depart. Ahem. Fortunately, Arduino boards tend to be inexpensive - probably less so at the moment, but at least a couple of years ago, I could pick up 5 Arduino Nano boards for $10. At that price, it's not too painful if an experiment goes awry ... :)
 
Awake,

I just ordered one based on your description and what I read on the reviews. I can barely send an email without my wife's help but I would like to know more about this stuff. My ultimate goal is to assemble a small rotary table that is index able via the computer. May post an update in the future but only if successful!
 
If we can help, don't hesitate to ask. I'm not the brightest bulb in the chandelier (see reference to magic smoke above), but there are many here who clearly know what they are doing. :)
 
Andy,
Each of those "smoke" eruptions is just another learning experience, just like each "close call" in the shop makes us take shop-safety more seriously.

Vietti,
When I first got into the Parallax Stamp micro over 10 years ago, I was totally clueless. But I worked with a bunch or really sharp people and was explaining to one of the EEs what I was trying to do at home. He said, no big deal, the Stamp will do what you need, I will help you get started. He'd give me some hints, and I'd experiment at home. I'd come to work with questions, he'd give me some more answers and suggestions, and before I knew it, it started making sense. I think those EEs had been sand-bagging all along, and making us mere mortals think that there was some sort of voodoo in what they were doing. ;) And the programmer weenies were even worse, with their secret lingo like proffy bus and interrupts and clock speeds. Ha ha. Nope, mostly just common sense, with maybe a teeny-tiny bit of voodoo thrown in.
Lloyd
 

Latest posts

Back
Top