Perhaps someone with EDC16 experience can help noob on this one.
I have logged a WOT run using HDS.
I got rpm soi duration iq bar and rail duty.
Comparing to the soi tables the data is bang on expected and the axes is mg/str (0.01) vs rpm = crank degrees (0.23437)
Most tables are IQ vs rpm.
The problem is the duration table.
If the axes is mg/str (0.01) vs bar (0.1) = microsecond (1.0)....it seems way out.
If the axes is mm3 (0.01) vs bar (0.1) = microsecond (1.0).... (assuming 1.2 mm3 / mg) ...it appears much closer.
I thought edc16 used mg/str or mm3... and not both ?
Should I look for a fuel temp vs mg = mm3 tables ?
All 1600 bar...
4613rpm = 1000us (expected 0700us @ 42.0mg axes)
4500rpm = 1050us (expected 0800us @ 45.5mg axes)
4000rpm = 1275us (expected 0975us @ 52.5mg axes)
3500rpm = 1375us (expected 1050us @ 57.0mg axes)
3000rpm = 1425us (expected 1200us @ 63.0mg axes)
2500rpm = 1500us (expected 1200us @ 64.0mg axes)