PDA

View Full Version : Voltage offsets in analog input V2



joeygc5
January 13th, 2012, 11:20 AM
I upgraded my V2 cable last month to the newest V7 and V8 software releases. After getting all drivers and configurations correct I noticed that my analog input #1 wasn't reading correctly compared to the Prologger output. So I followed the instruction for calibration and verified that inputs 2 and 4 are reading within 0.05VDC of the supplied potential. But inputs #1 and 3 have a cascading effect, for instance applying 1.52VDC registers 1.72VDC on the scanner regardless of the calibration offset adjustment. Then supplying 4.76VDC registers 5.68VDC on the scanner, again regardless of offset adjustment. Lately I've just been monitoring Input #2 and really would like to get this issue resolved wether it be hardware or software related. Voltages were verified using my 3 month old Fluke scopemeter and a precision 0-5VDC analog supply btw. Any suggestions?

joecar
January 13th, 2012, 12:29 PM
Hi Joey,

Which wideband do you have...? If it is supported by V2 serial comms, then you would be better off connecting using serial comms.

joeygc5
January 14th, 2012, 06:30 AM
I use a "Prologger" brand wideband/boost controller

http://www.prologger.com/

mr.prick
January 14th, 2012, 11:54 AM
V2 A/D Calibration - FAQ (forum.efilive.com/showthread.php?4593-V2-A-D-Calibration-FAQ)

joeygc5
January 15th, 2012, 10:13 AM
V2 A/D Calibration - FAQ (forum.efilive.com/showthread.php?4593-V2-A-D-Calibration-FAQ)

I'm glad you read my post were It states I followed the calibration procedure. I didn't have this issue before the update, so I was curious if there was a step I was missing. I've tried reinstalling the newest Ver 7 and Ver8 with no difference in the analog input as to what is stated above. Previously I had no issues with the analog inputs 1-4 until the updates.

mr.prick
January 15th, 2012, 12:26 PM
I'm glad you read my post were It states I followed the calibration procedure. I didn't have this issue before the update, so I was curious if there was a step I was missing. I've tried reinstalling the newest Ver 7 and Ver8 with no difference in the analog input as to what is stated above. Previously I had no issues with the analog inputs 1-4 until the updates.

I guess it was a little condescending of me to add the link. :ermm:
2 & 4 are good after re-calibrating or were always right?

There have been instances where a firmware update has "silenced" one of the A/D inputs.
This is the only post on this so far but stranger things have happened.

Have you tried viewing the voltage and calibrating the ADs through the V2 alone without the software?
Is the software showing incorrect voltage or the V2 or both?

What is the firmware version & software build?

joeygc5
January 15th, 2012, 07:26 PM
I guess it was a little condescending of me to add the link. :ermm:
2 & 4 are good after re-calibrating or were always right?

There have been instances where a firmware update has "silenced" one of the A/D inputs.
This is the only post on this so far but stranger things have happened.

Have you tried viewing the voltage and calibrating the ADs through the V2 alone without the software?
Is the software showing incorrect voltage or the V2 or both?

What is the firmware version & software build?

Version and build are 7.5.7 and build 180.
Analog 2 has always worked correctly, I never needed analog 3 or 4 before. However before the update analog input 1 always read true value +/-0.010 volts of input.

No I haven't tried using the V2 without the software to verify if this is the condition. The software is showing the correct voltage for inputs 2 and 4. For inputs 1 and 3 there is a cascading or pullup effect. Where for every volt applied to inputs 1 and 3, the displayed is +0.020VDC higher/volt. So applying a verified 4.76VDC during calibration shows a value of 5.68vdc. During logging I'm showing over a full afr point higher than my wideband display/output, using input 1 or 3. I haven't had time to mess with this since my last dyno session. I was just curious if this problem has ever been seen before, or maybe I might have a hardware/device issue.

Thank you for any insight by the way

Tordne
January 17th, 2012, 01:14 PM
There is a hidden debug option on the FlashScan V2 itself. It is accessed by pressing F4->F4->F4 on the keypad (the last F4 option will not be shown in the menu list).

It would be "interesting" to see what the device itself is reporting. Paul how does the firmware programming is on holiday for a couple of weeks so if needed he would need to look a it further.

isak81
January 17th, 2012, 03:08 PM
I have this same problem with my AD1. Now i'm going out to check my other inputs.
I'm glad it's not just me.

joeygc5
January 18th, 2012, 03:42 AM
There is a hidden debug option on the FlashScan V2 itself. It is accessed by pressing F4->F4->F4 on the keypad (the last F4 option will not be shown in the menu list).

It would be "interesting" to see what the device itself is reporting. Paul how does the firmware programming is on holiday for a couple of weeks so if needed he would need to look a it further.
I'll try the device only F4-F4-F4 procedure while looking through the sub-menu's and report back.

Blacky
January 24th, 2012, 12:50 PM
I upgraded my V2 cable last month to the newest V7 and V8 software releases. After getting all drivers and configurations correct I noticed that my analog input #1 wasn't reading correctly compared to the Prologger output. So I followed the instruction for calibration and verified that inputs 2 and 4 are reading within 0.05VDC of the supplied potential. But inputs #1 and 3 have a cascading effect, for instance applying 1.52VDC registers 1.72VDC on the scanner regardless of the calibration offset adjustment. Then supplying 4.76VDC registers 5.68VDC on the scanner, again regardless of offset adjustment. Lately I've just been monitoring Input #2 and really would like to get this issue resolved wether it be hardware or software related. Voltages were verified using my 3 month old Fluke scopemeter and a precision 0-5VDC analog supply btw. Any suggestions?

The relationship between the input voltage and the A/D count is linear so the firmware scaling routines in the V2 should allow you to calibrate the AD inputs to overcome what you described.
In fact what you are seeing is probably caused by an incorrect calibration factor.

For example if the raw A/D counts were as follows (these are not real figures, I just pulled them out of the air):
- applying 1500mV the A/D count is 250
- applying 4500mV the A/D count is 700

To scale that into a 0-5V range the calibration factor and offset would need to be set to:
factor = (4500-1500)/(700-250) = 6.666667
offset = 1500-(250*6.6667) = -166.6667

So when 1500mV is applied to the A/D input, it would generate an A/D count of 250 which is converted to mV using the calibration equation:
250*6.666667-166.6667 = 1500mV
And when 4500mV is applied to the A/D input, it would generate an A/D count of 700 which is converted to mV using the calibration equation:
700*6.666667-166.6667 = 4500mV

If the factor was set just a little bit too high, for example, say the factor was 7.0 instead of 6.666667 the results would be:
250*7.0-166.6667 = 1583mV
700*7.0-166.6667 = 4733mV
which is similar to what you describe happening.

Note: the factors and offsets are configured per A/D input which could explain why some are accurate and some are not.

Regards
Paul