Hi,
since moving to V3 (3.030) my ECU master oil temp sensor no longer works correctly. It shows -26 when connected (16c ambient temp) but does change to 100c once I unplug it. It’s connected to pin B17. Also I’ve noticed the cal isn’t the same as in the old V2 software (cal was inputted as per instructions that the sensor came with) I used the sensor wizard to do the cal in V3. Maybe I’m doing something wrong but I don’t understand why there is a difference between the 2 calibrations if ECU master themselves supplied the information and built the software.
Yes I agree the pull-up is possibly wrong but isn’t the idea of having sensor wizard is that it does everything for you? I shouldn’t have to change any settings.
The pullup value depends on user. You should select 330R as advised in manual, but this value depends on your goal (the best precision range) the and the pullup can vary.
Also the default pullup value is the internal EMU pullup what is used form most temperature sensor.
So if the 330R pull-up is the default resistance for this sensor why does the software wizard not enter this automatically? Because the value I get each time is 2200.
Look like you don’t understand pull-up role … pullup firstly is depending of the sensor and expected mor important temperature to have better resolution on given temperature area so in case of ecumaster sensor if used for oil I’m usin 330 pullup to have nice good resolution on higher temperatures - reason i don’t care if sensor reads less accurately on lower temperatures… but for example if I’m using the same sensor for coolant temp I’m using 1k pullup to have nicer reading on working temperature for clt or power steering like on the photo below
compare curves below the same sensor different pullup
After changing the wizard’s default 2200 ohm resistance setting for oil temp cal to 330 ohm the sensor now reads correctly. Personally I feel the wizard should input the default 330R for the oil temp sensor automatically. Anyway thanks for the help.