Results 1 to 4 of 4

Thread: HR80 Temperature Calibration

  1. #1
    Automated Home Jr Member
    Join Date
    Nov 2014
    Location
    UK
    Posts
    31

    Default HR80 Temperature Calibration

    Hi, I have been looking into calibration of my HR80 radiator valves.

    Following a good deal of work, I have decided one my HR80 is reading 1 low. I will be looking at my others HR80's when I have an answer to the question of how to calibrate my HR80's.

    I have had a look through the instructions, searched on Google but can't find a procedure to 'calibrate', offset' or 'adjust' the measured temperature of a HR80.

    Why adjust at the valve? I am using the Evohome system with multiple indication and control points via the API e.g
    Evohom Custom API 2016-Tiles Theme.jpg
    Perhaps one leaves the valves controllers and there is a setting in the Evohome unit?

    Thanks for taking time to read

    Conseils

  2. #2
    Automated Home Legend
    Join Date
    Sep 2014
    Location
    Scotland
    Posts
    2,289

    Default

    The only temperature calibration setting on the Evotouch itself is to calibrate its own internal temperature sensor, this is in the installer menu. There are no settings to calibrate externally reported temperatures here.

    The HR92 has a calibrate option in its own setup menu but I am not familiar with the HR80 - a quick look at the online installer manual for the HR80 does not show a calibration option so I guess you are out of luck.

    By the way you might want to read a post I wrote some time ago that goes into the shortcomings of trying to measure temperature at the radiator valve itself:

    http://www.automatedhome.co.uk/vbull...ll=1#post25191

    If you have a particularly troublesome room where you can't get an accurate temperature reading at the HR80 it's very unlikely to be the HR80 itself at fault, simply the setup of the room and location of the radiator and HR80 relative to other things in the room, in which case a wall mounted thermostat like a DTS92 mounted away from the radiator is your solution.

    A remotely mounted wall thermostat is always going to give a much more accurate reading, provided it is located in a sensible place.
    Last edited by DBMandrake; 12th September 2016 at 02:51 PM.

  3. #3
    Automated Home Jr Member
    Join Date
    Nov 2014
    Location
    UK
    Posts
    31

    Default

    Many thanks, I had struggled to find anything on calibrating a HR80.

    Your treatise on room temperature measurement is nicely put together.

    My calibration technique was to place the HR80 in a closed chamber and leave with 'Calibrated (0 and 100 C) type K thermocouple bead' taped to the front with Kapton tape, just inside the left vent by the thermistor above the 'y' in Honeywell.

    Stabilisation period was 2 hours and samples of measured data taken and averaged from all channels concerned. Reference measurement by Fluke 52 twin channel thermocouple meter, together with two DS18B20 (also being checked for exactly the purpose you allude to with an ESP8266 ). Ambient 21C for all instrumentation and at least 1 hour warm up. Fresh batteries fitted in all equipment battery operated. Measurements repeated over 5 hours by data logging.

    I see ~1.5 C difference, at the moment I am assuming an offset rather than a scaling error; its the discussions with ones partner that gets exciting. I set 21 C in my room, but she says it's too warm when she walks in! Well now I know why... If I have my difference correctly mapped, it's actually controlling to ~22.5 C at the control valve. With the thermal gradient in the room, it probably does feel quite a bit warmer than the other rooms.

    I was expecting no more than 0.5 C error. Of course the actual value is largely irrelevant, until one is trying to compare multiple rooms to 'feel' the same. I will see how some of the other valves perform if I get the chance and this late warm weather holds.

    Having put together my own web based control and logging system I can now correct my web based system and include a 'calibration factor', though the ideal would have been to 'tweak' the value read out.

    Thanks for taking time to look into the potential to 'calibrate' issue.

  4. #4
    Automated Home Legend
    Join Date
    Sep 2014
    Location
    Scotland
    Posts
    2,289

    Default

    Wow you've really done your homework there and done a thorough job of measuring this.

    It does indeed sound like your HR80's calibration is out - while that seems unusual its certainly not impossible, and to be honest I have no direct experience with the HR80 and assumed it to be more similar in functionality to the HR92 than it actually seems to be! Have you tried testing another one under the same test conditions to see if its just one bad unit you have or are they all a bit out ?

    I'm assuming from the screenshot in your initial post that you are polling your API data from the "old" V1 API, which has just recently started providing two decimal place un-biased readings ? This is currently the only way (other than an HGI80 interface) that I'm aware of to get "true" temperature readings from the system, rather than readings that are rounded to the nearest half degree and then biased another half degree towards the set-point... before this it was very difficult to do an accurate comparison with another thermometer.

    For what it's worth since this API change occurred I've done some comparisons with a weather station thermometer (inside/wireless outside) that I use as my "reference", which whilst being a bit slow to update, I believe is pretty accurate, down to about 0.1 degrees, and agrees very closely with other digital thermometers I have.

    I have found the DTS92E wall stat (of which I only have one) to be absolutely spot on compared to my reference thermometer - if they are right beside each other they read within 0.1 degrees of each other as polled through the API. The indicated temperature on the display is however somewhat inaccurate due to the rounding and set-point biasing, as is the reading on the evotouch. (And the two often don't quite agree with each other as they seem to round and bias slightly differently ) But the reading from the API is spot on.

    The internal temperature sensor of the Evotouch when mounted on a wall mount I have found to read almost exactly 1.0 degrees higher than reality, I put this down to the internal heat generation inside the device from the display, CPU, etc. The power dissipation in a DTS92E would be absolutely minuscule so would not have this same temperature rise effect.

    As long as the power dissipation in the evotouch is constant, the temperature offset will also be constant as a constant power dissipation with a constant thermal resistance to the surroundings (case, mounting location etc) will cause a constant rise above ambient temperature. So it should be possible to correct for this error by using the calibration setting in the installer menu. I have this set to -1.0 and measurements I've done suggest that it now agrees with my reference thermometer placed beside it on the wall within about 0.2 degrees or so. (Not quite as consistent as the DTS92E)

    If the batteries are charging this would generate additional heat that would throw this calibration off slightly (another forum member has reported this effect) but I find with mine left on the wall mount it very infrequently charges the battery so its not an issue.

    I have found the temperature measurement of the HR92UK also very accurate and spot on within about 0.1 degrees as measured through the V1 API, provided that the radiator is completely cold, (no localised heating) the reference thermometer is placed right beside the HR92, and the HR92 calibration is set to the default of 0. This to me suggests that electronically, the HR92's are very accurate at measuring the temperature.

    However the cold radiator and measurement location are a massive "gotcha". Just because the sensor itself is very accurate doesn't mean it will necessarily give an accurate indication of the temperature out in the room, for the reasons discussed in the other thread. I find that typically there is about a 1 degree rise in reading over ambient when the radiator is hot due to localised heating. And that is with a good, un-obscured convector radiator. If the radiator is not a convector (old fashioned panels only) or is partially covered with clothing reducing convection, I find that the error can increase to as much as 2-3 degrees. Because of this 1 degree error I have most of my HR92's calibrated to -1. (Unfortunately whilst the calibration of the evotouch sensor can be adjusted in 0.1 degree increments, the HR92 calibration can only be adjusted in 1 degree increments from -3 to +3)

    Of course with that calibration it will now read 1 degree below the true temperature when the radiator cools down, and also I find that a near-floor measurement location is typically at least 0.5 degrees colder than a 1.2 metre high measurement location, so the cold radiator reading is now out quite a bit. The only real solution to that is a wall mounted thermostat, which I'm now using in the Living room, and which gives a very accurate representation of the room regardless of whether the radiator is hot or cold.
    Last edited by DBMandrake; 13th September 2016 at 10:33 AM.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •