Like it or not, there are tolerances on everything electronic, computer controlled or not. A lot of electronic components are spec'ed with either 5% or 10% tolerance. For example, the second meter we used at the Halloween Classic, a Fluke 77, has an published accuracy of +/- 0.3%. There isn't a specification for repeatability, so even if the battery is exactly 8.400 volts, the meter could read 8.425 volts. Or an illegal 8.425 v battery could read 8.400 v. Adam claims that this would result in the racer being DQ'd for that round, when in fact it is the measuring device that is wrong. Also, note that Fluke specifies this as the "Best Accuracy", whatever that means. And the values also vary with temperature.
To the best of my knowledge, my ICE and Checkpoint chargers aren't upgradable, and are pre-programmed to stop the charging process at a non-configurable voltage (and mine read back the voltage as between 8.401 and 8.405 when charging is complete). So if the tolerances line up such that the charger's built-in meter reads low, and the ROAR meter reads high, what should the racer do? (Don't forget the repeatability factor). If you say that they should come and check the voltage after every charging cycle during the Nats on the ROAR meter before coming through tech, you're not being realistic.
Why not use some common sense and allow the racer to run the car to drop the voltage into the "legal" range before being allowed through tech? The last thing that this hobby needs is to alienate paying customers in the current economy.