What happens when you take readings from the quantize value rather then from the actual value
a detailed explanation would be much appreciated
a detailed explanation would be much appreciated
-
If you mean that the answer is truncated for some reason, then consider this.
Suppose you hae a voltmeter that reads to 2 decimal places. You could say the 'quantum' is 0.01 V because that is the ultimate sensitivity of the meter. If it reads 2.36 V, the actual value could be +/- 1 digit in the last place, so the best value would be (2.36 +/- 0.01) volts. This gives you +/- 1 part in 236 or +/- 0.4 %. This assumes that the meter is otherwise correct.
Suppose you hae a voltmeter that reads to 2 decimal places. You could say the 'quantum' is 0.01 V because that is the ultimate sensitivity of the meter. If it reads 2.36 V, the actual value could be +/- 1 digit in the last place, so the best value would be (2.36 +/- 0.01) volts. This gives you +/- 1 part in 236 or +/- 0.4 %. This assumes that the meter is otherwise correct.