Measurement Current in Voltmeters: Understanding the Draw and Impact
Introduction
When measuring electrical potential differences in a circuit, one must consider the effects of the measuring device itself. Voltmeters are designed to measure voltage without significantly altering the circuit's behavior. This article explores the current draw of voltmeters during measurement, providing insights into the impact of different designs and the importance of high internal resistance.
Understanding Voltmeter Design and Functionality
A voltmeter's primary function is to measure the voltage across two points in an electrical circuit while minimizing interference. To achieve this, a voltmeter is required to have a very high internal resistance, typically in the range of megaohms (MΩ).
The high internal resistance of a voltmeter ensures that the current drawn during measurement is extremely low, often in the microampere (μA) range, or even lower. This minimal current draw prevents the voltmeter from loading the circuit, thus providing an accurate measurement of the voltage under test. This is crucial for maintaining the integrity of the circuit being measured.
Historical Perspective on Voltmeters
Early versions of voltmeters, known as VOMs (volt-ohm-milliamp meters), relied on the circuit under test to power their operations. These devices were not as precise and often had a much lower input resistance. For instance, some older VOMs might have had a full scale current of 1 milliampere, indicating a relatively low input resistance. The series resistor value, which was crucial for measuring voltage, was often determined by multiplying the entire range of the scale by 1000.
Examples of such older voltmeters included the Allied VOM, which had a stunning 100,000 ohms per volt and a 9 microampere full scale, and Heathkit's VTVM, which featured a 1 megohm input impedance. These devices, despite their lower sensitivity, were reliable and still function today, emphasizing the importance of robust design.
Modern Voltmeters and Their Advancements
Modern digital voltmeters (DVMs) typically offer higher input resistances, often in the range of 10 megohms (10 MΩ). The Heathcheck VTVM and other contemporary devices often exceed this, with some having input resistances in the gigohms (GΩ) range. The high input resistance ensures minimal current draw, making them ideal for precise and non-invasive measurements.
It is worth noting that the term 'ohms per volt' was mistakenly used to describe the input resistance in the initial versions of these devices, but this is no longer the case. The input resistance of digital meters is standardized at 10 MΩ for all ranges, ensuring consistency and accuracy in measurements.
Factors Affecting Current Draw in Voltmeters
The current draw in a voltmeter can vary depending on the design and operational requirements. In less expensive DVMs, the input resistance is often lower, typically around 10 MΩ. More sophisticated devices can have input resistances in the several gigohms range, leading to even lower current draw.
While a standard voltmeter draws minimal current, some designs might incorporate shunts to indicate higher current values for specific applications. These designs may sacrifice some accuracy for increased sensitivity, but they are less common in high-precision applications.
Conclusion
The current draw of a voltmeter during measurement is critical for maintaining the accuracy and non-invasive nature of voltage measurement. By ensuring a very high internal resistance, modern voltmeters minimize this current draw, allowing for precise readings without altering the circuit. Whether using historical or contemporary designs, the principle remains the same: a high internal resistance ensures reliable and accurate voltage measurements.