The Hall effect was named after its discoverer, American physicist and thermoelectric researcher Harvard Edwin Herbert Hall. The Hall sensor acts as a magnetic field perpendicular to a current-carrying conductor, creating an electrical voltage across the current flow direction. There are many possible applications of this physical principle allowing the strength of an external magnetic field to be determined and measured. Current measurements are possible as each conductor through which current flows creates its own a magnetic field, which can be used to indirectly measure the current. Since the Hall effect is most pronounced in semiconductors, typically a small plate of semiconductor material is used as the Hall element.

In a semiconductor die, the Hall voltage is generated by the action of an external magnetic field acting perpendicular to the current direction. Hall sensors measure the component of the magnetic flux perpendicular to the chip surface. In today’s Hall sensors, the required evaluation electronics are normally integrated on the chip with the Hall plate. Field lines that penetrate the Hall element vertically generate a proportional electrical voltage, which is processed by a digital signal processor and converted into a usable output format.

The first CMOS-based Hall sensors were pioneered and developed by what is now TDK-Micronas in the early 1990s. As the manufacturing process became cheaper, the integrated circuits (ICs) could be better integrated, allowing the construction of more complex circuits. Now, digital and analog functions can be combined on the same chip as mixed signal ICs (see Figure 1).

Fig. 1 - Semiconductor chip of a linear Hall sensor including Hall element and active circuits (monolithic integration) and the associated block diagram. The internal signal conditioning linearizes the output characteristic. Click here to enlarge the image.

Hall sensors operate at very low signal voltages and are therefore prone to offset drifts that can occur as a result of temperature and voltage fluctuations, as well as mechanical stress. To combat this, newer designs use an active offset compensation that limits fluctuations to only a few microvolts over the entire working temperature and the electromagnetic compatibility (EMC) voltage of the circuit. The result is a complete CMOS Hall sensor system that includes on-chip signal processing and nonvolatile memory integration.

In addition, by using both vertical and horizontal Hall elements, high-precision angle sensors and multidimensional magnetic field measurements are now possible: Vertical Hall elements detect magnetic field lines parallel to the sensor surface, while the horizontal Hall elements detect the vertical component impinging on the chip surface. This is crucial for use in the medical industry, which requires tighter integration and EMC optimization of components. Such sensors with integrated blocking capacitors must meet the functional safety requirements of ISO 60601-1-2. In safety-critical applications, high reliability can be achieved with the help of the redundancy principle by means of two independent semiconductor chips in a single housing.

The Tunnel Effect: TMR Sensors

Tunnel magnetoresistive (TMR) technology is based on a quantum mechanical effect. In a magnetoresistive effect (MR), a magnetic field change leads to a change in the electrical resistance (see Figure 2). As a result, a magnetic field and electrical variable can be measured based on their relation to one another. TMR is a special form of MR technology where the so-called tunnel effect ensures a much greater change in resistance as soon as a magnetic field is applied. As a result, TMR sensors offer the advantage of higher sensitivity and accuracy.

Fig. 2 - Functional principle of TMR technology.

In TMR sensors, a stack of magnetic material is separated in the middle by an electrical insulator — the so-called tunnel barrier. The direction of the magnetization of the free layer depends on the external magnetic field. The direction of the solid layer remains unchanged. The resistance of the TMR element is proportional to the relative angle between free and solid layers.

Modern TMR sensors grew out of AMR (anisotropic magnetoresistive) technology and GMR (giant magnetoresistance) technology, which was also developed during the early 1990s. Compact components based on these technologies led to the adoption of microscopic MR read heads in hard drives and away from larger electromechanical heads, allowing for higher storage density. With the further development of TMR technology that utilized a manufacturing process similar to that of a CMOS, different layers are deposited and patterned on a silicon wafer. Consequently, TMR resistance elements are connected serially to form a resistance element. These resistors are usually arranged in groups of four, as a Wheatstone bridge circuit. This type of interconnection generates common electrical signals that can be evaluated directly or by means of downstream electronics (ASIC).

A key parameter for medical applications is the sensor output voltage that can be read directly from conventional microcontrollers, and as a result, additional amplifier circuits, resistors, and capacitors can be eliminated. TMR-based angle sensors, speed sensors, and linear sensors can be used to realize a variety of applications. The simple design of a TMR sensor has many advantages including more reliability, resulting in better functional safety, and superior monitoring of sensor behavior during operation.

TMR sensors can be used in accordance with ISO standard 60601-1-2 to the highest level of safety in the medical industry. Another important feature is the very high angular accuracy that can be achieved with TMR sensors. This performance is reflected in the increase in efficiency and noise reduction, especially for the control of electric brushless DC motors (BLDC).

Hall Effect Sensor or TMR Sensor?

When determining which type of sensor should be used in medical applications, the special properties of each should determine their usage. CMOS Hall sensors feature cost-effective monolithic Hall element and transmitter integration on a single chip. This makes them ideal for cost-sensitive applications or for detecting stronger magnetic fields.

Compared to magnetic field sensors based on other technologies (Hall/AMR/GMR), TMR sensors have a better signal-to-noise ratio and score with unprecedented accuracy and very low power consumption. TMR sensors offer reliable and stable performance over temperature and the lifetime of the sensors. As a result, TMR sensors are preferred in very demanding applications.

In many cases, the combination of both technologies can also be very beneficial. For example, in a current sensor module, a TMR sensor can measure weak electric currents while the Hall sensor detects high currents. The measuring range can thus be extended significantly with increased measuring accuracy. A second application for the use of both technologies is determining the rotor position in a BLDC. In this case, a high-precision TMR sensor can be combined with a redundant Hall sensor to increase the overall system reliability and there would be no need for an angle encoder.

Conclusion

Regardless of whether a Hall Effect sensor or TMR sensor or both are used, the technologies in each type of sensor can reduce the overall bill of materials and costs in a medical design when compared to using other dedicated sensors. By better understanding the technology and applications of each type of sensor, engineers are better able to make informed decisions resulting in safer, more reliable designs.

This article was written by Andy Phillips, North America Sales Manager for TDK-Micronas GmbH, Detroit, MI. For more information, click here .