By HIDEO KONDO
Product Marketing Manager, ON Semiconductor
www.onsemi.com
The wearable/portable electronics market is experiencing significant growth, not least of all in the health sector, where medical monitoring of patients is now being carried out with some of the most advanced equipment available. The other area where wearable electronics are taking off is personal health monitoring — for example, many health-conscious members of the public now wear monitors on their wrists for measuring their heart rates, energy output (in calories), and even sleep patterns.
The annual market for wearable electronics will be valued at more than $25 billion by 2020, according to figures from Allied Market Research. While this growth is good news for manufacturers of the kind of monitoring equipment that fits into portable units, it exposes a major problem. If these units suddenly shut down when their batteries are close to the limits of their remaining energy (even though the display indicates that there is still a reasonable amount of battery power left), it can be frustrating at best, but at worst, it can be potentially life-threatening if this kind of failure occurs in a medical environment.
As technology advances, portable devices are becoming thinner and less costly, which means that batteries need to offer increased capacity in a smaller size. Additionally, the users need to receive enough of a warning so that they are able to complete the task at hand with confidence that there is sufficient power left in the battery. Even if there is no danger to life when the power cuts out, the loss of data that can result has the potential to put people off from using such wearable systems, meaning that the growth in the market could be truncated considerably.
Besides keeping the wearer informed about charge levels, it is important to get the best possible energy efficiency out of the batteries that are fitted into the equipment — which are usually lithium-ion (Li-ion) batteries.
Li-ion replaced nickel-cadmium and nickel-metal-hydride some years ago as the preferred battery chemistry for portable equipment power sources. The main reason for this is that the energy density of Li-ion is about double that of standard nickel-cadmium, offering a cell voltage of 3.6 V. Lithium-ion also offers extremely low maintenance.
However, built-in protection circuits mean that the cell voltage cannot drop too low on discharge and that the battery usually fails after two to three years (although five years of use have been recorded).
With these features in mind, getting the maximum possible power efficiency out of Li-ion batteries is crucial if the wearable electronics device user is to benefit. Of equal importance is keeping users aware of battery usage and charge levels so that they can avoid the aforementioned inconvenient shutdowns.
When looking at how to make this possible, a key factor that needs to be taken into consideration is that space is very limited in such devices and adding anything to offer the features mentioned above is likely to increase the size and weight of the device, making it more cumbersome and therefore less likely to be bought and used. Another factor to consider is that adding anything to the device runs the risk of making the item so expensive that it is priced out of what is already an extremely competitive market. Finally, adding any technological solution to such devices could mean that some of the power, which is at a premium, is likely to be diverted to the new system.
The problem with the traditional off-line methods of gauging how much power batteries have left in them, such as coulomb counting, is that they are notoriously inaccurate. A coulomb counter can be accurate to within about 8% but this means that there could be just 2% worth of charge left in a portable device when 10% is being indicated (see Fig. 1 ). This significant inaccuracy can lead to the user worrying unnecessarily about a shutdown even when their device is showing as much as a 20% charge.
Fig. 1: Traditional methods of gauging battery power are accurate to only about 8%.
Coulomb counting works by continually measuring in- and out-flowing current through a sense resistor (see Fig. 2 ). This measure is then compared with the maximum battery charge to work out how much charge is left in the battery. However, the battery’s self-discharge current doesn’t pass through the coulomb counter, so it cannot be detected. This current also affects the accuracy of the resistor because it raises the overall temperature. Finally, an accurate measurement can be taken only if the battery is fully charged every time. Other issues with coulomb counting are that sense resistors are expensive and use up vital battery power, often interrupting the main battery performance. They also take up precious space on the portable device’s printed circuit board (PCB).
Fig. 2: Coulomb counting does not detect the self-discharge current and takes an accurate measurement only if the battery is fully charged.
A much better method of measuring remaining battery power accurately — and so making efficient use of all of the available energy — is to use an on-board fuel gauge, which is based on precision analog-to-digital conversion (ADC) technology and has error correction and temperature compensation built in. Not only does this approach deliver a much higher level of accuracy, but it also allows manufacturers of wearable electronics products to develop more streamlined and cost-effective gauging systems. ON Semiconductor offers a solution — the LC709203F battery-voltage sensing fuel gauge (see Fig. 3 ) — that combines low-power operation with highly accurate gauging and ensures longer battery life through less wear. Also, not having an external sense resistor means that there is no power loss and that valuable space on the PCB is saved.
Fig. 3: The LC709203F battery-voltage sensing fuel gauge.
Based on a gauging method called HG-CVR (see Fig. 4) — hybrid gauging by current-voltage tracking with internal resistance — the Smart LiB fuel gauge measures the battery’s relative state of charge (RSOC) to an accuracy of 2.8%, even under relatively unstable conditions that incorporate temperature, aging, loading, and self-discharge.
Fig. 4: The hybrid gauging by current-voltage tracking technique.
Essentially, the gauge works by storing a reference table in its memory that contains data on the voltage/capacity, impedance/capacity, and impedance/temperature features of the battery. It then compares these values with the measured voltage and temperature on a continual basis and uses this information to calculate what battery capacity is left.
In the HG-CVR method, the currency (coulomb) is extracted after monitoring only the voltage and calculating using the impedance profile Table & Voltage profile table. For an even more accurate picture of when the battery will be completely depleted, the fuel gauge also monitors the battery’s temperature. As the remaining battery life decreases, the gauge takes such readings more often at lower battery voltages.
Unlike other methods of charge measurement, the Smart LiB system doesn’t need the device’s battery to be fully charged for calibration. This means that the remaining battery life can be calculated accurately, even if the battery has only been charged to 50%.
As mentioned before, power saving is essential in any gauging system, so the LC709203F unit puts itself into energy-saving sleep mode in between measurements. With no sense resistor needed, active power consumption is reduced. In terms of space saving, the gauge measures 1.76 x 1.6 mm, which is about 45% smaller than alternative units. The gauge circuit’s overall PCB footprint is 77% less than anything else on the market.
Finally, in terms of total power consumption, the LC709203F has been shown to have an operating current of 15 µA — significantly lower than the 118 µA of its nearest competitor. Power consumption in active mode is 2 µA and in sleep mode is 0.2 µA, compared with alternative units that require 25 µA and 0.5 µA, respectively.
Learn more about ON Semiconductor