SABINE JUD, Marketing Manager, Mobility Sensors, and
GERNOT HEHN, Applications Engineer Automotive Business Unit,
ams, www.ams.com
Great hope has been invested in the technology of electric vehicles (EVs): it is expected to reduce greenhouse gas (GHG) emissions, promote sustainable energy development and improve air quality in urban areas. Mass adoption of EVs would bring benefits to the planet and its people.
Yet today, the EV is making slow progress in its battle to replace conventional internal combustion engine (ICE) passenger vehicles. At the top of the list of the barriers facing EV manufacturers are consumers’ reluctance to adopt new technologies, the high initial purchase cost of EVs, and the lack of a widely available charging infrastructure. So even though EVs generate zero emissions of various pollutants including nitrogen oxide, carbon monoxide and hydrocarbons, they are still considered for purchase by few consumers.
In fact, for many years EVs have been successful in niche markets that were well suited to a low-power, short-range traction technology: examples include milk floats and golf buggies. But since the launch of pioneering car designs such as the Nissan Leaf and the Chevrolet Volt in late 2010, plug-in electric passenger vehicles (PEVs) have become more widely available. Hybrid electric vehicles (HEVs), which first appeared a decade earlier, are also now selling steadily. According to a report from Pike Research, annual worldwide sales of these various types of EV are forecast to reach 3.8 million by 2020.
The biggest factor affecting the rate of growth in adoption of EVs is the battery: it is the single biggest contributor to the purchase price of an EV, and the performance of the battery – its range, life-time and power-output characteristics – largely determines whether the consumer sees the EV as an acceptable alternative to an ICE vehicle, or as unacceptably compromised.
The battery’s sensor has a crucial role to play in the performance of the battery as a whole. Keeping the battery charged and functioning correctly depends on the accuracy and reliability of measurements captured by the sensor. As this article will show, improvements in battery sensor technology today are helping to deliver important improvements to the performance of the battery, and therefore to the consumer appeal of EVs.
The functions of a battery management system
A battery management system (BMS) is absolutely necessary to protect the battery from damage, prolong the life of the battery, and maintain the battery’s fitness to perform the functions of the application for which it was specified. In other words, a BMS supports both the safety and performance requirements of the vehicle.
Basic BMS functions include:
• Measure the pack voltage
• Measure the current flowing into (when charging) or out of (when discharging) the battery
• Measure individual cell voltages
• Measure the temperature of the cells
• Disconnect the battery when these values exceed the absolute maximum/minimum permitted
• Balance the charge stored by each cell in a stack
• Check the operational status of system components to guarantee the functional safety of the BMS
• Calculate the state of charge (SOC), state of health (SOH) and state of function (SOF) of the battery
• Communicate these data to the vehicle
The ability of the BMS to perform these functions depends on the accuracy of its sensor inputs. This article now takes a close look at the specifics of current sensing to compare the various technolo-gies available and to show the benefits and drawbacks of each.
Comparison of Hall-based and shunt-based sensors
There are two fundamental measurement technologies available to measure current flow:
• A circuit may directly measure the voltage drop across a shunt resistor of a known resistance value, and convert this voltage measurement into a measurement of current flow
• A magnetic field sensing element, known as a Hall sensor, may measure the magnetic flux created by a conductor carrying current. The strength of the magnetic flux varies as the cur-rent varies.
Both methods have inherent advantages and disadvantages:
• A shunt-based sensor must be inserted into the current path, which means that it affects the system it is measuring. In the past, shunt-based systems had relatively poor sensitivity, so they required a shunt with a high resistance value in order to capture current measurements with the necessary accuracy and resolution. The high resistance created a high insertion loss, and an undesirable waste of power. Today’s shunt-based sensors, however, benefit from huge improvements in analog circuit design and fabrication, and can measure current accurately and precisely with a shunt of just 100µΩ; they have roughly the same insertion loss as magnetic sensors. In these systems, the voltage drop across the resistor is amplified (preferably using an architecture that cancels offset), and digitized.
• In a magnetic current sensor, the current conductor is placed in close proximity to a Hall sensor element. This is accomplished either by integrating the current conductor and the Hall element into a single package, or by using a ring of ferrous metal (around the conductor) as a flux concentrator, and integrating the sensor element into a gap in the ring. The magnetic flux is then either sensed directly by the Hall element, or a closed-loop architecture is used to compensate the total magnetic flux to zero; the correction voltage is then proportional to the current being measured. Since the magnetic system does not need to be inserted into the current path, it is intrinsically isolated and therefore easier to install. It suffers from non-linearity, however (since Hall elements are intrinsically non-linear), and is susceptible to interference from stray magnetic fields.
Table 1 shows that a shunt-based current sensor has the advantage over a magnetic sensor on most parameters of interest to the system manufacturer.
Table 1: Comparison of the main characteristics of the two current sensing methods
Extremely accurate battery voltage and current measurements in EVs
The most challenging aspect of the design of an automotive battery sensor is the need for very precise measurement over a very wide current range – 1 mA to 1 kA. This requires a sensor interface with a measurement range of >100 mV, with a resolution of better than 1 µV.
The key attributes of such a measurement system are:
• Very low noise
• Highly linear
• Zero offset
The AS8510, a highly integrated sensor interface from ams, offers these characteristics. Featuring best-in-class accuracy in current and voltage measurement, the device offers two independent data acquisition channels which can simultaneously measure current and voltage signals of both polarities, and with no offset.
In a lithium-ion battery, SOC data are obtained through the accurate measurement of the current flowing in and out of the battery over time, with the help of calibration cycles based on similarly ac-curate measurements of no-load battery voltage, together with the battery’s temperature. Accurate SOC data capture requires precise current measurement over the entire signal and temperature ranges as well as an exact time base (provided by an external quartz-based clock). SOC data inform the driver’s ‘remaining range’ indicator, and also enable the BMS to prevent damaging over-discharge events.
Current is measured through a 100µΩ Manganin shunt, an extremely precise and stable resistor. Thanks to features such as highly linear 16-bit sigma-delta ADCs, a zero-offset architecture and an ADC reference which is temperature-trimmed in the factory, the AS8510 can offer typical accuracy of 0.2% over the full automotive temperature range, input range and lifetime. What is more, the accuracy of the reference and gain over the device’s lifetime has been certified as part of the device’s AEC Q100 qualification.
Voltage is another key parameter that a BMS must measure. Accurate voltage measurement is required, in particular, for safe and efficient charging, and for cell balancing in Li-ion batteries. In the AS8510, the externally attenuated battery voltage is directly digitized, either simultaneously with the current measurement, or at a different sample rate if so configured. As with the current measurement function, voltage measurement accuracy of 0.2% is possible on the voltage channel (when software correction of the ADC reference’s drift over temperature is implemented).
Fig. 1: Aa BMW i3 at a charging station in Paris, France. The i3 includes an AS8510 sensor interface in the BMS of its li-ion traction power unit
Offering better accuracy, stability and lifetime than any other battery measurement IC, the AS8510 is set to enable EV and HEV manufacturers to meet market demand for vehicles that more closely match drivers’ expectations for range between charges, while lowering the lifetime cost of owning the vehicle.
Performance of the AS8510 on the road
The BMW i3 is an example of an EV in volume production today with an AS8510 in the battery sensor. The BMS monitors the voltage and current of the 400V Li-ion battery powering the i3’s electric motors, and ensures the functional safety of the vehicle’s battery systems. When backed by a special calibration scheme, the sensor system in the BMS is able to measure current to an accuracy of just ±0.5% and voltage to an accuracy of better than ±0.1% over the full operating temperature range of the AS8510 (-40°C to +125°C) and over its lifetime.
Learn more about ams (formerly austriamicrosystems)