Calculating Battery Life in IoT Applications
The internet has dramatically changed the way we design most electronic systems, with everything from signage on bus stops to complex industrial systems now using connectivity as a key part of their functionality. Perhaps the biggest change, however, is the introduction of sensor systems that collect data and pass the information to the cloud. These range from temperature monitoring and heating control in homes to location-tracking systems for logistics companies.
Unlike many larger connected systems, these small “things” often do not have access to mains power. This means that they must have a means of powering themselves, something that is achieved using either batteries or energy harvesting.
For many applications, energy harvesting offers the most promising solution. As the energy required by the system is taken from the environment, using technologies from solar panels to systems that use the energy from movement or even a push of a switch, energy harvesting offers the prospect of indefinite operation if the device can be designed to use less power than is available through energy harvesting.
Although an increasing number of applications can now be developed at the ultra-low power levels required for energy harvesting, many more are not suitable for this approach. Perhaps the power required for processing data on the device is too high, the needs of the communication technology are too demanding, or there simply isn’t a good source of energy to harvest. In this case batteries are needed to power the system.
Unlike products that use energy harvesting, which will either demand less power than is available and therefore will work, batteries will need changing at some point. With the cost of replacing batteries often higher than the cost of the IoT device itself, calculating lifetime is critical.
Factors Affecting Battery Life of IoT Devices
The battery life of an IoT device is determined by a simple calculation: the battery capacity divided by the average rate of discharge. Minimising the energy used by the device, or increasing the battery capacity will increase the lifetime of the battery and reduce the total cost of ownership of the product.
As batteries are often the largest part of an IoT sensor system, engineers often have a limited choice of which one to use. With a wide range of processors, communications technologies and software algorithms, however, the system can be designed to achieve the required lifetime. Often IoT sensors will be designed to operate for their entire life on the original battery, as the labour cost of replacement is so large.
An IoT Battery Life Calculator
With battery life such a critical part of IoT design, we have developed a calculator that will let you estimate the battery life of your IoT system quickly and easily. The calculator allows you to enter parameters for your processor, communications device, sensor and battery as well as letting you define the way your software operates, and then estimates the battery life of your design.
This valuable tool provides a first-pass estimate to ensure your IoT design is feasible. It will also allow you to experiment with different approaches, showing the impact of changing processor, communications technology, battery or software algorithm.
The IoT Battery Life Calculator ensures that you don’t waste time on products that cannot achieve acceptable battery life. This article explains the operation of the calculator, and discusses how even more accurate calculations can be made, highlighting where the calculator does not exactly reflect the real world.
IoT Processor Sleep Modes
Processors designed for IoT applications offer a variety of ultra-low power sleep modes. It is intended that the processor stays in this mode for the vast majority of time, waking for only a short period to gather or process data, or to transmit information to the network.
Consider the TI CC2650MODA. Figure 1 shows the current consumed when operating in different states. The power consumption varies by six orders of magnitude from shutdown to active operation.

Figure 1 - Core Power Consumption of TI CC2650MODA
Unless sampling of the data is very infrequent, shutting down the processor offers few advantages. Additional circuitry and code will be needed to restart, adding to cost and complexity. Furthermore, the standby modes consume less than 3µA, a level that would take at least eight years to discharge the battery: longer than the lifetime of many IoT devices and as much as the shelf life of a CR2032 battery, hence there is usually little benefit in shutting down the processor completely.
Selecting the appropriate standby mode can be important. The lowest power standby consumes around one third of the current of the highest power option, but critically very little of the processor state is saved. Although some IoT applications will need to select the lowest power sleep modes, many will choose to preserve the cache to minimise the number of cycles required to perform the processing required in active mode.
Processing in active mode is a trade-off. Figure 1 shows that power consumption increases linearly with clock frequency due to the CMOS technology used for IoT processors like this one. So, faster clock speeds might seem to equate with shorter battery life, but as there is a “base” current of 1.45mA, the shorter wake time that you would need to run the same algorithm at faster clock speeds can mean that slowing the clock is a false economy and actually reduces battery life.
The calculation, however, is not quite this simple: there is also a finite wake time to switch from one mode to another. When switching from standby to active, the CC2650MODA takes 151µs. At the maximum clock frequency of 48MHz, this means that power is burnt for more than 7000 clock cycles as the processor wakes. For applications where only a small amount of code is needed, it’s likely that slowing the clock to trade a longer code execution time for lower power during wake-up will extend the battery life. Equally minimising the number of wakes and performing as many tasks as possible before returning to standby can also increase battery life.
Understanding how to balance clock speed, number of wakes and run time is complex, but the IoT battery life calculator offers an ideal way to experiment with different scenarios.
Peripheral Power Consumption in IoT Applications
Modern IoT devices are very complex products that integrate many peripherals to allow a single-chip solution for many requirements. Frequently, however, IoT devices – particularly simple sensors – don’t need this functionality and therefore it is important to turn off the unused peripherals.

Figure 2 - TI CC2650 MODA Block Diagram
Figure 3 gives the power consumption of the peripherals available on the TI CC2650MODA family. Although the current consumed by the various devices is of the order of tens or low hundreds of µA, disabling them can have a significant impact. If no serial connectivity is required, a total of 318µA can be saved. Although this may not seem much, this current will have a significant impact on battery life, and could be used to increase the clock speed by 10MHz, reducing time spent in battery-draining active mode.

Figure 3 - TI CC2650 Peripheral Power Consumption
IoT Communications Technologies
Choosing the right communications technology is often determined by the system requirements. For battery-powered IoT systems, this almost always means using an RF link: wired communications would lose all the benefits offered by using battery power to eliminate wires.
For wireless communications, increased range or higher data rate will typically demand higher power consumption, and therefore the lowest-power communications technology that will meet these demands is often the obvious choice. Powering the system using batteries will make some technologies impractical: for example, a CR2032 will not have sufficient capacity to support a 3G modem, although larger batteries and developments in cellular technology for IoT are having an impact.
For IoT sensors there are several popular technologies. LoRa, for example, offers the capability to build a low-power, long-range WAN over several km while Bluetooth Low Energy (BLE) only communicates over short distances, but consumes significantly less current.
Another decision that must be made is whether to use an on-chip device, or to select a separate chip to handle communications. Typically, on-chip offers lower overall power consumption, although sometimes it’s not possible to find an integrated solution and therefore a separate device is the only option.
Managing the communications interface is critical, as even low-power communications technologies will drain a battery very quickly, and often the processing requirement is higher than that of the RF stage. Take the TI CC2650MODA, which needs 9.4mA to power the transmission circuit that supports BLE and IEEE 802.15.4: both very low power communications standards. This current is three times that drawn by the CPU when running at maximum frequency.
To maximise the battery capacity devoted to communications, many IoT systems will perform some pre-processing and collation of data, only waking the communications circuits when they have sufficient data to make transmission worthwhile. Analysing the impact of aggregating data to reduce the frequency of transmission is easy using the IoT battery life calculator.
Selecting an IoT Sensor to Maximise Battery Life
With many IoT devices designed primarily to capture environmental data, sensors can have a significant impact on the battery life of an IoT system. Choosing the right technology and mode of operation are critical decisions.
Take a temperature sensor, for example. A RTD (resistance temperature detector) such as the Honeywell HEL-777 or thermistor (e.g. Honeywell 135-104LAF-J01) changes resistance with temperature. A simple application, where accuracy isn’t important, might use a voltage divider, but high-precision systems would need a current source, which will require more power.
For many applications, integrated temperature sensors, such as the TI LM35DZ are a good solution: this device is accurate to ±¼°C at room temperature and draws only 60µA.
Whatever sensor is chosen; it is critical that it only draws power when being used. Powering the sensor when the processor is not taking measurements wastes battery capacity, and even the low=power LM36DZ draws around 30 times the current of the CC2650MODA processor in standby mode.
Understanding Battery Technologies for IoT
Several different batteries are popular for IoT applications. Increasingly the CR2032 “coin cell” is the product of choice because it offers a compact form-factor with sufficient capacity to allow IoT products to operate for years.
The first thing you notice about batteries is the limited data that is available for many of them. Other than the physical dimensions and output voltage, often the only other parameter specified is capacity. The battery capacity is obviously critical, as this determines the total energy available for your IoT device.
Battery quality has a significant impact on capacity. Simply specifying a CR2032 cell risks purchasing buying a cheaper device with much lower capacity, reducing the battery life of your IoT device and storing up expensive battery replacement costs for the future. There may also be batteries with different chemistries available in the form factor you have chosen: using a different chemistry can have a dramatic impact on battery life.
The Farnell element14 IoT calculator offers a choice of CR2032 batteries, both of which use Lithium Manganese Dioxide chemistry. One battery, however, is specified to offer around 10% less capacity than the other, although for many applications this may be justified by the fact that it is available for less than half the price of the higher-capacity battery.
Battery Specifications are Approximations
With the brief data sheets supplied for many batteries, it is tempting to assume batteries are very simple devices. The IoT calculator takes a similar approach, and assumes that the capacity of the battery is fixed, but in practice this is simply not true: consider the Multicomp CR2032. Figure 4 shows how capacity changes with load and temperature.

Figure 4 - Capacity of Multicomp CR2032 Battery
The first thing you notice is that a capacity of 210mAh is based upon optimum conditions. If the load demands more current, then lifetime is reduced dramatically. More importantly for some applications – for example temperature tracking of refrigerated items – as temperature falls, the capacity of the battery drops considerably.
IoT applications draw current in pulses. The processor and sensor could draw several mA for a short burst and then switch into a low power mode for a long period of time. Drawing current in pulses causes the output voltage to drop. Figure 5 shows that even a 2mA pulsed load will cause the output of a CR2032 to fall from 3V to around 2.2V.

Figure 5 - Pulse Discharge of Multicomp CR2032 Battery
Battery Shelf Life and IoT Applications
Battery shelf life is often ignored by engineers: after all this refers to storing, rather than using the battery, doesn’t it? IoT applications, however, often need to operate for years from a single battery, making shelf life a critical factor.
Two specifications determine the life of a battery when not being used: the shelf life and the self-discharge. Where specified, self-discharge of a CR2032 battery is typically only 1-2% per year, but this does not mean that a battery has a shelf life of 50 to 100 years: in fact, most batteries offer a quoted shelf life of only seven or eight years. This apparent paradox is due to the non-linear behaviour of the battery chemistry.
The IoT battery life calculator makes a sweeping assumption: it simply assumes that the battery leaks its entire charge linearly over the quoted shelf life. Although reality is different, it makes little difference to the result of the calculation if the IoT device being analysed has a battery life of less than a couple of years and ensures the calculator produces a conservative battery lifetime if the product is designed to operate for periods approaching the battery shelf life.
Conclusion: Maximising Battery Life in IoT Applications
Developing an IoT device that can operate from a battery requires careful engineering. Although the choice of components is important, bad design decisions can swamp the benefit of a lower-power processor. The key to achieving good battery life is to ensure the processor is in a low-power standby mode as much as possible, and that the use of wireless communications is minimised.
With many factors impacting the battery life, it can be a complex task to estimate battery life for a design, and very time consuming to compare different approaches. Although any calculator necessarily makes approximations, the IoT battery life calculator offers an easy way to test hypotheses and understand the impact of choosing different algorithms, components or communications technologies.
Calculating Battery Life in IoT Applications. Date published: 4th May 2017 by Farnell element14