The paper explains how the Bluetooth Mesh Standard came about to address the problem of the variety of BLE meshing solutions that were not interoperable. It includes a great introduction to Bluetooth LE and Mesh with some statistical and experimental insights into mesh performance.
The authors explain how the choice of the use of advertising advertising at 100% duty cycle for lower end-to-end delay has degraded the low energy advantage of BLE advertising thus limiting the usefulness in power (battery) sensitive applications.
The paper contains some useful insights:
The back off mechanism, used to decrease the chance of mesh network collisions, contributes most to the communication delay. However, as they identify, it’s this mechanism that provides reliability and scalability in larger networks. Disabling the backoff mechanism decreases the delay but makes the network less scaleable and robust.
Making the network more dense, has a positive effect on the round trip time (RTT). However too a dense network leads to more collisions.
Increasing the number of hops needed, making the network more sparse, has a negative effect on the RTT.
“It is clear that there are a lot of factors influencing the communication flows within a Bluetooth Mesh network, requiring more advanced management mechanism for optimizing the performance of the mesh network.”
However, the research had some limitations. Noise was simulated by introducing non-mesh beacons advertising every 20ms. This wasn’t very realistic given that most beacons advertise in the range 100ms to 1000ms. Re-transmit time was considered that complicated calculations – especially as re-transmit is application specific. It wasn’t mentioned that in many mesh sensing applications, unacknowledged messages are acceptable such that there’s no re-transmit. Also, the affect of other mesh network traffic, on the round trip time, wasn’t considered – only one mesh transmission at a time was considered.
Connected factory implementations require a large number of connected assets for condition-based monitoring, asset tracking, inventory (stock) management or for building automation. Bluetooth is a secure, low cost, low power and reliable solution suitable for use in connected factories. In this post, we examine the reasoning behind some out-of-date thinking on industrial wireless, uncover the real problems in factories and provide some explanations how Bluetooth overcomes these challenges.
Operations teams are usually very sceptical about industrial wireless. They have usually tried proprietary industry solutions using wireless with mixed results. They might have experienced how connections, such as WiFi, can become unreliable in the more electrically noisy areas of factories. The usual approach is to use cable. However, this can become expensive and time consuming. Using cable isn’t possible when assets are moving and becomes impractical when the number of connected items becomes large as in the case of connected factories. As we shall explain, Bluetooth is intrinsically more reliable than WiFi even through they share the same 2.4GHz frequency band.
There’s usually lots of electrical noise in an industrial environment that tends to be one of two types:
Electromagnetic radiation emitted by equipment. This typically includes engines, charging devices, frequency converters, power converters and welding. It also includes transmissions from other radio equipment such as DECT phones and mobile telephones.
Multipath propagation which is reflection of radio signals off, usually metallic, surfaces and received again slightly later.
It’s important to understand how Bluetooth and other competing technologies react to these types of interference. There’s a useful study (pdf) by Linköping University, Swedish Defence Research Agency (FOI) and the University of Gävle on noise industrial environments.
Noise in industrial environments tends to follow the following spectral pattern:
There’s usually lots of electrical noise up to about 500MHz. This means wireless communication using lower frequencies, such as two way radio, exhibits a lot of noise. Pertinently, several wireless solutions for industrial applications use frequencies in the 30–80 MHz and 400–450 MHz bands. Bluetooth’s and WiFi’s 2.4GHz frequency is well above 500MHz so exhibits better reliability than some industrial wireless solutions. Incidentally, in the above charts, the peaks around 900 MHz and 1800 MHz mobile phone signals and 1880–1890 MHz come from DECT phones.
The magnitude of multipath propoagation depends on the environment. It’s greatest in buildings having highly reflective, usually metallic, floors, walls and roofs. If you imagine a radio signal wave being received and then received again nanoseconds later, you can imagine how both the amplitude (peaks) and the phase of the received signal becomes distorted over time. Bluetooth uses Adaptive Frequency Hopping (AFH) which means that packets transferred consecutively in time do not use the same frequency. Thus each packet acts like a single narrowband transmission and there’s less affect of one packet on the next one. However, what is more affected is amplitude which manifests itself as the received overall signal strength (RSSI). RSSI is often used by Bluetooth applications to infer distance from sender to receiver. We will mention mitigations for varying RSSI later.
It’s important to consider what happens when there is electrical noise. It turns out that technologies invented to ensure reliable transmission behave much less well in noisy situations. One such technique is carrier sense multiple access (CSMA), used by WLAN (WiFi), that listens to the channel before transmitting and waits until a free channel is observed. CSMA and automatic auto repeat (ARQ) also have re-transmission mechanisms. The retrying can also incur significant extra traffic that can overwhelm the communication in noisy environment. Bluetooth doesn’t use such schemes.
The previously mentioned research classifies different wireless technologies according to the delay when used in noisy environments:
Bluetooth (and WISA) is a good choice for noisier environments. It’s particularly suited for applications with lower data rates and sending at periodic intervals.
A final consideration is interference between Bluetooth and other technologies, such as WiFi, that use similar 2.4GHz frequencies. As mentioned in a previous post, there’s negligible overlap between Bluetooth and WiFi channel frequencies.
In summary, Bluetooth is more suited to electrically noisy environments and also offers low cost, low power and secure wireless communication.
These conclusions correlate well with our own empirical observations. We have found that Bluetooth advertising is still received in environments we have measured, using a RF spectrum analyser, to be electrically noisy around 2.4GHz . We believe this is because Bluetooth advertising hops across three frequencies such that there’s less likelihood of noise on all three. Advertising is also very short, typically taking 1 or 2 ms, making the coincidence of the noise and the advertising less likely than would be the case of a longer transmission.
It has been our experience that solutions using Bluetooth advertising are more reliable than those using Bluetooth (GATT) connections, especially in noisy environments when it’s difficult to maintain the chatty protocol of a connection over a long time period. In noisy situations, advertising is usually seen on a future transmit/scan if the first advertising is lost. By coincidence or design, Bluetooth Mesh is built on communication via advertising rather than connection and for this reason is also reliable on the factory floor.
However, using Bluetooth isn’t a silver bullet. There are situations where factories, or more usually parts of factories, have reflective surfaces or unusual radio frequency (RF) characteristics stretching into unforeseen frequencies. Poorer performing WiFi also needs to be considered in context of choosing between Ethernet and WiFi gateways and Bluetooth mesh.
It’s important to do a site survey including RF spectral analysis. This will uncover nuances of particular critical locations or coverage that can drive subsequent hardware planning. It can also feed into requirements for software processing, for example particular settings for processing within a real time locating system (RTLS) to cater for more varying RSSI.
The traditional way of tracking assets using barcodes, NFC or RFID requires that someone of something scan the items at very close range. Bluetooth has the advantage that it works up to 70m, sometimes up to 300m allowing the reading to be done:
Without moving the items, saving infrastructure such as conveyor belts
Without human involvement, saving time
Continuously
The affect of ‘continuously’ is subtle but powerful. With traditional scanning, information as to the whereabouts of an item is only as good as the last scan that could be minutes, hours or even days ago. If the item moves without scanning, finding it can be very difficult. Bluetooth asset tracking is updated continuously.
Although beacons cost more than barcodes, NFC and RFID, the readers, usually gateways, cost considerably (x10) less. As the beacons are Bluetooth, for some scenarios the readers are ‘free’ as you can use smartphones already in use. Nevertheless, beacons cost ($5 to $40) more than barcodes so tend to be used on aggregated items such as pallets and sub-assemblies or on single valuable items.
Beacons go beyond simple simple assets tracking into the Internet of Things (IoT). The same beacons can monitor quantities such as vibration, temperature, humidity, light, proximity, smoke and gas. Using beacons for extra purposes such as sensing and providing triggered information about assets can often be the most compelling aspect of using beacons.
When it comes to software, think carefully. Most people expect functionality similar to traditional barcode-based asset tracking with software on a server somewhere. While the equivalent exists in the form of RTLS systems that put beacons onto maps and plans, it’s sometimes possible to implement a simpler solution to get the job done. Could your requirements be met with just an app? One such example is the work we did for Malvern instruments that’s a simple app that does a stock check by scanning for beacons as the user moves about their site. Also, we have found that many organisations don’t actually need a full asset management solution but instead need something that can capture beacon data and make it available to their existing systems. Our BeaconServer™ fulfils that role.
Russ Sharer, Vice-President of Global Marketing for Fulham, a manufacturer of energy-efficient lighting sub-systems has written an article in Health Estate Journal (pdf) on the use of iBeacons in healthcare.
Russ says it’s often difficult to find life saving equipment in hospitals and many organisations have to compensate by purchasing more equipment than they need. However, in use, equipment still gets misplaced, usually just at the critical time it is needed. He explains how the use of Bluetooth beacons and mesh can solve this problem. The article provides a great introduction to iBeacons and some issues such as the affect of frequency of transmission on battery life.
While the article mentions Bluetooth Mesh and iBeacons, these specific technologies don’t always have to be used. Gateways can be used instead of mesh to allow greater throughput of data. Also, any beacons, not just iBeacons, can be used as it’s usually the MAC address of the beacon that’s used for identification purposes. Using sensor beacons allows further scenarios, for example, monitoring the temperature of expensive medicines.
There are also many more scenarios for the use of beacons in healthcare than are mentioned in the article. Our beacons are being using to track hundreds of dementia patients. We have also been involved in a project to use beacons for navigation in large hospitals. Once there’s a network of beacons in a hospital, it’s possible to add lots of widely varying solutions.
In the past, before Bluetooth Low Energy (LE) was introduced in 2010, real time locating systems (RTLS) used costly Ultra Wide Band (UWB) devices or radio frequency id (RFID).
Bluetooth LE is increasing being used for RTLS due to:
the availability of much lower cost Bluetooth devices, up to 1/4 the price of UWB.
a much longer battery life than UWB, 5+ years for some devices.
a much longer range than RFID, up to 300m vs cm.
the ease of detecting beacons via apps and single board computers using existing operating system Bluetooth APIs
The 4th Industrial Revolution (4IR), also known as Industry 4.0, is the use of technology to improve operational efficiency, increase throughput, minimise downtime, improve quality and lower costs. We have an article that explains how beacons are part of 4IR.
There’s a lot more to 4IR than tracking items and analysing data. It also includes areas such as automation, robotics, cyber security and 3D printing. There’s a free online Industry 4.0 Magazine that can help you get up to speed.
This is part 3 of a 3 part series that explains what’s inside a beacon. In this part we take a look at the System on a Chip (SoC) software and programming for the Nordic nRF range found in the majority of beacons.
Despite the small size and memory, a typical beacon contains lots of code written in the c programming language. The code required to implement Bluetooth, called the Bluetooth stack, is very complex. It also has to pass tests by the Bluetooth SIG, called qualification. To prevent every product vendor using the SoC having to write the Bluetooth part themselves, Nordic supply what’s called a SoftDevice. A SoftDevice is a precompiled and linked binary library implementing a wireless protocol, Bluetooth in our case.
For the nRF52, the S132 SoftDevice provides a qualified Bluetooth® 5 low energy (BLE) Central and Peripheral protocol stack solution. It provides eight connections with an Observer and a Broadcaster role all running concurrently. Use of a softdevice allows developers to concentrate on their own high level product functionality rather than lower level complexities.
Beacon manufacturers or 3rd party developers such as ourselves create a program using either SEGGER Embedded Studio (SES), MDK-ARM Keil µVision, GNU/GCC or IAR Workbench. Most development now uses SEGGER Embedded Studio because Nordic have licensed it to allow Nordic developers to use free of charge. Most Nordic code examples in the nRF52 SDK now include a SEGGER Embedded Studio project file.
There are two ways of programming, either pre-programming the SoC with production code before mounting using socket programming or programming the SoC after mounting in the circuit. The PCB holes mentioned in part 1 are used to program the beacon in the circuit. A jig with pogo pins (pins with springs) can be used to help programming larger number of devices:
The other end plugs into a nRF52 DK that has a debug out header at the top right:
If you keep the pins connected to your beacon, you can run and debug the code, in situ, via the SEGGER IDE. However, debugging is not that capable because it’s not possible to continue from breakpoints. You have to re-run or rely on lots of logging to the console.
The nRF52 DK also contains a nRF52 which means it can be used in the initial stages of product development prior to moving to actual hardware.
This is part 2 of a 3 part series that explains what’s inside a beacon. In this part we take a look at the System on a Chip (SoC), in particular the Nordic nRF range, found in many beacons.
In part 1 we identified the Nordic nRF52832 SoC. The nRF52 is a newer version of the Nordic nRF51 that has been used in millions of beacons. The new version has more memory, uses less power and includes NFC. The extra memory is useful for applications such as Bluetooth Mesh.
The Nordic nRF52832 SoC wasn’t created just for beacons. It’s a general purpose device for any electronics that needs to have 2.4GHz wireless communications and software processing. The nRF51 and nRF52 series can be found in many fitness trackers and wearables. For example, the BBC micro:bit, the Polar GPS multisport watch and Garmin’s child activity monitor.
The SoC is a stand alone computer having an ARM® Cortex™-M4 CPU with a floating point unit. The NFC-A Tag can be used in pairing and payment solutions which makes it suited for use with smartphone apps. The SoC also has digital peripherals and interfaces such as PDM and I2S for digital microphones and audio.
It has very low power consumption via an on-chip adaptive power management system. It uses between 0.3 μA and 1.9 μA, depending on the mode, and can still respond to events. For beacons, it periodically wakes up for about 1ms, during which it uses about 5.3 mA (at 0 dBm power output).
The SoC supports ANT™, IEEE 802.15.4, Thread, and proprietary protocols operating in the 2.4 GHz bandwidth as well as Bluetooth®.
The marking on the chip denotes the variant with different RAM and flash combinations:
The image in part 1 shows the i7 beacon has the QFAA variant with 64 kB RAM 512 kB Flash. As with SSDs, the flash can only be erased and written so many times. For the nRF52832 this is 10,000 erase/write cycles. This is irrelevant for most beacons as they save very little data, irregularly, usually only when settings are changed. However, for applications such as mesh, the number of erase/write cycles needs to be minimised to prevent the device wearing out in a short period of time.