BLE sounds simple on paper. A device broadcasts, a phone receives, you get a location. I've spent long enough building Bluetooth into JavaScript to know the paper lies.
The product I'm writing from is BikeMesh, a tracker that helps people find their bike in a rack of twenty. Built in React Native, on top of an NXP Bluetooth 5 chip I reconfigured from its stock factory firmware into a broadcast format that works identically on iOS and Android. It's the subject of the talk I'm giving at BelfastJS on 13 May.
This is the shape of what it took.
A quick vocabulary
Before anything else, four terms. Loose, not a test:
- BLE: Bluetooth Low Energy. The short-range radio in almost every battery-powered device you own: earbuds, fitness bands, car keys, smart bulbs.
- A BLE device can shout (broadcast into the air, no pairing) or chat (pair with a specific phone, exchange data back and forth). Different purposes, very different code.
-
RSSI: the signal strength of a received packet. Always negative. Closer to zero means the device is closer to you.
-50dBm is in your hand;-90is almost gone. - iBeacon: a specific shape Apple designed for a Bluetooth shout. Fixed structure. Works identically on iOS and Android.
Hold those loosely. They all come back.
The hardware decision
The BikeMesh hardware is an NXP QN9090, a Bluetooth 5.0 chip on a development kit called the DK6. Out of the box, it ships speaking the Bluetooth Proximity Profile: a paired, session-based mode where a phone connects to the chip and exchanges commands. Beautiful for a factory demo. Wrong shape for a tracker.
Because what the product needs is different. A user walks up to a rack of twenty bikes and their phone should see all of them at once. Not pair to each one in turn. See all of them.
That means the chip has to stop chatting and start shouting. A day with NXP's MCUXpresso SDK, a few configuration changes, reflash the firmware over the debug probe, and the chip now broadcasts as an iBeacon ten times a second, shouting its identity into the air, no pairing required.
The decision I made here, before writing a line of JavaScript, is the one that quietly carried the rest of the system. It's worth lingering on: the hardest calls in BLE aren't made in your app. They're made in firmware, or in the advertising format you pick. The JavaScript side inherits the consequences.
Meeting JavaScript
On to the software.
JavaScript, of course, doesn't talk to a radio directly. A library does it on your behalf, written in native code (Swift on iOS, Java on Android, C on Node), exposing a clean JavaScript API that hides the radio-level ugliness.
In React Native, the library everyone uses is react-native-ble-plx. It gives you one function, startDeviceScan, and a callback that fires every time the phone's radio picks up a packet.
Three lines of code to start scanning:
import { BleManager } from 'react-native-ble-plx';
const ble = new BleManager();
ble.startDeviceScan(null, null, (err, device) => {
console.log(device?.id, device?.rssi);
});
You run it. You expect a flood: devices, signal strengths, a wall of text in the console.
You get nothing.
Or, on iOS, your app disappears. No error. No log. No crash dialog. The process is simply terminated.
The silent failure
This is the wall everyone meets on their first BLE project.
On iOS, the operating system requires a specific string in Info.plist, NSBluetoothAlwaysUsageDescription, explaining to the user why the app needs Bluetooth. Omit it, and iOS terminates your process the instant your code touches the radio. Not a warning. Not a dialog. Termination.
In Expo, you declare it in app.json:
{"expo":{"ios":{"infoPlist":{"NSBluetoothAlwaysUsageDescription":"We use Bluetooth to find your bike."}}}}
Android is more forgiving (it denies rather than crashes), but on Android 12 and above (API 31+), you need three separate runtime permissions, including one for location, even when you're not using GPS:
BLUETOOTH_SCANBLUETOOTH_CONNECTACCESS_FINE_LOCATION
Miss any of them and your scan returns zero results, silently.
Fix the configuration. Try again. Packets begin to flow.
The packets
This is the part people remember.
What you get back is not a nice object with a device name and a clean identifier. You get a blob of base64. You decode it, and what's left is raw bytes, up to thirty-one per packet. If you want to know what's inside, you parse them yourself.
This is the parser that turned BikeMesh from zero to working, in full:
import { BleManager } from 'react-native-ble-plx';
import { Buffer } from 'buffer';
const ble = new BleManager();
const APPLE_COMPANY_ID = 0x004C;
export function startDiscovery(
onFound: (d: { uuid: string; rssi: number }) => void
) {
ble.startDeviceScan(null, { allowDuplicates: false }, (err, dev) => {
if (err || !dev?.manufacturerData) return;
const bytes = Buffer.from(dev.manufacturerData, 'base64');
// Apple company ID (little-endian: bytes[0] = 0x4C, bytes[1] = 0x00)
if ((bytes[1] << 8 | bytes[0]) !== APPLE_COMPANY_ID) return;
// iBeacon subtype 0x02, data length 0x15
if (bytes[2] !== 0x02 || bytes[3] !== 0x15) return;
// 16-byte UUID lives in bytes 4–20
const uuid = bytes.slice(4, 20).toString('hex');
onFound({ uuid, rssi: dev.rssi ?? 0 });
});
}
Twenty-seven lines. Notice the shape.
There's a bit-shift, bytes[1] << 8 | bytes[0], reading a two-byte number in little-endian. There's a hexadecimal literal, 0x004C, Apple's company ID in the iBeacon standard. There's bytes.slice(4, 20), pulling sixteen bytes out of the middle of the packet because that's where the device's chosen UUID lives.
TypeScript doing bit operations on raw bytes. This is the moment the illusion breaks: BLE in JavaScript is not a high-level thing. It's a thin veneer over a radio protocol, and you have to know what's inside the packet.
The good news is that once you've parsed one, you've parsed them all. The iBeacon layout is fixed:
[ companyID:2 ] [ subtype:1 = 0x02 ] [ length:1 = 0x15 ]
[ UUID:16 ] [ major:2 ] [ minor:2 ] [ txPower:1 ]
Twenty-three bytes of structured data. Deterministic. Portable between iOS and Android.
Identity across platforms
Now you have packets. You need to recognise the same device next time you see it.
On Android, easy: the phone gives you the device's hardware MAC address, a stable fingerprint that doesn't change.
On iOS, Apple never exposes the MAC. For privacy reasons, iOS invents its own UUID for each peripheral, and it's different per app install. The same physical bike, same radio, same signal, shows up as a completely different device on a user's phone after they reinstall the app.
It's tempting to fight this. Write normalisation layers, store mappings, hope for the best. Don't.
The answer was built into the architecture earlier. Because the firmware broadcasts iBeacon, every packet carries a sixteen-byte UUID that I chose. The parser pulls it straight out of the bytes. It works identically on iOS and Android because identity lives in the payload, not in whatever the OS decided to label the device with.
The principle that falls out of this, if you ever build cross-platform BLE: never trust the identifier the operating system hands you. Put identity in the payload whenever you can.
The lying number
Every packet comes with RSSI. The temptation is immediate: stronger signal, closer device. Just show the user the number.
RSSI lies.
A device sitting motionless on a desk, nothing moving, nothing changing, will report values that swing by twenty dBm in ten seconds. Walls reflect the signal. Human bodies absorb it. A car parked a few metres away changes everything. The way the user holds their phone changes everything.
A real snapshot from a device that wasn't moving:
-65 -72 -78 -70 -85 -68 -92 -74 -69 -88
You have a number that looks precise ("minus seventy-four!") but is actually a noisy distribution. Showing that number to the user produces a jittering display that makes the app look broken.
The app isn't broken. Physics is.
Bands and haptics
The right response isn't to fix the number. It's to meet the user where the data actually is:
const getBand = (rssi: number) => {
if (rssi >= -50) return { label: 'Very Near', meters: '< 1m' };
if (rssi >= -60) return { label: 'Near', meters: '1–3m' };
if (rssi >= -70) return { label: 'Mid', meters: '3–10m' };
if (rssi >= -80) return { label: 'Far', meters: '10–20m' };
if (rssi >= -90) return { label: 'Very Far', meters: '20–50m' };
return { label: 'Out of range', meters: null };
};
Six bands. Each covers ten dBm, wide enough to absorb the jitter. "Near" spans a two-metre range, so small fluctuations don't throw the user out of the band they're in.
Every time the user crosses from one band to the next (Mid into Near, say), something happens. A colour change. A short haptic pulse. A tap on their hand.
That tap tells them "you're getting closer" far more effectively than a number ever could. Human brains are built for gradients, not for dBm.
The general lesson here, the one that travels beyond BLE, is that when your signal is noisy, your UI has to match the grain of the data. Don't pretend a messy sensor is precise. Quantise, bucket, give your user a gradient they can feel.
The other platform
I tested everything on Android first. Scan, parse, band, haptic: all working.
Then I deployed to iOS and hit a wall I didn't expect: background scanning is almost entirely locked down. You can't leave a scan running. The OS kills it.
The workaround is elegant.
Apple has a feature called iBeacon region monitoring. You tell iOS: "wake my app when you see a device broadcasting this specific UUID." The OS runs the scanner itself, silently, in the background, forever. When a bike comes into range, the OS wakes your app.
I didn't write the background scanner. Apple ran it for me.
This is the hidden payoff of the very first decision, back when the chip was still sitting on my desk. Because BikeMesh broadcasts iBeacon, it gets background scanning on iOS essentially for free. The call I made before writing a line of JavaScript is the one that carried the entire system.
Security
Everything I've described so far has a property worth naming. The tracker broadcasts its UUID in the clear, ten times a second, to anyone listening. No encryption. No handshake. No authentication.
That is how iBeacon works by design, and it is the source of every useful property (works without pairing, works cross-platform, works in the background) and every real-world risk (it can be tracked, and it can be spoofed).
Two concrete implications for anyone building this:
- Tracking. A static UUID is a persistent identifier. If you hand a tag to a user and it broadcasts the same UUID forever, anyone with a BLE scanner can trace that user's movements. Apple's own Find My network solves this with rotating identifiers: the tag derives a new UUID on a schedule, and only the owner's paired device can compute the current value. If privacy matters to your product, a static UUID is a bug.
-
Spoofing. Anyone who scans the tag once learns its UUID and can rebroadcast the same value from another device. For a bike tracker that might be harmless; for a proximity unlock, it's an attack. The usual mitigation is to move identity verification to the pairing layer (GATT with LE Secure Connections and ECDH key exchange, Bluetooth 4.2+) or to rotate the
minorfield with an HMAC so each advertisement is bound to a shared secret.
And one rule that belongs on every React Native BLE project, independent of the radio: do not ship secrets in the JavaScript bundle. RN bundles are trivially extractable on both platforms. Anything sensitive lives on the server, and the client verifies a signed response.
The short version: iBeacon is convenient because it's a shout. That shout is also the threat model.
Working with maps
A question that comes up whenever I describe this: how does BLE know where the bike actually is on a map?
It doesn't. BLE is a metres-scale radio; it has no idea where it is in the world. The phone's GPS does that job. The pattern I use, and the one that fits most tracking products, is a handoff:
- When the phone sees the bike's iBeacon, it records the phone's own GPS coordinates as the bike's last known location.
- That coordinate is rendered on a map (
react-native-mapsfor the base layer,react-native-maps-directionsif you want routing to the pin). - When the user gets close enough to the map pin, the phone starts seeing the iBeacon again and the bands-and-haptics UX takes over for the final few metres.
BLE and GPS are not competitors. They live at different scales. GPS gets you within a street of the bike; BLE gets you to the bike. The Google Maps API (or any map tile provider) is just the surface you draw the GPS coordinate on. There is no satellite relationship between the tag and the map; the tag never meets a satellite.
What the shape looks like
Pulling the whole thing into one view:
- A hardware decision, shout rather than chat, made in firmware before any app code was written.
- A permission handshake on both OSes. Miss it, and your scanner dies silently.
- Byte-level packet parsing, not event listening. You decode; the library doesn't.
- Identity in the payload, not in the OS.
- A UI that matches the grain of a noisy signal: bands and haptics, not numbers.
- Background scanning for free on iOS, because of the format chosen at the very start.
This is hardware-software integration in BLE, end to end, in JavaScript. There's a whole other evening on the chatting side (GATT services, characteristics, streaming sensor data, over-the-air firmware updates), but once you have the shape, everything else is detail.
Where this shape goes next
The architecture I walked through, a shouting tag, a parsing scanner, identity in the payload, bands on top of a noisy signal, shows up in more products than you'd think. A few that fit it well:
- Asset trackers, the shape BikeMesh uses: bikes, tools on a construction site, cameras at an event, luggage, anything that gets lost in a pile of similar things.
- Indoor navigation. GPS dies indoors. A grid of fixed beacons in an airport, hospital, museum, or warehouse gives the phone a positioning signal strong enough to route a visitor to a specific room.
- Proximity unlocks. Cars, doors, lockers. The beacon lives on the thing being unlocked; the phone carries the identity. This one needs the pairing side of BLE, not just broadcast, because the threat model is different.
- Event check-in. Attendees walk past a beacon at the entrance; their phone records the check-in automatically. Faster than QR codes, and it works when the attendee's hands are full.
- Retail and venue proximity. The original iBeacon use case: the store or gallery beacon triggers app content as visitors move through the space.
- Safety and care products. Child tags, elderly wander alerts, lone-worker safety: the phone watches for its paired beacon and raises an alert when the signal disappears.
Each of these uses the same underlying pieces. The part that changes is the product layer: what you do with a sighting, how you draw the map, how you design the feedback for the user holding the phone.
Code + talk
The BikeMesh source is on GitHub: github.com/ifedayoagboola/bike-mesh.
I'm giving this as a talk at BelfastJS on 13 May 2026. If you're in Belfast, come.
Ifedayo "Sam" Agboola is a software engineer and founder based in Belfast. He builds at the intersection of JavaScript and hardware: BikeMesh, Sellexa, and the projects that live between them. Find more at dev.to/blackscripts and linkedin.com/in/ifedayo-agboola.