The technology is at risk of dying off — and that would be a shame.
iBeacons and various BLE technologies have the potential to shake up many established ways of doing business by streamlining interactions. Although there are potentially many uses for iBeacons, much of the initial discussion has focused on retail. (I’ll follow up with some examples of iBeacon applications outside retail in a future post.)
As I described in my initial post in this series, all an iBeacon does is send out advertisement packets. iBeacon transmissions let a receiver perform two tasks: uniquely identify what things they are near and estimate the distance to them. With such a simple protocol, iBeacons cannot:
- Receive anything. (Many iBeacon devices will have two-way Bluetooth interfaces so they can receive configurations, but the iBeacon specification does not require reception.)
- Report on clients they have seen. Wi-Fi based proximity systems use transmissions from mobile devices to uniquely identify visitors to a space. If you take a smartphone into an area covered by a Wi-Fi proximity system, you can be uniquely identified. Because an iBeacon is only a transmitter, it does not receive Bluetooth messages from mobile devices to uniquely identify visitors.
iBeacons don't communicate directly with end users — applications are required for translation and action execution.
Once you are set up with an iBeacon, no matter whether it is a dedicated device or a program running on a host device, you are ready to start writing applications. The iBeacon “protocol” is simple, as we saw in the introductory post: it defines regions in space as “where I see a specified combination of UUID, major, and minor numbers.” There is no descriptive text or mapping transmitted in the packets sent by a beacon. Translation between the beacon’s transmissions and any actions are done entirely within an application running on the receiving device, even if the application is a simple text message to say “welcome to this place.”
For developing applications on iOS, the core documentation is the developer’s guide to region monitoring. For many years, iOS has enabled applications to use the physical location of a device in applications through the CoreLocation framework, which is what users enable in the Location Services settings in the Privacy panel. iBeacon functions were added to CoreLocation in iOS 7.0. Naturally, devices must have hardware support for the underlying Bluetooth Low Energy functions, which in practice means an iPhone 4S or later iOS device. Macs sold since late 2011 also have the required Bluetooth hardware. Read more…
The difference between location and proximity: knowing you’re in the restaurant vs knowing what table you’re sitting at.
As the old proverb goes, “close only counts in horseshoes and hand grenades.” It doesn’t quite apply when building mobile applications, however. Smaller screens and the resistance to extensive keyboard input define the input and output constraints of mobile apps, but there is something more fundamental that a mobile application can do. At its best, an application that knows where you are will augment reality to help you navigate and interact with the physical world. These applications are sometimes described as “location” applications and sometimes described as “proximity” applications. Many people are guilty of using the words interchangeably — myself included. In response to my post on iBeacon basics, Alasdair Allan called me out on Twitter:
— Alasdair Allan (@aallan) March 23, 2014
His comment was spot-on. In this post, I’ll define (and be very precise about) the difference, the importance of which leads directly to the level of interest in proximity — and informs the excitement levels around technologies like iBeacons. Read more…
Proximity is the 'Hello World' of mobility.
As any programmer knows, writing the “hello, world” program is the canonical elementary exercise in any new programming language. Getting devices to interact with the world is the foundation of the Internet of Things, and enabling devices to learn about their surroundings is the “hello world” of mobility.
On a recent trip to Washington D.C., I attended the first DC iBeacon Meetup. iBeacons are exciting. Retailers are revolutionizing shopping by applying new indoor proximity technologies and developing the physical world analog of the data that a web-based retailer like Amazon can routinely collect. A few days ago, I tweeted about an analysis of the beacon market, which noted that “[beacons] are poised to transform how retailers, event organizers, transit systems, enterprises, and educational institutions communicate with people indoors” — and could even be used in home automation systems.
I got to see the ground floor of the disruption in action at the meetup in DC, which featured presentations by a few notable local companies, including Radius Networks, the developer of the CES scavenger hunt app for iOS. When I first heard of the app, I almost bought a ticket to Las Vegas to experience the app for myself, so it was something of a cool moment to hear about the technology from the developer of an application that I’d admired from afar.
After the presentations, I had a chance to talk with David Helms of Radius. Helms was drawn to work at Radius for the same reason I was compelled to attend the iBeacon meetup. As he put it, “The first step in extending the mobile computing experience beyond the confines of that slab of glass in your pocket is when it can recognize the world around it and interact with it, and proximity is the ‘Hello’ of the Internet of Things revolution.” Read more…
Power limitations with mobile devices are just the tip of the iceberg.
I’ve spent the past decade of my professional life working to enable connectivity everywhere with Wi-Fi. Back when I started working with Wi-Fi, it was a way of connecting laptops to the network more easily. These days, Wi-Fi is more likely to be used as a way of getting an entirely new type of device connected — a phone, tablet, or even a sensor or automation device.
The downside to all this mobile computing is that batteries are not keeping up with the demands placed on them. Computing power grows exponentially, following Moore’s Law, but the ability of batteries to store energy grows much more slowly. To take one example, the battery capacity of the various models of iPhone have grown about 15% since its introduction, but the increase in the capabilities of the device has far exceeded that mere 15% growth.
Engineers working on battery technology continue to eke out gains, and the energy storage capability of lithium rechargeable batteries is high. According to one memorable turn of phrase, the energy density of lithium batteries is now comparable to hand grenades, even if it is only about 4% of the density of gasoline. Read more…
Wearables can help bridge the gap between batch and real-time communications.
I drown in e-mail, which is a common affliction. With meetings during the day, I need to defer e-mail to breaks between meetings or until the evening, which prevents it from being a real-time communications medium.
Everybody builds a communication “bubble” around themselves, sometimes by design and sometimes by necessity. Robert Reich’s memoir Locked in the Cabinet describes the process of staffing his office and, ultimately, building that bubble. He resists, but eventually succumbs to the necessity of filtering communications when managing such a large organization.
One of the reasons I’m fascinated by wearable technology is that it is one way of bridging the gap between batch and real-time communications. Wearable technology has smaller screens, and many early products use low-power screen technology that lacks the ability to display vibrant colors. Some may view these qualities as drawbacks, but in return, it is possible to display critical information in an easily viewable — and immediate — way. Read more…
This lower-cost technology could greatly enhance consumer convenience for many applications.
I’ve been thinking a lot about the new low-energy form of Bluetooth (BLE) recently, with an eye toward thinking about ways it can be used. The core advantages the protocol has over other similar standards is that it’s optimized for lower data rates, and extremely long battery life. While we may complain about how much energy a Wi-Fi device uses, it’s acceptable to charge your phone once a day. If we could eliminate the need to recharge, what lower-data rate applications could we build?
The most obvious application of something like BLE is that it communicates over a shorter range, and therefore, can provide precise location information. Companies like Euclid Analytics measure foot traffic by using Wi-Fi signals, so the precision of the location is fairly rough. BLE devices have a smaller operating range, and thus would be able to provide information on what aisle a person is in instead of a broad area of the store. (And yes, there are obvious privacy concerns here, especially given that many users tend to accept all the privileges requested by an app running on their phone, which might make BLE-enabled location personally identifiable.) Read more…
Very much a work in progress, the stack for IoT will require rethinking every layer of the protocol stack.
When I flip through a book on networking, one of the first things I look for is the protocol stack diagram. An elegant representation of the protocol stack can help you make sense of where to put things, separate out important mental concepts, and help explain how a technology is organized.
I’m no stranger to the idea of trying to diagram out a protocol space; I had a successful effort back when the second edition of my book on 802.11 was published. I’ve been a party to several awesome conversations recently about how to organize the broad space that’s referred to as the “Internet of Things.”
Let’s start at the bottom, with the hardware layer, which is labeled Things. These are devices that aren’t typically thought of as computers. Sure, they wind up using a computing ecosystem, but they are not really general-purpose computers. These devices are embedded devices that interact with the physical world, such as sensors and motors. Primarily, they will either be the eyes and ears of the overall system, or the channel for actions on the physical world. They will be designed around low power consumption, and therefore will use a low throughput communication channel. If they communicate with a network, it will typically be through a radio interface, but the tyranny of limited power consumption means that the network interface will usually be limited in some way.
Things provide their information or are instructed to act on the world by connecting through a Network Transport layer. Networking allows reports from devices to be received and acted on. In this model, the network transport consists of a set of technologies that move data around, and is a combination of the OSI data link, networking, and transport layers. Mapping into technologies that we use, it would be TCP/IP on Wi-Fi for packet transport, with data carried over a protocol like REST. Read more…
The Jawbone UP shows the promise available in all kinds of wearable sensors.
In a recent conversation, I described my phone as “everything that Compaq marketing promised the iPAQ was going to be.” It was the first device I really carried around and used as an extension of my normal computing activities. Of course, everything I did on the iPAQ can be done much more easily on a smartphone these days, so my iPAQ sits in a closet, hoping that one day I might notice and run Linux on it.
In the decade and a half since the iPAQ hit the market, battery capacity has improved and power consumption has gone down for many types of computing devices. In the Wi-Fi arena, we’ve turned phones into sensors to track motion throughout public spaces, and, in essence, “outsourced” the sensor to individual customers.
Phones, however, are relatively large devices, and the I/O capabilities of the phone aren’t needed in most sensor operations. A smartphone today can measure motion and acceleration, and even position through GPS. However, in many cases, display isn’t needed on the sensor itself, and the data to be collected might need another type of sensor. Many inexpensive sensors are available today to measure temperature, humidity, or even air quality. By moving the I/O from the sensor itself onto a centralized device, the battery power can be devoted almost entirely to collecting data.