Getting what you need out of the xBee API Library for Arduino

Aside: All this code is tested on Series 2 and Pro models of the xBee. With that said understanding this code will allow you to write something very similar if you require the use of Series 1 modules.

Understanding this library was one of the major early challenges for my thesis project. Once I was able to get a handle on the structure of their code it became easy to write code to suit the application rather than the other way around.

The original library was written in 2009 by Andrew Rapp. There have been sporadic updates since then, but in order to get some semblance of an up-to-date version I had to go through a mailing list for the lib. [Update: it appears that the software serial release now exists officially, download your files from the main source]

Armed with what worked in the latest Arduino IDE I began to explore. [Aside: I often say ‘around line XXX’ to keep things simple and semi future compatible]

There are two files for the library: XBee.h and XBee.cpp.

Lets take the ‘get something together right away’ approach to learning this.

If you open the XBee.h file, you will notice a great deal of defined constants. Thankfully they have readable names and some sporadic comments to help us guess our way through this.

If you scroll down to around line 82 you will notice Api Id Constants. These will help us check what kind of packet is being sent or received. In my case I wanted to consider two packets in particular. TheZB_RX_RESPONSEand theZB_IO_SAMPLE_RESPONSEboth deal with received packets of the connected xBee. Note thatZBhere stands for ZigBee, which is the communication mode of the Series 2 xBee modules.

What do these constants represent? A HEX value that each packet uses to identify its nature to the recipient.

What else can we get from this file? The class structure, around line 151 you will notice theXBeeResponseclass being declared and it’s public methods listed.

If you are new to C++ don’t worry, I will describe what this means to us as end users. Public methods are the things we really care about, they are the ‘functions‘ that we would want to call on in our program.

The public methods directly in theXBeeResponseclass are a little high level for our purposes. How do we find what we need?

Earlier I mentioned theZB_IO_SAMPLE_RESPONSEconstant. If you search the file for that name, you will be drawn to somewhere around line 220. There you can see that we have thegetZBRxIoSampleResponsefunction. Lets click some pieces in: if you want to read a value of a pin on an external xBee (lets say you are looking to read a temperature sensor attached to it) you will be looking at an IO Response from that xBee.

The constant mentioned earlier defines how to verify that the packet you currently received is of a specific type, in this case theZB_IO_SAMPLE_RESPONSE. Once this is verified we can call the painfully namedgetZBRxIoSampleResponsefunction in order to get at the data available within.

Now that we are that far in our thought experiment lets look at how to get to the final stretch. Still in the XBee.h file, lets search for “ZBRxIoSampleResponse” (orgetZBRxIoSampleResponsewithout the “get”.) This brings us to a class that deals with this type of packet specifically.

To understand the structure of the XBee library better, it is very important that you note that this class is using another via the: public ZBRxResponse. This means that the public methods of that class are available to you when you callgetZBRxIoSampleResponse. This is chained all the way to the first class we looked at:XBeeResponse. And you can trace that back by searching backwards for “XBeeResponse.”

Now that that is out of the way, lets see packet specific functionality we have exposed to us:

  bool containsAnalog();
  bool containsDigital();
  /**
   * Returns true if the pin is enabled
   */
  bool isAnalogEnabled(uint8_t pin);
  /**
   * Returns true if the pin is enabled
   */
  bool isDigitalEnabled(uint8_t pin);
  /**
   * Returns the 10-bit analog reading of the specified pin.
   * Valid pins include ADC:xxx.
   */
  uint16_t getAnalog(uint8_t pin);
  /**
   * Returns true if the specified pin is high/on.
   * Valid pins include DIO:xxx.
   */
  bool isDigitalOn(uint8_t pin);

These are very well commented, so with that packet specifically we can quickly check:

  • Which pins are enabled /and/ set to analog or digital.
  • The data for those pins as stored in the packet.

For the use case of grabbing a sensor value on pin 1 this is all that we need to complete our code. Which brings me to the part I am sure you’ve all been waiting for.

The Code

  #include <XBee.h>

  XBee xbee = XBee();
  ZBRxIoSampleResponse ZBioSample = ZBRxIoSampleResponse();

  void setup() {
      Serial.begin(9600);
      xbee.begin(Serial);
  }

  void loop() {

    xbee.readPacket();

    if (xbee.getResponse().isAvailable()) {
      if (xbee.getResponse().getApiId() == ZB_IO_SAMPLE_RESPONSE) {

        xbee.getResponse().getZBRxIoSampleResponse(ZBioSample);

        if (ZBioSample.isAnalogEnabled(1)) {
          int data = ZBioSample.getAnalog(1);
          // now you have the pin data!
        }
      }
    }
  }

Tada, if you read the above explanation the meat of the loop there shouldn’t be surprising. With that said there are a few lines that are required by the library.

First we include it like any other Arduino library. Then we create an instance of XBee, allowing us to call it.

The next line is what might be confusing, what we are doing is creating aformfor our specific packet to fill (Just instance of theZBRxIoSampleResponsewe looked at above!). This is used to store that packet and call the functions I mentioned earlier on it.

In the setup we have a standard Serial begin and then we initialize our xbee passing the library the Serial ‘line’ we will communicate to it on. (Note that you can use software serial here as well, just begin it and pass it to an xbee)

Then we get to the stuff that actually does our work. First we want to constantly watch for a packet to come on, so we run thexbee.readPacket();function right in the main loop. Following we check if our xBee has received a packet (via the generalxbee.getResponse().isAvailable().) Once a packet is confirmed we can look back at our prior reasoning.

Check the ID of the packet, and reference it against the constant we want to look for (in our caseZB_IO_SAMPLE_RESPONSE.) Once that is also confirmed we finally know that we have an IO packet from our device!

We call the function we found earlier in order to populate theZBioSampleshell:

  xbee.getResponse().getZBRxIoSampleResponse(ZBioSample);

Now you have the latest packet data available to you viaZBioSample. With that just use any of the functions you find in theZBRxIoSampleResponseclass that we looked at earlier.

In our example we first check if pin one on the xBee is set to analog:ZBioSample.isAnalogEnabled(1)if that test passes we grab the data from the pin viaZBioSample.getAnalog(1);

Your Use Case

This is all fine and dandy if all you want to do is what I wanted to do. Which is why I spent all that time at the start showing you a couple of bits from the XBee.h file.

If you look through the defined Api ID constants again, and search for their respective classes you can quickly get your bearings and start dealing with many other types of packets that your xBee can receive. Once you know what type of data you want to grab or send, and have found the class that gets you to that the fastest take a look through the available methods. And don’t forget to check the classes that they reference for their methods as well. As an example, lets say I want to also grab the address of the xBee that sent that packet.

If you look through the methods provided by theZBRxIoSampleResponseclass you can see that it doesn’t appear to contain that method. Now lets look at the class that it is referencing:ZBRxResponse(Around line 381)

Note that it has a methodgetRemoteAddress16this can let us do just what we want, and we can still call it on theZBioSampleinstance we created earlier.

  ZBioSample.getRemoteAddress16();

This will now let us have the 16 bit address of the xBee, which we can use to give some more context to the prior sensor value.

The avid XBee.h explorer has probably also noticed thatZBRxResponseis also referencingRxDataResponse. If you can’t find what you are looking for, just try looking all the way down the rabbit hole!

I hope that this has been at least somewhat informative. I didn’t want to just throw a chunk of code out there. Lets call this ‘Teaching how to fish and giving you the first catch.’

For anyone that missed the link at the top, make sure to grab the library from the official source.

Outline

This is the preliminary outline for the final paper submitted with my thesis project.

1: Introduction, Context, and Background Research

This chapter will focus on clearly defining the goals for this project as well as looking at the technologies and advancements in this area. Existing wearable API services primarily focus on tracking the activity of the body, i.e. Fitness trackers, medical trackers. While ambient environment trackers tend to focus on a specific stationary environments. Examples of these will be listed.

2: Overall Project Description and Narrative

Here I will describe the possible short and long term uses for this type of ambient data API, from commercial, artistic, and medical standpoints. This chapter will also provide an overview of the design decisions within this project, as well as provide a walkthrough of the concept artwork produced.

3: Prototyping, testing, and experiments

This chapter will focus on three parts of the project.

  • Replication of the design aesthetic established by the concept art.
  • Physical computing successes and challenges.
  • Developments in the codebase, like the communication protocol and organization of data.

4: Final Project

This part of the paper will focus on the wearability and usefulness of the final prototype. I will conduct some rudimentary analysis of the output data in order to illustrate some of the more easily discernible patterns. (Transitions form indoor – outdoor environments and crowded – private locations)

5: Future Work

This will focus in more depth on the various possible applications for this ambient API, as well as where the project can be expanded via further modules and sensors. The section will also look at the various technologies that had to be removed or adjusted during the process and whether they would be more beneficial in the future. (Bluetooth Low Energy for communication between modules)

Upcoming Challenges

This semester will have a number of key tasks and challenges that I will need to overcome in order to complete the Thesis project to my satisfaction.

1 ~.Communication.

Due to the unforeseen issues withe RBL BLE modules described in a prior post I need to decide whether to continue pursuing this route or go with a different wireless solution. This is a primary issue, and needs to be solved within the next few days. I am beginning to lean to a solution I have already worked with in a prior proejct as it would both enable much simpler networking and a broader range of devices that I can use as the final receiver. With that said, the financial cost of the RBL modules is not insignificant and their advertised benefits would be a great asset to this project (provided I can get them to work within the next week).

2 `.Structure

This is another yet to be solved problem. Although 3D printing is an effective prototyping method, and I will continue with it, it is unable to duplicate the aesthetics established by my design process. It’s important to follow through form the designs in order to have a coherent final style guide. My current consideration is to create a simplified 3D printed electronics housing for each module and then attach other materials onto it via adhesives or mounting methods (Snap-in, bolts, etc). This will require a set of quick tests to decide what would be best for this process.

3 ).Write-out

The final outcome for this semester will be presented at the OCADU GradX, meaning that there are some considerations that need to be taken to make the invisible visible for the public. Data gathered by the sensors will have to be presented in a compelling and perhaps interactive manner. This will be a challenge after the codebase is complete, but must be considered throughout.

4 _.Functions

Communication is one thing, and although the code will not be challenging it will need to have a fairly high degree of stability. It will also need to be very general.

Each module can have a sensor or a part of a sensor (for example 1/2 of a 360deg ambient sound sensor is on one shoulder), the message that the module sends out to the collector will need to provide a very clear note of which sensor is which. The collector will have to process this data in order to unify the sensors that are split amongst modules.

Tag structure: Mod 1 thru 3 followed by sensor 1 thru 3 followed by sensor value i.e. 1|1|255 <- neck module, light sensor, full bright. If there are multiple sensor ‘pieces’ on different modules they are easily distinguishable. (1|1|255, 2|1|255) etc. This is a preliminary idea, and there might need to be more metadata passed through.

A lot of the data noise will need to be cleaned up on the software level as well (likely with basic thresholds).

The collector module (upper hip) will need to consolidate this data into a package that can be logged internally and synced as appropriate to an internet enabled device (Currently a mobile phone). Each package will be given a distinct time stamp and any message backlog will be pushed to the phone at the next best opportunity.

With the above code completed the project will have everything it needs to provide a clean interface for the gathered ambient data.

There are also a few parts that need to be ordered soon,

2x Colour light sensor. (And a focus lens that needs to be tested)
4x Wireless Charging coil and circuit.

Setting up the BLE Mini from Red Bear Lab

I picked up four of these bluetooth low energy (BLE) modules with the intention of setting up three wireless sensors and a receiver. After a long struggle I managed to get some basic functionality going and would like to document that to hopefully save some time for anyone else trying them out.

Main components:

  1. BLE Mini
  2. Arduino Pro Mini 3.3v
  3. Analog Sensors

Other:

Bluetooth 4.0 compatible device. Here are some of the devices that support this:

  • Android
    • Nexus 4
    • Nexus 7
    • Samsung Note 3
  • iOS
    • iPhone 5 (all models)
    • iPhone 4S
    • iPad Air
    • iPad (3rd gen or later)
    • iPad mini (all models)
    • iPod touch (5th gen or later)

RedBearLab outlines some basic instructions on how to set the BLE mini up with iOS and Android here. With that said the instructions really need to be expanded further. Here is a breakdown of the iOS path:

Connect pins from BLE Mini “J4” to Arduino board
VIN > 5V
GND > GND
TX > Default RX (Pin 0)
RX > Default TX (Pin 1)

The Green LED at D1 should light up, otherwise please check the Troubleshooting section below.

Notes: VIN doesn’t need to be five volts, 3.3 volts will work just fine. For me, the LED did not light up on one device. I was able to fix that by uploading the latest firmware to it with the instructions a little further down.

Download our latest RedBearLab Library.

Notes: The ZIP file provided is the latest but if you want to dig through the source and be a bit more selective with the download here is the RBL GitHub page.

Although the device firmware was changed quite a few months ago a much older version appears to come with all the modules I purchased. Make sure you grab the latest from here and follow these instructions carefully 

Unzip the file and copy the “RBL_BLEMini” subfolder in BLEMini/Arduino/libraries to Arduino’s libraries folder.
For more information about Arduino’s libraries folder, please visit http://arduino.cc/en/Guide/Libraries.

Open our BleFirmata sketch: “File” > “Examples” > “RBL_BLEMini” > “BLEFirmataSketch”.

Compile and upload the program to your Arduino board.

Notes: Big note: use Arduino 1.0.5 and not the 1.5 nightlies, else BLEFirmata doesn’t compile.  Although this setup is fairly common, we will get to some problems with BLEFirmata a little bit later.

Download our BLE Arduino App from Apple’s iTunes Store.

Turn on Bluetooth on your iOS device. (*Please note that BLE Mini or any Bluetooth Low Energy device will not show in the “Devices” list as pairing to BLE device is not required)

Start our BLE Arduino App and press “Connect”.

Note: The main issue here is that you only get three board so chose from when you connect via the iOS app. This is pretty backwards and I don’t have any decent solutions. While testing the A(n) pins tend to be the same across boards, I used UNO and LEONARDO modes to test the minis. The only issue is that the UNO setting is the only one that shows the ‘analog’ option for the A(n) pins, which lets you see the analog signal live in the app. When set to ‘input’ on the LEONARDO setting no data is ever displayed on the iOS app.

Once that’s done you can puppeteer your Arduino of choice via the BLE App.

Some notes on BLEFirmata: Currently only UNO, MEGA and LEONARDO are officially supported. You can get something like the pro mini to work but you can run into issues with mismatched I/O and software serial. I wish I had a fix for this right now but I don’t even know where to begin.

RBL released a library for using one BLE Mini as a master and connecting slave devices to it. Unfortunately any version of the HCI library they provide that I try to upload to the boards ends up turning off the green status LED immediately. I am checking with support about this issue now.

Overall the impressive part of this module is the stability once it’s connected. I have been using a retina iPad to test the app, and once connected I haven’t had any kind of signal loss. Hopefully this stability can be translated into more custom projects.

Resources:

This thread provided some very good insight.

Although light on content the RBL zendesk support is worth a further examination.

A good project on instructables.com

Wraithguard

20131201_195026_HDR

or what was supposed to be Wraithguard

This is my final project for the wearable technology class this semester. I really wanted to focus on something more frivolous than some of the other projects I made in the past three years. Thus the concept ended up being rooted in media that I enjoy.

At first I really wanted to create a version of Wraithguard from Morrowind. I am still struggling to finish up a post about Morrowind which would explain my desire to bother with this. After looking over the possibilities for duplicating or mimicking this design I came to the conclusion that there wasn’t a plausible way for me to make it look pleasing. Adjusting my influences I looked at some other fan projects I could undertake.

Once I tinkered a bit things starting to look very much like a Cyberpunk garment. I realized that this could be an object not built by professionals or for mass production but rather in an environment where people would scramble to create something out of the junk they had lying around. A tool that would be wasteful, suboptimal, dangerous to use but one that would complete the required task.

Here is the initial glove work:

Beyond aesthetics I wanted to tackle the technical challenge of running a large amount of EL material in a purely wearable project (aka no cord to the wall.) Here began some heavier challenges, at first I didn’t think a 12v inverter would be a good idea. So I decided to pick up some 2AA volt battery + inverter packs. After hooking one up to the wire I noticed that it was horribly dim. Realizing that the inverter is taking in 3 volts I decided to test it with a 3.7 volt lipo battery. That didn’t go over well, the inverter lit the wire for a second then promptly burned up.

After some testing I decided to just go for as much voltage as I could muster, hoping to counteract the inevitable bulkiness by driving more EL material and having a more stunning final effect.

I picked up two 7.4 volt LiPo batteries, hooked them up together and powered a Leonardo with an EL Escudo Duo shield. This let me run an EL panel, tape and wire at a much brighter than even a 12 volt wall power supply did during testing.

I was really happy with just how bright the EL materials got, but had a new issue to deal with. This was getting quite heavy quite fast, the solution was to separate all the circuitry from the hand and move it to the upper arm. I crated a sturdy cord, wrapped it a few times in leftover EL wire and the outcome didn’t look too bad. All the components were strapped to a very wide Velcro strap:

Once all that came together I wanted to add some interactivity. Going back to my inspirations I realized that the constantly on EL materials made it look very clean and too functional. To randomize and add believable electrical flicker I attached a 3 axis accelerometer to the hand. This gave me three distinct movement values to work with in my code. I used them to adjust the pattern and delay between flickers of each individual piece of EL.

Accelerometer WIP:

And here are some shots of the final result:

Also Kate and Michael wearing it at the critique today:

Overall I am very happy with the outcome, it isn’t the Wraithguard I hoped for but I think that it has a really cool effect to it. I am also attaching the short three slide PDF I used to present this earlier. [Wearables 1 2013 Final]

Technical Details

The code is available in this gist. I tried to make it readable, most of it was done by slightly adjusting the values. These few lines comprise the the chunk that ended up controlling flicker timings.

While this dealt with some randomization of the sensor data.

The circuit itself was fairly basic. The accelerometer is connected straight to the Arduino pins via a cable, and the only other modification was adding a connector for the batteries that also contained a switch to turn everything on and off.

End of semester thesis post

20131116_174050

This past week was probably the most interesting from this semester. After the initial 3D printing test from last week me and Rickee went all out and spent a few long evenings at the Komodo Lab. The results on my end are two well formed prototypes and models.

Here are some process images:

I also did a quick vine as I found some of the sounds that the printer’s stepper motors make awesome.

The neck module had to be scaled down in the current iteration, so it is more of a demo of the shape. The arm module is 1:1 scale, unless that changes in the future. It looks like there is enough room for all the components, but that will come early next semester. Expect the flood of part and circuit pictures to come back (thank god my soldering skills are slightly upgraded.)

I am still really enjoying using OpenSCAD, and the 3D models are developing nicely. You can take a look at both on my GitHub.

Here are the final pieces:

This slideshow requires JavaScript.

Thesis Update 3D printing

Did my first few test 3D prints this week. Very excited to keep iterating and testing out what is possible with the printers. Here are some pictures:

This slideshow requires JavaScript.

The printer has quite a few quirks, there is a chance of warping due to heat that only becomes noticeable about 20% into a model. Hairspray appears to be the industry standard solution to this issue. Another issue that makes building one piece models very difficult is the way the piece needs to be as flat as possible at the bottom (With the next test I will be testing some 45 degree bevels though.) There is also the obvious limitation of this particular method that doesn’t allow for a way to print without a direct support from the base. Meaning horizontal connections and sharp protrusions cannot have open air below them.

After struggling with various desktop and online 3D tools I realized that it would be longer for me to master them then to search for non GUI alternatives. I ended up stumbling onto OpenSCAD and after giving it a try I immediately felt right at home and was able to put together some more precise STL files for printing. OpenSCAD just has two panels, code and render. There are only a few language specific commands, allowing for a very low learning curve if you know some basic programming (Just check out how tiny this complete cheat sheet is.)

With this new tool in tow I was finally able to do some rapid prototyping and ideation with just a bit of code.

Screen Shot 2013-11-15 at 8.06.37 PM

I am applying some pointers from the prior model to this one, upping the scale and letting the printer take care of how to hollow out the solid areas. As well as providing a more solid base to combat warping.

I am also working on setting up a well structured Thesis project repository to keep track of all my STL and Design files. ( Already have a Github repository set up but haven’t done much there in a few months now.)

Here it is: DFI-Thesis

Big thanks to the Komodo Lab and Rickee Charbonneau for all the help this week.