Technical Implementation
Technical work
Our technical implementation involved a lot of learning-by-doing. Our input options went through several cycles that allowed us to learn what worked (and what didn’t):Nokia Go API
Our first attempt utilized Nokia Go, a low-cost fitness tracker similar to a FitBit. The Nokia platform has an API that allows developers to read information including daily steps and acceleration. However, once we set up both our device and the API keys, we couldn’t read the data from the server. We spent over a week debugging this API, we could not find an implementable solution.
Arduino (LightBlue Bean)
Our second attempt utilized the LightBlue Bean device, an arduino with a Bluetooth Low Energy development platform. While we were able to link the Bean device to our computers and successfully change the LED colors based off acceleration, we were not able to read the values in the serial monitor on our macOS system. Thus, we decided to switch to a PC command-line interface (CLI) to program the Bean. We installed the node.JS software developer kit, Microsoft build tools, and the Bean CLI loader. We ran into a small setback, since Bean requires a Bluetooth Dongle to connect to a PC. Once we acquired the dongle, we had trouble scanning for the Bean; removing and resetting the battery allowed us to discover the device. However, there was an error with the Python link, and we were not ultimately able to upload sketches.
OpenHumans Data
After trial-and-error with our input devices, we decided to use fixed data for the remainder of our project. OpenHumans is a project that allows users to share their data for scientific research. We used JSON files from 28 OpenHuman FitBit users as our input data.
Output
Our data ouput was built off an existing JavaScript library, D3. According to D3, the library allows a developer to bind data to a Document Object Model (DOM), and transform it into a visualization. We adapted an open source GitHub visualization “OMG Particles!” from Mike Bostock.
While we primarily found success building our visualization using D3 and FitBit, we did run into another challenge with the saving functionality. This ability was intended to work such that the user could swipe right or left to save the visualization as an image or movie file. We tried a few different methods: saving by appending the visualization to a canvas, using a more native d3 saving function, and various other suggestions we found online. However, ultimately, we were unable to figure this out due to the constraints of our own technical skills (none of us study software engineering) and the constraints of time. We wish were able to figure this out, but we were able to fake it for our demo/video and still produced an output we are proud of.
While we primarily found success building our visualization using D3 and FitBit, we did run into another challenge with the saving functionality. This ability was intended to work such that the user could swipe right or left to save the visualization as an image or movie file. We tried a few different methods: saving by appending the visualization to a canvas, using a more native d3 saving function, and various other suggestions we found online. However, ultimately, we were unable to figure this out due to the constraints of our own technical skills (none of us study software engineering) and the constraints of time. We wish were able to figure this out, but we were able to fake it for our demo/video and still produced an output we are proud of.
Comments
Post a Comment