GPS through Arduino and B_GSM Boards

For our GSM class final, partnered with Clara and Karthik we created a Geo-Fence hooked to an Arduino Uno.

If you're receiving garble when reading the incoming data from the GSM through Arduino, you should check whether both boards are working at the same baudrate (9600). To change the GSM board –Quectel 10M– baudrate through an FTDI adaptor and CoolTerm:

  1. AT+IPR? //Response (most likely) 115200
  2. AT+IPR = 9600 //Response OK
  3. —[Disconnect] change Baudrate to 9600 [Connect]—
  4. AT&W //Save settings. Response OK

Now plug the GSM Board to the chosen Software Serial PINS (10 and 11) and change the Fence center-point (homeLat and homeLon) the size –radius– of the fence in KMs (thresholdDistance) and the phone is going to be controlled from (phoneNumber), UPLOAD and run the SERIAL MONITOR. The code can be found in this Github Repo. Enjoy

The hardware used for this project was an Arduino UNO and one of Benedetta's custom GSM Boards with enabled GPS. This setup could work for potentially stand-alone purposes, however is advised to make sure beforehand that the power source has enough Watts to run the setup for the sought time.

Kinetic Energy Challenge

Concept Development

Along Oryan Inbar, we decided to address the kinetic energy challenge by powering the LED through a trainer bicycle setup. After repurposing the stepper motor from a bill-printer we began exploring different circuit  possibilities around capacitive, resistor and charging settings. In the end our circuit is composed by the two rectified-coils from the stepper connected in series, a two way switch that allows to charge the capacitors first and light the LED after, three 1F Capacitors, one 330 Ohm resistor and a counter LED (which we believe is lit by 1.7 Volts)


It was surprising to see the Short Circuit Voltage whenever plugging the LED, from around 29V to 2.2V. We also decided to add up the two coils to two Bridge Rectifiers that would power the circuit in series. This and the overall capacitance pointed that we needed to first charge the capacitors before connecting the LED. This is the reason behind the two way switch. After sorting the general circuitry, we decided to use the strongest muscles as the source of power along with an already solved solution as the mechanism –a bicycle–. 


We created a bicycle trainer to interface the bicycle to the stepper motor. This latter one we re-use it from a bill-printer taken from the shop's junk shelf. The overall kinetic energy  inputed into the stepper motor can be identified from the gear configuration. We re-use the embedded gear system from the printer and realized that the driver has a 11:1 ratio in relation to the driven motor gear. At the same time, this gear system, specifically its driver was connected to the back wheel from the bike, having more less a 1:35 ratio.

Counter UX —Physical + Mobile—


The general idea came from the hula-hoop toy as a cyclical activity. After some thought and happy accidents, it was clear that a scaled down version would be more suitable. As a rapid prototyping strategy, the use of glowing necklaces came in pretty handy. It not only scales up quite well to the assignment's brief, but it communicates clearly the data visualization of the activity's tentative feedback.

Insight from User Test

After some user tests, it was clear that people intuitively spin the ring in either way –clockwise and counterclockwise–, which means that the overall task of counting should be designed in line to this ambivalence. 

Design Principle

This is the reason behind the two sides (white and black) of the object. When spinning the object with the white side up, it counts and whenever the black is up it basically undo the counts. The general idea is to see the circle light up in relation the the goal's progression, being counting up or down. The way to set up the mode and quantity are through embedded knobs as shown in the third illustration. 

Digital Translation

Having a prototyped experience makes the UX mobile sketching easier. Nevertheless, these diagrams make the overall panoramic view of the experience much clearer. 

Designing Interactions

Translating this physical experience into a digitally mobile one could go two ways. In the end, the chosen alternative will rely on user tests. The initial idea is to spin the mobile device to count and spin it the other way to undo the count. However, spinning the physical device might not be intuitively enough in mobile applications. The other hypothetical alternative is through a circular swiping gesture, consistently upon the whole app's interaction.

Airplane Food Order

The ideal experience behind an order in a plane –maybe elsewhere as well– would be to be suggested food pairing by correlating the person's Agenda, Rest Prediction through Biometric sensed data, and a Medical history. Before creating the wireframe, I deconstructed the information into a Hierarchical Task Analysis to have a better sense of the drill down flow of the overall UI

There's 3 sub-levels involved in the order flow, except for Coffee which takes two additional (type of milk and sweetness). By creating this, I was able to decide on micro interactions such as reducing the choices to Yes or No answers whenever a refined choice is needed. For instance,  Water with or without ice. This allows for an overall consistent UI flow.

The overall circle layout is the a tentative proposal towards cyclic rituals behind meals.

Box Fab


We decided to work with live-hinges for our first project. We started off by concept proving through black foam.


After some tests, we chose the "parametric kerf #6" pattern given to its generous flexibility. For our overall box concept we combined the live-hinge method with a for dice semi-cubed volume. The next step we took, was to start cutting the two apparently replicated pieces.


However, our estimates for covering the half circles was inaccurate, avoiding the planes to fully assemble one-another.


For our second iteration, we follow Eric's advice and jump to prototype with our final material, wood. This we planned and did a little calculations to make sure the sides height would match to the half circle perimeters. We also planned for 45º edges, so we created 5mm inner reference raster-edges to sand after cutting. Since the material is 5mm thick, we realize that for 45º edges we needed a "square" reference to more less know our limit when sanding off the residue.

On our second laser cutting attempt, we came around with some technical unexpected obstacles. Besides overestimating the setup a bit high, the machine also cut offset (unknown reason still). Last but not least, the 60W laser cutter settings are different from the 50W when it comes to edging/rastering with black. This third setback was in fact a happy accident that allow us to realize we could simplify the entire process by scaling one of the sides by the thickness of the material. Our third cut run quite smoothly.

Error Correction and Experimentation

We even explore ways of conveniently bending wood with warm water and overnight drying. The result wasn't perfect, but we now know how to make a perfect matching wood bending from what we learnt with this first experiment. In the end, our thought magnetized-closing lid wasn't necessary. This is our final prototype, along with our inspirational dice. 



A simple snippet to make an LED light up when receiving a SMS, with one of Bennedetta's GSM Shields

Lighting the LED pin 13 in the Arduino board. After several failed attempts of writing from Arduino Serial Monitor, we decided to do it through Coolterm. 

This is the code:

#include <SoftwareSerial.h>

SoftwareSerial mySerial(2, 3);
char inChar = 0;
char message[] = "que pasa HUMBA!";

void setup()  
  Serial.println("Hello Debug Terminal!");
  // set the data rate for the SoftwareSerial port
  pinMode(13, OUTPUT);
//  //Turn off echo from GSM
//  mySerial.print("ATE0");
//  mySerial.print("\r");
//  delay(300);
//  //Set the module to text mode
//  mySerial.print("AT+CMGF=1");
//  mySerial.print("\r");
//  delay(500);
//  //Send the following SMS to the following phone number
//  mySerial.write("AT+CMGS=\"");
//  // 129 for domestic #s, 145 if with + in front of #
//  mySerial.write("6313180614\",129");
//  mySerial.write("\r");
//  delay(300);
//  mySerial.write(message);
//  // Special character to tell the module to send the message
//  mySerial.write(0x1A);
//  delay(500);

void loop() // run over and over
  if (mySerial.available()){
    inChar =;
    digitalWrite(13, HIGH);
    digitalWrite(13, LOW);
  if (Serial.available()>0){

Health Applications –Pain Tracking–

We chose two mobile applications that ideally will help patients collect meaningful information about their symptoms and share them with their doctors in way that they can emit better recommendations. Thus, we looked at three overall assets in the applications: first that the use of these apps don't generate an additional frustration over their health, second that what they are registering can be is easily inputed and third that what's being registered could be useful for the doctor. After some research in the abundant alternatives of applications, we chose RheumaTrack and Pain Coach, even though we discarded Track React and Catch My Pain. Overall, we sought the best ones to ultimately decide which of the two was better. Its fair to say that both have useful and usable affordances, but RheumaTrack does add aggregate value that Pain Coach doesn't. 

Overall we realized RheumaTrack is a better application because of one particular service or function, which is the way people input their joint pain. This interface in a nutshell is a meaningful (useful & usable) way for both patient and doctor of visualize and recording the pain condition in a really predictable manner. The overall process of adding a new entry (pain, medication and activity), though a bit clamped is clearer than others and pretty straightforward. This dashboard follows the conventional standards in regards of Mobile GUI design, where items and affordances are perceivable (easily readable) and predictable, and the overall navigation feedback. I could realize two simple UX elements that this could improve, which is whenever adding a "New Check" there's no progress bar to predict how long is this task going to take. The "Activity" interface could visually improve in various points . First, generating better contrast between the data recorded and the layers of pain intensity to enhance perceivability (readability) and the tags' date-format can be confusing. Nevertheless, the overall purpose of the "Activity" service or function is very useful for doctors.

Object Reflections

This backpack caught my sight immediately and I’ve carried it since –eight years ago–. An outer clean minimalist silhouette tainted with coal and dark black communicated elegant simplicity. The continuity from the lateral-surrounding body-fabric onto the handles reinforced this minimalist perception and added structure and endurance. Its outer simplicity up to date disguises its inner complexity of vast services, to the extent of pockets often passing unnoticed. Various adventurous stories with its rogue laptop compartment have crafted a valueless feeling in my mind. I’m still discovering alternate uses for the side and handle straps such as water bottle holder, umbrella drainer or pen/marker holders. And besides its impeccable impermeability this awesome backpack is awfully comfortable.

This other object keeps tormenting our daily experiences, even though there have been solutions crafted by now. In a nut shell, this control frustrates people by cognitively loading us with excessive affordances (buttons). It's fair to clarify that the tasks all these affordances tackle may address interesting user needs. However, the frequency at which these needs may arise don't make up for this cognitive load. For example, as a beginner user, I don't know what are the A,B,C and D buttons for. Even though they may not be significantly big in comparison to other buttons, the fact that they have color distracts the overall reading from the control layout. A good solution already in market is Apple TV's control. It's consistent with its laptop controls created back in the mid 00s, allowing people to learn it easily and fast.

Drawing Object

I choose a Gyroscope top. It resembles a slick whipping top, which embodies the dynamic equilibrium concept quite curiously. 

These objects run thanks to the Centrifuge and Centripetal forces, which result by the momentum of the center load. Since this exercise is centered in laser cutting, I've deliberately ignore the center load that drives momentum, since the tentative materials to solve this cannot be laser cut by the machines in the shop. It would be really cool to have some sort of stone-like material for this exercise though.

The overall shapes could be any type of wood, the smallest circles though –Ds diameter– should be a hard wood to ensure a smoother spin for the center load.

The overall shapes could be any type of wood, the smallest circles though –Ds diameter– should be a hard wood to ensure a smoother spin for the center load.

In-Class Drawing Exercise

Considering the 15 minute time span we had for this exercise, the approach I had for it was a communication one. In other words, I did not care much about details, but of the overall understanding of how I planned to translate the process of fragmenting the object onto the sliced fabrication method.

Mind the Needle — Popping Balloons with Your Mind 0.2



Time's running out! Will your Concentration drive the Needle fast enough? Through the EEG consumer electronic Mindwave, visualize how your concentration level drives the speed of the Needle's arm and pops the balloon, maybe!

Second UI Exploration

Second UI Exploration

Development & UI

I designedcoded and fabricated the entire experience as an excuse to explore how people approach interfaces for the first time and imagine how things could or should be used.

The current UI focuses on the experience's challenge: 5 seconds to pop the balloon. The previous UI focused more on visually communicating the concentration signal (from now on called ATTENTION SIGNAL)

This is why there's prominence on the timer's dimension, location and color. The timer is bigger than the Attention signal and The Needle's digital representation. In addition this is why the timer is positioned at the left so people will read it first. Even though Attention signal is visually represented the concurrent question that emerged in NYC Media Lab's "The Future of Interfaces" and ITP's "Winter Show" was: what should I think of? 



What drives the needle is the intensity of the concentration or overall electrical brain activity, which can be achieved through different ways, such as solving basic math problems for example –a recurrent successful on-site exercise–. More importantly, this question might be pointing to an underlying lack of feedback from the physical devise itself, a more revealing question would be: How could feedback in  BCIs be better? Another reflection upon this interactive experience was, what would happen if this playful challenge was addressed differently by moving The Needle only when exceeding a certain Attention threshold?

Previous Iterations

Panic App

By the end of 2014, crime rate -deaths- in Bogota, Colombia decreased. Mugs however remained, and to tackle the common smartphones theft, in Pinedot Studios we attempted to solve it. We created this concept app and pitched it to INTEL Colombia.



Palindrome Hour Web-Clock

This is a project that celebrates hours that can be read either from left-to-right and right-to-left, same as palindrome text –flee to me, remote elf–. A concept of living symmetry overlaid with pleasing coincidence, and chunks of daily serendipity. 

 I designed & coded this project in Javascript with the creative toolkit p5.js. Hop in, and catch the palindrome hours! Link To Project Here

Previous Iteration

UI Drafts

Generative Soundscape 0.1.2

This installation pursues playful collaboration. By placing the modules through arbitrary configurations the idea behind this collective experience is to create scenarios where people can collaboratively create infinite layouts that generate perceivable chain reactions. The way to trigger the installation is through a playful gesture similar to bocce where spheres can ignite the layout anywhere in the installation.


After an apparent success –context-specific– and consequent failure –altered context– the project turned onto a functional alternative. The next process better illustrates it.

These images show the initial thought out circuit that included a working sound triggered by a –static– threshold. We also experimented with Adafruit's Trinket aiming towards circuit simplification, cost-effectiveness and miniaturization. This shrunk micro controller is composed by an ATTiny85 nicely breadboarded and boot-loaded. In the beginning we were able to upload digital output sequences to drive the speaker and LED included in the circuit design. However, the main blockage we manage to overcome in the end was reading the analog input signal used by the microphone. The last image illustrates the incorporation of a speaker amplifier to improve the speaker's output.

The next two videos show

1. the functional prototype that includes a hearing spectrum –if the microphone senses a value greater than the set threshold, stop hearing for a determined time– 

2. the difference between a normal speaker output signal and an amplified speaker output signal. 

After the first full-on tryout, it was clear that a dynamic threshold –the value that sets the trigger adapts accordingly to its ambient–. The microphone however, broke one day before the deadline, so we never got to try this tentative solution –even though there's an initial code–.

Plan B, use the Call-To-InterAction event instead. In other words, use collision and the vibration it generates to trigger the modules through a piezo. Here's the code.

A couple videos that illustrate the colliding key moments that trigger the beginning of a thrilling pursue.


And because sometimes, plan-b also glitches... Special thanks to Catherine, Suprit and Rubin for play testing


Translated Code –Processing to OF–

This is a book on Generative Design, and the examples I've selected are oriented towards data visualization. The main limitation with the overall pursue is the underlying library –Generative Design– which doesn't exists in OF yet.

The Processing example used libraries that can be found in OF's addons, which draws the attention to the limitations of pursuing an entire translation of the examples. There's other examples that use theGeomerative example and the Generative Design library that are only available to Processing –or Java based IDEs–. Anyhow, this particular example used a PDF converter and Calendar libraries to export the application's canvas onto images with a timestamp. In the failed attempt I was able to include a calendar addon that didn't end up using in the working one. 

Even though there's a Project Generator that will include whichever addon needed, it doesn't work every time. Since this was one of those times, I ended creating the failed attempt in the same folder of the ofxICalendar addon. To try and  solve one of the primitive drawing elements I sought another addon called ofxVectorGraphics, that couldn't ever got it working on an already created project.

There are primitive functions in OF similar to Processing's, the arc however is not one of them. Instead, there's two ways to go around this. The addon mentioned before, and using an object called ofPath that contains the function arc. After a lot of trial and error I was able to finally get an arc drawn in an isolated project. As any OF project, you have to create the variables and objects in the *.h file and then you can work with them in the *.cpp file. What I came to know, after figuring out the specifics of not filling, outlining, setting the resolution and not closing –to an extent– arcs was, invoking the function needed to actually draw the function. This particularly was completely counter intuitive from the previous programming experience.

After Kyle McDonald's workshop in introduction to OF I learned that the project could be simplified significantly to one *.cpp file. This meant however that I wouldn't  be able to include the feature of exporting an image with a timestamp. Currently this is the working translated project. I would also like to thank AV –Sehyun Kim– for helping me out on how to –again– draw the arcs.

Labs –DC Motor, Outer Source & H Bridge–



Generative Soundscape Diagram, Time Table & BOM

System Diagram

This is the basic behavior of the system, where sound is the medium of communication. The Trigger is the element that initiates the chain reaction. It will translate the rolling motion into the sound that will trigger the modules laying in the ground. This event will start the chained reaction.

Time Table

This is our initial Time Table and we've divided the overall project's development in two main blocks, the trigger element –stands for [T] in the timeline– and the module –stands for [M]–. Follow this link for a detailed description on each of the activities involved in our Time Table.

Bill of Materials


This Bill of Materials is thought for an initial prototype of one Trigger and two Modules. There's still a lot to figure out, but so far this is how it looks. You can follow this link for future references.

Generative Gesltaltung OF Translation

Taking some Processing code examples from this book, and translate them into OpenFrameworks. The project is directed to anyone interested in learning OF who has previous knowledge in Processing. Not only is there Generative Design involved, but Data Visualization and a new programming language/framework.

This is a book on Generative Design, and the examples I've selected are oriented towards data visualization. The main limitation with the overall pursue is the underlying library –Generative Design– which doesn't exists in OF yet.

Generative Soundscape Concept

This is an evolved and collaborative idea, from the Generative Sculptural Synth. The ideal concept is an interactive synthesizer that's made up of replicated modules that generate sound. It is triggered by sphere that creates chain-reaction throughout the installation's configuration.

It started out as re-configurable soundscape and evolve into an interactive –bocce-like– generative instrument. Here's a inside scoop of the brainstorming session were we –with my teammate– sought common ground. (1. Roy's ideal pursuit 2.My ideal pursuit 3.Converged ideal)

Audio Input Instructable

It started out as re-configurable soundscape and evolve into an interactive –bocce-like– generative instrument. Here's a inside scoop of the brainstorming session were we –with my teammate– sought common ground. (1. Roy's ideal pursuit 2.My ideal pursuit 3.Converged ideal)

Littlebits –whatever works–

After the slum dunk failure of the DIY Audio Input, I realize the convenience –limited– of prototyping with Littlebits. This way, I could start concentrating in the trigger event, rather than getting stuck at circuit sketching. I was able to program a simple timer for module to "hear" –boolean triggered by the microphone– and a timer for the module to "speak" –boolean to generate a tone–. What I learnt about the limitations of the Littlebit sensor is a twofold. They have a Sound Trigger and a conventional Microphone. Both bits' circuits have the embedded circuit solved out which turned out to be useful but limiting. The Sound Trigger has an adjustable Gain, an embedded –uncontrollable– 2 second timer and a pseudo-boolean output signal. So even though you can adjust it's sensibility, you can't actually work around with its values in Arduino IDE. The Microphone bit had an offsetted (±515 serial value) but its gain was rather insensible.

This is why, when conveniently using the Sound Triggers, the pitch is proportional to the distance. In other words, the modules are triggered closer when lower pitches are sensed and vice versa. However, since these bits –Sound Trigger– are pseudo-boolean, there can't be a Frequency Analysis.