Getting the Space Rocks to talk to each other

Configuring the XBees.

Am using the Digi XBee wireless connectivity kit to allow the Space Rocks to communicate with each other. Following the tutorials on Sparkfun, I have managed to get them talking via X-CTU. Which is a good start.

XBee 1 talking
XBee 1 talking
XBee 2 talking
XBee 2 talking

 

And these two are also now communicating with each other.

XBee communicating with Arduino shield
XBee communicating with Arduino shield

And inside the plastic shells, at a distance of about 5 feet apart.

Expressive, Instructional or Instrumental?

Some further notes from Tom Igoe, from a 2016 presentation about Physicality. He categorises physical computing projects as either Expressive, Instructional or Instrumental.

  • Expressive works are often the least directly interactive, because they’re usually about expressing an artistic point of view. They’re useful for learning about control of physical systems, and control of aesthetics, like any expressive work, though. Example project: Matthew Richard – Estrella Intersects the Plane
  • Instructional works aim to demonstrate or illustrate a phenomenon. I think this is one area where phys comp techniques shine. You learn many things best by experiencing it directly. Example project: Jill Haefele – Human:Nature
  • Instrumental projects can be purely utilitarian, or they can be purely whimsical, but they exist to enable some other behavior. You generally don’t look at the instrument, you look at, or listen to, what it produces. Example project: John Schimmel – RAMPS – a wheelchair DJ

“Physical Computing’s Greatest Hits”

Stumbled across this blog post from 2008 discussing  Physical Computing’s wheels that people reinvent again and again:

  1. theremin-like instruments
  2. drum gloves (tangible vs intangible)
  3. dance floors
  4. Scooby-Doo paintings: paintings that react to presence (easy to sense presence, hard to sense attention)
  5. body-as-cursor
  6. video mirrors (aka, hand wavers, because people always wave their hands)
  7. mechanical pixels
  8. hand-as-cursor (aka Minority Report)
  9. multi-touch surfaces (exercise:operate an iPhone while it’s in your pocket)
  10. tilty stands and tables
  11. tilty controllers
  12. things you yell at
  13. meditation helpers
  14. fields of grass (running your hand across it affects it)
  15. dolls and pets
  16. remote hugs
  17. LED fetishism

The two interesting things to mention (for me) from the conversation part of this post is that these are in fact design patterns, which have likely developed because physical computing is now a mature field, and has its own traditions. And should there be “a museum of interactive technology. Then students can start their studies with a baseline in work that has been done before. Like playing the scales or imitating the masters.” Despite this being posted in 2008, I am not sure that is yet the case.

This blog post is a more in-depth version of the same list by Tom Igoe.

I also need to check out the Fashionable Technology book mentioned in the comments.

 

 

Arduino – Mozzi Piezo input tests

“A piezo creates a varying voltage when you squeeze it. [We’ll] use it as a sensor by measuring the voltage the piezo produces across a 1 megaOhm resistor…Notice it has a very fast response, especially a sharp attack and decay when it’s knocked.”

This first test was changing the frequency as the piezo was scraped and touched.

And the audio version:

 

Piezo sample trigger (& speed)

 

Piezo sample scrub

And finally an example of an external speaker circuit. Will need to work on the output level for this.

Arduino audio – further Mozzi tests

A video showing an Arduino Nano running the Mozzi library, which allows sounds generated from sensors. This uses the LDR + resistor as input to Analog Pins A1 +  A2, and using the code from Arduino➞File➞Examples➞Mozzi➞03.Sensors➞Knob_LightLevel_x2_FMsynth

And here’s the attendant audio.

Arduino audio tests

Arduino audio tests using the Mozzi library, a Nano board, a light dependent resistor and a potentiometer (volume control).

http://sensorium.github.io/Mozzi/learn/introductory-tutorial/

 

Arduino networked lamp test

Working through this tutorial currently, trying to understand how Processing can be used to network an Arduino and power the colour of a lamp from words featured in an XML feed (in this case my blog feed – replacing the word ‘love’ with ‘space’ and the word ‘peace’ with ‘rock’). This generates the colour #3C4C2C.

Space, rock and Arduino
Rock, space and Arduino

And after adding this post to the feed…
Note the slight colour change.

This is the circuit I used, from this website. The LED is a 4 pin one, which can generate any combination of RGB colour as light:

https://mayorquinmachines.weebly.com/blog/arduino-project-arduino-networked-lamp
https://mayorquinmachines.weebly.com/blog/arduino-project-arduino-networked-lamp

And the two versions of it that I built:

Arduino networked lamp circuit v1
Arduino networked lamp circuit v1, with RGB LEDs
Arduino networked lamp circuit v2
Arduino networked lamp circuit v2, with one LED

Here is the code used in Processing:

//Arduino Code for the Arduino Networked Lamp - Processing

#define SENSOR 0
#define R_LED 9
#define G_LED 10
#define B_LED 11
#define BUTTON 12
int val =0; //variable to store the value coming from the sensor
int btn = LOW;
int old_btn = LOW;
int state = 0;
char buffer[7];
int pointer = 0;
byte inByte = 0;
byte r = 0;
byte g = 0;
byte b = 0;

void setup() {
  Serial.begin(9600); //open up serial port
  pinMode(BUTTON, INPUT);
}
  
void loop() {
  val = analogRead(SENSOR);
  Serial.println(val);
 
  if (Serial.available() >0) {
    //read incoming byte
    inByte = Serial.read();
    if (inByte == '#') {
      while (pointer < 6) {
        buffer[pointer] = Serial.read(); 
        pointer++;
      }
      //now need to decode 3 numbers of colors stored as hex numbers into 3 bytes
      r = hex2dec(buffer[1]) +hex2dec(buffer[0])*16;
      g = hex2dec(buffer[3]) +hex2dec(buffer[2])*16;
      b = hex2dec(buffer[5]) +hex2dec(buffer[4])*16;
      pointer = 0; //reset pointer
    }
  }
  btn = digitalRead(BUTTON);
  //check if there was a transition
  if ((btn == HIGH) && (old_btn == LOW)) {
    state = 1-state;
  }
  old_btn = btn; //val is now old,lets store it
  if (state == 1) {
    analogWrite(R_LED, r);
    analogWrite(G_LED, g);
    analogWrite(B_LED, b);
  }
  else {
    analogWrite(R_LED, 0);
    analogWrite(G_LED, 0);
    analogWrite(B_LED, 0);
  }
  delay(100);
}
int hex2dec(byte c) {
 if (c >= '0' && c<= '9') {
  return c- '0';
 } else if (c >='A' && c <= 'F') {
  return c - 'A' + 10;
 }
}

And the code in Arduino:


//Arduino Code for the Arduino Networked Lamp - Arduino

#define SENSOR 0
#define R_LED 9
#define G_LED 10
#define B_LED 11
#define BUTTON 12
int val =0; //variable to store the value coming from the sensor
int btn = LOW;
int old_btn = LOW;
int state = 0;
char buffer[7];
int pointer = 0;
byte inByte = 0;
byte r = 0;
byte g = 0;
byte b = 0;

void setup() {
  Serial.begin(9600); //open up serial port
  pinMode(BUTTON, INPUT);
}
  
void loop() {
  val = analogRead(SENSOR);
  Serial.println(val);
 
  if (Serial.available() >0) {
    //read incoming byte
    inByte = Serial.read();
    if (inByte == '#') {
      while (pointer < 6) {
        buffer[pointer] = Serial.read(); 
        pointer++;
      }
      //now need to decode 3 numbers of colors stored as hex numbers into 3 bytes
      r = hex2dec(buffer[1]) +hex2dec(buffer[0])*16;
      g = hex2dec(buffer[3]) +hex2dec(buffer[2])*16;
      b = hex2dec(buffer[5]) +hex2dec(buffer[4])*16;
      pointer = 0; //reset pointer
    }
  }
  btn = digitalRead(BUTTON);
  //check if there was a transition
  if ((btn == HIGH) && (old_btn == LOW)) {
    state = 1-state;
  }
  old_btn = btn; //val is now old,lets store it
  if (state == 1) {
    analogWrite(R_LED, r);
    analogWrite(G_LED, g);
    analogWrite(B_LED, b);
  }
  else {
    analogWrite(R_LED, 0);
    analogWrite(G_LED, 0);
    analogWrite(B_LED, 0);
  }
  delay(100);
}
int hex2dec(byte c) {
 if (c >= '0' && c<= '9') {
  return c- '0';
 } else if (c >='A' && c <= 'F') {
  return c - 'A' + 10;
 }
}

This didn’t work the first time I ran it, so I had to specify the Arduino port that Processing should use and then…

 

Arduino research 17/05/2018

Today I’ve been looking at how to build the various Arduino circuits for the Space Rocks, and at some examples of relevant projects.

I made some basic sensor experiments with a Light-dependent resistor, ultimately using this code from the Make: Getting Started with Arduino book.

// Example 06b: Blink LED at a rate specified by the
// value of the analogue input

# define LED 9 // the pin for the LED

int val = 0; // variable used to store the value
 // coming from the sensor

void setup() {
 // put your setup code here, to run once:

pinMode(LED, OUTPUT); // LED is as an output

// note Analogue pins are
 // automatically set as inputs
 }

void loop() {
 // put your main code here, to run repeatedly:

val = analogRead(0); // read the value from the sensor

analogWrite(LED, val/4); // turn the LED on at
 // the brightness set
 // by the sensor

delay(10); // stop the program for some time

}

And this circuit:

LDR - LED circuit diagram
LDR – LED circuit diagram
LDR - LED circuit
LDR – LED circuit

Also stumbled upon How to Build an Arduino synthesizer with Mozzi library

The Mozzi library looks super-useful for sound generation:

Currently your Arduino can only beep like a microwave oven. Mozzi brings your Arduino to life by allowing it to produce much more complex and interesting growls, sweeps and chorusing atmospherics. These sounds can be quickly and easily constructed from familiar synthesis units like oscillators, delays, filters and envelopes.

You can use Mozzi to generate algorithmic music for an installation or performance, or make interactive sonifications of sensors, on a small, modular and super cheap Arduino, without the need for additional shields, message passing or external synths.

Note to self to also check out the Mozzi examples gallery.

This week’s research – 06/04/2018

Thinking this week about echo location, in the context of the space objects ‘talking’ to each other and sensing the distances between themselves.
https://en.wikipedia.org/wiki/Animal_echolocation

Found some inspiring projects in A Touch of Code.  Most notably (so far):

  • Markus Kison’s Touched Echo, using sound conducted through bones. Visitors put themselves into the place of the people who shut their ears away from the noise of the explosions. While leaning on the balustrade the sound of airplanes and explosions is transmitted from the swinging balustrade through their arm directly into into the inner ear (bone conduction).
  • WhiteVoid’s ‘unstuck’ augmented game.
  • “Experiencing Abstract Information” by Jochen Winker and Stefan Kraiss
  • And Leonel Moura’s Robotarium. The first zoo in the world for artificial life.
  • Drawing Machine  by Fernando Orellana. Explores the notion of generative art or art that makes art on its own. The piece consists of a three tiered mobile sculpture that is driven by the vibration of a motor.

  • LITERALLY SPEAKING Torsten Posselt, Martin Kim Luge – transforms tweets from twitter-users into the sound of singing birds.
  • Kathrin Strumreich’s fabric machine. Two fabric loops, driven by a motor, create a division in space. Light sensors measure the opacity of the textile.

Love the design and the sounds of these Bivalvia mini synths.

And some musical inspiration for the Space Rock objects from Hatis Noit; especially the way the first track here plays with voice – using various layers, some treated and distorted, some not.