Testing the XBees – at distance and enclosed

XBees turning on a red LED at 4 foot distance. I also tried this from one floor down and it still worked perfectly.

And testing them both inside Space Rock ‘shells’.

This week’s (or month’s) research – 11/07/18

Not done one of these for a while, but thought it would be useful to round-up some of the research I have been carrying out over the past few weeks.

Firstly, am looking at the work of Forensic Architecture, exploring their use of design as a process of archeology.

Exploring the language of objects and existing archetypes, reading Don Norman’s Emotional Design. I am investigating techniques to undermine or subvert the communication of messages by objects.

On the tech front, I am now looking into using the XBee RSSI (Received signal strength indicator) to produce the interactions between the four Space Rocks, using the distance and relative strength of the signal to change the light (colour / intensity) and sound parameters as the objects’ proximity to each other changes. See also Reading XBee RSSI with Arduino. I also need to dig out my copy of Making Things Talk.

Also looking into the concepts of:

  • Pattern recognitionHumans Are the World’s Best Pattern-Recognition Machines, But for How Long?
  • The overview effect –  refers to the experience of seeing firsthand the reality of the Earth in space, which is immediately understood to be a tiny, fragile ball of life, “hanging in the void”, shielded and nourished by a paper-thin atmosphere. From space, national boundaries vanish, the conflicts that divide people become less important, and the need to create a planetary society with the united will to protect this “pale blue dot” becomes both obvious and imperative.
  • Bicameralism – the condition of being divided into “two-chambers” is a hypothesis in psychology that argues that the human mind once operated in a state in which cognitive functions were divided between one part of the brain which appears to be “speaking”, and a second part which listens and obeys — a bicameral mind.

Attended an interesting talk at Futurefest, Speaking with Aliens. Triggered some very interesting ideas around communicating with extra-terrestrial life away from Earth with Clara Sousa-Silva (a Quantum Astrochemist at MIT tasked with finding alien life on a molecular level) and Jill A. Stuart (Space Law expert and director at METI international, working on different scenarios for encounters with intelligent life). I was most interested in Clara Sousa-Silva’s mentions of trying to communicate with light, as this can be seen everywhere, and white light can be split into infinite colours via a prism (for example). She also mentioned maths as a communication tool, but this seems too tied to human communication to fit the concept of a universal language. She also mentioned studying inter-species communication on Earth to inform communications with aliens.  The panel also discussed the idea of the Dark Forest, suggesting that perhaps we should not try to communicate with aliens at all.

This has also led me to researching Danielle Wood, Director of Space Enabled and the idea of space that has not been colonised by entrepreneurs such as Elon Musk and business interests. This also reminded me of this article in the Guardian discussing attempts to map underground spaces,  which are generally less regulated than the space above ground.

Research Workshop – Critical Thinking

A couple of sketches for the final ‘We Are Here’ presentation, with contributions from the Critical Thinking workshop group. The concept of presenting the objects in a dark space would help enhance the audio content.

Also useful for the final show piece is this Scientific American article about the brain compensating for the loss of one sense by enhancing others.

Super Powers for the Blind and Deaf.
The brain rewires itself to boost the remaining senses. If one sense is lost, the areas of the brain normally devoted to handling that sensory information do not go unused — they get rewired and put to work processing other senses. Brain imaging studies show the visual cortex in the blind is taken over by other senses, such as hearing or touch

https://www.scientificamerican.com/article/superpowers-for-the-blind-and-deaf/

A visit to The Future…

…Starts Here exhibition at the V&A. Some photos from a visit on Friday 25th May.

Was also fascinated by the data visualisation of Stamen Design’s Big Glass Microphone. Have long been interested in how to turn environmental data such as electronic / radio signals and sound / vibration levels into aesthetically appealing  visual or audio representations. This is an engaging interactive example of how to do that. The piece visualises fibre optic cables placed in a figure of eight around the Stanford University building. The cables pick up vibrations from the ground above (such as traffic) to create the visualisation.

Arduino networked lamp test

Working through this tutorial currently, trying to understand how Processing can be used to network an Arduino and power the colour of a lamp from words featured in an XML feed (in this case my blog feed – replacing the word ‘love’ with ‘space’ and the word ‘peace’ with ‘rock’). This generates the colour #3C4C2C.

Space, rock and Arduino
Rock, space and Arduino

And after adding this post to the feed…
Note the slight colour change.

This is the circuit I used, from this website. The LED is a 4 pin one, which can generate any combination of RGB colour as light:

https://mayorquinmachines.weebly.com/blog/arduino-project-arduino-networked-lamp
https://mayorquinmachines.weebly.com/blog/arduino-project-arduino-networked-lamp

And the two versions of it that I built:

Arduino networked lamp circuit v1
Arduino networked lamp circuit v1, with RGB LEDs
Arduino networked lamp circuit v2
Arduino networked lamp circuit v2, with one LED

Here is the code used in Processing:

//Arduino Code for the Arduino Networked Lamp - Processing

#define SENSOR 0
#define R_LED 9
#define G_LED 10
#define B_LED 11
#define BUTTON 12
int val =0; //variable to store the value coming from the sensor
int btn = LOW;
int old_btn = LOW;
int state = 0;
char buffer[7];
int pointer = 0;
byte inByte = 0;
byte r = 0;
byte g = 0;
byte b = 0;

void setup() {
  Serial.begin(9600); //open up serial port
  pinMode(BUTTON, INPUT);
}
  
void loop() {
  val = analogRead(SENSOR);
  Serial.println(val);
 
  if (Serial.available() >0) {
    //read incoming byte
    inByte = Serial.read();
    if (inByte == '#') {
      while (pointer < 6) {
        buffer[pointer] = Serial.read(); 
        pointer++;
      }
      //now need to decode 3 numbers of colors stored as hex numbers into 3 bytes
      r = hex2dec(buffer[1]) +hex2dec(buffer[0])*16;
      g = hex2dec(buffer[3]) +hex2dec(buffer[2])*16;
      b = hex2dec(buffer[5]) +hex2dec(buffer[4])*16;
      pointer = 0; //reset pointer
    }
  }
  btn = digitalRead(BUTTON);
  //check if there was a transition
  if ((btn == HIGH) && (old_btn == LOW)) {
    state = 1-state;
  }
  old_btn = btn; //val is now old,lets store it
  if (state == 1) {
    analogWrite(R_LED, r);
    analogWrite(G_LED, g);
    analogWrite(B_LED, b);
  }
  else {
    analogWrite(R_LED, 0);
    analogWrite(G_LED, 0);
    analogWrite(B_LED, 0);
  }
  delay(100);
}
int hex2dec(byte c) {
 if (c >= '0' && c<= '9') {
  return c- '0';
 } else if (c >='A' && c <= 'F') {
  return c - 'A' + 10;
 }
}

And the code in Arduino:


//Arduino Code for the Arduino Networked Lamp - Arduino

#define SENSOR 0
#define R_LED 9
#define G_LED 10
#define B_LED 11
#define BUTTON 12
int val =0; //variable to store the value coming from the sensor
int btn = LOW;
int old_btn = LOW;
int state = 0;
char buffer[7];
int pointer = 0;
byte inByte = 0;
byte r = 0;
byte g = 0;
byte b = 0;

void setup() {
  Serial.begin(9600); //open up serial port
  pinMode(BUTTON, INPUT);
}
  
void loop() {
  val = analogRead(SENSOR);
  Serial.println(val);
 
  if (Serial.available() >0) {
    //read incoming byte
    inByte = Serial.read();
    if (inByte == '#') {
      while (pointer < 6) {
        buffer[pointer] = Serial.read(); 
        pointer++;
      }
      //now need to decode 3 numbers of colors stored as hex numbers into 3 bytes
      r = hex2dec(buffer[1]) +hex2dec(buffer[0])*16;
      g = hex2dec(buffer[3]) +hex2dec(buffer[2])*16;
      b = hex2dec(buffer[5]) +hex2dec(buffer[4])*16;
      pointer = 0; //reset pointer
    }
  }
  btn = digitalRead(BUTTON);
  //check if there was a transition
  if ((btn == HIGH) && (old_btn == LOW)) {
    state = 1-state;
  }
  old_btn = btn; //val is now old,lets store it
  if (state == 1) {
    analogWrite(R_LED, r);
    analogWrite(G_LED, g);
    analogWrite(B_LED, b);
  }
  else {
    analogWrite(R_LED, 0);
    analogWrite(G_LED, 0);
    analogWrite(B_LED, 0);
  }
  delay(100);
}
int hex2dec(byte c) {
 if (c >= '0' && c<= '9') {
  return c- '0';
 } else if (c >='A' && c <= 'F') {
  return c - 'A' + 10;
 }
}

This didn’t work the first time I ran it, so I had to specify the Arduino port that Processing should use and then…

 

Arduino research 17/05/2018

Today I’ve been looking at how to build the various Arduino circuits for the Space Rocks, and at some examples of relevant projects.

I made some basic sensor experiments with a Light-dependent resistor, ultimately using this code from the Make: Getting Started with Arduino book.

// Example 06b: Blink LED at a rate specified by the
// value of the analogue input

# define LED 9 // the pin for the LED

int val = 0; // variable used to store the value
 // coming from the sensor

void setup() {
 // put your setup code here, to run once:

pinMode(LED, OUTPUT); // LED is as an output

// note Analogue pins are
 // automatically set as inputs
 }

void loop() {
 // put your main code here, to run repeatedly:

val = analogRead(0); // read the value from the sensor

analogWrite(LED, val/4); // turn the LED on at
 // the brightness set
 // by the sensor

delay(10); // stop the program for some time

}

And this circuit:

LDR - LED circuit diagram
LDR – LED circuit diagram
LDR - LED circuit
LDR – LED circuit

Also stumbled upon How to Build an Arduino synthesizer with Mozzi library

The Mozzi library looks super-useful for sound generation:

Currently your Arduino can only beep like a microwave oven. Mozzi brings your Arduino to life by allowing it to produce much more complex and interesting growls, sweeps and chorusing atmospherics. These sounds can be quickly and easily constructed from familiar synthesis units like oscillators, delays, filters and envelopes.

You can use Mozzi to generate algorithmic music for an installation or performance, or make interactive sonifications of sensors, on a small, modular and super cheap Arduino, without the need for additional shields, message passing or external synths.

Note to self to also check out the Mozzi examples gallery.

This week’s research – 06/04/2018

Thinking this week about echo location, in the context of the space objects ‘talking’ to each other and sensing the distances between themselves.
https://en.wikipedia.org/wiki/Animal_echolocation

Found some inspiring projects in A Touch of Code.  Most notably (so far):

  • Markus Kison’s Touched Echo, using sound conducted through bones. Visitors put themselves into the place of the people who shut their ears away from the noise of the explosions. While leaning on the balustrade the sound of airplanes and explosions is transmitted from the swinging balustrade through their arm directly into into the inner ear (bone conduction).
  • WhiteVoid’s ‘unstuck’ augmented game.
  • “Experiencing Abstract Information” by Jochen Winker and Stefan Kraiss
  • And Leonel Moura’s Robotarium. The first zoo in the world for artificial life.
  • Drawing Machine  by Fernando Orellana. Explores the notion of generative art or art that makes art on its own. The piece consists of a three tiered mobile sculpture that is driven by the vibration of a motor.

  • LITERALLY SPEAKING Torsten Posselt, Martin Kim Luge – transforms tweets from twitter-users into the sound of singing birds.
  • Kathrin Strumreich’s fabric machine. Two fabric loops, driven by a motor, create a division in space. Light sensors measure the opacity of the textile.

Love the design and the sounds of these Bivalvia mini synths.

And some musical inspiration for the Space Rock objects from Hatis Noit; especially the way the first track here plays with voice – using various layers, some treated and distorted, some not.