criticalmaking.com

Annoying!

March 16, 2009inf2241_classfi22411

First week [Ideas]

Finally, a weekly project that is off to a good start! This past session was indeed fruitful: we have a solid idea of what to build and have tested the two main components of our project successfully.

We were fortunate enough to have the WiiChuck tutorial left on our table. This was the starting point of our idea. As we try and figure out what type of data we could gather from the various readings the accelerometers would give us in the Serial window in Arduino, we toyed with ideas involving graphical displays on bodily information. We’ve discussed, as a starting point, an automated dance notation system inspired by the locative possibilities of the WiiChuck as well as already existing graphical dance notation display [see link 1 link 2]

However, we wanted the project and final device to address the idea of space, namely the line between private and public space, in a more forward manner. We opted for a more… “annoying” solution to this matter: a device that would emit different sounds depending on your body movements and relative position of your limbs in space. How does that speak of the public space exactly? We’re not sure yet, but we can for sure tell you that the first experiment with the sound component was completely successful at alerting and annoying the people around us… even ourselves.

First week [Technical]

The basic WiiChuck circuit was left untouched from the tutorial. What we need to understand is how the x, y and z components displayed in the serial window are affected and through what type of movement. This will in turn enable us to direct and trigger different sounds with the various variables. Additionally, we have successfully completed a sound tutorial where we produced a basic song [see video] using a small speaker.

WiiChuck connected to Arduino

WiiChuck connected to Arduino

Creating music using the WiiChuck

An experiment that took us a great deal of time was the idea of controlling the melody example (from the Arduino website) using the WiiChuck. A preliminary modification of the Arduino code was the introduction of an analog function that will allow us to control different variables when playing the sound. Once this is mastered successfully, we are analysing how to use the accelerometers present in the WiiChuck to introduce more control to our wearable device.

Testing the Melody code on Arduino

——————————–

int speakerPin = 9;
int analogIn = 0;

int length = 15; // the number of notes
char notes[] = “ccggaagffeeddc “; // a space represents a rest
int beats[] = { 1, 1, 1, 1, 1, 1, 2, 1, 1, 1, 1, 1, 1, 2, 4 };
int tempo = 100;

void playTone(int tone, int duration) {
tempo = analogRead(analogIn);
for (long i = 0; i < duration * 1000L; i += tone * 2) {
digitalWrite(speakerPin, HIGH);
delayMicroseconds(tone);
digitalWrite(speakerPin, LOW);
delayMicroseconds(tone);
}
}

void playNote(char note, int duration) {
char names[] = { ‘c’, ‘d’, ‘e’, ‘f’, ‘g’, ‘a’, ‘b’, ‘C’ };
int tones[] = { 1915, 1700, 1519, 1432, 1275, 1136, 1014, 956 };

// play the tone corresponding to the note name
for (int i = 0; i < 8; i++) {
if (names[i] == note) {
playTone(tones[i], duration);
}
}
}

void setup() {
pinMode(speakerPin, OUTPUT);
pinMode(analogIn, INPUT);
}

void loop() {
for (int i = 0; i < length; i++) {
if (notes[i] == ‘ ‘) {
delay(beats[i] * tempo); // rest
} else {
playNote(notes[i], beats[i] * tempo);
}

// pause between notes
delay(tempo / 2);
}
}

Second week [Ideas]

It’s alive! And pretty impressive to boot. We have managed to use the data from the WiiChuck to create a melody. Well, sort of a melody. It’s actually kind of an annoying beeping (Marie-Eve dubbed it the TechnoDJ 1980). Moving the WiiChuck vertically controls the pitch, while moving it horizontally controls the rhythm. Early testing revealed that testing this thing was extremely annoying, so we introduced a button to start the ‘melody’ (the button on the front of the WiiChuck). This turned out to be an interesting turning point; the entire purpose (morality) of the device changed with the additional functionality of an on/off button. With it, the user has much more control – it is possible for them to present a particular melody at a particular time, as opposed to the one that is automatically generated as they move through a public space. In turn, the perceptions of people around the user are changed…

We also managed to hook the Arduino up to a battery; once the code was loaded onto the board, the entire setup could be hidden away inside someone’s clothing. That way, the noises were less locatable, with nothing to immediately identify who was creating them. Again, this changes how the user-device interact with the space around them. We noticed this when wandering around New College. People would notice the odd sounds, but would frequently not be able to establish their source…

We had an interesting discussion regarding whether we should further disassemble the WiiChuck. We concluded that, while it would allow for slightly more comfort for the user, it would also made the device less repurposable. In a similar vein, we decided that we would pass the code on to the wearer (or the purchaser or whoever), to allow them some control over the tones emitted. As it stands, the user does not have much agency over the device. Allowing them to adjust the code would pass some of it back to them. We could even hope that they would start to tinker with the code or circuitry, then pass it on…

Discussion of articles

Mann, S. (1996). Smart clothing: the shift to wearable computing. Communications of the ACM, 39(8), 23-24.

Knight et al. (2007). “Uses of accelerometer data collected from a wearable system.” Personal and Ubiquitous Computing.”11(2):133-143.

Kirschenbaum, Matthew G. (2004). “So the Colors Cover the Wires”: Interface, Aesthetics, and Usability. In A Companion to Digital Humanities, ed. Susan Schreibman, Ray Siemens, John Unsworth. Oxford: Blackwell.

In his article “Smart Clothing: the shift to wearable computing”, Steve Mann examines the fundamental issues inherent to this (then) new discipline. As one of the pioneers of the wearable computing field, Mann addresses the social and physical awkwardness of the wearable devices yet expresses a sense of self-empowerment as these tools became a part of him. Mann then briefly looks at the idea of privacy and public space when envisioning a possible global village of people wearing cameras instead of being watched by CCTVs. Mann further pursue this idea in terms of clothing vs uniform where the Orwellian risk of being forced to wear computing devices for surveillance purposes might become a reality. As wearable computing devices are normally situated in the core of the wearer’s personal space, there is both a danger for this space to be violated as well as protected.

In their 2007 article, Knight et al. discuss current uses for accelerometer data. They outline a wide range of uses, from teaching physics to coaching sports and to research on human movement. One particularly interesting use that was discussed involved correlating accelerometer output with measures of physical activity, such as heart rate. With sufficient data, accelerometer readings could be mapped to energy output; the user could then use the body-mounted devices to know the amount of i.e. calories burned while performing a physical activity. The article concludes with some suggestions for using accelerometers to obtain effective readings (regarding noise and sampling frequency) and some suggestions for future uses. In regards to our project, the article was useful for its broad range of potential applications; however, less useful for our purposes was the was a heavy emphasis on the technical aspects, which skewing more towards the scientific end of the spectrum.

Kirschenbaum (2004) discusses the role and importance of interfaces. In the past, interfaces were usually Graphical User Interfaces (GUI) and considered as a minor element of the development process. It is common that interfaces are left to the final stage of development, and developed without any interaction with the final users. Kirschenbaum offers an alternative framework based on the critical approach that humanities can provide to the predominant approach to GUI. This approach becomes especially significant when designing wearable devices, since the boundary between content and medium of a wearable artefact became unclear. As Kirschenbaum suggests, “form and content are almost instinctively understood as inextricable from one another.”

… dancing your own beep: notes on the evolution of wearable ideas

The initial ideas inspired by the dance notation and the puzzling comfort of the WiiChuck have greatly evolved into interesting intuitions about wearable devices and relations of power, ownership and surveillance, now creativity, “tactile” and social.

Starting up with WiiChuck…

During the early stages of the design process, a challenging idea kept our focus: imagining how to capture the analog readings from the WiiChuck and correlate them with an automated construction of diagrams using an appropriate dance notation. The very thought was just fascinating!! Although not doable (perhaps because “we are not Pixar,” as some of us thought) within the time-span for our wearable device, the challenge did not stop, but led us further, to carefully explore the complexity of what we had just in front us – the data we could obtain from the WiiChuck and its quite “magical” roughness. This exploration took us a while… not actually time to make it work, which was the easiest, since we had re-used some freely available code for Arduino (and some nice steps further assisted by the instructor). Just ready-to-use, ready-to-be- re-purposed, although it was at first sight somewhat overwhelming. 5 arrays of numbers, ranging from 0 to 255, and two Boolean values (0 or 1) while pressing the buttons. The first day just went on…


First metamorphosis… “talk about you as you walk”

It was during that week, during a discussion about the issues on “Privacy and Piracy,” that the automated dance notation suddenly mutated into a surveillance device. Indeed, we were then wanting to build a heavily-loaded sensor, (“we might have to take apart some of these WiiChucks”- we thought) using multiple pieces so that we could build this machine, one which could “talk about you” secretly, informing about you as you walk. There we were, now in retrospective it only seems clear. Then, another prototype, a huge one again, a device capable of “learning to dance”, but so full of rather secret purposes, “noble” some may shouted out aloud. The chunks of Arduino code were now fitting smoothly. Once we discovered that the println() statements were messing everything up. It was now evident how expensive (in machine processing power) the unnecessary console displays were. Once commented out, the WiiChuck seemed to awake as we had never had seen. Larger ranges of values started to show up…

Second metamorphosis… “dancing our own beep”

The week started quite “overloaded,” but it was not really clear what kind of semantic “overload” it was, nor was it clear which uncertain shifts were awaiting. Some spikes on our minds; resonating concepts, old words, known and old, but not quite. It was then present, that strange sense of proximity and distance which invites such novelty. These very old-new words were making sense together, yes!! Resonating as sometimes a melody does, but those not so melodic as well. There we were carrying all of them with us: “movement and dance,” WiiChuck of course, but “social,” “capital,” “collective.” They could not have been missed; the discussion on Facebook had just tuned them up in a consciousness’ flow. Not quite graceful the mood – the “critical mood”, not the shift, or rather the reaction about uses and whatnot so misuses of classifications when modeling subjectivity (ed. ummmmm). Pushing metadata and building identity. Pulling as someone might have done with a leaf, and with it a big chunk of the roots came apart. At first, so tacitly “efficiency,” “effectiveness,” “customer-behaviors” and so many other that we could not read as they were too close. Among the noise, and the proximal-distant, quite novelty in our minds went back home, “dancing our own beep.”
Below is the “hybrid” code we have come up with… load it, add to it, comment on it. Dare to explore multiple modalities of perception, “critical thinking” might become “critical making” for you as well…   A. (m/16)

Second week [Technical]

————————————

/*
* Generic Wiichuck code
*
* Adapted from
* NunchuckServo
*
* 2007 Tod E. Kurt, http://todbot.com/blog/
*
* The Wii Nunchuck reading code is taken from Windmeadow Labs
*   http://www.windmeadow.com/node/42
*
*
* 2009 Thanks to Matt R. for fixing a bit more and making this accessible
*
* 2009 Thanks Anto for extracting the playTone() from Arduino and make it worked here
* *
*/

#include <Wire.h>

// wii vars

int joy_x_axis;
int joy_y_axis;
int accel_x_axis;
int accel_y_axis;
int accel_z_axis;

int z_button = 0;
int c_button = 0;

int loop_cnt = 0; // conrol var var of msec passed
int loop_cnt_tot = 200; //  msec to wait until data is read again
int controlToneVar;

// melody vars
int speakerPin = 13;
int analogIn = 5;
int tempo = 200;
int sensorVar = 0;
int myNote = 0;

void setup() {
pinMode(speakerPin, OUTPUT);
pinMode(analogIn, INPUT);
Serial.begin(9600); // initialize serial port

// wii vars
nunchuck_setpowerpins(); // use analog pins 2&3 as fake gnd & pwr
nunchuck_init(); // send the initilization handshake
}

void loop() {

/// Wi procedures

// get var from the Wii
checkNunchuck();

// control melody

//myNote = map(sensorVar,0,1023,100,5000);

/*
Serial.println(“”);
Serial.print(“My Tone: “);
*/

//Serial.println(controlToneVar);
//Serial.println(tempo);

delay(1);

//tempo=sensorVar;

}

// melody function: play a tone
void playTone(int tone, int duration) {
for (long i = 0; i < duration * 1000L; i += tone * 2) {
digitalWrite(speakerPin, HIGH);
delayMicroseconds(tone);
digitalWrite(speakerPin, LOW);
delayMicroseconds(tone);
}
}
/*
Sets the tone to play based on other data available
*/
void setTone(int measure, int start, int end) {

//myNote = map(measure,start, end, 100,3000);
//myNote = measure;

// convert the given rage to 0 – 1023 range
int sensorVarNew = 0;
sensorVarNew = map(measure,start, end, 0,1023);

//Serial.println(measure);
///*
// play only the scale
if ((sensorVarNew>=0) && (sensorVarNew<=126)){
myNote = 956;
controlToneVar = 1;
} else if ((sensorVarNew>=127) && (sensorVarNew<=253)) {
myNote = 1014;
controlToneVar = 2;
} else if ((sensorVarNew>=254) && (sensorVarNew<=381)) {
myNote = 1136;
controlToneVar = 3;
} else if ((sensorVarNew>=382) && (sensorVarNew<=507)) {
myNote = 1275;
controlToneVar = 4;
} else if ((sensorVarNew>=508) && (sensorVarNew<=634)) {
myNote = 1432;
controlToneVar = 5;
} else if ((sensorVarNew>=635) && (sensorVarNew<=761)) {
myNote = 1519;
controlToneVar = 6;
} else if ((sensorVarNew>=762) && (sensorVarNew<=888)) {
myNote = 1700;
controlToneVar = 7;
} else if ((sensorVarNew>=889) && (sensorVarNew<=1023)) {
myNote = 1915;
controlToneVar = 8;
}
//*/
}

// Wii functions

void checkNunchuck()
{
if( loop_cnt > loop_cnt_tot ) {  // loop()s is every 1msec, this is every 100msec

nunchuck_get_data();
nunchuck_print_data();

loop_cnt = 0;  // reset for
}
loop_cnt++;

}

//
// Nunchuck functions
//

static uint8_t nunchuck_buf[6];   // array to store nunchuck data,

// Uses port C (analog in) pins as power & ground for Nunchuck
static void nunchuck_setpowerpins()
{
#define pwrpin PC3
#define gndpin PC2
DDRC |= _BV(pwrpin) | _BV(gndpin);
PORTC &=~ _BV(gndpin);
PORTC |=  _BV(pwrpin);
delay(100);  // wait for things to stabilize
}

// initialize the I2C system, join the I2C bus,
// and tell the nunchuck we’re talking to it
void nunchuck_init()
{
Wire.begin();                    // join i2c bus as master
Wire.beginTransmission(0x52);    // transmit to device 0x52
Wire.send(0x40);        // sends memory address
Wire.send(0x00);        // sends sent a zero.
Wire.endTransmission();    // stop transmitting
}

// Send a request for data to the nunchuck
// was “send_zero()”
void nunchuck_send_request()
{
Wire.beginTransmission(0x52);    // transmit to device 0x52
Wire.send(0x00);        // sends one byte
Wire.endTransmission();    // stop transmitting
}

// Receive data back from the nunchuck,
// returns 1 on successful read. returns 0 on failure
int nunchuck_get_data()
{
int cnt=0;
Wire.requestFrom (0x52, 6);    // request data from nunchuck
while (Wire.available ()) {
// receive byte as an integer
nunchuck_buf[cnt] = nunchuk_decode_byte(Wire.receive());
cnt++;
}
nunchuck_send_request();  // send request for next data payload
// If we recieved the 6 bytes, then go print them
if (cnt >= 5) {
return 1;   // success
}
return 0; //failure
}

// Print the input data we have recieved
// accel data is 10 bits long
// so we read 8 bits, then we have to add
// on the last 2 bits.  That is why I
// multiply them by 2 * 2
void nunchuck_print_data()
{
static int i=0;
joy_x_axis = nunchuck_buf[0];
joy_y_axis = nunchuck_buf[1];
accel_x_axis = nunchuck_buf[2]; // * 2 * 2;
accel_y_axis = nunchuck_buf[3]; // * 2 * 2;
accel_z_axis = nunchuck_buf[4]; // * 2 * 2;

z_button = 0;
c_button = 0;

// byte nunchuck_buf[5] contains bits for z and c buttons
// it also contains the least significant bits for the accelerometer data
// so we have to check each bit of byte outbuf[5]
if ((nunchuck_buf[5] >> 0) & 1)
z_button = 1;
if ((nunchuck_buf[5] >> 1) & 1)
c_button = 1;

if ((nunchuck_buf[5] >> 2) & 1)
accel_x_axis += 2;
if ((nunchuck_buf[5] >> 3) & 1)
accel_x_axis += 1;

if ((nunchuck_buf[5] >> 4) & 1)
accel_y_axis += 2;
if ((nunchuck_buf[5] >> 5) & 1)
accel_y_axis += 1;

if ((nunchuck_buf[5] >> 6) & 1)
accel_z_axis += 2;
if ((nunchuck_buf[5] >> 7) & 1)
accel_z_axis += 1;

//accel_y_axis *= 11;

/*
// Printing data
Serial.print(i,DEC);
Serial.print(“\t”);

Serial.print(“joy:”);
Serial.print(joy_x_axis,DEC);
Serial.print(“,”);
Serial.print(joy_y_axis, DEC);
Serial.print(”  \t”);

Serial.print(“acc:”);
Serial.print(accel_x_axis, DEC);
Serial.print(“,”);
Serial.print(accel_y_axis, DEC);
Serial.print(“,”);
Serial.print(accel_z_axis, DEC);
Serial.print(“\t”);

Serial.print(“but:”);
Serial.print(z_button, DEC);
Serial.print(“,”);
Serial.print(c_button, DEC);

Serial.print(“\r\n”);  // newline
*/
i++;

// Play tone when z button pressed
if (z_button==0)
{
setTone(accel_x_axis, 60, 160);
//setTone(accel_y_axis, 70, 190);
//setTone(accel_y_axis, 80, 100);
//setTone(accel_y_axis, 0, 255);

// set the tempo using Wii controller
//tempo = map(joy_y_axis,0,255,10,500);
//tempo = map(accel_x_axis,60,150,1,500);
tempo = map(accel_y_axis,70,180,1,500);

if (myNote>0)
playTone(myNote, tempo);
}

// fix negative tempos
if (tempo<0)
{
tempo = 1;
}
//Serial.println(tempo);

//int p1 = map(accel_x_axis);

// exporting to processing
/*
Serial.print(accel_x_axis, BYTE);
Serial.print(accel_y_axis, BYTE);
Serial.print(accel_z_axis, BYTE);
*/

//Serial.print(joy_x_axis, BYTE);
//Serial.print(joy_y_axis, BYTE);

//delay(50);

}

// Encode data to format that most wiimote drivers except
// only needed if you use one of the regular wiimote drivers
char nunchuk_decode_byte (char x)
{
x = (x ^ 0x17) + 0x17;
return x;
}

// returns zbutton state: 1=pressed, 0=notpressed
int nunchuck_zbutton()
{
return ((nunchuck_buf[5] >> 0) & 1) ? 0 : 1;  // voodoo
}

// returns zbutton state: 1=pressed, 0=notpressed
int nunchuck_cbutton()
{
return ((nunchuck_buf[5] >> 1) & 1) ? 0 : 1;  // voodoo
}

// returns value of x-axis joystick
int nunchuck_joyx()
{
return nunchuck_buf[0];
}

// returns value of y-axis joystick
int nunchuck_joyy()
{
return nunchuck_buf[1];
}

// returns value of x-axis accelerometer
int nunchuck_accelx()
{
return nunchuck_buf[2];   // FIXME: this leaves out 2-bits of the data
}

// returns value of y-axis accelerometer
int nunchuck_accely()
{
return nunchuck_buf[3];   // FIXME: this leaves out 2-bits of the data
}

// returns value of z-axis accelerometer
int nunchuck_accelz()
{
return nunchuck_buf[4];   // FIXME: this leaves out 2-bits of the data
}
————————————-

You can follow any responses to this entry through the RSS feed. You can leave a response, or trackback from your own site.

Tagged , , , ,

One Comment

  1. AnonymousFebruary 5, 2016 at 9:38 am

    ?????,????? ?????,?? ?? ?? ????? ?????,????? ???? ???,????????,???? ???????,????? ?????,????? ????? ???? ?????? ????? ????,????? ?????,????? ?????,????? ?????? ,????? ????? ,????? ???? ?????,????? ????? 94,????? ????? 93,????? ????? 92,????? ?????? ?…

    […]Annoying! | criticalmaking.com[…]

Leave a reply

Follow Us