criticalmaking.com

Bruno’s Bellybutton saves the world!!

April 11, 2009inf2241_classfi2241

In daily life the individual ordinarily speaks for himself, speaks, as it were, in his ‘own’ character. However, when one examines speech, especially the informal variety, this traditional view proves inadequate…When a speaker employs conventional brackets to warn us that what he is saying is meant to be taken in jest, or as mere repeating of words by someone else, then it is clear that he means to stand in a relation of reduced personal responsibility for what he is saying. He splits himself off from the content of the words by expressing that their speaker is not he himself or not he himself in a serious way.
– Erving Goffman

All the world’s a stage, and all the men and women merely players.
– William Shakespeare

Sabrina stabbed listlessly at her steak tartare and stared a little to the left of Matt’s nose to appear to be listening to him. “Does he never shut up?” she thought, likening the sound of his voice to a thousand bees dancing deep within her skull. A quick glance to the left and right showed that Bob and Nathaniel were as enthralled as she. As the winners of the Very Bestest and Insightfullest Project Award of the 2009 Critical Making Lab for their brilliant Social/Lites from Bruno’s Bellybutton, the three tortured graduate students were treated to dinner by their verbose professor. Sacrifices must be made for free food.

Just as the monotony was reaching a fever pitch of violent beige dismalness, Matt tensed slightly and stared at his glass, which had just then lit up with a slight warming glow, an indecipherable question on his face. Suddenly silent, he grinned sheepishly and looked around the table. “Oops! My “glass” is trying to tell me something.”

Sabrina giggled to herself as Bob launched into a passionate account of the shortcomings of Arduino and Nathaniel twiddled his wineglass. Social/Lites from Bruno’s Bellybutton actually worked! It actually managed to send a non-verbal message to the intended receiver, who received it, understood it, and reacted to it. In this case, someone had told Matt to shut up. But who? It wasn’t her. Matt could have sent it to himself to get out of talking in circles, but that seemed unlikely. Bob usually kept his opinions to himself unless asked directly. Only Nathaniel was lacking the internal editor that would stop normal people from telling their professor to clam it.  Luckily, Matt had a good sense of humour. A very good sense of humour. A very, VERY good sense of humour …

Social/Lites from Bruno’s Bellybutton is a social device that enhances pre-existing communication currents through technology and human social knowledge. It is a simple concept: each place at a table is equipped with a foot pedal attached by wire to a central Arduino hub, which itself is connect to specially constructed glasses at each place. The pedals are able to communicate with each glass by pressure switches within the pedals assigned to each glass at the table. The glasses, in turn are able to both send and receive messages by means of LEDs and mercury switches. When a participant wants to send a message to a particular companion, she activates the corresponding pressure switch in the foot pedal and tilts her glass in one of three predetermined directions. The mercury switches in the glass (separated from the liquid for health reasons) read the direction the glass is tilted and send a message to the intended recipient. Each direction corresponds to one of three commonly known iconic symbols: smiley face, winky face, and frowny face. Thus the sender chooses the recipient with the pedal and the message with the glass. The recipient sees one of three backlit emoticons glowing at the bottom of her glass (emoticons were chosen because of their simplicity and increasing universality). The light is strong enough to make the glass glow, alerting everyone at the table to a message, though they are unaware of what the message is and who sent it.

Recognizing that the majority of face-to-face socializing takes place around tables, Social/Lites from Bruno’s Bellybutton explores the never-verbalized communication socializers employ to make subtle comments on events unfolding during the meal. In the example above, one party member decided to tell another that he was monopolizing the conversation. In the somewhat formal circumstances of restaurant dining with a professor where simply interrupting would be inappropriate, participants rely on subtle non-verbal cues to broadcast their message. While the main character tried to appear interested although she wasn’t, it was clear to her that her two silent companions were not interested in the conversation. To some speakers, this dispassionate ambience would be sufficient to deduce that it may be time to invite others to speak. When the speaker fails to notice this, the other party members must resort to more active tactics that yet remain within the boundaries of social etiquette, such as meaningful glances or fidgeting with silverware. Should this fail, meal participants are forced to increasingly overt means of appropriating the conversational space while increasing the risk of breaking social norms.

Social/Lites examines these communications and heightens them in a lighthearted manner. While acknowledging that there it is impossible to grab someone’s attention overtly in a situation where social norms would prohibit such blatancy, the device nevertheless allows a certain amount of discretion for both participants. The recipient knows that the message’s contents and the dispatcher are not broadcast to every individual at the table. The dispatcher has the same assurance, and the other participants benefit from a fair of amount of entertainment as they try to guess what message was sent and by whom.

As the meal progresses, participants increasingly attempt to piece together who “says” what to whom. They have already at the disposal the general context and knowledge of the relationships between their companions as well as more immediate circumstances relating to the overt communication occurring. These, along with the sudden glowing of the glasses and participant reactions, are used as tools to gauge the kinds of covert communication from which they are partially excluded. In this respect, Social/Lites allow groups to enhance pre-existing currents of communication and add solidity to the dinner theatre in which they play their role while at the same time forcing each other to take responsibility for the particulars of their interactions since nothing is strictly anonymous, especially for the receivers.

The strength of Social/Lites does allow users enough cover to send messages back and forth with the confidence that messages are received and understood without breaking social taboos or rules of etiquette. The professor in the opening except receives a clear message that permits only a small injury to his dignity and allows him to save face through humour. Perhaps now-separated new parents Bristol Palin and Levi Johnston would have been able to continue their romance of they had had the opportunity to send winkies and smilies back and forth throughout their many official meals during the 2008 American election campaign. It is conceivable that international relations would have gone better for America had President Koizumi of Japan been able to send a covert frowny to President Bush after his ill-conceived shoulder run of Germany’s Chancellor Merckel, saying, “Hey dude! Not cool.”

Critical Making && Inclusive Design Studio Open Show!

April 9, 2009matt.rattoevents, fi2241

Critical Making && Inclusive Design Studio openshow

April 16, 2009: 2-5pm.

Rm. 728, Bissell Building, University of Toronto.

FI2241 and FI2196 invite you to join us in Bissell, rm. 728 on Thursday, April 16 from 2-5pm for an open show of their final projects.  Free wine and other refreshments will be served.

Critical Making – FI2241 – Using design-based research on physical  computing as an adjunct to information scholarship, the course has  explored critical issues of intellectual property, technological bias,  identity and public space. The projects to be shown include:

•    Social/Lites: An experiment in collaborative dinner theatre. (Bruno’s  BellyButton)
•    Roomalizer: Breathing New Life Into Public Spaces. (Under Construction)
•    Rage Against the Machine: Let us tell you how you feel. (Shake n’ Bake)
•    The Hipster: Flaneur 2.0 (NUTS)

Inclusive Design – FIS2187/FIS2196 – Students will be presenting  projects resulting from engagement with both theoretical and practical  issues in inclusive design, including designs or demos for:

•    An accessible web chat client in an online community
•    Implementing and testing Flexible User Interface (FLUID) components  designed for learning management systems such as Sakai and ATutor
•    An accessible mobile client for an indoor positioning system
•    Inclusive policies for online communities (involving children and  adults)
•    Formats for import and export of video description and captioning
•    Translating Cultural Artifacts across sensory modalities

Come join us, interact and explore creative solutions, new forms of ubiquitous computing, and engaging solutions to issues of inclusive design. Email matt.ratto@utoronto.ca if you need directions or more info!

Roomalizer

April 7, 2009inf2241_classfi2241

“You don’t want to make anything too useful” (Ratto, 2009)

With those inspirational words, and our own initial ideas about linking personal feelings with public space, we set about designing our final project… the Roomalizer

During our initial brainstorm, we thought about spidering Twitter or some other microblogging site for emotionally loaded words (love, hate, jealous) and then translating them into a physical representation. We also discussed ideas about a wearable that would display thoughts that would be otherwise inappropriate for a given social context (“you’re standing too close!” or maybe “yes, you do look fat in those jeans”).

We eventually settled somewhere in between. We’re looking to design a system that moderates the excitement level in a room and translates it into a physical effect. The current installation of choice? An wall that breathes. As the level of excitement (measured as some currently undetermined variables) in the room goes up, the physical output of the device increases… the wall starts breathing a little faster, changes colours, the display output goes haywire, etc.

Week 1

Our first session in the lab revolved around what effect we were going for and how we would make it work. We discussed using infrared, pressure and aural sensors to gauge the feeling in a room. We eventually settled on Matt’s suggestion of extracting data from a web cam feed. For the physical representation, we looked into using fans, servos or solenloids to simulate breathing. We left the lab undecided, but with plenty to think about. Below are some initial drafts of the physical mechanism, which would translate the readings from a camera into a physical movement.

Breathing mechanism

Breathing mechanism

Servo motor function

Servo motor function

Initial sketch for servo motor

Initial sketch for servo motor- CLICK for a video demo

Note: Around this mechanism there will be a fabric and a lamp at the bottom displaying RGB colors

Week 2

While part of the team got to work hacking the webcam code, the others started experimenting with fans and fabric to create the lungs. The schematics shown above weren’t feasible, and the fan idea was quickly discarded as a dead end. We then borrowed (repurposed? stole?) BBB’s breadboard assemblage and servo, taped on some cardboard and a Starbucks cup, loaded the Barragan servo code and stuck the whole contraption behind a white piece of fabric. To our surprise, it kind of worked. We fiddled a little with the code to control the range (reduced from 180 degrees to about 75) and the speed (controlled with the delay function, which we’ll later insert with a value derived from the webcam data), and then called it a day.

Code

#include <Servo.h>

Servo myservo;  // create servo object to control a servo
// a maximum of eight servo objects can be created

int pos = 0;    // variable to store the servo position
int breathe_speed = 0;    //  variable to store delay for excitement
//int breathe_speed_out = 0; need tw o vars?

void setup()
{
myservo.attach(9);  // attaches the servo on pin 9 to the servo object
}

void loop()
{
for(pos = 45; pos < 150; pos += 1)  // goes from 0 degrees to 180 degrees
{                                  // in steps of 1 degree
myservo.write(pos);              // tell servo to go to position in
variable ‘pos’
delay(breathe_speed);                       // waits 15ms for the servo
to reach the position
}
for(pos = 150; pos>=45; pos-=1)     // goes from 180 degrees to 0 degrees
{
myservo.write(pos);              // tell servo to go to position in
variable ‘pos’
delay(breathe_speed);                       // waits 15ms for the servo
to reach the position
}
breathe_speed += 1;
}

Week 3

We arrived at the lab early, eager to get our lungs breathing. We quickly decided that two servos, with cardboard arms and balloons attached at the end were our best bet. We spent the next hours synchronizing the servos (which was surprisingly difficult, because they need to go in opposite directions and are restricted to 180 degrees of movement), assembling the whole contraption, and combining the webcam code with the servo code.

MORE TO COME…

Parallel Conversation Table

March 25, 2009inf2241_classfi2241

2009 March 15, overheard somewhere deep within Bruno’s Bellybutton:

BB1: What are we going to make?  How are we going to make it?

BB2: I have an idea… let’s wire a dinner table where people sit down normally but can send each other secret messages, kind of like kicking someone or putting your hand on someone’s thigh under the table while still maintaining good table etiquette…

BB1: Can we also send electric shocks to people who talk too much/we don’t like?

BB3: I heart this idea.

BB2: Yes… and an ‘eject’ button.

BB1: And a speaker that makes it sound like someone across the table farted.

BB2: Critical Information Studies / Design-Oriented Research Final Project: “Breaking Bread… and Wind”

BB1: “Breaking Bread, Wind and Hearts”  It could be used to tell dates that it isn’t working, while you go to the washroom or make your escape.

BB3: A “Bad Date” escape device?  Cha-ching!

Phillips (2009) describes identity and social relations as “performances” negotiated in social settings and recalls Goffman’s (1959) metaphors of ‘front stage’ and ‘back stage’ as the means by which we selectively reveal ourselves to those around us.  At a sit-down dinner party, conversation above the table, characterized by self-conscious words, careful etiquette, as well as generously-filled wine glasses is ‘front stage.’  Yet beneath the table, an entirely different, surreptitious ‘back stage’ conversation is carried out – literally, a subtext to the formal social occasion.  Perhaps information technology can act to mediate these parallel conversations, or better yet, bring them to new and wonderful heights of extreme discomfort.

//BBB

Waste your precious time. C’mon, do it.

March 19, 2009inf2241_classfi2241

Imagine being forced to sit down in the Bloor-Young Subway stop at 8:30AM in the midst of rush hour pedestrian traffic? Just sitting there on one of those red benches. You know those benches that no one ever sits on? Imagine all the things you would notice that you had never noticed before. The dust and grim on the ceiling, conversations of those walking by, the flickering lights. This idea of spending a great deal of time in a space that you have not wanted to spend time in. It is not a new idea. Activities such as slow walking allow the individual to make use of space in a similar fashion.

In our next project we would like to explore this idea by creating a mutated scavenger hunt. Instead of having individuals go from spot to spot as quick as they can, we want to force partakers to stop and smell the exhaust. They will have to sit still in a space for a good deal of time before the sensor they are carrying tells them they have spent enough time there.

We plan on creating the device by using RFID tags and an RFID reader. The device will be very simple. The user will carry the device around with them to specified locations. The RFID reader will pick up the RFID tag and after 5 minutes, or however long, a small LED light or speaker will be activated to inform the user that they have in fact spent enough time in the location and that it is time to move on – if they wish.

This device is attractive to us for many reasons. Initially we were attracted to the idea of creating a device that does not reinforce consumption – whether through actually consuming items or activities that require the users to consumer, like attending a party (buying alcohol, nice clothes, etc.).  Also, we are interested in creating a device that forces people to stop and connect with physical space. Instead of a device that helps navigate people from point A to point B while avoid human contact at all points. Because we believe space is an important infrastructure, part of the public domain, where culture is lived and transformed. A device that promotes the use of and reflection of such space is highly attractive to our group.

Ruminations on the final project

March 17, 2009inf2241_classfi2241

Final Project Potential: Use two arduino’s to communicate and cross-reference personal information about the wearers of a device.

Sort of inspired by the ‘annoying’ project, possible antithesis :)

What we have ultimately decided to follow through on is the idea of a physical emoticon – a more literal approach to wearing your heart on your sleeve, if you will. Our plan is to create a wearable device that will interpret the wearer’s voice patterns and translate the results into a physical representation of the person’s emotions, represented by a large “emoticon” made up of LEDs. The emoticon will change depending on whether the user’s voice patterns indicate that they are happy, upset or neutral. As such, the frequency of social misunderstandings will be reduced; those communicating with the wearer will know exactly what that person is feeling, even when they are lying or being sarcastic. The device is also completely ubiquitous: because it is sewn into a chic, stylish sweater, the user has only to get dressed in the morning and turn the device on, and let it work its magic.

There are a couple of reasons why we chose to pursue a wearable emoticon display. The first is that because online communication has become so prevalent–especially instant messaging services such as MSN, which relies heavily upon emoticons to communicate tone and emotion–that nearly everyone has a frame of reference for what a simple line drawing of a smiley face actually represents. We were also compelled by the idea that our reliance upon online communication might have had an effect–or perhaps will someday–upon our ability to interpret emotion, tone and intent through verbal communication and face-to-face interactions with other people. Just as tone can be incredibly difficult to discern through an email or instant messaging conversation, we wanted to explore similar difficulties associated with in-person conversation. Few people are strangers to the feeling of not being able to interpret exactly what someone is trying to say when their tone is ambiguous or contradictory to their words.

From the more serious side of things, there are a few technical aspects of the project that influence not only how the device operates, but the overall impact that it has upon the user from a theoretical perspective. As we’ve been playing around with different ways to build our device in the last few weeks, we determined that it would only be possible to introduce a microphone to the circuit (using Arduino, of course) to read voice intensity (ie. volume), as opposed to tone or pitch. This means that the terms by which emotion is interpreted by the device are extremely narrow and not very objective – volume is hardly enough to establish something as complicated as emotion, even in spite of the fact that our voices do tend to grow softer or louder in some cases, depending on what emotion we are expressing. Therefore, not only is the imposition of a specific emotion onto the user a very subjective outcome, but it also forces the user to adapt his or her manner of speaking in order to elicit a more desirable response, or at least try to evade an incorrect one. In addition, they might have to explain to others why the emoticon being shown on the LED board is not actually representative of what they are feeling.

Annoying!

March 16, 2009inf2241_classfi2241

First week [Ideas]

Finally, a weekly project that is off to a good start! This past session was indeed fruitful: we have a solid idea of what to build and have tested the two main components of our project successfully.

We were fortunate enough to have the WiiChuck tutorial left on our table. This was the starting point of our idea. As we try and figure out what type of data we could gather from the various readings the accelerometers would give us in the Serial window in Arduino, we toyed with ideas involving graphical displays on bodily information. We’ve discussed, as a starting point, an automated dance notation system inspired by the locative possibilities of the WiiChuck as well as already existing graphical dance notation display [see link 1 link 2]

However, we wanted the project and final device to address the idea of space, namely the line between private and public space, in a more forward manner. We opted for a more… “annoying” solution to this matter: a device that would emit different sounds depending on your body movements and relative position of your limbs in space. How does that speak of the public space exactly? We’re not sure yet, but we can for sure tell you that the first experiment with the sound component was completely successful at alerting and annoying the people around us… even ourselves.

First week [Technical]

The basic WiiChuck circuit was left untouched from the tutorial. What we need to understand is how the x, y and z components displayed in the serial window are affected and through what type of movement. This will in turn enable us to direct and trigger different sounds with the various variables. Additionally, we have successfully completed a sound tutorial where we produced a basic song [see video] using a small speaker.

WiiChuck connected to Arduino

WiiChuck connected to Arduino

Creating music using the WiiChuck

An experiment that took us a great deal of time was the idea of controlling the melody example (from the Arduino website) using the WiiChuck. A preliminary modification of the Arduino code was the introduction of an analog function that will allow us to control different variables when playing the sound. Once this is mastered successfully, we are analysing how to use the accelerometers present in the WiiChuck to introduce more control to our wearable device.

Testing the Melody code on Arduino

——————————–

int speakerPin = 9;
int analogIn = 0;

int length = 15; // the number of notes
char notes[] = “ccggaagffeeddc “; // a space represents a rest
int beats[] = { 1, 1, 1, 1, 1, 1, 2, 1, 1, 1, 1, 1, 1, 2, 4 };
int tempo = 100;

void playTone(int tone, int duration) {
tempo = analogRead(analogIn);
for (long i = 0; i < duration * 1000L; i += tone * 2) {
digitalWrite(speakerPin, HIGH);
delayMicroseconds(tone);
digitalWrite(speakerPin, LOW);
delayMicroseconds(tone);
}
}

void playNote(char note, int duration) {
char names[] = { ‘c’, ‘d’, ‘e’, ‘f’, ‘g’, ‘a’, ‘b’, ‘C’ };
int tones[] = { 1915, 1700, 1519, 1432, 1275, 1136, 1014, 956 };

// play the tone corresponding to the note name
for (int i = 0; i < 8; i++) {
if (names[i] == note) {
playTone(tones[i], duration);
}
}
}

void setup() {
pinMode(speakerPin, OUTPUT);
pinMode(analogIn, INPUT);
}

void loop() {
for (int i = 0; i < length; i++) {
if (notes[i] == ‘ ‘) {
delay(beats[i] * tempo); // rest
} else {
playNote(notes[i], beats[i] * tempo);
}

// pause between notes
delay(tempo / 2);
}
}

Second week [Ideas]

It’s alive! And pretty impressive to boot. We have managed to use the data from the WiiChuck to create a melody. Well, sort of a melody. It’s actually kind of an annoying beeping (Marie-Eve dubbed it the TechnoDJ 1980). Moving the WiiChuck vertically controls the pitch, while moving it horizontally controls the rhythm. Early testing revealed that testing this thing was extremely annoying, so we introduced a button to start the ‘melody’ (the button on the front of the WiiChuck). This turned out to be an interesting turning point; the entire purpose (morality) of the device changed with the additional functionality of an on/off button. With it, the user has much more control – it is possible for them to present a particular melody at a particular time, as opposed to the one that is automatically generated as they move through a public space. In turn, the perceptions of people around the user are changed…

We also managed to hook the Arduino up to a battery; once the code was loaded onto the board, the entire setup could be hidden away inside someone’s clothing. That way, the noises were less locatable, with nothing to immediately identify who was creating them. Again, this changes how the user-device interact with the space around them. We noticed this when wandering around New College. People would notice the odd sounds, but would frequently not be able to establish their source…

We had an interesting discussion regarding whether we should further disassemble the WiiChuck. We concluded that, while it would allow for slightly more comfort for the user, it would also made the device less repurposable. In a similar vein, we decided that we would pass the code on to the wearer (or the purchaser or whoever), to allow them some control over the tones emitted. As it stands, the user does not have much agency over the device. Allowing them to adjust the code would pass some of it back to them. We could even hope that they would start to tinker with the code or circuitry, then pass it on…

Discussion of articles

Mann, S. (1996). Smart clothing: the shift to wearable computing. Communications of the ACM, 39(8), 23-24.

Knight et al. (2007). “Uses of accelerometer data collected from a wearable system.” Personal and Ubiquitous Computing.”11(2):133-143.

Kirschenbaum, Matthew G. (2004). “So the Colors Cover the Wires”: Interface, Aesthetics, and Usability. In A Companion to Digital Humanities, ed. Susan Schreibman, Ray Siemens, John Unsworth. Oxford: Blackwell.

In his article “Smart Clothing: the shift to wearable computing”, Steve Mann examines the fundamental issues inherent to this (then) new discipline. As one of the pioneers of the wearable computing field, Mann addresses the social and physical awkwardness of the wearable devices yet expresses a sense of self-empowerment as these tools became a part of him. Mann then briefly looks at the idea of privacy and public space when envisioning a possible global village of people wearing cameras instead of being watched by CCTVs. Mann further pursue this idea in terms of clothing vs uniform where the Orwellian risk of being forced to wear computing devices for surveillance purposes might become a reality. As wearable computing devices are normally situated in the core of the wearer’s personal space, there is both a danger for this space to be violated as well as protected.

In their 2007 article, Knight et al. discuss current uses for accelerometer data. They outline a wide range of uses, from teaching physics to coaching sports and to research on human movement. One particularly interesting use that was discussed involved correlating accelerometer output with measures of physical activity, such as heart rate. With sufficient data, accelerometer readings could be mapped to energy output; the user could then use the body-mounted devices to know the amount of i.e. calories burned while performing a physical activity. The article concludes with some suggestions for using accelerometers to obtain effective readings (regarding noise and sampling frequency) and some suggestions for future uses. In regards to our project, the article was useful for its broad range of potential applications; however, less useful for our purposes was the was a heavy emphasis on the technical aspects, which skewing more towards the scientific end of the spectrum.

Kirschenbaum (2004) discusses the role and importance of interfaces. In the past, interfaces were usually Graphical User Interfaces (GUI) and considered as a minor element of the development process. It is common that interfaces are left to the final stage of development, and developed without any interaction with the final users. Kirschenbaum offers an alternative framework based on the critical approach that humanities can provide to the predominant approach to GUI. This approach becomes especially significant when designing wearable devices, since the boundary between content and medium of a wearable artefact became unclear. As Kirschenbaum suggests, “form and content are almost instinctively understood as inextricable from one another.”

… dancing your own beep: notes on the evolution of wearable ideas

The initial ideas inspired by the dance notation and the puzzling comfort of the WiiChuck have greatly evolved into interesting intuitions about wearable devices and relations of power, ownership and surveillance, now creativity, “tactile” and social.

Starting up with WiiChuck…

During the early stages of the design process, a challenging idea kept our focus: imagining how to capture the analog readings from the WiiChuck and correlate them with an automated construction of diagrams using an appropriate dance notation. The very thought was just fascinating!! Although not doable (perhaps because “we are not Pixar,” as some of us thought) within the time-span for our wearable device, the challenge did not stop, but led us further, to carefully explore the complexity of what we had just in front us – the data we could obtain from the WiiChuck and its quite “magical” roughness. This exploration took us a while… not actually time to make it work, which was the easiest, since we had re-used some freely available code for Arduino (and some nice steps further assisted by the instructor). Just ready-to-use, ready-to-be- re-purposed, although it was at first sight somewhat overwhelming. 5 arrays of numbers, ranging from 0 to 255, and two Boolean values (0 or 1) while pressing the buttons. The first day just went on…


First metamorphosis… “talk about you as you walk”

It was during that week, during a discussion about the issues on “Privacy and Piracy,” that the automated dance notation suddenly mutated into a surveillance device. Indeed, we were then wanting to build a heavily-loaded sensor, (“we might have to take apart some of these WiiChucks”- we thought) using multiple pieces so that we could build this machine, one which could “talk about you” secretly, informing about you as you walk. There we were, now in retrospective it only seems clear. Then, another prototype, a huge one again, a device capable of “learning to dance”, but so full of rather secret purposes, “noble” some may shouted out aloud. The chunks of Arduino code were now fitting smoothly. Once we discovered that the println() statements were messing everything up. It was now evident how expensive (in machine processing power) the unnecessary console displays were. Once commented out, the WiiChuck seemed to awake as we had never had seen. Larger ranges of values started to show up…

Second metamorphosis… “dancing our own beep”

The week started quite “overloaded,” but it was not really clear what kind of semantic “overload” it was, nor was it clear which uncertain shifts were awaiting. Some spikes on our minds; resonating concepts, old words, known and old, but not quite. It was then present, that strange sense of proximity and distance which invites such novelty. These very old-new words were making sense together, yes!! Resonating as sometimes a melody does, but those not so melodic as well. There we were carrying all of them with us: “movement and dance,” WiiChuck of course, but “social,” “capital,” “collective.” They could not have been missed; the discussion on Facebook had just tuned them up in a consciousness’ flow. Not quite graceful the mood – the “critical mood”, not the shift, or rather the reaction about uses and whatnot so misuses of classifications when modeling subjectivity (ed. ummmmm). Pushing metadata and building identity. Pulling as someone might have done with a leaf, and with it a big chunk of the roots came apart. At first, so tacitly “efficiency,” “effectiveness,” “customer-behaviors” and so many other that we could not read as they were too close. Among the noise, and the proximal-distant, quite novelty in our minds went back home, “dancing our own beep.”
Below is the “hybrid” code we have come up with… load it, add to it, comment on it. Dare to explore multiple modalities of perception, “critical thinking” might become “critical making” for you as well…   A. (m/16)

Second week [Technical]

————————————

/*
* Generic Wiichuck code
*
* Adapted from
* NunchuckServo
*
* 2007 Tod E. Kurt, http://todbot.com/blog/
*
* The Wii Nunchuck reading code is taken from Windmeadow Labs
*   http://www.windmeadow.com/node/42
*
*
* 2009 Thanks to Matt R. for fixing a bit more and making this accessible
*
* 2009 Thanks Anto for extracting the playTone() from Arduino and make it worked here
* *
*/

#include <Wire.h>

// wii vars

int joy_x_axis;
int joy_y_axis;
int accel_x_axis;
int accel_y_axis;
int accel_z_axis;

int z_button = 0;
int c_button = 0;

int loop_cnt = 0; // conrol var var of msec passed
int loop_cnt_tot = 200; //  msec to wait until data is read again
int controlToneVar;

// melody vars
int speakerPin = 13;
int analogIn = 5;
int tempo = 200;
int sensorVar = 0;
int myNote = 0;

void setup() {
pinMode(speakerPin, OUTPUT);
pinMode(analogIn, INPUT);
Serial.begin(9600); // initialize serial port

// wii vars
nunchuck_setpowerpins(); // use analog pins 2&3 as fake gnd & pwr
nunchuck_init(); // send the initilization handshake
}

void loop() {

/// Wi procedures

// get var from the Wii
checkNunchuck();

// control melody

//myNote = map(sensorVar,0,1023,100,5000);

/*
Serial.println(“”);
Serial.print(“My Tone: “);
*/

//Serial.println(controlToneVar);
//Serial.println(tempo);

delay(1);

//tempo=sensorVar;

}

// melody function: play a tone
void playTone(int tone, int duration) {
for (long i = 0; i < duration * 1000L; i += tone * 2) {
digitalWrite(speakerPin, HIGH);
delayMicroseconds(tone);
digitalWrite(speakerPin, LOW);
delayMicroseconds(tone);
}
}
/*
Sets the tone to play based on other data available
*/
void setTone(int measure, int start, int end) {

//myNote = map(measure,start, end, 100,3000);
//myNote = measure;

// convert the given rage to 0 – 1023 range
int sensorVarNew = 0;
sensorVarNew = map(measure,start, end, 0,1023);

//Serial.println(measure);
///*
// play only the scale
if ((sensorVarNew>=0) && (sensorVarNew<=126)){
myNote = 956;
controlToneVar = 1;
} else if ((sensorVarNew>=127) && (sensorVarNew<=253)) {
myNote = 1014;
controlToneVar = 2;
} else if ((sensorVarNew>=254) && (sensorVarNew<=381)) {
myNote = 1136;
controlToneVar = 3;
} else if ((sensorVarNew>=382) && (sensorVarNew<=507)) {
myNote = 1275;
controlToneVar = 4;
} else if ((sensorVarNew>=508) && (sensorVarNew<=634)) {
myNote = 1432;
controlToneVar = 5;
} else if ((sensorVarNew>=635) && (sensorVarNew<=761)) {
myNote = 1519;
controlToneVar = 6;
} else if ((sensorVarNew>=762) && (sensorVarNew<=888)) {
myNote = 1700;
controlToneVar = 7;
} else if ((sensorVarNew>=889) && (sensorVarNew<=1023)) {
myNote = 1915;
controlToneVar = 8;
}
//*/
}

// Wii functions

void checkNunchuck()
{
if( loop_cnt > loop_cnt_tot ) {  // loop()s is every 1msec, this is every 100msec

nunchuck_get_data();
nunchuck_print_data();

loop_cnt = 0;  // reset for
}
loop_cnt++;

}

//
// Nunchuck functions
//

static uint8_t nunchuck_buf[6];   // array to store nunchuck data,

// Uses port C (analog in) pins as power & ground for Nunchuck
static void nunchuck_setpowerpins()
{
#define pwrpin PC3
#define gndpin PC2
DDRC |= _BV(pwrpin) | _BV(gndpin);
PORTC &=~ _BV(gndpin);
PORTC |=  _BV(pwrpin);
delay(100);  // wait for things to stabilize
}

// initialize the I2C system, join the I2C bus,
// and tell the nunchuck we’re talking to it
void nunchuck_init()
{
Wire.begin();                    // join i2c bus as master
Wire.beginTransmission(0×52);    // transmit to device 0×52
Wire.send(0×40);        // sends memory address
Wire.send(0×00);        // sends sent a zero.
Wire.endTransmission();    // stop transmitting
}

// Send a request for data to the nunchuck
// was “send_zero()”
void nunchuck_send_request()
{
Wire.beginTransmission(0×52);    // transmit to device 0×52
Wire.send(0×00);        // sends one byte
Wire.endTransmission();    // stop transmitting
}

// Receive data back from the nunchuck,
// returns 1 on successful read. returns 0 on failure
int nunchuck_get_data()
{
int cnt=0;
Wire.requestFrom (0×52, 6);    // request data from nunchuck
while (Wire.available ()) {
// receive byte as an integer
nunchuck_buf[cnt] = nunchuk_decode_byte(Wire.receive());
cnt++;
}
nunchuck_send_request();  // send request for next data payload
// If we recieved the 6 bytes, then go print them
if (cnt >= 5) {
return 1;   // success
}
return 0; //failure
}

// Print the input data we have recieved
// accel data is 10 bits long
// so we read 8 bits, then we have to add
// on the last 2 bits.  That is why I
// multiply them by 2 * 2
void nunchuck_print_data()
{
static int i=0;
joy_x_axis = nunchuck_buf[0];
joy_y_axis = nunchuck_buf[1];
accel_x_axis = nunchuck_buf[2]; // * 2 * 2;
accel_y_axis = nunchuck_buf[3]; // * 2 * 2;
accel_z_axis = nunchuck_buf[4]; // * 2 * 2;

z_button = 0;
c_button = 0;

// byte nunchuck_buf[5] contains bits for z and c buttons
// it also contains the least significant bits for the accelerometer data
// so we have to check each bit of byte outbuf[5]
if ((nunchuck_buf[5] >> 0) & 1)
z_button = 1;
if ((nunchuck_buf[5] >> 1) & 1)
c_button = 1;

if ((nunchuck_buf[5] >> 2) & 1)
accel_x_axis += 2;
if ((nunchuck_buf[5] >> 3) & 1)
accel_x_axis += 1;

if ((nunchuck_buf[5] >> 4) & 1)
accel_y_axis += 2;
if ((nunchuck_buf[5] >> 5) & 1)
accel_y_axis += 1;

if ((nunchuck_buf[5] >> 6) & 1)
accel_z_axis += 2;
if ((nunchuck_buf[5] >> 7) & 1)
accel_z_axis += 1;

//accel_y_axis *= 11;

/*
// Printing data
Serial.print(i,DEC);
Serial.print(“\t”);

Serial.print(“joy:”);
Serial.print(joy_x_axis,DEC);
Serial.print(“,”);
Serial.print(joy_y_axis, DEC);
Serial.print(“  \t”);

Serial.print(“acc:”);
Serial.print(accel_x_axis, DEC);
Serial.print(“,”);
Serial.print(accel_y_axis, DEC);
Serial.print(“,”);
Serial.print(accel_z_axis, DEC);
Serial.print(“\t”);

Serial.print(“but:”);
Serial.print(z_button, DEC);
Serial.print(“,”);
Serial.print(c_button, DEC);

Serial.print(“\r\n”);  // newline
*/
i++;

// Play tone when z button pressed
if (z_button==0)
{
setTone(accel_x_axis, 60, 160);
//setTone(accel_y_axis, 70, 190);
//setTone(accel_y_axis, 80, 100);
//setTone(accel_y_axis, 0, 255);

// set the tempo using Wii controller
//tempo = map(joy_y_axis,0,255,10,500);
//tempo = map(accel_x_axis,60,150,1,500);
tempo = map(accel_y_axis,70,180,1,500);

if (myNote>0)
playTone(myNote, tempo);
}

// fix negative tempos
if (tempo<0)
{
tempo = 1;
}
//Serial.println(tempo);

//int p1 = map(accel_x_axis);

// exporting to processing
/*
Serial.print(accel_x_axis, BYTE);
Serial.print(accel_y_axis, BYTE);
Serial.print(accel_z_axis, BYTE);
*/

//Serial.print(joy_x_axis, BYTE);
//Serial.print(joy_y_axis, BYTE);

//delay(50);

}

// Encode data to format that most wiimote drivers except
// only needed if you use one of the regular wiimote drivers
char nunchuk_decode_byte (char x)
{
x = (x ^ 0×17) + 0×17;
return x;
}

// returns zbutton state: 1=pressed, 0=notpressed
int nunchuck_zbutton()
{
return ((nunchuck_buf[5] >> 0) & 1) ? 0 : 1;  // voodoo
}

// returns zbutton state: 1=pressed, 0=notpressed
int nunchuck_cbutton()
{
return ((nunchuck_buf[5] >> 1) & 1) ? 0 : 1;  // voodoo
}

// returns value of x-axis joystick
int nunchuck_joyx()
{
return nunchuck_buf[0];
}

// returns value of y-axis joystick
int nunchuck_joyy()
{
return nunchuck_buf[1];
}

// returns value of x-axis accelerometer
int nunchuck_accelx()
{
return nunchuck_buf[2];   // FIXME: this leaves out 2-bits of the data
}

// returns value of y-axis accelerometer
int nunchuck_accely()
{
return nunchuck_buf[3];   // FIXME: this leaves out 2-bits of the data
}

// returns value of z-axis accelerometer
int nunchuck_accelz()
{
return nunchuck_buf[4];   // FIXME: this leaves out 2-bits of the data
}
————————————-

DRM in the physical word – Presentation or DRM with RFID at its best

March 13, 2009inf2241_classfi2241

On March 11, 2009 we had the presentation of our DRM enhanced photocopier. If it was an success? Well it was well received and we all had a very good time. Please check all the other posts, because each project was very good and together they show how interesting this course is.

The last two blogs describe our thoughts, document our work, and reveal the code used. So try it out at home and post comments or questions.

We will add some pictures of the machine during the presentation later on. Below we have an discussion of an article that gives a critical view on DRM and how it is used in the industry. Other articles were discussed in the last posts, but we chose this one to be our main article.

Discussion of articles:

We chose Monika Roth’s article (Roth, 2008) as being our main article. Roth’s analysis sheds light on the issues that caused the music industry to turn away from DRM. EMI was the first major music label that announced that all music will be distributed without DRM in the future. Apple’s iTunes picked this up and is now used as a marketing instrument. However, Roth shows that there are other reasons than pure altruism at work that make music lables turning back on DRM. One of the reasons might be a growing frustration amongst users. The point that Roth makes is the fact that the wide usage of DRM could lead to a series of lawsuits against music companies and distributers alike. She uses Apple’s iTunes and the iPod products to show how DRM not only limits the usage of files within the realm of IP laws. Furthermore, DRM helps Apple to bundle iTunes with their iPods. Roth argues that this could violate the Sherman Act that regulates bundling of products.

We think the value of the article comes from the critical look on DRM technology. For some period of time DRM coupled with circumvention laws seems as the ultimate tool to prevent infringement of intellectual property laws. Roth shows the legal and economical problems that DRM creates for it’s own creators.

References
Roth, M. (2008). Entering the DRM-free zone: An intellectual property and antitrust analysis of the online music industry. Fordham Intellectual Property, Media & Entertainment Law Journal, 18(2), 515-540.

Social Crowning… or the Physical Facebook

This week in class, we had an interesting discussion of how technology affects our notions of social capital, in addition to the way it affects notions of public space (as we discussed in previous classes).

During the lab, we decided to scrap the RFID idea we had last week, realizing that it was too complex for this project and might be a good idea for our final project instead. The team decided to brainstorm a whole new idea and build it in one lab session. WOW! We had the option to build a wearable device or a project that implemented Digital Rights Management (DRM). Our initial idea combined both, but we decided instead to build a wearable device.

crown, Arduino, breadboard, battery

crown, Arduino, breadboard, battery

Some silly team member suggested making a crown. It started with an idea to build a wearable that could indicate the wearer’s mood. Then the discussion turned to ideas of social capital and social networking technologies. We eventually decided to make the crown into a wearable device that would indicate different elements of “social status”. Though at first we designed it with high school students in mind, the final device is an object meant to be worn by undergraduates or young adults aged 18-22 at social events ( the types of events which are attended for the purpose of meeting new “sexy” people). The crown has four categories, Single, Straight, Sober and ?, each with an LED embedded above it.  By turning on certain lights, the wearer announces their status to people at the event. The categories are humourously intended and are meant to critique the often limiting choices available for describing ones social status on social networking sites such as facebook (for example, the option of straight/NOT straight excludes other possible sexual preferences).

During class, Matt focused on the question of “how can we design objects that are meant to be re-purposed?” It is quite possible for wearers to re-purpose this technology to overcome these constraining and somewhat degrading categories, in a very grassroots way. I hope this doesn’t shatter the illusion for anyone, but those categories are made using black pen on yellow electrical tape. Anyone could peel them off and create their own categories. Symbols or logos could also be used, especially to indicate secret status or membership in a secret society. This technology could even be used to indicate serious allergies or health conditions.

Because of the powerful, transformational potential of this object, we’ve decided to make the code open source:

// CROWNTASTIC! by Mike, Nancy and Marta

int led_13 = 13;
int inPin2 = 2;
int val2 = 0;
int led_12 = 12;
int inPin3 = 3;
int val3 = 0;
int led_11 = 11;
int inPin4 = 4;
int val4 = 0;
int led_10 = 10;
int inPin5 = 5;
int val5 = 0;

void setup() {
pinMode(led_13, OUTPUT);  // declare LED as output
pinMode(inPin2, INPUT);    // declare pushbutton as input
pinMode(led_12, OUTPUT);  // declare LED as output
pinMode(inPin3, INPUT);    // declare pushbutton as input
pinMode(led_11, OUTPUT);  // declare LED as output
pinMode(inPin4, INPUT);    // declare pushbutton as input
pinMode(led_10, OUTPUT);  // declare LED as output
pinMode(inPin5, INPUT);    // declare pushbutton as input
}

void loop(){
val2 = digitalRead(inPin2);  // read input value
if (val2 == HIGH) {         // check if the input is HIGH (button released)
digitalWrite(led_13, LOW);  // turn LED OFF
} else {
digitalWrite(led_13, HIGH);  // turn LED ON
}

val3 = digitalRead(inPin3);  // read input value
if (val3 == HIGH) {         // check if the input is HIGH (button released)
digitalWrite(led_12, LOW);  // turn LED OFF
} else {
digitalWrite(led_12, HIGH);  // turn LED ON
}

val4 = digitalRead(inPin4);  // read input value
if (val4 == HIGH) {         // check if the input is HIGH (button released)
digitalWrite(led_11, LOW);  // turn LED OFF
} else {
digitalWrite(led_11, HIGH);  // turn LED ON
}

val5 = digitalRead(inPin5);  // read input value
if (val5 == HIGH) {         // check if the input is HIGH (button released)
digitalWrite(led_10, LOW);  // turn LED OFF
} else {
digitalWrite(led_10, HIGH);  // turn LED ON
}
}

DRM in the physical word – Control the usage

March 11, 2009inf2241_classfi2241

On March 4, 2009 we started building our physical DRM system. The idea is to control the physical artefact and its usage.

We want to restrict the usage of books within the controlled environment of an library. The books in libraries are today often equiped with RFID tags. These tags are used to detect if somebody tries to take a book out of the library without checking the book out. We want to use the RFID tags to control the copying of the book withing the library. Of course this setting has a limitation. It is still possible to borrow the book and copy it somewhere else. But every copying machine could be equipped with such a technology.

But back to the idea. We want an RFID reader in the copying machine. This allows to detect which book is about to be copied. If the user presses the copy button our system looks up the book, which is identified through an RFID signature, in a database and decides if the copying is permitted. The database could also carry the information how many pages can be copied.

The amount of pages allowed for copying could be set by the librarian uppon request. It could also be determined as a quota. A percentage of pages allowed to being copied depending on the overall pagecount.

Our prototype uses an RFID reader on a bread board. The RFID reader is power and controlled through the Arduino board. We placed this within a flat bed scanner (under the glass) and used the built in button of the scanner to start a copy process. We want to place a camera above the glass that takes a picture of the book. The book has to be place the other way around on the scanner. Once the button is pressed the RFID information of the book is compared with a file on the controlling laptop (connected via serial console). Only if the identifyer of the book is in the file and has a copycount bigger than zero the laptop will take the picture.

The DRM enforcer 1.0 - The publisher dream

The DRM enforcer 1.0 – The publisher dream

Well we did this project not to help the publishing industry to enforce IP laws. Even with our device there will always be ways to circumvent such technology (take a digital camera and the book can be scanned). It is more a way to explore digital DRM in the physical world. Digital DRM can also be circumvented, but the law disallows this. DRM and circumvention laws try to frustrate hackers and disencourage the copying of digital content. In the end they frustrate the user who payed for the usage of the digital content. The user can not use and posses his property in the same way a user could posses a physical property.

Code for anyone who wishes to take a look (1st part is python, 2nd is Wiring for Arduino):

#!/usr/bin/env python
import serial
import sqlite3
import re
import os
import datetime
re_id=re.compile(r'[0-9A-Z]{10}')
re_snap=re.compile(r'snap!')
con=sqlite3.connect('ids.sql')
count = int
s = serial.Serial('/dev/ttyUSB0', 9600, timeout=1)
#s = open('ids.txt', 'r')
while 1:
	item = s.readline()
	for id in re_id.findall(item):
		count = con.execute("SELECT count from ids where id = '%s';" %(id)).fetchone()
		try:
			count = int(count[0])
			print id, count
		except:
			count = 0
			print id, count
	for snap in re_snap.findall(item):
		count = int(count) - 1
		if count > 0:
			con.execute("UPDATE ids set count = '%s' WHERE id = '%s';" %(count,id))
			con.commit()
			os.popen('uvccapture -x960 -y720 ~/-o%s-%s.jpg' %(id,count))
			os.popen4('eog ~/%s-%s.jpg' %(id,count))
		if count == 0:
			os.popen4('eog ~/fail.jpg')
		if count < 0:
			os.popen4('eog ~/fail.jpg')
		

Arduino Sketch:

// RFID reader ID-12 for Arduino
// Based on code by BARRAGAN
// and code from HC Gilje – http://hcgilje.wordpress.com/resources/rfid_id12_tagreader/
// Modified for Arudino by djmatic
// Modified for ID-12 and checksum by Martijn The – http://www.martijnthe.nl/
//
// Use the drawings from HC Gilje to wire up the ID-12.
// Remark: disconnect the rx serial wire to the ID-12 when uploading the sketch

int controlPin = 2;
int pushPin = 12;
int ledPin = 13;
int state = HIGH;
int reading;
int previous = LOW;
long time = 0;
long debounce = 200;

void setup() {
pinMode(controlPin, OUTPUT);
pinMode(pushPin, INPUT);
Serial.begin(9600); // connect to the serial port
}

void loop () {
digitalWrite(controlPin, HIGH);
byte i = 0;
byte val = 0;
byte code[6];
byte checksum = 0;
byte bytesread = 0;
byte tempbyte = 0;

if(Serial.available() > 0) {
if((val = Serial.read()) == 2) { // check for header
bytesread = 0;
while (bytesread 0) {
val = Serial.read();
if((val == 0x0D)||(val == 0x0A)||(val == 0×03)||(val == 0×02)) { // if header or stop bytes before the 10 digit reading
break; // stop reading
}

// Do Ascii/Hex conversion:
if ((val >= ’0′) && (val = ‘A’) && (val > 1] = (val | (tempbyte <> 1 != 5) { // If we’re at the checksum byte,
checksum ^= code[bytesread >> 1]; // Calculate the checksum… (XOR)
};
} else {
tempbyte = val; // Store the first hex digit first…
};

bytesread++; // ready to read next digit
}
}

// Output to Serial:

if (bytesread == 12) { // if 12 digit read is complete
Serial.print(“5-byte code: “);
for (i=0; i<5; i++) {
if (code[i] debounce) {
// … invert the output
if (state == HIGH)
state = LOW;
else
state = HIGH;
Serial.print(“snap!”);

// … and remember when the last button press was
time = millis();
}

digitalWrite(ledPin, state);

previous = reading;

}

Follow Us