onsdag 24 februari 2010

halvfärdigt bord


Bord med skärm - rolig idé, men inte så ergonomiskt för Jonas.
Provar temuggsdetektor.



Efter vår session igår, när vi gjorde skärmen i bordet (bilder kommer), blev jag helt förhäxad av att man kunde använda bordet för att visa bilder. Bilderna fick en helt annan närvaro när de visades liggandes på bordet av någon anledning.

Vi satte in fotosensorer och värmesensorer i bordet, som är kopplade till en arduino som sedan är kopplat till processing.

Efter en kort brainstorming kom vi fram till att bordet ska känna av om någon ställer en kopp på bordet och då börja visa en bildserie. Bildserien ska sedan gå framåt ju varmare sensorerna blir. Om man ställer på bara en kopp, får man se ungefär halva serien, om två koppar får man se hela.

När sensorerna svalnar igen går "filmen" baklänges.

Arbetar på att få till kul saker att visa och göra en enkel bildbytare i Processing.

måndag 22 februari 2010

A hole in the table

Today I saw what my projekt mates had decided and programmed during the week away.

A really good job, with some really nice processing code that morphed in and out a text, and fun interface based on heat and simple object detection with photocells.

Today we essentially just made a about A4-sized hole in a table bought by Johan and a screen also brought by Johan. Really good job of him to buy it.

I'm really happy my project mates didn't use my idea of using a medusa lamp, because that would not bring good interaction in a simple way. I have to explore the medusa lamp interaction system further, but with another project.

We had good help from elab with drills and so on for making the hole. Unfortunantley they didn't have a "stich saw" (sticksåg) which made it nescessary to use a drill and saws in a creative way. We missed some holes for the On-button for the screen - embarrasing. Luckily we can go drill tomorrow as well.

Tomorrow we will maybe paint the table (but where?) and make sure the interaction works as expected.

We also have problems with configuring the sensors to map well to the program (especially since the light condition "always" changes, depending on the position and so on). I will investigate a way to dynamically adjust the "strengt" of the "light" by some simple integration method from control theory. The main problem seems to be to detect when a photocell is covered (and hence not should be integrated to compensate) and when a photocell is open to the light. We need to explore that with some different dataseries. Hopefully some processing and controlled experiments can give answers to good limits. This is engineering!

måndag 8 februari 2010

two entries for the main blog

I made two entries for the main course blog, one about photosynth, and one about the interesting libavg.

onsdag 3 februari 2010

Dagsrapport 3 feb







Genomgångar av Arduino och lite elektronik.

I jämförelse med Phidgets tycker jag att Arduino verkar vara en dröm att programmera. Men den har andra problem. Till exempel stötte vi direkt på patrull när vi ville överföra flera data på en gång från Arduinon till datorn. Direkt måste man börja ta hänsyn till vilken datatyp det är etc.

Kanske är Firmata något för oss.


Böjsensorer genom spänningsdelare (1 k motstånd) in i Arduinon. Bara ett av motstånden används!

Böjsensoroutput i processing (utgick från potentionmeter-exempel från Arduinos hemsida)


Arduino inkopplad. Bara ett av motstånden används!

Exempel (med dåligt ljud) som visar det hela in action.

Dagens källkod:

Arduino
/*
Graph
A simple example of communication from the Arduino board to the computer:
the value of analog input 0 is sent out the serial port. We call this "serial"
communication because the connection appears to both the Arduino and the
computer as a serial port, even though it may actually use
a USB cable. Bytes are sent one after another (serially) from the Arduino
to the computer.
You can use the Arduino serial monitor to view the sent data, or it can
be read by Processing, PD, Max/MSP, or any other program capable of reading
data from a serial port. The Processing code below graphs the data received
so you can see the value of the analog input changing over time.
The circuit:
Any analog input sensor is attached to analog in pin 0.
created 2006
by David A. Mellis
modified 14 Apr 2009
by Tom Igoe and Scott Fitzgerald

*/


void setup() {
// initialize the serial communication:
Serial.begin(9600);
}
void loop() {
// send the value of analog input 0:
Serial.println(analogRead(3));
// wait a bit for the analog-to-digital converter
// to stabilize after the last reading:
delay(100);
}

Processing:
Instansierar en sinusoscillator och ritar grafen på skärmen. Vid varje serial-event uppdaterar den kurvhöjd och sätter frekvensen på skärmen.

// Graphing sketch


// This program takes ASCII-encoded strings
// from the serial port at 9600 baud and graphs them. It expects values in the
// range 0 to 1023, followed by a newline, or newline and carriage return

// Created 20 Apr 2005
// Updated 18 Jan 2008
// by Tom Igoe

import processing.serial.*;

Serial myPort; // The serial port
int xPos = 1; // horizontal position of the graph


// SOUND:

import ddf.minim.*;
import ddf.minim.signals.*;
Minim minim;
AudioOutput out;
SineWave sine;

void setup () {
// set the window size:
size(400, 300);

// List all the available serial ports
println(Serial.list());
// I know that the first port in the serial list on my mac
// is always my Arduino, so I open Serial.list()[0].
// Open whatever port is the one you're using.
myPort = new Serial(this, Serial.list()[0], 9600);
// don't generate a serialEvent() unless you get a newline character:
myPort.bufferUntil('\n');
// set inital background:
background(0);
//sound
minim = new Minim(this);
// get a line out from Minim, default bufferSize is 1024, default sample rate is 44100, bit depth is 16
out = minim.getLineOut(Minim.STEREO);
// create a sine wave Oscillator, set to 440 Hz, at 0.5 amplitude, sample rate from line out
sine = new SineWave(440, 0.5, out.sampleRate());
// set the portamento speed on the oscillator to 200 milliseconds
sine.portamento(200);
// add the oscillator to the line out
out.addSignal(sine);
}
void draw () {
// everything happens in the serialEvent()
}

void serialEvent (Serial myPort) {
// get the ASCII string:
String inString = myPort.readStringUntil('\n');

if (inString != null) {
// trim off any whitespace:
inString = trim(inString);
// convert to an int and map to the screen height:
float inByte = float(inString);
inByte = map(inByte, 0, 1023, 0, height);

// draw the line:
stroke(127,34,255);
line(xPos, height, xPos,map(inByte,9,34,0,height));
//System.out.println(inByte);
float freq = map(inByte, 9,34, 1500, 60);
sine.setFreq(freq);

// at the edge of the screen, go back to the beginning:
if (xPos >= width) {
xPos = 0;
background(0);
}
else {
// increment the horizontal position:
xPos++;
}
}

}

void stop()
{
out.close();
minim.stop();
super.stop();
}


MIDI from Processing to Ableton Live!


There was success! I can now send MIDI messages from Processing to Ableton Live.
I use the MidiBus for Processing, which had some basic example that got me started, and the internal Midi loopback device in Mac OS X (called busses, reachable from Applications/Utilities/Sound and Midi. It's just to go into the Midi View through the Window menu and be sure that the IAC-driver is enabled, like in the picture.

Then it's just to enable that midi-input as a Track (or remote) in Ableton.

The Basic.pde, that followed midibus, shows the following devices:

Available MIDI Devices:
----------Input----------
[0] "Buss 1"
[1] "Port 1"
[2] "Real Time Sequencer"
----------Output----------
[0] "Buss 1"
[1] "Port 1"
[2] "Real Time Sequencer"
[3] "Java Sound Synthesizer"
If i initialize the midi-object with the Buss1 as output, I can speak to Live with midi!

I found a handy midi-reference (but then I know midi from before quite well). This one is on bit-level. This one is also on bitlevel, and is the Midis own.

However, there are some problems involved in handling midi-events. First and foremost: Notes have to be turned of. This can be solved by using some kind of queue for all the events. A more serious problem is that every program handles midi in different ways - after all, it has very rudimentary "data types" - notes on, control changes and patch-changes.

This can be overcomed by making a special setup in Live, which always is loaded when one want to work with the prototype. That's acceptable.

Now - it's just to start working! :)

Interface of wireless router


It would be nice to have a better interface to a wireless router. In a seminar I heard someone had checked that, and they had checked in peoples home how they used the routers - they even hid them behind their radiators to get rid of them (with overheating as a problem). I'm not satisfied with my own either. It's there, taking up space and an electrical outlet.

It would also be much better to be able to control the elementary functions with switches instead of an interface. That way, you don't need network cables attached to be able to configure it. Maybe switches and buttons, and a little window displaying the wifi-password.

That contradicts the possible to hide it away, though...



Midicontrollerideas from Arduino seminar



http://itp.nyu.edu/physcomp/Labs/MIDIOutput

A very easy way to implement MIDI with Arduino! Maybe I could build a pattern trigger for Ableton Live. It would look like a little matrix of buttons to make an analogue to the pattern trigger in Live.

This is mostly a copy of the Akai APC20, so it's probably cheaper to buy one, though-


What else can one do with a similar setup? Maybe a more probable patterntrigger? Something you could keep laying around on you desktop? Or put on your synthesizer or guitar? Something for loop-based performance?

Maybe something with servos? Or flexsensors?

tisdag 2 februari 2010

The language hunt

To be able to express oneself in a good way, one need a good language. After reading Paul Grahams book "Hackers and Painters", I got quite overwhelmed by the idea about how a language should be to be good. He is talking about efficiency and endours Lisp, a quite theoretical but also quite pragmatic language. Even though it's very efficient in many ways, it has some severe losses, one of them is the libraries and coherent support for such things as threads.

I've been trying with the implementation Clojure a bit, but it's actually been a bit abstract so far. However, when I start programming Java, I feel restricted in some sence.

An intrestng speach by Rich Hickney, the creator of Clojure.

Ericsson, the famous telephone company, developed a language Erlang. Their main focus was reliability and "updateability", and that's really what you get. When most programs needs to be restarted, or, in the case of windows or mac os, often be rebooted, Erland can change the program while it's running. No downtime whatsoever.

http://ftp.sunet.se/pub/lang/erlang/about.html


Feedback-pan-test




And the day after, I tried it with pans (which was a possibility I found by systematicly exploring the design space of materials to build the lear-synthesizer with, everything from porcelain to tin foil was on that list).

A quite different result compared to my hand-test. It was hard to shape the resonance in the way needed to express different tones. The pans actually worked better for dub-delay-effect typed sounds, and was quite hard to be able to resonate in different tones, even tough I used a resonator. It was hard to "find" the tones.

It would be highly intressting to test it on some more appropiate form, a trumpet, for example.

The setup is that my built in microphone records the sound coming out of the loudspeakers, filters it, delays it because of latency and sends it out.



The chain is Mic->Saturator (distoritive amplifier) -> Resonator (narrow banded musical eq) -> Compressor with high ratio/limiter -> Speaker (mic)


Feedback-hand-test

I couldn't resist to test how the feedback could be handcontrolled. Starring: My hand, my computer and Ableton live 7.

It took some amount of time to able to choose between tones, the possibility to control the sound is rather limited with only the hand as reflector.


Phidgetmania





Dagens hacking var inte särskillt produktiv. Vi hittade först inte servo-controller-kortet, och letade desperat efter lösningar. Jonas anslöt sig till gruppen, så nu var vi fyra personer. Alldeles för många för att kunna göra arbete produktivt - det verkar nästan om att effektiviteten är omvänt proportionell mot antalet personer i gruppen.

Vi ville detektera närvaro, att en person var närvarande, och detta genom en trycksensor. Tyvärr var det omöjligt att processing att fungera med två olika phidgetkort anslutna, samtidigt som webkameran var igång. Programmet gav mycket svårdiagnogstiserade spårutskrifter, och krashade då och då. Vi förbannade java-threadsen, och jag undrar i mitt stilla sinne om det verkligen inte finns ett bättre sätt att programmera interaktion. Nåja, det var ett skissprogram, och fungerade som ett sådant.

Redovisningen gick väl ganska bra, men många andra hade nästan mer fantastifulla och fina prototyper, det var faktiskt mycket hög klass på dem.

Lärdomar
  • Ha inte för stor grupp
  • Boka undan det som behövs
  • Använd inte för många sensorer/tekniker på en gång
  • Använd Eclipse (kanske?) för bättre tillgång till olika bibliotek
  • Planera för interaktionen och avsätt tid för att konfigurera den ordentligt
Det sista är väl att vi missade att faktiskt konfigurera interaktionen.

Jag ser mig om efter ett annat programmeringsspråk än Java, eftersom att jag tycker att det är för babbligt. Det tar många sidor kod att beskriva det man egentligen ville skriva på ett fåtal rader. Självklart går det ut över det färdiga resultatet. Kanske är Python bättre.


måndag 1 februari 2010

Brain wave sensors

Google: Brain wave sensor

gave some intresting results:

Brainwave Sensor Headsets Control Computers, Mobile Phones or Games using Thoughts & Emotions in Mind


"NeuroSky is a technology, reports MyTreo, that captures a person’s brainwave signals, eye movements, and other bio-signals via a patented Dry-Active sensor headset allowing users to control a computer or a mobile device with thoughts in the mind."

"During a keynote presentation at the conference, Ambient’s Callahan demonstrated this technology by using the device to place the world’s first voiceless cellphone call to Mike Hames, a TI senior vice president. The Audeo can send information to a mobile phone using Bluetooth. The company has also successfully controlled a wheelchair without the need of physical movementagain using brain signals. Ambient expects its first product to enable speech for individuals with ALS will be available before the end of 2008. More information can be found at http://www.theaudeo.com."

Now that's cool.

Let's see what we can do with it. It would be awesome to be able to connect to someone in a "non-intrusiv way" and say hello to them. The sexual posibilities of this is not be underestimated. Porno industry have been very innovative. Very.