scripting – john locke http://gracefulspoon.com/blog adventures in architecture Mon, 23 Feb 2015 23:15:14 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.2 void in the center http://gracefulspoon.com/blog/2012/02/04/void-in-the-center/ http://gracefulspoon.com/blog/2012/02/04/void-in-the-center/#respond Sun, 05 Feb 2012 04:47:11 +0000 http://gracefulspoon.com/blog/?p=2480

 

I’m pulling the last 100 tweets from within a half mile radius of latitude 40.800808 x longitude -73.965154 (otherwise known as the desk in my bedroom where I’m typing this now). And right off the bat I can see that the tweeting frequency of some of my neighbors is impressive, out of 100 tweets there were only 42 different users, all of whose profile images are displayed above based on the frequency of their messaging. Voyeurism is something built into New York’s dna, the simultaneous repulsion and attraction of surveillance that was so effectively conveyed in Rear Window. Sometimes when riding the train, on the rare occasions when you’re sans earphones, you can’t help overhearing fragments and context-less snippets of random stranger’s conversation. Most of the time they’re pretty banal, on the order of sports predictions and office gossip, about nothing interesting but still interesting. And that’s what makes the hidden, invisible conversations going on in this five block vicinity so fascinating to me in a way I can’t really describe. 100 random tweets hold no mysteries, but the 100 tweets of the people around me do. A secret knowledge that gives added meaning to the ruby aficionado I see walking down the street or the Mavs fan at the bar, all faces that are part of a huge story that can never end. I’ve started following ThatsOro.

 

Click more for the code. Based on great examples here and here.

 

 

import twitter4j.conf.*;
import twitter4j.internal.async.*;
import twitter4j.internal.org.json.*;
import twitter4j.internal.logging.*;
import twitter4j.json.*;
import twitter4j.internal.util.*;
import twitter4j.management.*;
import twitter4j.auth.*;
import twitter4j.api.*;
import twitter4j.util.*;
import twitter4j.internal.http.*;
import twitter4j.*;
import twitter4j.internal.json.*;

ArrayList username = new ArrayList();
ArrayList imgs = new ArrayList();


double lat;
double lon;
double res;
String resUnit;

PImage twitimg;

void setup() {
  size(990, 480);
  background(255);
  smooth();
  noStroke();

  ConfigurationBuilder cb = new ConfigurationBuilder();
  cb.setOAuthConsumerKey("XXXXX");
  cb.setOAuthConsumerSecret("XXXXX");
  cb.setOAuthAccessToken("XXXXX");
  cb.setOAuthAccessTokenSecret("XXXXX");

  lat = 40.800808;
  lon =73.965154;
  res = 1;
  resUnit="mi";

  try {

    Twitter twitter = new TwitterFactory(cb.build()).getInstance();

    Query query = new Query();
    GeoLocation nyc = new GeoLocation(lat, lon);
    query.setGeoCode(nyc, res, resUnit);
    query.setRpp(200);

    QueryResult result = twitter.search(query);


    ArrayList tweets = (ArrayList) result.getTweets();


    for (int i = 0; i < tweets.size(); i++) {
      Tweet t = (Tweet) tweets.get(i);
      String user = t.getFromUser();
      Long id = t.getFromUserId();
      String url = t.getProfileImageUrl();
      String msg = t.getText();
      Date d = t.getCreatedAt();

      username.add(user);

      imgs.add(url);
    };
  }

  catch (TwitterException te) {
    println("Couldn't connect: " + te);
  };
}


void draw() {

  int k = (frameCount % imgs.size()); 
  String pix = imgs.get(k);
  String users = username.get(k);

  PFont font;
  font = loadFont("FuturaLT-ExtraBoldOblique-18.vlw"); 
  fill(0);
  textFont(font);
  text(k + " " + users, 0, 35);

  fill(255, 200);

  rect(0, 0, 220, 45);

  twitimg = loadImage(pix, "png");
  image(twitimg, random(width), random(height));
  //text(users, twitimg.x, twitimg.y);

  fill(255, 1);
  rect(0, 0, width, height);

  fill(222, random(50, 150));
  textSize(random(10, 30));
}
]]>
http://gracefulspoon.com/blog/2012/02/04/void-in-the-center/feed/ 0
parametric image sampling http://gracefulspoon.com/blog/2009/10/15/parametric-image-sampling/ http://gracefulspoon.com/blog/2009/10/15/parametric-image-sampling/#comments Fri, 16 Oct 2009 02:08:48 +0000 http://gracefulspoon.com/blog/?p=1144 imagesampler021
imagesampler04

 

A simple test based on Sanghoon Yoon’s Grasshopper definition for using the new image sampler node, I swapped out a text image for an image image, because, well I just like fonts and 3D I guess. One of the things that’s cool is that the image is “live,” so as you change the text, the grasshopper definition updates. And of course you can also parametrically control the size of the pixels, the multiplication of the heightfield and the overall size of the surface. To get a random color on each polysurface, I modified Dale Fugier’s script located on the rhinoscript wiki page to include a function to assign the object color to the material color so it will render out in vray. See grasshopper definition and code below:

 

Edit: Added Link to download grasshopper definition and source image file. Click Here (zip file).

 


imagesampler01
imagesampler03

 

rhino
From Dale Fugier’s ObjectColor.rvb:

 

Sub SetObjectColorRandom

 

Dim objects, red, green, blue, i, material

 

objects = Rhino.GetObjects(“Select objects for randomly color change”, 0, True, True)
If IsNull(objects) Then Exit Sub

 

Rhino.EnableRedraw False
For i = 0 To UBound(objects)
red = Int(125 * Rnd)
green = Int(200 * Rnd)
blue = Int(180)
Rhino.ObjectColor objects(i), RGB(red, green, blue)

 

Call Rhino.ObjectColor (objects(i), RGB(red, green, blue))
material = rhino.AddMaterialToObject (objects(i))
Call rhino.MaterialColor (material, RGB(red, green, blue))
Call rhino.MaterialShine (material, 255)
Next
Rhino.EnableRedraw True

 

End Sub

 
]]>
http://gracefulspoon.com/blog/2009/10/15/parametric-image-sampling/feed/ 11
interactive elevator installation http://gracefulspoon.com/blog/2009/05/24/interactive-elevator-installation/ http://gracefulspoon.com/blog/2009/05/24/interactive-elevator-installation/#respond Sun, 24 May 2009 21:03:50 +0000 http://gracefulspoon.com/blog/?p=497 living01
For our living architecture course, we created an interactive light installation in the elevator of Avery Hall, controllable by anyone with a cell phone and a twitter account. The simplified process includes texting an emotion to twitter from any cellular phone using the #livarch hashtag. That tweet is then picked up by a realtime search, fed through our twitterfeed rss, then added to our own twitter account. For a more detailed explanation, see this previous post on getting multiple twitter users onto one twitter feed. That emotion is then directed to our pachube feed and sent through processing to an arduino microcontroller that controls the color and pulsing of the individual leds. The installation non-invasively attaches to the surface of the elevator via magnets. Allowing it to be placed on any metal surface, such as a building exterior, furniture, or a vehicle.

 

The lights within the elevator respond to the mood of the user. For instance, if a student texted “happy #livarch” the space within the elevator would begin to slowly pulse with a greenish/blue hue. However, if another student sent “angry #livarch” the first light will quickly flash a bright red. There are twelve lights total and show the collective mood of the twelve most recent users.

 

In this way, the elevator becomes a living representation of the collective mood of the building, but it is also hoped that a feedback loop can be created, a loop that actually influences the mood of those that ride the elevator. The emotion felt in the lobby will be altered by the time you reach the sixth floor. And that new emotion becomes what gets texted back to the elevator.

 

Lastly, future installations will be physically located away from the target user. For instance, Avery’s mood will be projected to the elevator in Uris Hall and vice versa. In this manner, we can both create a new form of pen-pal with distant locations, but also hope that our mood, whether angry, sad, happy or nervous, will both manifest itself in a new form of architecture, but also have an effect on the greater world around us.

 

The project team also included Talya Jacobs and Guanghong Ou.
See more for video and code:

 

living02c

living02b

living02

living03

living03b

living04

living05

living06

Super-Long Source Code:
The main thing to remember is in arduino to load the ‘standard firmata’ library and in processing to use the EEML language d.getStringValue(0); to make sure the value of stream id 0 is a word, or string, rather than a numeric value.

import processing.serial.*;
import cc.arduino.*;

import eeml.*;

Serial port; //Create object from Serial class

Arduino arduino;

String myValue;

String Led1Status;
String Led2Status;
String Led3Status;
String Led4Status;
String Led5Status;
String Led6Status;

String turnAngry = “angry”;
String turnAngryLiv = “angry_#livarch”;

String turnCalm = “calm”;
String turnCalmLiv = “calm_#livarch”;

String turnRelaxed = “relaxed”;
String turnRelaxedLiv = “relaxed_#livarch”;

String turnNervous = “nervous”;
String turnNervousLiv = “nervous_#livarch”;

String turnHappy = “happy”;
String turnHappyLiv = “happy_#livarch”;

String turnSad = “sad”;
String turnSadLiv = “sad_#livarch”;

DataIn dIn;

int value = 0;
int x = 0;
//int waitTime= 30; // 10ms delay

int Led1Red = 13;
int Led1Green = 2;
int Led1Blue = 3;

int Led2Red = 4;
int Led2Green = 5;
int Led2Blue = 6;

int Led3Red = 7;
int Led3Green = 8;
int Led3Blue = 9;

int Led4Red = 10;
int Led4Green = 11;
int Led4Blue = 12;

int Led5Red = 14;
int Led5Green = 15;
int Led5Blue = 16;

int Led6Red = 17;
int Led6Green = 18;
int Led6Blue = 19;

void setup()
{
frameRate(200);

println(Arduino.list());
arduino = new Arduino(this, Arduino.list()[0], 115200);

dIn = new DataIn(this, “http://www.pachube.com/api/1499.xml”, “fe2ac5cde083af08a353b9862a8d65b4d62caf94a40bfa5e6ea90f82f244f0ac”, 50000);
println(Serial.list());

arduino.pinMode(Led6Red, Arduino.OUTPUT);
arduino.pinMode(Led6Green, Arduino.OUTPUT);
arduino.pinMode(Led6Blue, Arduino.OUTPUT);
}

void draw(){
x += 1;
if (x>2){ //Set Time Here
x = 0;
x+= 1;
}

println(x);
if (x == 2) {
//if (Led1Status != myValue){
Led6Status = Led5Status;
Led5Status = Led4Status;
Led4Status = Led3Status;
Led3Status = Led2Status;
Led2Status = Led1Status;
Led1Status = myValue;
//}
}
println(“led2Status: ” + Led1Status);
println(“led2Status: ” + Led2Status);
println(“led3Status: ” + Led3Status);
println(“led4Status: ” + Led4Status);
println(“led5Status: ” + Led4Status);
println(“led6Status: ” + Led4Status);

//——————–IF/THEN-LED5————————————————————-

if((turnAngry.equals(Led5Status)) || (turnAngryLiv.equals(Led5Status))){
angry5();
}
else

if((turnCalm.equals(Led5Status)) || (turnCalmLiv.equals(Led5Status))){
calm5();
}
else

if((turnNervous.equals(Led5Status)) || (turnNervousLiv.equals(Led5Status))){
nervous5();
}
else

if((turnHappy.equals(Led5Status)) || (turnHappyLiv.equals(Led5Status))){
happy5();
}
else

if((turnSad.equals(Led5Status)) || (turnSadLiv.equals(Led5Status))){
sad5();
}

//——————–IF/THEN-LED4————————————————————-

if((turnAngry.equals(Led4Status)) || (turnAngryLiv.equals(Led4Status))){
angry4();
}
else

if((turnCalm.equals(Led4Status)) || (turnCalmLiv.equals(Led4Status))){
calm4();
}
else

if((turnNervous.equals(Led4Status)) || (turnNervousLiv.equals(Led4Status))){
nervous4();
}
else

if((turnHappy.equals(Led4Status)) || (turnHappyLiv.equals(Led4Status))){
happy4();
}
else

if((turnSad.equals(Led4Status)) || (turnSadLiv.equals(Led4Status))){
sad4();
}

//——————–IF/THEN-LED3————————————————————-

if((turnAngry.equals(Led3Status)) || (turnAngryLiv.equals(Led3Status))){
angry3();
}
else

if((turnCalm.equals(Led3Status)) || (turnCalmLiv.equals(Led3Status))){
calm3();
}
else

if((turnNervous.equals(Led3Status)) || (turnNervousLiv.equals(Led3Status))){
nervous3();
}
else

if((turnHappy.equals(Led3Status)) || (turnHappyLiv.equals(Led3Status))){
happy3();
}
else

if((turnSad.equals(Led3Status)) || (turnSadLiv.equals(Led3Status))){
sad3();
}

//——————–IF/THEN-LED2————————————————————-

if((turnAngry.equals(Led2Status)) || (turnAngryLiv.equals(Led2Status))){
angry2();
}
else

if((turnCalm.equals(Led2Status)) || (turnCalmLiv.equals(Led2Status))){
calm2();
}
else

if((turnNervous.equals(Led2Status)) || (turnNervousLiv.equals(Led2Status))){
nervous2();
}
else

if((turnHappy.equals(Led2Status)) || (turnHappyLiv.equals(Led2Status))){
happy2();
}
else

if((turnSad.equals(Led2Status)) || (turnSadLiv.equals(Led2Status))){
sad2();
}

//——————–IF/THEN-LED1————————————————————-

if((turnAngry.equals(Led1Status)) || (turnAngryLiv.equals(Led1Status))){
angry1();
}
else

if((turnCalm.equals(Led1Status)) || (turnCalmLiv.equals(Led1Status))){
calm1();
}
else

if((turnNervous.equals(Led1Status)) || (turnNervousLiv.equals(Led1Status))){
nervous1();
}
else

if((turnHappy.equals(Led1Status)) || (turnHappyLiv.equals(Led1Status))){
happy1();
}
else

if((turnSad.equals(Led1Status)) || (turnSadLiv.equals(Led1Status))){
sad1();
}

//——————–IF/THEN-LED6————————————————————-

if((turnAngry.equals(Led6Status)) || (turnAngryLiv.equals(Led6Status))){
angry6();
}
else

if((turnCalm.equals(Led6Status)) || (turnCalmLiv.equals(Led6Status))){
calm6();
}
else

if((turnNervous.equals(Led6Status)) || (turnNervousLiv.equals(Led6Status))){
nervous6();
}
else

if((turnHappy.equals(Led6Status)) || (turnHappyLiv.equals(Led6Status))){
happy6();
}
else

if((turnSad.equals(Led6Status)) || (turnSadLiv.equals(Led6Status))){
sad6();
}

}

//——————–CALM——–CALM————————————————————–

void calm1(){

for(value = 0; value<=255; value++){ arduino.analogWrite(Led1Blue, value); arduino.analogWrite(Led1Green, 130); arduino.analogWrite(Led1Red, 0); delay(25); } for(value = 255; value >=0; value –)
{
arduino.analogWrite(Led1Blue, value);
arduino.analogWrite(Led1Green, 130);
arduino.analogWrite(Led1Red, 0);
delay(25);
}
}

void calm2(){

for(value = 0; value<=255; value++){ arduino.analogWrite(Led2Blue, value); arduino.analogWrite(Led2Green, 130); arduino.analogWrite(Led2Red, 0); delay(25); } for(value = 255; value >=0; value –)
{
arduino.analogWrite(Led2Blue, value);
arduino.analogWrite(Led2Green, 130);
arduino.analogWrite(Led2Red, 0);
delay(25);
}
}

void calm3(){

for(value = 0; value<=255; value++){ arduino.analogWrite(Led3Blue, value); arduino.analogWrite(Led3Green, 130); arduino.analogWrite(Led3Red, 0); delay(25); } for(value = 255; value >=0; value –)
{
arduino.analogWrite(Led3Blue, value);
arduino.analogWrite(Led3Green, 130);
arduino.analogWrite(Led3Red, 0);
delay(25);
}
}

void calm4(){

for(value = 0; value<=255; value++){ arduino.analogWrite(Led4Blue, value); arduino.analogWrite(Led4Green, 130); arduino.analogWrite(Led4Red, 0); delay(25); } for(value = 255; value >=0; value –)
{
arduino.analogWrite(Led4Blue, value);
arduino.analogWrite(Led4Green, 130);
arduino.analogWrite(Led4Red, 0);
delay(25);
}
}

void calm5(){

for(value = 0; value<=255; value++){ arduino.analogWrite(Led5Blue, value); arduino.analogWrite(Led5Green, 130); arduino.analogWrite(Led5Red, 0); delay(25); } for(value = 255; value >=0; value –)
{
arduino.analogWrite(Led5Blue, value);
arduino.analogWrite(Led5Green, 130);
arduino.analogWrite(Led5Red, 0);
delay(25);
}
}

void calm6(){

for(value = 0; value<=255; value++){ arduino.analogWrite(Led6Blue, value); arduino.analogWrite(Led6Green, 130); arduino.analogWrite(Led6Red, 0); delay(25); } for(value = 255; value >=0; value –)
{
arduino.analogWrite(Led6Blue, value);
arduino.analogWrite(Led6Green, 130);
arduino.analogWrite(Led6Red, 0);
delay(25);
}
}

//——————–ANGRY——–ANGRY————————————————————–

void angry1(){
arduino.analogWrite(Led1Red,255);
//arduino.analgoWrite(Led1Green, 0);
arduino.analogWrite(Led1Blue, 0);
delay(10);
arduino.analogWrite(Led1Red,0);
//arduino.analgoWrite(Led1Green, 0);
arduino.analogWrite(Led1Blue, 0);
delay(10);
}

void angry2(){
arduino.analogWrite(Led2Red,255);
//arduino.analgoWrite(Led2Green, 0);
arduino.analogWrite(Led2Blue, 0);
delay(10);
arduino.analogWrite(Led2Red,0);
//arduino.analgoWrite(Led2Green, 0);
arduino.analogWrite(Led2Blue, 0);
delay(10);
}

void angry3(){
arduino.analogWrite(Led3Red,255);
//arduino.analgoWrite(Led3Green, 0);
arduino.analogWrite(Led3Blue, 0);
delay(10);
arduino.analogWrite(Led3Red,0);
//arduino.analgoWrite(Led3Green, 0);
arduino.analogWrite(Led3Blue, 0);
delay(10);
}

void angry4(){
arduino.analogWrite(Led4Red,255);
//arduino.analgoWrite(Led4Green, 0);
arduino.analogWrite(Led4Blue, 0);
delay(10);
arduino.analogWrite(Led4Red,0);
//arduino.analgoWrite(Led4Green, 0);
arduino.analogWrite(Led4Blue, 0);
delay(10);
}

void angry5(){
arduino.analogWrite(Led5Red,255);
//arduino.analgoWrite(Led5Green, 0);
arduino.analogWrite(Led5Blue, 0);
delay(10);
arduino.analogWrite(Led5Red,0);
//arduino.analgoWrite(Led5Green, 0);
arduino.analogWrite(Led5Blue, 0);
delay(10);
}

void angry6(){
arduino.analogWrite(Led6Red,255);
//arduino.analgoWrite(Led6Green, 0);
arduino.analogWrite(Led6Blue, 0);
delay(10);
arduino.analogWrite(Led6Red,0);
//arduino.analgoWrite(Led6Green, 0);
arduino.analogWrite(Led6Blue, 0);
delay(10);
}

//——————–Nervous——–Nervous————————————————————–

void nervous1(){
arduino.analogWrite(Led1Green,255);
arduino.analogWrite(Led1Red, 120);
arduino.analogWrite(Led1Blue, 0);
delay(300);
arduino.analogWrite(Led1Green,0);
arduino.analogWrite(Led1Red,120);
arduino.analogWrite(Led1Blue, 0);
delay(300);
}

void nervous2(){
arduino.analogWrite(Led2Green,255);
arduino.analogWrite(Led2Red, 120);
arduino.analogWrite(Led2Blue, 0);
delay(300);
arduino.analogWrite(Led2Green,0);
arduino.analogWrite(Led2Red,120);
arduino.analogWrite(Led2Blue, 0);
delay(300);
}

void nervous3(){
arduino.analogWrite(Led3Green,255);
arduino.analogWrite(Led3Red, 120);
arduino.analogWrite(Led3Blue, 0);
delay(300);
arduino.analogWrite(Led3Green,0);
arduino.analogWrite(Led3Red,120);
arduino.analogWrite(Led3Blue, 0);
delay(300);
}

void nervous4(){
arduino.analogWrite(Led4Green,255);
arduino.analogWrite(Led4Red, 120);
arduino.analogWrite(Led4Blue, 0);
delay(300);
arduino.analogWrite(Led4Green,0);
arduino.analogWrite(Led4Red,120);
arduino.analogWrite(Led4Blue, 0);
delay(300);
}

void nervous5(){
arduino.analogWrite(Led5Green,255);
arduino.analogWrite(Led5Red, 120);
arduino.analogWrite(Led5Blue, 0);
delay(300);
arduino.analogWrite(Led5Green,0);
arduino.analogWrite(Led5Red,120);
arduino.analogWrite(Led5Blue, 0);
delay(300);
}

void nervous6(){
arduino.analogWrite(Led6Green,255);
arduino.analogWrite(Led6Red, 120);
arduino.analogWrite(Led6Blue, 0);
delay(300);
arduino.analogWrite(Led6Green,0);
arduino.analogWrite(Led6Red,120);
arduino.analogWrite(Led6Blue, 0);
delay(300);
}

//——————–Happy——–Happy————————————————————–

void happy1(){

for(value = 0; value<=255; value++){ arduino.analogWrite(Led1Blue, value); arduino.analogWrite(Led1Green, 255-value); arduino.analogWrite(Led1Red, 0); delay(25); } for(value = 255; value >=0; value -=15)
{
arduino.analogWrite(Led1Blue, value);
arduino.analogWrite(Led1Green, 255-value);
arduino.analogWrite(Led1Red, 0);
delay(25);
}
}

void happy2(){

for(value = 0; value<=255; value++){ arduino.analogWrite(Led2Blue, value); arduino.analogWrite(Led2Green, 255-value); arduino.analogWrite(Led2Red, 0); delay(25); } for(value = 255; value >=0; value -=15)
{
arduino.analogWrite(Led2Blue, value);
arduino.analogWrite(Led2Green, 255-value);
arduino.analogWrite(Led2Red, 0);
delay(25);
}
}

void happy3(){

for(value = 0; value<=255; value++){ arduino.analogWrite(Led3Blue, value); arduino.analogWrite(Led3Green, 255-value); arduino.analogWrite(Led3Red, 0); delay(25); } for(value = 255; value >=0; value -=15)
{
arduino.analogWrite(Led3Blue, value);
arduino.analogWrite(Led3Green, 255-value);
arduino.analogWrite(Led3Red, 0);
delay(25);
}
}

void happy4(){

for(value = 0; value<=255; value++){ arduino.analogWrite(Led4Blue, value); arduino.analogWrite(Led4Green, 255-value); arduino.analogWrite(Led4Red,0); delay(25); } for(value = 255; value >=0; value -=15)
{
arduino.analogWrite(Led4Blue, value);
arduino.analogWrite(Led4Green, 255-value);
arduino.analogWrite(Led4Red, 0);
delay(25);
}
}

void happy5(){

for(value = 0; value<=255; value++){ arduino.analogWrite(Led5Blue, value); arduino.analogWrite(Led5Green,255-value); arduino.analogWrite(Led5Red, 0); delay(25); } for(value = 255; value >=0; value -=15)
{
arduino.analogWrite(Led5Blue, value);
arduino.analogWrite(Led5Green, 255-value);
arduino.analogWrite(Led5Red, 0);
delay(25);
}
}

void happy6(){

for(value = 0; value<=255; value++){ arduino.analogWrite(Led6Blue, value); arduino.analogWrite(Led6Green, 255-value); arduino.analogWrite(Led6Red, 0); delay(25); } for(value = 255; value >=0; value -=15)
{
arduino.analogWrite(Led6Blue, value);
arduino.analogWrite(Led6Green, 255-value);
arduino.analogWrite(Led6Red, 0);
delay(25);
}
}

//——————–Sad——–Sad————————————————————–

void sad1(){

for(value = 0; value<=255; value+=5){ arduino.analogWrite(Led1Blue, value); arduino.analogWrite(Led1Green, 0); arduino.analogWrite(Led1Red, 0); delay(10); } for(value = 255; value >=0; value -=5)
{
arduino.analogWrite(Led1Blue, 255);
arduino.analogWrite(Led1Green, value);
arduino.analogWrite(Led1Red, 0);
delay(10);
}

for(value = 0; value<=255; value+=5){ arduino.analogWrite(Led1Blue, 255-value); arduino.analogWrite(Led1Green, 255); arduino.analogWrite(Led1Red, 0); delay(10); } for(value = 255; value >=0; value -=5)
{
arduino.analogWrite(Led1Blue, 120);
arduino.analogWrite(Led1Green, 255-value);
arduino.analogWrite(Led1Red, 0);
delay(10);
}
}

void sad2(){

for(value = 0; value<=255; value+=5){ arduino.analogWrite(Led2Blue, value); arduino.analogWrite(Led2Green, 0); arduino.analogWrite(Led2Red, 0); delay(10); } for(value = 255; value >=0; value -=5)
{
arduino.analogWrite(Led2Blue, 255);
arduino.analogWrite(Led2Green, value);
arduino.analogWrite(Led2Red, 0);
delay(10);
}

for(value = 0; value<=255; value+=5){ arduino.analogWrite(Led2Blue, 255-value); arduino.analogWrite(Led2Green, 255); arduino.analogWrite(Led2Red, 0); delay(10); } for(value = 255; value >=0; value -=5)
{
arduino.analogWrite(Led2Blue, 120);
arduino.analogWrite(Led2Green, 255-value);
arduino.analogWrite(Led2Red, 0);
delay(10);
}
}

void sad3(){

for(value = 0; value<=255; value+=5){ arduino.analogWrite(Led3Blue, value); arduino.analogWrite(Led3Green, 0); arduino.analogWrite(Led3Red, 0); delay(10); } for(value = 255; value >=0; value -=5)
{
arduino.analogWrite(Led3Blue, 255);
arduino.analogWrite(Led3Green, value);
arduino.analogWrite(Led3Red, 0);
delay(10);
}

for(value = 0; value<=255; value+=5){ arduino.analogWrite(Led3Blue, 255-value); arduino.analogWrite(Led3Green, 255); arduino.analogWrite(Led3Red, 0); delay(10); } for(value = 255; value >=0; value -=5)
{
arduino.analogWrite(Led3Blue, 120 );
arduino.analogWrite(Led3Green, 255-value);
arduino.analogWrite(Led3Red, 0);
delay(10);
}
}

void sad4(){

for(value = 0; value<=255; value+=5){ arduino.analogWrite(Led4Blue, value); arduino.analogWrite(Led4Green, 0); arduino.analogWrite(Led4Red, 0); delay(10); } for(value = 255; value >=0; value -=5)
{
arduino.analogWrite(Led4Blue, 255);
arduino.analogWrite(Led4Green, value);
arduino.analogWrite(Led4Red, 0);
delay(10);
}

for(value = 0; value<=255; value+=5){ arduino.analogWrite(Led4Blue, 255-value); arduino.analogWrite(Led4Green, 255); arduino.analogWrite(Led4Red, 0); delay(10); } for(value = 255; value >=0; value -=5)
{
arduino.analogWrite(Led4Blue, 120);
arduino.analogWrite(Led4Green, 255-value);
arduino.analogWrite(Led4Red, 0);
delay(10);
}
}

void sad5(){

for(value = 0; value<=255; value+=5){ arduino.analogWrite(Led5Blue, value); arduino.analogWrite(Led5Green, 0); arduino.analogWrite(Led5Red, 0); delay(10); } for(value = 255; value >=0; value -=5)
{
arduino.analogWrite(Led5Blue, 255);
arduino.analogWrite(Led5Green, value);
arduino.analogWrite(Led5Red, 0);
delay(10);
}

for(value = 0; value<=255; value+=5){ arduino.analogWrite(Led5Blue, 255-value); arduino.analogWrite(Led5Green, 255); arduino.analogWrite(Led5Red, 0); delay(10); } for(value = 255; value >=0; value -=5)
{
arduino.analogWrite(Led5Blue, 120 );
arduino.analogWrite(Led5Green, 255-value);
arduino.analogWrite(Led5Red, 0);
delay(10);
}
}

void sad6(){

for(value = 0; value<=255; value+=5){ arduino.analogWrite(Led6Blue, value); arduino.analogWrite(Led6Green, 0); arduino.analogWrite(Led6Red, 0); delay(10); } for(value = 255; value >=0; value -=5)
{
arduino.analogWrite(Led6Blue, 255);
arduino.analogWrite(Led6Green, value);
arduino.analogWrite(Led6Red, 0);
delay(10);
}

for(value = 0; value<=255; value+=5){ arduino.analogWrite(Led6Blue, 255-value); arduino.analogWrite(Led6Green, 255); arduino.analogWrite(Led6Red, 0); delay(10); } for(value = 255; value >=0; value -=5)
{
arduino.analogWrite(Led6Blue, 120 );
arduino.analogWrite(Led6Green, 255-value);
arduino.analogWrite(Led6Red, 0);
delay(10);
}
}

void onReceiveEEML(DataIn d){

//set the value of remoteValue to equal the value of stream id 0 of the feed notated by d
String remoteValue = d.getStringValue(0);

myValue = (remoteValue);

}

]]>
http://gracefulspoon.com/blog/2009/05/24/interactive-elevator-installation/feed/ 0
the remix http://gracefulspoon.com/blog/2009/05/03/the-remix/ http://gracefulspoon.com/blog/2009/05/03/the-remix/#respond Sun, 03 May 2009 05:00:51 +0000 http://gracefulspoon.com/blog/?p=444 john_laurent
Mark Collins & Toru Hasegawa, the masterminds behind Proxyarch, and instructors of the course Search: Advanced Algorithmic Design at Columbia, ‘remixed’ the audio waveform code into something much more smooth and elegant. They’re awesome, and there were a lot of super interesting projects from the course which can all be viewed in the video here.

]]>
http://gracefulspoon.com/blog/2009/05/03/the-remix/feed/ 0
visualizing sound in processing http://gracefulspoon.com/blog/2009/04/02/visualizing-sound-in-processing/ http://gracefulspoon.com/blog/2009/04/02/visualizing-sound-in-processing/#comments Fri, 03 Apr 2009 03:33:51 +0000 http://gracefulspoon.com/blog/?p=395 02_waveforms
This was the final applet in motion. Using the minim library for processing, each waveform is generated in realtime as the two sounds play over eachother creating a pretty chaotic sound, but there are some instances of overlapping patterns where the mashup works pretty well. In the third version of the code, the boolean of the two waveforms is generated, producing a new way to visualize the waveforms. View the youtube video here, but I really need to figure out a way to add sound to the video, silence doesn’t do it justice. Charlie Parker, Iggy Pop and Richard Wagner comparison + code:

 

graphs_990

 


01_waveform
02_waveforms1
03_waveforms

import processing.dxf.*;
import ddf.minim.analysis.*;
import ddf.minim.*;
FFT fftLog1;
FFT fftLog2;

Waveform myRects;

Minim minim;
AudioPlayer groove1;
AudioPlayer groove2;

boolean record;

PFont font;
PFont fontoutline;

void setup(){
size(1200,600,P3D);
noStroke();
minim = new Minim(this);
groove1 = minim.loadFile(“groove_iggy.mp3”);
groove2 = minim.loadFile(“groove_wagner.mp3”);

groove1.loop();//repeat each song
groove2.loop();

font = loadFont(“HelveticaNeueLT-Bold-18.vlw”);
fontoutline = loadFont(“HelveticaNeueLT-Bold-18.vlw”);

fftLog1 = new FFT(groove1.bufferSize(),groove1.sampleRate()); //create the FFT logarithmic scale
fftLog2 = new FFT(groove2.bufferSize(),groove2.sampleRate());
fftLog1.logAverages(22,4); //adjust numbers to adjust spacing
fftLog2.logAverages(22,4);

float w1 = float ((fftLog1.avgSize()+fftLog2.avgSize())/2);
float x = w1;
float y = 0;
float z = 50;
myRects = new Waveform(x,y,z);

}

void draw(){
background(15);
directionalLight(126,126,126,sin(radians(frameCount)),cos(radians(frameCount)),1);
ambientLight(152,152,152);

for(int i = 0; i < fftLog1.avgSize(); i++){ int w = int(width/fftLog1.avgSize()); float zoom = 1; float jitter = (max(fftLog1.getAvg(i)*200,fftLog2.getAvg(i)*200 )); //jitter in camera influenced by waveform PVector foc = new PVector((myRects.x*.5+jitter*.5), myRects.y+jitter, 0); PVector cam = new PVector(zoom, zoom, -zoom); if (frameCount < 260){ camera(foc.x+cam.x,foc.y+(cam.y-1500*(cos(radians(-frameCount+60)))),foc.z+cam.z-400, foc.x,foc.y,foc.z-100, 0,0,1); //println(-1500*(cos(radians(-frameCount+60)))); } else { camera(foc.x+cam.x,foc.y+(cam.y+1418.278),foc.z+cam.z-400,foc.x,foc.y,foc.z-100,0,0,1); } } fftLog1.forward(groove1.mix); //play each song fftLog2.forward(groove2.mix); myRects.update1(); //update each waveform+boolean myRects.update2(); myRects.update3(); myRects.textdraw1(); //draw z height for song waveforms myRects.textdraw2(); if(record){ beginRaw(DXF, "output.dxf"); } // DXF will export the stuff drawn between here. myRects.plotBoolean(); //create surfaces myRects.plotTrace1(); myRects.plotTrace2(); if(record){ endRaw(); record = false; println("Done DXF~!"); } } void stop() { groove1.close(); // always close Minim audio classes when you finish with them groove2.close(); minim.stop(); // always stop Minim before exiting super.stop(); } class Waveform{ float x,y,z; PVector[] pts1 = new PVector[fftLog1.avgSize()]; PVector[] pts2 = new PVector[fftLog2.avgSize()]; PVector[] pts3 = new PVector[fftLog1.avgSize()]; //needed for boolean waveform PVector[] trace1 = new PVector[0]; PVector[] trace2 = new PVector[0]; PVector[] trace3 = new PVector[0]; //needed for boolean waveform Waveform(float incomingX, float incomingY, float incomingZ){ x = incomingX; y = incomingY; z = incomingZ; } void update1(){ //plot boolean waveform plotB(); } void plotB(){ for(int i = 0; i < fftLog1.avgSize(); i++){ int w = int(width/fftLog1.avgSize()); x = i*w-1050; //adjust the x position of the waveform here y = frameCount*5; z = height/4-fftLog1.getAvg(i)*10; stroke(0); point(x, y, z); pts1[i] = new PVector(x, y, z); //increase size of array trace by length+1 trace1 = (PVector[]) expand(trace1, trace1.length+1); //always get the next to last trace1[trace1.length-1] = new PVector(pts1[i].x, pts1[i].y, pts1[i].z); } } void plotBoolean(){ stroke(255,80); int inc = (fftLog1.avgSize()+fftLog2.avgSize())/2; for(int i=1; i<(trace1.length+trace2.length)/2-inc; i++){ if(i%inc != 0){ beginShape(TRIANGLE_STRIP); float value = (trace1[i].z*100); float m = map(value, -500, 20000, 0, 255); fill(m*2, 125, -m*2, 140); int threshold = 15; if (trace1[i].z220){
textFont(fontoutline, 24);
fill(155);
text(“wagner”,200,500,0);
text(“iggy”,900,500,0);
text(“max(iggy-wagner)”,500,500,0);
}
}
}
void plotTrace1(){
stroke(255,80);
int inc = fftLog1.avgSize();

for(int i=1; i

]]>
http://gracefulspoon.com/blog/2009/04/02/visualizing-sound-in-processing/feed/ 8
iggy wave http://gracefulspoon.com/blog/2009/03/06/iggy-wave/ http://gracefulspoon.com/blog/2009/03/06/iggy-wave/#comments Sat, 07 Mar 2009 03:11:38 +0000 http://gracefulspoon.com/blog/?p=315 dog
“Now I Wanna Be Your Dog” as a 3d landscape. I was using the minim library in processing to visualize the sound level data stream, then exporting out to rhino. Many thanks to the proxyarch team for help with the code.

 

EDIT:
Added link to processing app, see it in action (loud rock music will begin playing…so turn it up!)

 

http://gracefulspoon.com/processingapps/singlewave/index.html

 

processingapp


import processing.dxf.*;
import ddf.minim.analysis.*;
import ddf.minim.*;
FFT fftLin;
FFT fftLog;

Waveform myRects;

Minim minim;
AudioPlayer groove;

boolean record;

void setup(){
size(1000, 500, P3D);
noStroke();
minim = new Minim(this);
groove = minim.loadFile(“groove.mp3”);
groove.loop();
background(255);

fftLog = new FFT(groove.bufferSize(), groove.sampleRate());
fftLog.logAverages(22, 4); //adjust spacing here

float w = float(width/fftLog.avgSize());
float x = w;
float y = 0;
float z = 50;
float radius = 10;
myRects = new Waveform(x,y,z,radius);
}

void draw(){
background(0);
directionalLight(126,126,126,sin(radians(frameCount)),cos(radians(frameCount)),1);
ambientLight(102,102,102);

float zoom = 1000;
PVector foc = new PVector(myRects.x*0.5, myRects.y*0.5, 0);
PVector cam = new PVector(zoom*sin(radians(frameCount)), zoom*cos(radians(frameCount)), -zoom);
camera(foc.x+cam.x,foc.y+cam.y,foc.z+cam.z,foc.x,foc.y,foc.z,0,0,1);

//play the song
fftLog.forward(groove.mix);

myRects.update();

if(record){
beginRaw(DXF, “output.dxf”);
}
// DXF will export the stuff drawn between here.

myRects.plotTrace();

if(record){
endRaw();
record = false;
println(“Done DXF~!”);
}
}

void stop() {
// always close Minim audio classes when you finish with them
groove.close();
// always stop Minim before exiting
minim.stop();
super.stop();
}

class Waveform{
float x,y,z;
float radius;

PVector[] pts = new PVector[fftLog.avgSize()];

PVector[] trace = new PVector[0];

Waveform(float incomingX, float incomingY, float incomingZ, float incomingRadius){
x = incomingX;
y = incomingY;
z = incomingZ;
radius = incomingRadius;
}

void update(){
plot();
}

void plot(){
for(int i = 0; i < fftLog.avgSize(); i++){ int w = int(width/fftLog.avgSize()); x = i*w; y = frameCount*5; z = height/4-fftLog.getAvg(i)*10; stroke(0); point(x, y, z); pts[i] = new PVector(x, y, z); trace = (PVector[]) expand(trace, trace.length+1); trace[trace.length-1] = new PVector(pts[i].x, pts[i].y, pts[i].z); } } void plotTrace(){ /* //drawing points for(int i=0; i ]]> http://gracefulspoon.com/blog/2009/03/06/iggy-wave/feed/ 44 airport studio http://gracefulspoon.com/blog/2009/03/06/airport-studio/ http://gracefulspoon.com/blog/2009/03/06/airport-studio/#comments Sat, 07 Mar 2009 00:16:10 +0000 http://gracefulspoon.com/blog/?p=310 airport

 

Quick Project Desciption: Airports typically attempt to be all things to all people, resulting in general inefficiency and awkward relationships between program spaces. By seeking new opportunities via trade-offs, for instance a tourist class passenger waiting longer but flying for free, or a business class passenger’s ticket price rises while he waits less in a more luxurious setting, a new circulation map and airport space is created that addresses these disparate groups needs. Optimal relationships between airlines, airport, and users are handled through parametric models and genetic algorithms.

 

What is the metric for a good design? Or rather, now that parametric modelling allows us to easily create thousands of variations of a given design, how do we chose the “correct” one?

 

First, Creating a parametric model in catia, whose inputs are optimized through the engineering program modeFrontier with additional structural finite element analysis coming from autodesk’s newly aquired robot. The challenge became how to convert your design position, parti, whatever, into a quantifiable metric that the software can optimize for. For instance, to optimize for material efficiency, you could let the software optimize a shape for maximize volume with minimal surface area. After 3000 designs you’d have a sphere, but things can get very complex fast when you begin optimizing for competing objectives. See our complete studio blog here. Project description…
I was drawn to the metrics of passenger economy and profit. Airports typically attempt to be all things to all people, resulting in general inefficiency and awkward relationships between program spaces and passengers, especially business and tourist class. By seeking new opportunities via tradeoffs, for instance a tourist class passenger waiting longer but flying for free, or a business class passenger’s ticket price rises while creating multiple, separate dedicated entry points that allow shorter waits, a new optimized circulation map presents itself.

 

Each hanging element is a program + structural column connected by a circulation tube. Within the circulation tube tourist class passengers have the opportunity to fly for free, passing through each commercial program space. One objective is to maximize the length of the tube – thereby allowing more passengers to fly for free maximizing the airports ancillary profits. Another objective is to create an unobstructed space for business class passengers requiring few of the program spaces to touch the ground but rather hang, allowing business class passengers to freely pass through below. The more columns that touch the ground, the more structurally stabe the ceiling space frame becomes, allowing more housing towers above. The program mediates between these competing objectives finding high-performing, unexpected solutions and it becomes the role of the user to rank and chose designs based on desired criteria. Most housing = most columns = fewer business class travellers, etc…

 

airport2
locke_matrix_final

 


airport01
airport02
airport03
airport04
Freedom of movement for business class passengers on the left, forced path through retail space for tourist class passengers on the right. Trade offs of flying for free.
airport05
airport06
airport07
airport08
airport08b
airport08c
The addition of structure as an object creates more complex results
airport09
airport10
Overlay of 2500 designs to detect trends
airport11
airport12
Unexpected, high-performing results. Columns shift to one side with housing towers above, while cantilevered roof tapers to minimize movement.
airport13
airport14
airport15

]]>
http://gracefulspoon.com/blog/2009/03/06/airport-studio/feed/ 5
processing growth http://gracefulspoon.com/blog/2009/02/28/processing-growth/ http://gracefulspoon.com/blog/2009/02/28/processing-growth/#respond Sat, 28 Feb 2009 19:48:31 +0000 http://gracefulspoon.com/blog/?p=146 growingpoints

 

Looking at simple growth with classes and arrays. Next up, programming more behavior into the system.

]]>
http://gracefulspoon.com/blog/2009/02/28/processing-growth/feed/ 0
Processing to Rhino http://gracefulspoon.com/blog/2009/02/28/processing-to-rhino/ http://gracefulspoon.com/blog/2009/02/28/processing-to-rhino/#respond Sat, 28 Feb 2009 19:12:27 +0000 http://gracefulspoon.com/blog/?p=139 upload011
upload021

 

I was playing around with different processing sketches and different export procedures to get cool images out of rhino. The dxf exporter works great for lines and solids, then once in rhino run a simple grasshopper command to pipe all the curves.

]]>
http://gracefulspoon.com/blog/2009/02/28/processing-to-rhino/feed/ 0