µViz

µViz

µViz is graphical tablet with software creating beautiful visualization of music

I created µViz with Jamie Haberman and Ethan Deseautel in a weekend in November 2017 in San Francisco. I was lucky enough to be one of the 15 students selected to attend Red Bull Hack the Hits, to work with accomplished music acts from the SF music scene mentoring us as we work through the nights to create the future of music technology. It ended up being an amazing fun-filled weekend where I learned a lot of new skills (from laser-cutting to visualization in processing.js), and made great new friends! I would definitely recommend applying to Hack The Hits if technology and music are two passions of yours.

We created µViz as a way to visualize music. It is a graphical tablet that makes music as you draw on it; every drawing makes a unique song, and vice versa. It is also connected to a contact microphone that makes objects vibrate at the same frequency as the music.

Below, I explain how µViz was implemented. If you would like to try it out for yourself (you don’t necessarily need a graphical tablet, a mouse can also be used instead) then you can download the two GitHub scripts here and run them simultaneously. You may need to have processing.js and the numpy, pymouse, and pyaudio python packages – all of which can be installed via pip.

The Visualization

We used processing.js, a JavaScript framework, to create visualization. We programmed a rotating rectangle, which when dragged along the x-axis, its color changes, and long the y-axis, its size changes. Colors from two palettes were inspired by the viridis color palettes from ggplot.

There are also some fun semantics, such as the full path of the rectangle stays on the screen, but it continuously fades to gray after a certain amount of time to shift the focus on its current position/recent history.

Here’s the annotated implementation:

// setting intial values
float angle = 0;

// define color palettes

color[] colarray = {#FCF5BB, #FCCD94, #FB996E, #F06260, #CF4470,
                    #AF397B, #7E2A81, #481B76, #3D186F, #0C0925};
color[] colarray2 = {#430753, #482975, #414986, #35648B, #2D7D8D,
                     #28968B, #32AE80, #5AC46B, #97D64C, #DCE137};

boolean changecolor = false;
int col;

void setup() { // setup loop
  size(1200, 800); // define size of window
  background(0, 0, 0); // set background color
  colorMode(HSB);

 // end of setup loop
}

void draw() { // draw loop

 // create a colored rectangle
 fill(0, 20, 20, 2);
 rect(0, 0, width, height);

  for (int j=0; j<20; j++) {
    col = colarray[j];
    fill(col);
  }

  translate(mouseX, mouseY);

  // continuously rotate the rectangle
  angle += 0.1;
  rotate(angle);
  fill(255);
  col = colarray[0];
  fill(col);

 // map some key-presses, these also work on wacom tablet buttons.
 if (keyPressed) {
    if (keyCode==16) { // maps to shift
      changecolor = true;
    }
    if (keyCode==17) { // maps to ctrl
      changecolor = false;
    }
    if (keyCode==18) { // maps to alt
      background(0, 0, 0);
    }
  }

  if (changecolor==true) {
    magma();
  }
  else {
    virdis();
  } // viridis and magma are defined later

 stroke(0);

  for (int i=0; i<mouseY; i++) {
    rect(-i/16,-i/16, i/8,i/8,i%10);
  }
 noFill();
 noStroke();
 rect(frameCount * frameCount % width, 0, 40, height);
// end of draw loop
}

void magma() { // mapping color palette1 to x-axis

  for (int i=0; i<10; i=i+1) {
    if (mouseX >= width*i/10 && mouseX < width*(i+1)/10) {
     col = colarray[i];
      fill(col);
    }
  }
}

void virdis() { // mapping color palette2 to x-axis
  for (int j=0; j<10; j=j+1) {
    if (mouseX >= width*j/10 && mouseX <= width*(j+1)/10) {
     col = colarray2[j];
      fill(col);
    }
  }
}

A screenshot of how this looks like:

Or using another color palette:

Audio

The audio was mapped to the position of the tablet’s pen (or equivalently, mouse), with frequency varying along the x-axis, and amplitude along the y-axix.

We can define a dictionary with values representing frequencies. Interestingly, frequencies of musical notes follow a base-2 exponential function.

notes = {1:130.8, 2:146.8, 3: 164.8, 4: 185, 5:196, 6:220, 7:246.9,
         8:261.6, 9:293.6, 10: 329.6, 11: 370, 12:392, 13:440, 14:523.2}

we can then define a function that plays a note given a frequency and volume (amplitude) using the pyaudio library:

import pyaudio
import numpy as np

def play_sin(frequency,volume=0.2):
    p = pyaudio.PyAudio()
    volume = volume     # range [0.0, 1.0]
    fs = 44100       # sampling rate, Hz, must be integer
    duration = 300/frequency   # in seconds, may be float
    f = frequency        # sine frequency, Hz, may be float

    # generate samples, note conversion to float32 array
    samples = (np.sin(2*np.pi*np.arange(fs*duration)*f/fs)).astype(np.float32)

    # for paFloat32 sample values must be in range [-1.0, 1.0]
    stream = p.open(format=pyaudio.paFloat32,
                    channels=1,
                    rate=fs,
                    output=True)

    # play. May repeat with different volume values (if done interactively)
    stream.write(volume*samples)
    stream.stop_stream()
    stream.close()

    p.terminate()

Then, defining a window size, and breaking it up into a virtual grid (to later map frequencies and amplitudes to):

nx, ny = (14, 10)
x = np.linspace(0, 1200, nx)
y = np.linspace(0, 800, ny)

Finally, we can run a while-loop, to continuously play the music:

import mouse

idx=-1
duration = 1

while True:
  # play some intial value
  if idx==-1:
    sound = play_sin(notes[1])
    idx = 1

  else:
    old_idx=idx
    idx = np.searchsorted(x,[mouse.position()[0],], side='right')[0]
    idxy = np.searchsorted(y,[mouse.position()[1],], side='right')[0]
    sound = play_sin(frequency=notes[idx], volume=0.1*idxy)

Other than the software part, we got a purple acrylic sheet and laser cut it to fit to make a cover for a box with contact microphones inside, and a tablet mount on top (pictured at the top). Each part of the project was unique, fun, and highly-pedagogical. We also got to demo it at the end of the weekend at the TechShop San Francisco Gallery, in front of some of the hottest music acts in the SF electronic music scene! It was also a pretty unique experience to spend the weekend in TechShop SF, which filed for bankruptcy the morning after we left, making us the last few people to use it.

Here’s the team demoing this on stage!

and some more photos from the amazing weekend!


© Husni Almoubayyed 2018. All rights reserved.