Data + Publics Wk 2: Data Drawing

For the assignment to create a data visualization (using non-digital tools) on a postcard that shows a ‘snapshot’ of a particular slice of a public data set, I chose to use the NYC Department of Health and Mental Hygiene’s dataset on NYC Restaurant Inspection Results.

I was thinking of sending it to my cousin, who got food poisoning when he visited NYC a couple months ago.

Scan_datadrawingScan_legend

Legend shows: (1) each violation type has their own pattern, (2) each restaurant type has their own color, and (3) critical/not critical changes the box’s drop shadow direction.

Below is the final postcard:

posrcard_set

Data + Publics Wk 1: Reading Reflections

1.) Turning the Data Around

This article’s general sentiment to “consider the human” resonates strongly with me. The idea that data visualization should benefit the community or people in which the data is taken from seems only right. Reading this article, also makes me reflect on all the past data visualization projects I have done. I can’t help but wonder if our ‘New Yorkers who Clean Up and New Yorkers who Complain’ piece helped sanitation workers in any way – did it give back to them? Did any of my previous ‘Data + Art’ projects do that? I would like to think that it helped shed light on the fact that garbage collection is a dangerous job, but did it? Did it benefit the workers? How would I improve upon my previous data projects using this mindset of ‘turning the data around’? How would they feel if they saw the data visualization? Would they feel like their issues were being heard? Would they feel like they had been part of a study they didn’t agree to be a part of? Or worse, would they feel uncomfortable seeing this? I cringe thinking about it, but it’s very possible that it made them uncomfortable and not heard. It would have been better to get to know sanitation workers, to get to know the people first – not just make a data visualization. Good to be self-critical. It’s helpful reading this article and keeping in mind the community in which the data is taken from. I’d also like to know more ways to empower the people whom data is collected from other than putting it in a public setting.

2.) Chapter Two: On Rational, Scientific, Objective Viewpoints from Mythical, Imaginary, Impossible Standpoints

A great saying the article repeated was “data visceralization not visualization.” Designing for viscerlizations requires a much more holistic embodiment from the person. As the article mentions multiple times, it’s important to keep in mind that objective, rational “truths” are impossible. The argument to keep data “objective” is faulty in that everyone’s “truth” is different. We live in a subjective, non-binary world full of many unique truths. Another great point is the idea that novel representations of data are much more memorable than a typical bar graph. Even the “chart junk”, which are illustrated data visualizations, are more unique and therefore more effective. It was helpful to read this and remember that when making my own data visualizations it is important to a.) show not tell, b.) think of unique forms to represent this data, c.) convey the emotions of the dataset, and d.) design for “data viscerlizations” which require a “holistic conception of the viewer” and treats viewers as more than just a pair of eyes.

 

Data + Publics Wk 1: Open Data NYC Zine

Lydia and I worked together to make a zine about Open Data NYC! We covered: a.) the history of Open Data NYC and what it is, b.) what are 311 calls, c.) how to submit a 311 request, d.) how to access 311 data, d.) how to use that data, f.) what are the biases involved, and g.) what is missing from the data? We also included an activity guide for others on the back. This was fun to make – we hope to continue with this and make more data how-to guides!

book-8book-1

book-3book-6book-7

Setting Up Domain Name, Server & SSL: Notes

My NYU hosted website doesn’t work anymore, so I’m trying to get my previous live web + data art projects that use an ssl certificate up and running again. In hopes that I don’t forget how to do this, I’m documenting my process of hosting my own domain name, getting my own server, and registering for ssl certificates. Should be simple, but let’s see!

Step 1: registered for a new domain name on Dreamhost. My new domain name: dadododoes.tech

Step 2: Created a new droplet on digital ocean.

Step 3: Added a SSH key. This video was quite helpful in understanding how to get a SSH key on my DigitalOcean. With all this set up I am now able to ssh in using terminal

ssh root@ipaddresss

I am also able to log into fetch server:

Hostname: ipaddress
Username: root

Step 4: point my DigitalOcean name server from common Domain Registrar. This was a helpful video and article.

Step 5: updating my previous code with new ssl certificate and rsa private key that DreamHost has provided.

Step 7: ran into a lot of issues with not being able to run “node server.js” in my terminal once my file was in fetch. realized i have to install node in it. must use these commands to install node. live web class page has a helpful tutorial for this.

apt-get update
apt-get install nodejs
apt-get install nodejs-legacy
apt-get install npm

Step 7: also had to reinstall forever

Step 8: on Digital Ocean it needs to have the ‘www’ added to the url name that points to the ip address.

To get ssh key content:

pbcopy < ~/.ssh/id_rsa.pub

 

Live Web Wk 11-13: Final

Final: https://el3015.itp.io:3128/play.html

Final code: here

About:

Sound of Color is a web and mobile app that visualizes the relationship between color and sound frequency.

Process:

Final project proposal

Process

Steps for this week: a.) fix the line animation on the “draw” page/finally get that to work properly, b.) style the page: resize the pixelate canvas to be full width, c) animate the playhead, d.) fix the mouse move sizing issue, e.) get ipad camera to be facing out/ not user, f.) Billy to work on converting the sound frequency I have into midi notes, g.) the speed plays back faster every time you play and pause in ‘playback’ mode

a.) fix the line animation on the “draw” page/finally get that to work properly

It now animates! Many thanks to Shawn’s logic advice! It needed to go in a main draw loop and also needed an array that the sine lines selected would be pushed into.

But other issues emerged…for ex: it draws the full history of the lines selected. meh! this took way too long to figure out. but perhaps this is the solution? https://stackoverflow.com/questions/23092624/socket-io-removing-specific-listener

Tried a bunch of things with the “socket.off” and “socket.removeEventListener” but still not working. Need to ask Shawn/Mimi.

Screenshot 2019-11-29 09.03.33.png

Great, asked and turns out it was a simple solve! canvas.removeEventListener was the right code to use! Turns out I was just removing the line that draws the sine wave, when i needed to remove the whole function. Below is the winning script that works (this seriously took way too long to get)!

   redSelect.addEventListener('click', () => {

        evl = drawRedSine;

        let removeRed = (e) => {

            console.log('evl for red: ' + evl)

            console.log(e);

            let p = {

                x: e.clientX,

                y: e.clientY

            };

            socket.emit('sendRedData', p);

            console.log("EMITTED RED");

            canvas.removeEventListener('click', removeRed);

        }

        var listener = canvas.addEventListener('click', removeRed);

    });

b.) Resizing context + canvas to be full width (see d for answer)

Question: when i resize the canvas to be full width, the mouse pick is still picking from the same context size. it is not picking from the right pixel/ context didn’t seem to scale with the canvas.

c.) Animating the shape in playback mode

How to return value from a setInterval()? Having issues with the rectangle drawing across the screen…when it is drawing in setInterval(), it disappears too quickly because it is drawn every 500 miliseconds.

Never mind! fixed. All I had to do was draw the shape in the loop function.

d.) the mouse move was not choosing from the canvas.

The issue was that the canvas width was set at 100vh in css. so the actual video that was being drawn into the canvas was not the full window width. Figured out that you have to set the canvas width and height in the javascript.

the key code:

   ctx.canvas.width = window.innerWidth;

    ctx.canvas.height = window.innerHeight;

    //set video full width

    video.width = window.innerWidth;

    video.height = window.innerHeight;

e.) to get camera to face out I had to use ‘facingMode: ‘environment”

  let constraints = {

    audio: false,

    // video: true,

    video: {

      facingMode: "environment"

      // facingMode: 'environment'

    }

  }

f.) Converting frequency sounds to notes. Thank you Billy for converting the frequency to actual notes! woo!

g.) Fixing speed increase! This took so long but thank you Craig for helping troubleshoot this one! so the issue was that the enableFakePick() already has an event listener so it just kept adding to it when I was doing another event listener in the drop down selection part. What needed to happen was just to get rid of the event listener in this drop down part. part below is the winning code!

if (i === 1) {

letplayButtonDiv=document.getElementById('bottom-button-play-div');

letpauseButtonDiv=document.getElementById('bottom-button-pause-div');

playButtonDiv.style.display='inline-block';

evl=i;

console.log("evl:"+evl);

if (evl!=null) {

letplayButtonDiv=document.getElementById('bottom-button-play-div');

playButtonDiv.style.pointerEvents='auto';

console.log('playback');

enableFakePick();

//playButtonDiv.addEventListener('click', enableFakePick); //enable play button

}

//remove mouse pick function

document.removeEventListener('mousemove', enableMouseMove);

} else if (i === 2) {

//document.removeEventListener('click', enableFakePick);

evl=i;

if (evl!=null) {

// show color with pick function

console.log('mouse move');

document.addEventListener('mousemove', enableMouseMove);

}

letplayButtonDiv=document.getElementById('bottom-button-play-div');

playButtonDiv.style.pointerEvents='none'; //disable play button

}

Helpful Resources:

Credits!

Biggest thank you to Professor Shawn for constantly solving the technical issues I ran into. So appreciate all the help and for steering this project in a better direction. I learned so much regular javascript from doing this project and from being in Live Web- very grateful I took this course.

Shout out to sound expert Billy Bennett for converting the frequency sounds into midi notes! It sounds infinitely better!!

Also, thank you so much the last minute troubleshooting help, Dan Oved and Professor Mimi.

Data Art Wk 12-14: Data Critique

Final site: here

Final code: here

Eva and I wanted to continue thinking through the topic of garbage. We had started this project thinking we would continue building out our original designs from the previous Data & Publics assignment. But after hearing Genevieve’s lecture and seeing the examples of how people use their projects to critique an aspect of data culture, we decided to rethink our next steps.

From the lecture, I especially liked this Giorgia Lupi notion of ‘data is people.’ At the same time, I’ve also been thinking about Dr. Robin Nagle’s book ‘Picking Up’ and her experience working alongside the sanitation workers of NYC. From these two sources, we decided to use this opportunity to move beyond a general data visualization. Instead, we hoped to focus on the people. Who are the sanitation workers and what are their experiences?

1_kJt8zj1L1jooF2MrmRm9Ag.jpegImage result for picking up robin nagle"

At one point we wanted to interview sanitation workers but after talking to Robin, we realized how tricky that could get. After doing some more research, we found that sanitation workers have shorter lifespans and more injuries due to the back-breaking and dangerous nature of the work. We then got data from the Department of Sanitation New York that shows the different types of injuries by year and boroughs.

We were also interested in using 311 complaint data from NYC’s Open Data API to show the complaints made about sanitation. By placing the two perspectives next to each other – from those who complain to those who clean up – we hope to show a larger picture of sanitation in the city. We hope that this piece serves as a reminder of the labor that often goes unnoticed but is crucial to making our daily lives run.

a.) Finding our datasets. Using these 2 datasets – 311 Complaints (the most recent ones) and the DSNY Sanitation Worker injuries (for the 5 boroughs and within Jan-Nov 2019) – we brainstormed possible forms this could take. We split up the tasks, so Eva worked on the 311 complaint data using text analysis. I would work on the sanitation worker injury data.

b.) Designing it. For designing the injury side, we wanted to capture the physicality and humanness of injuries. So, the boroughs would have textural marks that each indicate a different type of injury. We were inspired by the textural quality of this website: https://canners.nyc/

First, was designing rough layouts of the pages.

Next, was illustrating the different type of injuries as marks and having the marks within the shape of that borough.

Screen Shot 2019-12-04 at 9.04.30 PM.png

Last part of the design process was combining the layout with the illustration.

screen-shot-2019-12-04-at-9.04.14-pm-2-e1575949432355.png

 

c.) Coding it. The idea is that when you mouse over a textural mark, the amount of that type of injury would show. This was a lot of div work! It required a separate div for every single injury for all the boroughs. A lot of the code for this was about showing and hiding divs when you selected a specific borough.

d.) Final touches. Next was designing and coding all the other parts of the website – an intro page and a menu.

screenshot-2019-12-08-10.33.50.png

Eva’s steps with the 311 Complaints and more can be read: here

Thank you!

Biggest thank you to Eva for always being stellar to work with! Also, big thank you to Rashida for brainstorming with us and giving us the best ideas for this! Excited to continue collaborating! As always, thank you Genevieve for inspiring us to think more critically about how data is collected, visualized and used. The readings and conversations have helped us be more self-aware when working in this space.

Understanding Networks Wk 10-13: Co-Draw

Xiaotong and I worked together on this project! The goal was to create a collaborative live drawing app using ITP’s Axi-Draw Machine and RESTful APIs.

The functions we aimed for: 

  • Co-Draw allows people to contribute to a drawing using a web interface
  • Control the Axidraw’s position based on the user’s mouse x and y position on the website.
  • Use socket.io to allow many people to draw at the same time.

When coding this project, we came across many challenges and we ended up with 2 versions:

  • version 1: tweaked example code for the Axi draw and added sockets to it (as backup)
  • version 2 (final): uses our own RESTful API that communicates from client → server → serially communicates to machine

Final code without sockets



Final Version

Screen Shot 2019-12-03 at 12.37.46 PM copy

This version uses our own RESTful API, which was done with a lot of help from Professor Tom Igoe. We were able to get as far as sending commands from the client side –> server side –> (communicate serially) –> to axi draw machine.

Systems Diagram

systems_diagram-e1575354864813.jpg



Server side: 

a.) GET and POST endpoints

// here are all your endpoints. The pattern is:
// GET the current value, or
// POST the new value as a request param:

server.get('/mouse', getMouseState);

server.get('/command/:command', runRemoteCommand);

server.post('/mouse/:mouse_x/:mouse_y', moveMouse);

server.post('/mouse/:mouse_y', handlePostRequest);

server.get('/state', handleGetRequest);

server.post('/state/:state', handlePostRequest);

b.) moveMouse() function that gets request from client and puts into a string to send to Serial

function moveMouse(request, response) {

// request is /mouse/:mouse_x/:mouse_y'
// get the position from the request
//SM,1000,-250,766\r

let result='SM,'+'1000,'+request.params.mouse_x+","+request.params.mouse_y;

// send it to the serial port as a command
sendSerialData(result);

// wait for confirmation from the serial port
// send a response back to the user

response.send(result); //send for string

}

c.) Send command from mouseMove() function to Serial 

function sendSerialData(command) {

   console.log(command+'\r');

   myPort.write(command+'\r');

   console.log("Sending something out the serial port");

}

Client Side

a.) setMouse() sends POST request to server using httpDo function of p5js

function setMouse(x, y) {

    var path = '/mouse/' + x + '/' + y; // assemble the full URL

    var content = '';

    console.log('path: ' + path);

    //httpDo( path, 'GET', content, 'text', getResponse); //HTTP PUT the change

    httpDo(path, 'POST', content, 'text', responseHandler); //HTTP PUT the change

}

b.) setMouse() function is called on in index

<!--top left-->

<buttononclick="setMouse('-550', '0'); moveTopLeft()">top left</button>

<!--top-->

<buttononclick="setMouse('-1000', '550'); moveTop()">top</button>

<!--top right-->

<buttononclick="setMouse('0', '550'); moveTopRight()">top right</button>

<!--bottom left-->

<buttononclick="setMouse('0', '-550'); moveButtomLeft()">bottom left</button>

<!--bottom-->

<buttononclick="setMouse('550', '-550'); moveButtom()">bottom</button>

<!--bottom right-->

<buttononclick="setMouse('1000', '500'); moveButtomRight()">bottom right</button>

Socket Code

a.) sendMouse() function for emitting data from client to server
// Function for sending to the socket

function sendmouse(xpos, ypos) {

// We are sending!

console.log("sendmouse: "+xpos+" "+ypos);

// Make a little object with and y

vardata= {

x:xpos,

y:ypos

};

// Send that object to the socket

socket.emit('mouse', data);

}

b.) Socket code on server side for listening for data, then emitting

// When this user emits, client side: socket.emit('otherevent',some data);

socket.on('mouse',

function (data) {

// Data comes in as whatever was sent, including objects

console.log("Received: 'mouse' "+data.x+" "+data.y);

// Send it to all other clients

socket.broadcast.emit('mouse', data);

// This is a way to send to everyone including sender

// io.sockets.emit('message', "this goes to everyone");

}

);

 

Version 2 (sockets added to AxiDraw Example Code)

For this version, we used the cncserver code as our example: https://github.com/techninja/cncserver

Run cncserver.js

install Node.js
install npm
node cncserver.js

 


 

References we used: 

 


 

Process:

a.) Our backup plan of using the AxiDraw code and just adding sockets to it (when our own code wasn’t working.)

 

b.) Practicing sockets to disconnect a user after 30 seconds. 

 

c.) inital start: simply communicating to Axi Draw using serialPort

 



So Many Thanks:

We bugged way too many people and professors for this project. I’m so grateful to everyone who took the time to help and listen. The biggest thank you to Professor Tom Igoe for being patient and spending so much time to help us understand how to write RESTful APIs and how to communicate with the AxiDraw using serial.

Thank you so much to Professor Shawn for providing guidance and troubleshooting help. Thank you Professor Mimi for helping us last minute with the socket pairing.

Thank you Dana and Noah for asking critical questions and giving good feedback during the start/ideation of this project. Thank you Jackie and Andrew for guiding us through the logic.