Data Art Wk 12-14: Data Critique

Final site: here

Final code: here

Eva and I wanted to continue thinking through the topic of garbage. We had started this project thinking we would continue building out our original designs from the previous Data & Publics assignment. But after hearing Genevieve’s lecture and seeing the examples of how people use their projects to critique an aspect of data culture, we decided to rethink our next steps.

From the lecture, I especially liked this Giorgia Lupi notion of ‘data is people.’ At the same time, I’ve also been thinking about Dr. Robin Nagle’s book ‘Picking Up’ and her experience working alongside the sanitation workers of NYC. From these two sources, we decided to use this opportunity to move beyond a general data visualization. Instead, we hoped to focus on the people. Who are the sanitation workers and what are their experiences?

1_kJt8zj1L1jooF2MrmRm9Ag.jpegImage result for picking up robin nagle"

At one point we wanted to interview sanitation workers but after talking to Robin, we realized how tricky that could get. After doing some more research, we found that sanitation workers have shorter lifespans and more injuries due to the back-breaking and dangerous nature of the work. We then got data from the Department of Sanitation New York that shows the different types of injuries by year and boroughs.

We were also interested in using 311 complaint data from NYC’s Open Data API to show the complaints made about sanitation. By placing the two perspectives next to each other – from those who complain to those who clean up – we hope to show a larger picture of sanitation in the city. We hope that this piece serves as a reminder of the labor that often goes unnoticed but is crucial to making our daily lives run.

a.) Finding our datasets. Using these 2 datasets – 311 Complaints (the most recent ones) and the DSNY Sanitation Worker injuries (for the 5 boroughs and within Jan-Nov 2019) – we brainstormed possible forms this could take. We split up the tasks, so Eva worked on the 311 complaint data using text analysis. I would work on the sanitation worker injury data.

b.) Designing it. For designing the injury side, we wanted to capture the physicality and humanness of injuries. So, the boroughs would have textural marks that each indicate a different type of injury. We were inspired by the textural quality of this website: https://canners.nyc/

First, was designing rough layouts of the pages.

Next, was illustrating the different type of injuries as marks and having the marks within the shape of that borough.

Screen Shot 2019-12-04 at 9.04.30 PM.png

Last part of the design process was combining the layout with the illustration.

screen-shot-2019-12-04-at-9.04.14-pm-2-e1575949432355.png

 

c.) Coding it. The idea is that when you mouse over a textural mark, the amount of that type of injury would show. This was a lot of div work! It required a separate div for every single injury for all the boroughs. A lot of the code for this was about showing and hiding divs when you selected a specific borough.

d.) Final touches. Next was designing and coding all the other parts of the website – an intro page and a menu.

screenshot-2019-12-08-10.33.50.png

Eva’s steps with the 311 Complaints and more can be read: here

Thank you!

Biggest thank you to Eva for always being stellar to work with! Also, big thank you to Rashida for brainstorming with us and giving us the best ideas for this! Excited to continue collaborating! As always, thank you Genevieve for inspiring us to think more critically about how data is collected, visualized and used. The readings and conversations have helped us be more self-aware when working in this space.

Understanding Networks Wk 10-13: Co-Draw

Xiaotong and I worked together on this project! The goal was to create a collaborative live drawing app using ITP’s Axi-Draw Machine and RESTful APIs.

The functions we aimed for: 

  • Co-Draw allows people to contribute to a drawing using a web interface
  • Control the Axidraw’s position based on the user’s mouse x and y position on the website.
  • Use socket.io to allow many people to draw at the same time.

When coding this project, we came across many challenges and we ended up with 2 versions:

  • version 1: tweaked example code for the Axi draw and added sockets to it (as backup)
  • version 2 (final): uses our own RESTful API that communicates from client → server → serially communicates to machine

Final code without sockets



Final Version

Screen Shot 2019-12-03 at 12.37.46 PM copy

This version uses our own RESTful API, which was done with a lot of help from Professor Tom Igoe. We were able to get as far as sending commands from the client side –> server side –> (communicate serially) –> to axi draw machine.

Systems Diagram

systems_diagram-e1575354864813.jpg



Server side: 

a.) GET and POST endpoints

// here are all your endpoints. The pattern is:
// GET the current value, or
// POST the new value as a request param:

server.get('/mouse', getMouseState);

server.get('/command/:command', runRemoteCommand);

server.post('/mouse/:mouse_x/:mouse_y', moveMouse);

server.post('/mouse/:mouse_y', handlePostRequest);

server.get('/state', handleGetRequest);

server.post('/state/:state', handlePostRequest);

b.) moveMouse() function that gets request from client and puts into a string to send to Serial

function moveMouse(request, response) {

// request is /mouse/:mouse_x/:mouse_y'
// get the position from the request
//SM,1000,-250,766\r

let result='SM,'+'1000,'+request.params.mouse_x+","+request.params.mouse_y;

// send it to the serial port as a command
sendSerialData(result);

// wait for confirmation from the serial port
// send a response back to the user

response.send(result); //send for string

}

c.) Send command from mouseMove() function to Serial 

function sendSerialData(command) {

   console.log(command+'\r');

   myPort.write(command+'\r');

   console.log("Sending something out the serial port");

}

Client Side

a.) setMouse() sends POST request to server using httpDo function of p5js

function setMouse(x, y) {

    var path = '/mouse/' + x + '/' + y; // assemble the full URL

    var content = '';

    console.log('path: ' + path);

    //httpDo( path, 'GET', content, 'text', getResponse); //HTTP PUT the change

    httpDo(path, 'POST', content, 'text', responseHandler); //HTTP PUT the change

}

b.) setMouse() function is called on in index

<!--top left-->

<buttononclick="setMouse('-550', '0'); moveTopLeft()">top left</button>

<!--top-->

<buttononclick="setMouse('-1000', '550'); moveTop()">top</button>

<!--top right-->

<buttononclick="setMouse('0', '550'); moveTopRight()">top right</button>

<!--bottom left-->

<buttononclick="setMouse('0', '-550'); moveButtomLeft()">bottom left</button>

<!--bottom-->

<buttononclick="setMouse('550', '-550'); moveButtom()">bottom</button>

<!--bottom right-->

<buttononclick="setMouse('1000', '500'); moveButtomRight()">bottom right</button>

Socket Code

a.) sendMouse() function for emitting data from client to server
// Function for sending to the socket

function sendmouse(xpos, ypos) {

// We are sending!

console.log("sendmouse: "+xpos+" "+ypos);

// Make a little object with and y

vardata= {

x:xpos,

y:ypos

};

// Send that object to the socket

socket.emit('mouse', data);

}

b.) Socket code on server side for listening for data, then emitting

// When this user emits, client side: socket.emit('otherevent',some data);

socket.on('mouse',

function (data) {

// Data comes in as whatever was sent, including objects

console.log("Received: 'mouse' "+data.x+" "+data.y);

// Send it to all other clients

socket.broadcast.emit('mouse', data);

// This is a way to send to everyone including sender

// io.sockets.emit('message', "this goes to everyone");

}

);

 

Version 2 (sockets added to AxiDraw Example Code)

For this version, we used the cncserver code as our example: https://github.com/techninja/cncserver

Run cncserver.js

install Node.js
install npm
node cncserver.js

 


 

References we used: 

 


 

Process:

a.) Our backup plan of using the AxiDraw code and just adding sockets to it (when our own code wasn’t working.)

 

b.) Practicing sockets to disconnect a user after 30 seconds. 

 

c.) inital start: simply communicating to Axi Draw using serialPort

 



So Many Thanks:

We bugged way too many people and professors for this project. I’m so grateful to everyone who took the time to help and listen. The biggest thank you to Professor Tom Igoe for being patient and spending so much time to help us understand how to write RESTful APIs and how to communicate with the AxiDraw using serial.

Thank you so much to Professor Shawn for providing guidance and troubleshooting help. Thank you Professor Mimi for helping us last minute with the socket pairing.

Thank you Dana and Noah for asking critical questions and giving good feedback during the start/ideation of this project. Thank you Jackie and Andrew for guiding us through the logic.

Live Web Wk 10: Finals Progress (v1)

Steps for Week v1 (11/14/19 – 11/21/19) 

a.) Pixelate live stream + get RGB values of each pixel

First I pixelated the live stream. This was the reference code for pixelating the live stream: pixelate effect (using this one!). Then I tried to find the rgb value for each pixel. I used the MDN example for that: getting pixel data from context

Screenshot 2019-11-20 01.06.54

—–

b.) Converting RGB to Frequency

  • RGB to HSL 
    • After pixelating the live stream, I needed to convert rgb –> hsl (hue|saturation|value), then hue –> wavelength(nm), then wavelength(nm) –>  frequency (THz). This is to ensure that the conversion from color to sound is scientifically accurate. From the
    • First things first: I converted the RGB to HSL in order to access the number for hue. After getting the Hue in the HSL, I need to convert Helpful resource for converting RGB to HSL: here.
    • Here’s my test code for pixelation with the rgb and hsl values printed in the innerHTML: here

Screenshot 2019-11-20 11.00.05

—–

c.) Convert Hue to Wavelength (nm)

Helpful visualizer for converting hue to wavelength: here.
stackoverflow – convert hue to wavelength
convert hue to wavelength

    //convert hue to wavelength
    // Estimating that the usable part of the visible spectrum is 450-620nm, 
    // with wavelength (in nm) and hue value (in degrees), you can improvise this:

    let wavelength;
    wavelength = Math.ceil(620 - 170 / 270 * h);

Checked using this site that converts color to wavelength (nm). For ex: tested to make sure the blue is around 400.

—–

d.) Convert Wavelength (nm) to Frequency (THz)

Helpful site for understanding and converting wavelength to frequency: here

Based on this equation:
Wavelength (Lambda) = Wave Velocity (v) / Frequency (f)

The code for converting from wavelength (nm) to frequency (THz) is this:

frequency = 3 * (Math.pow(10, 5)) / wl;

The above code also works for converting! I checked it just by making sure this blue is around 600 – 668 THz.

Screenshot 2019-11-21 02.31.36

—–

e.) Using frequency (THz) to make sound

I was able to take the functions I wrote from midterms to get a sound that mapped the color frequency (400-789 THz) to a pitch frequency range (20-3000 Hz).

Previously, the sketch was giving me an interesting feedback-y type sound. Wasn’t sure what it was but Shawn added in this snipped of code which helped with making each pixel have a distinct sound.

Next step is to figure out how to get the sound to only play once when on mouse over.

—–

f.) Design for Play Modes

Example sketch layout for introduction to the Play app. This is where people can select which mode they want to try out.

Screenshot 2019-11-24 17.06.14.png

Example layout for the 2 different modes. They don’t differ that much in visuals, just in how the user can interact. Live mode has a play head so it’s more like a soundtrack. Interact mode allows you to mouse over each color pixel.

 

Joy + Games Wk 10: Fruit Bowling

(video doesn’t have the audio)

For this week, I made a new game that simulates bowling with fruits! I came up with this idea after watching a very helpful bowling in unity tutorial. Right now, the instructions are: 1.) space bar to start bowling/move watermelon forward, 2.) right arrow to move right, 3.) left arrow for left, 4.) R to refresh.

There are many features that I still wanted to include – an intro page, a scoring system, changing which fruit to bowl with. I am also interested in putting this on itch.io! Will aim to do that this week!

References: 

Live Web Wk 10: Final Project Proposal

Sound + Color + Environment! 

Objective: When people take a photo of their surroundings using the app, the photo will turn into colored pixels. Each of these “pixels” will have a sound that matches the color. The goal is to create a tool that helps one experience the world in pure sounds and colors. I also intend for this to be an extension of my previous “Sound + Color” project. By having the more educational “color frequency” page from that project,  it will be easier to explain how I chose the sounds of the colors for this “Environment Page”.

191113_liveweb_final_sketch_1.jpg

Functions + Features:

  • Environment Page (mobile):
    • when user takes a photo –> photo pixelates into color blocks
    • user can click over one color block at a time to hear that sound
    • or user can click on the play button (at the bottom of the page) to hear the full soundtrack of all the colors
    • a sound wave that matches with the frequency of the color + sound
  • Environment Page (desktop):
    • same as above but only difference is that the photo captured will be from the computer’s camera
  • Environment Page will need to be both for mobile and web
  • Recreate the previous web vs (with the Intro, Learn, Draw page) so that it is mobile friendly
  • Add an about page that explains the science a little more and has some of my research references

Here is an example of the pixelation effect that I am hoping to achieve:

Data Art Wk 8-10: Data & Publics

Website so far (map only works locally)
Code here

Eva and I worked together to better understand where our garbage goes. We were interested in finding out where our different types of trash end up – does it stay in New York, end up in a different state, or even get sent out of country? First things first, we researched about how trash is handled in New York. Wish I had more time to dig into this topic deeper as there is a lot to dig into. We managed to get some facts at least.

a.) Research! Here are some notes captured through the research process: 

Notes on Guardian Article:
  • NYC generates over 3 million tons of household waste in 2015
  • types of garbage
    • mixed solid waste –> curb –> waste transfer station –> landfill or waste-to-energy plant
    • paper recyclables –> curb –> handing and recover facility –> domestic or international paper mills
    • metals, glass and plastic –> handing and recovering facility –> domestic/international recyclers
    • * compost is not listed in the diagram below but is an important type to consider.

Where New York City Garbage Goes

  • NYC relies of complex waste-management ecosystem encompassing 2 city agencies, 3 modes of transport (trucks, trains, and barges), 248 private waste hauling companies, temporary and permanent facilities
  • History of NYC waste management:
    • most of its history until mid-1900s, primary method for disposing waste was to dump it into the ocean
    • at one point 80% garbage in sea
    • City used some its garbage (ash, rubble, debris) to create artificial land –> increased its own size. much of the city’s land today including some of its priciest neighborhoods are built on garbage. ex 1160s map show how much of city is made of rubble + debris *Is this not amazing?!

A map of 1660s Manhattan overlaid on modern New York shows how much of the city’s land is manmade.

  • 2 waste Systems: 1 public, 1 private
    • 3/4 of ny’s garbage is generated by commercial business, most of it is rubble + debris from construction projects
    • garbage hauling industry has ties to organized crime
  • 12,000 tons of garbage each day
  • 2,230 collection trucks
  • moved to transfer facilities –> carted off to landfills located in various surrounding states – which are now nearly all at capacity
  • NY spent almost $1 billion per year on trash and recyclables collection

b.) After brainstorming, we realized that we had many functions we wanted this website to have. Some of the functions we thought of and designed up:

  • An intro animation page: that gave you a general overview of some NYC garbage facts
  • A journey section: when you type in your zip code it will tell you exactly the journey that your trash goes – from curb to transfer station to landfill. Ideally, it will show you an image of exactly what each step looks like.
  • A map section: a way to see how much trash each neighborhood generates. We were also hoping to be able to filter by time, income and type.
  • A take action section: some action items that we can do to be better trash citizens
  • A resource page: we used many datasets and read some articles. It will be good and transparent to have a bibliography page.
  • A QR code + sticker campaign: we thought of having stickers placed on trash bins with QR codes. The QR code would show the route that the bin would take from school/home/work to landfill. We were hoping this would be a simple way to bring the data closer to people.

c.) We next came up with design and sketches! Eva worked on some very thorough UX layouts based on what we had brainstormed.

I made some visual designs based on her great UX layouts.

 

This slideshow requires JavaScript.

e.) Next we worked on coding this to life! Exciting!

f.) Conceptual mockups for our civic approach.
Still need to tweak the facts, make the QR code, and create the page that links the QR to the journey section of the website.

mockup_qr_stickers_1.jpg

f.) To be continued…
Since we couldn’t get the data for the journey map within this time frame, we will continue working on this. We also want to get the messaging right for the stickers with a stronger focus on the compost program!

We are excited to keep going!!!

 

In the meantime we have made this click through invision link to show what we are imagining for this: https://emily511438.invisionapp.com/public/share/UKWU4NCMF

Code References:

Understanding Networks Wk 10: Restful API Drawing Machine (Updated)

Live Drawing Show!

Goal:

  •  User will draw on the website’s draw page. The Axi Draw will update based on user’s drawing. There will be a camera that is recording. The recording will be sent to the website’s live stream page so others can see the live stream of the drawing. 
    Image result for axi draw

    IMG_7722.jpg

Functionality:

  • Co-Draw allows people to contribute to a drawing using a web interface
  • Control the Axidraw’s position based on the user’s mouse x and y position on the website.
  • Include a camera that video records the Live Drawing show. The live stream will be posted on a separate page on the website.
  • Ideally: use socket.io to allow 2 people to draw at the same time.

REST API:

1.) Identify Axi Draw Machine

  • Address: /identify_axi
  • Method: ?


2. ) Find current position of the user’s mouse x and mouse y on the website

 

Address: /pen_position

Method: post

Body:

{
   “mouse_x”: 2344,
   “mouse_y”: 281,
   “state” : 1  //pen state is from 0 to 1 (down/on)
}

Response:

“coordinates”: [
        {
        “success”: 
            {
             “type”: “200”, //ok
             “mouse_x”: “2344”,
             “mouse_y”: “281”,
             “state”: “1”
            },
         “error”: 
             {
               “type”: “400”, //error
               “details”: “invalid inputs”
             }
         }
     ]
}

3. ) Set to original position when user presses Reset Button

Address: /original
Method: POST
Body: 

{
   “mouse_x”: 0,
   “mouse_y”: 0,
   “state” : 0 // off
}

Response: 

{
    “original_position”:[
        “success”: 
        {
            “mouse_x”: 0,
            “mouse_y”: 0,
            “state” : 0 // off
         },

   “error”: 
        {
             “type”: “400”,
             “details”: “invalid inputs”
         }
      ] 
}

Video for live stream

4.) Get channel 

Address: /channel
Method: GET
Body: 

Response: 

{
    “channel”:[
        “success”: 
        {
            “recording”: on, //turn channel on or off
            “channel_id”: {channel_id}, //integer
            “width": 1920px,
            "height": 1080px
         },

   “error”: 
        {
             “type”: “400”,
             “details”: “invalid inputs”
         }
      ] 
}

5.) Post channel data to client

Address: /channel
Method: POST

{
 “channel”:
      “recording”: on, //turn channel on or off
      “channel_id”: {channel_id}, //integer
      “width": 1920px,
      "height": 1080px
}

Reference:

Live Web Wk 9: WebRTC Data Channels

Try it out!
Code here

For this week, I made a very slow type of chat messaging (calling it snail msg for now?). The idea is that you can only send one letter at a time. I guess I am making this as a reminder to slow down and to be ok with taking my time. Sometimes I expect things to arrive immediately- a thesis idea, great project concepts, someone’s email/text response,  skills, knowledge. It’s helpful, especially right now, to remind myself to not rush. This is a super simple chat messaging system (a lot of the code was tweaked from the class example), but I enjoy the fact that it forces me to pay attention to every letter I type.

Notes & Questions:

  • In WebRTC, There have three main JavaScript APIs: 1. MediaStream, 2. RTCPeerConnection, 3. RTCDataChannel.
  • Why do I keep getting this “Resource interpreted as Stylesheet but transferred with MIME type text/html:” and am not able to link my stylesheet. I have usually been able to do so in the past. What is different this time? Searched throughout stackoverflow, but couldn’t find a solution. In the meantime, I just added the style to the html pg.
    Screenshot 2019-11-06 00.21.52
  • had issues with creating a new div for each data that was sent over socket. realized the issue was that I was creating a new textNode to append to the div, when it should have been a new innerHTML to append! this was the solution!
  • add class to newly appended div: “newDiv.className = ‘name of class'”
  • also getting errors when using my own fonts. says “https://stackoverflow.com/questions/34133808/webpack-ots-parsing-error-loading-fonts

References: 

Lines to Remember:

npm install nedb

Understanding Networks Wk 9: Firewall

Assignment:
“Run Linux host for several days, with a firewall in place and a public IP address. Make a table of all the IP addresses that attempt to connect to your host. Found out where they are located? What organizations are the associated with? What service providers are providing their IP address?”

a.) Step 1: Using my Digital Ocean Linux Host, I installed and configured the ufw firewall. I enabled the TCP connections on  the ports below. I did this by following this super helpful tutorial from the ITP Network Site.

screen-shot-2019-11-04-at-11.14.18-pm.png

b.) Step 2: I collected a table of all the IP addresses that tried to connect to my host for Nov. 2, Nov.3, and Nov. 4. I was planning to combine all 3 day’s worth of data into 1 set, but that was just too large and made it hard to manage in Excel.

I decided to just focus on the Nov. 4 data, which was more than enough data points to work with (5,640 unique hits).  The below image shows a section of the full dataset.

screen-shot-2019-11-04-at-11.02.59-pm-e1572927798745.png

c.) Step 3: Using Excel I was able to find: total hits from all the sources, number of unique sources, top 10 most frequent src (including their ip addresses). The image below shows the results for this.

Screen Shot 2019-11-04 at 11.05.17 PM.png

Other Data (Nov 4): 

  • Total Hits from All the Sources: 5,640
  • Number of Unique Sources: 1,111

d.) Step 4: Using ipinfo.io, I tried to find where each of these 10 ip addresses are coming from and the organizations they are associated with.

  • #1 hits: 185.156.73.52
    • Moscow, RU
    • OOO Patent-Media

screen-shot-2019-11-04-at-11.46.49-pm-e1572929309844.png

  • #2 hits: 185.176.27.254
    • Moscow, RU
    • Balkan Internet Exchange Ltd

185.176.27.254.png

  • #3 hits: 185.156.73.52
    • Moscow, RU
    • Balkan Internet Exchange Ltd

185.176.27.162.png

  • #4 hits: 80.82.64.73
    • Amsterdam, NL
    • Incrediserve.net

80.82.64.73.png

Side note: Incrediserve.net is an IP Trading company! This is a dumb question, but do they mean ip addresses or ip as in intellectual property? 

Screen Shot 2019-11-05 at 12.03.51 AM

  • #5 hits: 96.250.119.232
    • New York City, NY
    • MCI Communications Services, Inc. Verizon Business

96.250.119.232.png

e.) Step 5: Reflection.

UM, why is someone from Russia constantly hitting my IP address?! And so many time in 1 day! Is this maybe tied with the hacking of our elections? Tried to look into this a little and found this forum. Seems like some people on the forum say it is “normal” to get probes from many countries like Russia, China, Vietnam, etc. Basically, I shouldn’t let my biases get in the way, even though my top 3 places that hit me are from Russia.

Thank You: 

  • So much gratitude to Rashida for answering my many questions about this!
  • Thank you for the straightforward guide to setting up the firewall and ip tables, Professor Tom! By initially trying to do this assignment without the guide, I have realized how confusing the process of setting up a firewall and configuring the ip tables could have been. The internet has so many resources, but not all structured so well!

Helpful Resources for Understanding UFW ( initially, I forgot about the itp tutorial above, so found some forums/helpful links online for trying to understand what to do) :

Lines to remember:

sudo dmesg | grep '\[UFW'
grep UFW /var/log/syslog
/var/log
sudo tail ufw.log

remember:
csp root@.....:/var/log/ufw.log .

Joy & Games Wk 8: Simulation

Assignment: make a simulation in unity. I decided to simulate a calm, fall-inspired boat ride. This was chosen party because of skill level, partly because of time, and partly because of all the beautiful yellow leaves I’m seeing! This isn’t the most exciting or goofy interaction but I’ve been feeling the need to calm down lately. Perhaps that is why I’ve ended up with something that is simply about staring at nature? I really should just go out for a hike instead of simulating nature on a screen lol.

References: 

Asset Credits: