Feel the Traffic

Once seen as a luxury, technology is now a tool of necessity and is greatly changing our day-to-day activities, creating a new interface for our lifestyle. Being born in the late 80’s and growing up a child of the 90’s, I have been able to witness first hand a fair share of technology’s development that has molded society into the way it functions today. We’ve gone from a time where technology was developed and shaped to fit within the parameters of society to the point of society being shaped to fit with technology. This exponential growth of the digital age breads a new generation with no connection to the days before iPhones, high-speed Internet, YouTube, and the end all be all search engine. We’ve become a slave to the interface, where for most, technological understanding is only surface deep and stops at the screen. This is what has inspired me to bring to the foreground what goes on in the background of one digital technology easily taken for granted—the Internet.

As the semester progressed I realized that I wanted to repurpose one of the works that we covered in class and make it my own. From experience I find that mimicking a piece is the best way to further your understanding and knowledge about the work. I was interested in Carnivore/Carnivore PE created by *Alexander Galloway and the Radical Software Group (RSG) in 2000 and [1]Jonah Brucker-Cohen’s Alerting Infrastructure! of 2003, ultimately deciding that I would combine the two. [2]CarnivorPE is server-based packet-sniffing program for Processing with its origins from the infamous software of the same name (Carnivore) that was created by the FBI for network surveillance and rewritten and repurposed by Alexander Galloway as a tool for “generative” new media as a way to visualize data. I was drawn to it for the fact that it generates data real-time by eaves dropping on network traffic. Unlike its traditional uses for graphical visualization, I will be creating a somewhat physically immersive environment from the data stream. Due to my limited coding abilities, I could not build my project solely around Carnivore. Searching for another element, we covered Alerting Infrastructure! by Jonah Brucker-Cohen. [3]This piece is generated by the interaction of online users in real-time with every virtual hit to the website translating a physical hit to the gallery wall by means of jackhammer creating a “shift in context” by actualizing the websites activity. The idea of representing a virtual activity in the physical realm intrigued me and even more so I was drawn to the concept of how the advancement of one medium was directly related to the destruction of the other. Although for the purposes of my project I will take a different approach than this piece.

I started the construction of my project by first thinking about what kind of data generated by the Internet could be use, what tool could funnel this data, and in what way could I present it.  Later in the development of my project I realized what kind of generative data could be used, however immediately I knew what tool was capable of tapping into this stream, Processing. Having previous experience with the program, I had a working knowledge of a few different examples that used live feeds from the Internet such as [4]RSS, weather applications, and search engine results. This being a good place to start, I narrowed down the type of data that would be readable by the program and how I might use it. I decided that using results from a search engine such as Google or Yahoo would be how I’d power my visuals. From here I spent the first month or so doing trial and error with the code eventually being unsuccessful. I had come to find out some of the developer kits that were packaged for use with Processing were out of date and no longer functional. So I decided to do further research and signed up for the newer Yahoo SDK (Software Development Kit) and also downloaded Google’s SDK as well which allows for different access to different social media feeds. The only problem with this new candy for my project was the growing complexity that comes with this kind of programming, not to mention integrating a different programming language to Processing from the SDK. Due to time constraints and my limited coding ability I was forced to reconfigure my Idea. Brainstorm after brainstorm I was determined to make use of the user generated data provided by the web. Not long after coming to a complete halt with the project in every aspect due to technical difficulties and code sophisticated well over my head, we covered the CarnivorePE project in class. I soon learned that not only was Processing used to create it but also that the code was open-sourced. I found the source-code from the RSG (Radical Software Group) website, tested it, and was now able to successfully generate a live feed from the Internet. The next thing to do was sift through the lengthy complex code to figure out how the values were being generated in order to filter out unusable information. This was a necessity in narrowing down the code because not all of the data types being generated e.g. booleans, bytes, chars (characters), floats or integers can be can be used for outputting values that would be able to power the physical aspect of the project. As the next few weeks progressed and a few more problems arose, mostly from a beta update to Processing that rendered some Java based component of the Carnivore client incompatible. Once the code was up and running again, I decided how I would use the information being generated. This would require me to create a custom variable within the code allowing an output for the number of active clients on the network to a usable data type. One function of all IDE’s(Integrated Development Environment—this compiles the code) is that it allows you to use a println function (syntax varies on the programming language), read “print line,” that will allow you to view the data being produced by particular or variable.  From this, I then decided that I would take the project from a graphical representation of the data stream to a physical platform.

The next step after solidifying the bare bones of the code in conjunction with concept for the project was to figure out how to bring the generated data from a screen based environment to the physical one. This would require an additional piece of hardware known as Arduino, from the creators of Processing. Arduino is a microcontroller and basically the simplest form of a computer. It provides a blank slate capable of uploading custom code for task as basic as turning on a light or creating a remote control, to more complex task such as wireless functionality and even domestic integration for monitoring the wattage usage in your home. If it involves technology, Arduino can be programmed to be part of it or even repurpose the device.

Now that Arduino is part of the equation I need to get the two programs talking to one another. This is done through serial communication by directing a signal through a specified unused port that one program writes to while the other reads it. This is known as a digital serial read. Once I have the programs communicating, I had to create a custom if-then statement for Processing. What this does is tell the program that if a predefined value is reached then send a signal to Arduino, typically a random variable that tells the program to turn on or activate whatever is hooked up to it, in my case a single LED light.  Without this functionality there is no project. Now I have the most dumbed-down version of what I want; a value generated by Carnivore PE, read and output by Processing, then read by Arduino, which turns on the light. For the next couple weeks I tried to fine-tune the code for a more desired result for how Arduino handles the information. The next problem I ran into is finding the right kind of parts that would help actualize network traffic. In the end I was unsuccessful in finding the right parts, which means the conceptual execution needed to be reinvented once more. This time I decided to use sound to protrude upon the humble environment of the computer user untainted by Internet’s traffic. Almost as effective as my original idea, I now have a complete project.

The general idea behind the combing of the two works is to use the CarnivorePE client as a form of data mining for extracting the number of people on LSU’s local network, and then transform this would-be hidden data to the physical world as in Alerting Infrastructure!. I want to make people aware of the activity going on in their local network by integrating the frustrations of real-world traffic as well as give physical form to the shear volume of network traffic that easily goes unnoticed other than the occasional sub-par loading times of browsers or downloads. I think about sitting in 5 o’clock with the windows down and how my senses are constantly violated by my surroundings. When connected to a network, a user is completely oblivious to how many people are sharing the same cyber highways with them. I want convey this feeling of angst and disdain so closely tied to traffic and taint the bubble created from the inherent nature of isolation associated with computer usage.

The more I thought about the two different types of transportation, the more I started to realize how many qualities are shared.  Each car ultimately represents a person; likewise a connection to the network typically represents the person on the other side of the screen. Just as every building has a mailing address, every website has an IP address, and for that matter most buildings have multiple ways of entering just like computer ports, some are blocked or locked while others need permission and for the most part there is one path that is open free to come and go. In both cases the level of security will dictate how this happens. On the flip side the main noticeable difference between the two forms of traffic is the environment that houses it. But until there is a change in infrastructure or interface, these are the protocols that govern the methods of transportation. This brings an awareness to an interesting thought of how Internet usage would change or have evolved differently if every time you used your web browser you were forced to deal with the same inconveniences that plague our daily commute. Visiting Google would be like trying to fight your way through mall traffic on Black Friday and you would find another way to figure out the meaning to your unknown queries.


[1] Jonah Brucker-Cohen, “Alerting Infrastructure! Challenging The Temporality of Physical versus Virtual Environments,” Net works: case studies in Web art and design, ed. Xstine Burrough (New York: Routledge, 2012), 222.

[2] Christiane Paul, Digital Art, (New York: Thames & Hudson Inc., 2003), 179-180.

[3] Jonah Brucker-Cohen, “Alerting Infrastructure! Challenging The Temporality of Physical versus Virtual Environments,” Net works: case studies in Web art and design, ed. Xstine Burrough (New York: Routledge, 2012), 221-223

[4] Daniel Shiffman, Learning Processing: A Beginner’s Guide to Programming Images, Animation and Interaction, (Burlington, MA: Morgan Kaufmann, 2008), chap. 18.

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s