Posts Tagged ‘Spatial Media’

Spatial Media: Look out below!

Tuesday, April 7th, 2009

looking-downThis project aims to be much more lighthearted than the TrafficFlow project. My primary interest in the Spatial Media final project was to do something large – at an architectural scale. Anderson Miller had a similar desire and we came up with an utterly ridiculous idea. (more…)

counting change

Thursday, March 26th, 2009

ofcountingchange

As an in-class exercise for Spatial Media we were asked to develop a program which would identify coins in a series of supplied images and tally up their total value. In one hour.

Since time was limited, I decided to use pixel count as a rough estimate of each coin’s size. This works very reliably with the sample set of images, but that’s likely because the images consist of duplicated coins and are on a solid white background. Several additional methods wold likely need to be implemented to deal with actual situations. (Code below) (more…)

TrafficFlow: Prototype

Friday, March 13th, 2009

flow-darkAmeya and I presented the prototype of TrafficFlow yesterday at ITP for our Spatial Media midterm project. It is an installation-based table which visualizes wireless traffic on a local network as gracefully flowing rivers of light.

Each user on a network has an individual connection to the internet and may have a conceptual model of personalized “tube”. However, all traffic on a typical network shares the same infrastructure and commingles at some point, and if unencrypted, is available to any other member of the network. TrafficFlow aims to make visible this hidden connection. (more…)

Spatial Media: TrafficFlow update

Thursday, March 5th, 2009

Simple updates on the project. The previz applet has mouse control for each of the emitters. It’s interesting to see how the glowing packets influence each other when in close proximity. Here’s the applet.

ir_flame

Camera vision. Started working with openFrameworks to do the visual tracking of objects on the table. Unexposed portions of developed film negatives block most visible light and let IR light pass. Using this as a filter over the built-in iSight camera, I was able to test a rudimentary camera tracking system. It would likely be better to use a more robust library like openCV, but writing the tracking myself helped me to learn about how it works.

Still have to merge these two components. It looks like I’ll have to rewrite the flocking system in C++ since Processing/Java is getting bogged down. Ameya is working on the web side of the project – messing with a linux-based router to handle the proxy and packet sniffing as well as a database system to manage the data.

The table was built throughout this past week; we still need to mount the camera and projector in an effective way for under projection and sensing…any advice?

Spatial Media: previz

Thursday, February 26th, 2009

flow_flockingStarted to put together the initial visualizations of the data flowing between the “cloud” and the potential devices on the table. The central area is the router, or the network conduit to the internet; each node around the periphery is a device on the table. In this rough demo the red and blue streams are outbound and inbound packets respectively. This version is using modified flocking code from dan schiffman’s “nature of code” examples. I’ve tried another version using a particle system instead; it has a different look which I’m not quite happy with.

I’ve already hit some performance issues, even using openGL in java. It’s likely that my code isn’t that efficient, but unless I find something glaringly wrong I may have to rewrite this in C++. So it goes. Next is to get various packet types identified by color.

Animated demo applet here.

Ameya has been working with the interface for the web side of the initialization process. We’re planning on using an open wifi network with a proxy server to handle devices on the table. This should provide some built-in constraints to make the process manageable. All devices will connect to the table’s network. They will then be walked through a brief process to spatially locate the device on the table. Afterwards, the traffic will be passed on to the internet as expected. Here are his sketches.

Spatial Media: Dining Table

Monday, February 23rd, 2009

Situational Lighting for dining table.

dining-room-moodDining tables in the home seem to be used for many functions and can be a central hub of activity in small living spaces. Three situations quickly come to my mind: entertaining, working and eating. I propose a situational reactive illumination system for the dining room table. (more…)

Spatial Media: TrafficFlow Implementation

Thursday, February 19th, 2009

tf_system_diagram1The TrafficFlow project has changed due to comments received during the first critique.

The table is circular with a smooth surface. User places an internet connected device on the table (mobile phone, PDA, laptop). After visiting the project’s website with the device and performing a brief initialization, particles representing internet traffic emanate from the device and begin to swarm off to the center of the table, where they become obscured in “the cloud”. As data flows into and out from the device, particles will flow between the cloud and the device. The movement of the particles will follow some type of fluid dynamics, tracking gracefully as the user moves their device. As the device is lifted from the table, the flow will cease.

Each particle represents packets of TCP/IP data. The particles will be color-coded by data type (web, e-mail, ftp, torrent, tcp overhead) and may be strung together to indicate groups of related packets in a data stream.

Methods:
Blob / Edge detection for devices on the table.
Packet Sniffing (Carnivore / libpcap) to watch network traffic / power visualization
Ajax web application to register users (associate IP with physical footprint of device)

System Diagram:
particle system visualization
projector (spec. TBD)
projection surface (allowing for rear projection)
camera with IR emitter (location of emitter TBD)
sensing (IR, WiFi traffic)

Procedural Direction:
1. user joins our public network
*. user places device on surface
*. user instructed to visit initialization site
4. site asks user to hit the connect button
5. site recognized users IP / UserAgent string
6. visual confirmation, ensuring the correct device is correlated with the IP
7. association is made
8. start sniffing packets
a. read packet header
b. determine source, destination, data type (as inferred by port number)
c. pass this data to visualization in the form of parameters
9. begin visualization
10. if blob of associated device is lost, the ajax interface asks the user if they wish to reconnect

Points of Failure:
- confusion for those connected to the internet via mobile data service (GPRS, Edge, 3G)
- simultaneous user logo
- inability to maintain tracking of devices

Spatial Media: Traffic Flow

Thursday, February 12th, 2009

puckTangible interface for exploring local network traffic.

The installation will be been designed to visualize the flow of information within a local area network. The goal is to make visible the invisible layer of information that comprises our information infrastructure which generally flows beneath a general level of awareness. (more…)

Spatial Media: Fivesies

Thursday, February 5th, 2009

couch_fives_startThe couch is often prime space in a living room during social gatherings and securing a spot on it can be a priority. Eventually, the couch sitter will have to get up, to use the restroom, to get a drink or snacks, to take a phone call, etc.. “Calling fives” or “fivesies” declares your intention to leave only momentarily and requests (demands!) that your seat be reserved for five minutes. There are times when someone walks into the room in your absence and is unaware that “fives” had been called on the vacant couch seat or others when the time remaining is in question. This display will let everyone know the couch “fivesies” status. (more…)

Spatial Media: Camera Coding

Friday, January 30th, 2009

Assignment 2(b):

cameracoding_grab

Make an app that allows you to get a single pixel color from live video and fill a rectangle with that color.

I’ve done similar things countless times in Processing (and Director before that), so I used this as an opportunity to dive into openFrameworks. I have been building up my experience of coding with C++ through working with Golden Cheetah and I can finally (sort-of) read the syntax of the code comfortably. I’m also getting used to working in Xcode.

Anyway, here is the result of the quick sketch of the app. I still need to check over my coding conventions to make sure that I’m not doing anything really inefficient, but this works for now. I feel like I’m starting over again…which is fine.

Here is the application (os x) and source code (xcode project, for use with oF v0.05).