Posts Tagged ‘visualization’

Spatial Media: previz

Thursday, February 26th, 2009

flow_flockingStarted to put together the initial visualizations of the data flowing between the “cloud” and the potential devices on the table. The central area is the router, or the network conduit to the internet; each node around the periphery is a device on the table. In this rough demo the red and blue streams are outbound and inbound packets respectively. This version is using modified flocking code from dan schiffman’s “nature of code” examples. I’ve tried another version using a particle system instead; it has a different look which I’m not quite happy with.

I’ve already hit some performance issues, even using openGL in java. It’s likely that my code isn’t that efficient, but unless I find something glaringly wrong I may have to rewrite this in C++. So it goes. Next is to get various packet types identified by color.

Animated demo applet here.

Ameya has been working with the interface for the web side of the initialization process. We’re planning on using an open wifi network with a proxy server to handle devices on the table. This should provide some built-in constraints to make the process manageable. All devices will connect to the table’s network. They will then be walked through a brief process to spatially locate the device on the table. Afterwards, the traffic will be passed on to the internet as expected. Here are his sketches.

Spatial Media: TrafficFlow Implementation

Thursday, February 19th, 2009

tf_system_diagram1The TrafficFlow project has changed due to comments received during the first critique.

The table is circular with a smooth surface. User places an internet connected device on the table (mobile phone, PDA, laptop). After visiting the project’s website with the device and performing a brief initialization, particles representing internet traffic emanate from the device and begin to swarm off to the center of the table, where they become obscured in “the cloud”. As data flows into and out from the device, particles will flow between the cloud and the device. The movement of the particles will follow some type of fluid dynamics, tracking gracefully as the user moves their device. As the device is lifted from the table, the flow will cease.

Each particle represents packets of TCP/IP data. The particles will be color-coded by data type (web, e-mail, ftp, torrent, tcp overhead) and may be strung together to indicate groups of related packets in a data stream.

Methods:
Blob / Edge detection for devices on the table.
Packet Sniffing (Carnivore / libpcap) to watch network traffic / power visualization
Ajax web application to register users (associate IP with physical footprint of device)

System Diagram:
particle system visualization
projector (spec. TBD)
projection surface (allowing for rear projection)
camera with IR emitter (location of emitter TBD)
sensing (IR, WiFi traffic)

Procedural Direction:
1. user joins our public network
*. user places device on surface
*. user instructed to visit initialization site
4. site asks user to hit the connect button
5. site recognized users IP / UserAgent string
6. visual confirmation, ensuring the correct device is correlated with the IP
7. association is made
8. start sniffing packets
a. read packet header
b. determine source, destination, data type (as inferred by port number)
c. pass this data to visualization in the form of parameters
9. begin visualization
10. if blob of associated device is lost, the ajax interface asks the user if they wish to reconnect

Points of Failure:
- confusion for those connected to the internet via mobile data service (GPRS, Edge, 3G)
- simultaneous user logo
- inability to maintain tracking of devices

Visualizing (proxemic) space

Wednesday, December 17th, 2008

In conducting research for the CycleSense bicycle traffic proximity system I gathered data about the amount of space behind a bicycle rider traveling through typical New York City traffic at various times and locations. The initial visualization of this data was to graph the distances over time to look for specific events that the system would need to detect in order to be useful. I cross-referenced the space data with video taken on the same rides

(more…)

space

Tuesday, November 25th, 2008

In reaction to seeing my research on the cycling proximity project Jane sent me some really interesting information about Edward Hall and his work on Proxemics. I’m still reading up on the concept, but in brief it pertains to the study of space that individuals maintain in various social interactions and specifically how several cultures maintain different norms.

Here is an update to the proximity visualization application incorporating the 4 Proxemic distance classifications of Intimate, Personal, Social and Public. I’m hoping to show that while riding a bicycle in traffic riders experience frequent intrusions by vehicles into the Personal space. The haptic feedback device that is being devised for the CycleSense project will transpose the events in Social and Personal space to the Intimate space to provide an immediate awareness of these intrusions that generally go unseen as they occur behind the rider.

The next HUGE challenge is to refine the rangefinder sensor package, which I’ve never been able to get completely reliable.

data logging…mobile

Tuesday, November 11th, 2008

(this is an initial report for a longer post)

in conducting field research for the CycleSense bicycle proximity sensor i’m looking to gather data about actual proximity events while riding and to correlate these events with video documentation and personal annotation from the test subject…probably just me.

to that end, i’ve worked on rigging up a data logging solution for the sensor package. there was some information on using bluetooth enabled mobile phones as a storage device, communicating to a bluetooth module such the blueSMiRF attached to a microcontroller. in this case, an ultrasonic rangefinder is read by an Arduino which sends the range values through the blueSMiRF to a nokia phone. (more…)