Posts Tagged ‘processing’

…of course i was logging

Wednesday, February 10th, 2010

Screen shot 2010-01-30 at 20.12.55I fractured my ankle in a hard snowboard crash a couple of weeks ago and of course I was data logging the accelerometer forces. I was using the iPhone app developed last fall for the seismi{c}ycling project; while riding the phone was in my jacket’s internal chest pocket.

A group from ITP was enjoying the bitter weather at Mount Snow, in West Dover, VT on our (now annual?) Snowbunnies trip. This crash was late in the day on a wide open trail. I accidentally disengaged my heelside edge for a moment, causing me to rotate slightly clockwise and slide laterally. Moments later, my heelside edge caught again, now on the downhill side, causing me to quickly flip backwards onto my head … thankfully I was wearing a helmet. After that I can’t recall what exactly happened, but I know that it involved a lot of tumbling which my right ankle just couldn’t weather. (more…)

Rest of You: Bike Forces

Monday, September 28th, 2009

IMG_0726(note: I’m awaiting the HR sensor, this is mostly outward forces)

I’m logging the acceleration forces at the handlebars of my bicycle while riding through New York City. The body has roughly three contact points with a bicycle, the hands at the handlebars, the “seat” at the saddle, and the feet at the pedals. The downward force of the rider’s weight and pedaling force and the upward forces of the bicycle rolling over uneven ground are distributed over these three points. I was interested to see just what kind of forces are “pushing back” that I may not be aware of, myself lost in the act of simply keeping the bicycle upright and safely navigating through traffic.

handlebar_vibrationTo contextualize the raw accelerometer data I also tracking GPS location and eventually geocoding the raw data in software. The bicycle sensors are being transmitted via Bluetooth to a mobile phone and the data is logged with a custom written (but now open-source!) python script. Below is the first draft of the visualization. (more…)

Spatial Media: TrafficFlow update

Thursday, March 5th, 2009

Simple updates on the project. The previz applet has mouse control for each of the emitters. It’s interesting to see how the glowing packets influence each other when in close proximity. Here’s the applet.

ir_flame

Camera vision. Started working with openFrameworks to do the visual tracking of objects on the table. Unexposed portions of developed film negatives block most visible light and let IR light pass. Using this as a filter over the built-in iSight camera, I was able to test a rudimentary camera tracking system. It would likely be better to use a more robust library like openCV, but writing the tracking myself helped me to learn about how it works.

Still have to merge these two components. It looks like I’ll have to rewrite the flocking system in C++ since Processing/Java is getting bogged down. Ameya is working on the web side of the project – messing with a linux-based router to handle the proxy and packet sniffing as well as a database system to manage the data.

The table was built throughout this past week; we still need to mount the camera and projector in an effective way for under projection and sensing…any advice?

Thinking Physically: brauswitch demonstration

Saturday, February 21st, 2009

kara_brauswitchFollowing up on the initial post about the brauswitch – the eyebrow activated headband switch. Here is some video with a simple application demonstrating it’s use. There are separate switches for both the left and right sides. The simple Arduino code listed below will indicate if the left, right or both sides have been activated. A Processing sketch reads the serial output of the device and plays a variety of sound samples.

There is something really nice about the amplification of a small facial movement and the larger audio/visual response of the sketch. It’s also nice to interact in a handsfree way. Oh! Fun. Code after the video. (more…)

Toy Design: TraceBug proposal

Tuesday, February 17th, 2009

tracebugVisiting the American International Toy Fair gave me a brief overview of what is popular across many facets of the toy industry. I wouldn’t claim to have a thorough comprehension of the market, but it was certainly an informative experience.

Some of my notes from fair:
Infrared control.
Lots of robot kits. Solar powered vs. battery powered.
Grouped into prefab and modular kits.
Sound and light sensors, too.
Example: HexPods. Overheard vendor discuss user testing: Kids want control and speed.
Slot car systems. Even here there is much licensing. (Nintendo – Mario Cart)
Figurine playsets. Thematically related. Realistically detailed. Schleich.
Glow strings and kits.
Materials: Lots of plastic (PVC), lots of wood.
Many stuffed animals.
Lots of board games and educational toys. Brain teasers.
Flying toys. Planes/helicopters.
Tents and other enclosures.
Saw remote controlled drawing robots. Reminded me of Chris Cerrito’s project
Pedal powered cart. Awesome. Disc brakes and 7 speed shifting. (more…)

Toy Design – Assignment 1 research

Friday, January 23rd, 2009

skipdraw_compSome research for our first assignment in Toy Design. Paul had an idea for a drawing toy that that would rotate, with extensions at the end of a string or rope also rotating and actually making the marks.

Here’s a quick sketch of the possible output. There is also an animated version with speed controls.

This is all very rough, but I wanted to mock something up before investing too much work into prototyping it.

(Signal to) Noise meter, ITP 4in4, day 4

Thursday, January 15th, 2009

For today’s project I wanted to do something with the analog decibel meter that Tymm gave me on day one. My idea is to calculate some kind of signal to noise ratio in my email inbox and to display the value on this physical meter. Since I already get a lot of noise in there, maybe the value won’t change very dynamically, but be a steady din. Perhaps I could tie into the junk mail filter to show just how much work it’s doing, like a tachometer. Who am I kidding, really, I’m likely going to jump into what all the cool kids are doing and just come up with some type of Twitter visualization… (more…)

Meggy pixel video display. 4-in-4, day one.

Tuesday, January 13th, 2009

Spent most of the day at Tymm’s house, pretending to participate in 4-in-4, but mostly drinking coffee and watching videos. I did eventually get around to tinkering with my new Meggy Jr RGB from Evil Mad Scientist Laboratories. The Meggy is a pixel game platform built around a vivid 8×8 LED matrix running an Arduino compatible ATMega168. EMSL has also released a simple to use library for managing the display, buttons and speaker. It’s really a great kit.

I decided to ignore all the nice easy stuff (more…)

Touch Fingerprint

Wednesday, December 17th, 2008

An in-class exercise in trying to convey the sense of touch on-screen. Given a matrix of white points on a black field, we had to rework the project with our own “fingerprint” in about 30 minutes. (more…)

Visualizing (proxemic) space

Wednesday, December 17th, 2008

In conducting research for the CycleSense bicycle traffic proximity system I gathered data about the amount of space behind a bicycle rider traveling through typical New York City traffic at various times and locations. The initial visualization of this data was to graph the distances over time to look for specific events that the system would need to detect in order to be useful. I cross-referenced the space data with video taken on the same rides

(more…)