Archive for February, 2009

Using Arduino in Xcode

Saturday, February 28th, 2009

While the Arduino IDE (Integrated Development Environment) is relatively simple way to program the Arduino hardware, I found myself getting frustrated by it’s limited features after spending time using Xcode and Eclipse. There are some simple tutorials available online which demonstrate how to set up Xcode for programming the Arduino, however these didn’t “just work” for me, and were targeting arduino-0.10.

I made a few (simple in hindsight) changes to the project which has worked well for me so far. (more…)

1-2-10: Interfaces research

Thursday, February 26th, 2009

I’m running a linux-based home theater pc (HTPC for the uninitiated). Originally the user interface was solely MythTV, but lately I’ve been trying out XBMC, Boxee for media management as well asvarious fullscreen web apps (Hulu, NY Times Reader, Netflix WatchNow, Adobe TourTracker) to see what does and doesn’t work from ten feet away.

MythTV, XBMC and Boxee work well on the TV. Each can be controlled almost completely with arrow keys / directional pad. Understandably, the web-based interfaces are lacking. The text is generally too small to read at it’s default sizes and simply zooming to text sizes unpredictably alters the page flow. These interfaces also require the use of a pointing device. A gyro mouse provides control over the pointer, but most hit areas demand a high degree of precision which is difficult to attain from the couch. Also, reading a large volume of text at a distance is not an ideal experience for me. This was one reason for the design of Cloud Reader – presenting a serial string of words one at a time, occupying the entire screen. The trade off is temporal display for spatial limitations.

The interaction with mobile devices is the next considerations. There was an interesting thread on the IxDA mailing list asking folks for their common uses for devices [google doc]. This will demand some attention. Here is a list from the doc:

On the bus – Knowing what is my stop, Waiting for the bus – When is it coming?, Read something while waiting, Listen to music, On the motorcycle – Control music (play,stop,pause,next,prev), Answer a call, At the supermarket – Compare prices with other stores, See what i need to cook a recipe, Running (general working out) – Listen to music, Navigate, Keep track on speed, route etc, At the bank – carrying needed documents and numbers, Lost – Being able to point where is North and South, Needing to get from point A to point B – Find a route, look at a map, locate the closest gas station/bank/post office, Relaxing in the park – Read a book, Listen to music, Look at a map, Take a picture, Out and about – Blog, Twitter, upload photo, At a store – Comparing prices and availability with other vendors, looking up someone’s Amazon wishlist (or other registry), Watching tv – Looking up facts about actors/movies/shows, looking up the next playtime for a show/movie I might have missed, See an advertisement (print, billboard, etc) – Visit advertisers site, schedule DVR, locate nearest movie theatre, find a gas station, Looking for a gas station – Best gas price, Service station amenity list (food, car washing…), In a grocery store – Compare prices, look up weights, cross off grocery list, review recipes, Charging – Synch with computer, look up new items to download, apply fixes, While reading a book – Look up for possible meanings,synonyms and antonyms of words thru google, While travelling – To check the ticket status of waiting list tickets, With one hand – can’t this apply to any of the above?, Sleeping – Alarm, in a meeting (office) – taking notes, display datas, Walking from a meeting to another – see the address

Spatial Media: previz

Thursday, February 26th, 2009

flow_flockingStarted to put together the initial visualizations of the data flowing between the “cloud” and the potential devices on the table. The central area is the router, or the network conduit to the internet; each node around the periphery is a device on the table. In this rough demo the red and blue streams are outbound and inbound packets respectively. This version is using modified flocking code from dan schiffman’s “nature of code” examples. I’ve tried another version using a particle system instead; it has a different look which I’m not quite happy with.

I’ve already hit some performance issues, even using openGL in java. It’s likely that my code isn’t that efficient, but unless I find something glaringly wrong I may have to rewrite this in C++. So it goes. Next is to get various packet types identified by color.

Animated demo applet here.

Ameya has been working with the interface for the web side of the initialization process. We’re planning on using an open wifi network with a proxy server to handle devices on the table. This should provide some built-in constraints to make the process manageable. All devices will connect to the table’s network. They will then be walked through a brief process to spatially locate the device on the table. Afterwards, the traffic will be passed on to the internet as expected. Here are his sketches.

Spatial Media: Dining Table

Monday, February 23rd, 2009

Situational Lighting for dining table.

dining-room-moodDining tables in the home seem to be used for many functions and can be a central hub of activity in small living spaces. Three situations quickly come to my mind: entertaining, working and eating. I propose a situational reactive illumination system for the dining room table. (more…)

Thinking Physically: brauswitch demonstration

Saturday, February 21st, 2009

kara_brauswitchFollowing up on the initial post about the brauswitch – the eyebrow activated headband switch. Here is some video with a simple application demonstrating it’s use. There are separate switches for both the left and right sides. The simple Arduino code listed below will indicate if the left, right or both sides have been activated. A Processing sketch reads the serial output of the device and plays a variety of sound samples.

There is something really nice about the amplification of a small facial movement and the larger audio/visual response of the sketch. It’s also nice to interact in a handsfree way. Oh! Fun. Code after the video. (more…)

feeling productive…iPhone glove

Thursday, February 19th, 2009

after another rough week of classes, planning and discussions about projects without actually making anything, i needed a quick productivity break. iPhone gloves.

dsc03409some quick background. the touchscreen on the iPhone and iPod Touch (as well as the older click wheel iPods and trackpad on Macbook/Pro’s) use the capacitance of skin to track touches. gloves generally prevent these type of sensors from reading (except perhaps very thin gloves). taking off a glove to use the phone is frustrating, especially when trying to momentarily check something that would only take a few seconds (text messages, e-mails, etc). (more…)

Spatial Media: TrafficFlow Implementation

Thursday, February 19th, 2009

tf_system_diagram1The TrafficFlow project has changed due to comments received during the first critique.

The table is circular with a smooth surface. User places an internet connected device on the table (mobile phone, PDA, laptop). After visiting the project’s website with the device and performing a brief initialization, particles representing internet traffic emanate from the device and begin to swarm off to the center of the table, where they become obscured in “the cloud”. As data flows into and out from the device, particles will flow between the cloud and the device. The movement of the particles will follow some type of fluid dynamics, tracking gracefully as the user moves their device. As the device is lifted from the table, the flow will cease.

Each particle represents packets of TCP/IP data. The particles will be color-coded by data type (web, e-mail, ftp, torrent, tcp overhead) and may be strung together to indicate groups of related packets in a data stream.

Blob / Edge detection for devices on the table.
Packet Sniffing (Carnivore / libpcap) to watch network traffic / power visualization
Ajax web application to register users (associate IP with physical footprint of device)

System Diagram:
particle system visualization
projector (spec. TBD)
projection surface (allowing for rear projection)
camera with IR emitter (location of emitter TBD)
sensing (IR, WiFi traffic)

Procedural Direction:
1. user joins our public network
*. user places device on surface
*. user instructed to visit initialization site
4. site asks user to hit the connect button
5. site recognized users IP / UserAgent string
6. visual confirmation, ensuring the correct device is correlated with the IP
7. association is made
8. start sniffing packets
a. read packet header
b. determine source, destination, data type (as inferred by port number)
c. pass this data to visualization in the form of parameters
9. begin visualization
10. if blob of associated device is lost, the ajax interface asks the user if they wish to reconnect

Points of Failure:
- confusion for those connected to the internet via mobile data service (GPRS, Edge, 3G)
- simultaneous user logo
- inability to maintain tracking of devices

Thinking Physically: Go away. (gesture)

Wednesday, February 18th, 2009

Designing a gesture. “The Expressive Body” by David Alberts describes movements and gestures at length. In it he writes:

In terms of human interaction, physical behavior has five primary functions: (1) to express emotion; (2) to regulate interpersonal interactions; (3) to present one’s personality to others; (4) to convey interpersonal attitudes and relationships; (5) to replace or accompany speech.

dsc03389I was interested in drawing from a fairly common position I’ve found myself in lately: being deep in concentration reading or brainstorming. Analyzing the common body positions I’ve observed myself and others in when in a similar state, here is my proposed gesture for “I’m busy/tired/frustrated/overwhelmed – go away / leave me alone right now.”

Three fingers touch the face. Index finger above the outside corner of the eyebrow. Middle finger on the forehead above the inside corner of the eyebrow. Thumb just below the cheekbone. Head can be, but not necessarily be downturned as if reading a book or screen.

I started with a gesture for having a headache – squeezing the temples or rubbing the forehead with the tips of the fingers of both hands. Then rubbing the forehead with fingers and thumb on opposite sides of the face. This gesture is modified from those.

Folks seemed to be unsure of the gesture at first, but were receptive to try it out. The finger positions varied slightly, but still are recognizable. The expression of the eyes also seems to play into the gesture greatly.

dsc03386 dsc03390

Toy Design: TraceBug proposal

Tuesday, February 17th, 2009

tracebugVisiting the American International Toy Fair gave me a brief overview of what is popular across many facets of the toy industry. I wouldn’t claim to have a thorough comprehension of the market, but it was certainly an informative experience.

Some of my notes from fair:
Infrared control.
Lots of robot kits. Solar powered vs. battery powered.
Grouped into prefab and modular kits.
Sound and light sensors, too.
Example: HexPods. Overheard vendor discuss user testing: Kids want control and speed.
Slot car systems. Even here there is much licensing. (Nintendo – Mario Cart)
Figurine playsets. Thematically related. Realistically detailed. Schleich.
Glow strings and kits.
Materials: Lots of plastic (PVC), lots of wood.
Many stuffed animals.
Lots of board games and educational toys. Brain teasers.
Flying toys. Planes/helicopters.
Tents and other enclosures.
Saw remote controlled drawing robots. Reminded me of Chris Cerrito’s project
Pedal powered cart. Awesome. Disc brakes and 7 speed shifting. (more…)

Thinking Physically: Dance Fever

Thursday, February 12th, 2009

Under the close guidance of Anne Gridley from the Nature Theater of Oklahoma, our group (Mustafa, Andrew, DV and myself) were tasked with creating a dance of six moves whose choreography was determined by chance of a six-sided die. Each step is a single count.

Each of the six moves we created were inspired by the wearable digital switches presented earlier in class. They are labeled in a some what literal way, generally after the motion or body part the switch employed. (more…)