years ago we used to joke that the small microphone at the top of some apple monitors (beige powerPC era) was a stress sensor and would allow the computer to detect when you were getting frustrated or were under a deadline and would then misbehave to spite you.
when the macbooks added the isight camera built-in to the display bezel it was like deja vu, “all over again”.
i’ve always been somewhat wary of the camera, specifically that it may be on despite not indicating as such. last night, as i walked past the computer, my fears were confirmed..as show in the below video:
The tracking is done in Processing, with the vanilla video library using frame differencing. The difference image is analyzed by the blobDetection library, limited to one blob. Sending the difference image enabled the tracker to look for luminosity values…the brightest pixels indicating the areas of most change and greatest movement.
The eye image is a placeholder for a macro photograph of a real eye. I’m planning on having the eye center and close after a timeout, then open and track as the camera detects motion….creepy.
Check back for the tracking code after I have a chance to comment and clean it up.
For posterity, the original video depicting the tracking and intended motion: