Archive for the ‘iphone’ Category

Calling NYC cyclists!

Thursday, March 25th, 2010

mobile logger iconI’m looking for some beta testers for my iPhone data logger application. I’m specifically soliciting bicycle riders to record their rides around New York City in support of my thesis research involving visualizing the cyclist experience. This is a proof-of-concept exploration in what a ubiquitous mobile sensor network could possibly look like, using existing technology that we already carry to learn about ourselves and our world.

I’ve chosen to focus on cycling in the city, but the concept is far-reaching (and I’m certainly not the first to approach this). Recently, The New York Times published an article [1] revealing findings from a year of GPS logged taxi cab data, summarizing average traffic speeds in Manhattan by day. Similarly, Cabspotting [2] visualized the taxi routes in San Francisco. Flight Patterns [3] reveals the air traffic over the United States throughout a typical day.

dashboardProjects involving using the bicycle as a sensing platform have emerged as well. The Copenhagen Wheel [4] is a dense array of motion and environmental sensors packed into an electric-assist rear hub. While not cycling-specific, the Personal Environmental Impact Report [5] uses GPS-enabled mobile phones to infer mode of travel from speed and calculate your carbon footprint and exposure to air pollution.

I’m primarily looking to see if there are correlations in rider travel patterns. Are there commonalities in routes, sound levels, bumps? How are riders navigating to similar locations? What are typical trip durations and speeds? Do different types of riders (commuter, enthusiast, courier, racer, delivery rider) behave differently? When are riders on the roads? For all of this, what could it look like as visualization?

This application is the data collection mechanism I’ve chosen to employ for this exploration. It records location, heading, speed, altitude, accelerometer, sound level, trip duration and distance to storage on the device. Each log can be viewed on a map and individual samples inspected. Export logs via e-mail in CSV, JSON or Golden Cheetah format. Data can be automatically uploaded while recording as well.

map viewThis application will be released as open source software under the GPLv3. Source code will be available at: http://github.com/rcarlsen/Mobile-Logger

If you’d like to participate in this beta test, please e-mail me the UUID for your iPhone (3G or 3GS, OS 3.1+) device. This can be retrieved in iTunes by connecting the iPhone via USB cable, and clicking on the Serial Number field in the device summary. After displaying the UUID, go to Edit > Copy to copy it to the clipboard.

The basic functions of the application are on the project page. If you’re simply interested in recording your trips and not specifically interested in contributing to the project then I ask you to wait for the public release of the free app in the App Store. An Android version of the logger is also forthcoming.


[1] http://www.nytimes.com/2010/03/24/nyregion/24traffic.html?ref=nyregion
[2] http://cabspotting.org/
[3] http://www.aaronkoblin.com/work/flightpatterns/
[4] http://senseable.mit.edu/copenhagenwheel/
[5] http://peir.cens.ucla.edu/

OCR for iPhone source

Tuesday, January 12th, 2010

ocr_gobbledygookThe source code for the Tesseract OCR for iPhone project has been published. It’s really simple – more of a skeleton, proof-of-concept project than anything else. Still, though, it’s neat to have nearly point-and-shoot text conversion in your pocket.

The project page is: Pocket OCR

The source code is available at github: http://github.com/rcarlsen/Pocket-OCR

There is certainly a lot of improvement to be made. Automatic color correction. Page layout recognition. Perspective correction…the list could go on. The code is there, so…fork away!

(the thumbnail is a bit tongue-in-cheek…but honest. good conversion requires a good source image: well-lit, macro, focused and tightly cropped seems best)

OCR on iPhone demo

Sunday, December 6th, 2009

Update: Source code for demo project released.

TessIcon

i finally got around to building a proof of concept implementation of tesseract-ocr for the iPhone. months ago, i documented the steps which helped to get the library cross-compiled for the iPhone’s ARM processor, and how to build a fat library for use with the simulator as well. several folks have helped immensely in noting how to actually run the engine in obj-c++. thanks to everyone who has commented so far.

anyway, below is a short video of the POC in action. the basic workflow is: select image from photo library or camera, crop tightly on the box of text you’d like to convert, wait while it processes, select / copy or email text. (more…)

No summertime fun: BeachBall rejected

Friday, August 7th, 2009

IMG_10360BeachBall was recently rejected by the iPhone Developer Program on the claim that it infringes an Apple trademark image. I’m not surprised by this at all, although I was taken aback by the (relatively) quick response – 7 days.

The usual Guidelines for Using Apple’s Trademarks and Copyrights was quoted:

Apple Logo and Apple-owned Graphic Symbols:

You may not use the Apple Logo or any other Apple-owned graphic symbol, logo, or icon on or in connection with web sites, products, packaging, manuals, promotional/advertising materials, or for any other purpose except pursuant to an express written trademark license from Apple, such as a reseller agreement.

What is frustrating by this clause is the opacity of it. Where can the ownership status of a “graphic symbol, logo, or icon” be determined? A search through the USPTO Trademark Electronic Search System (TESS) didn’t turn up anything useful. I’ll be happy to comply with Apple’s demands if I know what the specific claim is.

Several other apps have been rejected on similar grounds, often relating to the use of a rounded rectangle image which Apple claims to infringe on the iPhone / iPod touch trademark.

The beach ball image in the app is “original” art (albeit very close to the OS X spinning wait cursor) in that I created this instance of the graphic. What are the bounds of the graphic symbol’s claim? A circle with a rainbow gradient? Gloss and shadow effect? Five-bladed fan rotating clockwise at “x” rpm? Would a photograph or illustration of a “beach ball” also be deemed to infringe?

Without specific information on the claim any changes I make are likely to be a back and forth shot in the dark. Of course, I knew this going into it, but I welcome a challenge. Where’s the sense of humor? :)

Summer time: BeachBall

Thursday, July 30th, 2009

beachball_submitThe BeachBall app has been submitted to the iPhone App Store. I’m pretty sure that I’ve crossed all the t’s and dotted the i’s throughout the application and am holding out hope for a smooth process. If all goes well it will be a great surprise just before the next semester kicks off.

Thanks to everyone who has been beta testing the (codename) Pinwheel app. I’ll be sure to give you all a copy of the final app when (if?) it’s approved.

In the meantime there are several other projects which need my attention. I’m committed to getting the in-progress work finished before moving on to new stuff…which is a strong motivator since I have a few things I’m really excited about working on. Onward…!

PinwheelBeta – adhoc

Thursday, July 23rd, 2009

In preparation of submitting Pinwheel to the App Store I’ve gone and released a beta version via ad hoc distribution. Seems like it should be a straightforward prospect, however the various device provisioning, code signing profiles and entitlements made it a bit of a process for a first-timer.

iTunes also seems to be entirely unhelpful when something goes wrong with the “unknown error” message. What could it be? Code sign error? Unprovisioned device? Incorrect OS target? Mismatch in the app and mobileprovision file? It’s all a mystery in the iTunes universe.

Despite the inconvenience involved with having to generate a new ad hoc mobileprovision profile each time I add a new beta tester device, requiring a recompile of the beta app, it’s nice to finally get an app out to other non-development devices.

The ad hoc distribution process is documented well (enough) on Apple’s dev portal and on several other blogs, but my real stumbling block was navigating the Xcode interface to ensure that the proper code signing profile was being included in the binary (hint, look at the verbose build output for “embedded.mobileprovision” and ensure that the mobileprovision ID matches the correct version.)

I’ve also been getting trouble with Xcode not building using the correct provisioning, despite updating the target info and cleaning out the build folder. A restart of the program usually gets it sorted.

The next step will be to set up a Distribution build for the App Store submission. We’ll see how that goes in the end…

cross-compiling for iPhone dev

Wednesday, July 15th, 2009

Update: Proof-of-concept demo. Also, updated the script for building with the 10.6 SDK.

Update #2: Source code for demo project released.

Update #3: script for use with tesseract v3 posted.

I recently had need to use an open-source library in an iPhone project. Recalling the earlier work necessary in compiling the libraries needed for openFrameworks I started looking for a more generic way to build for iPhone development. Thankfully, LateNiteSoft wrote a great article about using a shell script to cross-compile linux projects, building a Universal Binary with versions for the Simulator and Device.

I configured their provided code snippets to build tesseract-ocr for iPhone, referring to the set-up for freetype and freeimage to fill in some c++ gaps. Anyway, the library seems to have built correctly. I’ll know for sure when I incorporate it into a project, soon.

To use it, copy the script into the project directory, next to the configure script. For a simple project which generates one monolithic library, edit the LIBFILE variable to reflect the location and name of the library. I’ve only used this for static libraries…other work may be necessary to correctly generate dynamic libraries (however, the iPhone SDK prohibits linking to dynamic libraries, so in this case it seems moot). Run ./build_fat.sh to kick off the process. Look for the compiled libraries in the “lnsout” directory. There’s no error checking, so caveat emptor. :)

Cross-compile shell script follows: (more…)

openFrameworks iPhone 3GS / OS 3.0

Tuesday, July 14th, 2009

i’ve been dealing with a performance bug in a particle + accelerometer oF app. the same project which has run very smoothly on a first generation iPhone with OS 2.2.1 has a noticeable stutter on an iPhone 3GS with OS 3.0.

there was no improvement despite several rounds of optimizing the openGL drawing code and plugging several memory leaks.

finally, in frustration i bumped the explicit frame rate declaration from the default 60 fps to 120 fps. i realize that ofSetFrameRate(60) is merely the upper limit of the frame rate, and that the hardware won’t actually go faster than it can handle, however this immediately improved performance on the 3GS and the first gen is about the same as it was. further improvement was had with a declared frame rate of 240 fps.

i haven’t had a chance to look into the underlying issue, but i believe that oF is using NSTimer under the hood to trigger a scheduled update() and draw(). has there been some change in the SDK there?

…little help (oh, malloc)?

Monday, July 6th, 2009

i’m having a time of it tracking down a resource allocation bloat culprit in an iPhone app i’m working on. i’ve narrowed it down to when i’m rapidly updating the center property of a UIImageView. a snippet from the Instruments application is below:

malloc


there are hundreds (if not thousands) of these messages, and the object allocation graph trends steadily upwards. when i disable the portion of the method which updates the center property the allocations stop and the memory usage goes flat. this also only occurs on the device…the simulator does not exhibit this behavior.

i’m using UIImageView directly, without subclassing, and it contains data from a small PNG file. i’ve patched up several other leaks successfully today, and really wanted to nail this one too.

(overly simplistic) saving state in oF for iPhone

Friday, June 19th, 2009

save_stateThere was a recent comment about saving / restoring application state when using openFrameworks for iPhone which got me to thinking about how to do it. Apple’s frameworks provide a fairly thorough way to save state to the disk and restore later. There seem to be three primary ways to do this: simple plist files (usually encoded in binary on the iPhone), archived data (they like to refer to this as freeze-dried object graphs) and core data.

I believe that archiving objects require methods inherited from NSObject, which we don’t have in openFrameworks’ ofSimpleApp. Core Data seems like overkill, so I looked into using plist files.

There are likely better ways to do this, but this ad-hoc solution works wonderfully for a small app I’m working on, and only requires a bit of Objective-C code that could likely be moved up into a nice wrapper class. However, since the question was asked I’d just like to get it out there before working on a more elegant approach. (more…)