For a first, straightforward attempt, let’s just create a basic Window based
iOS application. (I hear this is gone in iOS 5.0; you can get one back from
Big Nerd Ranch here: Window-Based Application.) Drop an MKMapView
on your window and hook it up to your App Delegate. If you want, tell it to
show the user location and zoom in on whatever region.
If you have Big Nerd Ranch’s iOS Programming, the Whereami
application in Chapter 5 is pretty much the starting point I’m using. There’s
a link at the bottom of the post for a demo project as well.
And the method that’ll be called when the taps are caught. We only want single
taps, so let’s try and filter that out:
Build and run, and at first glance, this works just fine. Tap and watch the logs
and it’ll tell you both the window location and world location that your finger
dropped. You can still tap and drag to move the view around and pinch to zoom,
so we haven’t broken anything.
However, try double tapping. The map will still zoom properly, but it’s still
going to call handleTap:, even with the check for number of touches. Bummer.
UIGestureRecognizers can be set up to require one or more other recognizers to
fail, so let’s give that a try. We’ll add another UITapGestureRecognizer
and set its minimum number of taps to 2, and then require that it fail before
the original recognizer will work.
Build and run, and sweet! Now we’re only catching single taps in handleMap:!
However, now double tapping to zoom is broken. It looks like we’re interfering
with MKMapView’s internal gesture recognizers now. Let’s take a look at those.
Running that will show us something like this:
So there are already UITapGestureRecognizers in there for both single and double taps.
I have no idea what the single tap one does, but the double tap one probably handles
zooming. Maybe we can just hijack that one.
So rip out the double tap recognizer we added, and dig through the built-in recognizers
to find the double tap one and rely on that one instead.
Build and run, and bingo. A single tap on the map will report the tap
location in the Debug console, and a double tap will still zoom properly.
I have only tested this in the simulator so far, as I’m cheap and
haven’t ponied up the cash for a developer license yet. So this
might not work in the wild.
I figured out what that single tap gesture recognizer is in there for.
MKMapView uses it to catch when you tap on annotations.
Unfortunately, adding your own single tap recognizer causes the built
in one to no longer get called, breaking that functionality. So if you
need annotations, this is probably not a good idea.