Reputation: 47364
I'm looking for a way to capture (X,Y) locations of touches that user might be making within screens my app (relative to most backward view of a view controller).
Is there a way for me to subclass some kind of (UIResponder?) or add a category on a UIView to be able to intercept touches, process them, but still allow them to interact with the content (buttons, gesture recognizers, etc)?
I was thinking of implementing "touchesBegan:", but in my experience that frequently messes up existing button or gesture recognizer logic.
Upvotes: 0
Views: 2078
Reputation: 11112
This is possible up to some limits. There is a sample project on GitHub by Todd Reed, you can also use that code and modify it for your needs.
Having a quick look at the code, it is keeping a custom UIWindow
on top using swizzled methods and rendering touches on that view. It is also using sendEvent:
method of the UIApplication
class, which redirects the events down the view hierarchy.
This is a much more elegant solution than using UIGestureRecognizer
on each view controller. Many analytics solutions are also doing this very effectively.
Upvotes: 1
Reputation: 1879
you could have a base View Controller that is a UIGestureRecognizerDelegate that each of your view controllers subclass that has a gesture recognizer on to listen for touches.
-(void)viewDidLoad{
...
UIGestureRecognizer *gestureRecognizer = [[UIGestureRecognizer alloc] initWithTarget:self action:@selector(gestureAction:)];
[gestureRecognizer setEnabled:YES];
[gestureRecognizer setCancelsTouchesInView:NO];
[gestureRecognizer setDelaysTouchesBegan:NO];
[gestureRecognizer setDelaysTouchesEnded:NO];
[gestureRecognizer setDelegate:self];
[self.view addGestureRecognizer:gestureRecognizer];
...
}
and then have your gesture recognizer code
-(BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch{
NSLog(@"view touch (%f,%f)",[touch locationInView:self.view].x,[touch locationInView:self.view].y);
return YES;
}
Upvotes: 1