Reputation: 2609
I am trying to build an app that detects if a user puts their lips on the screen. How could I get an image of their lips when they put it on the screen?
I know how to detect finger touches, would it be similar to that?
Edit: Also, wondering about doing this for Android.
Upvotes: 0
Views: 370
Reputation: 58107
As others have posted here, it is not possible to detect all of the contact points of the lips and the screen. Yes, it would be similar, but more difficult. The iPhone screen is capacitive, so it should detect lips, which are similar to fingertips. (When outside in the cold and I need to scroll with my gloves on, I've used my chin.)
As far as the shape, you may be able to map multiple touches, but touches don't take on any particular shape, so you won't be able to draw the lips based on that. You may want to consider preloading generic pictures of lips and then resizing it to match the size of their lips, depending on where the touch inputs are.
Upvotes: 1
Reputation: 6138
I believe the iphone hardware supports somewhere between 10 and 20 touch points, but the SDK only gives you access to 5. To get a good outline of the lips you'd need a much higher number of touch points.
Could you use the front facing camera and avoid the obvious hygiene issues?!
Upvotes: 5
Reputation: 25328
You can't get the form of the touches, so you wouldn't be able to detect lips on the screens.
Upvotes: 4