Reputation: 5955
I use a UIWebView
to display a "html" page that contains some "html" image objects and these objects can be interacted by the user (like dragging, scaling, rotating, etc.). Now, I want to implement a feature that allows user to make annotations on that page (by drawing lines). When the user starts annotating, the interaction with the objects in "html" page should be disable. The interaction will be enable after the annotations are done.
What I am thinking of until now is:
Solution 1: Injecting "jquery" code into the existed "html" structure, allowing to draw line.
Solution 2: Create a UIView
built on top of the UIWebView
, and draw lines on that UIView
.
I prefer the second solution, because it seems the drawing part and the "html" page are independent to each other. However, I meet some challenges as follow:
If I make the UIView
that contains drawing lines transparent (view.alpha = 0
), all its sub views (lines) are transparent as well.
How to send touch events of a top UIView
to the UIWebView
which is located under it?
I appreciate any recommendation.
Upvotes: 0
Views: 757
Reputation: 17866
As you say, you have two strategies. You can handle the drawing part either in HTML, or on a view on top of the UIWebView on the Objective-C side. Both are valid approaches and the decision will depend on the circumstances of your app.
It is possible to communicate between UIWebView Javascript and your Objective-C code in both directions using the UIWebView bridging capabilities. See, for example, Using JavaScript From Objective-C. To communicate the touches, you just invent a data structure that captures the needed info, and ship the touches over the bridge from UIKit to Javascript with it.
Upvotes: 0