Reputation: 13281
I am in the midst of improving the performance of my mapView with thousands of data to be shown. I'm done with the clustering, and loading a new set of data each time the user pans the camera of the map, and I added a throttle for that to avoid spamming of network requests.
My next idea though is to remove the existing annotations if they're outside the current camera. Question is: how can I compute if a given coordinate/annotation is outside or inside the current camera?
Here's a piece of my code:
@objc func throttleDone() {
let newCoordinatesOfCamera = self.mapView.camera.centerCoordinate
let newLocationOfCamera = CLLocation(latitude: newCoordinatesOfCamera.latitude, longitude: newCoordinatesOfCamera.longitude)
// Remove current annotations outside the camera.
// TODO HERE: ---
// Fetch new data inside the camera
self.fetchNearByEstablishments(newLocationOfCamera)
}
@objc func didDragMap(_ gestureRecognizer: UIPanGestureRecognizer) {
self.hideBottomInfo()
// Fetch new annotations with throttle
self.timer?.invalidate()
self.timer = Timer.scheduledTimer(timeInterval: 1.5, target: self, selector: #selector(self.throttleDone), userInfo: nil, repeats: false)
}
I feel like it's quite easy but I just couldn't find any answers here on SO and Google. Thank you!
Upvotes: 0
Views: 290
Reputation: 113747
MKMapView
has a region
property which is the visible area of the map. You can check to see if the annotation's coordinates are inside or outside that. There will be plenty of answers for this now that you know the search term. Note that if the camera is tilted the region will be a little bigger than the actual visible area since region is always a rectangle but an angled camera will be showing (I think) an Isosceles trapezoid.
If your background is Google and their maps framework, the camera is the usual way of thinking, but in iOS the region is the usual way to think about it while the camera is more for tilting and rotating.
Upvotes: 1