Reputation: 4209
I try to find out, when the user scrolls to the end of a NSTableView
. So of course, the tableview is embedded as usual in a NSClipView
an this is embedded in a NSScrollView
. There are no more changes except the fact of adding insets to the ClipView (Top: 20, Left/Right: 20, Bottom: 0).
To get notified about the scrolling, I connected an outlet of the NSClipView
.
So I use this code:
self.clipView.postsBoundsChangedNotifications = true
NotificationCenter.default.addObserver(self,
selector: #selector(scrollViewDidScroll(notification:)),
name: NSView.boundsDidChangeNotification,
object: self.clipView)
...
@objc func scrollViewDidScroll(notification: Notification) {
print("\(self.clipView.contentInsets.bottom) - \(self.clipView.contentInsets.top)")
print(self.clipView.bounds)
}
If I'm at the top of the ScrollView, the print-results look like this:
0.0 - 20.0
(-20.0, -20.0, 600.0, 556.0)
// x y width height
This seems ok for me. But if I'm at the bottom, it looks like this:
0.0 - 20.0
(-20.0, 595.0, 600.0, 556.0)
In my understanding, the y-value should be 536, not 595. Where is this difference coming from?
Upvotes: 1
Views: 475
Reputation: 4209
I found the solution by my one by observing the different values especially after resizing the window. So:
clipView.documentVisibleRect
is that, what you really can see. Of course I cannot use the height value of this part. The origin value tells me about the position in the ScrollView.
So necessary is the height of the total content. Therefore I have to use clipView.documentRect
.
Now it's very easy: If the visibleRects y-position + the height of the visible rect is the same as the height of the total clipview, we are at the end. This code works:
@objc func scrollViewDidScroll(notification: Notification) {
let scrollY = self.clipView.documentVisibleRect.origin.y
let visibleHeight = self.clipView.documentVisibleRect.size.height
let totalHeight = self.clipView.documentRect.size.height
if scrollY + visibleHeight >= totalHeight {
print("end")
}
}
Upvotes: 1