Reputation: 1266
I am creating an iOS application with a ruby on rails backend using active record mysql
The application is able to post various types of media (images, gifs, videos)
Any user is allowed to post and posts that have the boolean flag is_private to false will show up in a global feed visible on the application (think Facebook news feed).
For this global feed i need to build pagination and pull to refresh.
Application side i have built the network services and models but i need to harden down the logic as to how to fetch data for the global feed.
My questions are:
-What is the common structure for the backend and application communication. -So far i have a method which gets the initial page and a method which gets the next page starting at the last current item.
-How do i deal with the fact that more items have been put into the head (top) of the data source while the user has been scrolling causing issues with continuity?
Upvotes: 1
Views: 164
Reputation: 76774
Interesting question!!!!
The best information I have pertains to how to "receive" data from the backend in "real time". I'm not sure how you'll handle the JS scrolling mechanism
--
Live
The "live" functionality of the system is basically handled by passing data through either an SSE or Websocket (asynchronous connection), to make it appear like your application is operating in "real time".
In reality, "live" applications are nothing more than those which are constantly "listening" to the server - allowing you to take any data they send & put on the page with JS
If you wanted to keep the "feed" up to date perpetually, I'd recommend using either of these technologies:
SSE's
The most elemental way of performing this is to use server sent events
- an HTML5 technology which basically allows you to pass data from your server to your DOM using the text/stream
content-type
:
This is what's considered a "native" way of handling updates from the server:
#app/assets/javascripts/application.js
var source = new EventSource("your_end_point");
source.onmessage = function(event) {
//your data here
};
This can be compensated on the Rails side with ActionController::Live::SSE
controller :
class MyController < ActionController::Base
include ActionController::Live
def index
response.headers['Content-Type'] = 'text/event-stream'
sse = SSE.new(response.stream, retry: 300, event: "event-name")
sse.write({ name: 'John'})
sse.write({ name: 'John'}, id: 10)
sse.write({ name: 'John'}, id: 10, event: "other-event")
sse.write({ name: 'John'}, id: 10, event: "other-event", retry: 500)
ensure
sse.close
end
end
The problem with SSE's is they are basically the same as ajax long-polling; in that your front-end JS will constantly send requests every second. I don't like this
--
Websockets
Websockets are the "right" way to connect & receive data in "real time":
(source: inapp.com)
They basically allow you to open a perpetual connection between your front-end and your server, meaning you won't have to send constant requests to your server. I don't have much experience with websockets, but I do with Pusher
--
Pusher
I'd highly recommend pusher
- it's a third party websocket system (I am not affiliated with them in any way)
Simply, it allows you to send updates to the Pusher service, and read them on your system. It takes out all the hassle from having to provide connectivity for your own websocket
app.
You can read up on how it works, as well as studying the pusher gem
Upvotes: 1
Reputation: 1615
Pagination with write consistency is harder than blind pagination. The basics of pagination are that you want to load an initial set and then be able to go down (typically back in time) from the point of the last fetch.
Two types of pagination:
Fetch the head of the list and store the last id that is in the list returned from the server. On the next request (when the user scrolls to the buttom of the list) send the last see id and then filter to the next m items **after that last seend id
response = [12,11,10,9,8]
last_id = response.last
response = [7,6,5,4]
[19,18,17,16,15]
Upvotes: 1