Rohan Panchal
Rohan Panchal

Reputation: 1266

Implementing a Feed from data source

I am creating an iOS application with a ruby on rails backend using active record mysql

The application is able to post various types of media (images, gifs, videos)

Any user is allowed to post and posts that have the boolean flag is_private to false will show up in a global feed visible on the application (think Facebook news feed).

For this global feed i need to build pagination and pull to refresh.

Application side i have built the network services and models but i need to harden down the logic as to how to fetch data for the global feed.

My questions are:

-What is the common structure for the backend and application communication. -So far i have a method which gets the initial page and a method which gets the next page starting at the last current item.

-How do i deal with the fact that more items have been put into the head (top) of the data source while the user has been scrolling causing issues with continuity?

Upvotes: 1

Views: 164

Answers (2)

Richard Peck
Richard Peck

Reputation: 76774

Interesting question!!!!

The best information I have pertains to how to "receive" data from the backend in "real time". I'm not sure how you'll handle the JS scrolling mechanism

--

Live

The "live" functionality of the system is basically handled by passing data through either an SSE or Websocket (asynchronous connection), to make it appear like your application is operating in "real time".

In reality, "live" applications are nothing more than those which are constantly "listening" to the server - allowing you to take any data they send & put on the page with JS

If you wanted to keep the "feed" up to date perpetually, I'd recommend using either of these technologies:


SSE's

The most elemental way of performing this is to use server sent events - an HTML5 technology which basically allows you to pass data from your server to your DOM using the text/stream content-type:

enter image description here

This is what's considered a "native" way of handling updates from the server:

#app/assets/javascripts/application.js
var source = new EventSource("your_end_point");
source.onmessage = function(event) {
    //your data here
};

This can be compensated on the Rails side with ActionController::Live::SSE controller :

class MyController < ActionController::Base
  include ActionController::Live

  def index
    response.headers['Content-Type'] = 'text/event-stream'
    sse = SSE.new(response.stream, retry: 300, event: "event-name")
    sse.write({ name: 'John'})
    sse.write({ name: 'John'}, id: 10)
    sse.write({ name: 'John'}, id: 10, event: "other-event")
    sse.write({ name: 'John'}, id: 10, event: "other-event", retry: 500)
  ensure
    sse.close
  end
end

The problem with SSE's is they are basically the same as ajax long-polling; in that your front-end JS will constantly send requests every second. I don't like this

--

Websockets

Websockets are the "right" way to connect & receive data in "real time":

enter image description here
(source: inapp.com)

They basically allow you to open a perpetual connection between your front-end and your server, meaning you won't have to send constant requests to your server. I don't have much experience with websockets, but I do with Pusher

--

Pusher

I'd highly recommend pusher - it's a third party websocket system (I am not affiliated with them in any way)

Simply, it allows you to send updates to the Pusher service, and read them on your system. It takes out all the hassle from having to provide connectivity for your own websocket app.

You can read up on how it works, as well as studying the pusher gem

Upvotes: 1

nort
nort

Reputation: 1615

Pagination with write consistency is harder than blind pagination. The basics of pagination are that you want to load an initial set and then be able to go down (typically back in time) from the point of the last fetch.

Two types of pagination:

  1. Fetch the *top of the data-source and then fetch the next page and then the next page. the problem with this approach is that when items are inserted at the top of the data-source your definition of page 2 shifts by n items (n is the number of inserts since last fetch)
  2. Fetch the head of the list and store the last id that is in the list returned from the server. On the next request (when the user scrolls to the buttom of the list) send the last see id and then filter to the next m items **after that last seend id

    first request (GET items.json) returns

    response = [12,11,10,9,8]

    store the last id

    last_id = response.last

    send it with the next request (GET items.json?last_id=8)

    response = [7,6,5,4]

    and so for on the way down

    pull to refresh sends a request (GET items.json) to fetch the head of the list

    [19,18,17,16,15]

    then make another request (GET items.json?last_id=15) to fill in the gaps between 15 and 12

Upvotes: 1

Related Questions