Reputation: 1334
Building an application using Google recommended architecture seem like a nice way of separation and modularization for the app. That being said, I often stumble upon the fact that, when caching data that comes from an API, a need for using different models for remote & local dataSources may arise. (I found a comment here by swankjesse which states the same).
Different models looks nice, but having complex models with multiple nesting levels seems to be a pain in the ass (mapping local & remote models to a common data layer
entity).
Another argument would be that when requesting data from network, API may respond with a JSON mapping pagination and other stuff inside (which is needed by ViewModel
(just an example) to load more data).
Having a Repository
with local & remote dataSources looks like is kinda ruined (local responds with List of Objects, remote responds with class that contains the List of Objects).
All sample apps that I've seen demonstrates using simple POJOs (which in production code is almost never realistic). Any idea on solving this architecture puzzle?
Upvotes: 2
Views: 588
Reputation: 6892
I assume you have such modules, with corresponding data model:
domain
with UserItem
repository
contains 2 data source: module remote
(Retrofit - with UserResponse
) and module local
(Room - with UserEntity
). There is also a UserMapper
inside repository
module.Why we need 3 different data model to represent User
? Because in 3 such modules, data has different format as well as annotation. For example User
would have a birthday
field:
remote
: class UserResponse(@SerializedName("birthday_date") val birthdayDate: String)
local
: @Entity(tableName = "users")
class UserEntity(
@ColumnInfo(name = "birthday") val birthday: Long
)
domain
: class UserItem(val birthdayDate: Long)
With 3 different data models, you can easily change the data in domain
with worry about any breaking changes in local
or remote
Upvotes: 1