Reputation: 529
!!! FULL EDIT !!!
Because my question was not really well defined and therefore did not address the problem right. With help from the already existing answers I did some more testing and edited this.
Store 10.000 items in two "operations". Keep track of Realm size and the time it takes to finish the whole task.
These items have the following structure:
class DbObject() : RealmObject() {
@PrimaryKey
@Index
lateinit var id: String
private set
var data: ByteArray? = null
private set
var downloadedAt: Long = 0L
var lastUsed: Long? = null
constructor(
id: String,
data: ByteArray? = null,
downloadedAt: Long
) : this() {
this.id = id
this.data = data
this.downloadedAt = downloadedAt
this.lastUsed = downloadedAt
}
}
The cleanups in the following code parts will delete older Realm entries to keep a maximum of 5000 items in the Realm.
Inserts each item individually and does a cleanup after a set cycle (e.g. 5 insertions).
fun storeInDb(object: DbObject) {
Realm.getInstance(DatabaseConfig.REALM_CONFIG).use { realmInstance ->
realmInstance.refresh()
realmInstance.executeTransaction {
it.copyToRealmOrUpdate(object)
cleanupTick = (cleanupTick + 1) % CLEANUP_CYCLE
if (cleanupTick == 0) {
cleanupDb(it)
}
}
}
}
All 5000 items are stored in one transaction.
fun storeListInDb(list: List<DbObject>) {
Realm.getInstance(DatabaseConfig.REALM_CONFIG).use { realmInstance ->
realmInstance.refresh()
realmInstance.executeTransaction { realm ->
list.forEach {
realm.copyToRealmOrUpdate(it)
}
}
}
}
The items are stored in chunks of about 1000 items.
fun storeInDb(list: List<DbObject>) {
Realm.getInstance(DatabaseConfig.REALM_CONFIG).use { realmInstance ->
realmInstance.refresh()
var index = 0
while (list.lastIndex - index > 1000) {
storeListInDb(realmInstance, list.subList(index, index + 1000))
index += 1000
}
val rest = list.lastIndex - index
if (rest > 0) {
storeListInDb(realmInstance, list.subList(index, index + rest + 1))
}
realmInstance.executeTransaction {
cleanupDb(realmInstance)
}
}
}
private fun storeListInDb(realmInstance: Realm, list: List<DbObject>) {
realmInstance.executeTransaction { realm ->
list.forEach {
realm.copyToRealmOrUpdate(it)
}
}
}
start: 480kb, end: 832kb, timeTaken: 370.622s // 5000 individually (cleanup after 5 insertions)
start: 4608kb, end: 5120kb, timeTaken: 2.704s // 5000 in one transaction (cleanup after whole list was stored)
start: 1664kb, end: 2048kb, timeTaken: 2.519s // 5000 in chunks of 1000 (cleanUp after whole list was stored)
start
: Realm size after 5000 insertions
end
: Realm size after 10000 insertions
Size: More smaller transactions will keep the Realm file smaller.
Time: Bigger transactions will reduce the time (in most cases)
Still my question now is: Why are single transactions so (god damn) slow? For 5000 items they are about 148 times slower than batching transactions.
Upvotes: 0
Views: 555
Reputation: 1610
I'm not sure there's a 'question' as such, but I'll point out a couple of things that mean that you are not testing 'time to write N objects' so much as 'doing things N times that create N objects'. There is a difference, and if this forms part of a project (rather than just a test for your interest) then there are ways to speed this up.
storeInDb
you are calling Realm.getInstance
) and then closing it. .refresh
when you open the realm..copyToRealm
, just call .copyToRealmOrUpdate
. executeTransactionAsync
instead?As mentioned in the Realm docs, try and make sure you batch operations as much as possible. If you want to add 1000 objects, then add 1000 objects together. As much as trying to keep an abstract view of the database is worthwhile, sometimes the design may need to understand the concept of 'batching' operations.
Upvotes: 1