DaneEdw
DaneEdw

Reputation: 446

foreach large select slowing down knockoutjs databinding

My Problem

I have a table that loads the same select that has hundreds of options for every data item retrieved. I retrieve the data for the select and store it in a JavaScript variable prior to the data-binding. The problem is that because there are so many options, the options take 10+ seconds to populate a data set with 200+ items.

What I have done to determine this was the problem

I have found by removing the selects the data-binding runs very quickly so I'm certain that this is my issue.

My Question

What else can I do to speed this process up?

My Implementation

<table id="reportList">
    <tbody data-bind='foreach: reportList'>
       <tr>
          <td>
              <select class="itemSelect" data-bind="options: $root.selectItemIDOptions,
                                                    value:ItemID, optionsValue: 'ItemID',
                                                    optionsText: 'SupplierItemID',
                                                    optionsCaption: 'Select Item'"  />

          </td>
          <td data-bind="text: Description"></td>
       </tr>

My View Model

function ReportViewModel(reportData) {

  //GlobalItemList already has all of the select options at this point ready for databinding
  self.selectItemIDOptions = GlobalItemList;
  self.reportList = ko.observableArray();
  var Shrinkage = reportData.ShrinkageList;
  var rowArr = self.reportList();
  for (var i = 0; i < Shrinkage.length; i++) {
    rowArr.push(new ReportRow(Shrinkage[i].ItemID, Shrinkage[i].Description);
  }

   self.reportList.valueHasMutated();
}

Row class

function ReportRow(ItemID, Description) {

  var self = this;
  self.ItemID = ko.observable(ItemID);
  self.Description = ko.observable(Description);
}

Upvotes: 0

Views: 489

Answers (1)

ryan1234
ryan1234

Reputation: 7275

My recommendation would be to not try and fix or debug Knockout in regards to large data sets, but instead refactor the data you are sending.

What if you took ~10 fields per record (instead of 100) for the first data call and then provide a "Get Details" link for a user to retrieve the other 90 fields on a row by row basis.

That would load the original list very quickly and each subsequent call for more detail would also be fast.

Upvotes: 1

Related Questions