Douglas Hahn
Douglas Hahn

Reputation: 61

UKG Dimensions Number of employees in request (624) exceeds allowed limit (500)

This error occurs when you are requesting aggregated data using the URL:

POST - {{DIMENSIONSHOST}}/v1/commons/data/multi_read

The Postman body that I used was:

{
  "select": [
    {"key": "EMP_COMMON_FULL_NAME"}
  ],
  "from": {
    "view": "EMP",
    "employeeSet": {
        "hyperfind": {
            "id": -9
        },
        "dateRange": {
            "startDate": "2022-01-01",
            "endDate": "2022-04-30"
      }
    }
  },
  "index": 0,
  "count": 500
}

Notice that I requested "count": 500. Even though I requested only 500 records, I got the error message that there were more than 500 records.

Upvotes: 2

Views: 339

Answers (1)

Douglas Hahn
Douglas Hahn

Reputation: 61

This identifies a bug in UKG Dimensions. I have developed a work around:

  1. Retrieve the hyperfind by itself using /v1/commons/hyperfind/execute
  2. Use a Postman Test (a post-response program) to split the IDs into batches of 500.
  3. Save the batches to an environment variable
  4. Use the environment variable for the aggregated data request.

You can retrieve the hyperfind by itself using:

POST - {{DIMENSIONSHOST}}/v1/commons/hyperfind/execute

The body of the request is:

{
    "dateRange": {
        "startDate": "2022-05-01",
        "endDate": "2022-06-30"
    },
    "hyperfind": {
        "id": -9
    },
    "includeTerminatedInRangeForLocations": true
}

The test script is:

var jsonData = JSON.parse(responseBody); //the data from the response body
var allIDs = [];  //an array of all the IDs, no record count limit
var max500IDs = []; //arrays of maximum number of IDs
//
//retrieve all the IDs and put them into an array called allIDs
//
for(var i = 0; i < jsonData.result.refs.length; i++) {
    allIDs.push(jsonData.result.refs[i].id );
}
var batchCount = 1; //number of batches - default 1
var IDsInBatch = 500;  //maximum number of records in batch
//
//calculate the number of batches that you will need
//
if(allIDs.length > IDsInBatch) {
    batchCount = Math.ceil((allIDs.length - 1) / IDsInBatch);
}
//
//loop through the number of batches
//
var eeCountInOtherBatches = 0;
for(var k = 0; k < batchCount; k++) {
    //
    //loop through all the IDs and transfer them to a max 500 batch
    //
    var batch = []
    for(var j = 0; j < IDsInBatch; j++) {
        personID = allIDs[eeCountInOtherBatches + j];
        if(personID) {
            batch.push(personID);
        }
    }
    max500IDs[k] = batch;
    eeCountInOtherBatches = eeCountInOtherBatches + IDsInBatch;
}
//
//transfer the batches to environment variable(s)
//
for(var x = 0; x < max500IDs.length; x++) {
    postman.setEnvironmentVariable("max500IDs_" + x, max500IDs[x]);
}

The environment variables will be:

max500IDs_0
max500IDs_1
etc.

The employee request would be something like:

POST - {{DIMENSIONSHOST}}/v1/commons/data/multi_read

The body would be:

{
  "select": [
    {"key": "EMP_COMMON_FULL_NAME"}
  ],
  "from": {
    "view": "EMP",
    "employeeSet": {
        "employees": {
            "ids": [{{max500IDs_0}}]
        },
        "dateRange": {
            "startDate": "2022-01-01",
            "endDate": "2022-04-30"
      }
    }
  },
  "index": 0,
  "count": 500
}

Upvotes: 0

Related Questions