Reputation: 184
I'm trying to return an array of custom type (struct) MyStruct[] of ambiguous size to my Chainlink consumer contract, and am having trouble developing the appropriate job config (TOML) for it.
I'm aware of the gas concerns here, but would like to ignore this for now, as I'll be handling this via separate circuit breakers in the API endpoint.
I have a consumer contract that looks similar to this (pseudo-code):
//SPDX-License-Identifier: MIT
pragma solidity ^0.8.7;
import "@chainlink/contracts/src/v0.8/ChainlinkClient.sol";
import "@chainlink/contracts/src/v0.8/ConfirmedOwner.sol";
contract ConsumerContract is ChainlinkClient, ConfirmedOwner {
. . .
struct MyStruct {
uint a;
uint b;
string x;
}
MyStruct[] public result;
function request() public {
Chainlink.Request memory req = buildOperatorRequest(jobId, this.fulfill.selector);
// (add parameters)
// Send the request to the Chainlink oracle
sendOperatorRequest(req, fee);
}
function fulfill(bytes32 requestId, bytes[] memory _result) public recordChainlinkFulfillment(requestId) {
emit RequestFulfilled(requestId, _result);
// Process the response
for(uint i = 0; i < _result.length; i++){
MyStruct memory myStruct = abi.decode(_result[i], (MyStruct));
result.push(myStruct);
}
}
. . .
Example response from API endpoint (JSON representation of an unknown number of structs). Since I have control over the API endpoint, I can format this response however necessary:
[
[
0,
1,
"myString-1"
],
[
2,
3,
"myString-2"
]
. . .
]
Example job TOML (not working, but I took a stab at what I think it should look like):
type = "directrequest"
schemaVersion = 1
name = "my-job"
forwardingAllowed = false
maxTaskDuration = "0s"
contractAddress = "XXX"
minContractPaymentLinkJuels = "0"
observationSource = """
decode_log [type="ethabidecodelog"
abi="OracleRequest(bytes32 indexed specId, address requester, bytes32 requestId, uint256 payment, address callbackAddr, bytes4 callbackFunctionId, uint256 cancelExpiration, uint256 dataVersion, bytes data)"
data="$(jobRun.logData)"
topics="$(jobRun.logTopics)"]
decode_cbor [type="cborparse" data="$(decode_log.data)"]
fetch [type="http" method=GET url="$(decode_cbor.url)" allowunrestrictednetworkaccess="true"]
encode_data [type="ethabiencode"
abi="(bytes32 requestId, bytes[] _data)"
data="{\\"requestId\\": $(decode_log.requestId), \\"_data\\": [$(fetch)]}"
]
encode_tx [type="ethabiencode"
abi="fulfillOracleRequest2(bytes32 requestId, uint256 payment, address callbackAddress, bytes4 callbackFunctionId, uint256 expiration, bytes calldata data)"
data="{\\"requestId\\": $(decode_log.requestId), \\"payment\\": $(decode_log.payment), \\"callbackAddress\\": $(decode_log.callbackAddr), \\"callbackFunctionId\\": $(decode_log.callbackFunctionId), \\"expiration\\": $(decode_log.cancelExpiration), \\"data\\": $(encode_data)}"
]
submit_tx [type="ethtx" to="XXX" data="$(encode_tx)"]
decode_log -> decode_cbor -> fetch -> encode_data -> encode_tx -> submit_tx
"""
While the Chainlink job pipeline completes interally, the error occurs during the fulfill transaction, causing the fulfill transaction to fail (likely because my consumer contract's fulfill() function errors out trying to translate the bytes[] response into a struct[]
).
I checked out the Chainlink documentation, but nowhere can I find any information on how to return a custom data type (ie, struct) using a Direct Request job.
UPD: After scouring the internet, there is a similar question here, but it does not relate to an array of structs, and the struct included is quite basic (contains only one field): https://ethereum.stackexchange.com/questions/148499/chainlink-any-api-ethabiencode-while-parsing-abi-string-bad-abi-specification
Thank you for your help!
Upvotes: 0
Views: 64
Reputation: 1082
I don't think encode_data can be used to encode an array. Can you try to encode the array of struct with encode_large
? In order to use encode_large, please replace the codes
encode_data [type="ethabiencode"
abi="(bytes32 requestId, bytes[] _data)"
data="{\\"requestId\\": $(decode_log.requestId), \\"_data\\": [$(fetch)]}"
]
to
encode_large [type="ethabiencode"
abi="(bytes32 requestId, bytes[] _data)"
data="{\\"requestId\\": $(decode_log.requestId), \\"_data\\": $(fetch)}"
]
In addition, in your TOML file, there is no parse
to parse the data you fetched from API in fetch
and no path is provided either. You are trying to parse the entire JSON file, so please make sure that you get the correct value. You can add a parse
with the codes below:
parse [type="jsonparse" path="$(decode_cbor.path)" data="$(fetch)"]
Hope it helps
Upvotes: 0