Rai
Rai

Reputation: 136

Mulesoft Batch aggregator ImmutableRecordAwareList payload

While working With Anypoint studio 7.12 and mule Runtime 4.4 batch flow when data transformation completes and records are aggregated by Batch Aggregator, the collection of records is of type ImmutableRecordAwareList, when I am trying to write that data in aggregator to a file I am getting below error, which indicates that mule runtime is trying to convert an immutable collection to InputStream which is causing error.

Wondering if anyone else has faced similar issue and you know how to resolve it.

Error:

com.mulesoft.mule.runtime.module.batch.internal.commit.ImmutableRecordAwareList could not be transformed to the desired type java.io.InputStream"

Please share your comment to help me resolve it. Below is the sample mule config file which is causing this error

<?xml version="1.0" encoding="UTF-8"?>

<mule xmlns:batch="http://www.mulesoft.org/schema/mule/batch" xmlns:ee="http://www.mulesoft.org/schema/mule/ee/core"
    xmlns:file="http://www.mulesoft.org/schema/mule/file"
    xmlns:http="http://www.mulesoft.org/schema/mule/http" xmlns="http://www.mulesoft.org/schema/mule/core" xmlns:doc="http://www.mulesoft.org/schema/mule/documentation" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.mulesoft.org/schema/mule/core http://www.mulesoft.org/schema/mule/core/current/mule.xsd
http://www.mulesoft.org/schema/mule/http http://www.mulesoft.org/schema/mule/http/current/mule-http.xsd
http://www.mulesoft.org/schema/mule/file http://www.mulesoft.org/schema/mule/file/current/mule-file.xsd
http://www.mulesoft.org/schema/mule/ee/core http://www.mulesoft.org/schema/mule/ee/core/current/mule-ee.xsd
http://www.mulesoft.org/schema/mule/batch http://www.mulesoft.org/schema/mule/batch/current/mule-batch.xsd">
    <http:listener-config name="HTTP_Listener_config" doc:name="HTTP Listener config" doc:id="3c39b3e8-228c-4dcf-a145-2204df0a7ba6" >
        <http:listener-connection host="0.0.0.0" port="8081" />
    </http:listener-config>
    <file:config name="File_Config" doc:name="File Config" doc:id="5e47593d-5967-4dc6-ad57-6da7e2530779" >
        <file:connection workingDir="C:\workspace\TestData\Mulesoft" />
    </file:config>
    <flow name="batchfilewriterFlow" doc:id="83e23d37-3594-40a8-a8cd-805b8242d6f9" >
        <http:listener doc:name="Listener" doc:id="df99d40a-2826-4d33-9565-5a15c0a49c05" config-ref="HTTP_Listener_config" path="/writer"/>
        <file:read doc:name="Read csv" doc:id="34172ee1-f238-433d-b712-a56f79517e50" config-ref="File_Config" path="Contact.csv" outputMimeType="application/csv; header=true; separator=|"/>
        <ee:transform doc:name="TO CVS" doc:id="3e36402d-7c54-464d-a1a5-519e7bd96fac" >
            <ee:message >
                <ee:set-payload ><![CDATA[%dw 2.0
output application/csv header=true, separator=","
---
payload]]></ee:set-payload>
            </ee:message>
        </ee:transform>
        <batch:job jobName="batchfilewriterBatch_Job" doc:id="c9695d46-b534-4202-8b88-0e1142617c2a" >
            <batch:process-records >
                <batch:step name="Batch_Step" doc:id="577a43b4-262e-4dcd-aa79-8c24a466fdd2" >
                    <ee:transform doc:name="Transform Message" doc:id="4e61f76f-299d-4049-89f4-0dc3b4cc0e91" >
                        <ee:message >
                            <ee:set-payload ><![CDATA[%dw 2.0
output application/java
---
payload]]></ee:set-payload>
                        </ee:message>
                    </ee:transform>
                    <batch:aggregator doc:name="Batch Aggregator" doc:id="201a1803-91a2-46aa-bffd-998b5a03f53f" size="200">
                        <file:write doc:name="Write" doc:id="7e7318ec-e5a8-425c-adae-bc960731357a" config-ref="File_Config" path="Error.csv" />
                    </batch:aggregator>
                </batch:step>
            </batch:process-records>
        </batch:job>
    </flow>
</mule>

Upvotes: 0

Views: 1278

Answers (3)

Vishal Naik
Vishal Naik

Reputation: 1

try this, I had immutableRecordAwareList data(array of base 64 data) and was not able to access the data. later transformed the data in json for easier manupalation

%dw 2.0 output application/json

payload map read(payload[$$],'application/json')

hope it helps

Upvotes: 0

Rai
Rai

Reputation: 136

I found some workaround for the issue I mentioned above so I would like to share with you. Not sure if this is prefect solution for the problem but it did resolved the issue.

  1. inside batch aggregator if I transform the payload to a serializable custom java object I do not get the error. In my case I was sending my payload to Salesforce Bulk API 2 and with above change I do not see above mentioned error

  2. if you do not want to use custom serializable object then you can add a transformer like

    payload map (value, index) -> value not sure what does happens to payload with above transformation but above error goes away.

I will update this post with additional finding later

Upvotes: 0

aled
aled

Reputation: 25837

This usage of batch makes no sense to me.

  1. The input to the batch should be records. The flow is converting the CSV read from the file to another CSV, just with a different separator. It makes no sense since the objective of the batch is to process records it would be more efficient to convert the file to Java records.
  2. Writing files inside a batch is not a good idea. Batch jobs in Mule are executed in multiple threads. That means that the file can get overwritten and/or corrupted.
  3. Just converting to Java inside the batch step means that the file write operation will not know what to do with the resulting Java objects. You would need to convert that payload to something that can be written in a file.
  4. But more importantly than all the previous points, there is no reason at all to use a batch for this flow. It does not any record oriented processing that would be take any advantages of a batch job.

Instead you can remove the batch completely, and just write the output of the transformed CSV directly to a file. If the file is big you may want to try using the streaming writer property in both the file read and the transformation to reduce memory usage.

Examples:

  • outputMimeType="application/csv; header=true; separator='|'; streaming=true"
  • output application/csv header=true, separator=",", streaming=true

Upvotes: 1

Related Questions