Reputation: 1991
I have a utility class I created for my Spring controller to invoke to generate a CSV from a collection of beans using the SuperCSV library ( http://supercsv.sourceforge.net/ )
The utility class is pretty basic:
public static void export2CSV(HttpServletResponse response,
String[] header, String filePrefix, List<? extends Object> dataObjs) {
try{
response.setContentType("text/csv;charset=utf-8");
response.setHeader("Content-Disposition","attachment; filename="+filePrefix+"_Data.csv");
OutputStream fout= response.getOutputStream();
OutputStream bos = new BufferedOutputStream(fout);
OutputStreamWriter outputwriter = new OutputStreamWriter(bos);
ICsvBeanWriter writer = new CsvBeanWriter(outputwriter, CsvPreference.EXCEL_PREFERENCE);
// the actual writing
writer.writeHeader(header);
for(Object anObj : dataObjs){
writer.write(anObj, header);
}
}catch (Exception e){
e.printStackTrace();
}
};
The catch is, I'm getting different behaviors out of this operation and I don't know why. When I invoke it from one controller (we'll call it 'A'), I get the expected output of data.
When I invoke it from the other controller ('B'), I get a tiny blurb of unrecognizable binary data that cannot be opened by OO Calc. Opening it in Notepad++ yields an unreadable line of gibberish that I can only assume is an attempt by the reader to show me a binary stream.
Controller 'A' invocation (the one that works)
@RequestMapping(value="/getFullReqData.html", method = RequestMethod.GET)
public void getFullData(HttpSession session, HttpServletRequest request, HttpServletResponse response) throws IOException{
logger.info("INFO: ******************************Received request for full Req data dump");
String projName= (String)session.getAttribute("currentProject");
int projectID = ProjectService.getProjectID(projName);
List<Requirement> allRecords = reqService.getFullDataSet(projectID);
final String[] header = new String[] {
"ColumnA",
"ColumnB",
"ColumnC",
"ColumnD",
"ColumnE"
};
CSVExporter.export2CSV(response, header, projName+"_reqs_", allRecords);
};
...and here's the Controller 'B' invocation (the one that fails):
@RequestMapping(value="/getFullTCData.html", method = RequestMethod.GET)
public void getFullData(HttpSession session, HttpServletRequest request, HttpServletResponse response) throws IOException{
logger.info("INFO: Received request for full TCD data dump");
String projName= (String)session.getAttribute("currentProject");
int projectID = ProjectService.getProjectID(projName);
List<TestCase> allRecords = testService.getFullTestCaseList(projectID);
final String[] header = new String[] {
"ColumnW",
"ColumnX",
"ColumnY",
"ColumnZ"
};
CSVExporter.export2CSV(response, header, projName+"_tcs_", allRecords);
}
Observations:
Any suggestions or insights would be greatly appreciated. I'm really stumped how to better isolate the issue...
Upvotes: 2
Views: 2834
Reputation: 9868
You aren't closing the writer. Also, CsvBeanWriter will wrap the writer in a BufferedWriter, so you can probably simplify your outputwriter
as well.
public static void export2CSV(HttpServletResponse response,
String[] header, String filePrefix, List<? extends Object> dataObjs) {
ICsvBeanWriter writer;
try{
response.setContentType("text/csv;charset=utf-8");
response.setHeader("Content-Disposition",
"attachment; filename="+filePrefix+"_Data.csv");
OutputStreamWriter outputwriter =
new OutputStreamWriter(response.getOutputStream());
writer = new CsvBeanWriter(outputwriter, CsvPreference.EXCEL_PREFERENCE);
// the actual writing
writer.writeHeader(header);
for(Object anObj : dataObjs){
writer.write(anObj, header);
}
} catch (Exception e) {
e.printStackTrace();
} finally {
try {
writer.close(); // closes writer and underlying stream
} catch (Exception e){}
}
};
Super CSV 2.0.0-beta-1 is out now! As well as adding numerous other features (including Maven support and a new Dozer extension), CSV writers now expose a flush()
method as well.
Upvotes: 2