Reputation: 115
I meet a problem when I compile spark version 1.3.1. When I compiled the original source codes provided by spark,it was OK. But when I added some source files into the mllib, it came up with errors,like:
Based on the information at the end of compiling
It should be because of the scalastyle test. I could finish my compile process by closing the validation of scalastyle.
But is there any other ways to handle this problem? I don't think just closing the validation is good enough
Example code of Errors:
good one
val implicitPrefs =
new BooleanParam(this, "implicitPrefs", "whether to use implicit preference", Some(false))
bad one
val implicitPrefs = new BooleanParam(this, "implicitPrefs", "whether to use implicit preference", Some(false))
Upvotes: 2
Views: 5699
Reputation: 1465
I believe you should have some kind of xml configuration (e.g. scalastyle.xml) to set up scala rules for your project. So you might set up maxFileLength value up to you:
<scalastyle>
<name>.....</name>
<check level="warning" class="org.scalastyle.file.FileLengthChecker" enabled="true">
<parameters>
<parameter name="maxFileLength">1000</parameter>
</parameters>
</check>
</scalastyle>
http://www.scalastyle.org/rules-dev.html#org_scalastyle_file_FileLengthChecker
Upvotes: 0
Reputation: 1815
You can also read two code files side by side if their line length has been limited to 80/100.
Scala Style also imposes worthwhile rules like braces around single if-else statements.
Upvotes: 0