Reputation: 861
I'm trying to use some Scala code in Zeppelin 0.8.0 with Spark interpreter:
%spark
import scala.beans.BeanProperty
class Node(@BeanProperty val parent: Option[Node]) {
}
But imports do not seem to be taken into account
import scala.beans.BeanProperty
<console>:14: error: not found: type BeanProperty
@BeanProperty val parent: Option[Node]) {
^
EDIT: I found out that the following code works :
class Node(@scala.beans.BeanProperty val parent: Option[Node]) {
}
This also works fine :
def loadCsv(CSVPATH: String): DataFrame = {
import org.apache.spark.sql.types._
//[...] some code
val schema = StructType(
firstRow.map(s => StructField(s, StringType))
)
//[…] some code again
}
So I guess everything works fine if it is imported between braces or directly specified with a path.to.package.Class
when used.
QUESTION: How do I import outside of a class/function definition?
Upvotes: 1
Views: 653
Reputation: 3021
Importing by path.to.package.Class
works well in Zeppelin. You can try it with importing and using java.sql.Date
;
import java.sql.Date
val date = Date.valueOf("2019-01-01")
The problem is about Zeppelin context. If you try to use following code snippets in Zeppelin, you will see that it works fine;
object TestImport {
import scala.beans.BeanProperty
class Node(@BeanProperty val parent: Option[Node]){}
}
val testObj = new TestImport.Node(None)
testObj.getParent
//prints Option[Node] = None
I hope it helps!
Upvotes: 1