laomao
laomao

Reputation: 31

Option 'schema' not specified setting wso2 AM 1.10.x with DAS 3.1.0

I am trying to setup wso2 API Manager 1.10.0 with DAS 3.1.0. DAS will use MySQL 5.7.18. I ran mysql5.7.sql from DAS package to create DB schema in MySQL. I also downloaded MySQL-connector-java-5.1.35-bin.jar and copied it into repository\components\lib directory.

I turned on Configure Analytics in API manger, and saved the configuration successfully. I can see that API manager can talk to DAS without a problem.

But in the carbon log of DAS, I see exceptions like this:

TID: [-1234] [] [2017-05-26 15:30:00,368] ERROR {org.wso2.carbon.analytics.spark.core.AnalyticsTask} -  Error while executing the scheduled task for the script: APIM_STAT_SCRIPT {org.wso2.carbon.analytics.spark.core.AnalyticsTask}
org.wso2.carbon.analytics.spark.core.exception.AnalyticsExecutionException: Exception in executing query create temporary table APIRequestSummaryData using CarbonJDBC options (dataSource "WSO2AM_STATS_DB", tableName "API_REQUEST_SUMMARY")
    at org.wso2.carbon.analytics.spark.core.internal.SparkAnalyticsExecutor.executeQueryLocal(SparkAnalyticsExecutor.java:764)
    at org.wso2.carbon.analytics.spark.core.internal.SparkAnalyticsExecutor.executeQuery(SparkAnalyticsExecutor.java:721)
    at org.wso2.carbon.analytics.spark.core.CarbonAnalyticsProcessorService.executeQuery(CarbonAnalyticsProcessorService.java:201)
    at org.wso2.carbon.analytics.spark.core.CarbonAnalyticsProcessorService.executeScript(CarbonAnalyticsProcessorService.java:151)
    at org.wso2.carbon.analytics.spark.core.AnalyticsTask.execute(AnalyticsTask.java:60)
    at org.wso2.carbon.ntask.core.impl.TaskQuartzJobAdapter.execute(TaskQuartzJobAdapter.java:67)
    at org.quartz.core.JobRunShell.run(JobRunShell.java:213)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.RuntimeException: Option 'schema' not specified
    at scala.sys.package$.error(package.scala:27)
    at org.apache.spark.sql.jdbc.carbon.AnalyticsJDBCRelationProvider$$anonfun$3.apply(JDBCRelation.scala:113)
    at org.apache.spark.sql.jdbc.carbon.AnalyticsJDBCRelationProvider$$anonfun$3.apply(JDBCRelation.scala:113)
    at scala.collection.MapLike$class.getOrElse(MapLike.scala:128)
    at org.apache.spark.sql.execution.datasources.CaseInsensitiveMap.getOrElse(ddl.scala:150)
    at org.apache.spark.sql.jdbc.carbon.AnalyticsJDBCRelationProvider.createRelation(JDBCRelation.scala:113)
    at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:158)
    at org.apache.spark.sql.execution.datasources.CreateTempTableUsing.run(ddl.scala:92)
    at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:58)
    at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:56)
    at org.apache.spark.sql.execution.ExecutedCommand.doExecute(commands.scala:70)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:132)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:130)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
    at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:130)
    at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:55)
    at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:55)
    at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:145)
    at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:130)
    at org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:52)
    at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:817)
    at org.wso2.carbon.analytics.spark.core.internal.SparkAnalyticsExecutor.executeQueryLocal(SparkAnalyticsExecutor.java:760)

How to resolve it? Thanks.

Upvotes: 2

Views: 106

Answers (2)

laomao
laomao

Reputation: 31

It turns out that I need to import the correct schema declaration script from the /dbscript/stat/sql folder to the DAS database I am setting here.

Upvotes: 0

Abimaran Kugathasan
Abimaran Kugathasan

Reputation: 32468

API Manager 1.10 and DAS 3.1.0 aren't compatible each other. Unless you customize database and CApps, it won't work.

You can use API Manager 2.1 with DAS 3.1.0 or API Manager 1.10 with DAS 3.0.x

Upvotes: 1

Related Questions