laomao
laomao

Reputation: 31

clustered wso2 api manager gateway does not publish statistics to DAS

I am setting up wso2 API manager 1.10.x with DAS 3.0.1 for publishing API statistics using mysql. My API manager system is clustered with gateway worker node on a separate VM. I followed the following documents to enable analytics for API manager via UI. http://mail.wso2.org/mailarchive/dev/2016-March/060905.html I also followed this document to manually enable analytics for gateway worker node. http://blog.rukspot.com/2016/05/configure-wso2-apim-analytics-using-xml.html After setup, I restart all servers, everything seems fine. But when I make a request to published API, from gateway worker log, I don't see it publishing any statistics to DAS receiver. No data in DAS summary tables either. What do I need to do to make API manager gateway worker node to publish statistics to DAS? Am I missing anything in configuration?

I do see following exception in DAS (which I don't think is related to gateway worker node not publishing statistics).

[2017-05-31 17:02:46,660]  INFO {org.wso2.carbon.event.processor.manager.core.internal.CarbonEventManagementService} -  Starting polling event receivers
Exception in thread "dag-scheduler-event-loop" java.lang.NoClassDefFoundError: org/xerial/snappy/SnappyInputStream
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:348)
        at org.apache.spark.io.CompressionCodec$.createCodec(CompressionCodec.scala:66)
        at org.apache.spark.io.CompressionCodec$.createCodec(CompressionCodec.scala:60)
        at org.apache.spark.broadcast.TorrentBroadcast.org$apache$spark$broadcast$TorrentBroadcast$$setConf(TorrentBroadcast.scala:73)
        at org.apache.spark.broadcast.TorrentBroadcast.<init>(TorrentBroadcast.scala:80)
        at org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:34)
        at org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.scala:62)
        at org.apache.spark.SparkContext.broadcast(SparkContext.scala:1292)
        at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$submitMissingTasks(DAGScheduler.scala:874)
        at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$submitStage(DAGScheduler.scala:815)
        at org.apache.spark.scheduler.DAGScheduler.handleJobSubmitted(DAGScheduler.scala:799)
        at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1429)
        at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1421)
        at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
Caused by: java.lang.ClassNotFoundException: org.xerial.snappy.SnappyInputStream cannot be found by spark-core_2.10_1.4.2.wso2v1
        at org.eclipse.osgi.internal.loader.BundleLoader.findClassInternal(BundleLoader.java:501)
        at org.eclipse.osgi.internal.loader.BundleLoader.findClass(BundleLoader.java:421)
        at org.eclipse.osgi.internal.loader.BundleLoader.findClass(BundleLoader.java:412)
        at org.eclipse.osgi.internal.baseadaptor.DefaultClassLoader.loadClass(DefaultClassLoader.java:107)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

Configuration (api-manager.xml):

<APIUsageTracking>
    <Enabled>true</Enabled>
    <DASServerURL>{tcp://10.14.3.93:7614}</DASServerURL>
    <DASRestApiURL>10.14.3.93:9446</DASRestApiURL>
    <SkipEventReceiverConnection>false</SkipEventReceiverConnect‌​ion>
    <PublisherClass>org.wso2.carbon.apimgt.usage.publisher.APIMg‌​tUsageDataBridgeData‌​Publisher</Publisher‌​Class>
    <PublishResponseMessageSize>false</PublishResponseMessageSiz‌​e>
</APIUsageTracking>

Upvotes: 1

Views: 210

Answers (1)

Chamalee De Silva
Chamalee De Silva

Reputation: 705

It seems that the org.xerial.snappy.snappy-java_1.1.1.7.jar which is in your plugins directory is having an OSGI header issue. Please download the jar file from maven repository and copy it to {DAS_HOME}/repository/components/lib directory and restart the server.

Upvotes: 0

Related Questions