KillerSnail
KillerSnail

Reputation: 3591

druid kafka indexing service setup

I followed the docs and edited:

druid-0.9.2/conf/druid/_common/common.runtime.properties

and added:

"druid-kafka-indexing-service"

to the druid.extensions.loadList and restarted all druid services: middlemanager, overlord, coordinator, broker, historical

I ran:

curl -X 'POST' -H 'Content-Type:application/json' -d @kafka_connect/script.json druid_server:8090/druid/indexer/v1/task

but got:

{"error":"Could not resolve type id 'kafka' into a subtype of [simple type, class io.druid.indexing.common.task.Task]\n at [Source: HttpInputOverHTTP@4c467f1c; line: 1, column: 4]"}

The input json has:

{
  "type": "kafka",
  "dataSchema": {
    "dataSource": "sensors-kafka",
    "parser": {
      "type": "string",
      "parseSpec": {
        "format": "json",
        "timestampSpec": {
          "column": "timestamp",
          "format": "auto"
        },
        "dimensionsSpec": {
          "dimensions": ["machine", "key"],
          "dimensionExclusions": [
            "timestamp",
            "value"
          ]
        }
      }
    },
    "metricsSpec": [
      {
        "name": "count",
        "type": "count"
      },
      {
        "name": "value_sum",
        "fieldName": "value",
        "type": "doubleSum"
      },
      {
        "name": "value_min",
        "fieldName": "value",
        "type": "doubleMin"
      },
      {
        "name": "value_max",
        "fieldName": "value",
        "type": "doubleMax"
      }
    ],
    "granularitySpec": {
      "type": "uniform",
      "segmentGranularity": "HOUR",
      "queryGranularity": "NONE"
    }
  },
  "tuningConfig": {
    "type": "kafka",
    "maxRowsPerSegment": 5000000
  },
  "ioConfig": {
    "topic": "sensor",
    "consumerProperties": {
      "bootstrap.servers": "kafka_server:2181"
    },
    "taskCount": 1,
    "replicas": 1,
    "taskDuration": "PT1H"
  }
}

Any idea what I did wrong? According to the doc: http://druid.io/docs/0.9.2-rc3/development/extensions-core/kafka-ingestion.html, the type is kafka?

Is there a way to check that the extension was loaded properly or do I have to specify the extension in each component's runtime.properties?

Upvotes: 4

Views: 3404

Answers (3)

AjitChahal
AjitChahal

Reputation: 271

if using dockerized apache/druid, you need to set

druid.extensions.loadList=["druid-histogram", "druid-datasketches", "druid-lookups-cached-global", "postgresql-metadata-storage", "druid-kafka-indexing-service"]

in file /opt/druid/conf/druid/cluster/_common/common.runtime.properties

Upvotes: 0

huang botao
huang botao

Reputation: 416

I meet the similar problems,and I fixed it by modify the file of "conf/druid/_common/common.runtime.properties" by add "druid-kafka-indexing-service" to druid.extensions.loadList,and now its show like this:

druid.extensions.loadList=["druid-parser-route", "mysql-metadata-storage", "druid-kafka-indexing-service"]

Hope can help anyone else

Upvotes: 3

Pierre Lacave
Pierre Lacave

Reputation: 2690

The supervisors json specs are to be sent on this endpoint on the overlord /druid/indexer/v1/supervisor

curl -X POST -H 'Content-Type: application/json' -d @kafka_connect/script.json http://druid_server:8090/druid/indexer/v1/supervisor

Upvotes: 2

Related Questions