Allen Zhang
Allen Zhang

Reputation: 9

Spark-etl load data into accumulo failure

When I learned to use geotrellis to load data into accumulo, I had a problem:

Exception in thread "main" geotrellis.spark.io.package$LayerWriteError: Failed to write Layer(name = "example", zoom = 13)...... rg.apache.accumulo.core.client.AccumuloException: file:/geotrellis-ingest/726314aa-5b72-4f9c-9c41-f9521a045603-O45VGIHPpi: java.io.IOException: file:/geotrellis-ingest/726314aa-5b72-4f9c-9c41-f9521a045603-O45VGIHPpi is not in a volume configured for Accumulo

Here are images of my config file: config

config

config

Upvotes: 0

Views: 144

Answers (1)

Christopher
Christopher

Reputation: 2512

I'm not familiar with geotrellis or spark, but the error message indicates that there's a bulk import into Accumulo being attempted across filesystems (volumes), which Accumulo doesn't support.

The files you bulk import must be on a volume that is already configured for use in Accumulo. Accumulo will move the files within the same volume to its own directories, but it won't move them across volumes. To configure volumes for use within Accumulo, see the documentation for instance.volumes.

Upvotes: 1

Related Questions