Jim Keener
Jim Keener

Reputation: 9303

Query cannot have any inequality filters

When running a Dataflow with the DirectPipelineRunner

        DatastoreV1.Query q = DatastoreV1.Query.newBuilder()
        .addKind(DatastoreV1.KindExpression.newBuilder().setName("KIND").build())
        .setFilter(DatastoreHelper.makeFilter(
            DatastoreHelper.makeFilter(
                "date", 
                DatastoreV1.PropertyFilter.Operator.GREATER_THAN_OR_EQUAL, 
                DatastoreHelper.makeValue(date)).build(),
            DatastoreHelper.makeFilter(
                "date", 
                DatastoreV1.PropertyFilter.Operator.LESS_THAN,
                DatastoreHelper.makeValue(next_date)).build()
        ))
        .build()

Works as expected. When submitting the job to Dataflow, I get the following error:

(4e1bbdfd880a21c1): java.lang.IllegalArgumentException: Query cannot have any inequality filters. at 
com.google.api.services.datastore.client.QuerySplitterImpl.validateFilter(QuerySplitterImpl.java:109) at 
com.google.api.services.datastore.client.QuerySplitterImpl.validateFilter(QuerySplitterImpl.java:105) at 
com.google.api.services.datastore.client.QuerySplitterImpl.validateQuery(QuerySplitterImpl.java:128) at 
com.google.api.services.datastore.client.QuerySplitterImpl.getSplits(QuerySplitterImpl.java:71) at com.google.cloud.dataflow.sdk.io.DatastoreIO$Source.getSplitQueries(DatastoreIO.java:426) at com.google.cloud.dataflow.sdk.io.DatastoreIO$Source.splitIntoBundles(DatastoreIO.java:305) at 
com.google.cloud.dataflow.sdk.runners.dataflow.CustomSources.performSplit(CustomSources.java:305) at 
com.google.cloud.dataflow.sdk.runners.dataflow.CustomSources.performSourceOperation(CustomSources.java:151) at 
com.google.cloud.dataflow.sdk.runners.worker.SourceOperationExecutor.execute(SourceOperationExecutor.java:62) at 
com.google.cloud.dataflow.sdk.runners.worker.DataflowWorker.executeWork(DataflowWorker.java:254) at 
com.google.cloud.dataflow.sdk.runners.worker.DataflowWorker.doWork(DataflowWorker.java:191) at 
com.google.cloud.dataflow.sdk.runners.worker.DataflowWorker.getAndPerformWork(DataflowWorker.java:144) at 
com.google.cloud.dataflow.sdk.runners.worker.DataflowWorkerHarness$WorkerThread.doWork(DataflowWorkerHarness.java:180) at 
com.google.cloud.dataflow.sdk.runners.worker.DataflowWorkerHarness$WorkerThread.call(DataflowWorkerHarness.java:161) at 
com.google.cloud.dataflow.sdk.runners.worker.DataflowWorkerHarness$WorkerThread.call(DataflowWorkerHarness.java:148) at 
java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745)

I have Dataflow SDK version: 1.3.0 locally.

Upvotes: 1

Views: 515

Answers (1)

Davor Bonaci
Davor Bonaci

Reputation: 1729

This is a code issue in the Dataflow SDK for Java, version 1.4.0 and older. We'll track it as Issue #101 in the GitHub's repository issue tracker. We'll try to address this quickly -- please follow there for updates.

I cannot think of any trivial workaround right now. Sorry about that!

The solution will be to update to a newer version of the Dataflow SDK.

Upvotes: 2

Related Questions