Surender Raja
Surender Raja

Reputation: 3599

Pig Simple Dump function

My Input file is below . I am trying to dump the loaded data in relation. I am using pig 0.12.

a,t1,1000,100
a,t1,2000,200
b,t2,1000,200
b,t2,5000,100

I entered into HDFS mode by entering pig

myinput = LOAD 'file' AS(a1:chararray,a2:chararray,amt:int,rate:int);

if i do dump myinput then it shows the below error.

describe, illustrate works fine..

so

dump myinput ;

As soon i enter the dump command i get the below error message.

ERROR org.apache.hadoop.ipc.RPC - FailoverProxy: Failing this Call: submitJob for error   (RemoteException): org.apache.hadoop.ipc.RemoteException:  org.apache.hadoop.security.AccessControlException: User 'myid' cannot perform operation SUBMIT_JOB on queue default.
Please run "hadoop queue -showacls" command to find the queues you have access to .
    at org.apache.hadoop.mapred.ACLsManager.checkAccess(ACLsManager.java:179)
    at org.apache.hadoop.mapred.ACLsManager.checkAccess(ACLsManager.java:136)
    at org.apache.hadoop.mapred.ACLsManager.checkAccess(ACLsManager.java:113)
    at org.apache.hadoop.mapred.JobTracker.submitJob(JobTracker.java:4541)
    at sun.reflect.GeneratedMethodAccessor20.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:993)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1326)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1322)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1127)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1320)



ERROR org.apache.pig.tools.grunt.Grunt - ERROR 1066: Unable to open iterator for alias myinput

Is this access issues? kind of privilege issue? Can someone help me

Upvotes: 0

Views: 355

Answers (2)

Rengasamy
Rengasamy

Reputation: 1043

If you didn't mention any load functions like PigStorage('\t') then it reads data with the column separator as tab(\t) by default.

In your data, the column separator is comma(,)

So Try this one,

myinput = LOAD 'file' using PigStorage(',') AS(a1:chararray,a2:chararray,amt:int,rate:int);

Hope it should work..

Upvotes: 1

user3387616
user3387616

Reputation: 81

you could describe your input data(separator), in your case comma : try this code please :

myinput = LOAD 'file'  USING PigStorage(',') AS (a1:chararray,a2:chararray,amt:int,rate:int);

Upvotes: 0

Related Questions