[openstack-dev] [Savanna-all] How to run hadoop jobs
Ruslan Kamaldinov
rkamaldinov at mirantis.com
Sat Jul 13 18:04:26 UTC 2013
Hi Arindam,
There was a validation error during cluster creation. Can you send contents of cluster_create.json ?
Also, please attach output of:
$ http $SAVANNA_URL/images X-Auth-Token:$AUTH_TOKEN
$ http $SAVANNA_URL/node-group-templates X-Auth-Token:$AUTH_TOKEN
$ http $SAVANNA_URL/cluster-templates X-Auth-Token:$AUTH_TOKEN
Savanna debug logs also would be helpful.
Please use *openstack-dev* as a mail-list for Savanna-related questions. Just prefix the subject with [Savanna].
Thanks,
Ruslan
On Saturday, July 13, 2013 at 9:24 PM, Arindam Choudhury wrote:
> Hi,
>
> I installed savanna from the git repository. I can create cluster but only with admin, when I try to create cluster with non-admin user it gives me an error:
>
> (keystone_arindam)]# http $SAVANNA_URL/clusters X-Auth-Token:$AUTH_TOKEN < cluster_create.json
> HTTP/1.1 500 INTERNAL SERVER ERROR
> Content-Length: 111
> Content-Type: application/json
> Date: Sat, 13 Jul 2013 16:47:25 GMT
>
> {
> "error_code": 500,
> "error_message": "Error occurred during validation",
> "error_name": "INTERNAL_SERVER_ERROR"
> }
>
> The hadoop daemons are not started automatically so I could not run the test job.
>
> hadoop at cluster-1-master-001:/usr/share/hadoop$ hadoop jar hadoop-examples.jar pi 10 100
> Exception in thread "main" java.io.IOException: Error opening job jar: hadoop-examples.jar
> at org.apache.hadoop.util.RunJar.main(RunJar.java:90)
> Caused by: java.io.FileNotFoundException: hadoop-examples.jar (No such file or directory)
> at java.util.zip.ZipFile.open(Native Method)
> at java.util.zip.ZipFile.<init>(ZipFile.java:214)
> at java.util.zip.ZipFile.<init>(ZipFile.java:144)
> at java.util.jar.JarFile.<init>(JarFile.java:153)
> at java.util.jar.JarFile.<init>(JarFile.java:90)
> at org.apache.hadoop.util.RunJar.main(RunJar.java:88)
> hadoop at cluster-1-master-001:/usr/share/hadoop$ start-all.sh (http://start-all.sh)
> mkdir: cannot create directory `/mnt/log': Permission denied
> chown: cannot access `/mnt/log/hadoop/hadoop/': No such file or directory
> starting namenode, logging to /mnt/log/hadoop/hadoop//hadoop-hadoop-namenode-cluster-1-master-001.out
> /usr/sbin/hadoop-daemon.sh (http://hadoop-daemon.sh): line 136: /mnt/log/hadoop/hadoop//hadoop-hadoop-namenode-cluster-1-master-001.out: No such file or directory
> head: cannot open `/mnt/log/hadoop/hadoop//hadoop-hadoop-namenode-cluster-1-master-001.out' for reading: No such file or directory
> hadoop at localhost's password:
>
> when I try to start hadoop manually, it asks for a password. How to do it? Am I missing something?
>
> Regards,
> Arindam
>
>
> --
> Mailing list: https://launchpad.net/~savanna-all
> Post to : savanna-all at lists.launchpad.net (mailto:savanna-all at lists.launchpad.net)
> Unsubscribe : https://launchpad.net/~savanna-all
> More help : https://help.launchpad.net/ListHelp
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.openstack.org/pipermail/openstack-dev/attachments/20130713/965bc27e/attachment.html>
More information about the OpenStack-dev
mailing list