If java is installed in c:\Program Files\Java\, it is headache to make Hadoop/Hive works. Hive reports an error like this:
/workplace/apps/hadoop-0.20.2-cdh3u1/bin/hadoop: line 300: /cygdrive/c/Program:
No such file or directory
Apparently the space in JAVA_HOME doesn't work.
I tried several solutions on the Internet, but none of them works. For example,
export JAVA_HOME=/cygdrive/c/Program\ Files/Java/jdk1.6.0_25
export JAVA_HOME=/cygdrive/c/"Program Files"/Java/jdk1.6.0_25
Here is how I solve this problem: create a soft link like this
ln -s /cygdrive/c/Program\ Files/Java/jdk1.6.0_25 /usr/java/default
and set JAVA_HOME in ${HADOOP_HOME}/conf/hadoop-env.sh like this
export JAVA_HOME=/usr/java/default
Thanks for your post. Same issue i was facing and by following your comments, now it is working fine. Thanks much.
ReplyDeleteWorks here too! thx
ReplyDeletethanks man !! really appreciate posting...
ReplyDeleteIt doesn't work for me. I issue command in cygwin installed on Window XP SP3. I got error.
ReplyDelete$ ln -s /cygdrive/c/Program\ Files/Java/jdk1.7.0_25 /usr/java/default
ln: failed to create symbolic link `/usr/java/default': No such file or directory
Do I have to create folder /usr/java/default first?
Yes. Make sure the directory '/usr/java' exists, if not, create it, then ln will create /usr/java/default.
DeleteHi friends try this once it will work
ReplyDeleteJAVA_HOME="C:/Progra~1/Java/jdk1.7.0_51"
in hadoop-env.sh
yes this works ! just replace jdk1.7.0_51 with your jdk version.
DeleteTried the earlier mentioned method but no luck !