Tuesday, August 16, 2011

If java is installed in c:\Program Files\Java\, it is headache to make Hadoop/Hive works. Hive reports an error like this:

/workplace/apps/hadoop-0.20.2-cdh3u1/bin/hadoop: line 300: /cygdrive/c/Program:
No such file or directory

Apparently the space in JAVA_HOME doesn't work.

I tried several solutions on the Internet, but none of them works. For example,

export JAVA_HOME=/cygdrive/c/Program\ Files/Java/jdk1.6.0_25
export JAVA_HOME=/cygdrive/c/"Program Files"/Java/jdk1.6.0_25

Here is how I solve this problem: create a soft link like this

ln -s /cygdrive/c/Program\ Files/Java/jdk1.6.0_25 /usr/java/default

and set JAVA_HOME in ${HADOOP_HOME}/conf/hadoop-env.sh like this

export JAVA_HOME=/usr/java/default

7 comments:

  1. Thanks for your post. Same issue i was facing and by following your comments, now it is working fine. Thanks much.

    ReplyDelete
  2. thanks man !! really appreciate posting...

    ReplyDelete
  3. It doesn't work for me. I issue command in cygwin installed on Window XP SP3. I got error.
    $ ln -s /cygdrive/c/Program\ Files/Java/jdk1.7.0_25 /usr/java/default
    ln: failed to create symbolic link `/usr/java/default': No such file or directory

    Do I have to create folder /usr/java/default first?

    ReplyDelete
    Replies
    1. Yes. Make sure the directory '/usr/java' exists, if not, create it, then ln will create /usr/java/default.

      Delete
  4. Hi friends try this once it will work
    JAVA_HOME="C:/Progra~1/Java/jdk1.7.0_51"
    in hadoop-env.sh

    ReplyDelete
    Replies
    1. yes this works ! just replace jdk1.7.0_51 with your jdk version.

      Tried the earlier mentioned method but no luck !

      Delete