Saturday, 31 August 2013

increase jvm heap space while runnig from hadoop unix

increase jvm heap space while runnig from hadoop unix

I am running a java class test.java from hadoop command :
$ hadoop test
I am using a stringBuilder, and its size is going out of memory :
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
at java.util.Arrays.copyOf(Arrays.java:2882)
at
java.lang.AbstractStringBuilder.expandCapacity(AbstractStringBuilder.

java:100)
at
java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:572

)
at java.lang.StringBuffer.append(StringBuffer.java:320)
at java.io.StringWriter.write(StringWriter.java:60)
at org.json.JSONObject.quote(JSONObject.java:1287)
at org.json.JSONObject.writeValue(JSONObject.java:1597)
at org.json.JSONObject.write(JSONObject.java:1649)
at org.json.JSONObject.writeValue(JSONObject.java:1574)
at org.json.JSONArray.write(JSONArray.java:929)
at org.json.JSONObject.writeValue(JSONObject.java:1576)
at org.json.JSONObject.write(JSONObject.java:1649)
at org.json.JSONObject.writeValue(JSONObject.java:1574)
at org.json.JSONObject.write(JSONObject.java:1632)
at org.json.JSONObject.toString(JSONObject.java:1443)
I know in java we can run a java program by providing a heap space size :
java -Xmx4G test
How can I do this while running with hadoop, if I run it like :
$ hadoop -Xmx4G test
it throws exception :
Caused by: java.lang.ClassNotFoundException: java
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Could not find the main class: java. Program will exit.
Error: No command named `-Xmx4g' was found. Perhaps you meant `hadoop
Xmx4g'
I would like to change the heap size permanently.

No comments:

Post a Comment