Pages

Wednesday, May 25, 2016

java.lang.ClassNotFoundException and No FileSystem for scheme: hdfs

java.lang.ClassNotFoundException and No FileSystem for scheme: hdfs  exception while connecting to hadoop from application server



Stand alone code to connect to hadoop works fine But when the same code put it in application server ( weblogic ) , it fails . Even though all the jars bundled in ear file


Error 1 :

java.io.IOException: No FileSystem for scheme: hdfs                                                                        
        at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1600)                                          
        at org.apache.hadoop.fs.FileSystem.access$300(FileSystem.java:69)                                                  
        at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:1637)                                        
        at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1619)                                                
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:255)                                                        
        at connector.HadoopService.hadoopHandler(HadoopService.java:62)                                                    
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)                                                    
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)                                  
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingM


Fix :
Edit the code  like this

                      Configuration conf = new Configuration();
               

              conf.set("fs.hdfs.impl", org.apache.hadoop.hdfs.DistributedFileSystem.class.getName());
             conf.set("fs.file.impl", org.apache.hadoop.fs.LocalFileSystem.class.getName());



        FileSystem hdfs = FileSystem.get(new URI("hdfs://IP:9000"),conf);




Error 2 :


java.lang.ClassNotFoundException: Class  org.apache.hadoop.hdfs.DistributedFileSystem
org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2290)
org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2303)
org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:87)
org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2342)
org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2324)
org.apache.hadoop.fs.FileSystem.get(FileSystem.java:351)
org.apache.hadoop.fs.FileSystem.get(FileSystem.java:163)




Fix : add all the jars required specifically in <domain>/lib directory

For hadoop 2.7.2  : required jars  ( all are located at common/lib folder )


  1. commons-io-2.4.jar       
  2.   guava-11.0.2.jar    
  3.    hadoop-common-2.7.2.jar 
  4.  htrace-core-3.1.0-incubating.jar 
  5.  protobuf-java-2.5.0.jar 
  6.  slf4j-api-1.7.10.jar
  7. commons-logging-1.1.3.jar 
  8.  hadoop-auth-2.7.2.jar 
  9.  hadoop-hdfs-2.7.2.jar  
  10.   log4j-1.2.17.jar   
For coludera distribution 

1. log4j-1.2.17.jar


2. commons-logging-1.0.4.jar

3. guava-r09-jarjar.jar


4. hadoop-core-0.20.2.jar


Adding lib directory is best solution to fix this issue 

No comments:

Post a Comment