[HADOOP] 64 비트 Centos 6.5에서 Apache Hadoop 2.4.0으로 작업하는 Sqoop 1.99.3을 가져올 수 없습니다.
HADOOP64 비트 Centos 6.5에서 Apache Hadoop 2.4.0으로 작업하는 Sqoop 1.99.3을 가져올 수 없습니다.
저는 Apache hadoop을 작동시키고 Centos 6.5 KVM 가상 서버에 설치했습니다. 그것은에 설치됩니다
/home/hduser/yarn/hadoop-2.4.0 and the config files are in /home/hduser/yarn/hadoop-2.4.0/etc/hadoop.
라이브러리에 대해 hadoop에서 32 비트 (기본 설치로 바이너리 설치가 포함 된 것 같음)라는 불만이 제기되었으므로 64 비트 라이브러리를 얻으려고 완전한 소스 빌드를 수행했습니다. 하지만 sqoop 1.99.3은 어쨌든 hadoop 항아리 만 사용하기를 원할 것입니다. (?)
이것은 보이는 주요 오류이며 너무 인기있는 것하지만 나는 어떤 제안을 찾을 수 없습니다. 내 sqoop 설치시 addtowar.sh가 존재하지 않습니다.
**Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.conf.Configuration
at org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1680)**
Sqoop은 /home/hduser/sqoop-1.99.3-bin-hadoop200과 cataline.properties에있다.
common.loader=${catalina.base}/lib,${catalina.base}/lib/*.jar,${catalina.home}/lib,${catalina.home}/lib/*.jar,${catalina.home}/../lib/*.jar,${HADOOP_PREFIX}/share/hadoop/common/*.jar,${HADOOP_PREFIX}/share/hadoop/mapreduce/*.jar
vmcentos01:/home/hduser/sqoop-1.99.3-bin-hadoop200/server/conf>echo $HADOOP_PREFIX
/home/hduser/yarn/hadoop-2.4.0
./sqoop.sh 서버 시작 ..을 실행합니다.
Sqoop home directory: /home/hduser/sqoop-1.99.3-bin-hadoop200
Setting SQOOP_HTTP_PORT: 12000
Setting SQOOP_ADMIN_PORT: 12001
Using CATALINA_OPTS:
Adding to CATALINA_OPTS: -Dsqoop.http.port=12000 -Dsqoop.admin.port=12001
Using CATALINA_BASE: /home/hduser/sqoop-1.99.3-bin-hadoop200/server
Using CATALINA_HOME: /home/hduser/sqoop-1.99.3-bin-hadoop200/server
Using CATALINA_TMPDIR: /home/hduser/sqoop-1.99.3-bin-hadoop200/server/temp
Using JRE_HOME: /usr/java/jdk1.7.0_15
Using CLASSPATH: /home/hduser/sqoop-1.99.3-bin-hadoop200/server/bin/bootstrap.jar
vmcentos01:/home/hduser/sqoop-1.99.3-bin-hadoop200/bin>
vmcentos01:/home/hduser/sqoop-1.99.3-bin-hadoop200/bin>netstat -aln | grep 12000
tcp 0 0 0.0.0.0:12000 0.0.0.0:* LISTEN
vmcentos01:/home/hduser/sqoop-1.99.3-bin-hadoop200/bin>
webapps / sqoop에 sqoop.war이 배포되었습니다.
/lib:
total 4092
-rw-r--r-- 1 hduser hadoop 160519 Oct 15 2013 commons-dbcp-1.4.jar
-rw-r--r-- 1 hduser hadoop 279193 Oct 15 2013 commons-lang-2.5.jar
-rw-r--r-- 1 hduser hadoop 96221 Oct 15 2013 commons-pool-1.5.4.jar
-rw-r--r-- 1 hduser hadoop 6734 Oct 18 2013 connector-sdk-1.99.3.jar
-rw-r--r-- 1 hduser hadoop 2671577 Oct 15 2013 derby-10.8.2.2.jar
-rw-r--r-- 1 hduser hadoop 16046 Oct 15 2013 json-simple-1.1.jar
-rw-r--r-- 1 hduser hadoop 481535 Oct 15 2013 log4j-1.2.16.jar
-rw-r--r-- 1 hduser hadoop 130387 Oct 18 2013 sqoop-common-1.99.3.jar
-rw-r--r-- 1 hduser hadoop 51382 Oct 18 2013 sqoop-connector-generic-jdbc-1.99.3.jar
-rw-r--r-- 1 hduser hadoop 119652 Oct 18 2013 sqoop-core-1.99.3.jar
-rw-r--r-- 1 hduser hadoop 70692 Oct 18 2013 sqoop-execution-mapreduce-1.99.3-hadoop200.jar
-rw-r--r-- 1 hduser hadoop 41462 Oct 18 2013 sqoop-repository-derby-1.99.3.jar
-rw-r--r-- 1 hduser hadoop 16156 Oct 18 2013 sqoop-spi-1.99.3.jar
-rw-r--r-- 1 hduser hadoop 16590 Oct 18 2013 sqoop-submission-mapreduce-1.99.3-hadoop200.jar
vmcentos01:/home/hduser/sqoop-1.99.3-bin-hadoop200/server/webapps/sqoop/WEB-INF>
다음 로그가 있습니다.
vmcentos01:/home/hduser/sqoop-1.99.3-bin-hadoop200/server/logs>cat localhost.2014-05-11.log
May 11, 2014 10:15:56 AM org.apache.catalina.core.StandardContext listenerStart
SEVERE: Exception sending context initialized event to listener instance of class org.apache.sqoop.server.ServerInitializer
java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
at org.apache.sqoop.submission.mapreduce.MapreduceSubmissionEngine.initialize(MapreduceSubmissionEngine.java:78)
at org.apache.sqoop.framework.JobManager.initialize(JobManager.java:215)
at org.ap vmcentos01:/home/hduser/sqoop-1.99.3-bin-hadoop200/server/logs>ls -l
total 24
-rw-r--r-- 1 hduser hadoop 3766 May 11 10:15 catalina.2014-05-11.log
-rw-r--r-- 1 hduser hadoop 8629 May 11 10:15 catalina.out
-rw-r--r-- 1 hduser hadoop 0 May 11 10:15 host-manager.2014-05-11.log
-rw-r--r-- 1 hduser hadoop 5032 May 11 10:15 localhost.2014-05-11.log
-rw-r--r-- 1 hduser hadoop 0 May 11 10:15 manager.2014-05-11.log
vmcentos01:/home/hduser/sqoop-1.99.3-bin-hadoop200/server/logs>
-------------localhost*.log --------------
ache.sqoop.core.SqoopServer.initialize(SqoopServer.java:53)
at org.apache.sqoop.server.ServerInitializer.contextInitialized(ServerInitializer.java:36)
at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4206)
at org.apache.catalina.core.StandardContext.start(StandardContext.java:4705)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:799)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:779)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:601)
at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:943)
at org.apache.catalina.startup.HostConfig.deployWARs(HostConfig.java:778)
at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:504)
at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1317)
at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:324)
at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:142)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1065)
at org.apache.catalina.core.StandardHost.start(StandardHost.java:840)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1057)
at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:463)
at org.apache.catalina.core.StandardService.start(StandardService.java:525)
at org.apache.catalina.core.StandardServer.start(StandardServer.java:754)
at org.apache.catalina.startup.Catalina.start(Catalina.java:595)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:289)
at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:414)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.conf.Configuration
at org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1680)
at org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1526)
... 28 more
May 11, 2014 10:15:56 AM org.apache.catalina.core.StandardContext listenerStop
SEVERE: Exception sending context destroyed event to listener instance of class org.apache.sqoop.server.ServerInitializer
java.lang.NullPointerException
at org.apache.sqoop.framework.JobManager.destroy(JobManager.java:176)
at org.apache.sqoop.core.SqoopServer.destroy(SqoopServer.java:36)
at org.apache.sqoop.server.ServerInitializer.contextDestroyed(ServerInitializer.java:32)
at org.apache.catalina.core.StandardContext.listenerStop(StandardContext.java:4245)
at org.apache.catalina.core.StandardContext.stop(StandardContext.java:4886)
at org.apache.catalina.core.StandardContext.start(StandardContext.java:4750)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:799)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:779)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:601)
at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:943)
at org.apache.catalina.startup.HostConfig.deployWARs(HostConfig.java:778)
at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:504)
at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1317)
at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:324)
at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:142)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1065)
at org.apache.catalina.core.StandardHost.start(StandardHost.java:840)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1057)
at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:463)
at org.apache.catalina.core.StandardService.start(StandardService.java:525)
at org.apache.catalina.core.StandardServer.start(StandardServer.java:754)
at org.apache.catalina.startup.Catalina.start(Catalina.java:595)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:289)
at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:414)
-----------------------catalina log ----------------------------
vmcentos01:/home/hduser/sqoop-1.99.3-bin-hadoop200/server/logs>cat catalina.2014-05-11.log
May 11, 2014 10:15:54 AM org.apache.catalina.startup.ClassLoaderFactory validateFile
WARNING: Problem with directory [/home/hduser/sqoop-1.99.3-bin-hadoop200/lib], exists: [false], isDirectory: [false], canRead: [false]
May 11, 2014 10:15:54 AM org.apache.catalina.startup.ClassLoaderFactory validateFile
WARNING: Problem with directory [/home/hduser/sqoop-1.99.3-bin-hadoop200/${HADOOP_PREFIX}/share/hadoop/common], exists: [false], isDirectory: [false], canRead: [false]
May 11, 2014 10:15:54 AM org.apache.catalina.startup.ClassLoaderFactory validateFile
WARNING: Problem with directory [/home/hduser/sqoop-1.99.3-bin-hadoop200/${HADOOP_PREFIX}/share/hadoop/mapreduce], exists: [false], isDirectory: [false], canRead: [false]
May 11, 2014 10:15:54 AM org.apache.catalina.core.AprLifecycleListener init
INFO: The APR based Apache Tomcat Native library which allows optimal performance in production environments was not found on the java.library.path: /usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
May 11, 2014 10:15:54 AM org.apache.coyote.http11.Http11Protocol init
INFO: Initializing Coyote HTTP/1.1 on http-12000
May 11, 2014 10:15:54 AM org.apache.catalina.startup.Catalina load
INFO: Initialization processed in 399 ms
May 11, 2014 10:15:54 AM org.apache.catalina.core.StandardService start
INFO: Starting service Catalina
May 11, 2014 10:15:54 AM org.apache.catalina.core.StandardEngine start
INFO: Starting Servlet Engine: Apache Tomcat/6.0.36
May 11, 2014 10:15:54 AM org.apache.catalina.startup.HostConfig deployWAR
INFO: Deploying web application archive sqoop.war
May 11, 2014 10:15:56 AM org.apache.catalina.core.StandardContext start
SEVERE: Error listenerStart
May 11, 2014 10:15:56 AM org.apache.catalina.core.StandardContext start
SEVERE: Context [/sqoop] startup failed due to previous errors
May 11, 2014 10:15:56 AM org.apache.catalina.loader.WebappClassLoader clearReferencesJdbc
SEVERE: The web application [/sqoop] registered the JDBC driver [org.apache.derby.jdbc.AutoloadedDriver40] but failed to unregister it when the web application was stopped. To prevent a memory leak, the JDBC Driver has been forcibly unregistered.
May 11, 2014 10:15:56 AM org.apache.catalina.loader.WebappClassLoader clearReferencesThreads
SEVERE: The web application [/sqoop] appears to have started a thread named [sqoop-config-file-poller] but has failed to stop it. This is very likely to create a memory leak.
May 11, 2014 10:15:56 AM org.apache.catalina.loader.WebappClassLoader checkThreadLocalMapForLeaks
SEVERE: The web application [/sqoop] created a ThreadLocal with key of type [java.lang.ThreadLocal] (value [java.lang.ThreadLocal@3f782da8]) and a value of type [org.apache.derby.iapi.services.context.ContextManager] (value [org.apache.derby.iapi.services.context.ContextManager@6495dc5a]) but failed to remove it when the web application was stopped. This is very likely to create a memory leak.
May 11, 2014 10:15:56 AM org.apache.catalina.loader.WebappClassLoader checkThreadLocalMapForLeaks
SEVERE: The web application [/sqoop] created a ThreadLocal with key of type [java.lang.ThreadLocal] (value [java.lang.ThreadLocal@3f782da8]) and a value of type [org.apache.derby.iapi.services.context.ContextManager] (value [org.apache.derby.iapi.services.context.ContextManager@3e8a0821]) but failed to remove it when the web application was stopped. This is very likely to create a memory leak.
May 11, 2014 10:15:56 AM org.apache.catalina.startup.HostConfig deployDirectory
INFO: Deploying web application directory ROOT
May 11, 2014 10:15:56 AM org.apache.coyote.http11.Http11Protocol start
INFO: Starting Coyote HTTP/1.1 on http-12000
May 11, 2014 10:15:56 AM org.apache.catalina.startup.Catalina start
INFO: Server startup in 1656 ms
vmcentos01:/home/hduser/sqoop-1.99.3-bin-hadoop200/server/logs>
--------------------- catalina.out -------------------------
vmcentos01:/home/hduser/sqoop-1.99.3-bin-hadoop200/server/logs>cat catalina.out
May 11, 2014 10:15:54 AM org.apache.catalina.startup.ClassLoaderFactory validateFile
WARNING: Problem with directory [/home/hduser/sqoop-1.99.3-bin-hadoop200/lib], exists: [false], isDirectory: [false], canRead: [false]
May 11, 2014 10:15:54 AM org.apache.catalina.startup.ClassLoaderFactory validateFile
WARNING: Problem with directory [/home/hduser/sqoop-1.99.3-bin-hadoop200/${HADOOP_PREFIX}/share/hadoop/common], exists: [false], isDirectory: [false], canRead: [false]
May 11, 2014 10:15:54 AM org.apache.catalina.startup.ClassLoaderFactory validateFile
WARNING: Problem with directory [/home/hduser/sqoop-1.99.3-bin-hadoop200/${HADOOP_PREFIX}/share/hadoop/mapreduce], exists: [false], isDirectory: [false], canRead: [false]
May 11, 2014 10:15:54 AM org.apache.catalina.core.AprLifecycleListener init
INFO: The APR based Apache Tomcat Native library which allows optimal performance in production environments was not found on the java.library.path: /usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
May 11, 2014 10:15:54 AM org.apache.coyote.http11.Http11Protocol init
INFO: Initializing Coyote HTTP/1.1 on http-12000
May 11, 2014 10:15:54 AM org.apache.catalina.startup.Catalina load
INFO: Initialization processed in 399 ms
May 11, 2014 10:15:54 AM org.apache.catalina.core.StandardService start
INFO: Starting service Catalina
May 11, 2014 10:15:54 AM org.apache.catalina.core.StandardEngine start
INFO: Starting Servlet Engine: Apache Tomcat/6.0.36
May 11, 2014 10:15:54 AM org.apache.catalina.startup.HostConfig deployWAR
INFO: Deploying web application archive sqoop.war
log4j:WARN No appenders could be found for logger (org.apache.sqoop.core.SqoopServer).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
log4j: Parsing for [root] with value=[WARN, file].
log4j: Level token is [WARN].
log4j: Category root set to WARN
log4j: Parsing appender named "file".
log4j: Parsing layout options for "file".
log4j: Setting property [conversionPattern] to [%d{ISO8601} %-5p %c{2} [%l] %m%n].
log4j: End of parsing for "file".
log4j: Setting property [file] to [@LOGDIR@/sqoop.log].
log4j: Setting property [maxBackupIndex] to [5].
log4j: Setting property [maxFileSize] to [25MB].
log4j: setFile called: @LOGDIR@/sqoop.log, true
log4j: setFile ended
log4j: Parsed "file" options.
log4j: Parsing for [org.apache.sqoop] with value=[DEBUG].
log4j: Level token is [DEBUG].
log4j: Category org.apache.sqoop set to DEBUG
log4j: Handling log4j.additivity.org.apache.sqoop=[null]
log4j: Parsing for [org.apache.derby] with value=[INFO].
log4j: Level token is [INFO].
log4j: Category org.apache.derby set to INFO
log4j: Handling log4j.additivity.org.apache.derby=[null]
log4j: Finished configuring.
log4j: Could not find root logger information. Is this OK?
log4j: Parsing for [default] with value=[INFO,defaultAppender].
log4j: Level token is [INFO].
log4j: Category default set to INFO
log4j: Parsing appender named "defaultAppender".
log4j: Parsing layout options for "defaultAppender".
log4j: Setting property [conversionPattern] to [%d %-5p %c: %m%n].
log4j: End of parsing for "defaultAppender".
log4j: Setting property [file] to [@LOGDIR@/default.audit].
log4j: setFile called: @LOGDIR@/default.audit, true
log4j: setFile ended
log4j: Parsed "defaultAppender" options.
log4j: Handling log4j.additivity.default=[null]
log4j: Finished configuring.
May 11, 2014 10:15:56 AM org.apache.catalina.core.StandardContext start
SEVERE: Error listenerStart
May 11, 2014 10:15:56 AM org.apache.catalina.core.StandardContext start
SEVERE: Context [/sqoop] startup failed due to previous errors
May 11, 2014 10:15:56 AM org.apache.catalina.loader.WebappClassLoader clearReferencesJdbc
SEVERE: The web application [/sqoop] registered the JDBC driver [org.apache.derby.jdbc.AutoloadedDriver40] but failed to unregister it when the web application was stopped. To prevent a memory leak, the JDBC Driver has been forcibly unregistered.
May 11, 2014 10:15:56 AM org.apache.catalina.loader.WebappClassLoader clearReferencesThreads
SEVERE: The web application [/sqoop] appears to have started a thread named [sqoop-config-file-poller] but has failed to stop it. This is very likely to create a memory leak.
May 11, 2014 10:15:56 AM org.apache.catalina.loader.WebappClassLoader checkThreadLocalMapForLeaks
SEVERE: The web application [/sqoop] created a ThreadLocal with key of type [java.lang.ThreadLocal] (value [java.lang.ThreadLocal@3f782da8]) and a value of type [org.apache.derby.iapi.services.context.ContextManager] (value [org.apache.derby.iapi.services.context.ContextManager@6495dc5a]) but failed to remove it when the web application was stopped. This is very likely to create a memory leak.
May 11, 2014 10:15:56 AM org.apache.catalina.loader.WebappClassLoader checkThreadLocalMapForLeaks
SEVERE: The web application [/sqoop] created a ThreadLocal with key of type [java.lang.ThreadLocal] (value [java.lang.ThreadLocal@3f782da8]) and a value of type [org.apache.derby.iapi.services.context.ContextManager] (value [org.apache.derby.iapi.services.context.ContextManager@3e8a0821]) but failed to remove it when the web application was stopped. This is very likely to create a memory leak.
log4j: log4j called after unloading, see http://logging.apache.org/log4j/1.2/faq.html#unload.
java.lang.IllegalStateException: Class invariant violation
at org.apache.log4j.LogManager.getLoggerRepository(LogManager.java:199)
at org.apache.log4j.LogManager.getLogger(LogManager.java:228)
at org.apache.log4j.Logger.getLogger(Logger.java:117)
at org.apache.sqoop.connector.jdbc.GenericJdbcImportInitializer.<clinit>(GenericJdbcImportInitializer.java:42)
at sun.misc.Unsafe.ensureClassInitialized(Native Method)
at sun.reflect.UnsafeFieldAccessorFactory.newFieldAccessor(UnsafeFieldAccessorFactory.java:43)
at sun.reflect.ReflectionFactory.newFieldAccessor(ReflectionFactory.java:140)
at java.lang.reflect.Field.acquireFieldAccessor(Field.java:949)
at java.lang.reflect.Field.getFieldAccessor(Field.java:930)
at java.lang.reflect.Field.get(Field.java:372)
at org.apache.catalina.loader.WebappClassLoader.clearReferencesStaticFinal(WebappClassLoader.java:2066)
at org.apache.catalina.loader.WebappClassLoader.clearReferences(WebappClassLoader.java:1929)
at org.apache.catalina.loader.WebappClassLoader.stop(WebappClassLoader.java:1833)
at org.apache.catalina.loader.WebappLoader.stop(WebappLoader.java:740)
at org.apache.catalina.core.StandardContext.stop(StandardContext.java:4920)
at org.apache.catalina.core.StandardContext.start(StandardContext.java:4750)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:799)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:779)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:601)
at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:943)
at org.apache.catalina.startup.HostConfig.deployWARs(HostConfig.java:778)
at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:504)
at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1317)
at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:324)
at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:142)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1065)
at org.apache.catalina.core.StandardHost.start(StandardHost.java:840)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1057)
at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:463)
at org.apache.catalina.core.StandardService.start(StandardService.java:525)
at org.apache.catalina.core.StandardServer.start(StandardServer.java:754)
at org.apache.catalina.startup.Catalina.start(Catalina.java:595)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:289)
at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:414)
May 11, 2014 10:15:56 AM org.apache.catalina.startup.HostConfig deployDirectory
INFO: Deploying web application directory ROOT
May 11, 2014 10:15:56 AM org.apache.coyote.http11.Http11Protocol start
INFO: Starting Coyote HTTP/1.1 on http-12000
May 11, 2014 10:15:56 AM org.apache.catalina.startup.Catalina start
INFO: Server startup in 1656 ms
vmcentos01:/home/hduser/sqoop-1.99.3-bin-hadoop200/server/logs>
해결법
-
==============================
1.나는 너와 같은 문제가 있었는데, 너는 너의 hadoop 설치의 share / hadoop 폴더에 모든 병을 넣어야한다는 것을 알았다. 한 가지 예는 share / hadoop / mapreduce / *. jar를 포함했지만 share / hadoop / mapreduce / lib / *. jar을 한 단계 더 깊숙이 빠뜨린 것입니다. 다음은 작동하는 내 common.loader 설정입니다 (그냥 hadoop 위치 접두어를 바꿔 넣으십시오).
나는 너와 같은 문제가 있었는데, 너는 너의 hadoop 설치의 share / hadoop 폴더에 모든 병을 넣어야한다는 것을 알았다. 한 가지 예는 share / hadoop / mapreduce / *. jar를 포함했지만 share / hadoop / mapreduce / lib / *. jar을 한 단계 더 깊숙이 빠뜨린 것입니다. 다음은 작동하는 내 common.loader 설정입니다 (그냥 hadoop 위치 접두어를 바꿔 넣으십시오).
common.loader=${catalina.base}/lib,${catalina.base}/lib/*.jar,${catalina.home}/lib,${catalina.home}/lib/*.jar,${catalina.home}/../lib/*.jar,/localdisk/hadoop-2.4.0/share/hadoop/common/*.jar,/localdisk/hadoop-2.4.0/share/hadoop/common/lib/*.jar,/localdisk/hadoop-2.4.0/share/hadoop/hdfs/*.jar,/localdisk/hadoop-2.4.0/share/hadoop/hdfs/lib/*.jar,/localdisk/hadoop-2.4.0/share/hadoop/mapreduce/*.jar,/localdisk/hadoop-2.4.0/share/hadoop/mapreduce/lib/*.jar,/localdisk/hadoop-2.4.0/share/hadoop/tools/*.jar,/localdisk/hadoop-2.4.0/share/hadoop/tools/lib/*.jar,/localdisk/hadoop-2.4.0/share/hadoop/yarn/*.jar,/localdisk/hadoop-2.4.0/share/hadoop/yarn/lib/*.jar
-
==============================
2.Hadoop 2.4.0을 사용하면서 동일한 문제가 발생했습니다. 모든 hadoop jar 파일을 추가하려 했습니까 (share / hadoop / * 폴더에있는 모든 jar 파일을 sqoop lib 폴더에 추가 했습니까?
Hadoop 2.4.0을 사용하면서 동일한 문제가 발생했습니다. 모든 hadoop jar 파일을 추가하려 했습니까 (share / hadoop / * 폴더에있는 모든 jar 파일을 sqoop lib 폴더에 추가 했습니까?
결국 내 sqoop 서버가 실행되었습니다 (그러나 이전 버전에서). 이 문제를 해결하는 가장 완벽한 방법은 아니지만 적어도 sqoop을 실행할 수 있습니다.
어쩌면이 문제가 해결 될 것입니다.
-
==============================
3.sqoop2 1.99.6과 Hadoop-2.7.1을 사용하여 나는이 믿음을 겪었습니다.
sqoop2 1.99.6과 Hadoop-2.7.1을 사용하여 나는이 믿음을 겪었습니다.
root@some_server sqoop2]# sqoop2-tool verify Setting conf dir: /sw/apache/sqoop2/bin/../conf Sqoop home directory: /sw/apache/sqoop2 Sqoop tool executor: Version: 2.0.0-SNAPSHOT Revision: 81778c37a413eb64aa38ffc397af2ca695909013 Compiled on Thu Dec 10 22:20:08 WAT 2015 by root Running tool: class org.apache.sqoop.tools.tool.VerifyTool 0 [main] INFO org.apache.sqoop.core.SqoopServer - Initializing Sqoop server. 17 [main] INFO org.apache.sqoop.core.PropertiesConfigurationProvider - Starting config file poller thread Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration at org.apache.sqoop.security.authentication.SimpleAuthenticationHandler.secureLogin(SimpleAuthenticationHandler.java:36) at org.apache.sqoop.security.AuthenticationManager.initialize(AuthenticationManager.java:98) at org.apache.sqoop.core.SqoopServer.initialize(SqoopServer.java:55) at org.apache.sqoop.tools.tool.VerifyTool.runTool(VerifyTool.java:36) at org.apache.sqoop.tools.ToolRunner.main(ToolRunner.java:72) Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.conf.Configuration at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331) at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
해결책, 나는 "common.loader"속성 접근법을 시도했지만 어떤 이유로 그것을 올바르게하지 못했습니다. 결국 나는 내 모든 hadoop jar 파일을 $ HADOOP_HOME / share / hadoop 과 $ HADOOP_HOME / share / hadoop * / lib /를 $ SQOOP2_HOME / server / lib에 추가하십시오.
[root@vggdw01 sqoop2]# cp $HADOOP_HOME/share/hadoop/*.jar server/lib/ cp: overwrite ‘server/lib/commons-cli-1.2.jar’? n cp: overwrite ‘server/lib/commons-io-2.4.jar’? n cp: overwrite ‘server/lib/commons-logging-1.1.3.jar’? n cp: overwrite ‘server/lib/guava-11.0.2.jar’? n cp: overwrite ‘server/lib/jackson-core-asl-1.9.13.jar’? n cp: overwrite ‘server/lib/jackson-mapper-asl-1.9.13.jar’? n cp: overwrite ‘server/lib/paranamer-2.3.jar’? n cp: overwrite ‘server/lib/zookeeper-3.4.6.jar’? n
위에서 보았 듯이 jar 파일을 덮어 쓰지 않았습니다.
나는 그것이 TOMCAT 일인가라고 생각했다. .. 그래서 나는이 전술에 의지해야했다.
-
==============================
4.나는 HAdoop 2.6.2와 Sqoop 1.99.6을 사용하고 있는데, 아래에 그랬다.
나는 HAdoop 2.6.2와 Sqoop 1.99.6을 사용하고 있는데, 아래에 그랬다.
edit $SQOOP_HOME/server/conf/catalina.properties modified common.loader and set the paths to my hadoop and hive dir's
(또는) lazy fix (sudo로) -
ln -s $HADOOP_HOME/share/hadoop/common /usr/lib/hadoop ln -s $HADOOP_HOME/share/hadoop/hdfs /usr/lib/hadoop-hdfs ln -s $HADOOP_HOME/share/hadoop/mapreduce /usr/lib/hadoop-mapreduce ln -s $HADOOP_HOME/share/hadoop/yarn /usr/lib/hadoop-yarn mkdir -p /usr/lib/hive chmod 775 /usr/lib/hive ln -s $HIVE_HOME/lib /usr/lib/hive/lib
또한, SQOOP_HOME / server / conf / sqoop.properties의 org.apache.sqoop.submission.engine.mapreduce.configuration.directory 속성을 수정하고 적절한 hadoop conf 디렉토리로 설정하십시오.
희망이 도움이됩니다.
-
==============================
5.이 문제는 sqoop이 hadoop 설정과 라이브러리를 자동으로 찾을 수 없다는 사실과 관련이 있기 때문이다. 나는 그 도서관들의 절대 경로를 지적했다.
이 문제는 sqoop이 hadoop 설정과 라이브러리를 자동으로 찾을 수 없다는 사실과 관련이 있기 때문이다. 나는 그 도서관들의 절대 경로를 지적했다.
아래는 내 .profile 파일입니다. cen이 모든 것이 정상적으로 작동하는지 확인한 후에 다음과 같습니다. sqoop2-tool verify
#SQOOP settings export SQOOP_HOME=/usr/lib/sqoop export SQOOP_CONF_DIR=$SQOOP_HOME/conf export PATH=$PATH:$SQOOP_HOME/bin SQOOP_SERVER_EXTRA_LIB=/var/lib/sqoop2 #Hadoop settings export HADOOP_HOME=/usr/local/hadoop export PATH=$PATH:$HADOOP_HOME/bin export PATH=$PATH:$HADOOP_HOME/sbin export HADOOP_MAPRED_HOME=/usr/local/hadoop/share/hadoop/mapreduce export HADOOP_COMMON_HOME=/usr/local/hadoop/share/hadoop/common export HADOOP_HDFS_HOME=/usr/local/hadoop/share/hadoop/hdfs export YARN_HOME=/usr/local/hadoop/share/hadoop/yarn export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib" #export HADOOP_CONF_DIR=/usr/local/hadoop/etc/hadoop
from https://stackoverflow.com/questions/23593818/cant-get-sqoop-1-99-3-working-with-apache-hadoop-2-4-0-on-64-bit-centos-6-5 by cc-by-sa and MIT license
'HADOOP' 카테고리의 다른 글
[HADOOP] 특정 큐에서 sqoop 작업 실행 (0) | 2019.06.25 |
---|---|
[HADOOP] Scala / Spark에서 HDFS의 한 폴더에서 다른 폴더로 파일 이동 (0) | 2019.06.25 |
[HADOOP] Java에서 실행할 수 있도록 실행 파일을 압축 할 수 있습니까? (0) | 2019.06.25 |
[HADOOP] Hadoop 옵션이 효과가 없습니다 (mapreduce.input.lineinputformat.linespermap, mapred.max.map.failures.percent). (0) | 2019.06.25 |
[HADOOP] Mahout의 sequencefile API 코드는 어떻게 사용할 수 있습니까? (0) | 2019.06.25 |