복붙노트

[HADOOP] ResourceManager 및 NodeManager를 시작하는 Hadoop 오류

HADOOP

ResourceManager 및 NodeManager를 시작하는 Hadoop 오류

단일 노드 클러스터 (Psuedo-distributed)로 Hadoop3-alpha3을 설정하고 아파치 가이드를 사용하여 설정하려고합니다. MapReduce 작업 예제를 실행하려고 시도했지만 연결이 거부 될 때마다. sbin / start-all.sh를 실행 한 후 ResourceManager 로그 (및 NodeManager 로그와 유사하게)에서 이러한 예외가 발생했습니다.

xxxx-xx-xx xx:xx:xx,xxx INFO org.apache.commons.beanutils.FluentPropertyBeanIntrospector: Error when creating PropertyDescriptor for public final void org.apache.commons.configuration2.AbstractConfiguration.setProperty(java.lang.String,java.lang.Object)! Ignoring this property.
xxxx-xx-xx xx:xx:xx,xxx DEBUG org.apache.commons.beanutils.FluentPropertyBeanIntrospector: Exception is:
java.beans.IntrospectionException: bad write method arg count: public final void org.apache.commons.configuration2.AbstractConfiguration.setProperty(java.lang.String,java.lang.Object)
    at java.desktop/java.beans.PropertyDescriptor.findPropertyType(PropertyDescriptor.java:696)
    at java.desktop/java.beans.PropertyDescriptor.setWriteMethod(PropertyDescriptor.java:356)
    at java.desktop/java.beans.PropertyDescriptor.<init>(PropertyDescriptor.java:142)
    at org.apache.commons.beanutils.FluentPropertyBeanIntrospector.createFluentPropertyDescritor(FluentPropertyBeanIntrospector.java:178)
    at org.apache.commons.beanutils.FluentPropertyBeanIntrospector.introspect(FluentPropertyBeanIntrospector.java:141)
    at org.apache.commons.beanutils.PropertyUtilsBean.fetchIntrospectionData(PropertyUtilsBean.java:2245)
    at org.apache.commons.beanutils.PropertyUtilsBean.getIntrospectionData(PropertyUtilsBean.java:2226)
    at org.apache.commons.beanutils.PropertyUtilsBean.getPropertyDescriptor(PropertyUtilsBean.java:954)
    at org.apache.commons.beanutils.PropertyUtilsBean.isWriteable(PropertyUtilsBean.java:1478)
    at org.apache.commons.configuration2.beanutils.BeanHelper.isPropertyWriteable(BeanHelper.java:521)
    at org.apache.commons.configuration2.beanutils.BeanHelper.initProperty(BeanHelper.java:357)
    at org.apache.commons.configuration2.beanutils.BeanHelper.initBeanProperties(BeanHelper.java:273)
    at org.apache.commons.configuration2.beanutils.BeanHelper.initBean(BeanHelper.java:192)
    at org.apache.commons.configuration2.beanutils.BeanHelper$BeanCreationContextImpl.initBean(BeanHelper.java:669)
    at org.apache.commons.configuration2.beanutils.DefaultBeanFactory.initBeanInstance(DefaultBeanFactory.java:162)
    at org.apache.commons.configuration2.beanutils.DefaultBeanFactory.createBean(DefaultBeanFactory.java:116)
    at org.apache.commons.configuration2.beanutils.BeanHelper.createBean(BeanHelper.java:459)
    at org.apache.commons.configuration2.beanutils.BeanHelper.createBean(BeanHelper.java:479)
    at org.apache.commons.configuration2.beanutils.BeanHelper.createBean(BeanHelper.java:492)
    at org.apache.commons.configuration2.builder.BasicConfigurationBuilder.createResultInstance(BasicConfigurationBuilder.java:447)
    at org.apache.commons.configuration2.builder.BasicConfigurationBuilder.createResult(BasicConfigurationBuilder.java:417)
    at org.apache.commons.configuration2.builder.BasicConfigurationBuilder.getConfiguration(BasicConfigurationBuilder.java:285)
    at org.apache.hadoop.metrics2.impl.MetricsConfig.loadFirst(MetricsConfig.java:119)
    at org.apache.hadoop.metrics2.impl.MetricsConfig.create(MetricsConfig.java:98)
    at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.configure(MetricsSystemImpl.java:478)
    at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.start(MetricsSystemImpl.java:188)
    at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.init(MetricsSystemImpl.java:163)
    at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.init(DefaultMetricsSystem.java:62)
    at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.initialize(DefaultMetricsSystem.java:58)
    at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$RMActiveServices.serviceInit(ResourceManager.java:678)
    at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
    at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.createAndInitActiveServices(ResourceManager.java:1129)
    at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.serviceInit(ResourceManager.java:315)
    at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
    at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.main(ResourceManager.java:1407)

그리고 나중에 파일에서 :

xxxx-xx-xx xx:xx:xx,xxx FATAL org.apache.hadoop.yarn.server.resourcemanager.ResourceManager: Error starting ResourceManager
java.lang.ExceptionInInitializerError
    at com.google.inject.internal.cglib.reflect.$FastClassEmitter.<init>(FastClassEmitter.java:67)
    at com.google.inject.internal.cglib.reflect.$FastClass$Generator.generateClass(FastClass.java:72)
    at com.google.inject.internal.cglib.core.$DefaultGeneratorStrategy.generate(DefaultGeneratorStrategy.java:25)
    at com.google.inject.internal.cglib.core.$AbstractClassGenerator.create(AbstractClassGenerator.java:216)
    at com.google.inject.internal.cglib.reflect.$FastClass$Generator.create(FastClass.java:64)
    at com.google.inject.internal.BytecodeGen.newFastClass(BytecodeGen.java:204)
    at com.google.inject.internal.ProviderMethod$FastClassProviderMethod.<init>(ProviderMethod.java:256)
    at com.google.inject.internal.ProviderMethod.create(ProviderMethod.java:71)
    at com.google.inject.internal.ProviderMethodsModule.createProviderMethod(ProviderMethodsModule.java:275)
    at com.google.inject.internal.ProviderMethodsModule.getProviderMethods(ProviderMethodsModule.java:144)
    at com.google.inject.internal.ProviderMethodsModule.configure(ProviderMethodsModule.java:123)
    at com.google.inject.spi.Elements$RecordingBinder.install(Elements.java:340)
    at com.google.inject.spi.Elements$RecordingBinder.install(Elements.java:349)
    at com.google.inject.AbstractModule.install(AbstractModule.java:122)
    at com.google.inject.servlet.ServletModule.configure(ServletModule.java:52)
    at com.google.inject.AbstractModule.configure(AbstractModule.java:62)
    at com.google.inject.spi.Elements$RecordingBinder.install(Elements.java:340)
    at com.google.inject.spi.Elements.getElements(Elements.java:110)
    at com.google.inject.internal.InjectorShell$Builder.build(InjectorShell.java:138)
    at com.google.inject.internal.InternalInjectorCreator.build(InternalInjectorCreator.java:104)
    at com.google.inject.Guice.createInjector(Guice.java:96)
    at com.google.inject.Guice.createInjector(Guice.java:73)
    at com.google.inject.Guice.createInjector(Guice.java:62)
    at org.apache.hadoop.yarn.webapp.WebApps$Builder.build(WebApps.java:332)
    at org.apache.hadoop.yarn.webapp.WebApps$Builder.start(WebApps.java:377)
    at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.startWepApp(ResourceManager.java:1116)
    at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.serviceStart(ResourceManager.java:1218)
    at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
    at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.main(ResourceManager.java:1408)
Caused by: java.lang.reflect.InaccessibleObjectException: Unable to make protected final java.lang.Class java.lang.ClassLoader.defineClass(java.lang.String,byte[],int,int,java.security.ProtectionDomain) throws java.lang.ClassFormatError accessible: module java.base does not "opens java.lang" to unnamed module @173f73e7
    at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:337)
    at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:281)
    at java.base/java.lang.reflect.Method.checkCanSetAccessible(Method.java:197)
    at java.base/java.lang.reflect.Method.setAccessible(Method.java:191)
    at com.google.inject.internal.cglib.core.$ReflectUtils$2.run(ReflectUtils.java:56)
    at java.base/java.security.AccessController.doPrivileged(Native Method)
    at com.google.inject.internal.cglib.core.$ReflectUtils.<clinit>(ReflectUtils.java:46)
    ... 29 more

내 core-site.xml을 참조하십시오.

<configuration>
    <property>
        <name>fs.default.name</name>
        <value>hdfs://localhost:9000</value>
    </property>
</configuration>

hdfs-site.xml :

<configuration>
    <property>
        <name>dfs.replication</name>
        <value>1</value>
    </property>
</configuration>

mapred-site.xml :

<configuration>
    <property>
        <name>mapreduce.framework.name</name>
        <value>yarn</value>
    </property>
</configuration>

및 yarn-site.xml :

<configuration>
    <property>
        <name>yarn.nodemanager.aux-services</name>
        <value>mapreduce_shuffle</value>
    </property>
    <property>
        <name>yarn.nodemanager.env-whitelist</name>
        <value>JAVA_HOME,HADOOP_COMMON_HOME,HADOOP_HDFS_HOME,HADOOP_CONF_DIR,CLASSPATH_PREPEND_DISTCACHE,HADOOP_YARN_HOME,HADOOP_MAPRED_HOME</value>
    </property>
</configuration>

이 예외를 일으키는 원인이 무엇인지 알 수 없으므로 도움이 될 것입니다.

편집 : hadoop-env.sh가 추가되었습니다.

export JAVA_HOME=/usr/local/jdk-9
export HADOOP_HOME=/usr/local/hadoop
export HADOOP_OS_TYPE=${HADOOP_OS_TYPE:-$(uname -s)}
case ${HADOOP_OS_TYPE} in
  Darwin*)
    export HADOOP_OPTS="${HADOOP_OPTS} -Djava.security.krb5.realm= "
    export HADOOP_OPTS="${HADOOP_OPTS} -Djava.security.krb5.kdc= "
    export HADOOP_OPTS="${HADOOP_OPTS} -Djava.security.krb5.conf= "
  ;;
esac
export HADOOP_ROOT_LOGGER=DEBUG,console
export HADOOP_DAEMON_ROOT_LOGGER=DEBUG,RFA

해결법

  1. ==============================

    1.의견에서 @ tk421이 언급했습니다. Java 9는 아직 Hadoop 3 (및 가능한 하둡 버전)과 호환되지 않습니다.

    의견에서 @ tk421이 언급했습니다. Java 9는 아직 Hadoop 3 (및 가능한 하둡 버전)과 호환되지 않습니다.

    https://issues.apache.org/jira/browse/HADOOP-11123

    Java 8.181로 변경했으며 둘 다 지금 시작됩니다.

    hadoop@hadoop:/usr/local/hadoop$ sbin/start-all.sh
    WARNING: Attempting to start all Apache Hadoop daemons as hadoop in 10 seconds.
    WARNING: This is not a recommended production deployment configuration.
    WARNING: Use CTRL-C to abort.
    Starting namenodes on [localhost]
    Starting datanodes
    Starting secondary namenodes [hadoop]
    Starting resourcemanager
    Starting nodemanagers
    hadoop@hadoop:/usr/local/hadoop$ jps
    8756 SecondaryNameNode
    8389 NameNode
    9173 NodeManager
    9030 ResourceManager
    8535 DataNode
    9515 Jps
    
  2. ==============================

    2.내 문제는 java11을 사용하여 hadoop과 협력한다는 것입니다.

    내 문제는 java11을 사용하여 hadoop과 협력한다는 것입니다.

    그래서 내가하는 일은

    1.rm / 라이브러리 / 자바 / *

    2. https://www.oracle.com/technetwork/java/javase/downloads/jdk8-downloads-2133151.html에서 java8을 다운로드하십시오.

    3. java8jdk를 설치하고

    4. hadoop-env.sh에서 JAVA_HOME을 수정하십시오.

    5.stop-all.sh

    6.start-dfs.sh

    7.start-yarn.sh

  3. from https://stackoverflow.com/questions/44427653/hadoop-error-starting-resourcemanager-and-nodemanager by cc-by-sa and MIT license