복붙노트

[HADOOP] PySpark 설치 오류

HADOOP

PySpark 설치 오류

나는 이것을 포함하여 여러 블로그 게시물에서 지시 사항을 따랐다. 이것과 이것으로 나의 랩탑에 pyspark를 설치한다. 그러나 터미널이나 jupyter 노트북에서 pyspark를 사용하려고하면 다음 오류가 계속 발생합니다.

질문의 맨 아래에 표시된대로 필요한 모든 소프트웨어를 설치했습니다.

나는 .bashrc에 다음을 추가했다.

function sjupyter_init()
{
#Set anaconda3 as python
export PATH=~/anaconda3/bin:$PATH

#Spark path (based on your computer)
SPARK_HOME=/opt/spark
export PATH=$SPARK_HOME:$PATH

export PYTHONPATH=$SPARK_HOME/python:/home/khurram/anaconda3/bin/python3
export PYSPARK_DRIVER_PYTHON="jupyter"
export PYSPARK_DRIVER_PYTHON_OPTS="notebook"
export PYSPARK_PYTHON=python3
}

나는 터미널에서 jupyter 노트가 이어지는 sjupyter_init를 실행하여 pyspark가있는 jupyter 노트를 실행합니다.

노트북에서 다음과 같은 오류없이 실행합니다.

import findspark
findspark.init('/opt/spark')
from pyspark.sql import SparkSession

하지만 아래 라인에서 실행하면

spark = SparkSession.builder.appName("test").getOrCreate() 

이 오류 메시지가 나타납니다.

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
18/01/20 17:10:06 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/opt/spark/python/pyspark/sql/session.py", line 173, in getOrCreate
    sc = SparkContext.getOrCreate(sparkConf)
  File "/opt/spark/python/pyspark/context.py", line 334, in getOrCreate
    SparkContext(conf=conf or SparkConf())
  File "/opt/spark/python/pyspark/context.py", line 118, in __init__
    conf, jsc, profiler_cls)
  File "/opt/spark/python/pyspark/context.py", line 180, in _do_init
    self._jsc = jsc or self._initialize_context(self._conf._jconf)
  File "/opt/spark/python/pyspark/context.py", line 273, in _initialize_context
    return self._jvm.JavaSparkContext(jconf)
  File "/home/khurram/anaconda3/lib/python3.6/site-packages/py4j/java_gateway.py", line 1428, in __call__
    answer, self._gateway_client, None, self._fqn)
  File "/home/khurram/anaconda3/lib/python3.6/site-packages/py4j/protocol.py", line 320, in get_return_value
    format(target_id, ".", name), value)
py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
: java.lang.ExceptionInInitializerError
        at org.apache.spark.SparkConf.validateSettings(SparkConf.scala:546)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:373)
        at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
        at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
        at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
        at py4j.Gateway.invoke(Gateway.java:236)
        at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
        at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
        at py4j.GatewayConnection.run(GatewayConnection.java:214)
        at java.lang.Thread.run(Thread.java:748)
Caused by: java.net.UnknownHostException: linux-0he7: linux-0he7: Name or service not known
        at java.net.InetAddress.getLocalHost(InetAddress.java:1505)
        at org.apache.spark.util.Utils$.findLocalInetAddress(Utils.scala:891)
        at org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$localIpAddress$lzycompute(Utils.scala:884)
        at org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$localIpAddress(Utils.scala:884)
        at org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:941)
        at org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:941)
        at scala.Option.getOrElse(Option.scala:121)
        at org.apache.spark.util.Utils$.localHostName(Utils.scala:941)
        at org.apache.spark.internal.config.package$.<init>(package.scala:204)
        at org.apache.spark.internal.config.package$.<clinit>(package.scala)
        ... 14 more
Caused by: java.net.UnknownHostException: linux-0he7: Name or service not known
        at java.net.Inet6AddressImpl.lookupAllHostAddr(Native Method)
        at java.net.InetAddress$2.lookupAllHostAddr(InetAddress.java:928)
        at java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1323)
        at java.net.InetAddress.getLocalHost(InetAddress.java:1500)
        ... 23 more

내 OS 세부 정보는 다음과 같습니다.

OS :

OpenSuse Leap 42.2 64-bit

자바:

    khurram@linux-0he7:~> java -version
    openjdk version "1.8.0_151"

스칼라

    khurram@linux-0he7:~> scala -version
    Scala code runner version 2.12.4 -- Copyright 2002-2017, LAMP/EPFL and Lightbend, Inc.

하둡 3.0

khurram@linux-0he7:~> echo $HADOOP_HOME
/opt/hadoop

Say4h

khurram@linux-0he7:~> pip show py4j
Name: py4j
Version: 0.10.6
Summary: Enables Python programs to dynamically access arbitrary Java objects
Home-page: https://www.py4j.org/
Author: Barthelemy Dagenais
Author-email: barthelemy@infobart.com
License: BSD License
Location: /home/khurram/anaconda3/lib/python3.6/site-packages
Requires: 
khurram@linux-0he7:~> 

hadoop 및 spark 디렉토리에 대해 chmod 777을 실행했습니다.

khurram@linux-0he7:~> ls -al /opt/
total 8
drwxr-xr-x 1 root    root   96 Jan 19 20:22 .
drwxr-xr-x 1 root    root  222 Jan 20 14:54 ..
lrwxrwxrwx 1 root    root   18 Jan 19 20:22 hadoop -> /opt/hadoop-3.0.0/
drwxrwxrwx 1 khurram users 126 Dec  8 19:42 hadoop-3.0.0
lrwxrwxrwx 1 root    root   30 Jan 19 19:40 spark -> /opt/spark-2.2.1-bin-hadoop2.7
drwxrwxrwx 1 khurram users 150 Jan 19 19:33 spark-2.2.1-bin-hadoop2.7
khurram@linux-0he7:~>

호스트 파일의 내용

khurram@linux-0he7:> cat /etc/hosts

127.0.0.1       localhost

# special IPv6 addresses
::1             localhost ipv6-localhost ipv6-loopback

fe00::0         ipv6-localnet

ff00::0         ipv6-mcastprefix
ff02::1         ipv6-allnodes
ff02::2         ipv6-allrouters
ff02::3         ipv6-allhosts

해결법

  1. ==============================

    1.UnknownHostException

    UnknownHostException

    스택 추적의 맨 아래에 throw됩니다.

    프롬프트 쉘 linux-0he7을 보면 로컬 모드를 사용하고 있다고 가정합니다. 즉, / etc / hosts에는 linux-0he7이 포함되지 않습니다.

    첨가

    127.0.0.1    linux-0he7
    

    / etc / hosts가 문제를 해결해야합니다.

    spark.driver.bindAddress 및 spark.driver.host를 사용하여 드라이버에 특정 호스트 IP를 사용할 수도 있습니다.

    Hadoop 3.0.0 예외는 아직 지원되지 않습니다. 당분간 2.x 사용을 권합니다.

  2. from https://stackoverflow.com/questions/48359436/pyspark-install-error by cc-by-sa and MIT license