在创建会话之前抑制 pyspark 警告

问题描述 投票:0回答:1

在我的计算机上本地工作时,我无法抑制 pyspark 警告。我收到以下警告,我想“抑制”它们:

WARN Utils: Your hostname, [HOSTNAME] resolves to a loopback address: 127.0.1.1; using 192.168.26.41 instead (on interface [INTERFACE])
WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

这是我试图抑制警告的方法,但没有一个起作用:

from pyspark.sql import SparkSession

method_number = 1 # 2, 3

def get_spark_object():
    return SparkSession.builder.getOrCreate()

if method_number == 1:
    import sys
    save_stdout = sys.stdout
    sys.stdout = open('trash', 'w')
    spark = get_spark_object()
    sys.stdout = save_stdout

elif method_number == 2:
    import warnings
    warnings.filterwarnings('ignore')
    warnings.simplefilter('ignore')
    spark = get_spark_object()

elif method_number == 3:
    import shutup
    shutup.please()
    spark = get_spark_object()
python pyspark warnings suppress-warnings
1个回答
0
投票

这是一种替代方法,可能有助于抑制这些特定警告。

from pyspark.sql import SparkSession
import os

# Set environment variables to suppress specific warnings
os.environ["SPARK_LOCAL_IP"] = "127.0.0.1"
os.environ["PYSPARK_SUBMIT_ARGS"] = "--conf spark.ui.showConsoleProgress=false"

# Create Spark session
spark = SparkSession.builder.appName("yourAppName").getOrCreate()

这样,您就不需要重定向或过滤警告;相反,您正在配置 Spark 以避免发出某些警告。

© www.soinside.com 2019 - 2024. All rights reserved.