无法抑制 PySpark 警告

问题描述 投票:0回答:2

我在尝试抑制 pyspark 警告时遇到一些问题,特别是 Spark API 上的 pandas。我目前拥有的:

import warnings
warnings.simplefilter(action='ignore', category=Warning)
warnings.filterwarnings("ignore")
import pandas as pd
from pyspark.sql import SparkSession
from pyspark.sql import functions as F
import pyspark.pandas as

%%capture
spark = SparkSession.builder\
    .master("local[32]")\
    .config("spark.driver.memory", "150g")
    .config("spark.driver.maxResultSize", "40g")\
    .config("spark.python.worker.memory", "1g")\
    .config("spark.num.executors","(3x-2)")\
    .config("spark.num.executor.cores","5")\
    .config("spark.driver.cores", "5")\
    .appName("Analysis")\
    .getOrCreate()
spark.sparkContext.setLogLevel("OFF")

接下来是实际数据分析:

spark.catalog.clearCache()
enc = ps.read_parquet("/example_path/")
enc.columns = [i.lower() for i in enc.columns]
print(enc.en_end_date.min())
print(enc.en_end_date.max())
enc['year'] = enc.en_end_date.apply(lambda x: x.strftime('%Y') if pd.notnull(x) else np.nan)
enc['month'] = enc.en_end_date.apply(lambda x: x.strftime('%m') if pd.notnull(x) else np.nan)
enc['day'] = enc.en_end_date.apply(lambda x: x.strftime('%d') if pd.notnull(x) else np.nan)
enc[(enc.year >= 2024) & (enc.month >= 1) & (enc.day >= 1)]

这就是实际问题发生的地方。我完全被轰炸了:

/example/miniconda/lib/python3.8/site-packages/pyspark/python/lib/pyspark.zip/pyspark/pandas/internal.py:1573: FutureWarning: iteritems is deprecated and will be removed in a future version. Use .items instead.
/example/miniconda/lib/python3.8/site-packages/pyspark/python/lib/pyspark.zip/pyspark/pandas/internal.py:1573: FutureWarning: iteritems is deprecated and will be removed in a future version. Use .items instead.
/example/miniconda/lib/python3.8/site-packages/pyspark/python/lib/pyspark.zip/pyspark/pandas/internal.py:1573: FutureWarning: iteritems is deprecated and will be removed in a future version. Use .items instead.
/example/miniconda/lib/python3.8/site-packages/pyspark/python/lib/pyspark.zip/pyspark/pandas/internal.py:1573: FutureWarning: iteritems is deprecated and will be removed in a future version. Use .items instead.

数百次。我只是想把它关掉。有什么建议吗

python pandas apache-spark pyspark suppress-warnings
2个回答
0
投票

对于遇到此问题的任何人,请回滚您的 Pandas 版本,直到警告停止,不幸的是没有其他方法可以抑制此问题。


0
投票

它还可以通过使用

FutureWarnings
 模块来抑制 
warnings

import warnings
warnings.simplefilter(action='ignore', category=FutureWarning)

对我来说,这在 Azure Synapse 中有效,但可能取决于您的具体配置。

© www.soinside.com 2019 - 2024. All rights reserved.