我在Azure上使用Databricks笔记本,我有一个非常好的Pyspark笔记本,昨天整天运行良好。但是在一天结束的时候,我注意到我在代码上遇到了一些奇怪的错误,我知道以前工作过:org.apache.spark.SparkException: Job aborted due to stage failure: Task from application
但是因为已经很晚了,我把它留到了今天。今天我尝试创建一个运行代码的新集群,这次它只是说我的工作被“取消”
实际上我只是尝试运行1行代码:
filePath = "/SalesData.csv"
甚至被取消了。
编辑:
以下是Azure的std错误日志:
OpenJDK 64-Bit Server VM warning: ignoring option MaxPermSize=512m; support was removed in 8.0
/databricks/python/lib/python3.5/site-packages/IPython/config/loader.py:38: UserWarning: IPython.utils.traitlets has moved to a top-level traitlets package.
from IPython.utils.traitlets import HasTraits, List, Any, TraitError
Fri Jan 4 16:51:08 2019 py4j imported
Fri Jan 4 16:51:08 2019 Python shell started with PID 2543 and guid 86405138b8744987a1df085e4454bb5d
Could not launch process The 'config' trait of an IPythonShell instance must be a Config, but a value of class 'IPython.config.loader.Config' (i.e. {'HistoryManager': {'hist_file': ':memory:'}, 'HistoryAccessor': {'hist_file': ':memory:'}}) was specified. Traceback (most recent call last):
File "/tmp/1546620668035-0/PythonShell.py", line 1048, in <module>
launch_process()
File "/tmp/1546620668035-0/PythonShell.py", line 1036, in launch_process
console_buffer, error_buffer)
File "/tmp/1546620668035-0/PythonShell.py", line 508, in __init__
self.shell = self.create_shell()
File "/tmp/1546620668035-0/PythonShell.py", line 617, in create_shell
ip_shell = IPythonShell.instance(config=config, user_ns=user_ns)
File "/databricks/python/lib/python3.5/site-packages/traitlets/config/configurable.py", line 412, in instance
inst = cls(*args, **kwargs)
File "/databricks/python/lib/python3.5/site-packages/IPython/terminal/embed.py", line 159, in __init__
super(InteractiveShellEmbed,self).__init__(**kw)
File "/databricks/python/lib/python3.5/site-packages/IPython/terminal/interactiveshell.py", line 455, in __init__
super(TerminalInteractiveShell, self).__init__(*args, **kwargs)
File "/databricks/python/lib/python3.5/site-packages/IPython/core/interactiveshell.py", line 622, in __init__
super(InteractiveShell, self).__init__(**kwargs)
File "/databricks/python/lib/python3.5/site-packages/traitlets/config/configurable.py", line 84, in __init__
self.config = config
File "/databricks/python/lib/python3.5/site-packages/traitlets/traitlets.py", line 583, in __set__
self.set(obj, value)
File "/databricks/python/lib/python3.5/site-packages/traitlets/traitlets.py", line 557, in set
new_value = self._validate(obj, value)
File "/databricks/python/lib/python3.5/site-packages/traitlets/traitlets.py", line 589, in _validate
value = self.validate(obj, value)
File "/databricks/python/lib/python3.5/site-packages/traitlets/traitlets.py", line 1681, in validate
self.error(obj, value)
File "/databricks/python/lib/python3.5/site-packages/traitlets/traitlets.py", line 1528, in error
raise TraitError(e)
traitlets.traitlets.TraitError: The 'config' trait of an IPythonShell instance must be a Config, but a value of class 'IPython.config.loader.Config' (i.e. {'HistoryManager': {'hist_file': ':memory:'}, 'HistoryAccessor': {'hist_file': ':memory:'}}) was specified.
我和我的团队在将azureml['notebooks']
Python软件包安装到我们的集群后遇到了这个问题。安装似乎有效,但我们收到了试图运行代码单元的“已取消”消息。
我们的日志中也收到了类似于此帖子中的错误:
The 'config' trait of an IPythonShell instance must be a Config,
but a value of class 'IPython.config.loader.Config'...
似乎某些Python包可能与此Config对象冲突,或者是不兼容的。我们卸载了库,重新启动了集群,一切正常。希望这有助于某人:)
好的,我最终创建了另一个新的集群,它现在似乎工作。我做的唯一不同的是,在前一个集群中,我设置了可以扩展到5的最大节点。这次我把它作为默认值8。
但是我不知道这是否真的有所不同。 ESP。鉴于昨天的错误发生在之前工作正常的集群上。或者今天的错误是执行一个非常简单的代码。
听起来您的群集可能已进入错误状态并需要重新启动。有时,底层VM服务也可能出现故障,您需要使用新节点启动新集群。如果您无法执行代码,请务必从重新启动群集开始。
似乎安装了IPython软件包版本有问题。解决它的问题是降级IPython版本:
集群(左窗格)>在“软件包”字段中单击您的集群>库>安装新> PyPi>,写入:“ipython == 3.2.3”>安装
然后重新启动群集。
此外,Databricks似乎还有另一个与NumPy软件包类似的问题,这个问题在修复IPython后发生在我们身上。如果它也发生在您身上,请尝试按照与IPython相同的方式降级到numpy == 1.15.0。