spark_apply无法运行程序“ Rscript”:在目录“ C:\ Users \用户名\ AppData \ Local \ spark \ spark-2.3.3-bin-hadoop2.7 \ tmp \ local \ spark-中。 \ userFiles

问题描述 投票:0回答:1

按照《用R掌握Apache Spark》一书的第一条说明进行操作关于spark_apply,在Windows下并使用RGui的本地群集上,发射:

install.packages("sparklyr")
install.packages("pkgconfig")
spark_install("2.3")
Installing Spark 2.3.3 for Hadoop 2.7 or later.
spark_installed_versions()
library(dplyr,sparklyr)
sc <- spark_connect(master = "local", version = "2.3.3")
cars <- copy_to(sc, mtcars)    
cars %>% spark_apply(~round(.x))

返回以下错误:

spark_apply Cannot run program “Rscript”:  in directory "C:\Users\username\AppData\Local\spark\spark-2.3.3-bin-hadoop2.7\tmp\local\spark-..\userFiles-..  
CreateProcess error=2, The file specified can't be found

如何核心安装sparklyr和如何获得此错误?

r apache-spark sparklyr
1个回答
0
投票

spark节点在其路径中需要Rscript可执行文件。对于主节点,使用以下命令为possible to set the path to the Rscript executable

© www.soinside.com 2019 - 2024. All rights reserved.