从Spark Dataframe获取表名

问题描述 投票:1回答:3

如果我创建了如下数据框:

df = spark.table("tblName")

无论如何我可以从df获取tblName吗?

apache-spark pyspark
3个回答
0
投票

您可以从计划中提取它:

df.logicalPlan().argString().replace("`","")

-1
投票

您可以从df创建表。但是,如果table是本地临时视图或全局临时视图,则应在创建具有相同名称的表之前删除它(sqlContext.dropTempTable)或使用create或replace函数(spark.createOrReplaceGlobalTempView或spark.createOrReplaceTempView)。如果table是临时表,则可以创建具有相同名称的表而不会出现错误

#Create data frame
>>> d = [('Alice', 1)]
>>> test_df = spark.createDataFrame(sc.parallelize(d), ['name','age'])
>>> test_df.show()
+-----+---+
| name|age|
+-----+---+
|Alice|  1|
+-----+---+

#create tables
>>> test_df.createTempView("tbl1")
>>> test_df.registerTempTable("tbl2")
>>> sqlContext.tables().show()
+--------+---------+-----------+
|database|tableName|isTemporary|
+--------+---------+-----------+
|        |     tbl1|       true|
|        |     tbl2|       true|
+--------+---------+-----------+

#create data frame from tbl1
>>> df = spark.table("tbl1")
>>> df.show()
+-----+---+
| name|age|
+-----+---+
|Alice|  1|
+-----+---+

#create tbl1 again with using df data frame. It will get error
>>> df.createTempView("tbl1")
    raise AnalysisException(s.split(': ', 1)[1], stackTrace)
pyspark.sql.utils.AnalysisException: "Temporary view 'tbl1' already exists;"

#drop and create again
>>> sqlContext.dropTempTable('tbl1')
>>> df.createTempView("tbl1")
>>> spark.sql('select * from tbl1').show()
+-----+---+
| name|age|
+-----+---+
|Alice|  1|
+-----+---+


#create data frame from tbl2 and replace name value
>>> df = spark.table("tbl2")
>>> df = df.replace('Alice', 'Bob')
>>> df.show()
+----+---+
|name|age|
+----+---+
| Bob|  1|
+----+---+

#create tbl2 again with using df data frame
>>> df.registerTempTable("tbl2")
>>> spark.sql('select * from tbl2').show()
+----+---+
|name|age|
+----+---+
| Bob|  1|
+----+---+

-1
投票

您可以在其上执行explain以检索物理计划,该计划将为您提供可用于检索原始表名称的信息,

scala> val df = sqlContext.table("testtable") 
df: org.apache.spark.sql.DataFrame = [id: bigint, name: string, ssn: string]

scala> df.explain

== Physical Plan ==
Scan ParquetRelation: default.testtable[id#0L,name#1,ssn#2] InputPaths: hdfs://user/hive/warehouse/testtable

要么

== Physical Plan ==
HiveTableScan [id#0L,name#1,ssn#2], MetastoreRelation hive_sample_db, testtable, None

一旦你有物理计划作为字符串,它只需要操纵它来获取原始表名称。

© www.soinside.com 2019 - 2024. All rights reserved.