将Spark Streaming数据发送回客户端

问题描述 投票:0回答:1

我是Apache Spark Streaming的新手。我正在开发一个spark流媒体应用程序,以找到最短的路径,并再次将该路径发送回客户端。我已经写好了数据处理的代码,但我有一个问题,我怎么能把我的结果再次送回客户端这里是我的代码。

import networkx as nx
from pyspark import SparkConf,SparkContext
from pyspark.streaming import StreamingContext
TCP_IP = "127.0.0.1"
TCP_PORT = 5000

# Creating a Spark Configuration
conf=SparkConf()
conf.setAppName('ShortestPathApp')

sc= SparkContext(conf)
ssc= StreamingContext(sc,2)

def shortestPath(line):
    # get the values from rdd
    vehicleId = line[0]
    source = line[1]
    destination = line[2]
    deadline = line[3]

    # find shortest path
    shortest = nx.dijkstra_path(G, source, destination)



# receive from Socket
dataStream =ssc.socketTextStream(TCP_IP,TCP_PORT)
vehicle_data = dataStream.map(lambda line: line.split(" "))
vehicle_data.foreachRDD(lambda rdd: rdd.foreach(shortestPath))
ssc.start()
ssc.awaitTermination()

我怎么能把数据发回客户端呢?

python python-3.x apache-spark pyspark spark-streaming
1个回答
2
投票

使用 StreamingContext 将输出数据以流的形式推送回目的地,你可以创建如下方法。

# Lazily instantiated global instance of SparkSession
def getSparkSessionInstance(sparkConf):
    if ("sparkSessionSingletonInstance" not in globals()):
        globals()["sparkSessionSingletonInstance"] = SparkSession \
            .builder \
            .config(conf=sparkConf) \
            .getOrCreate()
    return globals()["sparkSessionSingletonInstance"]

sparkSess = getSparkSessionInstance(rdd.context.getConf()) 
vehicle_data_df = sparkSess.createDataFrame(vehicle_data)
vehicle_data_df.writeStream\
      .format("socket")\
      .option("host",TCP_OUTPUT_IP)    //Output socket IP address 
      .option("port",TCP_OUTPUT_PORT)  //Output socket port 
      .outputMode('append')\
      .start()\
      .awaitTermination()
© www.soinside.com 2019 - 2024. All rights reserved.