如何通过火花与Cassandra建立区域/直流感知连接?

问题描述 投票:1回答:1

我正在使用spark-sql 2.4.1,spark-cassandra-connector_2.11-2.4.1.jar和java8。当我尝试从表中获取数据时,我遇到

java.io.IOException: Failed to write statements to keyspace1.model_vals. The
latest exception was
  An unexpected error occurred server side on cassandra-node1: com.google.common.util.concurrent.UncheckedExecutionException: com.google.common.util.concurrent.UncheckedExecutionException: java.lang.RuntimeException: org.apache.cassandra.exceptions.ReadTimeoutException: Operation timed out - received only 0 responses.

那么如何从火花代码建立到Cassandra db的区域/直流感知连接?

YML

现有的

spring:
  data:
      cassandra:
        keyspace-name: raproduct
        contact-points:
                    - cassandra-node1
                    - cassandra-node2
        port: 9042 

更改为

spring:
  data:
      cassandra:
        connection:
          local_dc: southeast-1
        keyspace-name: raproduct
        contact-points:
                    - cassandra-node1
                    - cassandra-node2
        port: 9042 

问题

但是不会反映/应用更改后的“ local_dc”。如何在spring-data中做到这一点?

apache-spark cassandra spring-data datastax-java-driver spring-data-cassandra
1个回答
3
投票

检查Spark Connector documentationConfiguration Reference - Cassandra Connection Parameters。似乎可以通过在连接配置中设置spark.cassandra.connection.local_dc属性来完成:

val conf = new SparkConf(true)
        .set("spark.cassandra.connection.host", "192.168.1.10")
        .set("spark.cassandra.auth.username", "flynn")            
        .set("spark.cassandra.auth.password", "reindeerFlotilla82")
        .set("spark.cassandra.connection.local_dc", "encom_west1_dc")

val sc = new SparkContext("spark://192.168.1.133:7077", "test", conf)

[不确定您的连接配置代码是什么样,但是尝试设置spark.cassandra.connection.local_dc属性并查看将您带到何处。

© www.soinside.com 2019 - 2024. All rights reserved.