春天cassandra数据版本2.1.4分页

问题描述 投票:1回答:1

根据文档:https://docs.spring.io/spring-data/cassandra/docs/2.1.4.RELEASE/reference/html/#repositories.limit-query-result

Spring cassandra数据使得获取分页信息变得容易。但我不能让这个工作。

回购,通话和错误:

1.无功呼叫

回购:

public interface MyRepository extends ReactiveCassandraRepository<MyClass, String> {
  @Query("select * from my_keyspace.my_table where solr_query = ?0")
  Mono<Slice<MyClass>> findMono(String solrQuery, Pageable page);
}

呼叫:

 Mono<Slice<MyClass>>  repository.findMono(queryString, CassandraPageRequest.first(20));

错误:

“exceptionDescription”:“org.springframework.core.codec.CodecException:类型定义错误:[simple type,class com.datastax.driver.core.PagingState];嵌套异常是com.fasterxml.jackson.databind.exc.InvalidDefinitionException:没有找到类com.datastax.driver.core.PagingState的序列化程序,也没有发现创建BeanSerializer的属性(为了避免异常,请禁用SerializationFeature.FAIL_ON_EMPTY_BEANS)(通过引用链:org.springframework.data.domain.SliceImpl [\“pageable \ “] - > org.springframework.data.cassandra.core.query.CassandraPageRequest [\” pagingState \ “])”, “行”:[“org.springframework.http.codec.json.AbstractJackson2Encoder.encodeValue(AbstractJackson2Encoder.java :175) “ ”$ org.springframework.http.codec.json.AbstractJackson2Encoder.lambda编码$ 0(AbstractJackson2Encoder.java:122)“,” $ reactor.core.publisher.FluxMap MapSubscriber.onNext(FluxMap.java:100) “ ”reactor.core.publisher.FluxSwitchIfEmpty $ SwitchIfEmptySubscriber.onNext(FluxSwitchIfEmpty.java:67)“,” reactor.core.publisher.FluxMap $ MapSubscriber.onNext(FluxMap.java:114) “ ”reactor.core.publisher.FluxDefaultIfEmpty $ DefaultIfEmptySubscriber.onNext(FluxDefaultIfEmpty.java:92)“,” reactor.core.publisher.Operators $ MonoSubscriber.complete(Operators.java:1476) “ ”reactor.core.publisher.MonoFlatMap $ FlatMapInner.onNext(MonoFlatMap.java:241)“,” reactor.core.publisher.FluxMapFuseable $ MapFuseableSubscriber.onNext(FluxMapFuseable.java: 121) “ ”reactor.core.publisher.Operators $ MonoSubscriber.complete(Operators.java:1476)“, ”reactor.core.publisher.MonoCollectList $ MonoBufferAllSubscriber.onComplete(MonoCollectList.java:118)“,” reactor.core .publisher.FluxTake $ TakeFuseableSubscriber.onComplete(FluxTake.java:424) “ ”reactor.core.publisher.FluxTake $ TakeFuseableSubscriber.onNext(FluxTake.java:404)“,” reactor.core.publisher.FluxIterable $ IterableSubscription.fastPath (FluxIterable.java:311)","re​​actor.core.publisher.FluxIterable$IterableSubscription.request(FluxIterable.java:198)“],

2.与ReactiveSortingRepository一起使用

回购:

public interface LocationRepository extends ReactiveSortingRepository<MyClass, String> {
}

呼叫:

 repository.findAll(CassandraPageRequest.first(20))

错误:

语法错误:findAll不能应用于CassandraPageRequest。

3.简单的调用以获取页面。

回购:

public interface MyRepository extends CassandraRepository<MyClass, MyClassKey> {
Page<MyClass> findByKeyTerminalIdAndSolrQuery(String solrQuery, Pageable page);
}

启动时出错:

引起:org.springframework.dao.InvalidDataAccessApiUsageException:不支持页面查询。使用切片查询。

4.使用PagingAndSortingRepository

回购:

public interface MyRepository extends PagingAndSortingRepository<MyClass, MyClassKey> {

}

呼叫:

   Page<Vessel> vessels = repository.findAll(CassandraPageRequest.first(10));

错误:

springframework.data.mapping.PropertyReferenceException:找不到类型MyClass的属性findAll!

java spring-boot cassandra pagination spring-data-cassandra
1个回答
1
投票

欢迎来到Stack Overflow。

第一个例子是合适的例子:

public interface MyRepository extends ReactiveCassandraRepository<MyClass, String> {
  @Query("select * from my_keyspace.my_table where solr_query = ?0")
  Mono<Slice<MyClass>> findMono(String solrQuery, Pageable page);
}

Mono<Slice<MyClass>>  repository.findMono(queryString, CassandraPageRequest.first(20));

问题是杰克逊不能编码SliceImplSlice的实现),因为你将它传递给WebFlux(根据堆栈跟踪)。所以查询产生了正确的结果,但如果你想对它进行JSON编码,你需要传递Slice内容,而不是Slice本身。

在相关的说明:ReactiveCassandraRepository没有实现ReactiveSortingRepository,因为使用Sort参数的Casandra查询总是需要WHERE子句。看看ReactiveSortingRepository,你会看到一个不采用过滤标准的findAll(Sort)方法:

public interface ReactiveSortingRepository<T, ID> extends ReactiveCrudRepository<T, ID> {
    Flux<T> findAll(Sort sort);
}
© www.soinside.com 2019 - 2024. All rights reserved.