Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can't read ByteBuffer / ByteArray or Blob in Kotlin R2DBC CoroutineCrudRepository #1726

Open
pete-setchell-kubra opened this issue Jan 26, 2024 · 1 comment
Assignees
Labels
status: waiting-for-triage An issue we've not yet triaged

Comments

@pete-setchell-kubra
Copy link
Contributor

pete-setchell-kubra commented Jan 26, 2024

When using kotlin and the CoroutineCrudRepository I expect the bytea postgres type to translate to a ByteBuffer, and in a debugger I can see it's already returned as java.nio.HeapByteBuffer in the query response. As HeapByteBuffer already implements ByteBuffer, I don't see why this shouldn't work.

interface AssetRepository : CoroutineCrudRepository<Asset, Long> {
    @Query("SELECT '\\xDEADBEEF'::bytea")
    suspend fun getDeadBeef(): ByteBuffer
}

Test:

  @Test
  fun testByteBufferRead() {
      runBlocking {
          val buf = assetRepository.getDeadBeef()
      }
  }

Throws:

org.springframework.data.mapping.MappingException: Expected to read Document RowDocumentAccessor[document=RowDocument{bytea=java.nio.HeapByteBuffer[pos=0 lim=4 cap=4]}] into type class java.nio.ByteBuffer but didn't find a PersistentEntity for the latter

	at _COROUTINE._BOUNDARY._(CoroutineDebugging.kt:46)
	at com.r2dbcpostgisbug.repro.repository.AssetRepositoryTest$testByteBufferRead$1.invokeSuspend(AssetRepositoryTest.kt:47)
Caused by: org.springframework.data.mapping.MappingException: Expected to read Document RowDocumentAccessor[document=RowDocument{bytea=java.nio.HeapByteBuffer[pos=0 lim=4 cap=4]}] into type class java.nio.ByteBuffer but didn't find a PersistentEntity for the latter
	at org.springframework.data.relational.core.conversion.MappingRelationalConverter.readAggregate(MappingRelationalConverter.java:344)
	at org.springframework.data.relational.core.conversion.MappingRelationalConverter.readAggregate(MappingRelationalConverter.java:311)
	at org.springframework.data.relational.core.conversion.MappingRelationalConverter.read(MappingRelationalConverter.java:298)
	at org.springframework.data.relational.core.conversion.MappingRelationalConverter.read(MappingRelationalConverter.java:294)
	at org.springframework.data.r2dbc.convert.MappingR2dbcConverter.read(MappingR2dbcConverter.java:110)
	at org.springframework.data.r2dbc.convert.EntityRowMapper.apply(EntityRowMapper.java:42)
	at org.springframework.data.r2dbc.convert.EntityRowMapper.apply(EntityRowMapper.java:29)
	at org.springframework.data.r2dbc.core.R2dbcEntityTemplate.lambda$getRowsFetchSpec$16(R2dbcEntityTemplate.java:846)
	at io.r2dbc.postgresql.PostgresqlResult.lambda$map$2(PostgresqlResult.java:129)
	at reactor.core.publisher.FluxHandleFuseable$HandleFuseableSubscriber.onNext(FluxHandleFuseable.java:179)
	at reactor.core.publisher.FluxWindowPredicate$WindowFlux.drainRegular(FluxWindowPredicate.java:670)
	at reactor.core.publisher.FluxWindowPredicate$WindowFlux.drain(FluxWindowPredicate.java:748)
	at reactor.core.publisher.FluxWindowPredicate$WindowFlux.onNext(FluxWindowPredicate.java:790)
	at reactor.core.publisher.FluxWindowPredicate$WindowPredicateMain.onNext(FluxWindowPredicate.java:268)
	at io.r2dbc.postgresql.util.FluxDiscardOnCancel$FluxDiscardOnCancelSubscriber.onNext(FluxDiscardOnCancel.java:91)
	at reactor.core.publisher.FluxContextWrite$ContextWriteSubscriber.onNext(FluxContextWrite.java:107)
	at reactor.core.publisher.FluxCreate$BufferAsyncSink.drain(FluxCreate.java:880)
	at reactor.core.publisher.FluxCreate$BufferAsyncSink.next(FluxCreate.java:805)
	at reactor.core.publisher.FluxCreate$SerializedFluxSink.next(FluxCreate.java:163)
	at io.r2dbc.postgresql.client.ReactorNettyClient$Conversation.emit(ReactorNettyClient.java:684)
	at io.r2dbc.postgresql.client.ReactorNettyClient$BackendMessageSubscriber.emit(ReactorNettyClient.java:936)
	at io.r2dbc.postgresql.client.ReactorNettyClient$BackendMessageSubscriber.onNext(ReactorNettyClient.java:810)
	at io.r2dbc.postgresql.client.ReactorNettyClient$BackendMessageSubscriber.onNext(ReactorNettyClient.java:716)
	at reactor.core.publisher.FluxHandle$HandleSubscriber.onNext(FluxHandle.java:129)
	at reactor.core.publisher.FluxPeekFuseable$PeekConditionalSubscriber.onNext(FluxPeekFuseable.java:854)
	at reactor.core.publisher.FluxMap$MapConditionalSubscriber.onNext(FluxMap.java:224)
	at reactor.core.publisher.FluxMap$MapConditionalSubscriber.onNext(FluxMap.java:224)
	at reactor.netty.channel.FluxReceive.drainReceiver(FluxReceive.java:294)
	at reactor.netty.channel.FluxReceive.onInboundNext(FluxReceive.java:403)
	at reactor.netty.channel.ChannelOperations.onInboundNext(ChannelOperations.java:426)
	at reactor.netty.channel.ChannelOperationsHandler.channelRead(ChannelOperationsHandler.java:114)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
	at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:346)
	at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:333)
	at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:454)
	at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:290)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
	at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:440)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:788)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:724)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:650)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:562)
	at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997)
	at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
	at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
	at java.base/java.lang.Thread.run(Thread.java:1583)

Repro case here: https://github.com/pete-setchell-kubra/r2dbcrepro

@spring-projects-issues spring-projects-issues added the status: waiting-for-triage An issue we've not yet triaged label Jan 26, 2024
@pete-setchell-kubra pete-setchell-kubra changed the title Can't read ByteBuffer / ByteArray or Blob in Kotlin CoroutineCrudRepository Can't read ByteBuffer / ByteArray or Blob in Kotlin R2DBC CoroutineCrudRepository Jan 31, 2024
@pete-setchell-kubra
Copy link
Contributor Author

Ok, this is maybe just a user error. ByteBuffer does translate correctly as a field in a DTO, just not if you attempt to return it directly as you can with Int, String etc.

@mp911de mp911de self-assigned this Feb 5, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
status: waiting-for-triage An issue we've not yet triaged
Projects
None yet
Development

No branches or pull requests

3 participants