Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

external elasticsearch does not work #563

Closed
haghverdimasoud opened this issue Dec 25, 2022 · 7 comments
Closed

external elasticsearch does not work #563

haghverdimasoud opened this issue Dec 25, 2022 · 7 comments

Comments

@haghverdimasoud
Copy link

Hi
I get the last version of moqui framework that its not included moqui-elasticsearch. so i downoad elasticsearch oss 7.10.2 no JDK and added to runtime/elasticsearch.
and run the moqui

05:49:20.041 INFO main o.moqui.i.c.ElasticFacadeImpl Connected to ElasticSearch cluster default at http://127.0.0.1:9200 distribution elasticsearch version 7.10.2, ES earlier than 7.0? false
[name:MHAGHVERDI, cluster_name:elasticsearch, cluster_uuid:na, version:[number:7.10.2, build_flavor:oss, build_type:zip, build_hash:747e1cc71def077253878a59143c1f785afa92b9, build_date:2021-01-13T00:42:12.435326Z, build_snapshot:false, lucene_version:8.7.0, minimum_wire_compatibility_version:6.8.0, minimum_index_compatibility_version:6.0.0-beta1], tagline:You Know, for Search]
05:49:20.043 INFO main o.moqui.i.c.ElasticFacadeImpl Initializing ElasticSearchLogger with cluster default
[2022-12-25T17:19:23,059][INFO ][o.e.n.Node ] [MHAGHVERDI] version[7.10.2], pid[8756], build[oss/zip/747e1cc71def077253878a59143c1f785afa92b9/2021-01-13T00:42:12.435326Z], OS[Windows 10/10.0/amd64], JVM[Oracle Corporation/Java HotSpot(TM) 64-Bit Server VM/11.0.16.1/11.0.16.1+1-LTS-1]
[2022-12-25T17:19:23,070][INFO ][o.e.n.Node ] [MHAGHVERDI] JVM home [C:\Program Files\Java\jdk-11.0.16.1]
[2022-12-25T17:19:23,138][INFO ][o.e.n.Node ] [MHAGHVERDI] JVM arguments [-Des.networkaddress.cache.ttl=60, -Des.networkaddress.cache.negative.ttl=10, -XX:+AlwaysPreTouch, -Xss1m, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djna.nosys=true, -XX:-OmitStackTraceInFastThrow, -Dio.netty.noUnsafe=true, -Dio.netty.noKeySetOptimization=true, -Dio.netty.recycler.maxCapacityPerThread=0, -Dio.netty.allocator.numDirectArenas=0, -Dlog4j.shutdownHookEnabled=false, -Dlog4j2.disable.jmx=true, -Djava.locale.providers=SPI,COMPAT, -Xms1g, -Xmx1g, -XX:+UseConcMarkSweepGC, -XX:CMSInitiatingOccupancyFraction=75, -XX:+UseCMSInitiatingOccupancyOnly, -Djava.io.tmpdir=C:\Users\MHAGHV~1\AppData\Local\Temp\elasticsearch, -XX:+HeapDumpOnOutOfMemoryError, -XX:HeapDumpPath=data, -XX:ErrorFile=logs/hs_err_pid%p.log, -Xlog:gc*,gc+age=trace,safepoint:file=logs/gc.log:utctime,pid,tags:filecount=32,filesize=64m, -XX:MaxDirectMemorySize=536870912, -Delasticsearch, -Des.path.home=C:\Users\mhaghverdi\IdeaProjects\moqui-framework\runtime\elasticsearch, -Des.path.conf=C:\Users\mhaghverdi\IdeaProjects\moqui-framework\runtime\elasticsearch\config, -Des.distribution.flavor=oss, -Des.distribution.type=zip, -Des.bundled_jdk=false]
[2022-12-25T17:19:27,555][INFO ][o.e.p.PluginsService ] [MHAGHVERDI] loaded module [aggs-matrix-stats]
[2022-12-25T17:19:27,556][INFO ][o.e.p.PluginsService ] [MHAGHVERDI] loaded module [analysis-common]
[2022-12-25T17:19:27,557][INFO ][o.e.p.PluginsService ] [MHAGHVERDI] loaded module [geo]
[2022-12-25T17:19:27,558][INFO ][o.e.p.PluginsService ] [MHAGHVERDI] loaded module [ingest-common]
[2022-12-25T17:19:27,560][INFO ][o.e.p.PluginsService ] [MHAGHVERDI] loaded module [ingest-geoip]
[2022-12-25T17:19:27,566][INFO ][o.e.p.PluginsService ] [MHAGHVERDI] loaded module [ingest-user-agent]
[2022-12-25T17:19:27,569][INFO ][o.e.p.PluginsService ] [MHAGHVERDI] loaded module [kibana]
[2022-12-25T17:19:27,570][INFO ][o.e.p.PluginsService ] [MHAGHVERDI] loaded module [lang-expression]
[2022-12-25T17:19:27,573][INFO ][o.e.p.PluginsService ] [MHAGHVERDI] loaded module [lang-mustache]
[2022-12-25T17:19:27,573][INFO ][o.e.p.PluginsService ] [MHAGHVERDI] loaded module [lang-painless]
[2022-12-25T17:19:27,574][INFO ][o.e.p.PluginsService ] [MHAGHVERDI] loaded module [mapper-extras]
[2022-12-25T17:19:27,575][INFO ][o.e.p.PluginsService ] [MHAGHVERDI] loaded module [parent-join]
[2022-12-25T17:19:27,576][INFO ][o.e.p.PluginsService ] [MHAGHVERDI] loaded module [percolator]
[2022-12-25T17:19:27,597][INFO ][o.e.p.PluginsService ] [MHAGHVERDI] loaded module [rank-eval]
[2022-12-25T17:19:27,599][INFO ][o.e.p.PluginsService ] [MHAGHVERDI] loaded module [reindex]
[2022-12-25T17:19:27,601][INFO ][o.e.p.PluginsService ] [MHAGHVERDI] loaded module [repository-url]
[2022-12-25T17:19:27,604][INFO ][o.e.p.PluginsService ] [MHAGHVERDI] loaded module [transport-netty4]
[2022-12-25T17:19:27,606][INFO ][o.e.p.PluginsService ] [MHAGHVERDI] no plugins loaded
[2022-12-25T17:19:27,732][INFO ][o.e.e.NodeEnvironment ] [MHAGHVERDI] using [1] data paths, mounts [[(C:)]], net usable_space [25.5gb], net total_space [118.6gb], types [NTFS]
[2022-12-25T17:19:27,733][INFO ][o.e.e.NodeEnvironment ] [MHAGHVERDI] heap size [990.7mb], compressed ordinary object pointers [true]
[2022-12-25T17:19:29,425][INFO ][o.e.n.Node ] [MHAGHVERDI] node name [MHAGHVERDI], node ID [tEjoeM5ZReWLxXSozfEWlw], cluster name [elasticsearch], roles [master, remote_cluster_client, data, ingest]
[2022-12-25T17:19:38,148][INFO ][o.e.t.NettyAllocator ] [MHAGHVERDI] creating NettyAllocator with the following configs: [name=unpooled, suggested_max_allocation_size=1mb, factors={es.unsafe.use_unpooled_allocator=null, g1gc_enabled=false, g1gc_region_size=0b, heap_size=990.7mb}]
[2022-12-25T17:19:38,286][INFO ][o.e.d.DiscoveryModule ] [MHAGHVERDI] using discovery type [zen] and seed hosts providers [settings]
[2022-12-25T17:19:38,801][WARN ][o.e.g.DanglingIndicesState] [MHAGHVERDI] gateway.auto_import_dangling_indices is disabled, dangling indices will not be automatically detected or imported and must be managed manually
[2022-12-25T17:19:39,141][INFO ][o.e.n.Node ] [MHAGHVERDI] initialized
[2022-12-25T17:19:39,141][INFO ][o.e.n.Node ] [MHAGHVERDI] starting ...
[2022-12-25T17:19:41,375][INFO ][o.e.t.TransportService ] [MHAGHVERDI] publish_address {127.0.0.1:9305}, bound_addresses {127.0.0.1:9305}, {[::1]:9305}
[2022-12-25T17:19:41,703][WARN ][o.e.t.TcpTransport ] [MHAGHVERDI] exception caught on transport layer [Netty4TcpChannel{localAddress=/0:0:0:0:0:0:0:1:9305, remoteAddress=/0:0:0:0:0:0:0:1:53595}], closing connection
java.lang.IllegalStateException: transport not ready yet to handle incoming requests
at org.elasticsearch.transport.TransportService.onRequestReceived(TransportService.java:952) ~[elasticsearch-7.10.2.jar:7.10.2]
at org.elasticsearch.transport.InboundHandler.handleRequest(InboundHandler.java:164) ~[elasticsearch-7.10.2.jar:7.10.2]
at org.elasticsearch.transport.InboundHandler.messageReceived(InboundHandler.java:107) ~[elasticsearch-7.10.2.jar:7.10.2]
at org.elasticsearch.transport.InboundHandler.inboundMessage(InboundHandler.java:89) ~[elasticsearch-7.10.2.jar:7.10.2]
at org.elasticsearch.transport.TcpTransport.inboundMessage(TcpTransport.java:700) [elasticsearch-7.10.2.jar:7.10.2]
at org.elasticsearch.transport.InboundPipeline.forwardFragments(InboundPipeline.java:142) [elasticsearch-7.10.2.jar:7.10.2]
at org.elasticsearch.transport.InboundPipeline.doHandleBytes(InboundPipeline.java:117) [elasticsearch-7.10.2.jar:7.10.2]
at org.elasticsearch.transport.InboundPipeline.handleBytes(InboundPipeline.java:82) [elasticsearch-7.10.2.jar:7.10.2]
at org.elasticsearch.transport.netty4.Netty4MessageChannelHandler.channelRead(Netty4MessageChannelHandler.java:74) [transport-netty4-client-7.10.2.jar:7.10.2]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.handler.logging.LoggingHandler.channelRead(LoggingHandler.java:271) [netty-handler-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103) [netty-codec-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:163) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:714) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.nio.NioEventLoop.processSelectedKeysPlain(NioEventLoop.java:615) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:578) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:493) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989) [netty-common-4.1.49.Final.jar:4.1.49.Final]
at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) [netty-common-4.1.49.Final.jar:4.1.49.Final]
at java.lang.Thread.run(Thread.java:834) [?:?]
[2022-12-25T17:19:41,703][WARN ][o.e.t.TcpTransport ] [MHAGHVERDI] exception caught on transport layer [Netty4TcpChannel{localAddress=/127.0.0.1:9305, remoteAddress=/127.0.0.1:53596}], closing connection
java.lang.IllegalStateException: transport not ready yet to handle incoming requests
at org.elasticsearch.transport.TransportService.onRequestReceived(TransportService.java:952) ~[elasticsearch-7.10.2.jar:7.10.2]
at org.elasticsearch.transport.InboundHandler.handleRequest(InboundHandler.java:164) ~[elasticsearch-7.10.2.jar:7.10.2]
at org.elasticsearch.transport.InboundHandler.messageReceived(InboundHandler.java:107) ~[elasticsearch-7.10.2.jar:7.10.2]
at org.elasticsearch.transport.InboundHandler.inboundMessage(InboundHandler.java:89) ~[elasticsearch-7.10.2.jar:7.10.2]
at org.elasticsearch.transport.TcpTransport.inboundMessage(TcpTransport.java:700) [elasticsearch-7.10.2.jar:7.10.2]
at org.elasticsearch.transport.InboundPipeline.forwardFragments(InboundPipeline.java:142) [elasticsearch-7.10.2.jar:7.10.2]
at org.elasticsearch.transport.InboundPipeline.doHandleBytes(InboundPipeline.java:117) [elasticsearch-7.10.2.jar:7.10.2]
at org.elasticsearch.transport.InboundPipeline.handleBytes(InboundPipeline.java:82) [elasticsearch-7.10.2.jar:7.10.2]
at org.elasticsearch.transport.netty4.Netty4MessageChannelHandler.channelRead(Netty4MessageChannelHandler.java:74) [transport-netty4-client-7.10.2.jar:7.10.2]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.handler.logging.LoggingHandler.channelRead(LoggingHandler.java:271) [netty-handler-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103) [netty-codec-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:163) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:714) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.nio.NioEventLoop.processSelectedKeysPlain(NioEventLoop.java:615) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:578) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:493) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989) [netty-common-4.1.49.Final.jar:4.1.49.Final]
at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) [netty-common-4.1.49.Final.jar:4.1.49.Final]
at java.lang.Thread.run(Thread.java:834) [?:?]
[2022-12-25T17:19:41,766][WARN ][o.e.t.TcpTransport ] [MHAGHVERDI] exception caught on transport layer [Netty4TcpChannel{localAddress=/0:0:0:0:0:0:0:1:9305, remoteAddress=/0:0:0:0:0:0:0:1:53598}], closing connection
java.lang.IllegalStateException: transport not ready yet to handle incoming requests
at org.elasticsearch.transport.TransportService.onRequestReceived(TransportService.java:952) ~[elasticsearch-7.10.2.jar:7.10.2]
at org.elasticsearch.transport.InboundHandler.handleRequest(InboundHandler.java:164) ~[elasticsearch-7.10.2.jar:7.10.2]
at org.elasticsearch.transport.InboundHandler.messageReceived(InboundHandler.java:107) ~[elasticsearch-7.10.2.jar:7.10.2]
at org.elasticsearch.transport.InboundHandler.inboundMessage(InboundHandler.java:89) ~[elasticsearch-7.10.2.jar:7.10.2]
at org.elasticsearch.transport.TcpTransport.inboundMessage(TcpTransport.java:700) [elasticsearch-7.10.2.jar:7.10.2]
at org.elasticsearch.transport.InboundPipeline.forwardFragments(InboundPipeline.java:142) [elasticsearch-7.10.2.jar:7.10.2]
at org.elasticsearch.transport.InboundPipeline.doHandleBytes(InboundPipeline.java:117) [elasticsearch-7.10.2.jar:7.10.2]
at org.elasticsearch.transport.InboundPipeline.handleBytes(InboundPipeline.java:82) [elasticsearch-7.10.2.jar:7.10.2]
at org.elasticsearch.transport.netty4.Netty4MessageChannelHandler.channelRead(Netty4MessageChannelHandler.java:74) [transport-netty4-client-7.10.2.jar:7.10.2]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.handler.logging.LoggingHandler.channelRead(LoggingHandler.java:271) [netty-handler-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103) [netty-codec-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:163) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:714) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.nio.NioEventLoop.processSelectedKeysPlain(NioEventLoop.java:615) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:578) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:493) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989) [netty-common-4.1.49.Final.jar:4.1.49.Final]
at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) [netty-common-4.1.49.Final.jar:4.1.49.Final]
at java.lang.Thread.run(Thread.java:834) [?:?]
[2022-12-25T17:19:41,779][WARN ][o.e.t.TcpTransport ] [MHAGHVERDI] exception caught on transport layer [Netty4TcpChannel{localAddress=/127.0.0.1:9305, remoteAddress=/127.0.0.1:53597}], closing connection
java.lang.IllegalStateException: transport not ready yet to handle incoming requests
at org.elasticsearch.transport.TransportService.onRequestReceived(TransportService.java:952) ~[elasticsearch-7.10.2.jar:7.10.2]
at org.elasticsearch.transport.InboundHandler.handleRequest(InboundHandler.java:164) ~[elasticsearch-7.10.2.jar:7.10.2]
at org.elasticsearch.transport.InboundHandler.messageReceived(InboundHandler.java:107) ~[elasticsearch-7.10.2.jar:7.10.2]
at org.elasticsearch.transport.InboundHandler.inboundMessage(InboundHandler.java:89) ~[elasticsearch-7.10.2.jar:7.10.2]
at org.elasticsearch.transport.TcpTransport.inboundMessage(TcpTransport.java:700) [elasticsearch-7.10.2.jar:7.10.2]
at org.elasticsearch.transport.InboundPipeline.forwardFragments(InboundPipeline.java:142) [elasticsearch-7.10.2.jar:7.10.2]
at org.elasticsearch.transport.InboundPipeline.doHandleBytes(InboundPipeline.java:117) [elasticsearch-7.10.2.jar:7.10.2]
at org.elasticsearch.transport.InboundPipeline.handleBytes(InboundPipeline.java:82) [elasticsearch-7.10.2.jar:7.10.2]
at org.elasticsearch.transport.netty4.Netty4MessageChannelHandler.channelRead(Netty4MessageChannelHandler.java:74) [transport-netty4-client-7.10.2.jar:7.10.2]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.handler.logging.LoggingHandler.channelRead(LoggingHandler.java:271) [netty-handler-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103) [netty-codec-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:163) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:714) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.nio.NioEventLoop.processSelectedKeysPlain(NioEventLoop.java:615) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:578) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:493) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989) [netty-common-4.1.49.Final.jar:4.1.49.Final]
at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) [netty-common-4.1.49.Final.jar:4.1.49.Final]
at java.lang.Thread.run(Thread.java:834) [?:?]
[2022-12-25T17:19:41,812][WARN ][o.e.t.TcpTransport ] [MHAGHVERDI] exception caught on transport layer [Netty4TcpChannel{localAddress=/127.0.0.1:9305, remoteAddress=/127.0.0.1:53599}], closing connection
java.lang.IllegalStateException: transport not ready yet to handle incoming requests
at org.elasticsearch.transport.TransportService.onRequestReceived(TransportService.java:952) ~[elasticsearch-7.10.2.jar:7.10.2]
at org.elasticsearch.transport.InboundHandler.handleRequest(InboundHandler.java:164) ~[elasticsearch-7.10.2.jar:7.10.2]
at org.elasticsearch.transport.InboundHandler.messageReceived(InboundHandler.java:107) ~[elasticsearch-7.10.2.jar:7.10.2]
at org.elasticsearch.transport.InboundHandler.inboundMessage(InboundHandler.java:89) ~[elasticsearch-7.10.2.jar:7.10.2]
at org.elasticsearch.transport.TcpTransport.inboundMessage(TcpTransport.java:700) [elasticsearch-7.10.2.jar:7.10.2]
at org.elasticsearch.transport.InboundPipeline.forwardFragments(InboundPipeline.java:142) [elasticsearch-7.10.2.jar:7.10.2]
at org.elasticsearch.transport.InboundPipeline.doHandleBytes(InboundPipeline.java:117) [elasticsearch-7.10.2.jar:7.10.2]
at org.elasticsearch.transport.InboundPipeline.handleBytes(InboundPipeline.java:82) [elasticsearch-7.10.2.jar:7.10.2]
at org.elasticsearch.transport.netty4.Netty4MessageChannelHandler.channelRead(Netty4MessageChannelHandler.java:74) [transport-netty4-client-7.10.2.jar:7.10.2]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.handler.logging.LoggingHandler.channelRead(LoggingHandler.java:271) [netty-handler-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103) [netty-codec-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:163) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:714) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.nio.NioEventLoop.processSelectedKeysPlain(NioEventLoop.java:615) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:578) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:493) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989) [netty-common-4.1.49.Final.jar:4.1.49.Final]
at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) [netty-common-4.1.49.Final.jar:4.1.49.Final]
at java.lang.Thread.run(Thread.java:834) [?:?]
[2022-12-25T17:19:41,817][WARN ][o.e.t.TcpTransport ] [MHAGHVERDI] exception caught on transport layer [Netty4TcpChannel{localAddress=/0:0:0:0:0:0:0:1:9305, remoteAddress=/0:0:0:0:0:0:0:1:53600}], closing connection
java.lang.IllegalStateException: transport not ready yet to handle incoming requests
at org.elasticsearch.transport.TransportService.onRequestReceived(TransportService.java:952) ~[elasticsearch-7.10.2.jar:7.10.2]
at org.elasticsearch.transport.InboundHandler.handleRequest(InboundHandler.java:164) ~[elasticsearch-7.10.2.jar:7.10.2]
at org.elasticsearch.transport.InboundHandler.messageReceived(InboundHandler.java:107) ~[elasticsearch-7.10.2.jar:7.10.2]
at org.elasticsearch.transport.InboundHandler.inboundMessage(InboundHandler.java:89) ~[elasticsearch-7.10.2.jar:7.10.2]
at org.elasticsearch.transport.TcpTransport.inboundMessage(TcpTransport.java:700) [elasticsearch-7.10.2.jar:7.10.2]
at org.elasticsearch.transport.InboundPipeline.forwardFragments(InboundPipeline.java:142) [elasticsearch-7.10.2.jar:7.10.2]
at org.elasticsearch.transport.InboundPipeline.doHandleBytes(InboundPipeline.java:117) [elasticsearch-7.10.2.jar:7.10.2]
at org.elasticsearch.transport.InboundPipeline.handleBytes(InboundPipeline.java:82) [elasticsearch-7.10.2.jar:7.10.2]
at org.elasticsearch.transport.netty4.Netty4MessageChannelHandler.channelRead(Netty4MessageChannelHandler.java:74) [transport-netty4-client-7.10.2.jar:7.10.2]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.handler.logging.LoggingHandler.channelRead(LoggingHandler.java:271) [netty-handler-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103) [netty-codec-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:163) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:714) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.nio.NioEventLoop.processSelectedKeysPlain(NioEventLoop.java:615) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:578) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:493) [netty-transport-4.1.49.Final.jar:4.1.49.Final]
at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989) [netty-common-4.1.49.Final.jar:4.1.49.Final]
at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) [netty-common-4.1.49.Final.jar:4.1.49.Final]
at java.lang.Thread.run(Thread.java:834) [?:?]
[2022-12-25T17:19:41,920][WARN ][o.e.b.BootstrapChecks ] [MHAGHVERDI] the default discovery settings are unsuitable for production use; at least one of [discovery.seed_hosts, discovery.seed_providers, cluster.initial_master_nodes] must be configured
[2022-12-25T17:19:41,939][INFO ][o.e.c.c.ClusterBootstrapService] [MHAGHVERDI] no discovery configuration found, will perform best-effort cluster bootstrapping after [3s] unless existing master is discovered
[2022-12-25T17:19:44,961][INFO ][o.e.c.c.Coordinator ] [MHAGHVERDI] setting initial configuration to VotingConfiguration{SY8xhFVoR3mSRWO4y0hsTQ,tEjoeM5ZReWLxXSozfEWlw,UhH8S58hQlKaB6t-GditxQ,60d9Cvu9SaaoWrc5K5G4LA,rPWEXxlpT6Ch6mrNeQVl-g,bXahzH1cTaGih5ZrIrlN8A}
05:49:50.076 ERROR main o.moqui.i.c.ElasticFacadeImpl Error checking and creating moqui_logs ES index, not starting ElasticSearchLogger
org.moqui.BaseException: Error calling HTTP request to http://127.0.0.1:9200/moqui_logs
at org.moqui.util.RestClient.callInternal(RestClient.java:330) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at org.moqui.util.RestClient.call(RestClient.java:277) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at org.moqui.impl.context.ElasticFacadeImpl$ElasticClientImpl.indexExists(ElasticFacadeImpl.groovy:242) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at org.moqui.impl.util.ElasticSearchLogger.init(ElasticSearchLogger.groovy:65) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at org.moqui.impl.util.ElasticSearchLogger.(ElasticSearchLogger.groovy:59) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at org.moqui.impl.context.ElasticFacadeImpl.init(ElasticFacadeImpl.groovy:116) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at org.moqui.impl.context.ElasticFacadeImpl.(ElasticFacadeImpl.groovy:77) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at org.moqui.impl.context.ExecutionContextFactoryImpl.(ExecutionContextFactoryImpl.groovy:240) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:?]
at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:?]
at jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:?]
at java.util.ServiceLoader$ProviderImpl.newInstance(ServiceLoader.java:780) ~[?:?]
at java.util.ServiceLoader$ProviderImpl.get(ServiceLoader.java:722) ~[?:?]
at java.util.ServiceLoader$3.next(ServiceLoader.java:1395) ~[?:?]
at org.moqui.Moqui.loadData(Moqui.java:122) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:?]
at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
at MoquiStart.main(MoquiStart.java:152) ~[moqui.war:?]
Caused by: java.util.concurrent.ExecutionException: java.util.concurrent.TimeoutException: Idle timeout 30000 ms
at org.eclipse.jetty.client.util.FutureResponseListener.getResult(FutureResponseListener.java:113) ~[moqui_temp16714313482777790755WEB-INF_lib_jetty-client-10.0.12.jar.:10.0.12]
at org.eclipse.jetty.client.util.FutureResponseListener.get(FutureResponseListener.java:105) ~[moqui_temp16714313482777790755WEB-INF_lib_jetty-client-10.0.12.jar.:10.0.12]
at org.moqui.util.RestClient.callInternal(RestClient.java:319) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at org.moqui.util.RestClient.call(RestClient.java:277) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at org.moqui.impl.context.ElasticFacadeImpl$ElasticClientImpl.indexExists(ElasticFacadeImpl.groovy:242) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at org.moqui.impl.util.ElasticSearchLogger.init(ElasticSearchLogger.groovy:65) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at org.moqui.impl.util.ElasticSearchLogger.(ElasticSearchLogger.groovy:59) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at org.moqui.impl.context.ElasticFacadeImpl.init(ElasticFacadeImpl.groovy:116) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at org.moqui.impl.context.ElasticFacadeImpl.(ElasticFacadeImpl.groovy:77) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at org.moqui.impl.context.ExecutionContextFactoryImpl.(ExecutionContextFactoryImpl.groovy:240) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:?]
at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:?]
at jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:?]
at java.lang.reflect.Constructor.newInstance(Constructor.java:490) ~[?:?]
at java.util.ServiceLoader$ProviderImpl.newInstance(ServiceLoader.java:780) ~[?:?]
at java.util.ServiceLoader$ProviderImpl.get(ServiceLoader.java:722) ~[?:?]
at java.util.ServiceLoader$3.next(ServiceLoader.java:1395) ~[?:?]
at org.moqui.Moqui.loadData(Moqui.java:122) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:?]
at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
at java.lang.reflect.Method.invoke(Method.java:566) ~[?:?]
... 1 more
Caused by: java.util.concurrent.TimeoutException: Idle timeout 30000 ms
at org.eclipse.jetty.client.http.HttpConnectionOverHTTP.onIdleExpired(HttpConnectionOverHTTP.java:181) ~[moqui_temp16714313482777790755WEB-INF_lib_jetty-client-10.0.12.jar.:10.0.12]
at org.eclipse.jetty.io.AbstractEndPoint.onIdleExpired(AbstractEndPoint.java:407) ~[moqui_temp18060177454987752830WEB-INF_lib_jetty-io-10.0.12.jar.:10.0.12]
at org.eclipse.jetty.io.IdleTimeout.checkIdleTimeout(IdleTimeout.java:167) ~[moqui_temp18060177454987752830WEB-INF_lib_jetty-io-10.0.12.jar.:10.0.12]
at org.eclipse.jetty.io.IdleTimeout.idleCheck(IdleTimeout.java:109) ~[moqui_temp18060177454987752830WEB-INF_lib_jetty-io-10.0.12.jar.:10.0.12]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) ~[?:?]
at java.util.concurrent.FutureTask.run(FutureTask.java:264) ~[?:?]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304) ~[?:?]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?]
at java.lang.Thread.run(Thread.java:834) ~[?:?]
[2022-12-25T17:19:51,946][WARN ][o.e.c.c.ClusterFormationFailureHelper] [MHAGHVERDI] master not discovered or elected yet, an election requires at least 4 nodes with ids from [SY8xhFVoR3mSRWO4y0hsTQ, tEjoeM5ZReWLxXSozfEWlw, UhH8S58hQlKaB6t-GditxQ, 60d9Cvu9SaaoWrc5K5G4LA, rPWEXxlpT6Ch6mrNeQVl-g, bXahzH1cTaGih5ZrIrlN8A], have discovered [{MHAGHVERDI}{tEjoeM5ZReWLxXSozfEWlw}{f2Yv4dNdSmmUbSuWNho-Vg}{127.0.0.1}{127.0.0.1:9305}{dimr}, {MHAGHVERDI}{60d9Cvu9SaaoWrc5K5G4LA}{XeEd8OgqQ8qYk5EY5ApOvA}{127.0.0.1}{127.0.0.1:9300}{dimr}, {MHAGHVERDI}{SY8xhFVoR3mSRWO4y0hsTQ}{rDwWLNmyREOG06_o5l4oVA}{127.0.0.1}{127.0.0.1:9301}{dimr}, {MHAGHVERDI}{UhH8S58hQlKaB6t-GditxQ}{jslf5SF9Tr-Wb219IIWuJQ}{127.0.0.1}{127.0.0.1:9302}{dimr}, {MHAGHVERDI}{rPWEXxlpT6Ch6mrNeQVl-g}{4xLOsftcT8CbP1qJob7uoQ}{127.0.0.1}{127.0.0.1:9303}{dimr}, {MHAGHVERDI}{bXahzH1cTaGih5ZrIrlN8A}{U0LePEgaQI6H9MX9SxlePA}{127.0.0.1}{127.0.0.1:9304}{dimr}] which is a quorum; discovery will continue using [127.0.0.1:9300, 127.0.0.1:9301, 127.0.0.1:9302, 127.0.0.1:9303, 127.0.0.1:9304, [::1]:9300, [::1]:9301, [::1]:9302, [::1]:9303, [::1]:9304] from hosts providers and [{MHAGHVERDI}{tEjoeM5ZReWLxXSozfEWlw}{f2Yv4dNdSmmUbSuWNho-Vg}{127.0.0.1}{127.0.0.1:9305}{dimr}] from last-known cluster state; node term 0, last-accepted version 0 in term 0
[2022-12-25T17:20:01,952][WARN ][o.e.c.c.ClusterFormationFailureHelper] [MHAGHVERDI] master not discovered or elected yet, an election requires at least 4 nodes with ids from [SY8xhFVoR3mSRWO4y0hsTQ, tEjoeM5ZReWLxXSozfEWlw, UhH8S58hQlKaB6t-GditxQ, 60d9Cvu9SaaoWrc5K5G4LA, rPWEXxlpT6Ch6mrNeQVl-g, bXahzH1cTaGih5ZrIrlN8A], have discovered [{MHAGHVERDI}{tEjoeM5ZReWLxXSozfEWlw}{f2Yv4dNdSmmUbSuWNho-Vg}{127.0.0.1}{127.0.0.1:9305}{dimr}, {MHAGHVERDI}{60d9Cvu9SaaoWrc5K5G4LA}{XeEd8OgqQ8qYk5EY5ApOvA}{127.0.0.1}{127.0.0.1:9300}{dimr}, {MHAGHVERDI}{SY8xhFVoR3mSRWO4y0hsTQ}{rDwWLNmyREOG06_o5l4oVA}{127.0.0.1}{127.0.0.1:9301}{dimr}, {MHAGHVERDI}{UhH8S58hQlKaB6t-GditxQ}{jslf5SF9Tr-Wb219IIWuJQ}{127.0.0.1}{127.0.0.1:9302}{dimr}, {MHAGHVERDI}{rPWEXxlpT6Ch6mrNeQVl-g}{4xLOsftcT8CbP1qJob7uoQ}{127.0.0.1}{127.0.0.1:9303}{dimr}, {MHAGHVERDI}{bXahzH1cTaGih5ZrIrlN8A}{U0LePEgaQI6H9MX9SxlePA}{127.0.0.1}{127.0.0.1:9304}{dimr}] which is a quorum; discovery will continue using [127.0.0.1:9300, 127.0.0.1:9301, 127.0.0.1:9302, 127.0.0.1:9303, 127.0.0.1:9304, [::1]:9300, [::1]:9301, [::1]:9302, [::1]:9303, [::1]:9304] from hosts providers and [{MHAGHVERDI}{tEjoeM5ZReWLxXSozfEWlw}{f2Yv4dNdSmmUbSuWNho-Vg}{127.0.0.1}{127.0.0.1:9305}{dimr}] from last-known cluster state; node term 0, last-accepted version 0 in term 0
[2022-12-25T17:20:11,969][WARN ][o.e.c.c.ClusterFormationFailureHelper] [MHAGHVERDI] master not discovered or elected yet, an election requires at least 4 nodes with ids from [SY8xhFVoR3mSRWO4y0hsTQ, tEjoeM5ZReWLxXSozfEWlw, UhH8S58hQlKaB6t-GditxQ, 60d9Cvu9SaaoWrc5K5G4LA, rPWEXxlpT6Ch6mrNeQVl-g, bXahzH1cTaGih5ZrIrlN8A], have discovered [{MHAGHVERDI}{tEjoeM5ZReWLxXSozfEWlw}{f2Yv4dNdSmmUbSuWNho-Vg}{127.0.0.1}{127.0.0.1:9305}{dimr}, {MHAGHVERDI}{60d9Cvu9SaaoWrc5K5G4LA}{XeEd8OgqQ8qYk5EY5ApOvA}{127.0.0.1}{127.0.0.1:9300}{dimr}, {MHAGHVERDI}{SY8xhFVoR3mSRWO4y0hsTQ}{rDwWLNmyREOG06_o5l4oVA}{127.0.0.1}{127.0.0.1:9301}{dimr}, {MHAGHVERDI}{UhH8S58hQlKaB6t-GditxQ}{jslf5SF9Tr-Wb219IIWuJQ}{127.0.0.1}{127.0.0.1:9302}{dimr}, {MHAGHVERDI}{rPWEXxlpT6Ch6mrNeQVl-g}{4xLOsftcT8CbP1qJob7uoQ}{127.0.0.1}{127.0.0.1:9303}{dimr}, {MHAGHVERDI}{bXahzH1cTaGih5ZrIrlN8A}{U0LePEgaQI6H9MX9SxlePA}{127.0.0.1}{127.0.0.1:9304}{dimr}] which is a quorum; discovery will continue using [127.0.0.1:9300, 127.0.0.1:9301, 127.0.0.1:9302, 127.0.0.1:9303, 127.0.0.1:9304, [::1]:9300, [::1]:9301, [::1]:9302, [::1]:9303, [::1]:9304] from hosts providers and [{MHAGHVERDI}{tEjoeM5ZReWLxXSozfEWlw}{f2Yv4dNdSmmUbSuWNho-Vg}{127.0.0.1}{127.0.0.1:9305}{dimr}] from last-known cluster state; node term 0, last-accepted version 0 in term 0
[2022-12-25T17:20:12,080][WARN ][o.e.n.Node ] [MHAGHVERDI] timed out while waiting for initial discovery state - timeout: 30s
[2022-12-25T17:20:13,031][INFO ][o.e.h.AbstractHttpServerTransport] [MHAGHVERDI] publish_address {127.0.0.1:9205}, bound_addresses {127.0.0.1:9205}, {[::1]:9205}
[2022-12-25T17:20:13,034][INFO ][o.e.n.Node ] [MHAGHVERDI] started
05:50:20.238 ERROR main o.moqui.i.c.ElasticFacadeImpl Error checking or indexing for all DataFeed with indexOnStartEmpty=Y
org.moqui.BaseException: Error calling HTTP request to http://127.0.0.1:9200/mantle
at org.moqui.util.RestClient.callInternal(RestClient.java:330) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at org.moqui.util.RestClient.call(RestClient.java:277) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at org.moqui.impl.context.ElasticFacadeImpl$ElasticClientImpl.indexExists(ElasticFacadeImpl.groovy:242) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at org.moqui.impl.context.ElasticFacadeImpl.init(ElasticFacadeImpl.groovy:139) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at org.moqui.impl.context.ElasticFacadeImpl.(ElasticFacadeImpl.groovy:77) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at org.moqui.impl.context.ExecutionContextFactoryImpl.(ExecutionContextFactoryImpl.groovy:240) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:?]
at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:?]
at jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:?]
at java.util.ServiceLoader$ProviderImpl.newInstance(ServiceLoader.java:780) ~[?:?]
at java.util.ServiceLoader$ProviderImpl.get(ServiceLoader.java:722) ~[?:?]
at java.util.ServiceLoader$3.next(ServiceLoader.java:1395) ~[?:?]
at org.moqui.Moqui.loadData(Moqui.java:122) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:?]
at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
at MoquiStart.main(MoquiStart.java:152) ~[moqui.war:?]
Caused by: java.util.concurrent.ExecutionException: java.util.concurrent.TimeoutException: Idle timeout 30000 ms
at org.eclipse.jetty.client.util.FutureResponseListener.getResult(FutureResponseListener.java:113) ~[moqui_temp16714313482777790755WEB-INF_lib_jetty-client-10.0.12.jar.:10.0.12]
at org.eclipse.jetty.client.util.FutureResponseListener.get(FutureResponseListener.java:105) ~[moqui_temp16714313482777790755WEB-INF_lib_jetty-client-10.0.12.jar.:10.0.12]
at org.moqui.util.RestClient.callInternal(RestClient.java:319) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at org.moqui.util.RestClient.call(RestClient.java:277) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at org.moqui.impl.context.ElasticFacadeImpl$ElasticClientImpl.indexExists(ElasticFacadeImpl.groovy:242) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at org.moqui.impl.context.ElasticFacadeImpl.init(ElasticFacadeImpl.groovy:139) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at org.moqui.impl.context.ElasticFacadeImpl.(ElasticFacadeImpl.groovy:77) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at org.moqui.impl.context.ExecutionContextFactoryImpl.(ExecutionContextFactoryImpl.groovy:240) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:?]
at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:?]
at jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:?]
at java.lang.reflect.Constructor.newInstance(Constructor.java:490) ~[?:?]
at java.util.ServiceLoader$ProviderImpl.newInstance(ServiceLoader.java:780) ~[?:?]
at java.util.ServiceLoader$ProviderImpl.get(ServiceLoader.java:722) ~[?:?]
at java.util.ServiceLoader$3.next(ServiceLoader.java:1395) ~[?:?]
at org.moqui.Moqui.loadData(Moqui.java:122) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:?]
at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
at java.lang.reflect.Method.invoke(Method.java:566) ~[?:?]
... 1 more
Caused by: java.util.concurrent.TimeoutException: Idle timeout 30000 ms
at org.eclipse.jetty.client.http.HttpConnectionOverHTTP.onIdleExpired(HttpConnectionOverHTTP.java:181) ~[moqui_temp16714313482777790755WEB-INF_lib_jetty-client-10.0.12.jar.:10.0.12]
at org.eclipse.jetty.io.AbstractEndPoint.onIdleExpired(AbstractEndPoint.java:407) ~[moqui_temp18060177454987752830WEB-INF_lib_jetty-io-10.0.12.jar.:10.0.12]
at org.eclipse.jetty.io.IdleTimeout.checkIdleTimeout(IdleTimeout.java:167) ~[moqui_temp18060177454987752830WEB-INF_lib_jetty-io-10.0.12.jar.:10.0.12]
at org.eclipse.jetty.io.IdleTimeout.idleCheck(IdleTimeout.java:109) ~[moqui_temp18060177454987752830WEB-INF_lib_jetty-io-10.0.12.jar.:10.0.12]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) ~[?:?]
at java.util.concurrent.FutureTask.run(FutureTask.java:264) ~[?:?]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304) ~[?:?]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?]
at java.lang.Thread.run(Thread.java:834) ~[?:?]

@haghverdimasoud
Copy link
Author

and .....

05:51:00.236 ERROR oquiWorker-2 o.moqui.i.a.XmlAction Error running groovy script (org.moqui.BaseException: Error calling HTTP request to http://127.0.0.1:9200/mantle):
1 : import static org.moqui.util.ObjectUtilities.*
2 : import static org.moqui.util.CollectionUtilities.*
3 : import static org.moqui.util.StringUtilities.*
4 : import java.sql.Timestamp
5 : // these are in the context by default: ExecutionContext ec, Map<String, Object> context, Map<String, Object> result
6 : elasticClient = (ec.factory.elastic.getClient(clusterName))
7 : if (elasticClient == null) {
8 : ec.message.addMessage(ec.resource.expand('''No Elastic Client found for cluster name ${clusterName}, not indexing documents''',''), "danger")
9 : return;
10 : }
11 :
12 : if (verifyIndexes) {
13 :
14 : // begin inline script
15 : elasticClient.verifyDataDocumentIndexes(documentList)
16 : // end inline script
17 : }
18 :
19 :
20 : // begin inline script
21 : elasticClient.bulkIndexDataDocument(documentList)
22 : // end inline script
23 : // make sure the last statement is not considered the return value
24 : return;

05:51:00.239 WARN oquiWorker-2 o.moqui.i.c.TransactionFacadeImpl Transaction set rollback only. The rollback was originally caused by: Error running service org.moqui.search.SearchServices.index#DataDocuments (Throwable)
org.moqui.BaseException: Error calling HTTP request to http://127.0.0.1:9200/mantle
at org.moqui.util.RestClient.callInternal(RestClient.java:330) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at org.moqui.util.RestClient.call(RestClient.java:277) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at org.moqui.impl.context.ElasticFacadeImpl$ElasticClientImpl.indexExists(ElasticFacadeImpl.groovy:242) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at org.moqui.impl.context.ElasticFacadeImpl$ElasticClientImpl.checkCreateDataDocumentIndexes(ElasticFacadeImpl.groovy:552) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at org.moqui.impl.context.ElasticFacadeImpl$ElasticClientImpl.verifyDataDocumentIndexes(ElasticFacadeImpl.groovy:604) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at org_moqui_search_SearchServices_index_DataDocuments.run(org_moqui_search_SearchServices_index_DataDocuments:15) ~[?:?]
at org.moqui.impl.actions.XmlAction.run(XmlAction.java:67) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at org.moqui.impl.service.runner.InlineServiceRunner.runService(InlineServiceRunner.java:59) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at org.moqui.impl.service.ServiceCallSyncImpl.callSingle(ServiceCallSyncImpl.java:322) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at org.moqui.impl.service.ServiceCallSyncImpl.call(ServiceCallSyncImpl.java:125) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at org.moqui.impl.entity.EntityDataFeed$FeedRunnable.feedDataDocument(EntityDataFeed.groovy:768) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at org.moqui.impl.entity.EntityDataFeed$FeedRunnable.run(EntityDataFeed.groovy:543) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?]
at java.lang.Thread.run(Thread.java:834) ~[?:?]
Caused by: java.util.concurrent.ExecutionException: java.util.concurrent.TimeoutException: Idle timeout 30000 ms
at org.moqui.util.RestClient.callInternal(RestClient.java:319) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
... 14 more
Caused by: java.util.concurrent.TimeoutException: Idle timeout 30000 ms
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) ~[?:?]
at java.util.concurrent.FutureTask.run(FutureTask.java:264) ~[?:?]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304) ~[?:?]
... 3 more
05:51:00.240 WARN oquiWorker-2 o.moqui.i.c.TransactionFacadeImpl Transaction set rollback only for [Error running service org.moqui.search.SearchServices.index#DataDocuments (Throwable)]. Here is the current location:
org.moqui.BaseException: Set rollback only location
at org.moqui.impl.context.TransactionFacadeImpl.setRollbackOnly(TransactionFacadeImpl.groovy:498) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at org.moqui.impl.context.TransactionFacadeImpl.rollback(TransactionFacadeImpl.groovy:449) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at org.moqui.impl.service.ServiceCallSyncImpl.callSingle(ServiceCallSyncImpl.java:347) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at org.moqui.impl.service.ServiceCallSyncImpl.call(ServiceCallSyncImpl.java:125) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at org.moqui.impl.entity.EntityDataFeed$FeedRunnable.feedDataDocument(EntityDataFeed.groovy:768) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at org.moqui.impl.entity.EntityDataFeed$FeedRunnable.run(EntityDataFeed.groovy:543) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?]
at java.lang.Thread.run(Thread.java:834) ~[?:?]
05:51:00.245 WARN oquiWorker-2 o.moqui.i.s.ServiceCallSyncImpl Error running service org.moqui.search.SearchServices.index#DataDocuments (Throwable) Artifact stack: org.moqui.search.SearchServices.index#DataDocuments
org.moqui.BaseException: Error calling HTTP request to http://127.0.0.1:9200/mantle
at org.moqui.util.RestClient.callInternal(RestClient.java:330) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at org.moqui.util.RestClient.call(RestClient.java:277) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at org.moqui.impl.context.ElasticFacadeImpl$ElasticClientImpl.indexExists(ElasticFacadeImpl.groovy:242) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at org.moqui.impl.context.ElasticFacadeImpl$ElasticClientImpl.checkCreateDataDocumentIndexes(ElasticFacadeImpl.groovy:552) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at org.moqui.impl.context.ElasticFacadeImpl$ElasticClientImpl.verifyDataDocumentIndexes(ElasticFacadeImpl.groovy:604) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at org_moqui_search_SearchServices_index_DataDocuments.run(org_moqui_search_SearchServices_index_DataDocuments:15) ~[?:?]
at org.moqui.impl.actions.XmlAction.run(XmlAction.java:67) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at org.moqui.impl.service.runner.InlineServiceRunner.runService(InlineServiceRunner.java:59) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at org.moqui.impl.service.ServiceCallSyncImpl.callSingle(ServiceCallSyncImpl.java:322) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at org.moqui.impl.service.ServiceCallSyncImpl.call(ServiceCallSyncImpl.java:125) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at org.moqui.impl.entity.EntityDataFeed$FeedRunnable.feedDataDocument(EntityDataFeed.groovy:768) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at org.moqui.impl.entity.EntityDataFeed$FeedRunnable.run(EntityDataFeed.groovy:543) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?]
at java.lang.Thread.run(Thread.java:834) ~[?:?]
Caused by: java.util.concurrent.ExecutionException: java.util.concurrent.TimeoutException: Idle timeout 30000 ms
at org.moqui.util.RestClient.callInternal(RestClient.java:319) ~[moqui_temp1069922374156913779WEB-INF_lib_moqui-framework-3.1.0-rc1.jar.:3.1.0-rc1]
... 14 more
Caused by: java.util.concurrent.TimeoutException: Idle timeout 30000 ms
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) ~[?:?]
at java.util.concurrent.FutureTask.run(FutureTask.java:264) ~[?:?]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304) ~[?:?]
... 3 more
05:51:00.248 ERROR oquiWorker-2 o.moqui.i.c.MessageFacadeImpl Error calling HTTP request to http://127.0.0.1:9200/mantle
05:51:00.248 ERROR oquiWorker-2 o.moqui.i.c.MessageFacadeImpl java.util.concurrent.TimeoutException: Idle timeout 30000 ms
05:51:00.248 ERROR oquiWorker-2 o.moqui.i.c.MessageFacadeImpl Idle timeout 30000 ms
05:51:00.249 ERROR oquiWorker-2 o.moqui.i.e.EntityDataFeed Error calling DataFeed MantleSearch service org.moqui.search.SearchServices.index#DataDocuments: Error calling HTTP request to http://127.0.0.1:9200/mantle
java.util.concurrent.TimeoutException: Idle timeout 30000 ms
Idle timeout 30000 ms

@haghverdimasoud
Copy link
Author

and .....

org.moqui.BaseException: Error handling data load service call: Error calling HTTP request to http://127.0.0.1:9200/mantle
java.util.concurrent.TimeoutException: Idle timeout 30000 ms
Idle timeout 30000 ms

05:51:38.656 INFO main o.moqui.Moqui Loaded [8644] records in 78 seconds.
05:51:39.459 INFO main .moqui.i.c.ExecutionContextFactoryImpl ArtifactHitBins stored
05:51:39.459 INFO main .moqui.i.c.ExecutionContextFactoryImpl Shutting scheduled executor
05:51:39.460 INFO main .moqui.i.c.ExecutionContextFactoryImpl Shutting down worker pool
05:51:39.461 INFO main .moqui.i.c.ExecutionContextFactoryImpl Scheduled executor shut down and terminated
05:51:39.462 INFO main .moqui.i.c.ExecutionContextFactoryImpl Worker pool shut down and terminated
05:51:39.462 INFO main .moqui.i.c.ExecutionContextFactoryImpl Destroying ToolFactory: FOP
05:51:39.462 INFO main .moqui.i.c.ExecutionContextFactoryImpl Destroying ToolFactory: H2Server
05:51:39.462 INFO main .moqui.i.c.ExecutionContextFactoryImpl Destroying ToolFactory: MCache
05:51:39.528 INFO main .moqui.i.c.TransactionInternalBitronix Shutting down Bitronix
Facades destroyed
05:51:40.897 INFO main .moqui.i.c.ExecutionContextFactoryImpl Facades destroyed
Shut down H2 Server
Moqui ExecutionContextFactory Destroyed
[2022-12-25T17:21:42,132][WARN ][o.e.c.c.ClusterFormationFailureHelper] [MHAGHVERDI] master not discovered or elected yet, an election requires at least 4 nodes with ids from [SY8xhFVoR3mSRWO4y0hsTQ, tEjoeM5ZReWLxXSozfEWlw, UhH8S58hQlKaB6t-GditxQ, 60d9Cvu9SaaoWrc5K5G4LA, rPWEXxlpT6Ch6mrNeQVl-g, bXahzH1cTaGih5ZrIrlN8A], have discovered [{MHAGHVERDI}{tEjoeM5ZReWLxXSozfEWlw}{f2Yv4dNdSmmUbSuWNho-Vg}{127.0.0.1}{127.0.0.1:9305}{dimr}, {MHAGHVERDI}{60d9Cvu9SaaoWrc5K5G4LA}{XeEd8OgqQ8qYk5EY5ApOvA}{127.0.0.1}{127.0.0.1:9300}{dimr}, {MHAGHVERDI}{SY8xhFVoR3mSRWO4y0hsTQ}{rDwWLNmyREOG06_o5l4oVA}{127.0.0.1}{127.0.0.1:9301}{dimr}, {MHAGHVERDI}{UhH8S58hQlKaB6t-GditxQ}{jslf5SF9Tr-Wb219IIWuJQ}{127.0.0.1}{127.0.0.1:9302}{dimr}, {MHAGHVERDI}{rPWEXxlpT6Ch6mrNeQVl-g}{4xLOsftcT8CbP1qJob7uoQ}{127.0.0.1}{127.0.0.1:9303}{dimr}, {MHAGHVERDI}{bXahzH1cTaGih5ZrIrlN8A}{U0LePEgaQI6H9MX9SxlePA}{127.0.0.1}{127.0.0.1:9304}{dimr}] which is a quorum; discovery will continue using [127.0.0.1:9300, 127.0.0.1:9301, 127.0.0.1:9302, 127.0.0.1:9303, 127.0.0.1:9304, [::1]:9300, [::1]:9301, [::1]:9302, [::1]:9303, [::1]:9304] from hosts providers and [{MHAGHVERDI}{tEjoeM5ZReWLxXSozfEWlw}{f2Yv4dNdSmmUbSuWNho-Vg}{127.0.0.1}{127.0.0.1:9305}{dimr}] from last-known cluster state; node term 0, last-accepted version 0 in term 0
[2022-12-25T17:21:52,138][WARN ][o.e.c.c.ClusterFormationFailureHelper] [MHAGHVERDI] master not discovered or elected yet, an election requires at least 4 nodes with ids from [SY8xhFVoR3mSRWO4y0hsTQ, tEjoeM5ZReWLxXSozfEWlw, UhH8S58hQlKaB6t-GditxQ, 60d9Cvu9SaaoWrc5K5G4LA, rPWEXxlpT6Ch6mrNeQVl-g, bXahzH1cTaGih5ZrIrlN8A], have discovered [{MHAGHVERDI}{tEjoeM5ZReWLxXSozfEWlw}{f2Yv4dNdSmmUbSuWNho-Vg}{127.0.0.1}{127.0.0.1:9305}{dimr}, {MHAGHVERDI}{60d9Cvu9SaaoWrc5K5G4LA}{XeEd8OgqQ8qYk5EY5ApOvA}{127.0.0.1}{127.0.0.1:9300}{dimr}, {MHAGHVERDI}{SY8xhFVoR3mSRWO4y0hsTQ}{rDwWLNmyREOG06_o5l4oVA}{127.0.0.1}{127.0.0.1:9301}{dimr}, {MHAGHVERDI}{UhH8S58hQlKaB6t-GditxQ}{jslf5SF9Tr-Wb219IIWuJQ}{127.0.0.1}{127.0.0.1:9302}{dimr}, {MHAGHVERDI}{rPWEXxlpT6Ch6mrNeQVl-g}{4xLOsftcT8CbP1qJob7uoQ}{127.0.0.1}{127.0.0.1:9303}{dimr}, {MHAGHVERDI}{bXahzH1cTaGih5ZrIrlN8A}{U0LePEgaQI6H9MX9SxlePA}{127.0.0.1}{127.0.0.1:9304}{dimr}] which is a quorum; discovery will continue using [127.0.0.1:9300, 127.0.0.1:9301, 127.0.0.1:9302, 127.0.0.1:9303, 127.0.0.1:9304, [::1]:9300, [::1]:9301, [::1]:9302, [::1]:9303, [::1]:9304] from hosts providers and [{MHAGHVERDI}{tEjoeM5ZReWLxXSozfEWlw}{f2Yv4dNdSmmUbSuWNho-Vg}{127.0.0.1}{127.0.0.1:9305}{dimr}] from last-known cluster state; node term 0, last-accepted version 0 in term 0
[2022-12-25T17:22:02,155][WARN ][o.e.c.c.ClusterFormationFailureHelper] [MHAGHVERDI] master not discovered or elected yet, an election requires at least 4 nodes with ids from [SY8xhFVoR3mSRWO4y0hsTQ, tEjoeM5ZReWLxXSozfEWlw, UhH8S58hQlKaB6t-GditxQ, 60d9Cvu9SaaoWrc5K5G4LA, rPWEXxlpT6Ch6mrNeQVl-g, bXahzH1cTaGih5ZrIrlN8A], have discovered [{MHAGHVERDI}{tEjoeM5ZReWLxXSozfEWlw}{f2Yv4dNdSmmUbSuWNho-Vg}{127.0.0.1}{127.0.0.1:9305}{dimr}, {MHAGHVERDI}{60d9Cvu9SaaoWrc5K5G4LA}{XeEd8OgqQ8qYk5EY5ApOvA}{127.0.0.1}{127.0.0.1:9300}{dimr}, {MHAGHVERDI}{SY8xhFVoR3mSRWO4y0hsTQ}{rDwWLNmyREOG06_o5l4oVA}{127.0.0.1}{127.0.0.1:9301}{dimr}, {MHAGHVERDI}{UhH8S58hQlKaB6t-GditxQ}{jslf5SF9Tr-Wb219IIWuJQ}{127.0.0.1}{127.0.0.1:9302}{dimr}, {MHAGHVERDI}{rPWEXxlpT6Ch6mrNeQVl-g}{4xLOsftcT8CbP1qJob7uoQ}{127.0.0.1}{127.0.0.1:9303}{dimr}, {MHAGHVERDI}{bXahzH1cTaGih5ZrIrlN8A}{U0LePEgaQI6H9MX9SxlePA}{127.0.0.1}{127.0.0.1:9304}{dimr}] which is a quorum; discovery will continue using [127.0.0.1:9300, 127.0.0.1:9301, 127.0.0.1:9302, 127.0.0.1:9303, 127.0.0.1:9304, [::1]:9300, [::1]:9301, [::1]:9302, [::1]:9303, [::1]:9304] from hosts providers and [{MHAGHVERDI}{tEjoeM5ZReWLxXSozfEWlw}{f2Yv4dNdSmmUbSuWNho-Vg}{127.0.0.1}{127.0.0.1:9305}{dimr}] from last-known cluster state; node term 0, last-accepted version 0 in term 0
[2022-12-25T17:22:12,161][WARN ][o.e.c.c.ClusterFormationFailureHelper] [MHAGHVERDI] master not discovered or elected yet, an election requires at least 4 nodes with ids from [SY8xhFVoR3mSRWO4y0hsTQ, tEjoeM5ZReWLxXSozfEWlw, UhH8S58hQlKaB6t-GditxQ, 60d9Cvu9SaaoWrc5K5G4LA, rPWEXxlpT6Ch6mrNeQVl-g, bXahzH1cTaGih5ZrIrlN8A], have discovered [{MHAGHVERDI}{tEjoeM5ZReWLxXSozfEWlw}{f2Yv4dNdSmmUbSuWNho-Vg}{127.0.0.1}{127.0.0.1:9305}{dimr}, {MHAGHVERDI}{60d9Cvu9SaaoWrc5K5G4LA}{XeEd8OgqQ8qYk5EY5ApOvA}{127.0.0.1}{127.0.0.1:9300}{dimr}, {MHAGHVERDI}{SY8xhFVoR3mSRWO4y0hsTQ}{rDwWLNmyREOG06_o5l4oVA}{127.0.0.1}{127.0.0.1:9301}{dimr}, {MHAGHVERDI}{UhH8S58hQlKaB6t-GditxQ}{jslf5SF9Tr-Wb219IIWuJQ}{127.0.0.1}{127.0.0.1:9302}{dimr}, {MHAGHVERDI}{rPWEXxlpT6Ch6mrNeQVl-g}{4xLOsftcT8CbP1qJob7uoQ}{127.0.0.1}{127.0.0.1:9303}{dimr}, {MHAGHVERDI}{bXahzH1cTaGih5ZrIrlN8A}{U0LePEgaQI6H9MX9SxlePA}{127.0.0.1}{127.0.0.1:9304}{dimr}] which is a quorum; discovery will continue using [127.0.0.1:9300, 127.0.0.1:9301, 127.0.0.1:9302, 127.0.0.1:9303, 127.0.0.1:9304, [::1]:9300, [::1]:9301, [::1]:9302, [::1]:9303, [::1]:9304] from hosts providers and [{MHAGHVERDI}{tEjoeM5ZReWLxXSozfEWlw}{f2Yv4dNdSmmUbSuWNho-Vg}{127.0.0.1}{127.0.0.1:9305}{dimr}] from last-known cluster state; node term 0, last-accepted version 0 in term 0
[2022-12-25T17:22:22,176][WARN ][o.e.c.c.ClusterFormationFailureHelper] [MHAGHVERDI] master not discovered or elected yet, an election requires at least 4 nodes with ids from [SY8xhFVoR3mSRWO4y0hsTQ, tEjoeM5ZReWLxXSozfEWlw, UhH8S58hQlKaB6t-GditxQ, 60d9Cvu9SaaoWrc5K5G4LA, rPWEXxlpT6Ch6mrNeQVl-g, bXahzH1cTaGih5ZrIrlN8A], have discovered [{MHAGHVERDI}{tEjoeM5ZReWLxXSozfEWlw}{f2Yv4dNdSmmUbSuWNho-Vg}{127.0.0.1}{127.0.0.1:9305}{dimr}, {MHAGHVERDI}{60d9Cvu9SaaoWrc5K5G4LA}{XeEd8OgqQ8qYk5EY5ApOvA}{127.0.0.1}{127.0.0.1:9300}{dimr}, {MHAGHVERDI}{SY8xhFVoR3mSRWO4y0hsTQ}{rDwWLNmyREOG06_o5l4oVA}{127.0.0.1}{127.0.0.1:9301}{dimr}, {MHAGHVERDI}{UhH8S58hQlKaB6t-GditxQ}{jslf5SF9Tr-Wb219IIWuJQ}{127.0.0.1}{127.0.0.1:9302}{dimr}, {MHAGHVERDI}{rPWEXxlpT6Ch6mrNeQVl-g}{4xLOsftcT8CbP1qJob7uoQ}{127.0.0.1}{127.0.0.1:9303}{dimr}, {MHAGHVERDI}{bXahzH1cTaGih5ZrIrlN8A}{U0LePEgaQI6H9MX9SxlePA}{127.0.0.1}{127.0.0.1:9304}{dimr}] which is a quorum; discovery will continue using [127.0.0.1:9300, 127.0.0.1:9301, 127.0.0.1:9302, 127.0.0.1:9303, 127.0.0.1:9304, [::1]:9300, [::1]:9301, [::1]:9302, [::1]:9303, [::1]:9304] from hosts providers and [{MHAGHVERDI}{tEjoeM5ZReWLxXSozfEWlw}{f2Yv4dNdSmmUbSuWNho-Vg}{127.0.0.1}{127.0.0.1:9305}{dimr}] from last-known cluster state; node term 0, last-accepted version 0 in term 0
[2022-12-25T17:22:32,193][WARN ][o.e.c.c.ClusterFormationFailureHelper] [MHAGHVERDI] master not discovered or elected yet, an election requires at least 4 nodes with ids from [SY8xhFVoR3mSRWO4y0hsTQ, tEjoeM5ZReWLxXSozfEWlw, UhH8S58hQlKaB6t-GditxQ, 60d9Cvu9SaaoWrc5K5G4LA, rPWEXxlpT6Ch6mrNeQVl-g, bXahzH1cTaGih5ZrIrlN8A], have discovered [{MHAGHVERDI}{tEjoeM5ZReWLxXSozfEWlw}{f2Yv4dNdSmmUbSuWNho-Vg}{127.0.0.1}{127.0.0.1:9305}{dimr}, {MHAGHVERDI}{60d9Cvu9SaaoWrc5K5G4LA}{XeEd8OgqQ8qYk5EY5ApOvA}{127.0.0.1}{127.0.0.1:9300}{dimr}, {MHAGHVERDI}{SY8xhFVoR3mSRWO4y0hsTQ}{rDwWLNmyREOG06_o5l4oVA}{127.0.0.1}{127.0.0.1:9301}{dimr}, {MHAGHVERDI}{UhH8S58hQlKaB6t-GditxQ}{jslf5SF9Tr-Wb219IIWuJQ}{127.0.0.1}{127.0.0.1:9302}{dimr}, {MHAGHVERDI}{rPWEXxlpT6Ch6mrNeQVl-g}{4xLOsftcT8CbP1qJob7uoQ}{127.0.0.1}{127.0.0.1:9303}{dimr}, {MHAGHVERDI}{bXahzH1cTaGih5ZrIrlN8A}{U0LePEgaQI6H9MX9SxlePA}{127.0.0.1}{127.0.0.1:9304}{dimr}] which is a quorum; discovery will continue using [127.0.0.1:9300, 127.0.0.1:9301, 127.0.0.1:9302, 127.0.0.1:9303, 127.0.0.1:9304, [::1]:9300, [::1]:9301, [::1]:9302, [::1]:9303, [::1]:9304] from hosts providers and [{MHAGHVERDI}{tEjoeM5ZReWLxXSozfEWlw}{f2Yv4dNdSmmUbSuWNho-Vg}{127.0.0.1}{127.0.0.1:9305}{dimr}] from last-known cluster state; node term 0, last-accepted version 0 in term 0
[2022-12-25T17:22:42,208][WARN ][o.e.c.c.ClusterFormationFailureHelper] [MHAGHVERDI] master not discovered or elected yet, an election requires at least 4 nodes with ids from [SY8xhFVoR3mSRWO4y0hsTQ, tEjoeM5ZReWLxXSozfEWlw, UhH8S58hQlKaB6t-GditxQ, 60d9Cvu9SaaoWrc5K5G4LA, rPWEXxlpT6Ch6mrNeQVl-g, bXahzH1cTaGih5ZrIrlN8A], have discovered [{MHAGHVERDI}{tEjoeM5ZReWLxXSozfEWlw}{f2Yv4dNdSmmUbSuWNho-Vg}{127.0.0.1}{127.0.0.1:9305}{dimr}, {MHAGHVERDI}{60d9Cvu9SaaoWrc5K5G4LA}{XeEd8OgqQ8qYk5EY5ApOvA}{127.0.0.1}{127.0.0.1:9300}{dimr}, {MHAGHVERDI}{SY8xhFVoR3mSRWO4y0hsTQ}{rDwWLNmyREOG06_o5l4oVA}{127.0.0.1}{127.0.0.1:9301}{dimr}, {MHAGHVERDI}{UhH8S58hQlKaB6t-GditxQ}{jslf5SF9Tr-Wb219IIWuJQ}{127.0.0.1}{127.0.0.1:9302}{dimr}, {MHAGHVERDI}{rPWEXxlpT6Ch6mrNeQVl-g}{4xLOsftcT8CbP1qJob7uoQ}{127.0.0.1}{127.0.0.1:9303}{dimr}, {MHAGHVERDI}{bXahzH1cTaGih5ZrIrlN8A}{U0LePEgaQI6H9MX9SxlePA}{127.0.0.1}{127.0.0.1:9304}{dimr}] which is a quorum; discovery will continue using [127.0.0.1:9300, 127.0.0.1:9301, 127.0.0.1:9302, 127.0.0.1:9303, 127.0.0.1:9304, [::1]:9300, [::1]:9301, [::1]:9302, [::1]:9303, [::1]:9304] from hosts providers and [{MHAGHVERDI}{tEjoeM5ZReWLxXSozfEWlw}{f2Yv4dNdSmmUbSuWNho-Vg}{127.0.0.1}{127.0.0.1:9305}{dimr}] from last-known cluster state; node term 0, last-accepted version 0 in term 0
[2022-12-25T17:22:52,219][WARN ][o.e.c.c.ClusterFormationFailureHelper] [MHAGHVERDI] master not discovered or elected yet, an election requires at least 4 nodes with ids from [SY8xhFVoR3mSRWO4y0hsTQ, tEjoeM5ZReWLxXSozfEWlw, UhH8S58hQlKaB6t-GditxQ, 60d9Cvu9SaaoWrc5K5G4LA, rPWEXxlpT6Ch6mrNeQVl-g, bXahzH1cTaGih5ZrIrlN8A], have discovered [{MHAGHVERDI}{tEjoeM5ZReWLxXSozfEWlw}{f2Yv4dNdSmmUbSuWNho-Vg}{127.0.0.1}{127.0.0.1:9305}{dimr}, {MHAGHVERDI}{60d9Cvu9SaaoWrc5K5G4LA}{XeEd8OgqQ8qYk5EY5ApOvA}{127.0.0.1}{127.0.0.1:9300}{dimr}, {MHAGHVERDI}{SY8xhFVoR3mSRWO4y0hsTQ}{rDwWLNmyREOG06_o5l4oVA}{127.0.0.1}{127.0.0.1:9301}{dimr}, {MHAGHVERDI}{UhH8S58hQlKaB6t-GditxQ}{jslf5SF9Tr-Wb219IIWuJQ}{127.0.0.1}{127.0.0.1:9302}{dimr}, {MHAGHVERDI}{rPWEXxlpT6Ch6mrNeQVl-g}{4xLOsftcT8CbP1qJob7uoQ}{127.0.0.1}{127.0.0.1:9303}{dimr}, {MHAGHVERDI}{bXahzH1cTaGih5ZrIrlN8A}{U0LePEgaQI6H9MX9SxlePA}{127.0.0.1}{127.0.0.1:9304}{dimr}] which is a quorum; discovery will continue using [127.0.0.1:9300, 127.0.0.1:9301, 127.0.0.1:9302, 127.0.0.1:9303, 127.0.0.1:9304, [::1]:9300, [::1]:9301, [::1]:9302, [::1]:9303, [::1]:9304] from hosts providers and [{MHAGHVERDI}{tEjoeM5ZReWLxXSozfEWlw}{f2Yv4dNdSmmUbSuWNho-Vg}{127.0.0.1}{127.0.0.1:9305}{dimr}] from last-known cluster state; node term 0, last-accepted version 0 in term 0
[2022-12-25T17:23:02,224][WARN ][o.e.c.c.ClusterFormationFailureHelper] [MHAGHVERDI] master not discovered or elected yet, an election requires at least 4 nodes with ids from [SY8xhFVoR3mSRWO4y0hsTQ, tEjoeM5ZReWLxXSozfEWlw, UhH8S58hQlKaB6t-GditxQ, 60d9Cvu9SaaoWrc5K5G4LA, rPWEXxlpT6Ch6mrNeQVl-g, bXahzH1cTaGih5ZrIrlN8A], have discovered [{MHAGHVERDI}{tEjoeM5ZReWLxXSozfEWlw}{f2Yv4dNdSmmUbSuWNho-Vg}{127.0.0.1}{127.0.0.1:9305}{dimr}, {MHAGHVERDI}{60d9Cvu9SaaoWrc5K5G4LA}{XeEd8OgqQ8qYk5EY5ApOvA}{127.0.0.1}{127.0.0.1:9300}{dimr}, {MHAGHVERDI}{SY8xhFVoR3mSRWO4y0hsTQ}{rDwWLNmyREOG06_o5l4oVA}{127.0.0.1}{127.0.0.1:9301}{dimr}, {MHAGHVERDI}{UhH8S58hQlKaB6t-GditxQ}{jslf5SF9Tr-Wb219IIWuJQ}{127.0.0.1}{127.0.0.1:9302}{dimr}, {MHAGHVERDI}{rPWEXxlpT6Ch6mrNeQVl-g}{4xLOsftcT8CbP1qJob7uoQ}{127.0.0.1}{127.0.0.1:9303}{dimr}, {MHAGHVERDI}{bXahzH1cTaGih5ZrIrlN8A}{U0LePEgaQI6H9MX9SxlePA}{127.0.0.1}{127.0.0.1:9304}{dimr}] which is a quorum; discovery will continue using [127.0.0.1:9300, 127.0.0.1:9301, 127.0.0.1:9302, 127.0.0.1:9303, 127.0.0.1:9304, [::1]:9300, [::1]:9301, [::1]:9302, [::1]:9303, [::1]:9304] from hosts providers and [{MHAGHVERDI}{tEjoeM5ZReWLxXSozfEWlw}{f2Yv4dNdSmmUbSuWNho-Vg}{127.0.0.1}{127.0.0.1:9305}{dimr}] from last-known cluster state; node term 0, last-accepted version 0 in term 0
[2022-12-25T17:23:12,232][WARN ][o.e.c.c.ClusterFormationFailureHelper] [MHAGHVERDI] master not discovered or elected yet, an election requires at least 4 nodes with ids from [SY8xhFVoR3mSRWO4y0hsTQ, tEjoeM5ZReWLxXSozfEWlw, UhH8S58hQlKaB6t-GditxQ, 60d9Cvu9SaaoWrc5K5G4LA, rPWEXxlpT6Ch6mrNeQVl-g, bXahzH1cTaGih5ZrIrlN8A], have discovered [{MHAGHVERDI}{tEjoeM5ZReWLxXSozfEWlw}{f2Yv4dNdSmmUbSuWNho-Vg}{127.0.0.1}{127.0.0.1:9305}{dimr}, {MHAGHVERDI}{60d9Cvu9SaaoWrc5K5G4LA}{XeEd8OgqQ8qYk5EY5ApOvA}{127.0.0.1}{127.0.0.1:9300}{dimr}, {MHAGHVERDI}{SY8xhFVoR3mSRWO4y0hsTQ}{rDwWLNmyREOG06_o5l4oVA}{127.0.0.1}{127.0.0.1:9301}{dimr}, {MHAGHVERDI}{UhH8S58hQlKaB6t-GditxQ}{jslf5SF9Tr-Wb219IIWuJQ}{127.0.0.1}{127.0.0.1:9302}{dimr}, {MHAGHVERDI}{rPWEXxlpT6Ch6mrNeQVl-g}{4xLOsftcT8CbP1qJob7uoQ}{127.0.0.1}{127.0.0.1:9303}{dimr}, {MHAGHVERDI}{bXahzH1cTaGih5ZrIrlN8A}{U0LePEgaQI6H9MX9SxlePA}{127.0.0.1}{127.0.0.1:9304}{dimr}] which is a quorum; discovery will continue using [127.0.0.1:9300, 127.0.0.1:9301, 127.0.0.1:9302, 127.0.0.1:9303, 127.0.0.1:9304, [::1]:9300, [::1]:9301, [::1]:9302, [::1]:9303, [::1]:9304] from hosts providers and [{MHAGHVERDI}{tEjoeM5ZReWLxXSozfEWlw}{f2Yv4dNdSmmUbSuWNho-Vg}{127.0.0.1}{127.0.0.1:9305}{dimr}] from last-known cluster state; node term 0, last-accepted version 0 in term 0
[2022-12-25T17:23:22,237][WARN ][o.e.c.c.ClusterFormationFailureHelper] [MHAGHVERDI] master not discovered or elected yet, an election requires at least 4 nodes with ids from [SY8xhFVoR3mSRWO4y0hsTQ, tEjoeM5ZReWLxXSozfEWlw, UhH8S58hQlKaB6t-GditxQ, 60d9Cvu9SaaoWrc5K5G4LA, rPWEXxlpT6Ch6mrNeQVl-g, bXahzH1cTaGih5ZrIrlN8A], have discovered [{MHAGHVERDI}{tEjoeM5ZReWLxXSozfEWlw}{f2Yv4dNdSmmUbSuWNho-Vg}{127.0.0.1}{127.0.0.1:9305}{dimr}, {MHAGHVERDI}{60d9Cvu9SaaoWrc5K5G4LA}{XeEd8OgqQ8qYk5EY5ApOvA}{127.0.0.1}{127.0.0.1:9300}{dimr}, {MHAGHVERDI}{SY8xhFVoR3mSRWO4y0hsTQ}{rDwWLNmyREOG06_o5l4oVA}{127.0.0.1}{127.0.0.1:9301}{dimr}, {MHAGHVERDI}{UhH8S58hQlKaB6t-GditxQ}{jslf5SF9Tr-Wb219IIWuJQ}{127.0.0.1}{127.0.0.1:9302}{dimr}, {MHAGHVERDI}{rPWEXxlpT6Ch6mrNeQVl-g}{4xLOsftcT8CbP1qJob7uoQ}{127.0.0.1}{127.0.0.1:9303}{dimr}, {MHAGHVERDI}{bXahzH1cTaGih5ZrIrlN8A}{U0LePEgaQI6H9MX9SxlePA}{127.0.0.1}{127.0.0.1:9304}{dimr}] which is a quorum; discovery will continue using [127.0.0.1:9300, 127.0.0.1:9301, 127.0.0.1:9302, 127.0.0.1:9303, 127.0.0.1:9304, [::1]:9300, [::1]:9301, [::1]:9302, [::1]:9303, [::1]:9304] from hosts providers and [{MHAGHVERDI}{tEjoeM5ZReWLxXSozfEWlw}{f2Yv4dNdSmmUbSuWNho-Vg}{127.0.0.1}{127.0.0.1:9305}{dimr}] from last-known cluster state; node term 0, last-accepted version 0 in term 0

@haghverdimasoud
Copy link
Author

how can i solve this problem??

@jonesde
Copy link
Member

jonesde commented Dec 27, 2022

This is the main information I see about steps to reproduce, in your first post: I get the last version of moqui framework that its not included moqui-elasticsearch. so i downoad elasticsearch oss 7.10.2 no JDK and added to runtime/elasticsearch. and run the moqui

More information is needed to reproduce this. Either way, it looks like there is an issue with ElasticSearch given the error messages that start with: master not discovered or elected yet, an election requires at least 4 nodes with ids from...

The issue is likely in how ElasticSearch is configured, and it looks like you have a bunch of ElasticSearch nodes running that it is able to find, however it is configured, including 4 instances listening on 4 ports on the local machine (127.0.0.1). If you don't know how to stop those, or where they are from, a reboot will do it (unless you have something setup to auto start ElasticSearch on boot).

On a side note, using ElasticSearch with Moqui is no longer recommended unless you are using the commercial version. The reason for this is that 7.10.2 is that last open source version, at this point was released some time ago, and it is not maintained or supported for security or other issues. At this point it's better to use OpenSearch which is explicitly supported by Moqui, including query variations between ElasticSearch and OpenSearch (which are increasing now that they are following different development and design paths).

@haghverdimasoud
Copy link
Author

Thank you mr @jonesde
I removed elasticsearch and replaced OpenSearch in runtime directory. now I have an error on load run the project.
this error has shown:
[2022-12-28T09:59:43,992][INFO ][o.o.s.OpenSearchSecurityPlugin] [MHAGHVERDI] OpenSearch Config path is C:\Users\mhaghverdi\IdeaProjects\moqui-framework\runtime\opensearch\config
22:29:44.152 WARN main o.moqui.i.c.ElasticFacadeImpl Error connecting to ElasticSearch cluster default at http://127.0.0.1:9200, try 4 of 20: org.moqui.BaseException: Error calling HTTP request to http://127.0.0.1:9200/
[2022-12-28T09:59:44,632][INFO ][o.o.s.s.DefaultSecurityKeyStore] [MHAGHVERDI] JVM supports TLSv1.3
[2022-12-28T09:59:44,638][INFO ][o.o.s.s.DefaultSecurityKeyStore] [MHAGHVERDI] Config directory is C:\Users\mhaghverdi\IdeaProjects\moqui-framework\runtime\opensearch\config/, from there the key- and truststore files are resolved relatively
[2022-12-28T09:59:44,912][ERROR][o.o.b.OpenSearchUncaughtExceptionHandler] [MHAGHVERDI] uncaught exception in thread [main]
org.opensearch.bootstrap.StartupException: java.lang.IllegalStateException: failed to load plugin class [org.opensearch.security.OpenSearchSecurityPlugin]
at org.opensearch.bootstrap.OpenSearch.init(OpenSearch.java:184) ~[opensearch-2.4.1.jar:2.4.1]
at org.opensearch.bootstrap.OpenSearch.execute(OpenSearch.java:171) ~[opensearch-2.4.1.jar:2.4.1]
at org.opensearch.cli.EnvironmentAwareCommand.execute(EnvironmentAwareCommand.java:104) ~[opensearch-2.4.1.jar:2.4.1]
at org.opensearch.cli.Command.mainWithoutErrorHandling(Command.java:138) ~[opensearch-cli-2.4.1.jar:2.4.1]
at org.opensearch.cli.Command.main(Command.java:101) ~[opensearch-cli-2.4.1.jar:2.4.1]
at org.opensearch.bootstrap.OpenSearch.main(OpenSearch.java:137) ~[opensearch-2.4.1.jar:2.4.1]
at org.opensearch.bootstrap.OpenSearch.main(OpenSearch.java:103) ~[opensearch-2.4.1.jar:2.4.1]

I opended the file and found this:
extended.plugins: other plugins this plugin extends through SPI
extended.plugins=????????????????

@eigood
Copy link
Contributor

eigood commented Feb 10, 2023

That OpenSearch is because you have not loaded the default data, which creates a bunch of security settings. That's not a moqui bug. Search for "failed to load plugin class [org.opensearch.security.OpenSearchSecurityPlugin]" in google, to find further info.

But, be aware, opensearch upstream is currently deprecating the entire OpenSearchSecurityPlugin architecture, while they refactor it all. (opensearch-project/security#1755)

@jonesde
Copy link
Member

jonesde commented Feb 11, 2023

Agreed, this is not a Moqui issue and the moqui-framework issues are starting to get messy with a bunch of non-actionable issues like this. Closing.

@jonesde jonesde closed this as completed Feb 11, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

No branches or pull requests

3 participants