opentaps installation-Java returned: 99 - postgresql

I am installing opentaps on my system and error is-
[java] 2016-03-07 11:32:37,428 (main) [ TransactionUtil.java:345:INFO ] [TransactionUtil.rollback] transaction rolled back
[java] 2016-03-07 11:32:37,428 (main) [ EntityDataLoader.java:218:ERROR]
[java] ---- exception report ----------------------------------------------------------
[java] [install.loadData]: Error loading XML Resource "file:/home/oodles/work/skulocity/custom-erp-crm/opentaps/amazon/data/AmazonDemoSetup.xml"; Error was: A transaction error occurred reading data
[java] Exception: org.xml.sax.SAXException
[java] Message: A transaction error occurred reading data
[java] ---- cause ---------------------------------------------------------------------
[java] Exception: org.ofbiz.entity.GenericDataSourceException
[java] Message: SQL Exception occurred on commit (Commit can not be set while enrolled in a transaction)
[java] ---- cause ---------------------------------------------------------------------
[java] Exception: java.sql.SQLException
[java] Message: Commit can not be set while enrolled in a transaction
[java] ---- stack trace ---------------------------------------------------------------
[java] java.sql.SQLException: Commit can not be set while enrolled in a transaction
[java] org.apache.commons.dbcp.managed.ManagedConnection.commit(ManagedConnection.java:214)
[java] org.ofbiz.entity.jdbc.SQLProcessor.commit(SQLProcessor.java:145)
[java] org.ofbiz.entity.jdbc.SQLProcessor.close(SQLProcessor.java:196)
[java] org.ofbiz.entity.datasource.GenericDAO.select(GenericDAO.java:493)
[java] org.ofbiz.entity.datasource.GenericHelperDAO.findByPrimaryKey(GenericHelperDAO.java:80)
[java] org.ofbiz.entity.GenericDelegator.storeAll(GenericDelegator.java:1424)
[java] org.ofbiz.entity.util.EntitySaxReader.writeValues(EntitySaxReader.java:286)
[java] org.ofbiz.entity.util.EntitySaxReader.parse(EntitySaxReader.java:265)
[java] org.ofbiz.entity.util.EntitySaxReader.parse(EntitySaxReader.java:222)
[java] org.ofbiz.entity.util.EntityDataLoader.loadData(EntityDataLoader.java:214)
[java] org.ofbiz.entityext.data.EntityDataLoadContainer.start(EntityDataLoadContainer.java:389)
[java] org.ofbiz.base.container.ContainerLoader.start(ContainerLoader.java:101)
[java] org.ofbiz.base.start.Start.startStartLoaders(Start.java:273)
[java] org.ofbiz.base.start.Start.startServer(Start.java:323)
[java] org.ofbiz.base.start.Start.start(Start.java:327)
[java] org.ofbiz.base.start.Start.main(Start.java:412)
[java] --------------------------------------------------------------------------------
[java]
[java] 2016-03-07 11:32:37,428 (main) [ EntitySaxReader.java:221:INFO ] Beginning import from URL: file:/home/oodles/work/skulocity/custom-erp-crm/opentaps/amazon/data/AmazonDemoData.xml
[java] 2016-03-07 11:32:37,428 (main) [ EntitySaxReader.java:259:INFO ] Transaction Timeout set to 2 hours (7200 seconds)
[java] 2016-03-07 11:32:37,466 (main) [ EntitySaxReader.java:278:INFO ] Finished 13 values from file:/home/oodles/work/skulocity/custom-erp-crm/opentaps/amazon/data/AmazonDemoData.xml
[java] 2016-03-07 11:32:37,466 (main) [EntityDataLoadContainer.java:404:INFO ] =-=-=-=-=-=-= Here is a summary of the data load:
[java] 2016-03-07 11:32:37,466 (main) [EntityDataLoadContainer.java:406:INFO ] 00024 of 00024 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/security/data/SecurityData.xml
[java] 2016-03-07 11:32:37,466 (main) [EntityDataLoadContainer.java:406:INFO ] 00023 of 00047 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/common/data/CommonSecurityData.xml
[java] 2016-03-07 11:32:37,466 (main) [EntityDataLoadContainer.java:406:INFO ] 00095 of 00142 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/common/data/CommonTypeData.xml
[java] 2016-03-07 11:32:37,466 (main) [EntityDataLoadContainer.java:406:INFO ] 00724 of 00866 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/common/data/CountryCodeData.xml
[java] 2016-03-07 11:32:37,466 (main) [EntityDataLoadContainer.java:406:INFO ] 00169 of 01035 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/common/data/CurrencyData.xml
[java] 2016-03-07 11:32:37,466 (main) [EntityDataLoadContainer.java:406:INFO ] 00279 of 01314 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/common/data/GeoData.xml
[java] 2016-03-07 11:32:37,466 (main) [EntityDataLoadContainer.java:406:INFO ] 00016 of 01330 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/common/data/GeoData_AU.xml
[java] 2016-03-07 11:32:37,466 (main) [EntityDataLoadContainer.java:406:INFO ] 00056 of 01386 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/common/data/GeoData_BG.xml
[java] 2016-03-07 11:32:37,466 (main) [EntityDataLoadContainer.java:406:INFO ] 00053 of 01439 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/common/data/GeoData_BR.xml
[java] 2016-03-07 11:32:37,467 (main) [EntityDataLoadContainer.java:406:INFO ] 00066 of 01505 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/common/data/GeoData_CN.xml
[java] 2016-03-07 11:32:37,467 (main) [EntityDataLoadContainer.java:406:INFO ] 00066 of 01571 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/common/data/GeoData_CO.xml
[java] 2016-03-07 11:32:37,467 (main) [EntityDataLoadContainer.java:406:INFO ] 00032 of 01603 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/common/data/GeoData_DE.xml
[java] 2016-03-07 11:32:37,467 (main) [EntityDataLoadContainer.java:406:INFO ] 00138 of 01741 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/common/data/GeoData_ES.xml
[java] 2016-03-07 11:32:37,467 (main) [EntityDataLoadContainer.java:406:INFO ] 00428 of 02169 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/common/data/GeoData_FR.xml
[java] 2016-03-07 11:32:37,467 (main) [EntityDataLoadContainer.java:406:INFO ] 00220 of 02389 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/common/data/GeoData_IT.xml
[java] 2016-03-07 11:32:37,467 (main) [EntityDataLoadContainer.java:406:INFO ] 00070 of 02459 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/common/data/GeoData_IN.xml
[java] 2016-03-07 11:32:37,467 (main) [EntityDataLoadContainer.java:406:INFO ] 00064 of 02523 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/common/data/GeoData_IRL.xml
[java] 2016-03-07 11:32:37,467 (main) [EntityDataLoadContainer.java:406:INFO ] 00026 of 02549 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/common/data/GeoData_NL.xml
[java] 2016-03-07 11:32:37,467 (main) [EntityDataLoadContainer.java:406:INFO ] 00172 of 02721 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/common/data/GeoData_UK.xml
[java] 2016-03-07 11:32:37,467 (main) [EntityDataLoadContainer.java:406:INFO ] 00240 of 02961 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/common/data/GeoData_US.xml
[java] 2016-03-07 11:32:37,467 (main) [EntityDataLoadContainer.java:406:INFO ] 00433 of 03394 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/common/data/LanguageData.xml
[java] 2016-03-07 11:32:37,467 (main) [EntityDataLoadContainer.java:406:INFO ] 00236 of 03630 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/common/data/UnitData.xml
[java] 2016-03-07 11:32:37,468 (main) [EntityDataLoadContainer.java:406:INFO ] 00007 of 03637 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/common/data/PeriodData.xml
[java] 2016-03-07 11:32:37,468 (main) [EntityDataLoadContainer.java:406:INFO ] 00012 of 03649 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/common/data/CommonPortletData.xml
[java] 2016-03-07 11:32:37,468 (main) [EntityDataLoadContainer.java:406:INFO ] 00008 of 03657 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/service/data/ScheduledServiceData.xml
[java] 2016-03-07 11:32:37,468 (main) [EntityDataLoadContainer.java:406:INFO ] 00003 of 03660 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/service/data/ServiceSecurityData.xml
[java] 2016-03-07 11:32:37,468 (main) [EntityDataLoadContainer.java:406:INFO ] 00012 of 03672 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/entityext/data/EntityExtTypeData.xml
[java] 2016-03-07 11:32:37,468 (main) [EntityDataLoadContainer.java:406:INFO ] 00003 of 03675 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/entityext/data/EntityExtSecurityData.xml
[java] 2016-03-07 11:32:37,468 (main) [EntityDataLoadContainer.java:406:INFO ] 00001 of 03676 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/bi/data/BiTypeData.xml
BUILD FAILED
/home/oodles/work/skulocity/custom-erp-crm/build.xml:510: Java returned: 99
$java -version
Java(TM) SE Runtime Environment (build 1.6.0_45-b06)
Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed mode)
opentaps=> select version();
version
----------------------------------------------------------------------------------------------------------------
PostgreSQL 9.2.15 on x86_64-unknown-linux-gnu, compiled by gcc (GCC) 4.1.2 20080704 (Red Hat 4.1.2-52), 64-bit
(1 row)
$ant -version
Apache Ant(TM) version 1.9.3 compiled on April 8 2014
I have followed all steps mentioned on installation doc.
On executing "ant run-install" it created 1015 table but failed at the end.Can anybody help me..?

In my case $JAVA_HOME was pointing to JDK1.7 and java -version command was showing JDK1.6,So i changed $JAVA_HOME to 1.6 and this is working fine.

You should use java 1.6.4
check the JAVA_HOME environment.
echo %JAVA_HOME%

Related

How to fix Spring Load Balancing issue with ribbon "Error choosing server for key default"?

I have been trying to familiarize myself with spring boot load balancing.
i have a problem when trying to access the services using the load balancer URL.
i get a long stack trace but it from the looks of it it seems to start here
2021-05-17 20:44:05.540 WARN 23156 --- [nio-9090-exec-1] c.netflix.loadbalancer.BaseLoadBalancer : LoadBalancer [chatbook]: Error choosing server for key default
my entire code is here https://github.com/wizz269/springLoadBalancing
and here is the stacktrace
"
. ____ _ __ _ _
/\\ / ___'_ __ _ _(_)_ __ __ _ \ \ \ \
( ( )\___ | '_ | '_| | '_ \/ _` | \ \ \ \
\\/ ___)| |_)| | | | | || (_| | ) ) ) )
' |____| .__|_| |_|_| |_\__, | / / / /
=========|_|==============|___/=/_/_/_/
:: Spring Boot :: (v2.3.10.RELEASE)
2021-05-17 20:43:47.834 INFO 23156 --- [ restartedMain] com.demgo.userapp.UserAppApplication : No active profile set, falling back to default profiles: default
2021-05-17 20:43:48.598 INFO 23156 --- [ restartedMain] o.s.cloud.context.scope.GenericScope : BeanFactory id=3629af68-69dc-3f8d-91e5-bea3a5e48117
2021-05-17 20:43:49.006 INFO 23156 --- [ restartedMain] o.s.b.w.embedded.tomcat.TomcatWebServer : Tomcat initialized with port(s): 9090 (http)
2021-05-17 20:43:49.014 INFO 23156 --- [ restartedMain] o.apache.catalina.core.StandardService : Starting service [Tomcat]
2021-05-17 20:43:49.014 INFO 23156 --- [ restartedMain] org.apache.catalina.core.StandardEngine : Starting Servlet engine: [Apache Tomcat/9.0.45]
2021-05-17 20:43:49.210 INFO 23156 --- [ restartedMain] o.a.c.c.C.[Tomcat].[localhost].[/] : Initializing Spring embedded WebApplicationContext
2021-05-17 20:43:49.211 INFO 23156 --- [ restartedMain] w.s.c.ServletWebServerApplicationContext : Root WebApplicationContext: initialization completed in 1335 ms
2021-05-17 20:43:49.284 WARN 23156 --- [ restartedMain] c.n.c.sources.URLConfigurationSource : No URLs will be polled as dynamic configuration sources.
2021-05-17 20:43:49.284 INFO 23156 --- [ restartedMain] c.n.c.sources.URLConfigurationSource : To enable URLs as dynamic configuration sources, define System property archaius.configurationSource.additionalUrls or make config.properties available on classpath.
2021-05-17 20:43:49.288 WARN 23156 --- [ restartedMain] c.n.c.sources.URLConfigurationSource : No URLs will be polled as dynamic configuration sources.
2021-05-17 20:43:49.288 INFO 23156 --- [ restartedMain] c.n.c.sources.URLConfigurationSource : To enable URLs as dynamic configuration sources, define System property archaius.configurationSource.additionalUrls or make config.properties available on classpath.
2021-05-17 20:43:49.431 INFO 23156 --- [ restartedMain] o.s.s.concurrent.ThreadPoolTaskExecutor : Initializing ExecutorService 'applicationTaskExecutor'
2021-05-17 20:43:49.601 INFO 23156 --- [ restartedMain] o.s.b.d.a.OptionalLiveReloadServer : LiveReload server is running on port 35729
2021-05-17 20:43:50.294 INFO 23156 --- [ restartedMain] o.s.b.w.embedded.tomcat.TomcatWebServer : Tomcat started on port(s): 9090 (http) with context path ''
2021-05-17 20:43:50.748 INFO 23156 --- [ restartedMain] com.demgo.userapp.UserAppApplication : Started UserAppApplication in 5.553 seconds (JVM running for 8.061)
2021-05-17 20:44:04.554 INFO 23156 --- [nio-9090-exec-1] o.a.c.c.C.[Tomcat].[localhost].[/] : Initializing Spring DispatcherServlet 'dispatcherServlet'
2021-05-17 20:44:04.554 INFO 23156 --- [nio-9090-exec-1] o.s.web.servlet.DispatcherServlet : Initializing Servlet 'dispatcherServlet'
2021-05-17 20:44:04.562 INFO 23156 --- [nio-9090-exec-1] o.s.web.servlet.DispatcherServlet : Completed initialization in 8 ms
2021-05-17 20:44:05.386 INFO 23156 --- [nio-9090-exec-1] c.n.u.concurrent.ShutdownEnabledTimer : Shutdown hook installed for: NFLoadBalancer-PingTimer-chatbook
2021-05-17 20:44:05.388 INFO 23156 --- [nio-9090-exec-1] c.netflix.loadbalancer.BaseLoadBalancer : Client: chatbook instantiated a LoadBalancer: DynamicServerListLoadBalancer:{NFLoadBalancer:name=chatbook,current list of Servers=[],Load balancer stats=Zone stats: {},Server stats: []}ServerList:null
2021-05-17 20:44:05.397 INFO 23156 --- [nio-9090-exec-1] c.n.l.DynamicServerListLoadBalancer : Using serverListUpdater PollingServerListUpdater
2021-05-17 20:44:05.420 INFO 23156 --- [nio-9090-exec-1] c.netflix.config.ChainedDynamicProperty : Flipping property: chatbook.ribbon.ActiveConnectionsLimit to use NEXT property: niws.loadbalancer.availabilityFilteringRule.activeConnectionsLimit = 2147483647
2021-05-17 20:44:05.516 INFO 23156 --- [nio-9090-exec-1] c.n.l.DynamicServerListLoadBalancer : DynamicServerListLoadBalancer for client chatbook initialized: DynamicServerListLoadBalancer:{NFLoadBalancer:name=chatbook,current list of Servers=[localhost:8084, localhost:8082, localhost:8083],Load balancer stats=Zone stats: {unknown=[Zone:unknown; Instance count:3; Active connections count: 0; Circuit breaker tripped count: 0; Active connections per server: 0.0;]
},Server stats: [[Server:localhost:8084; Zone:UNKNOWN; Total Requests:0; Successive connection failure:0; Total blackout seconds:0; Last connection made:Thu Jan 01 03:00:00 EAT 1970; First connection made: Thu Jan 01 03:00:00 EAT 1970; Active Connections:0; total failure count in last (1000) msecs:0; average resp time:0.0; 90 percentile resp time:0.0; 95 percentile resp time:0.0; min resp time:0.0; max resp time:0.0; stddev resp time:0.0]
, [Server:localhost:8082; Zone:UNKNOWN; Total Requests:0; Successive connection failure:0; Total blackout seconds:0; Last connection made:Thu Jan 01 03:00:00 EAT 1970; First connection made: Thu Jan 01 03:00:00 EAT 1970; Active Connections:0; total failure count in last (1000) msecs:0; average resp time:0.0; 90 percentile resp time:0.0; 95 percentile resp time:0.0; min resp time:0.0; max resp time:0.0; stddev resp time:0.0]
, [Server:localhost:8083; Zone:UNKNOWN; Total Requests:0; Successive connection failure:0; Total blackout seconds:0; Last connection made:Thu Jan 01 03:00:00 EAT 1970; First connection made: Thu Jan 01 03:00:00 EAT 1970; Active Connections:0; total failure count in last (1000) msecs:0; average resp time:0.0; 90 percentile resp time:0.0; 95 percentile resp time:0.0; min resp time:0.0; max resp time:0.0; stddev resp time:0.0]
]}ServerList:com.netflix.loadbalancer.ConfigurationBasedServerList#4f6e06f2
2021-05-17 20:44:05.536 WARN 23156 --- [nio-9090-exec-1] com.netflix.loadbalancer.RoundRobinRule : No up servers available from load balancer: DynamicServerListLoadBalancer:{NFLoadBalancer:name=chatbook,current list of Servers=[localhost:8084, localhost:8082, localhost:8083],Load balancer stats=Zone stats: {unknown=[Zone:unknown; Instance count:3; Active connections count: 0; Circuit breaker tripped count: 0; Active connections per server: 0.0;]
},Server stats: [[Server:localhost:8084; Zone:UNKNOWN; Total Requests:0; Successive connection failure:0; Total blackout seconds:0; Last connection made:Thu Jan 01 03:00:00 EAT 1970; First connection made: Thu Jan 01 03:00:00 EAT 1970; Active Connections:0; total failure count in last (1000) msecs:0; average resp time:0.0; 90 percentile resp time:0.0; 95 percentile resp time:0.0; min resp time:0.0; max resp time:0.0; stddev resp time:0.0]
, [Server:localhost:8082; Zone:UNKNOWN; Total Requests:0; Successive connection failure:0; Total blackout seconds:0; Last connection made:Thu Jan 01 03:00:00 EAT 1970; First connection made: Thu Jan 01 03:00:00 EAT 1970; Active Connections:0; total failure count in last (1000) msecs:0; average resp time:0.0; 90 percentile resp time:0.0; 95 percentile resp time:0.0; min resp time:0.0; max resp time:0.0; stddev resp time:0.0]
, [Server:localhost:8083; Zone:UNKNOWN; Total Requests:0; Successive connection failure:0; Total blackout seconds:0; Last connection made:Thu Jan 01 03:00:00 EAT 1970; First connection made: Thu Jan 01 03:00:00 EAT 1970; Active Connections:0; total failure count in last (1000) msecs:0; average resp time:0.0; 90 percentile resp time:0.0; 95 percentile resp time:0.0; min resp time:0.0; max resp time:0.0; stddev resp time:0.0]
]}ServerList:com.netflix.loadbalancer.ConfigurationBasedServerList#4f6e06f2
2021-05-17 20:44:05.540 WARN 23156 --- [nio-9090-exec-1] c.netflix.loadbalancer.BaseLoadBalancer : LoadBalancer [chatbook]: Error choosing server for key default
java.lang.NullPointerException: null
at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:878) ~[guava-30.0-jre.jar:na]
at com.google.common.cache.LocalCache.get(LocalCache.java:3950) ~[guava-30.0-jre.jar:na]
at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3974) ~[guava-30.0-jre.jar:na]
at com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4935) ~[guava-30.0-jre.jar:na]
at com.netflix.loadbalancer.LoadBalancerStats.getServerStats(LoadBalancerStats.java:185) ~[ribbon-loadbalancer-2.3.0.jar:2.3.0]
at com.netflix.loadbalancer.LoadBalancerStats.getSingleServerStat(LoadBalancerStats.java:372) ~[ribbon-loadbalancer-2.3.0.jar:2.3.0]
at com.netflix.loadbalancer.AvailabilityPredicate.apply(AvailabilityPredicate.java:73) ~[ribbon-loadbalancer-2.3.0.jar:2.3.0]
at com.netflix.loadbalancer.AvailabilityPredicate.apply(AvailabilityPredicate.java:35) ~[ribbon-loadbalancer-2.3.0.jar:2.3.0]
at com.netflix.loadbalancer.CompositePredicate.apply(CompositePredicate.java:52) ~[ribbon-loadbalancer-2.3.0.jar:2.3.0]
at com.netflix.loadbalancer.CompositePredicate.apply(CompositePredicate.java:40) ~[ribbon-loadbalancer-2.3.0.jar:2.3.0]
at com.netflix.loadbalancer.AvailabilityFilteringRule.choose(AvailabilityFilteringRule.java:86) ~[ribbon-loadbalancer-2.3.0.jar:2.3.0]
at com.netflix.loadbalancer.BaseLoadBalancer.chooseServer(BaseLoadBalancer.java:755) ~[ribbon-loadbalancer-2.3.0.jar:2.3.0]
at com.netflix.loadbalancer.ZoneAwareLoadBalancer.chooseServer(ZoneAwareLoadBalancer.java:113) [ribbon-loadbalancer-2.3.0.jar:2.3.0]
at org.springframework.cloud.netflix.ribbon.RibbonLoadBalancerClient.getServer(RibbonLoadBalancerClient.java:189) [spring-cloud-netflix-ribbon-2.2.8.RELEASE.jar:2.2.8.RELEASE]
at org.springframework.cloud.netflix.ribbon.RibbonLoadBalancerClient.execute(RibbonLoadBalancerClient.java:117) [spring-cloud-netflix-ribbon-2.2.8.RELEASE.jar:2.2.8.RELEASE]
at org.springframework.cloud.netflix.ribbon.RibbonLoadBalancerClient.execute(RibbonLoadBalancerClient.java:99) [spring-cloud-netflix-ribbon-2.2.8.RELEASE.jar:2.2.8.RELEASE]
at org.springframework.cloud.client.loadbalancer.LoadBalancerInterceptor.intercept(LoadBalancerInterceptor.java:58) [spring-cloud-commons-2.2.8.RELEASE.jar:2.2.8.RELEASE]
at org.springframework.http.client.InterceptingClientHttpRequest$InterceptingRequestExecution.execute(InterceptingClientHttpRequest.java:93) [spring-web-5.2.14.RELEASE.jar:5.2.14.RELEASE]
at org.springframework.http.client.InterceptingClientHttpRequest.executeInternal(InterceptingClientHttpRequest.java:77) [spring-web-5.2.14.RELEASE.jar:5.2.14.RELEASE]
at org.springframework.http.client.AbstractBufferingClientHttpRequest.executeInternal(AbstractBufferingClientHttpRequest.java:48) [spring-web-5.2.14.RELEASE.jar:5.2.14.RELEASE]
at org.springframework.http.client.AbstractClientHttpRequest.execute(AbstractClientHttpRequest.java:53) [spring-web-5.2.14.RELEASE.jar:5.2.14.RELEASE]
at org.springframework.web.client.RestTemplate.doExecute(RestTemplate.java:737) [spring-web-5.2.14.RELEASE.jar:5.2.14.RELEASE]
at org.springframework.web.client.RestTemplate.execute(RestTemplate.java:672) [spring-web-5.2.14.RELEASE.jar:5.2.14.RELEASE]
at org.springframework.web.client.RestTemplate.getForObject(RestTemplate.java:313) [spring-web-5.2.14.RELEASE.jar:5.2.14.RELEASE]
at com.demgo.userapp.UserAppApplication.invokeChatbook(UserAppApplication.java:29) [classes/:na]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_281]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_281]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_281]
at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_281]
at org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:190) [spring-web-5.2.14.RELEASE.jar:5.2.14.RELEASE]
at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:138) [spring-web-5.2.14.RELEASE.jar:5.2.14.RELEASE]
at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:105) [spring-webmvc-5.2.14.RELEASE.jar:5.2.14.RELEASE]
at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:878) [spring-webmvc-5.2.14.RELEASE.jar:5.2.14.RELEASE]
at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:792) [spring-webmvc-5.2.14.RELEASE.jar:5.2.14.RELEASE]
at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:87) [spring-webmvc-5.2.14.RELEASE.jar:5.2.14.RELEASE]
at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:1040) [spring-webmvc-5.2.14.RELEASE.jar:5.2.14.RELEASE]
at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:943) [spring-webmvc-5.2.14.RELEASE.jar:5.2.14.RELEASE]
at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:1006) [spring-webmvc-5.2.14.RELEASE.jar:5.2.14.RELEASE]
at org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:898) [spring-webmvc-5.2.14.RELEASE.jar:5.2.14.RELEASE]
at javax.servlet.http.HttpServlet.service(HttpServlet.java:626) [tomcat-embed-core-9.0.45.jar:4.0.FR]
at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:883) [spring-webmvc-5.2.14.RELEASE.jar:5.2.14.RELEASE]
at javax.servlet.http.HttpServlet.service(HttpServlet.java:733) [tomcat-embed-core-9.0.45.jar:4.0.FR]
at
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:119) [spring-web-5.2.14.RELEASE.jar:5.2.14.RELEASE]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:189) [tomcat-embed-core-9.0.45.jar:9.0.45]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162) [tomcat-embed-core-9.0.45.jar:9.0.45]
at org.springframework.web.filter.FormContentFilter.doFilterInternal(FormContentFilter.java:93) [spring-web-5.2.14.RELEASE.jar:5.2.14.RELEASE]
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:119) [spring-web-5.2.14.RELEASE.jar:5.2.14.RELEASE]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:189) [tomcat-embed-core-9.0.45.jar:9.0.45]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162) [tomcat-embed-core-9.0.45.jar:9.0.45]
at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:201) [spring-web-5.2.14.RELEASE.jar:5.2.14.RELEASE]
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:119) [spring-web-5.2.14.RELEASE.jar:5.2.14.RELEASE]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:189) [tomcat-embed-core-9.0.45.jar:9.0.45]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162) [tomcat-embed-core-9.0.45.jar:9.0.45]
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:202) [tomcat-embed-core-9.0.45.jar:9.0.45]
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:97) [tomcat-embed-core-9.0.45.jar:9.0.45]
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:542) [tomcat-embed-core-9.0.45.jar:9.0.45]
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:143) [tomcat-embed-core-9.0.45.jar:9.0.45]
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:92) [tomcat-embed-core-9.0.45.jar:9.0.45]
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:78) [tomcat-embed-core-9.0.45.jar:9.0.45]
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:357) [tomcat-embed-core-9.0.45.jar:9.0.45]
at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:374) [tomcat-embed-core-9.0.45.jar:9.0.45]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [na:1.8.0_281]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [na:1.8.0_281]
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61) [tomcat-embed-core-9.0.45.jar:9.0.45]
at java.lang.Thread.run(Thread.java:748) [na:1.8.0_281]
2021-05-17 20:44:05.552 ERROR 23156 --- [nio-9090-exec-1] o.a.c.c.C.[.[.[/].[dispatcherServlet] : Servlet.service() for servlet [dispatcherServlet] in context with path [] threw exception [Request processing failed; nested exception is java.lang.IllegalStateException: No instances available for chatbook] with root cause
java.lang.IllegalStateException: No instances available for chatbook
at org.springframework.cloud.netflix.ribbon.RibbonLoadBalancerClient.execute(RibbonLoadBalancerClient.java:119) ~[spring-cloud-netflix-ribbon-2.2.8.RELEASE.jar:2.2.8.RELEASE]
at org.springframework.cloud.netflix.ribbon.RibbonLoadBalancerClient.execute(RibbonLoadBalancerClient.java:99) ~[spring-cloud-netflix-ribbon-2.2.8.RELEASE.jar:2.2.8.RELEASE]
at org.springframework.cloud.client.loadbalancer.LoadBalancerInterceptor.intercept(LoadBalancerInterceptor.java:58) ~[spring-cloud-commons-2.2.8.RELEASE.jar:2.2.8.RELEASE]
at org.springframework.http.client.InterceptingClientHttpRequest$InterceptingRequestExecution.execute(InterceptingClientHttpRequest.java:93) ~[spring-web-5.2.14.RELEASE.jar:5.2.14.RELEASE]
at org.springframework.web.client.RestTemplate.getForObject(RestTemplate.java:313) ~[spring-web-5.2.14.RELEASE.jar:5.2.14.RELEASE]
at com.demgo.userapp.UserAppApplication.invokeChatbook(UserAppApplication.java:29) ~[classes/:na]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_281]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_281]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_281]
at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_281]
at org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:190) ~[spring-web-5.2.14.RELEASE.jar:5.2.14.RELEASE]
at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:138) ~[spring-web-5.2.14.RELEASE.jar:5.2.14.RELEASE]
at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:105) ~[spring-webmvc-5.2.14.RELEASE.jar:5.2.14.RELEASE]
at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:878) ~[spring-webmvc-5.2.14.RELEASE.jar:5.2.14.RELEASE]
at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:792) ~[spring-webmvc-5.2.14.RELEASE.jar:5.2.14.RELEASE]
at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:87) ~[spring-webmvc-5.2.14.RELEASE.jar:5.2.14.RELEASE]
at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:1040) ~[spring-webmvc-5.2.14.RELEASE.jar:5.2.14.RELEASE]
at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:943) ~[spring-webmvc-5.2.14.RELEASE.jar:5.2.14.RELEASE]
at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:1006) ~[spring-webmvc-5.2.14.RELEASE.jar:5.2.14.RELEASE]
at org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:898) ~[spring-webmvc-5.2.14.RELEASE.jar:5.2.14.RELEASE]
at javax.servlet.http.HttpServlet.service(HttpServlet.java:626) ~[tomcat-embed-core-9.0.45.jar:4.0.FR]
at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:883) ~[spring-webmvc-5.2.14.RELEASE.jar:5.2.14.RELEASE]
at javax.servlet.http.HttpServlet.service(HttpServlet.java:733) ~[tomcat-embed-core-9.0.45.jar:4.0.FR]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:227) ~[tomcat-embed-core-9.0.45.jar:9.0.45]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162) ~[tomcat-embed-core-9.0.45.jar:9.0.45]
at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:53) ~[tomcat-embed-websocket-9.0.45.jar:9.0.45]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:189) ~[tomcat-embed-core-9.0.45.jar:9.0.45]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162) ~[tomcat-embed-core-9.0.45.jar:9.0.45]
at org.springframework.web.filter.RequestContextFilter.doFilterInternal(RequestContextFilter.java:100) ~[spring-web-5.2.14.RELEASE.jar:5.2.14.RELEASE]
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:119) ~[spring-web-5.2.14.RELEASE.jar:5.2.14.RELEASE]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:189) ~[tomcat-embed-core-9.0.45.jar:9.0.45]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162) ~[tomcat-embed-core-9.0.45.jar:9.0.45]
at org.springframework.web.filter.FormContentFilter.doFilterInternal(FormContentFilter.java:93) ~[spring-web-5.2.14.RELEASE.jar:5.2.14.RELEASE]
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:119) ~[spring-web-5.2.14.RELEASE.jar:5.2.14.RELEASE]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:189) ~[tomcat-embed-core-9.0.45.jar:9.0.45]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162) ~[tomcat-embed-core-9.0.45.jar:9.0.45]
at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:201) ~[spring-web-5.2.14.RELEASE.jar:5.2.14.RELEASE]
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:119) ~[spring-web-5.2.14.RELEASE.jar:5.2.14.RELEASE]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:189) ~[tomcat-embed-core-9.0.45.jar:9.0.45]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162) ~[tomcat-embed-core-9.0.45.jar:9.0.45]
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:202) ~[tomcat-embed-core-9.0.45.jar:9.0.45]
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:97) [tomcat-embed-core-9.0.45.jar:9.0.45]
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:542) [tomcat-embed-core-9.0.45.jar:9.0.45]
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:143) [tomcat-embed-core-9.0.45.jar:9.0.45]
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:92) [tomcat-embed-core-9.0.45.jar:9.0.45]
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:78) [tomcat-embed-core-9.0.45.jar:9.0.45]
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:357) [tomcat-embed-core-9.0.45.jar:9.0.45]
at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:374) [tomcat-embed-core-9.0.45.jar:9.0.45]
at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:65) [tomcat-embed-core-9.0.45.jar:9.0.45]
at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:893) [tomcat-embed-core-9.0.45.jar:9.0.45]
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1707) [tomcat-embed-core-9.0.45.jar:9.0.45]
at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:49) [tomcat-embed-core-9.0.45.jar:9.0.45]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [na:1.8.0_281]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [na:1.8.0_281]
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61) [tomcat-embed-core-9.0.45.jar:9.0.45]
at java.lang.Thread.run(Thread.java:748) [na:1.8.0_281]
2021-05-17 20:44:06.409 INFO 23156 --- [erListUpdater-0] c.netflix.config.ChainedDynamicProperty : Flipping property: chatbook.ribbon.ActiveConnectionsLimit to use NEXT property: niws.loadbalancer.availabilityFilteringRule.activeConnectionsLimit = 2147483647
Note : Each service instance i put up on its own works fine, only the load balancing part. will appreciate the assist
found the answer here
https://github.com/spring-cloud/spring-cloud-netflix/issues/2705#issuecomment-367824353
worked after deleting this line of code
#Bean
public IRule rule(IClientConfig ribbonClient)
{
return new AvailabilityFilteringRule();
}
still not sure why this worked, i'd appreciate if anyone cares to explain
worked after removing this code
#Bean
public IRule ribbonRule(IClientConfig config) {
return new AvailabilityFilteringRule();
}

Zeppelin exception with Spark Basic Features

I teach a class on Scala and Spark. I've been demonstrating Zeppelin for five years now (and using it for a bit longer than that).
For the last couple of years, whenever I demonstrate Zeppelin, using the distribution out-of-the-box, I can only show the Spark Basic Features Notebook. When I do this, all of the paragraphs come up as they should. If I try to change the age in the age field, or simply try to re-run any of the paragraphs, I get an exception.
I repeat: this is out-of-the-box. I downloaded the 0.9.0-preview2 version, started the daemon, and opened the provided notebook. Any ideas? I'm on a MacBook Pro with OS 10.15.7. I also have Spark spark-3.0.1-bin-hadoop2.7 installed.
Here's the error that I get:
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.zeppelin.spark.SparkSqlInterpreter.internalInterpret(SparkSqlInterpreter.java:105)
at org.apache.zeppelin.interpreter.AbstractInterpreter.interpret(AbstractInterpreter.java:47)
at org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:110)
at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:776)
at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:668)
at org.apache.zeppelin.scheduler.Job.run(Job.java:172)
at org.apache.zeppelin.scheduler.AbstractScheduler.runJob(AbstractScheduler.java:130)
at org.apache.zeppelin.scheduler.ParallelScheduler.lambda$runJobInScheduler$0(ParallelScheduler.java:39)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.spark.sql.AnalysisException: Table or view not found: bank; line 2 pos 5;
'Sort ['age ASC NULLS FIRST], true
+- 'Aggregate ['age], ['age, count(1) AS value#4L]
+- 'Filter ('age < 30)
+- 'UnresolvedRelation [bank]
at org.apache.spark.sql.catalyst.analysis.package$AnalysisErrorAt.failAnalysis(package.scala:42)
at org.apache.spark.sql.catalyst.analysis.CheckAnalysis.$anonfun$checkAnalysis$1(CheckAnalysis.scala:106)
at org.apache.spark.sql.catalyst.analysis.CheckAnalysis.$anonfun$checkAnalysis$1$adapted(CheckAnalysis.scala:92)
at org.apache.spark.sql.catalyst.trees.TreeNode.foreachUp(TreeNode.scala:177)
at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreachUp$1(TreeNode.scala:176)
at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreachUp$1$adapted(TreeNode.scala:176)
at scala.collection.immutable.List.foreach(List.scala:392)
at org.apache.spark.sql.catalyst.trees.TreeNode.foreachUp(TreeNode.scala:176)
at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreachUp$1(TreeNode.scala:176)
at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreachUp$1$adapted(TreeNode.scala:176)
at scala.collection.immutable.List.foreach(List.scala:392)
at org.apache.spark.sql.catalyst.trees.TreeNode.foreachUp(TreeNode.scala:176)
at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreachUp$1(TreeNode.scala:176)
at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreachUp$1$adapted(TreeNode.scala:176)
at scala.collection.immutable.List.foreach(List.scala:392)
at org.apache.spark.sql.catalyst.trees.TreeNode.foreachUp(TreeNode.scala:176)
at org.apache.spark.sql.catalyst.analysis.CheckAnalysis.checkAnalysis(CheckAnalysis.scala:92)
at org.apache.spark.sql.catalyst.analysis.CheckAnalysis.checkAnalysis$(CheckAnalysis.scala:89)
at org.apache.spark.sql.catalyst.analysis.Analyzer.checkAnalysis(Analyzer.scala:130)
at org.apache.spark.sql.catalyst.analysis.Analyzer.$anonfun$executeAndCheck$1(Analyzer.scala:156)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.markInAnalyzer(AnalysisHelper.scala:201)
at org.apache.spark.sql.catalyst.analysis.Analyzer.executeAndCheck(Analyzer.scala:153)
at org.apache.spark.sql.execution.QueryExecution.$anonfun$analyzed$1(QueryExecution.scala:68)
at org.apache.spark.sql.catalyst.QueryPlanningTracker.measurePhase(QueryPlanningTracker.scala:111)
at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$1(QueryExecution.scala:133)
at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:764)
at org.apache.spark.sql.execution.QueryExecution.executePhase(QueryExecution.scala:133)
at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:68)
at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:66)
at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:58)
at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:99)
at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:764)
at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:97)
at org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:607)
at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:764)
at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:602)
at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:650)
... 15 more

optaplanner Benchmarking failed Caused by: java.lang.InstantiationError: org.drools.core.factmodel.ClassBuilderFactory

There is part of my config of optaplanner.
<solutionClass>com.core.domain.schedule.WorkSchedule</solutionClass>
<entityClass>com.core.domain.Arrange.IWorkArrange</entityClass>
<entityClass>com.core.domain.BasicConstruct.ArrangeUnit.IArrangeUnit</entityClass>
<entityClass>com.core.domain.Arrange.InterruptionArrange.IInterruptionArrange</entityClass>
Is it because IWorkArrange,IArrangeUnit and IInterruptionArrange are interface ?
Exception in thread "main" org.optaplanner.benchmark.api.PlannerBenchmarkException: Benchmarking failed: failureCount (1). The exception of the firstFailureSingleBenchmarkRunner (Problem_0_local Search_0) is chained.
at org.optaplanner.benchmark.impl.DefaultPlannerBenchmark.benchmarkingEnded(DefaultPlannerBenchmark.java:335)
at org.optaplanner.benchmark.impl.DefaultPlannerBenchmark.benchmark(DefaultPlannerBenchmark.java:106)
at org.optaplanner.benchmark.impl.DefaultPlannerBenchmark.benchmarkAndShowReportInBrowser(DefaultPlannerBenchmark.java:433)
at com.ctrip.hotel.basicModel.helloWorld.main(helloWorld.java:116)
Caused by: java.lang.InstantiationError: org.drools.core.factmodel.ClassBuilderFactory
at org.drools.compiler.builder.impl.KnowledgeBuilderConfigurationImpl.init(KnowledgeBuilderConfigurationImpl.java:262)
at org.drools.compiler.builder.impl.KnowledgeBuilderConfigurationImpl.init(KnowledgeBuilderConfigurationImpl.java:191)
at org.drools.compiler.builder.impl.KnowledgeBuilderConfigurationImpl.<init>(KnowledgeBuilderConfigurationImpl.java:159)
at org.drools.compiler.kie.builder.impl.AbstractKieProject.getBuilderConfiguration(AbstractKieProject.java:302)
at org.drools.compiler.kie.builder.impl.AbstractKieProject.createKnowledgeBuilder(AbstractKieProject.java:288)
at org.drools.compiler.kie.builder.impl.AbstractKieProject.buildKnowledgePackages(AbstractKieProject.java:213)
at org.drools.compiler.kie.builder.impl.AbstractKieProject.verify(AbstractKieProject.java:75)
at org.drools.compiler.kie.builder.impl.KieBuilderImpl.buildKieProject(KieBuilderImpl.java:274)
at org.drools.compiler.kie.builder.impl.KieBuilderImpl.buildAll(KieBuilderImpl.java:242)
at org.drools.compiler.kie.builder.impl.KieBuilderImpl.buildAll(KieBuilderImpl.java:199)
at org.optaplanner.core.config.score.director.ScoreDirectorFactoryConfig.buildDroolsScoreDirectorFactory(ScoreDirectorFactoryConfig.java:683)
at org.optaplanner.core.config.score.director.ScoreDirectorFactoryConfig.buildScoreDirectorFactory(ScoreDirectorFactoryConfig.java:464)
at org.optaplanner.core.config.solver.SolverConfig.buildScoreDirectorFactory(SolverConfig.java:606)
at org.optaplanner.core.config.solver.SolverConfig.buildSolver(SolverConfig.java:514)
at org.optaplanner.core.impl.solver.DefaultSolverFactory.buildSolver(DefaultSolverFactory.java:49)
at org.optaplanner.benchmark.impl.SubSingleBenchmarkRunner.call(SubSingleBenchmarkRunner.java:104)
at org.optaplanner.benchmark.impl.SubSingleBenchmarkRunner.call(SubSingleBenchmarkRunner.java:35)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
this is maven dependence,is there any pom I should add? Thanks a lot.
[INFO] \- org.optaplanner:optaplanner-benchmark:jar:7.42.0.Final:compile
[INFO] +- org.kie:kie-api:jar:7.42.0.Final:compile
[INFO] | \- org.kie.soup:kie-soup-maven-
support:jar:7.42.0.Final:compile
[INFO] +- org.optaplanner:optaplanner-persistence-
common:jar:7.42.0.Final:compile
[INFO] +- org.optaplanner:optaplanner-persistence-
xstream:jar:7.42.0.Final:compile
[INFO] +- org.optaplanner:optaplanner-persistence-
jaxb:jar:7.42.0.Final:compile
[INFO] | \- org.jboss.spec.javax.xml.bind:jboss-jaxb-
api_2.3_spec:jar:1.0.1.Final:compile
[INFO] +- org.drools:drools-core:jar:7.42.0.Final:compile
[INFO] | +- org.mvel:mvel2:jar:2.4.8.Final:compile
[INFO] | +- org.kie.soup:kie-soup-xstream:jar:7.42.0.Final:compile
[INFO] | +- org.drools:drools-core-reflective:jar:7.42.0.Final:compile
[INFO] | +- org.drools:drools-core-dynamic:jar:7.42.0.Final:compile
[INFO] | \- org.kie.soup:kie-soup-project-datamodel-
commons:jar:7.42.0.Final:compile
[INFO] | \- org.kie.soup:kie-soup-project-datamodel-
api:jar:7.42.0.Final:compile
I add this not work.I add this not work.
<groupId>org.kie.kogito</groupId>
<artifactId>kogito-drools</artifactId>
<version>0.10.1</version>
Do an mvn dependency:tree - looks like you're mixing incompatible drools/kie/kogito versions.

py4j.protocol.Py4JJavaError: An error occurred while calling o59.save. : java.lang.NoClassDefFoundError: scala/runtime/LazyBoolean

I'm new in spark and wanted to write something in a mongodb using pyspark. I tried to do the same as in this tutorials: https://docs.mongodb.com/spark-connector/master/python-api/ and https://docs.mongodb.com/spark-connector/master/python/write-to-mongodb/. Somehow an error occurred.
Scala version: 2.13.1
Spark version: 2.4.4
Python version: 2.7.5
Java version: 1.8.0_242
[root#node2 vagrant]# pyspark --conf "spark.mongodb.input.uri=mongodb://IP/db.collection?readPreference=primaryPreferred" --conf "spark.mongodb.output.uri=mongodb://IP/db.collection" --packages org.mongodb.spark:mongo-spark-connector_2.12:2.4.1 --conf spark.ui.port=63000
Python 2.7.5 (default, Aug 7 2019, 00:51:29)
[GCC 4.8.5 20150623 (Red Hat 4.8.5-39)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
Ivy Default Cache set to: /root/.ivy2/cache
The jars for the packages stored in: /root/.ivy2/jars
:: loading settings :: url = jar:file:/home/vagrant/spark-2.4.4-bin-hadoop2.7/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
org.mongodb.spark#mongo-spark-connector_2.12 added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent-08af433d-99aa-4bba-8d25-7983617c99e5;1.0
confs: [default]
found org.mongodb.spark#mongo-spark-connector_2.12;2.4.1 in central
found org.mongodb#mongo-java-driver;3.10.2 in central
[3.10.2] org.mongodb#mongo-java-driver;[3.10,3.11)
:: resolution report :: resolve 1456ms :: artifacts dl 6ms
:: modules in use:
org.mongodb#mongo-java-driver;3.10.2 from central in [default]
org.mongodb.spark#mongo-spark-connector_2.12;2.4.1 from central in [default]
---------------------------------------------------------------------
| | modules || artifacts |
| conf | number| search|dwnlded|evicted|| number|dwnlded|
---------------------------------------------------------------------
| default | 2 | 1 | 0 | 0 || 2 | 0 |
---------------------------------------------------------------------
:: retrieving :: org.apache.spark#spark-submit-parent-08af433d-99aa-4bba-8d25-7983617c99e5
confs: [default]
0 artifacts copied, 2 already retrieved (0kB/19ms)
20/02/07 20:30:55 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/__ / .__/\_,_/_/ /_/\_\ version 2.4.4
/_/
Using Python version 2.7.5 (default, Aug 7 2019 00:51:29)
SparkSession available as 'spark'.
>>> my_spark = SparkSession.builder.appName("myApp").getOrCreate();
>>> people = spark.createDataFrame([("Bilbo Baggins", 50), ("Gandalf", 1000), ("Thorin", 195), ("Balin", 178), ("Kili", 77), ("Dwalin", 169), ("Oin", 167), ("Gloin", 158), ("Fili", 82), ("Bombur", None)], ["name", "age"]);
>>> people.write.format("mongo").mode("append").save();
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/vagrant/spark-2.4.4-bin-hadoop2.7/python/pyspark/sql/readwriter.py", line 736, in save
self._jwrite.save()
File "/home/vagrant/spark-2.4.4-bin-hadoop2.7/python/lib/py4j-0.10.7-src.zip/py4j/java_gateway.py", line 1257, in __call__
File "/home/vagrant/spark-2.4.4-bin-hadoop2.7/python/pyspark/sql/utils.py", line 63, in deco
return f(*a, **kw)
File "/home/vagrant/spark-2.4.4-bin-hadoop2.7/python/lib/py4j-0.10.7-src.zip/py4j/protocol.py", line 328, in get_return_value
py4j.protocol.Py4JJavaError: An error occurred while calling o59.save.
: java.lang.NoClassDefFoundError: scala/runtime/LazyBoolean
at com.mongodb.spark.sql.DefaultSource.createRelation(DefaultSource.scala:66)
at org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:86)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
at org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)
at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:676)
at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:285)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:271)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:282)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:238)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.ClassNotFoundException: scala.runtime.LazyBoolean
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:419)
at java.lang.ClassLoader.loadClass(ClassLoader.java:352)
... 32 more

Pyspark java.net.SocketTimeoutException: Accept timed out

I have the following code:
a) Generate Local Spark instance:
# Load data from local machine into dataframe
from pyspark.sql import SparkSession
spark = SparkSession.builder.appName("Basic").master("local[*]").config("spark.network.timeout","50s").config("spark.executor.heartbeatInterval", "50s").getOrCreate();
b) Generate a Pandas date range then convert it to PySpark data frame
import numpy as np
# Get the min and max dates
minDate, maxDate = df2.select(f.min("MonthlyTransactionDate"), f.max("MonthlyTransactionDate")).first()
d = pd.date_# Create aggregated dataset for analysis
df.registerTempTable("tmp")
sqlstr="SELECT " + groupBy + ", sum(" +amount +") as Amount FROM tmp GROUP BY " + groupBy
df2 = spark.sql(sqlstr)
spark.catalog.dropTempView('tmp')
display(df2.groupBy(monthlyTransactionDate).sum("Amount").orderBy(monthlyTransactionDate))range(start=minDate, end=maxDate, freq='MS')
tmp = pd.DataFrame(d)
df3 = spark.createDataFrame(tmp)
But I get what I think is a time out error:
Py4JJavaError: An error occurred while calling o932.collectToPython.
: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 39.0 failed 1 times, most recent failure: Lost task 0.0 in stage 39.0 (TID 1239, localhost, executor driver): org.apache.spark.SparkException: Python worker failed to connect back.
at org.apache.spark.api.python.PythonWorkerFactory.createSimpleWorker(PythonWorkerFactory.scala:170)
at org.apache.spark.api.python.PythonWorkerFactory.create(PythonWorkerFactory.scala:97)
at org.apache.spark.SparkEnv.createPythonWorker(SparkEnv.scala:117)
at ...
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.net.SocketTimeoutException: Accept timed out
at java.net.DualStackPlainSocketImpl.waitForNewConnection(Native Method)
at java.net.DualStackPlainSocketImpl.socketAccept(DualStackPlainSocketImpl.java:135)
at java.net.AbstractPlainSocketImpl.accept(AbstractPlainSocketImpl.java:409)
at java.net.PlainSocketImpl.accept(PlainSocketImpl.java:199)
at java.net.ServerSocket.implAccept(ServerSocket.java:545)
at java.net.ServerSocket.accept(ServerSocket.java:513)
at org.apache.spark.api.python.PythonWorkerFactory.createSimpleWorker(PythonWorkerFactory.scala:164)
... 32 more
Driver stacktrace:
at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1887)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1875)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1874)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1874)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:926)
at ...
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.spark.SparkException: Python worker failed to connect back.
at ... java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
... 1 more
Caused by: java.net.SocketTimeoutException: Accept timed out
at java.net.DualStackPlainSocketImpl.waitForNewConnection(Native Method)
at java.net.DualStackPlainSocketImpl.socketAccept(DualStackPlainSocketImpl.java:135)
at java.net.AbstractPlainSocketImpl.accept(AbstractPlainSocketImpl.java:409)
at java.net.PlainSocketImpl.accept(PlainSocketImpl.java:199)
at java.net.ServerSocket.implAccept(ServerSocket.java:545)
at java.net.ServerSocket.accept(ServerSocket.java:513)
at org.apache.spark.api.python.PythonWorkerFactory.createSimpleWorker(PythonWorkerFactory.scala:164)
... 32 more
The code works on Azure but not my local machine. I increased the time out parameters to what you can see above in the first bit of code by that did not help. The data frame I am creating isn't even all that large.