Bluestacks: failed to install bluestacks. error: value cannot be null. parameter name: path1 - bluestacks

I am getting this error. I uninstalled previous version of bluestacks but after the uninstall completed,
failed to install bluestacks. error: value cannot be null. parameter name: path1
I can't install the new version of it.
Log file shows:
2015-08-08 12:28:22.316000 BlueStacks-ThinInstaller_0.9.30.(1386: Opened log file
2015-08-08 12:28:22.322000 BlueStacks-ThinInstaller_0.9.30.(1386: 1 INFO CLR version 2.0.50727.8662
2015-08-08 12:28:22.323000 BlueStacks-ThinInstaller_0.9.30.(1386: 1 INFO IsAdministrator: False
2015-08-08 12:28:22.332000 BlueStacks-ThinInstaller_0.9.30.(1386: 3 INFO the tag File directory is C:\Users\IT\Downloads\BlueStacks-ThinInstaller (1)\
2015-08-08 12:28:22.356000 BlueStacks-ThinInstaller_0.9.30.(1386: 1 ERROR Failed to check for version. err: System.UnauthorizedAccessException: Access to the registry key 'HKEY_LOCAL_MACHINE\Software\BlueStacks' is denied.
2015-08-08 12:28:22.356000 BlueStacks-ThinInstaller_0.9.30.(1386: at Microsoft.Win32.RegistryKey.Win32Error(Int32 errorCode, String str)
2015-08-08 12:28:22.356000 BlueStacks-ThinInstaller_0.9.30.(1386: at Microsoft.Win32.RegistryKey.CreateSubKey(String subkey, RegistryKeyPermissionCheck permissionCheck, RegistrySecurity registrySecurity)
2015-08-08 12:28:22.356000 BlueStacks-ThinInstaller_0.9.30.(1386: at Microsoft.Win32.RegistryKey.CreateSubKey(String subkey)
2015-08-08 12:28:22.356000 BlueStacks-ThinInstaller_0.9.30.(1386: at BlueStacks.hyperDroid.ThinInstaller.ThinInstaller.AbortIfAlreadyInstalled()
2015-08-08 12:28:22.365000 BlueStacks-ThinInstaller_0.9.30.(1386: 1 INFO HandleCreated
2015-08-08 12:28:22.366000 BlueStacks-ThinInstaller_0.9.30.(1386: 1 INFO Handle: 199170
2015-08-08 12:28:22.368000 BlueStacks-ThinInstaller_0.9.30.(1386: 1 INFO m_ClassName = WindowsForms10.Window.8.app.0.33c0d9d
2015-08-08 12:28:22.369000 BlueStacks-ThinInstaller_0.9.30.(1386: 1 INFO Launching in installer mode
2015-08-08 12:28:24.456000 BlueStacks-ThinInstaller_0.9.30.(1340: Opened log file
2015-08-08 12:28:24.460000 BlueStacks-ThinInstaller_0.9.30.(1340: 1 INFO CLR version 2.0.50727.8662
2015-08-08 12:28:24.460000 BlueStacks-ThinInstaller_0.9.30.(1340: 1 INFO IsAdministrator: True
2015-08-08 12:28:24.461000 BlueStacks-ThinInstaller_0.9.30.(1340: 1 INFO In installer mode
2015-08-08 12:28:24.461000 BlueStacks-ThinInstaller_0.9.30.(1340: 1 INFO className = WindowsForms10.Window.8.app.0.33c0d9d
2015-08-08 12:28:24.461000 BlueStacks-ThinInstaller_0.9.30.(1340: 1 INFO Got handle: 199170
2015-08-08 12:28:24.464000 BlueStacks-ThinInstaller_0.9.30.(1340: 1 INFO Checking for existing BlueStacks installation...
2015-08-08 12:28:24.464000 BlueStacks-ThinInstaller_0.9.30.(1340: 1 INFO arg[0] = install:WindowsForms10.Window.8.app.0.33c0d9d
2015-08-08 12:28:24.467000 BlueStacks-ThinInstaller_0.9.30.(1340: 3 INFO the tag File directory is C:\Users\IT\Downloads\BlueStacks-ThinInstaller (1)\
2015-08-08 12:28:24.487000 BlueStacks-ThinInstaller_0.9.30.(1340: 1 INFO s_ParentHandle = 199170
2015-08-08 12:28:24.487000 BlueStacks-ThinInstaller_0.9.30.(1340: 1 INFO Populating Default Engilsh Strings
2015-08-08 12:28:24.488000 BlueStacks-ThinInstaller_0.9.30.(1340: 1 INFO Successfully Stored Localized Strings
2015-08-08 12:28:24.488000 BlueStacks-ThinInstaller_0.9.30.(1340: 1 INFO Successfully Populated English Strings
2015-08-08 12:28:24.489000 BlueStacks-ThinInstaller_0.9.30.(1340: 1 INFO SPAWNAPPS_APP_NAME =
2015-08-08 12:28:24.489000 BlueStacks-ThinInstaller_0.9.30.(1340: 1 INFO Setting directory permissions
2015-08-08 12:28:24.659000 BlueStacks-ThinInstaller_0.9.30.(1340: 1 INFO Checking for existing BlueStacks installation...
2015-08-08 12:28:26.076000 BlueStacks-ThinInstaller_0.9.30.(1340: 1 INFO Showing next screen
2015-08-08 12:28:26.933000 BlueStacks-ThinInstaller_0.9.30.(1340: 1 INFO Inside ShowInstallScreen
2015-08-08 12:28:26.937000 BlueStacks-ThinInstaller_0.9.30.(1340: 1 INFO Showing Install screen
2015-08-08 12:28:26.938000 BlueStacks-ThinInstaller_0.9.30.(1340: 1 INFO Inside Next Clicked
2015-08-08 12:28:26.938000 BlueStacks-ThinInstaller_0.9.30.(1340: 1 INFO Showing P2DM options screen
2015-08-08 12:28:26.938000 BlueStacks-ThinInstaller_0.9.30.(1340: 1 INFO P2DM is currently not enabled
2015-08-08 12:28:27.861000 BlueStacks-ThinInstaller_0.9.30.(1340: 1 INFO Starting installation...
2015-08-08 12:28:27.865000 BlueStacks-ThinInstaller_0.9.30.(1340: 1 INFO Checking for existing BlueStacks installation...
2015-08-08 12:28:27.905000 BlueStacks-ThinInstaller_0.9.30.(1340: 5 INFO The campaign name is empty
2015-08-08 12:28:27.906000 BlueStacks-ThinInstaller_0.9.30.(1340: 5 INFO Starting installation with msiexecArgs: /i "C:\Users\IT\AppData\Local\Temp\BlueStacks_lai2pu14.dka\BlueStacks_HD_AppPlayerSplit_setup_0.9.30.4239_REL.msi" /qn P2DM=1 FEATURES=268435455 OEM=BlueStacks APPPLAYER=YES CAMPAIGNNAME=empty LAUNCHER=ThinInstaller COMMONDATAFOLDER=C:\ProgramData APPNOTIFICATIONS=1
2015-08-08 12:28:27.937000 BlueStacks-ThinInstaller_0.9.30.(1340: 5 INFO Waiting for installer to complete...
2015-08-08 12:28:31.351000 BlueStacks-ThinInstaller_0.9.30.(1340: 5 INFO Installer exit code: 0
2015-08-08 12:28:31.351000 BlueStacks-ThinInstaller_0.9.30.(1340: 5 INFO Will try to send message 1025 to 199170
2015-08-08 12:28:31.351000 BlueStacks-ThinInstaller_0.9.30.(1386: 1 INFO Received message WM_USER_START_AGENT
2015-08-08 12:28:31.352000 BlueStacks-ThinInstaller_0.9.30.(1386: 5 INFO Starting Agent
2015-08-08 12:28:31.352000 BlueStacks-ThinInstaller_0.9.30.(1386: 5 ERROR System.ArgumentNullException: Value cannot be null.
2015-08-08 12:28:31.352000 BlueStacks-ThinInstaller_0.9.30.(1386: Parameter name: path1
2015-08-08 12:28:31.352000 BlueStacks-ThinInstaller_0.9.30.(1386: at System.IO.Path.Combine(String path1, String path2)
2015-08-08 12:28:31.352000 BlueStacks-ThinInstaller_0.9.30.(1386: at BlueStacks.hyperDroid.ThinInstaller.ThinInstallerUi.StartAgent()
2015-08-08 12:28:31.353000 BlueStacks-ThinInstaller_0.9.30.(1386: 1 INFO Processed message WM_USER_START_AGENT
2015-08-08 12:28:31.354000 BlueStacks-ThinInstaller_0.9.30.(1340: 5 INFO product name
2015-08-08 12:28:31.356000 BlueStacks-ThinInstaller_0.9.30.(1340: 5 INFO Creating runtime uninstall entry
2015-08-08 12:28:31.359000 BlueStacks-ThinInstaller_0.9.30.(1340: 5 ERROR Failed to create uninstall entry. err: System.ArgumentNullException: Value cannot be null.
2015-08-08 12:28:31.359000 BlueStacks-ThinInstaller_0.9.30.(1340: Parameter name: path1
2015-08-08 12:28:31.359000 BlueStacks-ThinInstaller_0.9.30.(1340: at System.IO.Path.Combine(String path1, String path2)
2015-08-08 12:28:31.359000 BlueStacks-ThinInstaller_0.9.30.(1340: at BlueStacks.hyperDroid.ThinInstaller.ThinInstallerUi.CreateRuntimeUninstallEntry()
2015-08-08 12:28:31.361000 BlueStacks-ThinInstaller_0.9.30.(1340: 5 INFO Returning from DoApkToExeStuff
2015-08-08 12:28:31.363000 BlueStacks-ThinInstaller_0.9.30.(1340: 5 INFO Will try to send message 1030 to 199170
2015-08-08 12:28:31.363000 BlueStacks-ThinInstaller_0.9.30.(1386: 1 INFO Received message WM_USER_LAUNCH_FRONTEND
2015-08-08 12:28:31.364000 BlueStacks-ThinInstaller_0.9.30.(1386: 6 INFO Launching frontend
2015-08-08 12:28:31.364000 BlueStacks-ThinInstaller_0.9.30.(1386: 6 ERROR Unhandled Application Exception:
2015-08-08 12:28:31.364000 BlueStacks-ThinInstaller_0.9.30.(1386: 6 ERROR System.ArgumentNullException: Value cannot be null.
2015-08-08 12:28:31.364000 BlueStacks-ThinInstaller_0.9.30.(1386: Parameter name: path1
2015-08-08 12:28:31.364000 BlueStacks-ThinInstaller_0.9.30.(1386: at System.IO.Path.Combine(String path1, String path2)
2015-08-08 12:28:31.364000 BlueStacks-ThinInstaller_0.9.30.(1386: at BlueStacks.hyperDroid.ThinInstaller.ThinInstallerUi.LaunchFrontend()
2015-08-08 12:28:31.364000 BlueStacks-ThinInstaller_0.9.30.(1386: at BlueStacks.hyperDroid.ThinInstaller.ThinInstallerUi.<HandleMessages>b__2c()
2015-08-08 12:28:31.364000 BlueStacks-ThinInstaller_0.9.30.(1386: at BlueStacks.hyperDroid.ThinInstaller.ThinInstallerUi.<>c__DisplayClass35.<PerformAction>b__34()
2015-08-08 12:28:31.364000 BlueStacks-ThinInstaller_0.9.30.(1386: at System.Threading.ThreadHelper.ThreadStart_Context(Object state)
2015-08-08 12:28:31.365000 BlueStacks-ThinInstaller_0.9.30.(1386: at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state)
2015-08-08 12:28:31.365000 BlueStacks-ThinInstaller_0.9.30.(1386: at System.Threading.ThreadHelper.ThreadStart()
2015-08-08 12:28:36.611000 BlueStacks-ThinInstaller_0.9.30.(1386: 7 INFO the tag File directory is C:\Users\IT\Downloads\BlueStacks-ThinInstaller (1)\
2015-08-08 12:28:36.876000 BlueStacks-ThinInstaller_0.9.30.(1340: 5 INFO In SleepAndExit with 0

Related

How to optimize EmbddedKafka and Mongo logs in Spring Boot

how to properly keep only relevant logs when using MongoDB and Kafka in a SpringBoot application
2022-08-02 11:14:58.148 INFO 363923 --- [ main] kafka.server.KafkaConfig : KafkaConfig values:
advertised.listeners = null
alter.config.policy.class.name = null
alter.log.dirs.replication.quota.window.num = 11
alter.log.dirs.replication.quota.window.size.seconds = 1
authorizer.class.name =
auto.create.topics.enable = true
auto.leader.rebalance.enable = true
background.threads = 10
broker.heartbeat.interval.ms = 2000
broker.id = 0
broker.id.generation.enable = true
broker.rack = null
broker.session.timeout.ms = 9000
client.quota.callback.class = null
compression.type = producer
connection.failed.authentication.delay.ms = 100
connections.max.idle.ms = 600000
...
2022-08-02 11:15:11.005 INFO 363923 --- [er-event-thread] state.change.logger : [Controller id=0 epoch=1] Changed partition test_cfr_prv_customeragreement_event_disbursement_ini-0 from NewPartition to OnlinePartition with state LeaderAndIsr(leader=0, leaderEpoch=0, isr=List(0), zkVersion=0)
2022-08-02 11:15:11.005 INFO 363923 --- [er-event-thread] state.change.logger : [Controller id=0 epoch=1] Changed partition test_cfr_prv_customeragreement_event_receipt_ini-0 from NewPartition to OnlinePartition with state LeaderAndIsr(leader=0, leaderEpoch=0, isr=List(0), zkVersion=0)
2022-08-02 11:15:11.017 INFO 363923 --- [er-event-thread] state.change.logger : [Controller id=0 epoch=1] Sending LeaderAndIsr request to broker 0 with 2 become-leader and 0 become-follower partitions
2022-08-02 11:15:11.024 INFO 363923 --- [er-event-thread] state.change.logger : [Controller id=0 epoch=1] Sending UpdateMetadata request to brokers HashSet(0) for 2 partitions
2022-08-02 11:15:11.026 INFO 363923 --- [er-event-thread] state.change.logger : [Controller id=0 epoch=1] Sending UpdateMetadata request to brokers HashSet() for 0 partitions
2022-08-02 11:15:11.028 INFO 363923 --- [quest-handler-0] state.change.logger : [Broker id=0] Handling LeaderAndIsr request correlationId 1 from controller 0 for 2 partitions
example of undesired logs
2022-08-02 11:15:04.578 INFO 363923 --- [ Thread-3] o.s.b.a.mongo.embedded.EmbeddedMongo : {"t":{"$date":"2022-08-02T11:15:04.578+02:00"},"s":"I", "c":"CONTROL", "id":51765, "ctx":"initandlisten","msg":"Operating System","attr":{"os":{"name":"Ubuntu","version":"20.04"}}}
2022-08-02 11:15:04.579 INFO 363923 --- [ Thread-3] o.s.b.a.mongo.embedded.EmbeddedMongo : {"t":{"$date":"2022-08-02T11:15:04.578+02:00"},"s":"I", "c":"CONTROL", "id":21951, "ctx":"initandlisten","msg":"Options set by command line","attr":{"options":{"net":{"bindIp":"127.0.0.1","port":34085},"replication":{"oplogSizeMB":10,"replSet":"rs0"},"security":{"authorization":"disabled"},"storage":{"dbPath":"/tmp/embedmongo-db-66eab1ce-d099-40ec-96fb-f759ef3808a4","syncPeriodSecs":0}}}}
2022-08-02 11:15:04.585 INFO 363923 --- [ Thread-3] o.s.b.a.mongo.embedded.EmbeddedMongo : {"t":{"$date":"2022-08-02T11:15:04.585+02:00"},"s":"I", "c":"STORAGE", "id":22297, "ctx":"initandlisten","msg":"Using the XFS filesystem is strongly recommended with the WiredTiger storage engine. See http://dochub.mongodb.org/core/prodnotes-filesystem","tags":["startupWarnings"]}
Please find here a link to a sample project github.com/smaillns/springboot-mongo-kafka
If we run a test we'll get a bunch of logs ! What's wrong with the current configuration ?

ActiveMQ Artemis: Zombie replica (slave) instance

We have deployed ActiveMQ Artemis v2.19.0 in a HA+Cluster configuration, hosted on Kubernetes (non-cloud) and use the JGroups KUBE_PING for the broker discovery. During regular operations, we have 2 primaries and 2 replica brokers and everything looks fine.
For testing, we now remove the replica instances (no Pods left) – and end up with a weird cluster state: 2 primaries – and 1 zombie replica connected to primary 1. The replica instances were shut down (scaling the corresponding StatefulSet to zero), i.e., no hard kill.
Restarting the replicas brings the cluster back to a normal state – sometimes.
According to the docs, the missing broker instances should be removed:
If it has not received a broadcast from a particular server for a length of time it will remove that server's entry from its list.
So the questions are: Why do we see the zombie broker (even after hours)? And how can we get back to a clean state without shutting down all instances?
Here is our jgroups.xml:
<config xmlns="urn:org:jgroups"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="urn:org:jgroups http://www.jgroups.org/schema/JGroups-3.0.xsd">
<TCP
enable_diagnostics="true"
bind_addr="match-interface:eth0,lo"
bind_port="7800"
recv_buf_size="20000000"
send_buf_size="640000"
max_bundle_size="64000"
max_bundle_timeout="30"
sock_conn_timeout="300"
thread_pool.enabled="true"
thread_pool.min_threads="2"
thread_pool.max_threads="8"
thread_pool.keep_alive_time="5000"
thread_pool.queue_enabled="true"
thread_pool.queue_max_size="10000"
thread_pool.rejection_policy="run"
oob_thread_pool.enabled="true"
oob_thread_pool.min_threads="1"
oob_thread_pool.max_threads="8"
oob_thread_pool.keep_alive_time="5000"
oob_thread_pool.queue_enabled="true"
oob_thread_pool.queue_max_size="100"
oob_thread_pool.rejection_policy="run"
/>
<TRACE/>
<org.jgroups.protocols.kubernetes.KUBE_PING
namespace="${kubernetesNamespace:default}"
labels="artemis-cluster=${clusterName:activemq-artemis}"
/>
<MERGE3 min_interval="10000" max_interval="30000"/>
<FD_SOCK/>
<FD timeout="3000" max_tries="3" />
<VERIFY_SUSPECT timeout="1500" />
<BARRIER />
<pbcast.NAKACK2 use_mcast_xmit="false" discard_delivered_msgs="true"/>
<UNICAST3
xmit_table_num_rows="100"
xmit_table_msgs_per_row="1000"
xmit_table_max_compaction_time="30000"
/>
<pbcast.STABLE stability_delay="1000" desired_avg_gossip="50000" max_bytes="400000"/>
<pbcast.GMS print_local_addr="true" join_timeout="3000" view_bundling="true"/>
<!-- <FC max_credits="2000000" min_threshold="0.10"/> -->
<MFC max_credits="2M" min_threshold="0.4"/>
<FRAG2 frag_size="60000" />
<pbcast.STATE_TRANSFER/>
<!-- <pbcast.FLUSH timeout="0"/> -->
</config>
Update
Configured logging as advised by Domenico. This time, when we shut down the replica brokers, both continue to exist as zombie instances:
Here are the logs (shutdown of replicas instances started at 2021-12-09T13:03:43Z):
------------------- TRACE (sent) -----------------------
MSG, arg=[dst: <null>, src: <null> (1 headers), size=0 bytes, flags=OOB|INTERNAL, transient_flags=DONT_LOOPBACK] (headers=NAKACK2: [HIGHEST_SEQNO, seqno=1727])
--------------------------------------------------------
---------------- TRACE (received) ----------------------
MSG, arg=[dst: <null>, src: ha-asa-activemq-artemis-primary-1-53544 (2 headers), size=0 bytes, flags=DONT_BUNDLE|INTERNAL] (headers=MERGE3: INFO: view_id=[ha-asa-activemq-artemis-primary-1-53544|5], logical_name=ha-asa-activemq-artemis-primary-1-53544, physical_addr=172.30.20.216:7800, TP: [cluster_name=active_broadcast_channel])
--------------------------------------------------------
------------------- TRACE (sent) -----------------------
SET_PHYSICAL_ADDRESS, arg=ha-asa-activemq-artemis-primary-1-53544 : 172.30.20.216:7800
--------------------------------------------------------
---------------- TRACE (received) ----------------------
MSG, arg=[dst: <null>, src: ha-asa-activemq-artemis-primary-1-53544 (2 headers), size=606 bytes] (headers=NAKACK2: [MSG, seqno=1733], TP: [cluster_name=active_broadcast_channel])
--------------------------------------------------------
{"timestamp":"2021-12-09T13:07:21.256Z","sequence":11249,"loggerClassName":"java.util.logging.Logger","loggerName":"org.apache.activemq.artemis.core.cluster.DiscoveryGroup","level":"DEBUG","message":"receiving 606","threadName":"activemq-discovery-group-thread-cluster-discovery-group0 (DiscoveryGroup-892093608)","threadId":78,"mdc":{},"ndc":"","hostName":"ha-asa-activemq-artemis-primary-0","processName":"Artemis","processId":303}
{"timestamp":"2021-12-09T13:07:21.256Z","sequence":11248,"loggerClassName":"java.util.logging.Logger","loggerName":"org.apache.activemq.artemis.core.cluster.DiscoveryGroup","level":"DEBUG","message":"receiving 606","threadName":"activemq-discovery-group-thread-cluster-discovery-group0 (DiscoveryGroup-176376157)","threadId":91,"mdc":{},"ndc":"","hostName":"ha-asa-activemq-artemis-primary-0","processName":"Artemis","processId":303}
{"timestamp":"2021-12-09T13:07:21.256Z","sequence":11252,"loggerClassName":"java.util.logging.Logger","loggerName":"org.apache.activemq.artemis.core.cluster.DiscoveryGroup","level":"DEBUG","message":"Received nodeID caec362d-58dc-11ec-9bf0-d2725171aa2d with originatingID = caad7f99-58dc-11ec-867d-ce446123ae5c","threadName":"activemq-discovery-group-thread-cluster-discovery-group0 (DiscoveryGroup-176376157)","threadId":91,"mdc":{},"ndc":"","hostName":"ha-asa-activemq-artemis-primary-0","processName":"Artemis","processId":303}
{"timestamp":"2021-12-09T13:07:21.256Z","sequence":11254,"loggerClassName":"java.util.logging.Logger","loggerName":"org.apache.activemq.artemis.core.cluster.DiscoveryGroup","level":"DEBUG","message":"Received nodeID d444f140-58dc-11ec-9bf0-d2725171aa2d with originatingID = caad7f99-58dc-11ec-867d-ce446123ae5c","threadName":"activemq-discovery-group-thread-cluster-discovery-group0 (DiscoveryGroup-892093608)","threadId":78,"mdc":{},"ndc":"","hostName":"ha-asa-activemq-artemis-primary-0","processName":"Artemis","processId":303}
{"timestamp":"2021-12-09T13:07:21.256Z","sequence":11256,"loggerClassName":"java.util.logging.Logger","loggerName":"org.apache.activemq.artemis.core.cluster.DiscoveryGroup","level":"DEBUG","message":"Received 1 discovery entry elements","threadName":"activemq-discovery-group-thread-cluster-discovery-group0 (DiscoveryGroup-176376157)","threadId":91,"mdc":{},"ndc":"","hostName":"ha-asa-activemq-artemis-primary-0","processName":"Artemis","processId":303}
{"timestamp":"2021-12-09T13:07:21.256Z","sequence":11258,"loggerClassName":"java.util.logging.Logger","loggerName":"org.apache.activemq.artemis.core.cluster.DiscoveryGroup","level":"DEBUG","message":"Received 1 discovery entry elements","threadName":"activemq-discovery-group-thread-cluster-discovery-group0 (DiscoveryGroup-892093608)","threadId":78,"mdc":{},"ndc":"","hostName":"ha-asa-activemq-artemis-primary-0","processName":"Artemis","processId":303}
{"timestamp":"2021-12-09T13:07:21.257Z","sequence":11261,"loggerClassName":"java.util.logging.Logger","loggerName":"org.apache.activemq.artemis.core.cluster.DiscoveryGroup","level":"DEBUG","message":"DiscoveryEntry[nodeID=caad7f99-58dc-11ec-867d-ce446123ae5c, connector=TransportConfiguration(name=artemis-tls-connector, factory=org-apache-activemq-artemis-core-remoting-impl-netty-NettyConnectorFactory) ?trustStorePassword=****&tcpReceiveBufferSize=1048576&port=61617&sslEnabled=true&host=ha-asa-activemq-artemis-primary-1-ha-asa-activemq-artemis-default-svc-bbscluster-hemisphere-local&trustStorePath=/var/lib/artemis/certs/truststore-jks&useEpoll=true&tcpSendBufferSize=1048576, lastUpdate=1639055241256]","threadName":"activemq-discovery-group-thread-cluster-discovery-group0 (DiscoveryGroup-892093608)","threadId":78,"mdc":{},"ndc":"","hostName":"ha-asa-activemq-artemis-primary-0","processName":"Artemis","processId":303}
{"timestamp":"2021-12-09T13:07:21.257Z","sequence":11260,"loggerClassName":"java.util.logging.Logger","loggerName":"org.apache.activemq.artemis.core.cluster.DiscoveryGroup","level":"DEBUG","message":"DiscoveryEntry[nodeID=caad7f99-58dc-11ec-867d-ce446123ae5c, connector=TransportConfiguration(name=artemis-tls-connector, factory=org-apache-activemq-artemis-core-remoting-impl-netty-NettyConnectorFactory) ?trustStorePassword=****&tcpReceiveBufferSize=1048576&port=61617&sslEnabled=true&host=ha-asa-activemq-artemis-primary-1-ha-asa-activemq-artemis-default-svc-bbscluster-hemisphere-local&trustStorePath=/var/lib/artemis/certs/truststore-jks&useEpoll=true&tcpSendBufferSize=1048576, lastUpdate=1639055241256]","threadName":"activemq-discovery-group-thread-cluster-discovery-group0 (DiscoveryGroup-176376157)","threadId":91,"mdc":{},"ndc":"","hostName":"ha-asa-activemq-artemis-primary-0","processName":"Artemis","processId":303}
{"timestamp":"2021-12-09T13:07:21.257Z","sequence":11264,"loggerClassName":"java.util.logging.Logger","loggerName":"org.apache.activemq.artemis.core.cluster.DiscoveryGroup","level":"DEBUG","message":"changed = false","threadName":"activemq-discovery-group-thread-cluster-discovery-group0 (DiscoveryGroup-892093608)","threadId":78,"mdc":{},"ndc":"","hostName":"ha-asa-activemq-artemis-primary-0","processName":"Artemis","processId":303}
{"timestamp":"2021-12-09T13:07:21.257Z","sequence":11266,"loggerClassName":"java.util.logging.Logger","loggerName":"org.apache.activemq.artemis.core.cluster.DiscoveryGroup","level":"DEBUG","message":"changed = false","threadName":"activemq-discovery-group-thread-cluster-discovery-group0 (DiscoveryGroup-176376157)","threadId":91,"mdc":{},"ndc":"","hostName":"ha-asa-activemq-artemis-primary-0","processName":"Artemis","processId":303}
{"timestamp":"2021-12-09T13:07:21.257Z","sequence":11268,"loggerClassName":"java.util.logging.Logger","loggerName":"org.apache.activemq.artemis.core.cluster.DiscoveryGroup","level":"DEBUG","message":"Calling notifyAll","threadName":"activemq-discovery-group-thread-cluster-discovery-group0 (DiscoveryGroup-892093608)","threadId":78,"mdc":{},"ndc":"","hostName":"ha-asa-activemq-artemis-primary-0","processName":"Artemis","processId":303}
{"timestamp":"2021-12-09T13:07:21.257Z","sequence":11270,"loggerClassName":"java.util.logging.Logger","loggerName":"org.apache.activemq.artemis.core.cluster.DiscoveryGroup","level":"DEBUG","message":"Calling notifyAll","threadName":"activemq-discovery-group-thread-cluster-discovery-group0 (DiscoveryGroup-176376157)","threadId":91,"mdc":{},"ndc":"","hostName":"ha-asa-activemq-artemis-primary-0","processName":"Artemis","processId":303}
{"timestamp":"2021-12-09T13:07:21.257Z","sequence":11272,"loggerClassName":"java.util.logging.Logger","loggerName":"org.apache.activemq.artemis.api.core.JGroupsBroadcastEndpoint","level":"TRACE","message":"Receiving Broadcast: clientOpened=true, channelOPen=true","threadName":"activemq-discovery-group-thread-cluster-discovery-group0 (DiscoveryGroup-892093608)","threadId":78,"mdc":{},"ndc":"","hostName":"ha-asa-activemq-artemis-primary-0","processName":"Artemis","processId":303}
{"timestamp":"2021-12-09T13:07:21.257Z","sequence":11274,"loggerClassName":"java.util.logging.Logger","loggerName":"org.apache.activemq.artemis.api.core.JGroupsBroadcastEndpoint","level":"TRACE","message":"Receiving Broadcast: clientOpened=true, channelOPen=true","threadName":"activemq-discovery-group-thread-cluster-discovery-group0 (DiscoveryGroup-176376157)","threadId":91,"mdc":{},"ndc":"","hostName":"ha-asa-activemq-artemis-primary-0","processName":"Artemis","processId":303}
---------------- TRACE (received) ----------------------
MSG, arg=[dst: ha-asa-activemq-artemis-primary-0-4048, src: ha-asa-activemq-artemis-primary-1-53544 (2 headers), size=0 bytes, flags=INTERNAL] (headers=FD: heartbeat, TP: [cluster_name=active_broadcast_channel])
--------------------------------------------------------
------------------- TRACE (sent) -----------------------
MSG, arg=[dst: ha-asa-activemq-artemis-primary-1-53544, src: <null> (1 headers), size=0 bytes, flags=INTERNAL] (headers=FD: heartbeat ack)
--------------------------------------------------------
------------------- TRACE (sent) -----------------------
MSG, arg=[dst: ha-asa-activemq-artemis-primary-1-53544, src: <null> (1 headers), size=0 bytes, flags=INTERNAL] (headers=FD: heartbeat)
--------------------------------------------------------
---------------- TRACE (received) ----------------------
MSG, arg=[dst: ha-asa-activemq-artemis-primary-0-4048, src: ha-asa-activemq-artemis-primary-1-53544 (2 headers), size=0 bytes, flags=INTERNAL] (headers=FD: heartbeat ack, TP: [cluster_name=active_broadcast_channel])
--------------------------------------------------------
{"timestamp":"2021-12-09T13:07:22.908Z","sequence":11276,"loggerClassName":"java.util.logging.Logger","loggerName":"org.apache.activemq.artemis.api.core.JGroupsBroadcastEndpoint","level":"TRACE","message":"Broadcasting: BroadCastOpened=true, channelOPen=true","threadName":"Thread-1 (ActiveMQ-scheduled-threads)","threadId":85,"mdc":{},"ndc":"","hostName":"ha-asa-activemq-artemis-primary-0","processName":"Artemis","processId":303}
------------------- TRACE (sent) -----------------------
MSG, arg=[dst: <null>, src: ha-asa-activemq-artemis-primary-0-4048 (1 headers), size=606 bytes, transient_flags=DONT_LOOPBACK] (headers=NAKACK2: [MSG, seqno=1728])
--------------------------------------------------------
---------------- TRACE (received) ----------------------
MSG, arg=[dst: <null>, src: ha-asa-activemq-artemis-primary-1-53544 (2 headers), size=0 bytes, flags=OOB|INTERNAL] (headers=NAKACK2: [HIGHEST_SEQNO, seqno=1733], TP: [cluster_name=active_broadcast_channel])
--------------------------------------------------------
------------------- TRACE (sent) -----------------------
MSG, arg=[dst: <null>, src: <null> (1 headers), size=0 bytes, flags=OOB|INTERNAL, transient_flags=DONT_LOOPBACK] (headers=NAKACK2: [HIGHEST_SEQNO, seqno=1728])
--------------------------------------------------------
---------------- TRACE (received) ----------------------
MSG, arg=[dst: ha-asa-activemq-artemis-primary-0-4048, src: ha-asa-activemq-artemis-primary-1-53544 (2 headers), size=0 bytes, flags=INTERNAL] (headers=FD: heartbeat, TP: [cluster_name=active_broadcast_channel])
--------------------------------------------------------
------------------- TRACE (sent) -----------------------
MSG, arg=[dst: ha-asa-activemq-artemis-primary-1-53544, src: <null> (1 headers), size=0 bytes, flags=INTERNAL] (headers=FD: heartbeat ack)
--------------------------------------------------------
------------------- TRACE (sent) -----------------------
MSG, arg=[dst: ha-asa-activemq-artemis-primary-1-53544, src: <null> (1 headers), size=0 bytes, flags=INTERNAL] (headers=FD: heartbeat)
--------------------------------------------------------
---------------- TRACE (received) ----------------------
MSG, arg=[dst: ha-asa-activemq-artemis-primary-0-4048, src: ha-asa-activemq-artemis-primary-1-53544 (2 headers), size=0 bytes, flags=INTERNAL] (headers=FD: heartbeat ack, TP: [cluster_name=active_broadcast_channel])
--------------------------------------------------------

How to collate pytest logging output to console?

I'd like to collate logging output to console such that the repeated "----ClassName.TestName---, and "-- Captured log call---" lines are removed or limited to a single entry. The below simplified example, with output, demonstrates the problem.
desired output:
2020-08-14 13:51:50 INFO test[test_01]
2020-08-14 13:51:50 INFO test[test_02]
2020-08-14 13:51:50 INFO test[test_03]
========= short test summary info =====================================
PASSED tests/test_logging.py::Test_Logging::test_01
PASSED tests/test_logging.py::Test_Logging::test_02
PASSED tests/test_logging.py::Test_Logging::test_03
source code:
import logging
import pytest
#pytest.mark.testing
class Test_Logging:
_logger = None
def setup_method(self):
self._logger = logging.getLogger('Test Logger')
def test_01(self, request):
self._logger.info(f"test[{request.node.name}]")
def test_02(self, request):
self._logger.info(f"test[{request.node.name}]")
def test_03(self, request):
self._logger.info(f"test[{request.node.name}]")
current output:
_____________ Test_Logging.test_01 ____________________________________
-------- Captured log call --------------------------------------------
2020-08-14 13:51:50 INFO test[test_01]
_____________ Test_Logging.test_02 ____________________________________
-------- Captured log call --------------------------------------------
2020-08-14 13:51:50 INFO test[test_02]
_____________ Test_Logging.test_03 ____________________________________
-------- Captured log call --------------------------------------------
2020-08-14 13:51:50 INFO test[test_03]
========= short test summary info =====================================
PASSED tests/test_logging.py::Test_Logging::test_01
PASSED tests/test_logging.py::Test_Logging::test_02
PASSED tests/test_logging.py::Test_Logging::test_03

To print output of SparkSQL to dataframe

I'm currently running Analyze command for particular table and could see the statistics being printed in the Spark-Console
However when I try to write the output to a DF I could not see the statistics.
Spark Version : 1.6.3
val a : DataFrame = sqlContext.sql("ANALYZE TABLE sample PARTITION (company='aaa', market='aab', edate='2019-01-03', pdate='2019-01-10') COMPUTE STATISTICS").collect()
Output in spark Shell
Partition sample{company=aaa, market=aab, etdate=2019-01-03, p=2019-01-10} stats: [numFiles=1, numRows=215, totalSize=7551, rawDataSize=461390]
19/03/22 02:49:33 INFO Task: Partition sample{company=aaa, market=aab, edate=2019-01-03, pdate=2019-01-10} stats: [numFiles=1, numRows=215, totalSize=7551, rawDataSize=461390]
Output of dataframe
19/03/22 02:49:33 INFO PerfLogger: </PERFLOG method=runTasks start=1553237373445 end=1553237373606 duration=161 from=org.apache.hadoop.hive.ql.Driver>
19/03/22 02:49:33 INFO PerfLogger: </PERFLOG method=Driver.execute start=1553237373445 end=1553237373606 duration=161 from=org.apache.hadoop.hive.ql.Driver>
19/03/22 02:49:33 INFO Driver: OK
19/03/22 02:49:40 INFO Executor: Running task 0.0 in stage 2.0 (TID 2)
19/03/22 02:49:40 INFO Executor: Finished task 0.0 in stage 2.0 (TID 2). 940 bytes result sent to driver
19/03/22 02:49:40 INFO TaskSetManager: Finished task 0.0 in stage 2.0 (TID 2) in 4 ms on localhost (1/1)
19/03/22 02:49:40 INFO DAGScheduler: ResultStage 2 (show at <console>:47) finished in 0.004 s
19/03/22 02:49:40 INFO TaskSchedulerImpl: Removed TaskSet 2.0, whose tasks have all completed, from pool
19/03/22 02:49:40 INFO DAGScheduler: Job 2 finished: show at <console>:47, took 0.007774 s
+------+
|result|
+------+
+------+
Could you please let me know how to get the same statistics output into the Dataframe.
Thanks.!
If you want to print from a Dataframe the way you are using, you can use,
val a : DataFrame = sqlContext.sql("ANALYZE TABLE sample PARTITION (company='aaa', market='aab', edate='2019-01-03', pdate='2019-01-10') COMPUTE STATISTICS")
a.select("*").show()

opentaps installation-Java returned: 99

I am installing opentaps on my system and error is-
[java] 2016-03-07 11:32:37,428 (main) [ TransactionUtil.java:345:INFO ] [TransactionUtil.rollback] transaction rolled back
[java] 2016-03-07 11:32:37,428 (main) [ EntityDataLoader.java:218:ERROR]
[java] ---- exception report ----------------------------------------------------------
[java] [install.loadData]: Error loading XML Resource "file:/home/oodles/work/skulocity/custom-erp-crm/opentaps/amazon/data/AmazonDemoSetup.xml"; Error was: A transaction error occurred reading data
[java] Exception: org.xml.sax.SAXException
[java] Message: A transaction error occurred reading data
[java] ---- cause ---------------------------------------------------------------------
[java] Exception: org.ofbiz.entity.GenericDataSourceException
[java] Message: SQL Exception occurred on commit (Commit can not be set while enrolled in a transaction)
[java] ---- cause ---------------------------------------------------------------------
[java] Exception: java.sql.SQLException
[java] Message: Commit can not be set while enrolled in a transaction
[java] ---- stack trace ---------------------------------------------------------------
[java] java.sql.SQLException: Commit can not be set while enrolled in a transaction
[java] org.apache.commons.dbcp.managed.ManagedConnection.commit(ManagedConnection.java:214)
[java] org.ofbiz.entity.jdbc.SQLProcessor.commit(SQLProcessor.java:145)
[java] org.ofbiz.entity.jdbc.SQLProcessor.close(SQLProcessor.java:196)
[java] org.ofbiz.entity.datasource.GenericDAO.select(GenericDAO.java:493)
[java] org.ofbiz.entity.datasource.GenericHelperDAO.findByPrimaryKey(GenericHelperDAO.java:80)
[java] org.ofbiz.entity.GenericDelegator.storeAll(GenericDelegator.java:1424)
[java] org.ofbiz.entity.util.EntitySaxReader.writeValues(EntitySaxReader.java:286)
[java] org.ofbiz.entity.util.EntitySaxReader.parse(EntitySaxReader.java:265)
[java] org.ofbiz.entity.util.EntitySaxReader.parse(EntitySaxReader.java:222)
[java] org.ofbiz.entity.util.EntityDataLoader.loadData(EntityDataLoader.java:214)
[java] org.ofbiz.entityext.data.EntityDataLoadContainer.start(EntityDataLoadContainer.java:389)
[java] org.ofbiz.base.container.ContainerLoader.start(ContainerLoader.java:101)
[java] org.ofbiz.base.start.Start.startStartLoaders(Start.java:273)
[java] org.ofbiz.base.start.Start.startServer(Start.java:323)
[java] org.ofbiz.base.start.Start.start(Start.java:327)
[java] org.ofbiz.base.start.Start.main(Start.java:412)
[java] --------------------------------------------------------------------------------
[java]
[java] 2016-03-07 11:32:37,428 (main) [ EntitySaxReader.java:221:INFO ] Beginning import from URL: file:/home/oodles/work/skulocity/custom-erp-crm/opentaps/amazon/data/AmazonDemoData.xml
[java] 2016-03-07 11:32:37,428 (main) [ EntitySaxReader.java:259:INFO ] Transaction Timeout set to 2 hours (7200 seconds)
[java] 2016-03-07 11:32:37,466 (main) [ EntitySaxReader.java:278:INFO ] Finished 13 values from file:/home/oodles/work/skulocity/custom-erp-crm/opentaps/amazon/data/AmazonDemoData.xml
[java] 2016-03-07 11:32:37,466 (main) [EntityDataLoadContainer.java:404:INFO ] =-=-=-=-=-=-= Here is a summary of the data load:
[java] 2016-03-07 11:32:37,466 (main) [EntityDataLoadContainer.java:406:INFO ] 00024 of 00024 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/security/data/SecurityData.xml
[java] 2016-03-07 11:32:37,466 (main) [EntityDataLoadContainer.java:406:INFO ] 00023 of 00047 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/common/data/CommonSecurityData.xml
[java] 2016-03-07 11:32:37,466 (main) [EntityDataLoadContainer.java:406:INFO ] 00095 of 00142 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/common/data/CommonTypeData.xml
[java] 2016-03-07 11:32:37,466 (main) [EntityDataLoadContainer.java:406:INFO ] 00724 of 00866 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/common/data/CountryCodeData.xml
[java] 2016-03-07 11:32:37,466 (main) [EntityDataLoadContainer.java:406:INFO ] 00169 of 01035 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/common/data/CurrencyData.xml
[java] 2016-03-07 11:32:37,466 (main) [EntityDataLoadContainer.java:406:INFO ] 00279 of 01314 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/common/data/GeoData.xml
[java] 2016-03-07 11:32:37,466 (main) [EntityDataLoadContainer.java:406:INFO ] 00016 of 01330 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/common/data/GeoData_AU.xml
[java] 2016-03-07 11:32:37,466 (main) [EntityDataLoadContainer.java:406:INFO ] 00056 of 01386 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/common/data/GeoData_BG.xml
[java] 2016-03-07 11:32:37,466 (main) [EntityDataLoadContainer.java:406:INFO ] 00053 of 01439 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/common/data/GeoData_BR.xml
[java] 2016-03-07 11:32:37,467 (main) [EntityDataLoadContainer.java:406:INFO ] 00066 of 01505 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/common/data/GeoData_CN.xml
[java] 2016-03-07 11:32:37,467 (main) [EntityDataLoadContainer.java:406:INFO ] 00066 of 01571 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/common/data/GeoData_CO.xml
[java] 2016-03-07 11:32:37,467 (main) [EntityDataLoadContainer.java:406:INFO ] 00032 of 01603 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/common/data/GeoData_DE.xml
[java] 2016-03-07 11:32:37,467 (main) [EntityDataLoadContainer.java:406:INFO ] 00138 of 01741 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/common/data/GeoData_ES.xml
[java] 2016-03-07 11:32:37,467 (main) [EntityDataLoadContainer.java:406:INFO ] 00428 of 02169 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/common/data/GeoData_FR.xml
[java] 2016-03-07 11:32:37,467 (main) [EntityDataLoadContainer.java:406:INFO ] 00220 of 02389 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/common/data/GeoData_IT.xml
[java] 2016-03-07 11:32:37,467 (main) [EntityDataLoadContainer.java:406:INFO ] 00070 of 02459 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/common/data/GeoData_IN.xml
[java] 2016-03-07 11:32:37,467 (main) [EntityDataLoadContainer.java:406:INFO ] 00064 of 02523 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/common/data/GeoData_IRL.xml
[java] 2016-03-07 11:32:37,467 (main) [EntityDataLoadContainer.java:406:INFO ] 00026 of 02549 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/common/data/GeoData_NL.xml
[java] 2016-03-07 11:32:37,467 (main) [EntityDataLoadContainer.java:406:INFO ] 00172 of 02721 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/common/data/GeoData_UK.xml
[java] 2016-03-07 11:32:37,467 (main) [EntityDataLoadContainer.java:406:INFO ] 00240 of 02961 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/common/data/GeoData_US.xml
[java] 2016-03-07 11:32:37,467 (main) [EntityDataLoadContainer.java:406:INFO ] 00433 of 03394 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/common/data/LanguageData.xml
[java] 2016-03-07 11:32:37,467 (main) [EntityDataLoadContainer.java:406:INFO ] 00236 of 03630 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/common/data/UnitData.xml
[java] 2016-03-07 11:32:37,468 (main) [EntityDataLoadContainer.java:406:INFO ] 00007 of 03637 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/common/data/PeriodData.xml
[java] 2016-03-07 11:32:37,468 (main) [EntityDataLoadContainer.java:406:INFO ] 00012 of 03649 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/common/data/CommonPortletData.xml
[java] 2016-03-07 11:32:37,468 (main) [EntityDataLoadContainer.java:406:INFO ] 00008 of 03657 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/service/data/ScheduledServiceData.xml
[java] 2016-03-07 11:32:37,468 (main) [EntityDataLoadContainer.java:406:INFO ] 00003 of 03660 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/service/data/ServiceSecurityData.xml
[java] 2016-03-07 11:32:37,468 (main) [EntityDataLoadContainer.java:406:INFO ] 00012 of 03672 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/entityext/data/EntityExtTypeData.xml
[java] 2016-03-07 11:32:37,468 (main) [EntityDataLoadContainer.java:406:INFO ] 00003 of 03675 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/entityext/data/EntityExtSecurityData.xml
[java] 2016-03-07 11:32:37,468 (main) [EntityDataLoadContainer.java:406:INFO ] 00001 of 03676 from file:/home/oodles/work/skulocity/custom-erp-crm/framework/bi/data/BiTypeData.xml
BUILD FAILED
/home/oodles/work/skulocity/custom-erp-crm/build.xml:510: Java returned: 99
$java -version
Java(TM) SE Runtime Environment (build 1.6.0_45-b06)
Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed mode)
opentaps=> select version();
version
----------------------------------------------------------------------------------------------------------------
PostgreSQL 9.2.15 on x86_64-unknown-linux-gnu, compiled by gcc (GCC) 4.1.2 20080704 (Red Hat 4.1.2-52), 64-bit
(1 row)
$ant -version
Apache Ant(TM) version 1.9.3 compiled on April 8 2014
I have followed all steps mentioned on installation doc.
On executing "ant run-install" it created 1015 table but failed at the end.Can anybody help me..?
In my case $JAVA_HOME was pointing to JDK1.7 and java -version command was showing JDK1.6,So i changed $JAVA_HOME to 1.6 and this is working fine.
You should use java 1.6.4
check the JAVA_HOME environment.
echo %JAVA_HOME%