Run kube-proxy on fedora server 23 panic: runtime error: invalid memory address or nil pointer dereference - kubernetes

fedora :
[root#host3 vagrant]# cat /etc/os-release
NAME=Fedora
VERSION="23 (Twenty Three)"
ID=fedora
VERSION_ID=23
PRETTY_NAME="Fedora 23 (Twenty Three)"
ANSI_COLOR="0;34"
CPE_NAME="cpe:/o:fedoraproject:fedora:23"
HOME_URL="https://fedoraproject.org/"
BUG_REPORT_URL="https://bugzilla.redhat.com/"
REDHAT_BUGZILLA_PRODUCT="Fedora"
REDHAT_BUGZILLA_PRODUCT_VERSION=23
REDHAT_SUPPORT_PRODUCT="Fedora"
REDHAT_SUPPORT_PRODUCT_VERSION=23
PRIVACY_POLICY_URL=https://fedoraproject.org/wiki/Legal:PrivacyPolicy
kube-proxy version:
[root#host3 vagrant]# kube-proxy --version=true
Kubernetes v1.1.2
Run command and error msg:
[root#host3 vagrant]# kube-proxy --logtostderr=true --v=0 --master=http://host1:8080 --proxy-mode=userspace --cleanup-iptables=true
panic: runtime error: invalid memory address or nil pointer dereference
[signal 0xb code=0x1 addr=0xb1 pc=0x465e26]
goroutine 1 [running]:
k8s.io/kubernetes/cmd/kube-proxy/app.(*ProxyServer).Run(0xc2080d79d0, 0xc208046af0, 0x0, 0x5, 0x0, 0x0)
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/cmd/kube-proxy/app/server.go:309 +0x56
main.main()
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/cmd/kube-proxy/proxy.go:53 +0x225
goroutine 7 [chan receive]:
github.com/golang/glog.(*loggingT).flushDaemon(0x12169e0)
/go/src/k8s.io/kubernetes/Godeps/_workspace/src/github.com/golang/glog/glog.go:879 +0x78
created by github.com/golang/glog.init·1
/go/src/k8s.io/kubernetes/Godeps/_workspace/src/github.com/golang/glog/glog.go:410 +0x2a7
goroutine 17 [syscall, locked to thread]:
runtime.goexit()
/usr/src/go/src/runtime/asm_amd64.s:2232 +0x1
goroutine 15 [chan receive]:
github.com/godbus/dbus.(*Conn).outWorker(0xc2080daa20)
/go/src/k8s.io/kubernetes/Godeps/_workspace/src/github.com/godbus/dbus/conn.go:367 +0x58
created by github.com/godbus/dbus.(*Conn).Auth
/go/src/k8s.io/kubernetes/Godeps/_workspace/src/github.com/godbus/dbus/auth.go:119 +0xea1
goroutine 12 [sleep]:
k8s.io/kubernetes/pkg/util.Until(0xda66f0, 0x12a05f200, 0xc20800a7e0)
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/pkg/util/util.go:127 +0x98
created by k8s.io/kubernetes/pkg/util.InitLogs
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/pkg/util/logs.go:49 +0xab
goroutine 14 [IO wait]:
net.(*pollDesc).Wait(0xc2080d6920, 0x72, 0x0, 0x0)
/usr/src/go/src/net/fd_poll_runtime.go:84 +0x47
net.(*pollDesc).WaitRead(0xc2080d6920, 0x0, 0x0)
/usr/src/go/src/net/fd_poll_runtime.go:89 +0x43
net.(*netFD).readMsg(0xc2080d68c0, 0xc2080e28c0, 0x10, 0x10, 0xc20816d220, 0x1000, 0x1000, 0xffffffffffffffff, 0x0, 0x0, ...)
/usr/src/go/src/net/fd_unix.go:296 +0x54e
net.(*UnixConn).ReadMsgUnix(0xc20803a0f0, 0xc2080e28c0, 0x10, 0x10, 0xc20816d220, 0x1000, 0x1000, 0x0, 0xc2080e276c, 0x4, ...)
/usr/src/go/src/net/unixsock_posix.go:147 +0x167
github.com/godbus/dbus.(*oobReader).Read(0xc20816d200, 0xc2080e28c0, 0x10, 0x10, 0xc20816d200, 0x0, 0x0)
/go/src/k8s.io/kubernetes/Godeps/_workspace/src/github.com/godbus/dbus/transport_unix.go:21 +0xc5
io.ReadAtLeast(0x7f7eeae0df58, 0xc20816d200, 0xc2080e28c0, 0x10, 0x10, 0x10, 0x0, 0x0, 0x0)
/usr/src/go/src/io/io.go:298 +0xf1
io.ReadFull(0x7f7eeae0df58, 0xc20816d200, 0xc2080e28c0, 0x10, 0x10, 0x0, 0x0, 0x0)
/usr/src/go/src/io/io.go:316 +0x6d
github.com/godbus/dbus.(*unixTransport).ReadMessage(0xc2081250d0, 0xc208112660, 0x0, 0x0)
/go/src/k8s.io/kubernetes/Godeps/_workspace/src/github.com/godbus/dbus/transport_unix.go:85 +0x1bf
github.com/godbus/dbus.(*Conn).inWorker(0xc2080daa20)
/go/src/k8s.io/kubernetes/Godeps/_workspace/src/github.com/godbus/dbus/conn.go:241 +0x58
created by github.com/godbus/dbus.(*Conn).Auth
/go/src/k8s.io/kubernetes/Godeps/_workspace/src/github.com/godbus/dbus/auth.go:118 +0xe84
goroutine 16 [runnable]:
k8s.io/kubernetes/pkg/util/iptables.(*runner).dbusSignalHandler(0xc2080d6850, 0x7f7eeae0e028, 0xc20803a100)
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/pkg/util/iptables/iptables.go:525
created by k8s.io/kubernetes/pkg/util/iptables.(*runner).connectToFirewallD
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/pkg/util/iptables/iptables.go:186 +0x7a7
anyone who can help me?

This looks like a bug when using the --cleanup-iptables=true flag in the 1.1.2 release, as I can reproduce a panic when running on a GCE node. I've created kubernetes#18197 on your behalf and this bug will be fixed in the upcoming 1.1.3 release.

Related

Connection between CLI and Peer/Orderer not working properly (Kubernetes setup)

I'm running a network in a Kubernetes Cluster and have a CLI, a Peer and an Orderer of the same organization each running in it's own Pod.
I can do channel creation, chaincode installation, approvement and committing without problems. However, when it comes to chaincode invocation, the CLI outputs that the chaincode might not be installed, while the Peer logs a failed connection to the CLI.
So here's the CLI command (update: with -o org1-orderer:30011):
$ export CORE_PEER_MSPCONFIGPATH=/config/admin/msp
$ peer chaincode invoke -C channel1 -n cc-abac -c '{"Args":["invoke","a","b","10"]}' -o org1-orderer:30011 --clientauth --tls --cafile /config/peer/tls-msp/tlscacerts/ca-cert.pem --keyfile /config/peer/tls-msp/keystore/key.pem --certfile /config/peer/tls-msp/signcerts
/cert.pem
CLI Output:
2020-07-07 16:47:20.918 UTC [msp] loadCertificateAt -> WARN 001 Failed loading ClientOU certificate at [/config/admin/msp]: [could not read file /config/admin/msp: read /config/admin/msp: is a directory]
2020-07-07 16:47:20.919 UTC [msp] loadCertificateAt -> WARN 002 Failed loading PeerOU certificate at [/config/admin/msp]: [could not read file /config/admin/msp: read /config/admin/msp: is a directory]
2020-07-07 16:47:20.919 UTC [msp] loadCertificateAt -> WARN 003 Failed loading AdminOU certificate at [/config/admin/msp]: [could not read file /config/admin/msp: read /config/admin/msp: is a directory]
2020-07-07 16:47:20.919 UTC [msp] loadCertificateAt -> WARN 004 Failed loading OrdererOU certificate at [/config/admin/msp]: [could not read file /config/admin/msp: read /config/admin/msp: is a directory]
2020-07-07 16:47:20.928 UTC [grpc] Infof -> DEBU 005 parsed scheme: ""
2020-07-07 16:47:20.928 UTC [grpc] Infof -> DEBU 006 scheme "" not registered, fallback to default scheme
2020-07-07 16:47:20.928 UTC [grpc] Infof -> DEBU 007 ccResolverWrapper: sending update to cc: {[{org1-peer1:30151 <nil> 0 <nil>}] <nil> <nil>}
2020-07-07 16:47:20.928 UTC [grpc] Infof -> DEBU 008 ClientConn switching balancer to "pick_first"
2020-07-07 16:47:20.928 UTC [grpc] Infof -> DEBU 009 Channel switches to new LB policy "pick_first"
2020-07-07 16:47:20.928 UTC [grpc] Infof -> DEBU 00a Subchannel Connectivity change to CONNECTING
2020-07-07 16:47:20.928 UTC [grpc] Infof -> DEBU 00b Subchannel picks a new address "org1-peer1:30151" to connect
2020-07-07 16:47:20.928 UTC [grpc] UpdateSubConnState -> DEBU 00c pickfirstBalancer: HandleSubConnStateChange: 0xc000114450, {CONNECTING <nil>}
2020-07-07 16:47:20.928 UTC [grpc] Infof -> DEBU 00d Channel Connectivity change to CONNECTING
2020-07-07 16:47:20.935 UTC [grpc] Infof -> DEBU 00e Subchannel Connectivity change to READY
2020-07-07 16:47:20.935 UTC [grpc] UpdateSubConnState -> DEBU 00f pickfirstBalancer: HandleSubConnStateChange: 0xc000114450, {READY <nil>}
2020-07-07 16:47:20.935 UTC [grpc] Infof -> DEBU 010 Channel Connectivity change to READY
2020-07-07 16:47:20.948 UTC [grpc] Infof -> DEBU 011 parsed scheme: ""
2020-07-07 16:47:20.948 UTC [grpc] Infof -> DEBU 012 scheme "" not registered, fallback to default scheme
2020-07-07 16:47:20.948 UTC [grpc] Infof -> DEBU 013 ccResolverWrapper: sending update to cc: {[{org1-peer1:30151 <nil> 0 <nil>}] <nil> <nil>}
2020-07-07 16:47:20.948 UTC [grpc] Infof -> DEBU 014 ClientConn switching balancer to "pick_first"
2020-07-07 16:47:20.948 UTC [grpc] Infof -> DEBU 015 Channel switches to new LB policy "pick_first"
2020-07-07 16:47:20.948 UTC [grpc] Infof -> DEBU 016 Subchannel Connectivity change to CONNECTING
2020-07-07 16:47:20.948 UTC [grpc] Infof -> DEBU 017 Subchannel picks a new address "org1-peer1:30151" to connect
2020-07-07 16:47:20.948 UTC [grpc] UpdateSubConnState -> DEBU 018 pickfirstBalancer: HandleSubConnStateChange: 0xc000496070, {CONNECTING <nil>}
2020-07-07 16:47:20.948 UTC [grpc] Infof -> DEBU 019 Channel Connectivity change to CONNECTING
2020-07-07 16:47:20.954 UTC [grpc] Infof -> DEBU 01a Subchannel Connectivity change to READY
2020-07-07 16:47:20.955 UTC [grpc] UpdateSubConnState -> DEBU 01b pickfirstBalancer: HandleSubConnStateChange: 0xc000496070, {READY <nil>}
2020-07-07 16:47:20.955 UTC [grpc] Infof -> DEBU 01c Channel Connectivity change to READY
2020-07-07 16:47:20.987 UTC [chaincodeCmd] InitCmdFactory -> INFO 01d Retrieved channel (channel1) orderer endpoint: org1-orderer:30011
2020-07-07 16:47:20.991 UTC [grpc] WithKeepaliveParams -> DEBU 01e Adjusting keepalive ping interval to minimum period of 10s
2020-07-07 16:47:20.991 UTC [grpc] Infof -> DEBU 01f parsed scheme: ""
2020-07-07 16:47:20.991 UTC [grpc] Infof -> DEBU 020 scheme "" not registered, fallback to default scheme
2020-07-07 16:47:20.991 UTC [grpc] Infof -> DEBU 021 ccResolverWrapper: sending update to cc: {[{org1-orderer:30011 <nil> 0 <nil>}] <nil> <nil>}
2020-07-07 16:47:20.991 UTC [grpc] Infof -> DEBU 022 ClientConn switching balancer to "pick_first"
2020-07-07 16:47:20.991 UTC [grpc] Infof -> DEBU 023 Channel switches to new LB policy "pick_first"
2020-07-07 16:47:20.991 UTC [grpc] Infof -> DEBU 024 Subchannel Connectivity change to CONNECTING
2020-07-07 16:47:20.991 UTC [grpc] Infof -> DEBU 025 Subchannel picks a new address "org1-orderer:30011" to connect
2020-07-07 16:47:20.991 UTC [grpc] UpdateSubConnState -> DEBU 026 pickfirstBalancer: HandleSubConnStateChange: 0xc000205a60, {CONNECTING <nil>}
2020-07-07 16:47:20.991 UTC [grpc] Infof -> DEBU 027 Channel Connectivity change to CONNECTING
2020-07-07 16:47:21.000 UTC [grpc] Infof -> DEBU 028 Subchannel Connectivity change to READY
2020-07-07 16:47:21.000 UTC [grpc] UpdateSubConnState -> DEBU 029 pickfirstBalancer: HandleSubConnStateChange: 0xc000205a60, {READY <nil>}
2020-07-07 16:47:21.000 UTC [grpc] Infof -> DEBU 02a Channel Connectivity change to READY
Error: endorsement failure during invoke. response: status:500 message:"make sure the chaincode cc-abac has been successfully defined on channel channel1 and try again: chaincode definition for 'cc-abac' exists, but chaincode is not installed"
I'm sure it's installed on channel1 (the only channel in existence, except sys-channel):
$ peer lifecycle chaincode queryinstalled
Installed chaincodes on peer:
Package ID: cc-abac:4992a37bf5c7b48f91f5062d9700a58a4129599c53d759e8282fdeffc8836c72, Label: cc-abac
On the Peer's side, I get the following in the log (updated):
[36m2020-07-09 06:45:55.976 UTC [gossip.discovery] periodicalSendAlive -> DEBU 194c[0m Sleeping 5s
[36m2020-07-09 06:45:56.182 UTC [endorser] ProcessProposal -> DEBU 194d[0m request from 10.129.1.229:60184
[36m2020-07-09 06:45:56.182 UTC [endorser] Validate -> DEBU 194e[0m creator is valid channel=channel1 txID=a71312e4 mspID=Org1MSP
[36m2020-07-09 06:45:56.182 UTC [msp.identity] Verify -> DEBU 194f[0m Verify: digest = 00000000 87 29 a0 e5 96 b8 5f 5e 9b e0 fb e5 4d 5b 86 b2 |.)...._^....M[..|
00000010 bd 43 ee 30 59 d6 a9 55 e3 e9 77 7b fd a2 47 8f |.C.0Y..U..w{..G.|
[36m2020-07-09 06:45:56.182 UTC [msp.identity] Verify -> DEBU 1950[0m Verify: sig = 00000000 30 45 02 21 00 f0 6b 23 9d f6 ec f2 29 be 64 4e |0E.!..k#....).dN|
00000010 75 69 a7 05 7e 05 71 51 64 6c 52 59 83 be ea f9 |ui..~.qQdlRY....|
00000020 08 5e 07 09 f3 02 20 7a f7 b0 6c e0 bb 32 b9 0c |.^.... z..l..2..|
00000030 8c 41 be b8 ea 39 33 91 92 0b 08 9e c6 14 39 e8 |.A...93.......9.|
00000040 46 eb a5 80 7a 7d d1 |F...z}.|
[36m2020-07-09 06:45:56.182 UTC [endorser] Validate -> DEBU 1951[0m signature is valid channel=channel1 txID=a71312e4 mspID=Org1MSP
[36m2020-07-09 06:45:56.182 UTC [fsblkstorage] retrieveTransactionByID -> DEBU 1952[0m retrieveTransactionByID() - txId = [a71312e411a6b417a541112e2aeac73adc8d6f7fbbb3c62ffcad2348e0c91fac]
[36m2020-07-09 06:45:56.182 UTC [leveldbhelper] GetIterator -> DEBU 1953[0m Getting iterator for range [[]byte{0x63, 0x68, 0x61, 0x6e, 0x6e, 0x65, 0x6c, 0x31, 0x0, 0x74, 0x1, 0x40, 0x61, 0x37, 0x31, 0x33, 0x31, 0x32, 0x65, 0x34, 0x31, 0x31, 0x61, 0x36, 0x62, 0x34, 0x31, 0x37, 0x61, 0x35, 0x34, 0x31, 0x31, 0x31, 0x32, 0x65, 0x32, 0x61, 0x65, 0x61, 0x63, 0x37, 0x33, 0x61, 0x64, 0x63, 0x38, 0x64, 0x36, 0x66, 0x37, 0x66, 0x62, 0x62, 0x62, 0x33, 0x63, 0x36, 0x32, 0x66, 0x66, 0x63, 0x61, 0x64, 0x32, 0x33, 0x34, 0x38, 0x65, 0x30, 0x63, 0x39, 0x31, 0x66, 0x61, 0x63}] - [[]byte{0x63, 0x68, 0x61, 0x6e, 0x6e, 0x65, 0x6c, 0x31, 0x0, 0x74, 0x1, 0x40, 0x61, 0x37, 0x31, 0x33, 0x31, 0x32, 0x65, 0x34, 0x31, 0x31, 0x61, 0x36, 0x62, 0x34, 0x31, 0x37, 0x61, 0x35, 0x34, 0x31, 0x31, 0x31, 0x32, 0x65, 0x32, 0x61, 0x65, 0x61, 0x63, 0x37, 0x33, 0x61, 0x64, 0x63, 0x38, 0x64, 0x36, 0x66, 0x37, 0x66, 0x62, 0x62, 0x62, 0x33, 0x63, 0x36, 0x32, 0x66, 0x66, 0x63, 0x61, 0x64, 0x32, 0x33, 0x34, 0x38, 0x65, 0x30, 0x63, 0x39, 0x31, 0x66, 0x61, 0x63, 0xff}]
[36m2020-07-09 06:45:56.182 UTC [aclmgmt] CheckACL -> DEBU 1954[0m acl policy /Channel/Application/Writers found in config for resource peer/Propose
[36m2020-07-09 06:45:56.182 UTC [aclmgmt] CheckACL -> DEBU 1955[0m acl check(/Channel/Application/Writers)
[36m2020-07-09 06:45:56.183 UTC [policies] EvaluateSignedData -> DEBU 1956[0m == Evaluating *policies.ImplicitMetaPolicy Policy /Channel/Application/Writers ==
[36m2020-07-09 06:45:56.183 UTC [policies] EvaluateSignedData -> DEBU 1957[0m This is an implicit meta policy, it will trigger other policy evaluations, whose failures may be benign
[36m2020-07-09 06:45:56.183 UTC [policies] EvaluateSignedData -> DEBU 1958[0m == Evaluating *cauthdsl.policy Policy /Channel/Application/Org1/Writers ==
[36m2020-07-09 06:45:56.183 UTC [msp.identity] Verify -> DEBU 1959[0m Verify: digest = 00000000 87 29 a0 e5 96 b8 5f 5e 9b e0 fb e5 4d 5b 86 b2 |.)...._^....M[..|
00000010 bd 43 ee 30 59 d6 a9 55 e3 e9 77 7b fd a2 47 8f |.C.0Y..U..w{..G.|
[36m2020-07-09 06:45:56.183 UTC [msp.identity] Verify -> DEBU 195a[0m Verify: sig = 00000000 30 45 02 21 00 f0 6b 23 9d f6 ec f2 29 be 64 4e |0E.!..k#....).dN|
00000010 75 69 a7 05 7e 05 71 51 64 6c 52 59 83 be ea f9 |ui..~.qQdlRY....|
00000020 08 5e 07 09 f3 02 20 7a f7 b0 6c e0 bb 32 b9 0c |.^.... z..l..2..|
00000030 8c 41 be b8 ea 39 33 91 92 0b 08 9e c6 14 39 e8 |.A...93.......9.|
00000040 46 eb a5 80 7a 7d d1 |F...z}.|
[36m2020-07-09 06:45:56.183 UTC [policies] SignatureSetToValidIdentities -> DEBU 195b[0m signature for identity 0 validated
[36m2020-07-09 06:45:56.183 UTC [cauthdsl] func1 -> DEBU 195c[0m 0xc0006210e0 gate 1594277156183221199 evaluation starts
[36m2020-07-09 06:45:56.183 UTC [cauthdsl] func2 -> DEBU 195d[0m 0xc0006210e0 signed by 0 principal evaluation starts (used [false])
[36m2020-07-09 06:45:56.183 UTC [cauthdsl] func2 -> DEBU 195e[0m 0xc0006210e0 processing identity 0 - &{Org1MSP 0b33fd619da73c0915b76088b0678047f834593ea6a4f22f0772b36f3c6bd68f}
[36m2020-07-09 06:45:56.183 UTC [cauthdsl] func2 -> DEBU 195f[0m 0xc0006210e0 principal evaluation succeeds for identity 0
[36m2020-07-09 06:45:56.183 UTC [cauthdsl] func1 -> DEBU 1960[0m 0xc0006210e0 gate 1594277156183221199 evaluation succeeds
[36m2020-07-09 06:45:56.183 UTC [policies] EvaluateSignedData -> DEBU 1961[0m Signature set satisfies policy /Channel/Application/Org1/Writers
[36m2020-07-09 06:45:56.183 UTC [policies] EvaluateSignedData -> DEBU 1962[0m == Done Evaluating *cauthdsl.policy Policy /Channel/Application/Org1/Writers
[36m2020-07-09 06:45:56.183 UTC [policies] EvaluateSignedData -> DEBU 1963[0m Signature set satisfies policy /Channel/Application/Writers
[36m2020-07-09 06:45:56.183 UTC [policies] EvaluateSignedData -> DEBU 1964[0m == Done Evaluating *policies.ImplicitMetaPolicy Policy /Channel/Application/Writers
[36m2020-07-09 06:45:56.183 UTC [lockbasedtxmgr] NewTxSimulator -> DEBU 1965[0m constructing new tx simulator
[36m2020-07-09 06:45:56.183 UTC [lockbasedtxmgr] newLockBasedTxSimulator -> DEBU 1966[0m constructing new tx simulator txid = [a71312e411a6b417a541112e2aeac73adc8d6f7fbbb3c62ffcad2348e0c91fac]
[36m2020-07-09 06:45:56.183 UTC [stateleveldb] GetState -> DEBU 1967[0m GetState(). ns=_lifecycle, key=namespaces/fields/cc-abac/Sequence
[36m2020-07-09 06:45:56.183 UTC [lockbasedtxmgr] Done -> DEBU 1968[0m Done with transaction simulation / query execution [a71312e411a6b417a541112e2aeac73adc8d6f7fbbb3c62ffcad2348e0c91fac]
[34m2020-07-09 06:45:56.183 UTC [comm.grpc.server] 1 -> INFO 1969[0m unary call completed grpc.service=protos.Endorser grpc.method=ProcessProposal grpc.peer_address=10.129.1.229:60184 grpc.peer_subject="CN=org1-peer1,OU=peer,O=Hyperledger,ST=North Carolina,C=US" grpc.code=OK grpc.call_duration=1.225382ms
[36m2020-07-09 06:45:56.186 UTC [grpc] warningf -> DEBU 196a[0m transport: http2Server.HandleStreams failed to read frame: read tcp 10.130.2.65:7051->10.129.1.229:60184: read: connection reset by peer
[36m2020-07-09 06:45:56.186 UTC [grpc] infof -> DEBU 196b[0m transport: loopyWriter.run returning. connection error: desc = "transport is closing"
[36m2020-07-09 06:45:56.186 UTC [grpc] infof -> DEBU 196c[0m transport: loopyWriter.run returning. connection error: desc = "transport is closing"
[update] The message unary call completed grpc.service=protos.Endorser grpc.method=ProcessProposal grpc.peer_address=10.129.1.229:60184 grpc.peer_subject="CN=org1-peer1,OU=peer,O=Hyperledger,ST=North Carolina,C=US" grpc.code=OK grpc.call_duration=1.225382ms indicates that the Peer considers the CLI as another Peer, doesn't it? If so, it's clear why the connection is failing. Now the question is, why the Peer thinks so?
Peer: 10.130.2.65
CLI: 10.129.1.229
Kind regards
Unfortunately, all of the GRPC logs and k8s related issues appear to be a red herring. The connection is being correctly established, the term 'peer' is simply a little confusing in the GRPC logs, as GRPC always refers to 'the party on the other end of the line' as a 'peer'. This term is re-used with a different meaning in Fabric.
As the logs indicate, the chaincode has been successfully approved, and defined on the channel.
As the peer CLI output indicates, you have installed a chaincode with package-id cc-abac:4992a37bf5c7b48f91f5062d9700a58a4129599c53d759e8282fdeffc8836c72.
But, on invoke, you are seeing the error that:
chaincode definition for 'cc-abac' exists, but chaincode is not installed
This means when you did your chaincode approval, you either did not specify a package-id, or, you specified an incorrect package id.
If you are using a v2.2+ version of Fabric, you should be able to use the peer lifecycle queryapproved utility to see what package ID you have selected.
You can re-run the peer lifecycle approveformyorg with the correct package-id (cc-abac:4992a37bf5c7b48f91f5062d9700a58a4129599c53d759e8282fdeffc8836c72) and this should correct things.

HID descriptor + report for iOS Home button?

I'm trying to use an Arduino to create a single-button Bluetooth keyboard and struggling to construct a valid HID descriptor. I've been able to send key events to my iOS device using the default generic desktop keyboard HID descriptor, but once I try using the following HID descriptor I'm unable to trigger a home button event (AC Home: 0x0223) when I send HID reports to toggle bit 0 from 0 → 1 → 0:
0x05, 0x0c, // USAGE_PAGE (Consumer Devices)
0x09, 0x01, // USAGE (Consumer Control)
0xa1, 0x01, // COLLECTION (Application)
0x15, 0x00, // LOGICAL_MINIMUM (0)
0x25, 0x01, // LOGICAL_MAXIMUM (1)
0x75, 0x01, // REPORT_SIZE (1)
0x95, 0x01, // REPORT_COUNT (1)
0x0c, 0x02, 0x23 // USAGE (AC Home)
0x81, 0x06, // INPUT (Data,Var,Rel)
0x95, 0x07, // REPORT_COUNT (7 bytes of padding)
0x81, 0x03, // INPUT (Cnst,Var,Abs)
0xc0 // END_COLLECTION
Am I missing something in the construction of my HID descriptor? Is AC Home not the correct usage ID for the home button in iOS?
Any help would be greatly appreciated!
Yes there is a small error in your descriptor:
0x0c, 0x02, 0x23 // USAGE (AC Home)
should be:
0x0a, 0x23, 0x02 // USAGE (AC Home)
Your current descriptor decodes as:
//--------------------------------------------------------------------------------
// Decoded Application Collection
//--------------------------------------------------------------------------------
/*
05 0C (GLOBAL) USAGE_PAGE 0x000C Consumer Device Page
09 01 (LOCAL) USAGE 0x000C0001 Consumer Control (Application Collection)
A1 01 (MAIN) COLLECTION 0x01 Application (Usage=0x000C0001: Page=Consumer Device Page, Usage=Consumer Control, Type=Application Collection)
15 00 (GLOBAL) LOGICAL_MINIMUM 0x00 (0) <-- Info: Consider replacing 15 00 with 14
25 01 (GLOBAL) LOGICAL_MAXIMUM 0x01 (1)
75 01 (GLOBAL) REPORT_SIZE 0x01 (1) Number of bits per field
95 01 (GLOBAL) REPORT_COUNT 0x01 (1) Number of fields
0C (ERROR) <-- Error: Item (0C) is not a MAIN, GLOBAL or LOCAL item
02 2381 (MAIN) <-- Error: Item (02) is not a MAIN item. Expected INPUT(8x) OUTPUT(9x) FEATURE(Bx) COLLECTION(Ax) or END_COLLECTION(Cx) (where x = 0,1,2,3).
06 9507 (GLOBAL) USAGE_PAGE 0x0795 Reserved
81 03 (MAIN) INPUT 0x00000003 (1 field x 1 bit) 1=Constant 1=Variable 0=Absolute 0=NoWrap 0=Linear 0=PrefState 0=NoNull 0=NonVolatile 0=Bitmap
C0 (MAIN) END_COLLECTION Application
*/
//--------------------------------------------------------------------------------
// Reserved inputReport (Device --> Host)
//--------------------------------------------------------------------------------
typedef struct
{
// No REPORT ID byte
// Collection: CA:ConsumerControl
uint8_t : 1; // Pad
} inputReport_t;
After the above change is implemented it looks in better shape:
//--------------------------------------------------------------------------------
// Decoded Application Collection
//--------------------------------------------------------------------------------
/*
05 0C (GLOBAL) USAGE_PAGE 0x000C Consumer Device Page
09 01 (LOCAL) USAGE 0x000C0001 Consumer Control (Application Collection)
A1 01 (MAIN) COLLECTION 0x01 Application (Usage=0x000C0001: Page=Consumer Device Page, Usage=Consumer Control, Type=Application Collection)
15 00 (GLOBAL) LOGICAL_MINIMUM 0x00 (0) <-- Info: Consider replacing 15 00 with 14
25 01 (GLOBAL) LOGICAL_MAXIMUM 0x01 (1)
75 01 (GLOBAL) REPORT_SIZE 0x01 (1) Number of bits per field
95 01 (GLOBAL) REPORT_COUNT 0x01 (1) Number of fields
0A 2302 (LOCAL) USAGE 0x000C0223 AC Home (Selector)
81 06 (MAIN) INPUT 0x00000006 (1 field x 1 bit) 0=Data 1=Variable 1=Relative 0=NoWrap 0=Linear 0=PrefState 0=NoNull 0=NonVolatile 0=Bitmap
95 07 (GLOBAL) REPORT_COUNT 0x07 (7) Number of fields
81 03 (MAIN) INPUT 0x00000003 (7 fields x 1 bit) 1=Constant 1=Variable 0=Absolute 0=NoWrap 0=Linear 0=PrefState 0=NoNull 0=NonVolatile 0=Bitmap
C0 (MAIN) END_COLLECTION Application
*/
//--------------------------------------------------------------------------------
// Consumer Device Page inputReport (Device --> Host)
//--------------------------------------------------------------------------------
typedef struct
{
// No REPORT ID byte
// Collection: CA:ConsumerControl
uint8_t CD_ConsumerControlAcHome : 1; // Usage 0x000C0223: AC Home, Value = 0 to 1
uint8_t : 1; // Pad
uint8_t : 1; // Pad
uint8_t : 1; // Pad
uint8_t : 1; // Pad
uint8_t : 1; // Pad
uint8_t : 1; // Pad
uint8_t : 1; // Pad
} inputReport_t;
...the HID descriptor was decoded with hidrdd (freeware) from github or sourceforge

Error CrashLoopBackOff Heapster with Statsd Sink Configuration

I got error with the latest heapster version: v1.5.1. I've described in detail in this github issue link: https://github.com/kubernetes/heapster/issues/1969
The error message:
Container Timestamp Message
heapster Mar 2, 2018, 11:03:20 AM /go/src/k8s.io/heapster/metrics/heapster.go:89 +0x458
heapster Mar 2, 2018, 11:03:20 AM main.main()
heapster Mar 2, 2018, 11:03:20 AM /go/src/k8s.io/heapster/metrics/heapster.go:194 +0x8d
heapster Mar 2, 2018, 11:03:20 AM main.createAndInitSinksOrDie(0xc42028f3b0, 0x1, 0x1, 0x0, 0x0, 0x4a817c800, 0x0, 0x0, 0xc420154740, 0x0, ...)
heapster Mar 2, 2018, 11:03:20 AM /go/src/k8s.io/heapster/metrics/sinks/factory.go:90 +0x563
heapster Mar 2, 2018, 11:03:20 AM k8s.io/heapster/metrics/sinks.(*SinkFactory).BuildAll(0xc4205e1cd8, 0xc42028f3b0, 0x1, 0x1, 0x0, 0x0, 0x7fb87afb8400, 0x0, 0x0, 0x0, ...)
heapster Mar 2, 2018, 11:03:20 AM goroutine 1 [running]:
heapster Mar 2, 2018, 11:03:20 AM
heapster Mar 2, 2018, 11:03:20 AM [signal SIGSEGV: segmentation violation code=0x1 addr=0x28 pc=0x15f77a3]
heapster Mar 2, 2018, 11:03:20 AM panic: runtime error: invalid memory address or nil pointer dereference
heapster Mar 2, 2018, 11:03:20 AM I0302 04:03:20.258025 1 configs.go:62] Using kubelet port 10255
heapster Mar 2, 2018, 11:03:20 AM I0302 04:03:20.258013 1 configs.go:61] Using Kubernetes client with master "https://kubernetes.default" and version v1
heapster Mar 2, 2018, 11:03:20 AM I0302 04:03:20.257857 1 heapster.go:79] Heapster version v1.5.1
heapster Mar 2, 2018, 11:03:20 AM I0302 04:03:20.257820 1 heapster.go:78] /heapster --source=kubernetes:https://kubernetes.default --sink="statsd:udp://dd-agent-service.default:8125"
Anybody knows how to solve it? Perhaps someone who already successfully integrated the heapster to datadog statsd agent in Kubernetes?
Thanks before

Kafka Burrow stopped after running for a while

I try to monitor consumer lag in Kafka with Burrow. I could get the result from HTTP endpoint, but it's only for a while. After about one minute, I couldn't get any response from burrow and port 8000 is closed.
I have my zookeeper installed in same host with kafka instance. Here my configuration and error log.
burrow.cfg
[general]
logdir=log
logconfig=config/logging.cfg
pidfile=burrow.pid
client-id=burrow-lagchecker
group-blacklist=^(console-consumer-|python-kafka-consumer-).*$
[zookeeper]
hostname=kafka01
hostname=kafka02
hostname=kafka03
port=2181
timeout=6
lock-path=/burrow/notifier
[kafka "TestEnvironment"]
broker=kafka01
broker=kafka02
broker=kafka03
broker-port=6667
zookeeper=kafka01
zookeeper=kafka02
zookeeper=kafka03
zookeeper-port=2181
zookeeper-path=/kafka-cluster
offsets-topic=__consumer_offsets
[tickers]
broker-offsets=60
[lagcheck]
intervals=10
expire-group=604800
[httpserver]
server=on
port=8000
[smtp]
server=mailserver.example.com
port=25
from=burrow-noreply#example.com
template=config/default-email.tmpl
[email "bofh#example.com"]
group=local,critical-consumer-group
group=local,other-consumer-group
interval=60
[httpnotifier]
url=http://notification.server.example.com:9000/v1/alert
interval=60
extra=app=burrow
extra=tier=STG
template-post=config/default-http-post.tmpl
template-delete=config/default-http-delete.tmpl
burrow.log
2015-09-16 06:02:28 [INFO] Starting Zookeeper client
2015-09-16 06:02:28 [INFO] Starting Offsets Storage module
2015-09-16 06:02:28 [INFO] Starting HTTP server
2015-09-16 06:02:28 [INFO] Starting Zookeeper client for cluster TestEnvironment
2015-09-16 06:02:28 [INFO] Starting Kafka client for cluster TestEnvironment
2015-09-16 06:02:28 [INFO] Starting consumers for 1 partitions of __consumer_offsets in cluster TestEnvironment
2015-09-16 06:02:28 [INFO] Configuring Email notifier
2015-09-16 06:02:28 [INFO] Configuring HTTP notifier
2015-09-16 06:02:28 [INFO] Acquired Zookeeper notifier lock
2015-09-16 06:02:28 [INFO] Starting Email notifier
2015-09-16 06:02:28 [INFO] Starting HTTP notifier
burrow.out
Started Burrow at September 16, 2015 at 6:02am (UTC)
panic: runtime error: invalid memory address or nil pointer dereference
[signal 0xb code=0x1 addr=0x18 pc=0x4172e2]
goroutine 183 [running]:
main.(*OffsetStorage).evaluateGroup(0xc8201d95c0, 0xc8200e00c0, 0x5, 0xc8200e00c6, 0x17, 0xc8204545a0)
/home/dwirawan/work/src/github.com/linkedin/burrow/offsets_store.go:337 +0x182
created by main.NewOffsetStorage.func1
/home/dwirawan/work/src/github.com/linkedin/burrow/offsets_store.go:188 +0x43f
goroutine 1 [chan receive]:
main.burrowMain(0x0)
/home/dwirawan/work/src/github.com/linkedin/burrow/main.go:194 +0x1c2b
main.main()
/home/dwirawan/work/src/github.com/linkedin/burrow/main.go:200 +0x33
goroutine 17 [syscall, 1 minutes, locked to thread]:
runtime.goexit()
/usr/local/go/src/runtime/asm_amd64.s:1696 +0x1
goroutine 5 [semacquire, 1 minutes]:
sync.runtime_Syncsemacquire(0xc820019150)
/usr/local/go/src/runtime/sema.go:237 +0x201
sync.(*Cond).Wait(0xc820019140)
/usr/local/go/src/sync/cond.go:62 +0x9b
github.com/cihub/seelog.(*asyncLoopLogger).processItem(0xc82001c600, 0x0)
/home/dwirawan/work/src/github.com/cihub/seelog/behavior_asynclooplogger.go:50 +0xc7
github.com/cihub/seelog.(*asyncLoopLogger).processQueue(0xc82001c600)
/home/dwirawan/work/src/github.com/cihub/seelog/behavior_asynclooplogger.go:63 +0x2a
created by github.com/cihub/seelog.newAsyncLoopLogger
/home/dwirawan/work/src/github.com/cihub/seelog/behavior_asynclooplogger.go:40 +0x91
goroutine 6 [semacquire, 1 minutes]:
sync.runtime_Syncsemacquire(0xc8200192d0)
/usr/local/go/src/runtime/sema.go:237 +0x201
sync.(*Cond).Wait(0xc8200192c0)
/usr/local/go/src/sync/cond.go:62 +0x9b
github.com/cihub/seelog.(*asyncLoopLogger).processItem(0xc82001c720, 0x0)
/home/dwirawan/work/src/github.com/cihub/seelog/behavior_asynclooplogger.go:50 +0xc7
github.com/cihub/seelog.(*asyncLoopLogger).processQueue(0xc82001c720)
/home/dwirawan/work/src/github.com/cihub/seelog/behavior_asynclooplogger.go:63 +0x2a
created by github.com/cihub/seelog.newAsyncLoopLogger
/home/dwirawan/work/src/github.com/cihub/seelog/behavior_asynclooplogger.go:40 +0x91
goroutine 7 [syscall, 1 minutes]:
os/signal.loop()
/usr/local/go/src/os/signal/signal_unix.go:22 +0x18
created by os/signal.init.1
/usr/local/go/src/os/signal/signal_unix.go:28 +0x37
goroutine 8 [semacquire]:
sync.runtime_Syncsemacquire(0xc820316710)
/usr/local/go/src/runtime/sema.go:237 +0x201
sync.(*Cond).Wait(0xc820316700)
/usr/local/go/src/sync/cond.go:62 +0x9b
github.com/cihub/seelog.(*asyncLoopLogger).processItem(0xc8200dd800, 0x0)
/home/dwirawan/work/src/github.com/cihub/seelog/behavior_asynclooplogger.go:50 +0xc7
github.com/cihub/seelog.(*asyncLoopLogger).processQueue(0xc8200dd800)
/home/dwirawan/work/src/github.com/cihub/seelog/behavior_asynclooplogger.go:63 +0x2a
created by github.com/cihub/seelog.newAsyncLoopLogger
/home/dwirawan/work/src/github.com/cihub/seelog/behavior_asynclooplogger.go:40 +0x91
goroutine 9 [semacquire, 1 minutes]:
sync.runtime_Semacquire(0xc82079201c)
/usr/local/go/src/runtime/sema.go:43 +0x26
sync.(*WaitGroup).Wait(0xc820792010)
/usr/local/go/src/sync/waitgroup.go:126 +0xb4
github.com/samuel/go-zookeeper/zk.(*Conn).loop(0xc820069e10)
/home/dwirawan/work/src/github.com/samuel/go-zookeeper/zk/conn.go:227 +0x671
github.com/samuel/go-zookeeper/zk.ConnectWithDialer.func1(0xc820069e10)
/home/dwirawan/work/src/github.com/samuel/go-zookeeper/zk/conn.go:145 +0x21
created by github.com/samuel/go-zookeeper/zk.ConnectWithDialer
/home/dwirawan/work/src/github.com/samuel/go-zookeeper/zk/conn.go:149 +0x452
goroutine 10 [select]:
main.NewOffsetStorage.func1(0xc8201d95c0)
/home/dwirawan/work/src/github.com/linkedin/burrow/offsets_store.go:168 +0x4a8
created by main.NewOffsetStorage
/home/dwirawan/work/src/github.com/linkedin/burrow/offsets_store.go:199 +0x4b7
goroutine 11 [IO wait]:
net.runtime_pollWait(0x7f17dc7d8fb0, 0x72, 0xc820010190)
/usr/local/go/src/runtime/netpoll.go:157 +0x60
net.(*pollDesc).Wait(0xc8203b6060, 0x72, 0x0, 0x0)
/usr/local/go/src/net/fd_poll_runtime.go:73 +0x3a
net.(*pollDesc).WaitRead(0xc8203b6060, 0x0, 0x0)
/usr/local/go/src/net/fd_poll_runtime.go:78 +0x36
net.(*netFD).accept(0xc8203b6000, 0x0, 0x7f17dc7d90a8, 0xc8200e1c80)
/usr/local/go/src/net/fd_unix.go:408 +0x27c
net.(*TCPListener).AcceptTCP(0xc8203d8000, 0x46e890, 0x0, 0x0)
/usr/local/go/src/net/tcpsock_posix.go:254 +0x4d
net/http.tcpKeepAliveListener.Accept(0xc8203d8000, 0x0, 0x0, 0x0, 0x0)
/usr/local/go/src/net/http/server.go:2135 +0x41
net/http.(*Server).Serve(0xc82038a000, 0x7f17dc7d9070, 0xc8203d8000, 0x0, 0x0)
/usr/local/go/src/net/http/server.go:1887 +0xb3
net/http.(*Server).ListenAndServe(0xc82038a000, 0x0, 0x0)
/usr/local/go/src/net/http/server.go:1877 +0x136
net/http.ListenAndServe(0xc820338750, 0x5, 0x7f17db9a02e8, 0xc8201d97a0, 0x0, 0x0)
/usr/local/go/src/net/http/server.go:1967 +0x8f
created by main.NewHttpServer
/home/dwirawan/work/src/github.com/linkedin/burrow/http_server.go:49 +0x4f7
goroutine 12 [semacquire, 1 minutes]:
sync.runtime_Semacquire(0xc8204460cc)
/usr/local/go/src/runtime/sema.go:43 +0x26
sync.(*WaitGroup).Wait(0xc8204460c0)
/usr/local/go/src/sync/waitgroup.go:126 +0xb4
github.com/samuel/go-zookeeper/zk.(*Conn).loop(0xc82034a000)
/home/dwirawan/work/src/github.com/samuel/go-zookeeper/zk/conn.go:227 +0x671
github.com/samuel/go-zookeeper/zk.ConnectWithDialer.func1(0xc82034a000)
/home/dwirawan/work/src/github.com/samuel/go-zookeeper/zk/conn.go:145 +0x21
created by github.com/samuel/go-zookeeper/zk.ConnectWithDialer
/home/dwirawan/work/src/github.com/samuel/go-zookeeper/zk/conn.go:149 +0x452
goroutine 35 [runnable]:
github.com/Shopify/sarama.decode(0xc82035e2a0, 0x8, 0x8, 0x7f17db9a4428, 0xc8200ca3d0, 0x0, 0x0)
/home/dwirawan/work/src/github.com/Shopify/sarama/encoder_decoder.go:51 +0x69
github.com/Shopify/sarama.(*Broker).responseReceiver(0xc820318930)
/home/dwirawan/work/src/github.com/Shopify/sarama/broker.go:354 +0x3e0
github.com/Shopify/sarama.(*Broker).(github.com/Shopify/sarama.responseReceiver)-fm()
/home/dwirawan/work/src/github.com/Shopify/sarama/broker.go:93 +0x20
github.com/Shopify/sarama.withRecover(0xc8204220c0)
/home/dwirawan/work/src/github.com/Shopify/sarama/utils.go:42 +0x3a
created by github.com/Shopify/sarama.(*Broker).Open.func1
/home/dwirawan/work/src/github.com/Shopify/sarama/broker.go:93 +0x59b
goroutine 16 [select, 1 minutes]:
github.com/Shopify/sarama.(*client).backgroundMetadataUpdater(0xc8200b1600)
/home/dwirawan/work/src/github.com/Shopify/sarama/client.go:553 +0x322
github.com/Shopify/sarama.(*client).(github.com/Shopify/sarama.backgroundMetadataUpdater)-fm()
/home/dwirawan/work/src/github.com/Shopify/sarama/client.go:142 +0x20
github.com/Shopify/sarama.withRecover(0xc82041d470)
/home/dwirawan/work/src/github.com/Shopify/sarama/utils.go:42 +0x3a
created by github.com/Shopify/sarama.NewClient
/home/dwirawan/work/src/github.com/Shopify/sarama/client.go:142 +0x754
goroutine 34 [chan receive, 1 minutes]:
github.com/Shopify/sarama.(*Broker).responseReceiver(0xc820318690)
/home/dwirawan/work/src/github.com/Shopify/sarama/broker.go:340 +0xf6
github.com/Shopify/sarama.(*Broker).(github.com/Shopify/sarama.responseReceiver)-fm()
/home/dwirawan/work/src/github.com/Shopify/sarama/broker.go:93 +0x20
github.com/Shopify/sarama.withRecover(0xc820422050)
/home/dwirawan/work/src/github.com/Shopify/sarama/utils.go:42 +0x3a
created by github.com/Shopify/sarama.(*Broker).Open.func1
/home/dwirawan/work/src/github.com/Shopify/sarama/broker.go:93 +0x59b
goroutine 50 [chan receive, 1 minutes]:
main.NewKafkaClient.func1(0xc8200166e0)
/home/dwirawan/work/src/github.com/linkedin/burrow/kafka_client.go:78 +0x8f
created by main.NewKafkaClient
/home/dwirawan/work/src/github.com/linkedin/burrow/kafka_client.go:81 +0x43a
goroutine 51 [chan receive, 1 minutes]:
main.NewKafkaClient.func2(0xc8200166e0)
/home/dwirawan/work/src/github.com/linkedin/burrow/kafka_client.go:84 +0x95
created by main.NewKafkaClient
/home/dwirawan/work/src/github.com/linkedin/burrow/kafka_client.go:87 +0x45c
goroutine 52 [chan receive, 1 minutes]:
main.NewKafkaClient.func3(0xc8200166e0)
/home/dwirawan/work/src/github.com/linkedin/burrow/kafka_client.go:92 +0x4e
created by main.NewKafkaClient
/home/dwirawan/work/src/github.com/linkedin/burrow/kafka_client.go:95 +0x48c
goroutine 38 [select]:
github.com/Shopify/sarama.(*brokerConsumer).subscriptionManager(0xc820432550)
/home/dwirawan/work/src/github.com/Shopify/sarama/consumer.go:547 +0x3e7
github.com/Shopify/sarama.(*brokerConsumer).(github.com/Shopify/sarama.subscriptionManager)-fm()
/home/dwirawan/work/src/github.com/Shopify/sarama/consumer.go:520 +0x20
github.com/Shopify/sarama.withRecover(0xc8204222c0)
/home/dwirawan/work/src/github.com/Shopify/sarama/utils.go:42 +0x3a
created by github.com/Shopify/sarama.(*consumer).newBrokerConsumer
/home/dwirawan/work/src/github.com/Shopify/sarama/consumer.go:520 +0x200
goroutine 21 [select]:
github.com/samuel/go-zookeeper/zk.(*Conn).sendLoop(0xc82034a000, 0x7f17db9a43a0, 0xc8203d8008, 0xc820448300, 0x0, 0x0)
/home/dwirawan/work/src/github.com/samuel/go-zookeeper/zk/conn.go:412 +0xd8b
github.com/samuel/go-zookeeper/zk.(*Conn).loop.func1(0xc82034a000, 0xc820448300, 0xc8204460c0)
/home/dwirawan/work/src/github.com/samuel/go-zookeeper/zk/conn.go:212 +0x48
created by github.com/samuel/go-zookeeper/zk.(*Conn).loop
/home/dwirawan/work/src/github.com/samuel/go-zookeeper/zk/conn.go:215 +0x609
goroutine 23 [semacquire]:
sync.runtime_Semacquire(0xc8202e821c)
/usr/local/go/src/runtime/sema.go:43 +0x26
sync.(*WaitGroup).Wait(0xc8202e8210)
/usr/local/go/src/sync/waitgroup.go:126 +0xb4
main.(*KafkaClient).getOffsets(0xc8200166e0, 0x0, 0x0)
/home/dwirawan/work/src/github.com/linkedin/burrow/kafka_client.go:228 +0x7a5
main.NewKafkaClient.func4(0xc8200166e0)
/home/dwirawan/work/src/github.com/linkedin/burrow/kafka_client.go:102 +0x75
created by main.NewKafkaClient
/home/dwirawan/work/src/github.com/linkedin/burrow/kafka_client.go:104 +0x508
goroutine 36 [chan receive, 1 minutes]:
github.com/Shopify/sarama.(*partitionConsumer).dispatcher(0xc820778000)
/home/dwirawan/work/src/github.com/Shopify/sarama/consumer.go:295 +0x57
github.com/Shopify/sarama.(*partitionConsumer).(github.com/Shopify/sarama.dispatcher)-fm()
/home/dwirawan/work/src/github.com/Shopify/sarama/consumer.go:151 +0x20
github.com/Shopify/sarama.withRecover(0xc8204222a0)
/home/dwirawan/work/src/github.com/Shopify/sarama/utils.go:42 +0x3a
created by github.com/Shopify/sarama.(*consumer).ConsumePartition
/home/dwirawan/work/src/github.com/Shopify/sarama/consumer.go:151 +0x454
goroutine 37 [chan receive]:
github.com/Shopify/sarama.(*partitionConsumer).responseFeeder(0xc820778000)
/home/dwirawan/work/src/github.com/Shopify/sarama/consumer.go:403 +0x5d
github.com/Shopify/sarama.(*partitionConsumer).(github.com/Shopify/sarama.responseFeeder)-fm()
/home/dwirawan/work/src/github.com/Shopify/sarama/consumer.go:152 +0x20
github.com/Shopify/sarama.withRecover(0xc8204222b0)
/home/dwirawan/work/src/github.com/Shopify/sarama/utils.go:42 +0x3a
created by github.com/Shopify/sarama.(*consumer).ConsumePartition
/home/dwirawan/work/src/github.com/Shopify/sarama/consumer.go:152 +0x4ab
goroutine 67 [chan receive]:
github.com/Shopify/sarama.(*Broker).responseReceiver(0xc8203188c0)
/home/dwirawan/work/src/github.com/Shopify/sarama/broker.go:340 +0xf6
github.com/Shopify/sarama.(*Broker).(github.com/Shopify/sarama.responseReceiver)-fm()
/home/dwirawan/work/src/github.com/Shopify/sarama/broker.go:93 +0x20
github.com/Shopify/sarama.withRecover(0xc8203ac0a0)
/home/dwirawan/work/src/github.com/Shopify/sarama/utils.go:42 +0x3a
created by github.com/Shopify/sarama.(*Broker).Open.func1
/home/dwirawan/work/src/github.com/Shopify/sarama/broker.go:93 +0x59b
goroutine 20 [chan receive]:
github.com/Shopify/sarama.(*Broker).responseReceiver(0xc820318850)
/home/dwirawan/work/src/github.com/Shopify/sarama/broker.go:340 +0xf6
github.com/Shopify/sarama.(*Broker).(github.com/Shopify/sarama.responseReceiver)-fm()
/home/dwirawan/work/src/github.com/Shopify/sarama/broker.go:93 +0x20
github.com/Shopify/sarama.withRecover(0xc820446060)
/home/dwirawan/work/src/github.com/Shopify/sarama/utils.go:42 +0x3a
created by github.com/Shopify/sarama.(*Broker).Open.func1
/home/dwirawan/work/src/github.com/Shopify/sarama/broker.go:93 +0x59b
goroutine 22 [IO wait]:
net.runtime_pollWait(0x7f17dc7d8e30, 0x72, 0xc820010190)
/usr/local/go/src/runtime/netpoll.go:157 +0x60
net.(*pollDesc).Wait(0xc8203b60d0, 0x72, 0x0, 0x0)
/usr/local/go/src/net/fd_poll_runtime.go:73 +0x3a
net.(*pollDesc).WaitRead(0xc8203b60d0, 0x0, 0x0)
/usr/local/go/src/net/fd_poll_runtime.go:78 +0x36
net.(*netFD).Read(0xc8203b6070, 0xc82045e000, 0x4, 0x180000, 0x0, 0x7f17db997050, 0xc820010190)
/usr/local/go/src/net/fd_unix.go:232 +0x23a
net.(*conn).Read(0xc8203d8008, 0xc82045e000, 0x4, 0x180000, 0x0, 0x0, 0x0)
/usr/local/go/src/net/net.go:172 +0xe4
io.ReadAtLeast(0x7f17d8150160, 0xc8203d8008, 0xc82045e000, 0x4, 0x180000, 0x4, 0x0, 0x0, 0x0)
/usr/local/go/src/io/io.go:298 +0xe6
io.ReadFull(0x7f17d8150160, 0xc8203d8008, 0xc82045e000, 0x4, 0x180000, 0x0, 0x0, 0x0)
/usr/local/go/src/io/io.go:316 +0x62
github.com/samuel/go-zookeeper/zk.(*Conn).recvLoop(0xc82034a000, 0x7f17db9a43a0, 0xc8203d8008, 0x0, 0x0)
/home/dwirawan/work/src/github.com/samuel/go-zookeeper/zk/conn.go:476 +0x231
github.com/samuel/go-zookeeper/zk.(*Conn).loop.func2(0xc8203ac030, 0xc82034a000, 0xc820448300, 0xc8204460c0)
/home/dwirawan/work/src/github.com/samuel/go-zookeeper/zk/conn.go:219 +0x46
created by github.com/samuel/go-zookeeper/zk.(*Conn).loop
/home/dwirawan/work/src/github.com/samuel/go-zookeeper/zk/conn.go:225 +0x663
goroutine 100 [chan receive]:
main.NewKafkaClient.func6(0xc8200166e0, 0x7f17d80d1000, 0xc820778000)
/home/dwirawan/work/src/github.com/linkedin/burrow/kafka_client.go:130 +0x9f
created by main.NewKafkaClient
/home/dwirawan/work/src/github.com/linkedin/burrow/kafka_client.go:133 +0xae8
goroutine 99 [chan receive]:
main.NewKafkaClient.func5(0xc8200166e0, 0x7f17d80d1000, 0xc820778000)
/home/dwirawan/work/src/github.com/linkedin/burrow/kafka_client.go:124 +0x9f
created by main.NewKafkaClient
/home/dwirawan/work/src/github.com/linkedin/burrow/kafka_client.go:127 +0xaac
goroutine 107 [select]:
github.com/Shopify/sarama.(*Broker).sendAndReceive(0xc820318930, 0x7f17db9a46a0, 0xc8203b8390, 0x7f17db9a46e0, 0xc8203d8270, 0x0, 0x0)
/home/dwirawan/work/src/github.com/Shopify/sarama/broker.go:286 +0x23f
github.com/Shopify/sarama.(*Broker).GetAvailableOffsets(0xc820318930, 0xc8203b8390, 0xc800000002, 0x0, 0x0)
/home/dwirawan/work/src/github.com/Shopify/sarama/broker.go:174 +0xc1
main.(*KafkaClient).getOffsets.func1(0xc800000002, 0xc8203b8390)
/home/dwirawan/work/src/github.com/linkedin/burrow/kafka_client.go:198 +0xa1
created by main.(*KafkaClient).getOffsets
/home/dwirawan/work/src/github.com/linkedin/burrow/kafka_client.go:225 +0x770
goroutine 144 [select, locked to thread]:
runtime.gopark(0x9ea3c8, 0xc8203be728, 0x913530, 0x6, 0x18, 0x2)
/usr/local/go/src/runtime/proc.go:185 +0x163
runtime.selectgoImpl(0xc8203be728, 0x0, 0x18)
/usr/local/go/src/runtime/select.go:392 +0xa64
runtime.selectgo(0xc8203be728)
/usr/local/go/src/runtime/select.go:212 +0x12
runtime.ensureSigM.func1()
/usr/local/go/src/runtime/signal1_unix.go:227 +0x353
runtime.goexit()
/usr/local/go/src/runtime/asm_amd64.s:1696 +0x1
goroutine 145 [chan receive]:
main.(*Emailer).sendEmailNotifications(0xc820416420, 0xc8200e0080, 0x10, 0x910828, 0x7, 0xc8200e01a0, 0x2, 0x2, 0xc820796720)
/home/dwirawan/work/src/github.com/linkedin/burrow/emailer.go:116 +0x45e
created by main.(*Emailer).Start
/home/dwirawan/work/src/github.com/linkedin/burrow/emailer.go:59 +0x1da
goroutine 162 [select]:
main.(*HttpNotifier).Start.func1(0xc820316940)
/home/dwirawan/work/src/github.com/linkedin/burrow/http_notifier.go:197 +0x19b
created by main.(*HttpNotifier).Start
/home/dwirawan/work/src/github.com/linkedin/burrow/http_notifier.go:207 +0x7a
goroutine 146 [select]:
github.com/samuel/go-zookeeper/zk.(*Conn).sendLoop(0xc820069e10, 0x7f17db9a43a0, 0xc8200322b8, 0xc820796000, 0x0, 0x0)
/home/dwirawan/work/src/github.com/samuel/go-zookeeper/zk/conn.go:412 +0xd8b
github.com/samuel/go-zookeeper/zk.(*Conn).loop.func1(0xc820069e10, 0xc820796000, 0xc820792010)
/home/dwirawan/work/src/github.com/samuel/go-zookeeper/zk/conn.go:212 +0x48
created by github.com/samuel/go-zookeeper/zk.(*Conn).loop
/home/dwirawan/work/src/github.com/samuel/go-zookeeper/zk/conn.go:215 +0x609
goroutine 147 [IO wait]:
net.runtime_pollWait(0x7f17dc7d8ef0, 0x72, 0xc820010190)
/usr/local/go/src/runtime/netpoll.go:157 +0x60
net.(*pollDesc).Wait(0xc820318840, 0x72, 0x0, 0x0)
/usr/local/go/src/net/fd_poll_runtime.go:73 +0x3a
net.(*pollDesc).WaitRead(0xc820318840, 0x0, 0x0)
/usr/local/go/src/net/fd_poll_runtime.go:78 +0x36
net.(*netFD).Read(0xc8203187e0, 0xc8207a6000, 0x4, 0x180000, 0x0, 0x7f17db997050, 0xc820010190)
/usr/local/go/src/net/fd_unix.go:232 +0x23a
net.(*conn).Read(0xc8200322b8, 0xc8207a6000, 0x4, 0x180000, 0x0, 0x0, 0x0)
/usr/local/go/src/net/net.go:172 +0xe4
io.ReadAtLeast(0x7f17d8150160, 0xc8200322b8, 0xc8207a6000, 0x4, 0x180000, 0x4, 0x0, 0x0, 0x0)
/usr/local/go/src/io/io.go:298 +0xe6
io.ReadFull(0x7f17d8150160, 0xc8200322b8, 0xc8207a6000, 0x4, 0x180000, 0x0, 0x0, 0x0)
/usr/local/go/src/io/io.go:316 +0x62
github.com/samuel/go-zookeeper/zk.(*Conn).recvLoop(0xc820069e10, 0x7f17db9a43a0, 0xc8200322b8, 0x0, 0x0)
/home/dwirawan/work/src/github.com/samuel/go-zookeeper/zk/conn.go:476 +0x231
github.com/samuel/go-zookeeper/zk.(*Conn).loop.func2(0xc82041cb80, 0xc820069e10, 0xc820796000, 0xc820792010)
/home/dwirawan/work/src/github.com/samuel/go-zookeeper/zk/conn.go:219 +0x46
created by github.com/samuel/go-zookeeper/zk.(*Conn).loop
/home/dwirawan/work/src/github.com/samuel/go-zookeeper/zk/conn.go:225 +0x663
goroutine 39 [select]:
github.com/Shopify/sarama.(*Broker).sendAndReceive(0xc820318930, 0x7f17d8090100, 0xc820338380, 0x7f17d8090140, 0xc820382360, 0x0, 0x0)
/home/dwirawan/work/src/github.com/Shopify/sarama/broker.go:286 +0x23f
github.com/Shopify/sarama.(*Broker).Fetch(0xc820318930, 0xc820338380, 0xc82003fd94, 0x0, 0x0)
/home/dwirawan/work/src/github.com/Shopify/sarama/broker.go:204 +0xc1
github.com/Shopify/sarama.(*brokerConsumer).fetchNewMessages(0xc820432550, 0x0, 0x0, 0x0)
/home/dwirawan/work/src/github.com/Shopify/sarama/consumer.go:646 +0x34e
github.com/Shopify/sarama.(*brokerConsumer).subscriptionConsumer(0xc820432550)
/home/dwirawan/work/src/github.com/Shopify/sarama/consumer.go:580 +0x144
github.com/Shopify/sarama.(*brokerConsumer).(github.com/Shopify/sarama.subscriptionConsumer)-fm()
/home/dwirawan/work/src/github.com/Shopify/sarama/consumer.go:521 +0x20
github.com/Shopify/sarama.withRecover(0xc8204222d0)
/home/dwirawan/work/src/github.com/Shopify/sarama/utils.go:42 +0x3a
created by github.com/Shopify/sarama.(*consumer).newBrokerConsumer
/home/dwirawan/work/src/github.com/Shopify/sarama/consumer.go:521 +0x253
goroutine 184 [runnable]:
main.(*OffsetStorage).evaluateGroup(0xc8201d95c0, 0xc8200e0160, 0x5, 0xc8200e0166, 0x14, 0xc8204545a0)
/home/dwirawan/work/src/github.com/linkedin/burrow/offsets_store.go:337 +0x182
created by main.NewOffsetStorage.func1
/home/dwirawan/work/src/github.com/linkedin/burrow/offsets_store.go:188 +0x43f
Is there something wrong with my configuration?
Thanks.
This issues is due to kafka cluster. Check your kafka cluster was properly running or Not .The NPE is because Burrow is unable to start the consumers for the __consumer_offsets topic. This could be due to ACL issues, or because the topic doesn't exist yet (it's only created after the first consumer group is started up).

any good alternative to Iconv library for encoding conversion?

i was using Iconv library on Ruby to convert encoding from UTF-8 to UTF-32, UTF-16 etc and it was quite good.
However, I do see an issue when converting from Big5 to UTF-8 -- an exception is thrown for invalid sequence...
and the problem goes away when it is converting from CP950 to UTF-8, of which CP950 is essentially Big5...
so I wonder if there is another good alternative besides using Iconv? Or is the CP950 a better version of Big5?
Have a look at ICU, it's a library that does character conversions among other things.
On the other hand, the other answers suggest you might need to examine your encodings more carefully.
Although Big5 and CP950 are almost the same there are differences. On the Unicode website there are reference files for converting different encodings to Unicode, you will see that Big5 and CP950 are different. My experience of Iconv has been good - I suspect its behavior may be correct.
Some of the characters found in CP950 but not Big5 have the hex values: 0x00, 0x01, 0x02, 0x03, 0x04, 0x05, 0x06, 0x07, 0x08, 0x09, 0x0A, 0x0B, 0x0C, 0x0D, 0x0E, 0x0F, 0x10, 0x11, 0x12, 0x13, 0x14, 0x15, 0x16, 0x17, 0x18, 0x19, 0x1A, 0x1B, 0x1C, 0x1D, 0x1E, 0x1F, 0x20, 0x21, 0x22, 0x23, 0x24, 0x25, 0x26, 0x27, 0x28, 0x29, 0x2A, 0x2B, 0x2C, 0x2D, 0x2E, 0x2F, 0x30, 0x31, 0x32, 0x33, 0x34, 0x35, 0x36, 0x37, 0x38, 0x39, 0x3A, 0x3B, 0x3C, 0x3D, 0x3E, 0x3F, 0x40, 0x41, 0x42, 0x43, 0x44, 0x45, 0x46, 0x47, 0x48, 0x49, 0x4A, 0x4B, 0x4C, 0x4D, 0x4E, 0x4F, 0x50, 0x51, 0x52, 0x53, 0x54, 0x55, 0x56, 0x57, 0x58, 0x59, 0x5A, 0x5B, 0x5C, 0x5D, 0x5E, 0x5F, 0x60, 0x61, 0x62, 0x63, 0x64, 0x65, 0x66, 0x67, 0x68, 0x69, 0x6A, 0x6B, 0x6C, 0x6D, 0x6E, 0x6F, 0x70, 0x71, 0x72, 0x73, 0x74, 0x75, 0x76, 0x77, 0x78, 0x79, 0x7A, 0x7B, 0x7C, 0x7D, 0x7E, 0x7F, 0x80, 0x81, 0x82, 0x83, 0x84, 0x85, 0x86, 0x87, 0x88, 0x89, 0x8A, 0x8B, 0x8C, 0x8D, 0x8E, 0x8F, 0x90, 0x91, 0x92, 0x93, 0x94, 0x95, 0x96, 0x97, 0x98, 0x99, 0x9A, 0x9B, 0x9C, 0x9D, 0x9E, 0x9F, 0xA0, 0xA1, 0xA15A, 0xA1C3, 0xA1C5, 0xA1FE, 0xA240, 0xA3E1, 0xA2CC, 0xA2CE. If any of your input contains these values, then the file is not valid Big5.
There are many many big5 variants. CP950 is just one of them.
http://www.moztw.org/docs/big5/
For big5, I would suggest use "big5-2003", which is official updated version.