Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

multi-nicd with the CrashLoopBackOff state #193

Open
3 tasks
cyclinder opened this issue May 20, 2024 · 1 comment
Open
3 tasks

multi-nicd with the CrashLoopBackOff state #193

cyclinder opened this issue May 20, 2024 · 1 comment
Labels
bug Something isn't working

Comments

@cyclinder
Copy link

Describe the bug
A clear and concise description of what the bug is.

root@controller:~/cyclinder/multi-nic-cni/deploy# kubectl get po -n multi-nic-cni-operator
NAME                                                         READY   STATUS             RESTARTS      AGE
multi-nic-cni-operator-controller-manager-64cd6b55b5-fnmcw   2/2     Running            0             15m
multi-nicd-q5njr                                             0/1     CrashLoopBackOff   1 (13s ago)   18s
root@controller:~/cyclinder/multi-nic-cni/deploy# kubectl logs -f -n multi-nic-cni-operator multi-nicd-q5njr
2024/05/20 09:52:07 Config
W0520 09:52:07.813697      10 client_config.go:617] Neither --kubeconfig nor --master was specified.  Using the inClusterConfig.  This might not work.
2024/05/20 09:52:07 hostName=controller
2024/05/20 09:52:07 ListIPPool with selector: hostname=controller
2024/05/20 09:52:07 ListIPPool elapsed: 13792 us
2024/05/20 09:52:07 Listening @0.0.0.0:11000
2024/05/20 09:52:08 cannot get PCI info: Get "https://pci-ids.ucw.cz/v2.2/pci.ids.gz": dial tcp: lookup pci-ids.ucw.cz on [::1]:53: read udp [::1]:57298->[::1]:53: read: connection refused
2024/05/20 09:52:08 http: panic serving 10.6.212.101:39214: runtime error: invalid memory address or nil pointer dereference
goroutine 41 [running]:
net/http.(*conn).serve.func1()
	/usr/local/go/src/net/http/server.go:1801 +0xb9
panic({0x12f08a0, 0x20761e0})
	/usr/local/go/src/runtime/panic.go:1047 +0x266
github.com/jaypipes/ghw/pkg/pci.(*Info).ListDevices(0x0)
	/root/go/pkg/mod/github.com/jaypipes/[email protected]/pkg/pci/pci_linux.go:356 +0x2e
github.com/foundation-model-stack/multi-nic-cni/daemon/iface.GetTargetNetworks()
	/usr/local/app/src/iface/pci.go:166 +0x9a
github.com/foundation-model-stack/multi-nic-cni/daemon/iface.GetInterfaces()
	/usr/local/app/src/iface/iface.go:100 +0x3b
main.GetInterface({0x163d148, 0xc000228a80}, 0xc0004355f0)
	/usr/local/app/src/main.go:144 +0x3b
net/http.HandlerFunc.ServeHTTP(0xc0002ddf00, {0x163d148, 0xc000228a80}, 0xc0000729f8)
	/usr/local/go/src/net/http/server.go:2046 +0x2f
github.com/gorilla/mux.(*Router).ServeHTTP(0xc00017b140, {0x163d148, 0xc000228a80}, 0xc0002dde00)
	/root/go/pkg/mod/github.com/gorilla/[email protected]/mux.go:210 +0x1cf
net/http.serverHandler.ServeHTTP({0xc000435440}, {0x163d148, 0xc000228a80}, 0xc0002dde00)
	/usr/local/go/src/net/http/server.go:2878 +0x43b
net/http.(*conn).serve(0xc000423cc0, {0x1640cf0, 0xc000435350})
	/usr/local/go/src/net/http/server.go:1929 +0xb08
created by net/http.(*Server).Serve
	/usr/local/go/src/net/http/server.go:3033 +0x4e8

To Reproduce

I'm not sure what happened with it, Need for more investigations

Expected behavior
A clear and concise description of what you expected to happen.

Screenshots
If applicable, add screenshots to help explain your problem.

  • manager container of controller and multi-nicd DS status:
  • multinicnetwork CR:
  • hostinterface list/CR:
  • cidr CR (multiNICIPAM: true):
  • ippools CR (multiNICIPAM: true):
  • log of manager container:
  • log of failed multi-nicd pod:

Environment (please complete the following information):

  • platform: [e.g. self-managed k8s, self-managed OpenShift, EKS, IKS, AKS]
  • node profile:
  • operator version :
  • cluster scale (number of nodes, pods, interfaces):

Additional context
Add any other context about the problem here.

@cyclinder cyclinder added the bug Something isn't working label May 20, 2024
@sunya-ch
Copy link
Collaborator

It is a reported issue of dependent ghw module (portainer/agent#34).
Need an upgrade.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants