Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cluster ID of 16 fails to properly identify as node0 or node1 in current_re #1135

Closed
blkangl opened this issue Oct 28, 2021 · 5 comments
Closed

Comments

@blkangl
Copy link

blkangl commented Oct 28, 2021

(int(device.facts["srx_cluster_id"]) & 0x000F) << 4

When we have a cluster ID of "16", the new code returns cluster_id_octet as "0"... but the ip of node1 is 130.16.0.1. Did something change in junos that breaks this new code? From what I see, this new code will always return cluster_id_octet as '0' when the id is 16 (or any multiple of 16)

By changing line 66 to:

                        cluster_id_octet = str(
                            (int(device.facts["srx_cluster_id"]) & 0x000F) << 4
                        )
                        if cluster_id_octet == "0":
                            cluster_id_octet = str(device.facts["srx_cluster_id"])

This allows my cluster with id "16" to function. I am not sure if this will work for all cases where the cluster ID binary ends in 0000.

@chidanandpujar
Copy link
Collaborator

Verified the above fix ,with cluster_id 2 and cluster_id 16 ,for both the cluster ids , the above fix works fine and current_re vlaues are populated according to cluster node details .

I will continue the testing for few more cluster id values and update the results .

from jnpr.junos import Device
from pprint import pprint

device = Device(host='xx.xx.xx.xx', user='xyz', password='zyx')
device.open()

pprint(device.facts["srx_cluster_id"])

cluster_id_octet = str(
(int(device.facts["srx_cluster_id"]) & 0x000F) << 4
)
pprint(cluster_id_octet)
pprint(device.facts["current_re"])

~/junos-pyez# python get-facts.py
'16'
'0'
['node0',
'master',
'fpc0',
'node',
'fwdd',
'member',
'pfem',
're0',
'fpc0.pic0']

~/junos-pyez# python get-facts.py
'16'
'0'
['node1',
'master',
'fpc0',
'node',
'fwdd',
'member',
'pfem',
're0',
'fpc0.pic0']

with cluster id 2:

~/junos-pyez# python get-facts.py
'2'
'32'
['node0',
'master',
'fpc0',
'node',
'fwdd',
'member',
'pfem',
're0',
'fpc0.pic0']

/junos-pyez# python get-facts.py
'2'
'32'
['node1',
'master',
'fpc0',
'node',
'fwdd',
'member',
'pfem',
're0',
'fpc0.pic0']

@chidanandpujar
Copy link
Collaborator

Verified the fix for cluster id values 16, 2, 17, 15, 32, 255 and it works fine and returns the current_re values for node0 and node1 .

from jnpr.junos import Device
from pprint import pprint

device = Device(host='xx.xx.xx.xx', user='xyz', password='xyz')
device.open()

print("srx cluster id value {}".format(device.facts["srx_cluster_id"]))

pprint(device.facts["current_re"])

                            cluster_id_octet = str(
                                (int(device.facts["srx_cluster_id"]) & 0x000F) << 4
                            )
                            if cluster_id_octet == '0':
                                cluster_id_octet = str(device.facts["srx_cluster_id"])

With fix and cluster_id 16

~/junos-pyez# python get-facts.py
'16'
'0'
['node0',
'master',
'fpc0',
'node',
'fwdd',
'member',
'pfem',
're0',
'fpc0.pic0']

~/junos-pyez# python get-facts.py
'16'
'0'
['node1',
'master',
'fpc0',
'node',
'fwdd',
'member',
'pfem',
're0',
'fpc0.pic0']

with cluster id 2:

~/junos-pyez# python get-facts.py
'2'
'32'
['node0',
'master',
'fpc0',
'node',
'fwdd',
'member',
'pfem',
're0',
'fpc0.pic0']

/junos-pyez# python get-facts.py
'2'
'32'
['node1',
'master',
'fpc0',
'node',
'fwdd',
'member',
'pfem',
're0',
'fpc0.pic0']

cluster id : 17

(venv) root@nms5-salt-master-b:~/junos-pyez# python get-facts.py
srx cluster id value 17
cluster_id_octet value 16
cluster_id_octet value 16
cluster_id_octet value 16
cluster_id_octet value 16
cluster_id_octet value 16
cluster_id_octet value 16
['node0',
'master',
'fpc0',
'node',
'fwdd',
'member',
'pfem',
're0',
'fpc0.pic0']

(venv) root@nms5-salt-master-b:~/junos-pyez# python get-facts.py
srx cluster id value 17
cluster_id_octet value 16
cluster_id_octet value 16
cluster_id_octet value 16
cluster_id_octet value 16
cluster_id_octet value 16
cluster_id_octet value 16
['node1',
'master',
'fpc0',
'node',
'fwdd',
'member',
'pfem',
're0',
'fpc0.pic0']

cluster id : 15

(venv) root@nms5-salt-master-b:~/junos-pyez# python get-facts.py
srx cluster id value 15
cluster_id_octet value 240
cluster_id_octet value 240
cluster_id_octet value 240
cluster_id_octet value 240
cluster_id_octet value 240
cluster_id_octet value 240
['node0',
'master',
'fpc0',
'node',
'fwdd',
'member',
'pfem',
're0',
'fpc0.pic0']

(venv) root@nms5-salt-master-b:~/junos-pyez# python get-facts.py
srx cluster id value 15
cluster_id_octet value 240
cluster_id_octet value 240
cluster_id_octet value 240
cluster_id_octet value 240
cluster_id_octet value 240
cluster_id_octet value 240
['node1',
'master',
'fpc0',
'node',
'fwdd',
'member',
'pfem',
're0',
'fpc0.pic0']

cluster id :32

(venv) root@nms5-salt-master-b:~/junos-pyez# python get-facts.py
srx cluster id value 32
cluster_id_octet value 0
cluster_id_octet value 32
cluster_id_octet value 0
cluster_id_octet value 32
cluster_id_octet value 0
cluster_id_octet value 32
['node0',
'master',
'fpc0',
'node',
'fwdd',
'member',
'pfem',
're0',
'fpc0.pic0']

(venv) root@nms5-salt-master-b:~/junos-pyez# python get-facts.py
srx cluster id value 32
cluster_id_octet value 0
cluster_id_octet value 32
cluster_id_octet value 0
cluster_id_octet value 32
cluster_id_octet value 0
cluster_id_octet value 32
['node1',
'master',
'fpc0',
'node',
'fwdd',
'member',
'pfem',
're0',
'fpc0.pic0']

with cluster id : 255

(venv) root@nms5-salt-master-b:~/junos-pyez# python get-facts.py
srx cluster id value 255
cluster_id_octet value 240
cluster_id_octet value 240
cluster_id_octet value 240
cluster_id_octet value 240
cluster_id_octet value 240
cluster_id_octet value 240
['node0',
'master',
'fpc0',
'node',
'fwdd',
'member',
'pfem',
're0',
'fpc0.pic0']

(venv) root@nms5-salt-master-b:~/junos-pyez# python get-facts.py
srx cluster id value 255
cluster_id_octet value 240
cluster_id_octet value 240
cluster_id_octet value 240
cluster_id_octet value 240
cluster_id_octet value 240
cluster_id_octet value 240
['node1',
'master',
'fpc0',
'node',
'fwdd',
'member',
'pfem',
're0',
'fpc0.pic0']

@chidanandpujar
Copy link
Collaborator

fix is under review - #1137

@blkangl
Copy link
Author

blkangl commented Mar 14, 2022

@chidanandpujar - do you know when this will be merged and available?

@ydnath ydnath added this to the Release 2.6.4 milestone Apr 28, 2022
@ydnath
Copy link
Member

ydnath commented May 6, 2022

#1178 provided the fix

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants