Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
71 commits
Select commit Hold shift + click to select a range
fc0d91e
Performance Updates
mikewiebe Sep 3, 2025
89fb871
update diff script for interfaces, underlay, vrfs, networks, vpc_peer…
ccoueffe Sep 23, 2025
a40844a
rename diff_interface to diff_compare + fix sanity
ccoueffe Sep 23, 2025
a44598a
fix pep8
ccoueffe Sep 23, 2025
f18ef9f
update sanity
ccoueffe Sep 23, 2025
8bad04b
update diff_compare
ccoueffe Sep 23, 2025
7b2427c
update underlay_ip_address for diff
ccoueffe Sep 23, 2025
531fc58
Merge branch 'develop' into performance_updates
mikewiebe Sep 23, 2025
962d2f5
update in create role underlay_ip, vpc_domain_id, vpc_peering + updat…
ccoueffe Sep 23, 2025
a5088bf
Fix interface diff bug and preserve backward compatability
mikewiebe Sep 24, 2025
2d2072c
Restore Commented out Lines
mikewiebe Sep 24, 2025
341c72c
Updates
mikewiebe Sep 24, 2025
a469a24
Re-order int delete and add deploy after remove
mikewiebe Sep 27, 2025
8fd2df1
update fabric_link, underlay_ip, vpc_domain_id, vpc_peering
ccoueffe Sep 29, 2025
4640c99
initialize changes_detected_vpc_domain_id_resource
ccoueffe Sep 29, 2025
070f6f5
Interface Remove and Deploy Updates
mikewiebe Sep 30, 2025
ff22452
Merge branch 'develop' into performance_updates
mikewiebe Oct 1, 2025
ae7c2a4
Fix actions failure
mikewiebe Oct 1, 2025
f0373de
Cleanup and add comments
mikewiebe Oct 1, 2025
10191de
More cleanup
mikewiebe Oct 1, 2025
251c9fb
Update comment
mikewiebe Oct 1, 2025
ad20024
Refactor changes_detected flags
mikewiebe Oct 6, 2025
50ca606
Update changes_detected flags
mikewiebe Oct 6, 2025
b68dcd8
Add common role tags
mikewiebe Oct 7, 2025
0a4c386
Updates
mikewiebe Oct 7, 2025
813237d
Cleanup, deploy false remove, write diff_compare to file
mikewiebe Oct 8, 2025
8d8767e
Cleanup
mikewiebe Oct 9, 2025
3bfb776
More cleanup
mikewiebe Oct 9, 2025
faff455
Fix Github Actions Failures
mikewiebe Oct 9, 2025
0c6bf40
Github Actions Fixes
mikewiebe Oct 9, 2025
be2bee8
More actions fixes
mikewiebe Oct 9, 2025
26ab291
Actions fixes
mikewiebe Oct 9, 2025
befb9d1
updates for vrfs
mtarking Oct 9, 2025
574b636
Fixes and cleanup
mikewiebe Oct 10, 2025
a7b6f49
Fix typo
mikewiebe Oct 10, 2025
4442151
MSD updates
mikewiebe Oct 10, 2025
94c118b
Minor cleanup
mikewiebe Oct 11, 2025
5658cee
New fabric_deploy_manager
mikewiebe Oct 11, 2025
3323015
github actions
mikewiebe Oct 11, 2025
25d588e
github actions
mikewiebe Oct 11, 2025
58eb931
Add task back that was removed
mikewiebe Oct 11, 2025
61a108c
MSD deploy updates
mikewiebe Oct 11, 2025
c570560
Skip unmanagable devices for sync check
mikewiebe Oct 11, 2025
26c19d6
Loop for in-sync check
mikewiebe Oct 12, 2025
1a690f4
VRF Fixes
mikewiebe Oct 13, 2025
cee39a1
Actions fixes
mikewiebe Oct 13, 2025
d921559
Fix fabric check sync
mikewiebe Oct 14, 2025
396ea7b
add missing change flag for ebgp
mikewiebe Oct 14, 2025
f939b30
Diff run create for MSD VRFs and Networks
mikewiebe Oct 14, 2025
e800f7a
Add missing ebgp vrf_diff_result
mikewiebe Oct 14, 2025
a2428e8
Normalize diff_result data for all fabrics
mikewiebe Oct 14, 2025
7341485
Debug task removal
mikewiebe Oct 14, 2025
5017195
Additional debug command cleanup
mikewiebe Oct 14, 2025
b4e4545
Remove deploy based on changes_detected
mikewiebe Oct 14, 2025
de45c44
initial work on msd diff
mtarking Oct 15, 2025
1d8e38c
Merge branch 'performance_updates' of github.com:netascode/ansible-dc…
mtarking Oct 15, 2025
fa495e5
Revert Remove deploy based on changes_detected
mikewiebe Oct 15, 2025
2714590
diff for msd vrfs and networks
mtarking Oct 15, 2025
332eedf
diff for msd vrfs and networks
mtarking Oct 15, 2025
723274a
fix lint errors
mtarking Oct 15, 2025
037ce32
ansible-lint cleanup
mtarking Oct 15, 2025
e04e05b
Move remove tasks out of import
mikewiebe Oct 15, 2025
2567b45
Merge branch 'performance_updates' of https://github.com/netascode/an…
mikewiebe Oct 15, 2025
f5473e0
Fix key error
mikewiebe Oct 15, 2025
b296e22
MSD Fixes
mikewiebe Oct 15, 2025
8363ffe
Merge branch 'develop' into performance_updates
mikewiebe Oct 15, 2025
31d5452
More MSD fixes
mikewiebe Oct 15, 2025
87b42bc
Fix comment
mikewiebe Oct 15, 2025
64202aa
Change import to include
mikewiebe Oct 15, 2025
ecf378c
Fix validate role
mikewiebe Oct 15, 2025
212c416
add tags
ccoueffe Oct 17, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
342 changes: 342 additions & 0 deletions plugins/action/common/change_flag_manager.py

Large diffs are not rendered by default.

11 changes: 11 additions & 0 deletions plugins/action/common/read_run_map.py
Original file line number Diff line number Diff line change
Expand Up @@ -38,6 +38,7 @@ def run(self, tmp=None, task_vars=None):
# self._supports_async = True
results = super(ActionModule, self).run(tmp, task_vars)
results['diff_run'] = True
results['validate_only_run'] = False

model_data = self._task.args.get('model_data')
play_tags = self._task.args.get('play_tags')
Expand Down Expand Up @@ -69,8 +70,18 @@ def run(self, tmp=None, task_vars=None):
if not previous_run_map.get(role):
results['diff_run'] = False
break
# All stages of the automation must run for the diff_run framework to be enabled
if play_tags and 'all' not in play_tags:
results['diff_run'] = False
# If force_run_all is True then set the diff_run flag to false
if task_vars.get('force_run_all') is True:
results['diff_run'] = False

# If only the role_validate tag is present then set validate_only_run to true
# This is used to prevent the diff_run map from being reset when the validate role
# gets run in isolation.
if len(play_tags) == 1 and 'role_validate' in play_tags:
results['validate_only_run'] = True

# If diff_run is false display an ansible warning message
if not results['diff_run']:
Expand Down
7 changes: 7 additions & 0 deletions plugins/action/common/run_map.py
Original file line number Diff line number Diff line change
Expand Up @@ -82,9 +82,16 @@ def run(self, tmp=None, task_vars=None):
updated_run_map['role_deploy_completed'] = True
elif stage == 'role_remove_completed':
updated_run_map['role_remove_completed'] = True
elif stage == 'role_all_completed':
updated_run_map['role_validate_completed'] = True
updated_run_map['role_create_completed'] = True
updated_run_map['role_deploy_completed'] = True
updated_run_map['role_remove_completed'] = True

with open(run_map_file_path, 'w') as outfile:
outfile.write("### This File Is Auto Generated, Do Not Edit ###\n")
yaml.dump(updated_run_map, outfile, default_flow_style=False)
# Add run map to results dictonary
results['updated'] = updated_run_map

return results
330 changes: 330 additions & 0 deletions plugins/action/dtc/diff_compare.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,330 @@
# Copyright (c) 2025 Cisco Systems, Inc. and its affiliates
#
# Permission is hereby granted, free of charge, to any person obtaining a copy of
# this software and associated documentation files (the "Software"), to deal in
# the Software without restriction, including without limitation the rights to
# use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of
# the Software, and to permit persons to whom the Software is furnished to do so,
# subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in all
# copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS
# FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR
# COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER
# IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN
# CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
#
# SPDX-License-Identifier: MIT

from __future__ import absolute_import, division, print_function

import yaml
import os
import datetime
from ansible.utils.display import Display
from ansible.plugins.action import ActionBase

display = Display()


class ActionModule(ActionBase):
"""
Action plugin to compare existing links with new links for a fabric.
Identifies new/modified, removed, and unchanged items.
"""
def __init__(self, *args, **kwargs):
super(ActionModule, self).__init__(*args, **kwargs)
self.old_file_path = None
self.new_file_path = None

def run(self, tmp=None, task_vars=None):
"""
Run the action plugin.

Args:
tmp: Temporary directory for file operations
task_vars: Variables available to the task

Returns:
dict: Results containing the comparison of items
"""
if task_vars is None:
task_vars = {}

results = super(ActionModule, self).run(tmp, task_vars)
results['compare'] = {}

# Validate required arguments
try:
self.old_file_path = self._task.args.get('old_file')
self.new_file_path = self._task.args.get('new_file')

if not self.old_file_path or not self.new_file_path:
raise ValueError("Both old_file and new_file arguments are required")

except (AttributeError, KeyError) as e:
return {'failed': True, 'msg': f'Missing required argument: {str(e)}'}

old_items = []
new_items = []

try:
old_items = self.load_yaml(self.old_file_path)
except (FileNotFoundError, IOError):
display.warning(f"Old file not found: {self.old_file_path}, using empty list")

try:
new_items = self.load_yaml(self.new_file_path)
except (FileNotFoundError, IOError):
display.warning(f"New file not found: {self.new_file_path}, using empty list")

# Normalize omit placeholder strings between old and new items
old_items, new_items = self.normalize_omit_placeholders(old_items, new_items)

updated_items, removed_items, equal_items = self.compare_items(old_items, new_items)

if self.new_file_path.endswith('ndfc_interface_all.yml'):
removed_items = self.order_interface_remove(removed_items)

results['compare'] = {"updated": updated_items, "removed": removed_items, "equal": equal_items}

# Write comparison results to file
self.write_comparison_results(results['compare'])

return results['compare']

def write_comparison_results(self, compare_results):
"""
Write comparison results to a unique file in the same directory as new_file_path.

Args:
compare_results (dict): Dictionary containing 'updated', 'removed', and 'equal' lists
"""
if not self.new_file_path:
display.warning("new_file_path is not set, cannot write comparison results")
return

# Get the directory of the new_file_path
output_dir = os.path.dirname(self.new_file_path)

# Create a unique filename with timestamp
base_filename = os.path.splitext(os.path.basename(self.new_file_path))[0]
output_filename = f"{base_filename}_comparison.yml"
output_path = os.path.join(output_dir, output_filename)

# Prepare the data to write
output_data = {
'comparison_summary': {
'timestamp': datetime.datetime.now().isoformat(),
'source_file': self.new_file_path,
'total_updated': len(compare_results.get('updated', [])),
'total_removed': len(compare_results.get('removed', [])),
'total_equal': len(compare_results.get('equal', []))
},
'updated_items': compare_results.get('updated', []),
'removed_items': compare_results.get('removed', []),
'equal_items': compare_results.get('equal', [])
}

try:
# Remove old file if it exists
if os.path.exists(output_path):
os.remove(output_path)

with open(output_path, 'w', encoding='utf-8') as f:
yaml.dump(output_data, f, default_flow_style=False, sort_keys=False)
except Exception as e:
display.warning(f"Failed to write comparison results to {output_path}: {str(e)}")

def load_yaml(self, filename):
"""
Load YAML data from a file.
"""
with open(filename, 'r', encoding='utf-8') as f:
return yaml.safe_load(f) or []

def normalize_omit_placeholders(self, old_items, new_items):
"""
Remove any lines that contain the string '__omit_place_holder__' from both old_items and new_items.
Goes through each list item and removes any dictionary key-value pairs where the value contains '__omit_place_holder__'.
Returns the cleaned (normalized) old_items and new_items.
"""
def remove_omit_placeholders(items):
"""Recursively remove any entries containing '__omit_place_holder__' from data structures."""
if isinstance(items, list):
cleaned_items = []
for item in items:
cleaned_item = remove_omit_placeholders(item)
if cleaned_item is not None: # Only add non-None items
cleaned_items.append(cleaned_item)
return cleaned_items
elif isinstance(items, dict):
cleaned_dict = {}
for key, value in items.items():
# Skip any key-value pair where the value contains '__omit_place_holder__'
if isinstance(value, str) and '__omit_place_holder__' in value:
continue
# Recursively clean nested structures
cleaned_value = remove_omit_placeholders(value)
cleaned_dict[key] = cleaned_value
return cleaned_dict
else:
# For primitive types, return as-is
return items

cleaned_old = remove_omit_placeholders(old_items)
cleaned_new = remove_omit_placeholders(new_items)
display.v("Normalized old_items and new_items by removing __omit_place_holder__ entries")
return cleaned_old, cleaned_new

KEY_MAPPING = {
'ndfc_underlay_ip_address.yml': 'entity_name',
'ndfc_attach_vrfs.yml': 'vrf_name',
'ndfc_attach_networks.yml': 'net_name',
'ndfc_vpc_domain_id_resource.yml': 'entity_name',
'ndfc_vpc_peering.yml': 'peerOneId'
}

def _create_fabric_link_key(self, item):
"""
Create a unique key for fabric links from multiple attributes.

Args:
item (dict): The fabric link item containing link details

Returns:
str: A unique key for the fabric link or None if required fields are missing
"""
required_fields = ['dst_fabric', 'src_device', 'src_interface', 'dst_interface']
if not all(item.get(field) for field in required_fields):
return None

return '_'.join([item.get(field) for field in required_fields])

def _create_interface_key(self, item):
"""
Create a unique key for interfaces from multiple attributes.

Args:
item (dict): The interface item containing interface details

Returns:
str: A unique key for the interface per switch or None if required fields are missing
"""
required_fields = ['name', 'switch']
if not all(item.get(field) for field in required_fields):
return None

switch_value = item.get('switch')
# Handle both string and list types for switch field
if isinstance(switch_value, list):
if not switch_value: # Empty list check
return None
switch_id = switch_value[0]
else:
switch_id = switch_value

return f"{item.get('name')}_{switch_id}"

def dict_key(self, item):
"""
Return the unique key for an item based on its type.

Args:
item (dict): The item to generate a key for

Returns:
str: The unique key for the item, or None if no key could be generated
"""
if not isinstance(item, dict):
return None

filename = self.new_file_path

# Special handling for fabric links due to composite key
if filename.endswith('ndfc_fabric_links.yml'):
return self._create_fabric_link_key(item)

# Special handling for interfaces due to composite key
if filename.endswith('ndfc_interface_all.yml'):
return self._create_interface_key(item)

# Find matching file type and return corresponding key
for file_type, key_attr in self.KEY_MAPPING.items():
if filename.endswith(file_type):
return item.get(key_attr)

return None

def compare_items(self, old_items, new_items):
"""
Compare old and new items, returning updated, removed, and equal items.
"""

old_dict = {self.dict_key(item): item for item in old_items}
new_dict = {self.dict_key(item): item for item in new_items}

updated_items = [] # Updated items in new file
removed_items = [] # Items removed in new file
equal_items = [] # Items unchanged

for key, new_item in new_dict.items():
old_item = old_dict.get(key)
if old_item is None:
updated_items.append(new_item)
elif old_item != new_item:
updated_items.append(new_item)
else:
equal_items.append(new_item)

for key, old_item in old_dict.items():
if key not in new_dict:
removed_items.append(old_item)

return updated_items, removed_items, equal_items

def order_interface_remove(self, removed_items):
"""
Order interface removals to avoid dependency issues.
Ensures that port-channels are removed after their member interfaces.

Args:
removed_items (list): List of interface items to be removed

Returns:
list: Ordered list of interface items for removal (port-channels first,
then ethernet interfaces, then other interface types)

Note:
This ordering helps prevent dependency conflicts during interface removal.
Port-channels should be removed before their member ethernet interfaces
to avoid configuration errors.
"""
# The order in which interfaces are configured matters during removal.
# Configuration Order:
# - Breakout Interfaces (Type: breakout)
# - Trunk Interfaces (Type: eth)
# - Access Interfaces (Type: eth)
# - Access Port-Channels (Type: pc)
# - Trunk Port-Channels (Type: pc)
# - Routed Interfaces (Type: eth)
# - Routed Sub-Interfaces (Type: sub_int)
# - Routed Port-Channels (Type: pc)
# - Loopback Interfaces (Type: lo)
# - Dot1Q Sub-Interfaces (Type: eth)
# - vPC Interfaces (Type: vpc)

# Remove in the reverse order to avoid dependency issues
vpc_interfaces = [item for item in removed_items if item.get('type') == 'vpc']
loopback_interfaces = [item for item in removed_items if item.get('type') == 'lo']
port_channels = [item for item in removed_items if item.get('type') == 'pc']
routed_sub_interfaces = [item for item in removed_items if item.get('type') == 'sub_int']
ethernet_interfaces = [item for item in removed_items if item.get('type') == 'eth']
breakout_interfaces = [item for item in removed_items if item.get('type') == 'breakout']

# Return ordered list: port-channels first, then ethernet interfaces, then others
all_interfaces = vpc_interfaces + loopback_interfaces + port_channels + routed_sub_interfaces + ethernet_interfaces + breakout_interfaces
return all_interfaces
Loading