cisco.dnac.backup_and_restore_playbook_config_generator module -- Generate YAML playbook for 'backup_and_restore_workflow_manager' module.
Note
This module is part of the cisco.dnac collection (version 6.49.0).
To install it, use: ansible-galaxy collection install cisco.dnac.
You need further requirements to be able to use this module,
see Requirements for details.
To use it in a playbook, specify: cisco.dnac.backup_and_restore_playbook_config_generator.
New in cisco.dnac 6.44.0
Synopsis
Generates YAML configurations compatible with the backup_and_restore_workflow_manager module, reducing manual playbook creation effort and enabling programmatic modifications.
Represents NFS server configurations and backup storage configurations for backup and restore operations on Cisco Catalyst Center.
Supports extraction of NFS configurations, backup storage configurations with encryption and retention policies.
Generated YAML format is directly usable with backup_and_restore_workflow_manager module for infrastructure as code.
Requirements
The below requirements are needed on the host that executes this module.
dnacentersdk >= 2.9.3
python >= 3.9
Parameters
Parameter |
Comments |
|---|---|
A dictionary of filters for generating YAML playbook compatible with the 'backup_and_restore_workflow_manager' module. Filters specify which components to include in the YAML configuration file. If
|
|
Required when Filters to specify which components to include in the YAML configuration file. If If component filter blocks are provided (for example If no component filter blocks are provided, |
|
Backup storage configuration filtering options by server type only. If not specified, all backup storage configurations are included. Only server_type filtering is supported for backup storage. Other filter parameters like mount_path or retention_period are not supported. |
|
Server type for filtering backup configurations. NFS type represents network-based backup storage. PHYSICAL_DISK type represents local disk backup storage. Choices: |
|
List of components to include in the YAML configuration file. Valid values are - NFS Configuration "nfs_configuration" - Backup Storage Configuration "backup_storage_configuration" Required when no component-specific filter blocks are provided. Empty list is invalid when no component-specific filter blocks are provided. Supports multiple filter entries for filtering multiple NFS servers. Choices: |
|
NFS configuration details to filter NFS servers. Both server_ip and source_path must be provided together for filtering. If not specified, all NFS configurations are included. |
|
Server IP address of the NFS server. Must be provided along with source_path for filtering. Used for exact match filtering of NFS configurations. |
|
Source path on the NFS server. Must be provided along with server_ip for filtering. Used for exact match filtering of NFS configurations. |
|
Defines the timeout in seconds for API calls to retrieve task details. If the task details are not received within this period, the process will end, and a timeout notification will be logged. Default: :ansible-option-default:`1200` |
|
Indicates whether debugging is enabled in the Cisco Catalyst Center SDK. Choices: |
|
The hostname of the Cisco Catalyst Center. |
|
Flag to enable/disable playbook execution logging. When true and dnac_log_file_path is provided, - Create the log file at the execution location with the specified name. When true and dnac_log_file_path is not provided, - Create the log file at the execution location with the name 'dnac.log'. When false, - Logging is disabled. If the log file doesn't exist, - It is created in append or write mode based on the "dnac_log_append" flag. If the log file exists, - It is overwritten or appended based on the "dnac_log_append" flag. Choices: |
|
Determines the mode of the file. Set to True for 'append' mode. Set to False for 'write' mode. Choices: |
|
Governs logging. Logs are recorded if dnac_log is True. If path is not specified, - When 'dnac_log_append' is True, 'dnac.log' is generated in the current Ansible directory; logs are appended. - When 'dnac_log_append' is False, 'dnac.log' is generated; logs are overwritten. If path is specified, - When 'dnac_log_append' is True, the file opens in append mode. - When 'dnac_log_append' is False, the file opens in write (w) mode. - In shared file scenarios, without append mode, content is overwritten after each module execution. - For a shared log file, set append to False for the 1st module (to overwrite); for subsequent modules, set append to True. Default: :ansible-option-default:`"dnac.log"` |
|
Sets the threshold for log level. Messages with a level equal to or higher than this will be logged. Levels are listed in order of severity [CRITICAL, ERROR, WARNING, INFO, DEBUG]. CRITICAL indicates serious errors halting the program. Displays only CRITICAL messages. ERROR indicates problems preventing a function. Displays ERROR and CRITICAL messages. WARNING indicates potential future issues. Displays WARNING, ERROR, CRITICAL messages. INFO tracks normal operation. Displays INFO, WARNING, ERROR, CRITICAL messages. DEBUG provides detailed diagnostic info. Displays all log messages. Default: :ansible-option-default:`"WARNING"` |
|
The password for authentication at the Cisco Catalyst Center. |
|
Specifies the port number associated with the Cisco Catalyst Center. Default: :ansible-option-default:`"443"` |
|
Specifies the interval in seconds between successive calls to the API to retrieve task details. Default: :ansible-option-default:`2` |
|
The username for authentication at the Cisco Catalyst Center. Default: :ansible-option-default:`"admin"` |
|
Flag to enable or disable SSL certificate verification. Choices: |
|
Specifies the version of the Cisco Catalyst Center that the SDK should use. Default: :ansible-option-default:`"2.2.3.3"` |
|
File write mode for the generated YAML configuration file. The overwrite option replaces existing file content with new content. The append option adds new content to the end of existing file. Defaults to overwrite if not specified. file_mode is only applicable when file_path is provided. Choices: |
|
Path where the YAML configuration file will be saved. If not provided, the file will be saved in the current working directory with a default file name For example, Supports both absolute and relative paths. |
|
Desired state of Cisco Catalyst Center after module execution. Only gathered state is supported for extracting configurations. Choices: |
|
Flag for Cisco Catalyst Center SDK to enable the validation of request bodies against a JSON schema. Choices: |
Notes
Note
SDK Methods used are - backup.Backup.get_all_n_f_s_configurations - backup.Backup.get_backup_configuration
Paths used are - GET /dna/system/api/v1/backupNfsConfigurations - GET /dna/system/api/v1/backupConfiguration
Requires Cisco Catalyst Center version 3.1.3.0 or higher
Only supports gathered state for extracting existing configurations
NFS filtering requires both server_ip and source_path together
Backup storage filtering only supports server_type parameter
Generated YAML file format is compatible with backup_and_restore_workflow_manager module
File path supports both absolute and relative paths
Default filename includes timestamp for uniqueness
NFS details correlation matches backup mount paths with NFS destination paths automatically
Empty configurations return success with idempotent behavior
Module does not modify Catalyst Center configuration
Does not support
check_modeThe plugin runs on the control node and does not use any ansible connection plugins instead embedded connection manager from Cisco Catalyst Center SDK
The parameters starting with dnac_ are used by the Cisco Catalyst Center Python SDK to establish the connection
See Also
See also
- cisco.dnac.backup_and_restore_workflow_manager
Module to manage backup and restore NFS configurations in Cisco Catalyst Center.
Examples
- name: Generate YAML Configuration with both NFS and backup storage configurations
cisco.dnac.backup_and_restore_playbook_config_generator:
dnac_host: "{{dnac_host}}"
dnac_username: "{{dnac_username}}"
dnac_password: "{{dnac_password}}"
dnac_verify: "{{dnac_verify}}"
dnac_port: "{{dnac_port}}"
dnac_version: "{{dnac_version}}"
dnac_debug: "{{dnac_debug}}"
dnac_log: true
dnac_log_level: "{{dnac_log_level}}"
state: gathered
file_path: "/tmp/catc_backup_restore_config.yaml"
config:
component_specific_filters:
components_list:
- "nfs_configuration"
- "backup_storage_configuration"
- name: Generate YAML for NFS-type backup storage only filtering by server_type
cisco.dnac.backup_and_restore_playbook_config_generator:
dnac_host: "{{dnac_host}}"
dnac_username: "{{dnac_username}}"
dnac_password: "{{dnac_password}}"
dnac_verify: "{{dnac_verify}}"
dnac_port: "{{dnac_port}}"
dnac_version: "{{dnac_version}}"
dnac_debug: "{{dnac_debug}}"
dnac_log: true
dnac_log_level: "{{dnac_log_level}}"
state: gathered
file_path: "/tmp/catc_backup_storage_config.yaml"
file_mode: "overwrite"
config:
component_specific_filters:
components_list: ["backup_storage_configuration"]
backup_storage_configuration:
- server_type: "NFS"
- name: Generate YAML for specific NFS server using exact match on server_ip and source_path
cisco.dnac.backup_and_restore_playbook_config_generator:
dnac_host: "{{dnac_host}}"
dnac_username: "{{dnac_username}}"
dnac_password: "{{dnac_password}}"
dnac_verify: "{{dnac_verify}}"
dnac_port: "{{dnac_port}}"
dnac_version: "{{dnac_version}}"
dnac_debug: "{{dnac_debug}}"
dnac_log: true
dnac_log_level: "{{dnac_log_level}}"
state: gathered
file_path: "/tmp/catc_specific_nfs_config.yaml"
config:
component_specific_filters:
components_list: ["nfs_configuration"]
nfs_configuration:
- server_ip: "172.27.17.90"
source_path: "/home/nfsshare/backups/TB30"
- name: Generate YAML for all configurations without filtering useful for complete system documentation
cisco.dnac.backup_and_restore_playbook_config_generator:
dnac_host: "{{dnac_host}}"
dnac_username: "{{dnac_username}}"
dnac_password: "{{dnac_password}}"
dnac_verify: "{{dnac_verify}}"
dnac_port: "{{dnac_port}}"
dnac_version: "{{dnac_version}}"
dnac_debug: "{{dnac_debug}}"
dnac_log: true
dnac_log_level: "{{dnac_log_level}}"
state: gathered
file_path: "/tmp/catc_backup_restore_config.yaml"
config:
component_specific_filters:
components_list:
- "nfs_configuration"
- "backup_storage_configuration"
- name: Append YAML Configuration for multiple NFS servers to existing file
cisco.dnac.backup_and_restore_playbook_config_generator:
dnac_host: "{{dnac_host}}"
dnac_username: "{{dnac_username}}"
dnac_password: "{{dnac_password}}"
dnac_verify: "{{dnac_verify}}"
dnac_port: "{{dnac_port}}"
dnac_version: "{{dnac_version}}"
dnac_debug: "{{dnac_debug}}"
dnac_log: true
dnac_log_level: "{{dnac_log_level}}"
state: gathered
file_path: "/tmp/catc_multiple_nfs_config.yaml"
file_mode: "append"
config:
component_specific_filters:
components_list: ["nfs_configuration"]
nfs_configuration:
- server_ip: "172.27.17.90"
source_path: "/home/nfsshare/backups/TB30"
- server_ip: "172.27.17.91"
source_path: "/home/nfsshare/backups/TB31"
- name: Generate YAML Configuration for Physical Disk backup storage only
cisco.dnac.backup_and_restore_playbook_config_generator:
dnac_host: "{{dnac_host}}"
dnac_username: "{{dnac_username}}"
dnac_password: "{{dnac_password}}"
dnac_verify: "{{dnac_verify}}"
dnac_port: "{{dnac_port}}"
dnac_version: "{{dnac_version}}"
dnac_debug: "{{dnac_debug}}"
dnac_log: true
dnac_log_level: "{{dnac_log_level}}"
state: gathered
file_path: "/tmp/catc_physical_disk_backup.yaml"
config:
component_specific_filters:
components_list: ["backup_storage_configuration"]
backup_storage_configuration:
- server_type: "NFS"
- name: Component filter auto-adds missing component to components_list
cisco.dnac.backup_and_restore_playbook_config_generator:
dnac_host: "{{dnac_host}}"
dnac_username: "{{dnac_username}}"
dnac_password: "{{dnac_password}}"
dnac_verify: "{{dnac_verify}}"
dnac_port: "{{dnac_port}}"
dnac_version: "{{dnac_version}}"
dnac_debug: "{{dnac_debug}}"
dnac_log: true
dnac_log_level: "{{dnac_log_level}}"
state: gathered
file_path: "/tmp/catc_component_auto_add.yaml"
config:
component_specific_filters:
components_list: ["nfs_configuration"]
backup_storage_configuration:
- server_type: "NFS"
- name: Equivalent explicit components_list for same filter behavior
cisco.dnac.backup_and_restore_playbook_config_generator:
dnac_host: "{{dnac_host}}"
dnac_username: "{{dnac_username}}"
dnac_password: "{{dnac_password}}"
dnac_verify: "{{dnac_verify}}"
dnac_port: "{{dnac_port}}"
dnac_version: "{{dnac_version}}"
dnac_debug: "{{dnac_debug}}"
dnac_log: true
dnac_log_level: "{{dnac_log_level}}"
state: gathered
file_path: "/tmp/catc_component_explicit.yaml"
config:
component_specific_filters:
components_list: ["nfs_configuration", "backup_storage_configuration"]
backup_storage_configuration:
- server_type: "NFS"
Return Values
Common return values are documented here, the following are the fields unique to this module:
Key |
Description |
|---|---|
Response from YAML configuration generation operation with execution statistics and status information. Returned: always |
|
Status message describing operation outcome Returned: always |
|
Detailed operation results and statistics Returned: always |
|
Number of components skipped due to errors or no data available Returned: always Sample: :ansible-rv-sample-value:`0` |
|
Total configuration items across all components Returned: always Sample: :ansible-rv-sample-value:`5` |
|
Absolute path to generated YAML file Returned: on_success Sample: :ansible-rv-sample-value:`"/tmp/backup\_restore\_config.yaml"` |
|
Detailed status message with operation summary Returned: always |
|
Operation status indicating success or failure Returned: always Can only return: |
|
Overall operation status Returned: always Can only return: |
|
No configurations found scenario treated as successful idempotent operation Returned: when_no_configs_found |
|
Operation failed due to invalid parameters, API errors, or file write issues Returned: on_failure |