Continuing with my experiments relating to injecting Switch backing files, I now have been able to automatically backup flows. I have created a flow group which automatically backs up every flow on a Switch system, creating .sflow files which can be easily restored. By default, the flow is triggered every hour and archives the resulted flows to a network location, but since I used the Portal Callback pattern, you can change that without compromising the core functions.
Usage


You do not need to modify any of the 'FAB Functions' flows. The flow 'Auto Backup Program' is yours to mess around with.
You control the frequency of backups by means of the Dummy Job Clock which by default is set to 1 hour (3600 seconds).
You then are required to use the Flow Auto Backup (FAB) dataset which consists of a few Private Data keys:
- FAB Portal Callback - The Portal channel where the packed flows will be delivered once the core functions are complete. You do not have to change this.
- FAB Flows Path - The absolute path to your Switch Server/flows directory
- FAB Manifest Path - The absolute path to your generic manifest.xml file. See below for more information.
- FAB Flow Name - The name of the flow
- FAB Flow Version - The current version of the flow
- FAB Flow ID - The numerical ID of the flow
By default, the 'Auto Backup Program' flow archives the flows into folders for each flow name. The version number is part of the .sflow name, so it will overwrite the same version, but if it's a new version, it will save a new copy in that folder.
Here is the result (Yes, this flow backs up itself. How meta


And encrypted flows all go together in the same folder:

Manifest Setup
To reverse-engineer the .sflow filetype and reproduce its behavior within Switch, we need to introduce a simple XML file called a manifest. Luckily, Switch doesn't require there to be very much information in it, so we can simply use a static file. The best way to handle this is to export any flow, rename the .sflow to .zip, unzip it, and copy the manifest to a static location and use that absolute URL as the FAB Manifest Path.
For reference, here are the contents of my manifest.xml:
Code: Select all
<Manifest>
<ProductInfo>Switch Version 13</ProductInfo>
<ExportFormatVersion>1.0</ExportFormatVersion>
<FlowFile>flow.xml</FlowFile>
<SwitchFlavour>Switch</SwitchFlavour>
<SwitchReleaseVersionNumber>13</SwitchReleaseVersionNumber>
<SwitchReleaseType></SwitchReleaseType>
<SwitchReleaseTypeNumber>0</SwitchReleaseTypeNumber>
<SwitchUpdateVersionNumber>100</SwitchUpdateVersionNumber>
<OperatingSystem>Mac OS X</OperatingSystem>
</Manifest>
So now that you know how to use it, here is how it works. Everything is triggered from the Auto Backup Program flow. This flow is invoking FAB Get Flow Function via the Portal Callback pattern. The advantage of this is that you don't have to modify any of the core FAB flows which means if I ever update the FAB Functions, you can simply swap them out and everything will keep working. Said another way, the core flows are acting as a "black box": you don't know and don't need to know what's going on inside because it's abstracted away from you.
Within the FAB Get Flow Function, the triggering job is immediately asserted to ensure the required FAB dataset is set. If so, it then uses those values to inject a copy of your flows folder (being sure not to delete it after injection). The flows folder is split into individual files, where it then injects a copy of the manifest and the flow XML itself is consumed as a dataset. The flow and manifest are assembled into one, and then it writes the remaining fields of the FAB dataset before handing the job off to FAB Pack Flow Function.
Within FAB Pack Flow Function, the job goes through another assertion to verify the callback is set. The job is ungrouped, then renamed to match the .sflow file structure (flow.xml, manifest.xml), and then merged again. The job folder is now archived, but it can't sit within a top-level folder as the Archive configurator would do, so I built a script to archive the files at the top level called Flat Archive. A .zip comes out, which has its suffix replaced with .sflow and then sent to the callback.
Back in the Auto Backup Program flow, the .sflows stream out and the user can do what they want with them. In the default case, they get backed up onto the network separated by folders for each flow name.
What was used
Private data script: https://github.com/open-automation/Swit ... rivateData
Dummy job clock script: https://github.com/open-automation/DummyJobClock
Portals: https://github.com/open-automation/SwitchPortals
Variable Assert: https://github.com/open-automation/SwitchVariableAssert
Flat Archive: https://github.com/open-automation/switch-flat-archive
Download
All Flows: https://drive.google.com/file/d/0B9ciRz ... sp=sharing
FAB Get Flow Function Documentation: https://www.googledrive.com/host/0B9ciR ... mZRdExyekE
FAB Pack Function Documentation: https://www.googledrive.com/host/0B9ciR ... 3o5S0ZLUGc
Auto Backup Program Documentation: https://www.googledrive.com/host/0B9ciR ... jFwQ2YzWnc
Update 9/16/16
This has been updated. Download and new documentation can be found at this repo:
https://github.com/open-automation/swit ... uto-backup