This is a detailed topic in our support portal in the Using Hopp series and assumes that you have some prior knowledge or experience using Hopp. |
A scheduled flow is a convenient way for an interactive user to safely run a series of jobs in the Hopp Runtime without having to monitor the job list in the Portal Operations UI in order to manually submit jobs as their predecessors complete.
The way to do it is first to compose a schedule and save it in a variable. Then the schedule can be submitted to run with the Submit-HpSchedule cmdlet. A schedule is simply an array of batches created by the New-HpBatch cmdlet.
The cmdlet keeps track of the jobs in the submitted schedule. If a job in the schedule faults or is cancelled the schedule will stop. At this point, the user can investigate and potentially fix the issue that caused the fault or cancellation. At this point, the schedule can be submitted again it will resume from the point of termination, restarting faulted/cancelled jobs.
Once all the jobs in a schedule are completed successfully, the schedule is done and can be discarded. To run a schedule again, you must create a new schedule.
The Hopp Automation module comes with 2 cmdlets to easily create new schedules for full migration runs:
- New-HpImportSchedule creates a schedule to run a full migration, including
- Set up Source and Target Engine
- Load Source Tables and Source Valuesets
- Load Source Views and Target Valuesets
- Run Export
- Run Full Monty Import
- Publish to Portal and Unload Target
- New-HpExportSchedule creates a schedule to run a full export-only migration, including
- Set up Source Engine
- Load Source Tables and Source Valuesets
- Load Source Views
- Run Export
- Publish to Portal and Unload Source
Apart from these to ready-to-go schedules, it is quite easy to create your own, customized schedules.
To create and run for instance the full import schedule you would first open a PowerShell and insure that the Hopp.Automation module is imported. The you would connect to the Portal an Track, create and store the schedule and finally submit it.
It would look something like this:
# Connect only once after you opened PowerShell Connect-Hp -portalUrl "(Portal Url)" -trackID "(Guid)" # Create a new schedule (providing the unloader to use) and store it $schedule = New-HpImportSchedule -unloaderID "(Guid)" # Run (or rerun) the schedule Submit-HpSchedule $schedule # Tear down the connection or simply close the PowerShell Disconnect-Hp
Building your own schedule
It is quite simple to build your own, customized schedule.
Let's have a look inside the New-HpImportSchedule to see what is going on.
$setupEngines = @( (New-HpSetupSourceJob), (New-HpSetupTargetJob) ) | New-HpBatch $loadTablesAndSourceValuesets = @( (Get-HpSourceTableList | New-HpLoadSourceTableJob), (Get-HpValuesetList -engine "Source" | New-HpLoadValuesetJob) ) | New-HpBatch $loadViewsAndTargetValuesets = @( (Get-HpSourceViewList | New-HpLoadViewJob), (Get-HpValuesetList -engine "Target" | New-HpLoadValuesetJob) ) | New-HpBatch $export = Get-HpEntityList | New-HpExportEntityJob -auditorID $sourceAuditorID | New-HpBatch $import = New-HpImportFullMontyJob -auditorID $targetAuditorID | New-HpBatch $finish = @( (New-HpPublishToPortalJob), (Get-HpEntityList | New-HpUnloadTargetJob -unloaderID $unloaderID) ) | New-HpBatch @( $setupEngines, $loadTablesAndSourceValuesets, $loadViewsAndTargetValuesets, $export, $import, $finish )
So, what's going on?
- $setupEngines : The batch containing the parameterized jobs to setup the source- and target engines
- $loadSourceTablesAndValuesets: The batch containing the table load jobs plus the load source valueset job
- $loadViewsAndTargetValuesets: The a batch containing the jobs to load the Source Views and the Target Valuesets
- $export: The batch containing all the export jobs. Optionally provide an id of a source auditor to use
- $import: The batch containing the full monty import job. Optionally provide an id of a target auditor to use
- $finish: The batch containing the jobs to publish to Portal and unload target. The unload job need the id of the unloader extension to use
- Finally, all the batches above is combined into one array that in effect is the schedule
It really is quite straight forward. To take things a bit further, let's try to create a little custom schedule.
Custom schedule to load all Source Tables with a given alias
Once you have worked with the Hopp software for a while, you will recognize that there is often a bit of work involved to get in the beginning to get the delivered source files loaded into the staging database (because the file often does not adhere to the agreed metadata).
Let's build a little custom schedule to load the source tables for a specific metadata alias "Src".
# Create the batch containing the relevant tables by filtering the source table list $srcTables = Get-HpSourceTableList | ForEach-Object { if ($_.Alias -eq "Src") { $_ } } | New-HpBatch $schedule = @( $srcTables ) # Submit (or re-submit) the schedule Submit-HpSchedule $schedule
See? Easy-peasy!
Save and reuse schedules
If you find that you often need to create the same schedule, you can create a PowerShell script file to build the schedule. Building on the sample above, you can save this part to a PowerShell script file (extension *.ps1):
$srcTables = Get-HpSourceTableList | ForEach-Object { if ($_.Alias -eq "Src") { $_ } } | New-HpBatch return @( $srcTables )
If you have save this to the file New-SrcTableLoad.ps1 in the current directory of your PowerShell session, you can quickly create a new profile like so:
$schedule = New-SrcTableLoad # The name of your ps1 file # Submit (or re-submit) the schedule Submit-HpSchedule $schedule
What not to do
Remember that the Submit-HpSchedule cmdlet keeps track of the job execution in the schedule. This is why it is important to first create the schedule and store in a variable - and then give that variable to the Submit_HpSchedule cmdlet.
By doing this, you can keep running Submit-HpSchedule on the same variable containing the schedule until it has completed successfully.
So this works:
$schedule = New-HpImportSchedule -unloaderID "(Guid)" # Can be resubmitted until success Submit-HpSchedule $schedule
This will also work - but will not be able to resume the schedule because the schedule is not stored anywhere. Re-running this command will simply create a new schedule that will start from scratch
# Run the schedule - cannot resume Submit-HpSchedule (New-HpImportSchedule -unloaderID "(Guid)")
Summary
We hope that this gives a good idea of how you can make your daily life easier by creating your own schedule or using the built in ones.
Now you can submit your schedule go to bed while your schedule is executing and wake up to completion and bliss !
Was this article helpful?
That’s Great!
Thank you for your feedback
Sorry! We couldn't be helpful
Thank you for your feedback
Feedback sent
We appreciate your effort and will try to fix the article