Exercise 4.4 - Load Dynamic and Translation Valuesets

Modified on Fri, 13 Oct 2023 at 06:28 PM

In the previous exercise, you loaded the data from the Source System and a View into staging tables. In this exercise, you will do the same for Dynamic and Translation Valuesets. 


As you remember from Studio, Valuesets come in 3 different flavours.


StaticData is provided in Studio itself. For static Valuesets, data is part of the generated engines and loaded into the staging tables as part of the engine setup. There is no need to do anything more.
DynamicData has to be loaded by calling the extension that is specified on the Valueset. In this exercise, the Valuesets are loaded using the Default extension delivered with Hopp. This extension reads Valuesets from Excel files.
TranslationA Translation Valueset is published in the Portal for users to provide the data content. The Portal stores the data content internally and in this exercise, you will load the staging table from this internal Portal store.



Load Dynamic Valueset CardTypes

The Valueset CardTypes were created in the Target map and are part of the Target engine. The Valueset is using the Valueset provider delivered with Hopp. This provider reads the Valueset data from an Excel file and it expects that Excel file at a specific location in the folder structure of the track. 


This file is already in the correct location as part of the training setup:



To load the Dynamic Valueset, select Target/Valuesets in the Portal Operations menu. The Portal Operations will show you all the Valuesets from the Target Engine. 


You can select View... from the context menu on any Valueset to see its data content in the staging table.




The CardTypes Valueset is in the list, but of course not loaded yet (if you don't see the Valueset, try to click the Refresh button). 


To load the Valueset with the contents of the Excel file, select Update... from the context menu


Check the job in the Job list and when done, go back to the Valuesets and click the Refresh button to update the row for CardTypes.


Load Translation Valueset TranslateCardTypes


As outlined at the beginning of this exercise, the data content for Translation Valuesets is provided by users of the Portal. Of course, there is a little hitch here as this is a training setup and there are no real users anywhere to provide this content.


Luckily, if you have sufficient authorizations (which you do in the training setup), the Portal allows you to load the data from an Excel sheet in one go. In a real-life scenario, this functionality is often used to provide the Portal users with initial data content for them to complete. 


Upload data to the Portal

Up to now, you have been working in the Operations part of the Portal. That is because you have been acting as an operator. The business user will normally not have access - and indeed not even see - the Portal Operations part of the navigation menu. Instead, the Business users are consulting the Portal State in order to investigate and process events etc. It is also in the Portal State menu, that the Translation Valuesets live.


Select Translation in the State menu on the left and then make sure the Partition is set to All:



Now, you can click the upload button for the TranslateCardTypes Valueset to upload the file: 


We have provided the data content for you in the file Documents\MigFx\Runtime\Track\01\Files\Valuesets\TranslateCardTypes.xlsx. Choose this file and click the Upload button to import the data content into the Portal:



Now the data content is loaded (and visible) in the Portal. You can see for yourself selecting a specific partition in the drop down (30 - Bank 30 or 40 - Bank 40 instead of All) and clicking the TranslateCardTypes link:


As an aside: Note the Validation column of the Translation Valueset in the Portal. Hopp provides utility functions to: 

  • unload the data content, perform some kind of validation, in this case for instance validating the TargetCardType against known Card types in the Target System 
  • load the result of the validation and mark the rows in the Translation Valueset accordingly


So far so good. Now the data content is stored internally in the Portal as if the (imaginary) business users had entered it in the Portal. But it is still not loaded into the Portal Operations staging table.


Load the Portal Operations Staging table

The next step is to go back to the Portal to load the Portal Operations staging table with the data content from the Portal. This is pretty much the same thing as loading the Dynamic Valueset above.


Select Source/Valuesets in the Portal Operations menu to see the Valuesets of the Source Engine:



TranslateCardTypes is there, but not loaded yet. The load the Valueset, select Update... from the context menu of TranslateCardTypes.


Check the job in the Job list and refresh the Valueset list when finished.


What happened here?


You have loaded the staging tables for the non-static Valuesets. The static Valuesets were loaded as part of the Engine setup you did in an earlier exercise. For the Translation Valueset there was the intermediary step of simulating user data entry by importing the data content into the Portal from a spreadsheet. This extra step is related to the training setup. In a real life scenario, there would of course be real users to enter the data.


The data content for translation Valuesets are maintained by the users in the Portal and loaded to the Portal Operations staging table in order to ensure well-known and stable data content. Even if users are updating the data content in Portal while the migration runs. In addition, rows with validation errors in the Portal will not be loaded into the staging table.


Now everything is in place and in the next exercise you will be migrating the Cards - finally!


Was this article helpful?

That’s Great!

Thank you for your feedback

Sorry! We couldn't be helpful

Thank you for your feedback

Let us know how can we improve this article!

Select atleast one of the reasons
CAPTCHA verification is required.

Feedback sent

We appreciate your effort and will try to fix the article