Mapping Semantics

Modified on Tue, 16 May 2023 at 09:17 PM

Specifying the mapping rules for any data migration is complicated. The specifications tend to grow indeed very big, there is a myriad of dependencies and users producing the specifications normally need deep ad comprehensive knowledge of the business behind the data being migrated.


While there is no silver bullet to eliminate these factors, hopp and especially Studio provide complete support for the users to keep on top of the volume and complexity as the migration project progresses, facilitating the production and maintenance of mapping of high and durable quality.


This section is a high-level outline of how the semantics in Studio provides a foundation for highly structured specifications while all the same enabling the flexibility and openness to absorb the peculiarities and specialities that invariably exist in any real-world data migration scenario. Studio is a rich application and for each mapping type there is of course a myriad of details. The aim of this section is to give a basic idea of the three different mapping project types, their areas of responsibility and how they connect and collaborate.


Common elements

While the two project types (Source Map and Target Map) are different in nature, they do use a common set of elements. Because of this, the Studio interface remains consistent across the two project types and manual rule implementation is completely similar.


Constants

A constant is a value and a given data type that can be used everywhere in the mapping. Constants come in two flavours:

  • Constant: The value for the constant is provided in Studio and this value is typically incorporated in the generated code as a literal value
  • Parameter: The value for the constant is not incorporated in the generated code, but provided through the Portal. Thus, the value may change between iterations

Value sets

A value set is a table of data organized in columns and rows. Using Studio, the user defines the columns of the value set, giving their names and data types. Once defined, a value set is used by rules to look up values or – in the case of manually implemented rules – in any way needed.


Value sets come in three flavours:

  • A static value set: The user populates the value set by typing values directly in Studio
  • A dynamic value set: The value set is populated by the Director Runtime using parameters specified for the value set in Studio. By default, the Runtime can read value sets from Excel worksheets, but an extension point is available to implement context specific value set providers (for instance reading values from the target system)
  • A translation value set: The value set is automatically shown in the Portal enabling external users to populate the value set

Rules and Flags

Rules are used throughout the mapping in Studio in many different contexts. No matter the context, any rule is defined in Studio by specifying the data type of its return value and name plus data type for each of parameters to be passed to the rule. 


Of special interest is that for any rule zero or more flags may be defined – each flag is a way for the rule to notify the Runtime that it encountered some relevant situation. The flags in combination with Events (see below) is the way the hopp decouples the implementation from the invocation context of a given rule. A rule may be invoked in different contexts, each reacting differently to the flags raised by the rule.


Rules are used extensively throughout the mapping, performing a variety of different tasks. For instance:

  • Validation rules: Validating input fields
  • Mapping rules: Assigning values to output fields (note that in most instances field values can be assigned without the use of mapping rules using other facilities in Studio)
  • Condition rule: Deciding whether a certain sub element should be processed or not
  • Exit rules: Typically, cross validating a business object at the point it is completed 
  • Etc.

Rules come in two flavours:

  • Lookup rules: A rule that uses one or more parameters to look up and return a value from a value set (see above). Rules of this kind are automatically generated by the code generator, no manual implementation is necessary
  • Manual rules: The code generator creates a virtual method to be manually implemented in Visual Studio as described above

Events

Using Studio, users can define User Events to be fired when a rule raises a flag. A user event is basically an event code combined with a message text. The message text can contain placeholders for context-specific values and can be supplied in multiple languages. If the message text contains placeholders, it is possible to specify the values to be merged into the message text when the event is raised. 


The user must specify a severity for the event, causing the framework to act accordingly if the event is fired. The possible reactions are:


Reject

Rejects the current root business object in its entirety

Reject Child

Rejects the current child business object but not the entire root business object

Error

Nothing is rejected, but the migration result will introduce a (non-fatal) error in the target system that must be rectified

Warning

Nothing is rejected, but the migration result may introduce inconsistency in the target system

Information

Information of action taken by the migration. Data may be modified, or new data introduced to improve quality.


Whenever a rule (as defined above) is used in Studio, for instance to provide the value for a field, the user is presented with a panel to define which value to provide for each the parameters for the rule. The user can provide a literal value, reference a Constant or provide other value types depending on the exact context.


The panel also presents the user for all the possible flags that can be raised by the rule. At this point the user decides how the framework should react to each flag. One possible reaction is to reference a User Event to be fired. A User Event is defined:

  • Ignore: The flag is ignored; no event is fired and no action taken
  • User Event: The user can identify a User Event as described above. The severity defined on the event will decide how the framework reacts
  • System Event: The user decides that the nature of the flag in this context does not merit a user event. In this case the framework will generate a standard message text in case the flag is raised. In this case the user must also specify the severity of the System Event

It is the events (User and System) fired in this manner that is collected by the Director Runtime, shown in real-time when monitoring the migration and shown in the Tracking Web Application.


Target MAP

The Target Map is the starting point for all specifications. It is the responsibility of the Target Map to ensure that the data produced by the migration is valid, and can be delivered to and accepted by the target system without late-occurring errors. It is normally the most extensive of the two map types in hopp but also the sole mapping to be reused, if the same target system over separate migration projects may receive data from different source systems.


Starting out a Target Map completely from scratch implies importing the specification of the target data structures that the data migration should produce. These data structures can represent anything, for instance tables in a database, parameter lists, routine calls, etc.


It is the core purpose of the Target Map:

  • to define how to produce data for these structures, when the migration executes
  • to enforce runtime validations to ensure that the target data produced is acceptable by the target system

In addition, it is the Target Map that creates the business object hierarchies that serve as a mainstay for the entire specification, execution and presentation of the migration result.


Developing the Target Map involves these main tasks (avoiding an abundance of detailed tasks):

  • Manually define the hierarchies of business objects 
  • For a given business object, point out the target system structures that this business object will deliver data to
  • For each target data structure, determine how to assign the value for each field in the structure. Many ways exist to internally derive/calculate these values from other values inside the target specification
  • In the case a given value cannot be derived/calculated in any way, this value surfaces as an upstream requirement for data to be received. This is done by manually create a so-called External Field on the business object 
  • In some cases, values, can be retrieved from other, related business objects. In these cases, it is possible to create relationships between business objects and use these relationships to retrieve values. Relationships automatically evolve into execution dependencies to be respected by the Runtime

The generated target engine contains the code to receive the exported data and call rules etc. as specified to produce the target result.


Publishing the Target Map for import into a Source Map is in fact just publishing the hierarchies of business objects with their external field requirements.


Source Map

The Source Map is built on two different inputs:

  • The published data requirements from the Target Map
  • The imported data structures from the source system

It is the core purpose of the Source Map to define how to meet the data requirements of the Target Map using the data structures in the source system.


The Business Object hierarchies rooted in the target specification, with the alterations imposed by the transformation specification is presented in Studio. For each business object in the hierarchy it must be specified how the external fields of the business object will be assigned a value.


For this purpose, the Source Map contains an export-specific toolset. Source data structures can be aggregated into views and these views and source tables themselves can be connected to business objects in order to provide the data necessary.


The generated export engine resulting from the Source Map contains these main parts:

  • A generated Sql Server database containing:
    1. A generated table for each source data structure
    2. For each view defined in the Source Map
      • A generated table to contain the data for the view
      • A generated stored procedure to populate the table with the data
    3. For each business object in the specification stored procedures to retrieve the source data necessary to satisfy the data requirements for the business objects
  • Generated code to execute the stored procedures to populate the views
  • Generated code to execute the stored procedures to retrieve source data for the business objects, call rules as specified and populate the business objects with field data to complete the export result.




Was this article helpful?

That’s Great!

Thank you for your feedback

Sorry! We couldn't be helpful

Thank you for your feedback

Let us know how can we improve this article!

Select atleast one of the reasons
CAPTCHA verification is required.

Feedback sent

We appreciate your effort and will try to fix the article