In this article, we introduce the main components of hopp data migration software. The main components make up the core software required to execute a complex data migration project using our software. In addition to the core components, we offer optional modules that support project management and scoping, and testing.
hopps data migration software is made of a set of linked and collaborating components. Each component is a feature-rich application and the combined suite of components provides a complete foundation and support for all aspects of the data migration process.
Built on the Business Objects at the center, our application or our toolset or framework, whatever you want to call it, is built out of four separate components.
If we start at the top in the figure below, we have the Studio application. This is where you produce the target map and you produce the source map.
Figure: hopp Components
Studio is a Windows application that is dedicated to describing how the transformation and validation requirements in the migration must function. It's a very structured interface with all the features you will need for any complex data migration to be specified correctly.
Studio requires you to be very precise and structured in the way you describe the mapping. It's a well-designed Windows application, focused on being user-friendly and helping the user to be productive and efficient.
A team can work together to produce the maps, with each team member working within Studio to produce, for instance, their part of the Target Map. A user can have Studio installed locally, on a PC, and will be checking items in and out of a common repository. This means that you can collaborate across a team in an efficient manner because you are communicating via a common repository.
The maps that you produce with Studio are so structured and so precise that they can be given to the Engine. The Engine contains a code generator that will generate C# code that does exactly what is described in the maps that we are generating the Engines from.
Whereas Studio is meant to be used by users who do not necessarily have a deep technical or programming background. Studio is meant to be used by people who know about the data and the business logic being moved across. Studio is aimed at a functional consultant rather than a technical consultant. When we get to the engine code, of course, this is more the technical people that will be working on this.
A migration project will have some people who are concerned with specifying what needs to be done. In projects not using hopp software, these people will be using, for instance, Word and Excel spreadsheets to kind of specify how you want the fields in the source system to be mapped and transformed to fields in the target system and what validation rules to apply and so on.
When the specification in Studio is done and the corresponding code generated by the Engine we will be leaving the metadata world and entering the operational world where we're actually executing the migration. First, we get an engine from the code generator and deploy it into the Runtime environment. The Portal Operation remotely controls the Runtime environment where job processes run. It can run locally on your own machine, but it can also run on a different machine in your network. So you can have a powerful migration server that is dedicated to that kind of thing.
The Portal Operation is targeted to roled - like one or two operators. Quite often it's the same people that would be doing the manual rule implementation that will do the operational tasks in the Portal Operation. Once you have achieved some level of quality, a state of your migration iterations that you think is presentable and makes sense, you will then publish it to the Portal. The Portal is meant to be reaching out to the widest audience, namely of course, the migration team itself, but also the business users that are involved in the project.
Here we talking about the real users from the business that are involved in the testing and verification of the migration results. They can see whatever happened in the migration and compare it the what they expect and the events created and flags raised as specified. It can also be management that wants to have some idea of the quality and the progress of the migration that can have benefits of looking at the tracker application. So these are the components that make out our solution.
Below is shown a short description of each main component.
The Studio is a Windows productivity application used to create the mapping. Studio supports and enforces highly structured specifications.
In addition, Studio contains extensive cross-referencing and reporting functionality, significantly improving the overall understanding and overview of the mapping.
Finally, Studio validates the mapping, clearly reporting any errors and inconsistencies that in turn would cause incorrect or invalid data migration results.
Studio exports the entire mapping in a format usable by the Engine (next) that then generates the program code to execute the data migration.
The Engine contains the code generators generating the engine code as well as base class libraries containing common, supporting functionality for the generated code.
While the Engine generators will generate by far most of the code necessary to execute the data migration, certain migration rules may be implemented by hand. The generated code contains stubs for these rules making their manual implementation straightforward.
While manual rule implementations are left alone (not overwritten) by the code generator, any modifications in the mapping that in turn modify the interface for a manual rule implementation will be instantly discovered at compile time.
The quality of the generated code is such that it is never manually retouched. While situations certainly occur where the generated code does not produce the desired migration results, these situations are invariably corrected by modifying the mapping and generating the code again.
The Portal Operation is the execution framework that uses the generated engines to execute the data migration. Through the Portal Operation interface, the user loads source data, populates value sets, executes the data migration, and offloads the target data produced by the generated engines.
Using Portal Operations, it is possible to iterate over the data migration in a very fine-grained manner. It is possible to iterate all business objects that generated a specific event during migration, to iterate a specific business object etc.
In addition, the Portal Runtime supports the operation and execution of multiple data migration projects across a host of different servers.
The Portal shows the result of a data migration iteration in a web-based interface. These results consist of:
In addition to the passive presentation of the results, the Portal contains rich workflow functionality allowing the involved users to manage responsibility, comments, and state (accepted, fixed, etc.) for all events.
Was this article helpful?
Thank you for your feedback
Sorry! We couldn't be helpful
Thank you for your feedback
We appreciate your effort and will try to fix the article