Data migration is easy

Whether it is becauase of a completely new implementation, the transfer to a new or upgrade of an existing PLM system, data migration will almost always be part of activities. Why:

  • Because there is no such thing as 'old' information
  • Becuase it concerns active company information or information that is the starting point for improvements
  • Because it is a change for quality correction
  • Because it is easy....


Each of the statements above can lead to its own discussion. But easy it true, although it is a relative statement. Condition is that with all needed steps, clear decisions have to be taken. Easy does not equal quick or cheap; this is the consequence of each of the decisions. The figure shows that with all decisions for correction, migration or checking, a choice has to be made between quality, cost and time. A good approach for executing a migration process is to assume the ideal situation with an unlimited budget and time and let quality be the leading factor.



The principal staps to follow in a migration are:

  • Inventory and analyse
    Which information is available? This concerns all information in all current systems. Only in a later step it should be decided what to migrate or not and why.
  • Mapping and mapping rules
    A mapping (table) has to be created between the datamodel of current data and the datamodel of the new PLM system. It is very likely that this will also need mapping rules to defined combination, translation of splitting up of values.
  • Analysis, filtering and qaulity verification
    Using the mapping and the rules the quality of information can be analysed. This can be used to filter what information is necessary to migrate and what not. If it is decided that some information that is not to be migrated is still necessary, it is much more usefull to migrate this. Making information available in an old or offline system will principally lead to errors and undermines the advantages of the PLM system. It is not necessary to correct information unless this leads to conflicts in the migration itself. Infomration can always be migrated with additional but temporary rules in combination with the approach to correct in the PLM system itself after migration. However; it is advised in this case to only decide on this course of action if correction resources can be reserved before migration or if the company has clear interest in the availability of infomration over quality correction.
  • Migration
    The actual migration is - if mapping and rules are clear and consistent - a purely technical action. Points of attention are to clearly define (estimate) the throughput time, planning and actual steps for migration and priority of information to migrate.
  • Verification
    A good approach for migration is that a 100% validation is planned as a difference report between old information and information as stored in the PLM system after migration. The chance is very high that exceptions are found that were not part of the defined mapping and rules. Validation ensures that there is no quality difference between information before and after migration. It does still does not ensure of say anything about the quality of information itself.



Supplier Management in PLM and ERP

After reading the discussion on how to manage supplier and manufacturer information in PLM and/or ERP I took the opportunity to document the datamodel as I see it and the way-of-working around it. Point of attention to start with is that PLM and ERP have a difference approach for definining the supplier and/or manufacturer. In the lifecycle definition it is not strictly necessary to define both; it may be sufficient to define a supplier. Additionally a manufacturer may of course also function as supplier. The resulting datamodel in PLM will look like this.


Customization vs. Configuration

As a result of a discussion in the LinkedIn PDM Platform group I wanted to straighten out some things around the continuing dicussion on the necessity and negative influence of customizations. Company environments grow and processes, procedures and ways-of-working are configured according to the direct needs of specific activities of persons or departments. In most cases this leads to local and point-to-point solutions. When acquiring a (new) PLM system the differences between the way-of-working of the software and of the company will always surface.


File name conventions

With the introduction of PDM where files are managed inside of a database worries for having file naming conventions are over, or so every thinks. But there are a multitude of reasons why we still need conventions.

The most important reason is off course that files do not stay inside your PDM system. Suppliers, co-developers, customers and other third-parties in general will not have access to you database.


Multidisciplinary Configuration Management (2)

To start a solution on managing the multidisplinary design information i am assuming that the functional decomposition is performed in such a way that each 'level' of decomposition is a set that is mutually conclusive and collectively exhaustive. This at least assures a neat collection of requirements.


Multidisciplinary Configuration Management (1)

A PLM system is the center for all product information, and therefore consist of information from multiple disciplines. A generally used cross-section is to split up information into Mechanical, Electrical and Software. The first point may be that the Electrical information is too general and may or must be split up into Electrical layout, PCB design and Cabling.



Articles View Hits