Metadata
3 min

Discovering the Data Behind Your Applications

The blog emphasizes the research phase required to identify data dictionary tables and how Safyr streamlines the process, reducing extraction time.
A word cloud

We’ve been providing a solution to get metadata out of ERP and CRM applications for quite a few years now and a common question over that period has been some variation of:  “It’s great what you do for SAP (PeopleSoft, Siebel, etc…) – now tell me, can you do the same for our other packaged apps?”

And we’ve normally had to say “Sorry, afraid not”.  But as of late, that’s all changed.

That which we do for  SAP, SAP BW, PeopleSoft, Siebel, JD Edwards, Oracle eBusiness Suite and more latterly, Salesforce, is to get at the metadata embedded within these applications by looking at their data dictionary tables.  If we simply looked at the database system catalog, then we could claim to ‘reverse engineer’ anything – but that would not normally address the customers’ requirements to really understand the metadata.

The key is to provide access to the proprietary data dictionary structures that describe the data underlying the application in a way intelligible to mortals.   And that’s why we historically said ‘No’ to the above question.  For each application we address, we’ve had to go through a research phase to locate the relevant data dictionary tables and then work out how to extract that metadata and map it into our own repository.

So we started to think about how we could address the ‘other packaged applications’ requests.  What we’ve come up with is a means to address other applications with the minimum of work.

We describe this as ‘ETL for Metadata’ because that’s pretty much what it is;  We can now Extract from a package, Transform the metadata as required and Load it into our Repository, so the searching and manipulation features of Safyr are available, just like for the other apps we already address. Of course there is some work to be done, and there are some characteristics of the packages we need before the ETL for Metadata process can be practically used:

  • The package needs to have a ‘data dictionary’ layer.  It might not be called that by the package, but most packages will have their own data definition features for laying a ‘logical’ view over the physical data structure.  If no such thing exists, we probably can’t add much value.
  • The data dictionary needs to reside in a relational database.  This requirement is not totally cast in stone, but it will make the job a lot easier if we can use SQL to interrogate the data dictionary.

Given the package meets these criteria, what’s involved in getting metadata into Safyr? Firstly, we need to do some investigative work to locate where each of the ‘objects’ we store in our repository is located in the application.  So we need to find where it stores Tables, where it stores Fields (or Columns), where it stores Relationship information, and so on.

Once that research work is done, there is a simple scripting capability within Safyr to allow definition of the queries that will be used to get at this metadata layer. And the scripting capability also includes the ‘Transform’ capability to allow the raw extracted metadata to be manipulated into a suitable form for loading into Safyr.

We’ve already used this new capability on a few packaged applications, and the average time from doing the research to delivering the metadata in Safyr is about 5 person/days. Once the scripts to do this have been developed, the extraction process is entirely automatic, just like the extractor functions we use for our longer established applications like SAP.

Over the coming months and years we will be building up a library of such scripts that we can offer to customers who use these applications. We are also always delighted to explore potential applications and markets with partners, domain specialists and other software application vendors.

Please let us know if you are interested in a dialogue.

Previous blog
There is no previous blog.
Back to all posts
Next blog
There is no next blog.
Back to all posts