ā15-01-2015 04:57 PM
What is the compression function / algorithm that is used to compress a 'text' attribute which is stored in the BLOB_BIN_VALUE column of the A_BLOB RDB table when BLOB_COMPRESSED = 1.
Thank you.
We are building reports directly against the Mega RDB to address the aggregate data analysis requirements which we have. We need to be able to use industry standard reporting and analysis tools against our Mega ITPM data.
We fully understand that the model is subject to change and this is unsupported.
ā17-05-2019 04:22 PM
Hi,
Thanks for the answer!
Unfortunatelly this requires another license...
Best
Raphael
ā16-05-2019 09:05 PM
Hi RGenin,
I can't neccessarily elaborate on the Sql/Oracle implementation our developers incorporated into the application but I know we now offer a "DataMart" feature that better exposes the data in an environment for reference from third Party applications (etc.). Have you looked into that? If not, I would recommend looking it up in our online documentation and reaching out to you MEGA account manager.
I hope that helped!
JCAMARA
ā15-05-2019 09:42 PM
Hello,
I was wondering if this topic has been further discussed at all ?
Best regards,
Raphaƫl.
ā17-03-2015 08:56 PM
Thank you for your reply.
Two questions: can yo share with us what Lempel-Ziv 3rd party library yo are using?
If not, the LZ varient name?
Also, why compress at all? SQL Server and Oracle both compress BLOB's and large text fields?
Thank you!
ā10-02-2015 01:15 AM
Hi,
Thank you very much for your response!
I am sorry as I could not deduce your first name from your forum username of jftrianda.
I fully understand that Mega does not support directly accessing the RDB data. We have no issue with that.
Our reporting requirements are such that we would need to author tens of thousands of lines of Java programming to obtain the reports we need. You have some good reporting capabilities in the tool, but most are built for 'immediate' and atomic analysis - all very good and very useful!
We are considering the Talend approach as well. Does the Talend ETL ship with the tool? I noticed that the generated Java code does. Eventually, we will probably go with an ETL solution accessing the Mega objects via the Mega API. We are well versed with most of the ETL tools.
However, one of our biggest frustrations here at AIG has been the inability to access the myriad of ITPM data using industry standard reporting and analysis tools! We leverage a good number of those tools. We are also gravely concerned with performance impact of reporting on the Mega IIS server.
We were very successful in creating a ādynamicā mapping layer of SQL Server views on the Mega RDB. With some suggestions from another Mega customer who did something similar, we authored some SQL Server TSQL to dynamically create views on all of the Mega Object tables and the joiner (association) tables using the Mega MetaClasses related tables.
You did some interesting and some puzzling things with your implementation in the RDB ā again very elegant, but the evolution from your proprietary RDB to SQL or Oracle is evident.
I included a copy of the SQL which we use to extract all Persons and their Roles associated with an application. You can deduce what we did in terms of naming convention. We combined both our āproductionā and āsystemā Mega repository views in our āviewsā RDB.
With respect to compressionā¦
I did play around with both LZ, Huffman and a couple of LZ variants to determine what compression algorithm was being used. Tedious. A question I will pose to you, why donāt you simply leverage the RDB compression of large character fields? Both SQL and Oracle perform excellent compression on BLOBs.
Can you tell me what LZ variant you used?
Unless we can persuade Mega to expose the Mega MetaClasses (at least the ITPM ones) and MetaAssociations, our current approach will be an interim one. We would like to get it to work. Your CP3 changes to BLOB storage was quite interesting. I would offer that you should try (from a performance and compression perspective) using the native RDB compression.
Please know, I espouse the elegance of the Mega meta model implementation to all who will listen!
Below is a sample of the SQL we use to extract all Persons and their Roles who are associated with an Application object. Easier than Mega API? Probably not. The general community knows much about reporting technologies. Exposing this to them has tremendous value.
Nowā¦ Mega could automatically create these views for us (really, really easy), or supply Talend ETL for all of the ITPM objects, or implement Table Value Functionsā¦ basically PLEASE expose your data!
select
a.IDABS as [Application IDABS]
,a.[EAP Global App ID]
,substring(a.[Generic Local name],1,patindex('%[[]%',a.[Generic Local name])-1) as [Application Name]
,p.Name [Application Owner Name]
,pr.Name as RoleName
,p.[E-Mail]
from Application_O_P a
join [Person Assignment to Assigned Object_L_P] paao on (paao.[Assigned Object] = a.IDABS)
join [Person Assignment_O_P] po on (po.IDABS = paao.[Person Assignment])
join [Mega Assignable Element to Person Assignment_L_P] marpa on (marpa.[Person Assignment] = paao.[Person Assignment])
join [Person_O_S] p on (p.IDABS = marpa.[Mega Assignable Element])
join [Role to Person Assignment_L_P] rpa on (rpa.[Person Assignment] = paao.[Person Assignment])
join [Person Role_O_S] pr on (pr.IDABS = rpa.Role)
All of our generated views filter on:
āwhere end_validity = 2147483647 and IDTRAN = 0ā
Thanks!
ā05-02-2015 11:35 AM - edited ā05-02-2015 05:24 PM
Hi Greg
The official answer you already know is: "The direct access to SQL tables is not supported by MEGA. We do not give information to do so, because we do not guarantee any persistence of the technical implementation. Only the APIs allow you to access the actual data."
The algorithm used is a standard Lempel-Ziv compression, that MEGA creates/retrieves through a third party zip library.
Yet, several customers who attempted to build their own extraction tool to fill an external datamart have failed (some may have succeeded but we donāt know them).
The best and compliant way to extract data from MEGA Repository for data analysis is to configure an ETL, which will load MEGA data in an RDBMS repository using our APIs. .
We still recommend that you have a look to the ETL provided for MEGA Business Objects advanced reporting, that MEGA commits to keep up-to-date, and which, anyway, is based on Talend Open source.
It might give you a less risky, no coding way of getting data out of the repository. Yet, it works through the APIās what is much safer, but admittedly slower.
Thank you.