Data lineage and the uses of information
A true end-to-end, exhaustive and fully automated data lineage, which presents multiple views according to needs.
Resolving data lineage breaks
> Views: if they are stored, {openAudit} will read them, even if they are stacked (views of views of views...).
> Dynamic SQL: If {openAudit} fails to resolve it directly, the dynamic SQL is resolved with runtime parameters or runtime logs.
> Others: in case of transfer of information by FTP, or when the database schema is not specified (this is the case in many ETLs), {openAudit} resolves these breaks by structural recognition, where {openAudit} reads the Batch / or the Shell.
Dynamically combine different data transformation technologies:
> Data lineage in the dataviz layer: {openAudit} will present all the transformations of the dashboard, from the feeding layers to the dashboard cell, and will allow you to review all the management rules implemented.
> Data lineage in the feedin layers: {openAudit} analyzes all processing technologies (object/procedural language, ELT/ETL), on premise or Cloud, and combines them in a single data flow, at the level thinner. The drill through allows access to the code.
> The process is dynamic, operated in delta mode, daily, and therefore synchronized with the information system.
Different levels of analysis for feeding layers :
> Cloud of points: this view allows you to instantly know the uses of a datapoint by disregarding transformations. It is also possible from a use (a dashboard, a data from a dashboard), to instantly identify its operational sources.
> Mapping: this view allows from any datapoint (field, table) to display a complete mapping of the upstream or downstream flow, i.e. from the operational sources to the exposure of the data (dataviz, query). The information used is highlighted, and the uses of the information are specified on the flyover (who consults the data, when, how).
> Granular data lineage: this view makes it possible to gradually follow the deployment of data in the information system from a datapoint, by iterative clicks, or on the contrary to go back to operational sources. Each transformation (ELT/ETL job, procedural/object code) can be analyzed with the “drill through”. The precise details of the uses of the data (who consults it, when, how, etc.) are defined with a single click.