Data Controller’s Knowledge Base
A section dedicated to news, updates and educational pieces.
A section dedicated to news, updates and educational pieces.
End User Computing (EUC) applications are unavoidable – the challenge is not to erase them, but to embrace automated approaches to EUC management that will identify, clean, secure, backup, and integrate EUC data with full auditability, ownership, and approval.
EUC applications such as Excel, Access Databases, and locally executed programs, are often targeted as the source of a myriad of risks – such as financial misstatements, internal fraud, incorrect models, and potential for business process disruption. The rationale being that business developed / owned applications are not subject to the same access controls, development & testing standards, documentation and release management processes as can be found over the “IT Fence”. Whilst this is probably true, the inherent flexibility of EUCs that can be quickly updated without service desk requests, project codes, or lost arms & legs – means that EUCs are, regardless, here to stay.
The challenge is to find a way to shine a light onto this “Shadow IT”, and provide a framework by which EUC data can be extracted in a simple, safe, secure, scalable, and auditable fashion.
The ‘war on EUCs’ cannot be won – it simply isn’t practical to ban them, or to migrate / redevelop every closely held and highly complex legacy VBA application. Until alternative solutions for Citizen Developers to build Enterprise Apps (such as Boemska AppFactory) become mainstream, simple measures / controls on the EUCs themselves must be implemented – such as version control, readonly attributes, embedded documentation, peer review etc.
In the meantime, a management system for EUCs is the ideal place for capturing the requisite metadata needed to monitor, audit, and secure the data therein. Such a management system should have, as a minimum, the following attributes:
The ability to run data quality routines at the point of data upload (from EUC to secure IT environment) provides instant feedback to EUC operators that will allow them to make corrections and avoid costly post-upload investigations, re-runs, or worse – incorrect results. As part of this process, it should be easy to create and update those Data Quality rules. A longer discussion of Data Quality can be found here.
After EUC data is submitted, it should be reviewed before the target database is updated. It should be possible (but not mandatory) for this check to be performed by a different individual. When performing that check, it should only be necessary to review new / changed / deleted records. For changed records, the reviewer should also be able to see the original values. If the data is approved, the target table is updated. If rejected, the staged data can simply be archived.
By capturing who is actually submitting the data, we can see who is responsible for each EUC. By reviewing who is signing off on that data, we have an indication of who is accountable. And by seeing who is being notified of changes to that data, we can deduce who are being consulted / informed. It will then be unnecessary to conduct time-consuming interviews or audits to produce instantly out of date and error-prone EUC ownership documentation!
EUCs are often present on network shares, with opaque access policies and few (if any) controls to prevent unintentional deletion or corruption of data. An EUC management system should ensure data protection from the point of EUC integration right through to the loading of the data to the target table(s). End users should not require write access to the target databases! Neither should individuals in IT be regularly relied upon to run manual scripts for loading business critical data. Finally, it should be possible to restrict (at least to table level) which groups are given permission to edit or approve data.
Adding new tables / EUCs to the system should be a BAU (configuration) task, and possible without needing to secure IT development resource. The process should be so well defined, that new EUC operators can safely integrate their processes with minimum (if any) engagement from IT.
Understanding the flow of data into regulatory reports is essential for ensuring the accuracy of the figures they contain. Whilst this can be done automatically in some IT systems (eg SAS Metadata or Prophet Diagram View) the lineage breaks down when data flow crosses system borders. An EUC management system therefore should keep a full history (including a backup copy of the EUC itself) to enable traceback of data items.
Any “system” worth it’s salt will enable easy integration and flexible workflows to ensure that subsequent processes can be triggered on relevant events (such as EUC submission, or data approval). There should be no manual steps other than the act of submitting the data, and reviewing / approving the data.
This should really go without saying, however the reality is that there are still many teams (yes, even in IT) who work without source control. For the love of god, don’t even think about building a complex data management system without solid source control and a comprehensive test harness. Not to mention automated build and deployment. When it comes to a system that is responsible for maintenance of business data, it is imperative that it is robust, performant, and filled with checks and controls.
Whilst a decent system should be intuitive enough to operate without a manual, when it comes to maintaining, extending, or using advanced features – documentation is essential, and should be updated regularly. New feature? Write the test, make the fix, build & deploy, pass the test, update the documentation, release. Documentation should be useful for users, developers, and administrators – with diagrams, screenshots, and process flows.
During month end, temperatures are high and the pressure is on. The last thing you need on BD2 is system failure, especially when it’s 4:30 on a Friday and 150 users are affected. Be sure your platform of choice is proven, supported, and highly available.
One of the biggest business benefits of an EUC Management System is the ability to trace data directly back to a locked down copy of the EUC that it came from. The system should therefore make it easy to identify and locate that copy, to see who submitted it, who signed it off, and what the precise changes were (adds, updates, deletes).
Before you go ahead and build / maintain your own ‘black box’ bespoke EUC reporting solution, take a look at what the Data Controller has to offer (in addition to everything described above):
We can also provide an on-site consultant to perform the deployment and user training. Get in touch to learn more!
When applying financial regulations in the EU (such as Solvency II, Basel III or GDPR) it is common for Member States to maintain or introduce national provisions to further specify how such rules might be applied. The National Bank of Belgium (NBB) is no stranger to this, and releases a steady stream of circulars via their website.
The circular of 12th October 2017 (NBB_2017_27, Jan Smets) is particularly interesting as it lays out a number of concrete recommendations for Belgian financial institutions with regard to Data Quality – and stated that these should be applied to internal reporting processes as well as the prudential data submitted.
This fact is well known by affected industry participants, who have already performed a self assessment for YE2017 and reviewed documentation expectations as part of the HY2018 submission.
The DQ requirements for reporting are described by the 6 dimensions (Accuracy, Reliability, Completeness, Consistency, Plausibility, Timeliness), as well as the Data Quality Framework described by Patrick Hogan here and here. There are a number of ‘hard checks’ implemented in OneGate as part of the XBRL submissions, which are kept up to date here. However, OneGate cannot be used as a validation tool – the regulators will be monitoring the reliability of submissions by comparing the magnitude of change between resubmissions! Not to mention the data plausibility (changes in submitted values over time).
When it comes to internal processes, CRO’s across Belgium must now demonstrate to accredited statutory auditors that they satisfy the 3 Principles of the circular (Governance, Technical Capacities, Process). A long list of action points are detailed – it’s clear that a lot of documentation will be required to fulfil these obligations! And not only that – the documentation will need to be continually updated and maintained. It’s fair to say that automated solutions have the potential to provide significant time & cost savings in this regard.
The Data Controller is a web based solution for capturing data from users. Data Quality is applied at source, changes are routed through an approval process before being applied, and all updates are captured for subsequent audit. The tool provides evidence of compliance with NBB_2017_27 in the following ways:
Data Controller differentiates between Editors (who provide the data) and Approvers (who sign it off). Editors stage data via the web interface, or by direct file upload. Approvers are then shown the new, changed, or deleted records – and can accept or reject the update.
As an Enterprise tool, the Data Controller is as scalable and resilient as your existing SAS platform. If you are looking for a best-in-class tool for performance testing and tuning of your analytic environment (and view the performance history) we recommend Boemska ESM®.
Data Controller has a number of features to ensure timely detection of Data Quality issues at source (such as cell validation, post edit hook scripts, duplicate removals, rejection of data with missing columns, etc etc). Where errors do make it into the system, a full history is kept (logs, copies of files etc) for all uploads and approvals. Emails of such errors can be configured for follow up.
The Data Controller can be configured to execute specific .sas programs after data validation. This enables the development of a secure and integrated workflow, and helps companies to avoid the additional documentation penalties associated with “miscellaneous unconnected computer applications” and manual information processing.
The Data Controller is actively maintained with the specific aim to reduce the cost of compliance with regulations such as NBB_2017_27. Our roadmap includes new features such as pre-canned reports, version ‘signoff’, and the ability to reinstate previous versions of data.
As a primary and dedicated tool for data corrections, Data Controller can be described once and used everywhere.
By using the Data Controller in combination with knowledge of data lineage (eg from SAS metadata or manual lookup table) it becomes possible to produce an automated report to identify exactly who – and hence which division – was involved in both the preparation and the validation of the all source data per reporting table for each reporting cycle.
Data Controller can be used as a staging point for verifying the quality of data, eg when data from one department must be passed to another department for processing. The user access policy will be as per the existing policy for your SAS environment.
Whilst the circular provides valuable clarity on the expectations of the NBB, there are significant costs involved to prepare for, and maintain, compliance with the guidance. This is especially the case where reporting processes are disparate, and make use of disconnected EUCs and manual processes.
The Data Controller for SAS® addresses and automates a number of pain points as specifically described in the circular. It is a robust and easy-to-use tool, actively maintained and documented, and provides an integrated solution on a tried and trusted platform for data management.
Data Controller is a product of Macro People, a software company whose focal point is SAS® software development and education.
Visit our educational and fun SAS® software quiz Sasensei and test your knowledge in this subject field.