Leveraging DataOps Suite to comply with BCBS, SOC2 and other regulations

pexels-pixabay

Many industries are regulated and have to adhere to a set of guidelines and industry standards. While we are a software company, we have been learning more about these standards and how our software is helping the financial industry adhere to the regulations.

Two of the standards we have been hearing about in the recent past are mentioned below:

  • The Systems and Organizational Controls (SOC) 2
  • Basel Committee on Banking Supervision’s standard number 239

The Systems and Organizational Controls (SOC) 2,

A comprehensive reporting framework was set forth by the American Institute of Chartered Public Accountants (ACPA). As part of this framework, organizations should engage independent auditors to execute a series of assessments to test the controls related to Trust Services Criteria (TSC) such as Security, Availability, Processing Integrity, Confidentiality, or Privacy.

From our perspective at Datagaps, achieving Processing Integrity is where we can add value and thus explaining our understanding below.

The Processing Integrity component has 6 criteria.

  • Procedures exist to prevent, or detect and correct, processing errors to meet the entity’s processing integrity commitments and system requirements.
datagaps_data_integrity
  • System inputs are measured and recorded completely, accurately, and time to meet the entity’s processing integrity commitments and system requirements.
  • Data is processed completely, accurately, and timely as authorized to meet the entity’s processing integrity commitments and system requirements.
  • Data is stored and maintained completely, accurately, and in a timely manner for its specified life span to meet the entity’s processing integrity commitments and system requirements.
  • System output is complete, accurate, and distributed to meet the entity’s processing integrity commitments and system requirements.
  • Modification of data, other than routine transaction processing, is authorized and processed to meet the entity’s processing integrity commitments and system requirements.

After the Global Financial Crisis in 2007-2008, one of the key lessons learned was that the Data Management frameworks and IT architectures in place at many banks at that time were inadequate to support and report risks associated with global banking. In lieu of that, the Basel Committee on Banking Supervision’s standard number 239 was issued as principles for effective risk data aggregation and Risk Reporting.

These are 14 principles that were laid out to strengthen banks’ risk data aggregation capabilities and internal risk reporting practices and they fall under the following categories.

  • Overarching Governance and infrastructure – Principle 1 & 2
  • Risk Data Aggregation Capabilities – Principles 3 through 6

In this blog, we will focus on few aspects of Principles 3 and 7 since they are the areas which are relevant from our perspective at Datagaps. If you want a complete description of the principles, please refer to this link.

  • Risk Reporting Practices – Principles 7 through 11
  • Supervisory Review tools and Cooperation – Principles 12 through 14

As we are collaborating more with the financial industry, we are learning that while the guidelines are fairly clear, the actual implementation is complex and there are no clear tools and implementation strategies to help banks achieve these principles. In this blog,  we want to delve a bit deeper with a concrete example of how our software can help achieve the Process Integrity required for the audits and enable trust in enterprise data and systems.

Consider General Ledger (GL), one of the most important elements in the Accounting world. Data is entered manually and automatically into the GL and oftentimes data is extracted from the GL into Enterprise Reporting systems or other downstream systems because the GL is a treasure trove of valuable information from Accounts Receivable, Accounts Payable, Purchasing, Payroll, and the sub-ledgers within the enterprise as shown in the image.

Automated-Data-Entry-from-Other-Systems

If anything goes wrong with the GL, it can have huge implications within the enterprise because if GL is not trustworthy, then, the reporting will not be trustworthy as well and wrong numbers might get accidentally reported to Wall Street because which organizations have to pay huge fines to governing entities.

Now, the question is how and where can something go wrong with the process integrity when it comes to General Ledger? To answer this question, let’s take a look at a potential file output from the GL system in Oracle. As mentioned in the Oracle Documentation,  a GL file can be imported or extracted in multiple formats through automated scheduler jobs such as AutoSys and one such format (IXGLDN) has the below columns.

Field Name

Attributes Field Description/Contents
Run group Alphanumeric, 12 positions Composed of CW + the date the file was populated, in CCYYMMDD format. For example, the run group for a record created on February 19, 2003 is CW20030219.
Sequence number Numeric, 6 positions A unique sequence number assigned by the system to each record each time you generate the download file.
Company Numeric, 4 positions The first 4 positions of the Cross reference number field in Working with Chart of Accounts (WCA). If the first 4 positions include a non-numeric character, such as a hyphen (-) the system skips this character and takes the first 4 numeric values. For example, if the Cross reference number is 100-10-12345, the Company will be 1000. If the Cross reference number field does not contain numeric characters, the Company field will not be populated consistently.
Company old Alphanumeric, 31 positions The first 11 positions of the Cross reference number field in Work with Chart of Accounts (fast path = WCA).
Account number old Alphanumeric, 31 positions Positions 12-19 of the Cross reference number field in Work with Chart of Accounts (fast path = WCA).
Source code Alphanumeric, 1 position The Journal source field from the General Ledger Interface file, indicating the type of transaction that generated the posting. See Accounts Payable Chapter 41: Displaying the General Ledger Interface (DGLI) for a list of journal source codes.
Transaction date Numeric, 8 positions (CCYYMMDD format) The date this record was downloaded to the General Ledger Download file. For example, the transaction date for a record created on February 19, 1999 is 19990219.
Reference Alphanumeric, 10 positions Composed of CW + the date the download file was created. For example, the Reference for a record created on February 19, 1999 is CW19990219.
Description Alphanumeric, 30 positions Same as the Reference field.
Currency code Alphanumeric, 5 positions Blank
Units amount Numeric, 15 positions with a 2-place decimal 0.00
Units amount sign Alphanumeric, 1 position Blank
Transaction amount Numeric, 15 positions with a 2-place decimal The Transaction amount field from the General Ledger Interface file.
Transaction amount sign Alphanumeric, 1 position + if the transaction amount is positive. – if the transaction amount is negative.
Base amount Numeric, 15 positions with a 2-place decimal The transaction amount field from the General Ledger Interface file.
Base amount sign Alphanumeric, 1 position + if the transaction amount is positive. – if the transaction amount is negative.
Base rate Numeric, 12 positions with a 6-place decimal .000000
System Alphanumeric, 2 positions CW
Program code Alphanumeric, 5 positions Blank
Auto rev Alphanumeric, 1 position N
Posting date Numeric, 8 positions (CCYYMMDD format) The date of the transaction from the General Ledger Interface file, converted into 8-position format. For example, the posting date for a record from February 19, 1999 is 19990219.
Activity Alphanumeric, 15 positions Blank
Account category Alphanumeric, 5 positions Blank
Document number Alphanumeric, 15 positions CW + the date the download file was created + the sequence number for this record. For example, the document number for the first record created on February 19, 1999 is CW19990219000001.
To base amount Numeric, 15 positions with a 2-place decimal 1.00
To base amount Sign Alphanumeric, 1 position +
Effect date Numeric, 8 positions (CCYYMMDD format) 0
Journal book number Alphanumeric, 12 positions Blank
MX value 1 Alphanumeric, 20 positions Blank
MX value 2 Alphanumeric, 20 positions Blank
MX value 3 Alphanumeric, 20 positions Blank

Whenever the version of the software related to scheduling or ETL or GL is updated, there is a possibility for the files to not be in sync with the expected formats and thus result in wrong data sets. If testing teams are able to achieve the desired quality, there are various challenges such as 

  • While the principles recommend banks have one source of data, it is impossible in large banks because the data is distributed across multiple systems.
  • Manual intervention related to all the files involved in testing is unavoidable with the current set of tools and frameworks.
  • Lack of end-to-end automation that instills trust in Quality Assurance and Reporting. 
  • Lost productivity in setting up test files and data.

Relying on Excel and manual eye-balling to check values and validate rules. As an example, consider the Excel file shared by the BIS as a reference to help banks adhere to the principles. It has multiple worksheets and one specifically called “Checks” that supervisors and other governance body members can use to validate the risk-related metrics. All this has to be done manually.

BIS_Excel

In our future blogs, we will explain how DataOps Suite can help banks achieve the following benefits and adhere to the principles laid out by organizations such as AICPA, BIS, etc. 

  • Elimination of Manual Intervention
  • End to end automation of test scripts
  • Extremely high performance to test billions of records in little time.
  • Integrated scheduling
Datagaps-logo-1536x406-1

Established in the year 2010 with the mission of building trust in enterprise data & reports. Datagaps provides software for ETL Data Automation, Data Synchronization, Data Quality, Data Transformation, Test Data Generation, & BI Test Automation. An innovative company focused on providing the highest customer satisfaction. We are passionate about data-driven test automation. Our flagship solutions, ETL ValidatorDataFlow, and BI Validator are designed to help customers automate the testing of ETL, BI, Database, Data Lake, Flat File, & XML Data Sources. Our tools support Snowflake, Tableau, Amazon Redshift, Oracle Analytics, Salesforce, Microsoft Power BI, Azure Synapse, SAP BusinessObjects, IBM Cognos, etc., data warehousing projects, and BI platforms.  www.datagaps.com 

Related Posts:

Data Quality

Automate testing of Business Intelligence applications by making use of the metadata available from the BI tools such as Tableau, OBIEE, and Business Objects.

Synthetic Data

Automate testing of Business Intelligence applications by making use of the metadata available from the BI tools such as Tableau, OBIEE, and Business Objects.

ETL Testing

Automate testing of Business Intelligence applications by making use of the metadata available from the BI tools such as Tableau, OBIEE, and Business Objects.

BI Validation

Automate testing of Business Intelligence applications by making use of the metadata available from the BI tools such as Tableau, OBIEE, and Business Objects.
Products

product_menu_icon01

DataOps Suite

End-to-End Data Testing Automation

product_menu_icon02

ETL Validator

Automate your Data Reconciliation & ETL/ELT testing

product_menu_icon03

BI Validator

Automate functional regression & performance testing of BI reports

product_menu_icon04

DQ Monitor

Monitor quality of data being Ingested or at rest using DQ rules & AI

product_menu_icon05

Test Data Manager

Maintain data privacy by generating realistic synthetic data using AI

About
Free Trial