Write My Paper Button

WhatsApp Widget

ASK A QUESTION

Write My Paper

Freelance Writing

The Extract, Transform, Load (ETL) process plays a pivotal role in modern data management, especially in sectors where real-time data analysis and reporting are crucial. The banking

Assessment Task

Introduction

The Extract, Transform, Load (ETL) process plays a pivotal role in modern data management, especially in sectors where real-time data analysis and reporting are crucial. The banking sector, with its complex data streams and regulatory requirements, is one such domain where ETL can significantly enhance efficiency and compliance.

 

This assessment focuses on a case study involving the ETL process in the context of banking liquidity reporting in Indonesia. The case study explores how banks maintain, calculate, and monitor liquidity using the Liquidity Coverage Ratio (LCR) indicator and how the ETL process can be leveraged to automate these tasks.

 

Scenario: ETL Process in Banking Liquidity Reporting

 

Background

Financial regulators across the globe require banks to maintain specific liquidity levels to ensure financial stability and protect consumer interests. The Liquidity Coverage Ratio (LCR) is a key measure used to assess a bank’s ability to meet its short-term obligations using its most liquid assets. In the rapidly evolving financial landscape, banks must navigate complex regulatory frameworks, diverse data sources, and stringent timelines. Automation through the ETL process has emerged as a vital solution, facilitating real-time reporting, enhancing data accuracy, and ensuring timely compliance with various international and domestic regulatory bodies.

 

Problem Statement

Despite the importance of the LCR in banking regulation, many banks face challenges in efficiently calculating and reporting this critical metric. Manual processes, disparate data sources, lack of standardization, and the complexity of regulatory requirements lead to delays, inaccuracies, and inefficiencies. Banks are under pressure to adapt to dynamic regulatory environments while maintaining transparency and accuracy in reporting. The existing manual and semi-automated systems often fall short in providing a scalable, robust, and responsive solution, leading to increased risks and operational costs.

 

 

 

 

 

The ETL Solution

The case study proposes an innovative ETL solution tailored to the banking context in Indonesia, designed to overcome the challenges associated with LCR reporting. This comprehensive solution extracts data from various sources, including transaction logs, balance sheets, and external regulatory databases, transforming it into the required format and loading it into a centralized reporting system. The automation not only saves time but also improves accuracy and facilitates real-time monitoring and compliance. The proposed ETL process leverages cutting-edge data integration techniques, employs advanced data validation protocols, and aligns with the specific regulatory requirements of the Indonesian banking sector. By streamlining data flow and enhancing analytical capabilities, this ETL solution contributes to informed decision-making and robust risk management within the banking industry.

 

Part 1: Extract, Transform, and Load (ETL) Process: (50 marks)

 

·       Design an ETL Process: Based on the case study and provided data set, design an ETL process that could be implemented in a real-world banking scenario.

·       Evaluate the ETL Process: Critically assess the ETL process, identify potential limitations, and propose enhancements or alternatives.

·       Using the case study provided in the moodle, discuss the relationships between entities and their cardinalities e.g one-to-one, one-to-many, many-to-many, and optional or mandatory.

·       Use app.diagrams.net tool to draw the conceptual Model entity relationship diagram (ER-Diagram)

·       Implement the ETL Process: Write SQL queries to extract, transform, and load data as per the designed process.

 

Part 2: Normalization: (30 marks)

·       Apply Normalization Techniques: Using the provided data set, apply normalization techniques to optimize the database structure.

·       Analyze Normalized Data: Interpret the normalized data and provide insights into banking liquidity trends.

 

 

 

 

 

 

 

 

 

Assessment Details

 

Part 1: Extract, Transform, and Load (ETL) Process

ER Diagram (20 points)

a.     Based on the case study, develop the Conceptual Model Enhanced Entity-Relationship diagram using app.diagrams.net tool

b.     Discuss the relationships between entities and their cardinalities e.g one-to-one, one-to-many etc, state all your assumptions.

 

Data Extraction (15 points)

Write SQL queries to extract the following information:

a. List of all banks regulated by “Regulator 1.”

b. Total liquidity of “Bank A” for January 2021.

 

Data Transformation (20 points)

 

Write SQL queries to transform the extracted data into the following formats:

a. A summary report showing the average LCR for each regulator.

b. A monthly trend of total transactions (Deposit and Withdrawal) for each bank.

 

Data Loading (15 points)

 

Assume you have a new table Regulator_Summary with columns Regulator and Average_LCR. Write an SQL query to load the transformed data from Exercise 2a into this table.

 

Part 2: Normalization

 

Scenario: Using the provided tables (find below), apply normalization techniques to optimize the database structure.

Bank Reports Table

 

ReportID

BankID

ReportDate

LCR

TotalLiquidity

TotalAssets

1

101

2021-01-31

105.2

1000000

1500000

2

102

2021-01-31

98.7

800000

1200000

3

101

2021-02-28

102.5

950000

1400000

4

103

2021-01-31

99.9

700000

1000000

 

 

 

Banks Table

 

BankID

BankName

Location

Regulator

101

Bank A

City X

Regulator 1

102

Bank B

City Y

Regulator 1

103

Bank C

City Z

Regulator 2

Transactions Table

 

TransactionID

BankID

TransactionDate

TransactionType

Amount

201

101

2021-01-15

Deposit

50000

202

101

2021-01-20

Withdrawal

30000

203

102

2021-01-10

Deposit

40000

204

103

2021-01-05

Deposit

25000

205

101

2021-02-01

Withdrawal

20000

 

First Normal Form (1NF) (5 points)

Convert the Bank_Reports table into First Normal Form (1NF). Explain the steps you have taken.

 

Second Normal Form (2NF) (5 points)

Further normalize the table(s) from Exercise 4 into Second Normal Form (2NF). Explain the steps you have taken.

 

 Third Normal Form (3NF) (10 points)

Further normalize the table(s) from Exercise 5 into Third Normal Form (3NF). Explain the steps you have taken.

 

Interpretation and Analysis (10 points)

Interpret the normalized data and provide insights into the banks’ liquidity trends.

Propose a potential use case for the ETL process in improving banking regulations.

 

 

 

 

 

Assessment Format and Submission Guidelines

 

Format: This assessment should be presented as a formal report, focusing on the analysis and design of SQL queries related to the banking scenario. Please adhere to the following guidelines:

Structure: Include an introduction, main body (divided into appropriate sections and subsections for the ETL process and normalization), conclusion, and references.

 

SQL Queries Section: Include a dedicated section for the SQL queries:

 

Write SQL queries to extract the required information as specified in the assessment.Provide explanations for each query, detailing how it meets the specific requirements (e.g., listing banks regulated by “Regulator 1” or calculating total liquidity for “Bank A”).Include comments within the SQL code if necessary to explain complex logic.

 

Word Limit: The report should be between 1500-2000 words, excluding SQL queries, tables, and references (Harvard Referencing).

 

 

 

 

 

 

 

 

 

 

 

Marking Rubric

 

Criteria (Weight%)

HD (85-100%)

D (75-84%)

C (65-74%)

P (50-64%)

F (0-49%)

Part 1: Extract,
Transform, and Load (ETL) Process

 

 

 

 

 

ER-Diagram 20 “(points)

The student demonstrates mastery in developing the
Conceptual Model Enhanced Entity-Relationship diagram using app.diagrams.net
tool and discusses the relationships between entities and their cardinalities

The student shows proficiency in developing the
Conceptual Model Enhanced Entity-Relationship diagram using app.diagrams.net
tool and discusses the relationships between entities and their cardinalities

The student exhibits basic skills in developing the
Conceptual Model Enhanced Entity-Relationship diagram using app.diagrams.net
tool and discusses the relationships between entities and their cardinalities

The student struggles with developing the Conceptual
Model Enhanced Entity-Relationship diagram using app.diagrams.net tool and
discusses the relationships between entities and their cardinalities

Inadequate understanding in ER-Diagram and in
discussing relationships between entities and their cardinalities

Data Extraction (15 points)

The student demonstrates mastery in data extraction,
accurately crafting queries and analyzing data.

The student shows proficiency in data extraction,
with minor areas for improvement in queries and analysis.

The student exhibits basic skills in data
extraction, with some errors in queries and analysis.

The student struggles with data extraction, showing
significant errors in queries and limited analysis.

Inadequate understanding and execution of data
extraction.

The Extract, Transform, Load (ETL) process plays a pivotal role in modern data management, especially in sectors where real-time data analysis and reporting are crucial. The banking
Scroll to top