Tata Steel is a global conglomerate reporting a total annual revenue of USD 113 billion. In Europe, Tata Steel is one of the largest steel producers serving multiple markets worldwide including construction and infrastructure, automotive, packaging, and engineering. The Internal Audit & Assurance department at Tata Steel Europe has 22 auditors. Data analysis for audit plays a central role in testing and validating controls, identifying anomalies and trends, as well as quantifying potential risks and savings.
In order to improve processes, Tata Steel Europe took different areas within the company where data analytics were being applied. The motivation: do more with the available data and move reporting away from the labor-intensive and error-prone Microsoft Excel. Below are three examples of data analytics in action at Tata Steel Europe. In all cases a subject matter expert, who is often the lead auditor, together with an in-house KNIME expert worked together to do the data analysis.
1. Contract Management: Prevent Overcharging Services and Incorrect Tariffs
Challenge: employees of contractors could perform jobs at multiple locations in a single day, which made it difficult to see when an employee started and ended their working day.
Solution: timesheet data were imported and combined to identify employees with long working days - some with more than 20 hours. Auditors also identified other errors due to incorrect surcharges for overtime, weekend/night shift, and public holidays.
Why KNIME: easy to show and explain which anomalies had been detected in timesheets. Since the total population was reviewed, the exact impact of these errors could be calculated. Also, working with date and time fields in KNIME is much easier than in Excel.
2. Sales Price Analysis: Check and Control Invoices and Orders to Ensure Correct Pricing
Challenge: prices not always invoiced correctly, regular customer complaints, complex pricing structure, and various order fulfillment strategies. Furthermore, a legacy system made it difficult for sales staff to identify whether sales orders and prices were correctly recorded.
Solution: a sales-price analysis was performed by taking extracts of all sales orders and invoices raised within a given period. Prices between previous periods were compared, as well as prices of similar products. Price corrections and provisional pricing were also considered. For the identified potential erroneous prices, the root causes were identified by the sales staff, and controls were enhanced.
Why KNIME: running these workflows in KNIME is simple and only takes a few minutes. The majority of the time (up to a few hours) is used to extract new data and check for any new inconsistencies. The same analysis can be reused on other sales sectors, which reuses sections of the workflow and saves time.
3. Product Master Data: Reduce Inconsistencies Between Customer Orders and Sales Order Confirmations
Challenge: the product master data consists of a large number of very different product specifications including dimensions, tolerances, and mechanical and chemical properties.
Solution: a traditional audit was performed and a sample of twenty sales orders was manually tested, which uncovered several inconsistencies. This was extended to support the business by performing an extensive reconciliation of the product master data for all orders and not just for the selected orders.
Why KNIME: KNIME was able to connect and extract data from existing and legacy systems using a single workflow. Furthermore, it was easy to train the manufacturing and sales staff to reperform these analyses themselves.
Immediately after getting started with KNIME, and whilst still being in the experimental phase with hobby projects, time savings of up to 95% could be achieved in some use cases. Furthermore, by comparing KNIME results with Excel reports, several data issues and calculation errors were identified in the Excel reports. This was an even stronger motivation to get started with KNIME.
KNIME’s visual programming environment makes data analytics accessible to people who don’t have coding or scripting backgrounds. In a data-driven culture, this is essential because non-experts can independently use the available data, enhance control procedures, and turn insights into business value. It does take time to learn KNIME and basic data analysis skills are required to build workflows. In the case of Tata Steel Europe, it took approximately 10 hours to get familiar with the workbench and 100-200 hours to become confident in building workflows. However, the time investment has significant rewards and is worth it. Using KNIME saves time. Workflows - or sections of workflows - are reusable and shareable, which reduces the need to recreate them from scratch for every single project. Repeated steps such as mundane pre-processing tasks can be captured as a component and used in other workflows or by other colleagues. Configuration changes can also be programmed to update across all components if needed – guaranteeing consistency in business processes. All KNIME workflows are self-documenting, meaning knowledge is stored in the workflow itself and not in the mind of an employee. Not everyone has to become a KNIME specialist, but with basic knowledge, even non-technical auditors can contribute meaningfully to data analytics in audit.
Eddy van der Geest from Tata Steel Europe presents at the KNIME Data Talks - Audit, and explains how Tata Steel is moving from Excel to KNIME to automate financial and internal auditing and enable non-experts to work with data independently.
Download the free and open source KNIME Analytics Platform.
KNIME in Auditing
Learn how to replace spreadsheets with repeatable, secure KNIME workflows.
Find out how other companies are using KNIME to solve their data challenges.