Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

The role of data masking in application testing

Nurdianah Md Nur | Dec. 19, 2016
Huw Price, Vice President, Continuous Delivery, CA Technologies, talks about the importance of data masking in improving overall data security and how to do so.

How can such businesses be persuaded to stop using copies of live production data in application testing? Would regulation or growing consumer concerns over privacy change their mind?

Firstly, it is regulations, as businesses are getting fined. Secondly, it stops them from being agile because they cannot outsource. Hence, they cannot be as effecive with their vendors. For example, if a bank wants its system to be leveraged by a company that creates financial services apps, one cannot use live data for testing and vendors need to be given more false data.  

Live data testing exposes the risk.

Today, DevOps is changing the entire landscape and speed is the key. DevOps is all about a working code coming out of time period. This is where one can use synthetic or virtual data to create enough around the problem to test it thoroughly. So when one checks the code in, it goes to production but it's well-tested.

DevOps is actually changing the security profile of companies -- it forces them to actually think about development and testing. 

Which industries have been avid adopters of data masking? Why is this so, and what kinds of challenge do they face in the process?

To be frank, none of the industries have been particularly active and the adoption has been very patchy. But government institutions have mostly been a bit more systematic and have more processes. 

There are challenges as well. For instance, in the heathcare industry, it is very difficult to mask data because it is personal and inter-related. HR data is also very similar given the series of records.

There are several factors like regulations, commercial thefts and loss of brand which act as hindrances.

In your whitepaper on test data management, you wrote about 'The Ideal Alternative', which comprise People, Process and Technology. You've covered the third point in earlier questions, but what about the first two?

When we talk about 'People', today, every developer, tester or user needs test data. Now this is diffcult to get as they have to manually create it, borrow it and look for it. This may take anywhere from 5 percent to 50 percent of their day doing that.

For Test Data Management (TDM), if one can create a very good centre of excellence or set of practices that people can adopt, people can go to one central place to find and ask for their test data, as opposed to looking everywhere. Once you have established it, you have to police it. The TDM team can then slowly change the process because people really need to be rigorous about writing test data requirement. This practice is a structured and an organised way of doing it.

Also, if you think of Agile as independent ships floating along, integration testing --  which is generating data  --  will be a problem. If someone in the data warehousing team needs the status on bills, one can go to the TDM teams instead of the billing team to ask for the bills. But the release should match. Hence, versioning is very important as the data definition is always changing.

What we are doing is changing the way people think about data. So the TDM as a central team becomes a data dictionary rather than data being dispersed across many teams in various formats whether in spreadsheets, PDFs etc.

 

Previous Page  1  2  3  Next Page 

Sign up for Computerworld eNewsletters.