Unisys.com

New Rules for Data Models and Security Architectures

 Author(s): Patricia Titus, Posted 06/18/10

Share:Facebook0Twitter0Google+0LinkedIn0Pinterest0StumbleUpon0Email

It shouldn’t surprise anyone that one of the key findings of the Unisys-IDC study is that consumers — that is, our workers — want to use their own consumer technology to access corporate information. Somewhere between 35 percent to 45 percent of the respondents are using a home PC to access corporate data and networks, and 15 to 30 percent are using a smart phone.

The traditional approach to IT security is built around the notion of “perimeter security.” We use firewalls and other technologies to create a boundary to keep the bad guys out and our data in. And now, with the consumerization of IT in full swing, we should start with the thought that the “bad guys” are already there inside our our networks and our data is seeping from the boundary.

As security professionals we need to embrace the technology advancements and plan accordingly. And that planning starts by asking this question: “What type of data do we want to be consumerized — to be made available at the touch of a button?” The correct answer will ensure the right people are getting the right data at the right time.

There are three ramifications to this question:

  • First, you need to be protecting your data, not just your perimeter.
  • Second, not all data needs the same level of protection.
  • Third, don’t keep obsolete data.

Instead of trying to protect all of the data all of the time — which is how most IT organizations approach the data model today — you need to think in terms of data classification and segmentation. You need to protect critical data or your crown jewels, and provide access to it based on clearly defined limitations.

Of course, you need to know what data is critical to your organization and business, where it lives, how you are securing it, and what your control rules are around the data itself.

For example, a national database tracking bad guys shouldn’t be accessible to anybody except certain law enforcement entities. Control to that critical data can be managed through multi-factor authentication: login, password, and one of many biometric options. That is, something that I have, something that I know, and something that I am. Requiring all three together is very hard for a rogue to acquire.

In addition to doing rigorous security controls around authentication, you’re going to scan for malicious code, you’re going to do validation and attestation of your software, and you’re going to assess the security posture more frequently than you normally would.

Maybe this used to be done every three years, but now you’re going to move to continuous monitoring and run assessments more frequently. The point is, you must concentrate your resources and focus on that database, because it has been deemed critical.

But you also have information that you want to make publicly available. While you’re going to make sure your Web servers don’t get hacked or compromised, using basic security principles, you are not concerned about requiring biometric access to publicly available information. You want that information to flow freely to workers, customers, partners, consumers and citizens.

Therein lies the rub. Today, many IT organizations have all of their data resources sitting in same data center connected to the same backbone infrastructure, connected to the same switch, same router, the same everything. That’s a major problem in this new era. We talk all the time about data center consolidation, but we never talk about data segmentation inside the data center consolidation! That’s a conversation that must start today. The time has come to assign different levels of security to your data, depending upon what it is and who needs to see it, even down to the infrastructure that supports it.

As IT consumerization takes hold, we as IT professionals need to think long and hard about what it is that we want to protect, how we want to protect it, and then prioritize the resources and funding we have on that data. We cannot and should not try to protect everything; there’s simply too much data, and the volume keeps growing at an unbelievable rate.

Which brings me to my last point: We need to get rid of data. Trash it. Digitally shred it. Toss it. We tend to keep data forever, because nobody knows what to get rid of or what might be important tomorrow. So we end up with volumes and volumes of data, the storage and archiving of which might actually be doing us more harm than good. Data gets stale, and stale data can cause confusion. Or it’s on tapes or drives that get stored in someone’s garage – trust me, it’s happening.

It’s a problem that’s only going to get worse, as unimaginable amounts of corporate data finds its way into consumer devices such as external hard drives and thumb drives, social networks like Facebook and Twitter, and apps in the cloud like Gmail and other collaboration tools.

The U.S. government put the National Archive and Retention Act in place to help it manage the process of destroying old data as guidance for federal agencies. Other government and non-government organizations should take a cue from this, and have an explicit policy on data retention and destruction.

In the era of consumer-driven IT, we need to prioritize a recalibrated approach to data security. We need to decide what type of data we want to be consumerized and made available, transparently at the touch of a button. For data that is not to be consumerized, we need to assign and implement different levels of security based on the nature of the data.


The statements posted on this blog are those of the writer alone, and do not necessarily reflect the views of Unisys.

«Consumerization of IT: How It Will Impact Enterprise Applications
Workers to IT: Better Support, Please! »





To prevent spam and inappropriate or offensive content, please note that all comments are moderated. Thank you.

Leave a Reply

*


Back To Top