Software development techniques behind the magic user interface

Multi-Touch Developer Journal

Subscribe to Multi-Touch Developer Journal: eMailAlertsEmail Alerts newslettersWeekly Newsletters
Get Multi-Touch Developer Journal: homepageHomepage mobileMobile rssRSS facebookFacebook twitterTwitter linkedinLinkedIn

Multi-Touch Authors: Ben Bradley, Qamar Qrsh, Suresh Sambandam, Jayaram Krishnaswamy, Kevin Benedict

Related Topics: MultiTouch Developer Journal

Multi-Touch: Article

[email protected] Protection and the Linux Environment

New and innovative techniques

Organizations that gather and store critical information have to protect it. While there are tried and true techniques for data protection, there are also new and innovative ones. These new practices and tools greatly enhance an organization's ability to protect mission-critical data. Linux and Open Source users are specially challenged when trying to take advantage of much of this new technology.

We asked technology analyst Tom Petrocelli about what is new and interesting in data protection. Tom is president of Technology Alignment Partners ( and author of the new book Data Protection and Information Lifecycle Management.

LWM: Where is data protection going? Are there changes underway in the way we protect mission-critical data?

Tom Petrocelli: This is an exciting time for people involved in data protection, and not in the bad way that things can be exciting. Many more options, techniques, and practices have become available to IT professionals. The new technology solves a great many problems.

Three major technologies or practices are rapidly changing our ability to protect mission-critical information. First, backup is changing - dramatically. The introduction of disk-to-disk backup systems is shrinking backup windows to nearly zero and bringing restore times in line with modern service levels. With disk-to-disk systems, the traditional tape backup devices are replaced with a hard drive-based system. Using a technique called virtual tape, the disk system emulates the tape system for purposes of software compatibility. Since the disk drives are much faster than tape devices, backup and restore operations are much faster.

As I point out in my book, restore operations that may have taken four or five hours can now be done in 90 minutes or less. This is significant when you consider the requirements of a 0.99999 service level.

Another technology that is vastly changing backup is continuous data protection or CDP. Continuous data protection is a technology that copies data as it's created and changed rather than at set times. When coupled with disk-to-disk backup systems, it affords a higher level of protection than traditional backups. Most CDP applications are application-specific, copying application objects such e-mails rather than entire volumes of data. Together with disk-to-disk backup systems, CDP has the potential to provide real-time backup of important application objects.

The second major advance in thinking about data protection is including security as part of the data protection toolbox. There has long been a separation between the data protection world and the security world. This is strictly artificial and based on IT technical skills, not good strategy. We have to stop thinking of data protection as fixing a problem after it happens. One of the best ways to protect data is to not have anything bad happen to it in the first place.

Unfortunately, security has always been the domain of the network or server. Data protection, on the other hand, has been squarely an issue for the storage community. This has caused an important area of security to be overlooked, namely storage security. Many data storage system are relatively unsecured. Once the server security has been breached, an intruder can pretty much do what he wants to the data on the storage devices. Worse yet, with the advent of networked storage, we created a risk multiplier. In the past, a breach would only give access to the local data stores. Now, an entire network of storage devices is at risk.

The good news is that there's more focus now on securing storage systems. First, the Storage Area Network (SAN) switch vendors have added security features to their products. The addition of virtual fabrics is one example. This was natural since they come from a network point-of-view. Lately, storage systems vendors have gotten into the act. They've been adding features such as encryption to their disk arrays. We can look at the acquisition of Decru, an encryption appliances vendor, by Network Appliance, the big Network Attached Storage (NAS) house, as evidence of the serious attitude that systems companies have taken to securing storage.

Finally, the introduction of policy-driven data protection is transforming the process of data protection. We're seeing a whole new set of best practices, backed by products that analyze data protection processes and devise rules that make them more efficient. Currently, the major focus is on data lifecycle management (DLM) and its close cousin, information lifecycle management (ILM). These are all-encompassing processes that deal with managing data over its lifetime. The focus is mostly on data retention for regulatory compliance, but there's much more to it than just that. DLM and ILM help make decisions as to the importance of data and, hence, what resources are allocated to protecting it. Why give the same level of protection to a memo about the company picnic as the third-quarter financial report? DLM and ILM make us think differently about data and what we do with it.

LWM: Is all data created equal? What are the differences between data protection for enterprise systems and personal systems?

TP: It's easy to answer "not much." There's a presumption that all the information a company creates is important. This is nonsense. An awful lot of unimportant information is created and shouldn't be backed up. Deciding what's important and what's not is the hard part. Part of the attraction of ILM is that it forces you to think about the real value of the data. That said, enterprise servers will tend to have the most mission-critical information and should have the highest levels of protection.

Unfortunately, this has led to relatively poor data protection for desktop and, especially, mobile systems. Many companies rely on individuals remembering to copy files to central file servers. This isn't a technology issue, since products exist to back up desktop hard drives, even ones at someone's house. It's really an issue of cost and process. Many IT organizations focus on protecting enterprise servers and don't pay enough attention to the desktop. This is a good example of why outsourced backup services often make sense. The challenge for Linux desktop users is finding a service or tool that works in that environment.

Personal systems present a special problem because they're not a controlled environment. System administrators can't manage an individual desktop computer, especially a mobile one, to the extent they can control a server. Important data left on a desktop or laptop can easily be lost because the mechanisms aren't in place to protect it.

LWM: In what ways do security and data protection intersect? Are there some specific areas that we should consider?

TP: As I mentioned earlier, security is part of data protection. It's always preferable for data not to be harmed in the first place, rather than to have to restore it. That said, two areas matter the most when it comes to security oriented toward data protection.

The first is access control. If you can keep people away from data they're not supposed to touch, then you eliminate an entire class of data protection problems. We tend to focus on the big things that can happen, such as natural disasters or a hard drive crash. Yet, an awful lot of data loss comes from silly little things such as someone deleting an e-mail they shouldn't have. Overwriting good data with bad data is one of the biggest problems you can have. That type of error can get written to your backups before you know it, leaving you with no good copy. Backup and restore, replication, and remote copy are great for protecting against the disasters and system failures. Access control keeps the little mistakes from destroying important data. It also keeps out the people who are intentionally trying to damage or destroy your data. A DEFAULT DENY posture, while often inconvenient, is better for data protection.

One thing to remember - don't rely on server security in a storage network! Networked storage has a many-to-many relationship with servers. If a single server is breached, someone has access to many storage units. By the same token, a storage device can be accessed from many different servers. Use the access control tools available for networked storage. Even the simplest NAS devices - the single disk ones for the SOHO market - have an access control system. With SANs, always zone every port and use hard zoning. Storage devices themselves provide for LUN masking and, more and more, LUN locking. These provide additional access control for the storage system.

The second area of security that's important to data protection is authentication. Access control rarely works well without user-level authentication. There are many well-known ways for enforcing identity in networks, servers, even applications. Not so for storage networks, especially SANs. When SANs were first developed, they were really an extension of the direct attached model, not a complete reworking of storage for the networked environment. Authentication wasn't a part of the original thinking. In a nutshell, all servers and users are considered trusted. If someone breaks server security they won't be challenged again when they access the storage resources. Even typical security storage mechanisms such as LUN masking or zoning assume a trusted server environment. These techniques are more interested in keeping accidental damage from happening (such as a volume being overwritten by the wrong server in a shared environment) than in keeping data safe from those who might hurt it intentionally.

More Stories By Ibrahim Haddad

Ibrahim Haddad is a member of the management team at The Linux Foundation responsible for technical, legal and compliance projects and initiatives. Prior to that, he ran the Open Source Office at Palm, the Open Source Technology Group at Motorola, and Global Telecommunications Initiatives at The Open Source Development Labs. Ibrahim started his career as a member of the research team at Ericsson Research focusing on advanced research for system architecture of 3G wireless IP networks and on the adoption of open source software in telecom. Ibrahim graduated from Concordia University (Montréal, Canada) with a Ph.D. in Computer Science. He is a Contributing Editor to the Linux Journal. Ibrahim is fluent in Arabic, English and French. He can be reached via

Comments (2) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

Most Recent Comments
SYS-CON Belgium News Desk 12/27/05 08:27:42 PM EST

Organizations that gather and store critical information have to protect it. While there are tried and true techniques for data protection, there are also new and innovative ones. These new practices and tools greatly enhance an organization's ability to protect mission-critical data. Linux and Open Source users are specially challenged when trying to take advantage of much of this new technology.

SYS-CON India News Desk 12/27/05 02:56:52 PM EST

Organizations that gather and store critical information have to protect it. While there are tried and true techniques for data protection, there are also new and innovative ones. These new practices and tools greatly enhance an organization's ability to protect mission-critical data. Linux and Open Source users are specially challenged when trying to take advantage of much of this new technology.