The Pirates are Coming
It is believed that there has been some level of piracy since at least the 13th Century BC. Ever since the “Sea Peoples” we have been working to try to stay one step ahead of those determined to take that which doesn’t belong to them. The modern front has generally moved from the protection of our goods from mysterious maritime invaders to protecting our data from hackers or accidental breaches. Data is critical to our everyday lives, it is generated by our daily activities and even lack of activity. We all entrust, either explicitly or implicitly, corporations with our information and hope that they are at minimum giving honest effort to keep our private information just that.
Data security is an important balance of restriction and usability. After all, the most secure data would be completely restricted and unusable. An approach must be taken to push the compromise probability as close to zero within the constraints of the system, all while needing to balance usability. While there is simply no such thing as an un-hackable system (we are fooling ourselves to believe otherwise), there are some best practices and different methods that can be used to help reduce the exposure and ideally increase the risk of those that intend to do harm.
The primary focus of this posting will be physical data files within the constraints of a workflow. There are two primary tenets 1) encryption at rest and 2) automatic cleaning up after ourselves.
Encryption at rest just means that files are locked when not in use. This means that if a file is not being accessed or travelling through the workflow in a way that encryption would not allow, the file should be encrypted. There are a number of ways that this can be accomplished but the two most common are either password protecting a file or PGP encrypting a file. Password protection is better than nothing, but as ITWorld outlined in 2012 and many other studies have shown, they will only get you so far depending on the complexity of your password. PGP on the other hand works a little bit differently. Using PGP encryption, you create a key ring that has a private and a public key. You can send out your public key to anybody that you wish so they can lock a file (or a file can be locked as part of a workflow) with the public key and you can unlock the file with your private key.
Next is automatically cleaning up after ourselves. You may have read my earlier article about the overwhelming amount of data being created, this is compounded when workflows leave leftover files strewn about and the more places where data lives, the harder the data is to protect. There are a number of tools that can help do this, but any of the working or intermediate files that are created as part of a workflow should be cleaned up and preferably through a wiping process. A wiping process is different than just deleting, which essentially makes the file invisible to the operating system until something else is written over its place, in that wiping a file will delete the file and immediately write useless data over where the file was on the drive. This is especially critical when dealing with sensitive data such as healthcare or financial data.
Tags: Data Security, PGP Encryption, Workflow Architecture
You must be logged in to post a comment.