[lug] cloud recommendation

Rob Nagler nagler at bivio.biz
Fri Oct 29 16:15:38 MDT 2010


Chris McDermott wrote:
> Paul Nowosielski wrote:
>> The data I'm dealing with is rather sensitive as well. Do you
>> feel there are any security implications using these cloud services?

If you have very sensitive data, encrypt the data in your database
with a key which is itself encrypted so you only have the decrypted
key in memory when your server is running.  You could even transfer
the key in, start the server, and then remove it from disk so it would
be harder to do an offline attack on the key itself.

> Absolutely.  First of all, you don't have physical control of the
> infrastructure, and  you have no idea where the datacenters are located and
> no right to inspect or audit them.

For all practical purposes, social engineering is very easy at most
data centers.  Worse yet, most data center cabinet locks are a joke.
You can be inside the data center, steal the disks, and be long gone
before anybody contacts the data center that the site is down.  One
day I brought my son, who was 11 at the time, to the data center.  It
took them over a half-hour to figure out there was an 11 year-old in
the room (without a badge, of course, even though all visitors are
supposed to sign in).  It was too bad, because he was really helping
me. :-(

If you don't use data centers, your data is less secure.  Most offices
aren't manned 7x24, contain keycard locks, and have video
surveillance.

> As Trent said during his presentation,
> that rules out certain classes of data such as PHI (HIPAA requirements).
> PCI compliance might also be a stretch, if you're dealing with credit
> cards.  Basically, look *very* carefully through whatever standards and
> regulations you need to comply with before putting that data up there.

I think you would be covered if the data center is SAS70 compliant,
which AWS is:

http://aws.amazon.com/about-aws/whats-new/2009/11/11/aws-completes-sas70-type-ii-audit/

Well, if you believe the press site.

> Secondly, even if there are no regulatory reasons to worry, you will be
> transferring your data across the internet and that's always cause for
> concern.

I don't think this is an issue.  I don't see how you can say with a
straight face that your data never transition a 3rd party's network.
Never plug in a USB key, CD, etc.  Don't login to the server except
from the console.  How does data get into the system in the first
place?

> Make sure you're encrypting credentials everywhere, and I
> recommend requiring SSH RSA or DSA keys for logging in remotely.

EC2 is a bit funny this way.  You get these credentials over the
internet which are not encrypted, that is, you download the private
key unencrypted (except via SSL).  You can, alternatively, upload your
own public key, which makes more sense.  Also, when you login as
ec2-user, that user can sudo without a password.  I couldn't even set
a password for that user.  How weird.  I deleted the user. :)

There are other concerns that are quite serious.  When I'm done with a
computer, I destroy the disks.  I never sell or recycle the disks,
ever.  However, when you destroy an AWS instance, well, where does the
data go?  S3 encrypts the data before it hits the disk, but the key is
obviously everywhere (imagine if they lost it, or if a single USB key
which all requests had to go through :-) so that really isn't
encryption.  Can you trust their techs?  Can you trust them not to let
the feds in?  Oh, the list is endless.

My office has glass windows.  How do I protect people from crashing
through, taking a sawzall and ripping through the walls till they find
what they want.  Can't put any secure data in there.

Getting back to PCI, which I've had the lovely pleasure of being
self-audited for, it's a joke.  They ask questions you can't possibly
answer yes to.

I've also been through an audit (hundreds of pages and several
conference calls) by a Fortune 50 company, and it was just a "nudge
nudge wink wink" exercise.  All they wanted to know is that we encrypt
the data and that we connect securely.  They wanted silly things like
warnings when someone logs into our computers.  They didn't ask to see
a single line of code that was doing the encryption or any code for
that matter.  Meanwhile their employees are emailing around the data
we were supposed to be protecting.

I think security comes down to your own personal ethics.  You have to
do the best job you think you can within your own conscience.  How
much do you pay people so they won't betray you?  How much time do you
spend on security vs creating features that add value?  Weinberg's
Absence of Errors Fallacy comes to mind:

     Though copious errors guarantees worthlessness, having zero errors
     guarantees nothing at all about the value of software.

Rob



More information about the LUG mailing list