Application Security

Subscribe to Application Security: eMailAlertsEmail Alerts
Get Application Security: homepageHomepage mobileMobile rssRSS facebookFacebook twitterTwitter linkedinLinkedIn


Related Topics: Twitter on Ulitzer, Mixed Network Integration

Blog Feed Post

Hadoop Security: Internal or External? Why Not Both!?

The gateway pattern is really putting a secure front on the already standardized APIs found in Hadoop

I saw a conversation today on Twitter that asked why we don’t just embed proper security into Hadoop instead of suggesting the API gateway approach to Hadoop security that my colleague Blake proposed.  The same could be asked about any number of applications and services, but the bottom line is that we believe that a two-pronged approach is best.

Internally, we have dramatically improved Hadoop’s security capabilities via Project Rhino.  This enables best security practices like encryption at rest, which cannot be implemented anywhere else.  We are also working to standardize the authorization framework and implement token based authentication with single sign-on.  These are all core capabilities that absolutely need to be added to Hadoop’s code base.

The gateway approach addresses something else – the API layer.  While I agree that any application should protect against common attacks, consider this in the bigger picture.  First, consider the number of different features that may be required by Hadoop adopters:  tokenization, data field encryption, integration with Active Directory, mapping to OAuth for mobile applications, etc.  It would take a staggering number of man-hours to implement all of these features within Hadoop.  Now consider the number of enterprise applications that expose APIs — consider the investment required to duplicate those features within each of these application suites.  Finally, consider the job of the poor sysadmin who has to selectively enable these features consistently across everything in their domain, along with the one who gets to come along behind him and audit for compliance.  Add to that the probability (or lack thereof) that all of these vendors implemented the features with common configuration processes…

Our façade proxy abstracts much of this functionality to an external system with an easy-to-use graphical interface.  Implementation and inspection of common security policies can be managed across all APIs within the enterprise.  More complex, custom workflows can be created and reused as well.  Finally, the gateway complements the Project Rhino work which provides a solid security foundation that can then be extended (in a standard fashion) by the gateway.

Part of the objection/confusion is shown here:

 

I want to clarify that we are talking about standard protocols – in fact, the gateway pattern is really putting a secure front on the already standardized APIs found in Hadoop such as WebHDFS and Stargate. These APIs aren’t new protocols, but the façade pattern helps with separation of concerns and lets the data scientists worry about data and the security folks worry about security.

The post Hadoop Security: Internal or External? Why not both! appeared first on Application Security.

More Stories By Application Security

This blog references our expert posts on application and web services security.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.