Immuta defends benchmark study comparing access control policy management burdens

Apache Ranger maintainers slam unflattering cloud data security comparison with Immuta

The maintainers of Apache Ranger have criticized a report that benchmarks the open source data governance platform against proprietary rival Immuta in terms of implementing access control policies.

The report, which is sponsored by Immuta and produced by tech analyst firm GigaOm, “paints an incorrect picture on the complexities of using Apache Ranger”, wrote Madhan Neethiraj, a project management committee (PMC) member for Apache Ranger, in a blog post.

Citing “a number of errors and inconsistencies”, Neethiraj said he wanted to set the record straight “for the benefit of existing and potential users of Apache Ranger”.

‘Superior’ ABAC controls

However, Immuta, which automates data access control across an organization’s cloud data infrastructure, endorsed GigaOm’s findings.

“This analysis showed that Immuta’s ABAC controls are superior to Ranger’s controls for the scenarios tested,” Immuta PR director Joe Madden told The Daily Swig.


YOU MIGHT ALSO LIKE ‘Being serious about security is a must’ – Apache Software Foundation custodians on fulfilling its founding mission


“The research is further validated by the GigaOm Radar Report for Data Governance Solutions published last week that recognizes Immuta as a leader and the most innovative data access control platform – ahead of all other data access control solutions.”

GigaOm said it benchmarked Apache Ranger against Immuta using “a reproducible test that included a standardized, publicly available dataset and a number of access control policy management scenarios based on real world use cases we have observed for cloud data workloads”.

Overlooking lookup tables

Apache Ranger, used to manage data security across the Hadoop platform, was paired with and without the Apache Atlas data governance tool during the tests.

However, Neethiraj questioned the methodology that was used for gauging policy management burdens imposed in various scenarios.

For instance, he highlighted six requirements that can be fulfilled by Apache Ranger with either no policy changes, one, row-filter policy or two policies by using lookup tables – “a common practice in enterprises”, he said.

However, by overlooking lookup tables, GigaOm inflated policy changes required for Apache Ranger in every case, said Neethiraj, including 102 changes cited for sharing additional data with managers.

De-identification policies “may not be feasible” at all for Apache Ranger, according to GigaOm. Not so, said Neethiraj, saying implementation was possible via data masking policies.

Apples with oranges

The report was also criticized in a blog post published by Privacera, a data access governance platform where Neethiraj is vice president of software architecture.

Privacera, which uses Apache Ranger as its underlying access control engine, slammed the study for its apparent failure to use Apache Ranger in accordance with its design principles, using terminology that complicates “apples-to-apples comparison”, and applying only a single, flawed criterion for evaluating the platforms.

GigaOm concluded that Ranger and Apache Atlas required 63 times as many policy changes as Immuta across 15 access control scenarios overall.


Catch up with the latest cloud security news


However, Privacera claimed that “proper usage of Apache Ranger” reduced the total policy count “from GigaOm’s inflated 603 to just 28”.

It said GigaOm “inflated ownership costs based on that error” and neglected to mention that Apache Ranger is free to use, while Immuta’s proprietary software “could cost hundreds of thousands [of] dollars a year in licensing”.

GigaOM said the “study exposed the limitations of extending legacy Hadoop security components into cloud use cases.”

However, Neethiraj said Apache Ranger’s open policy model and plugin architecture enabled the extension of access control to other applications, and the platform was widely accepted among “major cloud vendors like AWS, Azure, GCP”.

The report, which was published on July 19, said that “GigaOm developed the methodology and scoring” and that “certain criteria are inherently subject to judgment”.

Communication confusion

Sally Khudairi, vice president for sponsor relations at The Apache Software Foundation, told The Daily Swig she had attempted, without success, to contact GigaOm via email and online contact form multiple times, in an effort to persuade analysts to reconsider their findings.

However, she said GigaOm contacted The Apache Software Foundation on Monday (October 4) – after The Daily Swig contacted GigaOm – “stating that they had no record of our contacting them and they wanted to conduct an investigation”.

GigaOm told The Daily Swig: “We are currently investigating Apache Foundation’s attempts to reach out to us, and will respond directly to them”.


RELATED Apache HTTP Server devs issue fix for critical data leak vulnerability – update now