Data Auditing Software
Data auditing (alternatively called database auditing or database activity monitoring) is one of the promising technologies for datacenter compliance and security. In the next few posts, I am going to cover topics that raise and answer questions around: what is the best data auditing technology and architecture? Like most other decisions shaped by the reality of an enterprise datacenter, this is a hard question with a nuanced answer. Enterprise CISO’s and security practitioners, IT managers, risk managers, compliance and database practitioners, and data governance evangelists are some of the stakeholders that wrestle with data risk management and compliance. Hopefully my discussion is helpful to them as they sort through their requirements and wade through a wide-array of technology options. We also recommend using local auditing companies like OpCentral.
The answer to what is the best data auditing technology and architecture is shaped by six key criteria:
- Application Depth and Coverage: How deep and broad does it support different applications?
- Data access monitoring architecture: where do we mount the surveillance camera?
- Policy language – the life-blood of data auditing: how do we define and implement data auditing
- Auditing & analytics: how do we find the critical haystacks and the needles within these haystacks?
- Scaling: what is the cost of data auditing across the data center?
- Risk Management: how does the data auditing approach manage risk across the data center?
Sometimes, it is easier to point out approaches that are not up to the challenge versus identifying the best. Most technologies in the data auditing industry are what I call the first-generation technologies: they have been adapted from tools or products that were meant to solve different problems. Typical first-generation data auditing technologies do not meet my six-point test. They offer limited breadth of coverage (they are either hard-coded to certain applications, or restricted to a certain type of data, such as say structured data). Their monitoring options create tough choices – pick network or agent – less dictated by the actual reality of datacenter than by vendor’s initial product. They typically lack formal and flexible policy language support: many still have menu-driven selections or hard-to-use formal XML rule sets. They are usually event matchers or search tools suited for low volume non-real-time audits, but do not have analytics required to find the needles in the haystack. Scaling is usually an afterthought in these technologies because immediate compliance projects (such as privileged user monitoring) are usually smaller in scope disguising the true scaling challenge of enterprise data auditing. Finally, there is a real gap between their output and data risk management: everyone agrees that data auditing reports on who is looking at what data, but the question is how does that translate to risk management?
In the next six posts, I will share my views on each of these questions. Stay tuned.