Information technology demonstrated using a flow chart diagram on a chalkboard Before we start, let’s agree on three fundamental principles of protecting data: 1. Data is the most valuable asset your organization has (besides the folks who work for you anyway) 2. Data is like water – it will find the path of least resistance out of its current location 3. Based on its value and portability, data must be protected zealously. I think we can all agree to these three principles, and understand that if data does fall into the wrong hands you basically have your choice of one of two outcomes: a) a publicly-embarrassing and expensive breach, or b) an expensive compliance violation. In either case – whether resulting from intentional activity or sloppy administrative practices – the outcome is more cost, public embarrassment and even lost jobs. You don’t have to look far into recent breaches to learn that. Controls and accountability must be put into place so that only the right folks can access data and the systems on which that data resides. Employing a least privilege model helps to achieve that and more. This blog is the first of two in a series that discusses best practices for privileged account management to prevent data loss. To illustrate those best practices, I’ll use conclusions and data from a recent report from the Ponemon Institute. Deploy least privilege across the organization, not in pockets In the Ponemon report, we learn that 34% of IT practitioners say their organization does not enforce least privilege, and only 20% do enforce it. So 80% of organizations are at some level of risk, and more than a third of organizations are at an extreme level of unnecessary risk to potentially crippling breaches or compliance violations because of this. End users are even recognizing this potential attack surface, with 71% saying they have too much access to confidential corporate data. If you consider an end user, or more precisely an end-user machine, as the last mile of security, not having restrictions on access for that user on that machine just opens the door to both outside and inside threat actors. How did we get to this point? Failed least privilege implementations based on a lack of scope (i.e. focusing on too finite a population of users) or fear of user backlash. Only an enterprise-level least privilege deployment with flexible policies based on applications – not users – can succeed. Organizations should consider the deployment of least privilege across their environments – from desktop to server, with full session monitoring, file integrity monitoring, auditing and password management. Centrally managed with complete discovery, it will help organizations get control over who is accessing what. Understand the four W’s of data access Nearly 50% of IT practitioners surveyed in the report admit to not being able to assess what happened to a file when a change occurs. This is a significant gap that can cause countless lost hours of forensic work across potentially hundreds of systems. The impact to the business caused by the distraction of valuable IT assets in this work tangible. Organizations should consider the deployment of centralized auditing solutions that will pinpoint the "who, what, when and where" behind changes to critical systems such as Active Directory, Exchange Servers, Windows File Systems and SQL Server databases without the operational impacts of native auditing. Capabilities for continuous backup enable IT to both detect and roll back changes, minimizing the risks of business disruptions. So, an organization-wide approach to least privilege and an auditor’s focus can help. Want more? Check out next week’s blog to learn how granularity and starting from the inside and working outward can help improve data security.