The Mistake of Looking at Datum vs. Data in Threat Analytics
One of the strangest words in the English language is datum. It is, by definition, the singular form of data but is rarely used in conversation or in written documentation. It generally refers to a single point of information or a fixed starting point of a scale or operation. When we review security or debugging information we often refer to single entries in a log as data when it should be correctly referred to as datum.
The term may be considered obsolete, but when it comes to security there are many times we make critical decisions on datum, and not data. This is where discussions on analytics and user behavior become important. It would be a mistake to base a decision on user behavior strictly on datum. Analytics and user behavior require data, and unfortunately many technology solutions that claim to support user behavioral analytics really only look at datum to make recommendations, and not data.
Why is Data so Important in Analytics? It’s More than a Single Event
Any analytics solution that makes a recommendation based on a single piece of information is more in tune with an event monitoring solution, or SIEM, than an analytics engine. For example, a single event based on user, time, date and location is not analytics – its datum. That information correlated with other event datum, and processed via correlation is not analytics either. That is just a correlation engine reviewing multiple events in a logical order. This technology has been around for decades.
If the events are unique, processed via machine learning, cluster analysis, adaptive correlation engines, etc. then we could potentially have analytics. It takes more than just a single event and event matching to create analytics based on variable event data. Being mindful of the analytics claim and data absorption model is key in understanding whether an analytics solution can really help you detect and resolve security anomalies.
What Good Analytics Looks Like
The central capabilities behind the PowerBroker Privileged Access Management platform, BeyondInsight, include purpose-built analytics. Clarity is an advanced threat analytics feature that enables IT and security professionals to identify the data breach threats typically missed by other security solutions. Clarity pinpoints specific, high-risk users and assets by correlating low-level privilege, vulnerability and threat data from a variety of BeyondTrust and third-party solutions. Clarity analyzes information stored in BeyondTrust’s centralized database, which contains data gathered from across any or all supported solution deployed in the customer environment, including:
- PowerBroker Password Safe: the safe storage of user accounts and passwords with automatic password rotation, session management, and workflow for securing privileged account management
- PowerBroker for Unix & Linux: user and account activity from servers
- PowerBroker Endpoint Protection Platform: IPS, IDS, anti-virus and firewall log data
- Retina CS Enterprise Vulnerability Management: vulnerability data
- Third-Party Vulnerability Scanners: imported data from Qualys, Tenable, McAfee, TripWire, and Rapid7®
Clarity then sets baselines for normal behavior, observes changes, and identifies anomalies that signal critical threats via the following steps:
- Aggregate users and asset data to centrally baseline and track behavior
- Correlate diverse asset, user and threat activity to reveal critical risks
- Measure the velocity of asset changes to flag in-progress threats
- Isolate users and assets exhibiting deviant behavior
- Generate reports to inform and align security decisions
Because of its ability to interpret granular and diverse sets of data, Clarity enables IT and security staff to reveal previously overlooked cases of user, account, and asset risk based on real world data and not just a single event, plot, or simple correlation engine. If you are looking to expand your privileged risk insights, contact us today.