The noise is real. Of that, we can agree. It started way back in history – whoops, wrong topic (shout out to all of you who know that lyric). Basic packet captures – the final arbiter of proof, started all this and has continued nonstop until this very day. Every security analyst worth his/her salt asks for the packet captures. Why do we have all this data? Do we need it all? With IOT today, my toaster can tell me how many toast points I have burned since 2019. Do we care? Should we care? To be honest, I’m not sure I want folks to know I struggle getting my toast just right :).
Some of the blame rests squarely on the shoulders of all the security practitioners out there. How many times have we asked our partners, ”Can’t you just create a syslog and tell me everything I need to know?” We are part of the problem. The other part of the problem rests on our security partners (vendors). A few enterprising partners tell us, “Send me all your data. Worst case is we can store it just in case you need it someday.” Great way to drive up licensing – the more of my data your product stores the more I pay you to access my data.
Today, the security vendor community is exacerbating the problem as we create point solutions to solve point problems without the big picture in mind. These point solutions lead to un-correlated data, incomplete visibility, and terrible reporting. Last year at RSA, there were over 1500 security vendors showing their products. Depending on which well-meaning report you read, the average number of security tools deployed to protect the enterprise is between 50 and 75. How can this be? There has got to be a better way.
What is the big picture? That’s a great question. How about a security ecosystem where the underlying architecture does not care what apps (security tools) get plugged in or who the vendor is of those apps? The architecture allows for any tools to plug in, where the data gets processed correctly (reduced, de-duplicated, enriched, and normalized). Where the architecture shares data amongst all apps that are part of the ecosystem. Not only event data, but threat data, response (think SOAR) capabilities, and reporting.
Some of you will say, that’s nirvana. I suggest it exists today. If you have not explored XDR, you need to. XDR is any data (X) detection and response. Until recently, that was not possible despite the many claims from EDR and MDR vendors. Getting the data right through the correct data processing, applying detection’s across those data streams which include AI and ML, inspecting those data streams through threat hunting and correlation, and ultimately automating responses to those detection’s.
Open XDR is Stellar Cyber’s answer to the data problem. There are not too many alerts nor too much data – the existing data is just not being processed correctly. The Starlight platform is our answer to too much data, too many alerts, and too many tools. The Open XDR approach solves the point solution problem. Correlated data, complete visibility, improved reporting, and ultimately quicker detection’s and reduced dwell time. The idea of data fatigue can be put to rest – get the data right and the rest takes care of itself!
The post Myth Buster: Data Fatigue is Not Real appeared first on Cybersecurity Insiders.
January 28, 2021 at 05:57AM
0 comments:
Post a Comment