Enhancing Cybersecurity: The Critical Need for Real Observability

To combat cyber threats effectively, organizations must prioritize deep observability to detect and respond to intrusions promptly.

Enhancing Cybersecurity: The Critical Need for Real Observability
Andrew Wallace

Andrew Wallace

Professional Tech Editor

Focuses on professional-grade hardware, software, and enterprise solutions.

The most severe security breaches often begin quietly, slipping past defenses and spreading unnoticed. The time attackers spend undetected within a system can turn a minor intrusion into a significant disaster.

Despite substantial investments in preventive measures, many organizations still struggle to identify and contain attackers once they breach the perimeter.

The issue often lies not in a lack of visibility or alerts, but in a lack of clarity. In today’s hybrid environments, resilience is less about blocking every threat and more about detecting those already present.

The Growing Gap Between Detection and Real Visibility

Security teams are increasingly overwhelmed, and the statistics illustrate this challenge. Research indicates that a typical organization faces over 2,000 alerts daily, much of which is noise that provides little value.

Analysts spend more than 14 hours each week pursuing false positives, and two-thirds of leaders acknowledge their teams cannot keep pace. Missed alerts can quickly lead to missed opportunities to thwart attackers early.

Tool complexity exacerbates the issue. While most organizations utilize multiple cloud detection and response platforms, nearly all (92%) report significant capability gaps.

More data does not inherently lead to better detection; overlapping systems create fragmented visibility and conflicting information. Without meaningful context to connect these signals, defenders are left piecing together fragments rather than seeing the complete picture.

Why Lateral Movement Remains the Attacker’s Favorite Blind Spot

The challenges of distinguishing signal from noise greatly benefit threat actors, who increasingly favor low-and-slow tactics. Once inside, they often move stealthily through the network, escalating privileges and probing for sensitive systems.

This lateral movement can escalate minor breaches into major operational crises, making it one of the hardest stages of an attack to detect.

Cyber attackers are reaping the rewards, with nearly 90% of organizations reporting incidents involving lateral movement in the past year. On average, these breaches resulted in over seven hours of downtime, with ongoing disruptions extending recovery times.

These incidents persist because east-west traffic in modern hybrid environments is poorly understood. Even when organizations believe they are monitoring internal communications effectively, almost 40% of that traffic lacks the necessary context for confident analysis.

Attackers thrive when defenders are overwhelmed, uncertain, or unable to differentiate legitimate activity from early signs of an intrusion spreading through the network.

The Importance of Observability

To effectively defend against these threats, deep observability is essential. Industry bodies, such as the UK’s National Cyber Security Centre (NCSC), have been advocating for this shift.

The NCSC emphasizes that organizations cannot hunt for threats they cannot see, and traditional indicators of compromise are insufficient. Defenders need visibility across behaviors, patterns, identities, workloads, and east-west traffic to uncover subtle signals indicating an attacker is already in motion.

This aligns closely with our analysis regarding the lack of context for internal traffic, despite widespread confidence in monitoring capabilities.

Observability must extend beyond merely collecting more logs; it requires understanding how systems relate, behave, and change over time, connecting those insights before an attacker can exploit them.

Why Context, Correlation, and Containment Must Replace Alert-Hunting

For years, security programs have responded to rising attack volumes by gathering more data. However, this approach often intensifies alert fatigue rather than alleviating it.

With large portions of network traffic lacking the context needed for meaningful investigation, analysts find themselves sifting through unprioritized alerts instead of focusing on attacker behavior.

Security teams require a connected view of their environment, not isolated signals. Contextual models, such as security graphs, help map relationships among workloads, identities, devices, and data flows.

These models transform scattered indicators into a coherent picture. A low-level alert on one system can suddenly gain significance when linked to suspicious behavior elsewhere, revealing attacker intent rather than isolated anomalies.

This shift from alert-hunting to understanding pathways is crucial for breach containment. When defenders can visualize how systems interact and identify sensitive assets, they can pinpoint the routes an attacker is likely to take.

This clarity enables teams to act decisively, slowing or halting lateral movement before it spreads.

AI, Automation, and Scaling Human Judgment Responsibly

As environments expand, the volume and complexity of security data have surpassed what human analysts can manage alone. This is where AI and automation become vital.

Many organizations are leveraging AI and machine learning to enhance detection accuracy and accelerate response times, viewing these capabilities as essential for identifying lateral movement earlier and alleviating alert fatigue.

However, it’s a misconception that investing in AI will resolve all issues independently. AI is most effective when it complements, rather than replaces, human expertise. Automated systems can correlate signals across hybrid environments, enrich them with context, and filter out noise, providing analysts with a more precise starting point.

For instance, AI-powered analysis can continuously plot every workload and connection in real time, highlighting behavioral patterns that would be impossible to surface manually.

This creates a force multiplier, enabling faster, more confident decisions and supporting the rapid containment that modern resilience demands.

AI-powered security graphs can connect seemingly disparate network events, establishing a cohesive narrative where busy human analysts might only see isolated alerts.

For example, a workload accessing a database it has never accessed before could be traced back to a misconfigured identity, revealing an attack path already being exploited.

What This Means for Business and IT Leaders

For leaders, the path forward begins with recognizing that resilience hinges on what occurs after an attacker gains access. This necessitates investing in observability across the entire hybrid estate, encompassing not only perimeter logs but also identities, workloads, cloud services, and east-west traffic that illuminate how attacks unfold.

It also requires shifting detection strategies toward behaviors and relationships, supported by threat hunting and hypothesis-driven investigation. Automation and AI can assist, but only when grounded in high-quality, contextualized data.

Ultimately, success should be gauged not by the number of blocked threats, but by how swiftly organizations can detect, contain, and recover from an intrusion.

In today’s fast-paced, hybrid environments, breaches are an inevitable reality. What truly matters is how quickly an organization can identify when something is amiss and mitigate the impact.

We've featured the best encryption software.

This article was produced as part of our publication's Expert Insights channel, showcasing the best and brightest minds in the technology industry today. The views expressed here are those of the author and do not necessarily reflect those of our publication. If you are interested in contributing, find out more here.

React to this story

Related Posts