Digital transformation accelerated at breakneck speeds through the internet age and exploded with mobile technology, cloud computing, microservices, AI, and IoT. These evolutions have dramatically transformed the data landscape, and today, enterprises don’t just struggle with compounding data volumes—they battle with velocity, variety, and orchestration across hybrid environments.
Onum's co-founder and CTO, Lucas Varela, experienced these challenges firsthand managing cybersecurity operations at CaixaBank. The problem was painfully clear: "When you're at a certain stage in your data maturity journey, you realize you don't have enough money or resources—both human resources and computational— to deal with all your data. The data keeps growing because each time you contract new SaaS services, expand your data center, or implement new use cases, you generate more data that needs to be managed."
This complexity manifests in several ways:
Proliferating tool ecosystems: The average organization now uses 76+ security tools, creating a tangled web of data dependencies and integrations. That number doesn't include the hundreds of other SaaS platforms that orgs leverage across different departments.
Cross-functional misalignment: Different business units require specific data formats for various outcomes, yet cross-functional coordination rarely exists. New use cases for data are appearing and proliferating constantly.
Resource constraints: Teams have to collect, store, interpret, and manage massive amounts of data without budget increases, while simultaneously delivering more value.
This reality creates a perfect storm. Organizations must determine what data needs immediate analysis, what data needs to be delivered to what systems in what formats, and how to treat data to be stored or archived—all while ensuring they can act in real-time when critical events occur.
The Limitations of First-Gen Pipeline Processing
First-generation data pipeline solutions follow a linear processing model where each step must complete before the next begins. While this approach worked adequately for smaller data ecosystems, it creates significant bottlenecks as complexity grows.
"In a traditional alerting approach, you need to put all the information in one point, and after that, you make queries or work with the data in that central point," explains Varela. "That worked fine historically, but creates problems now. You need to put all the information in one place, which means dealing with egress costs, networking challenges, and infrastructure scaling."
Sequential processing architectures suffer from cumulative latency issues:
Each processing step adds a delay to the overall pipeline
Resource-intensive operations create bottlenecks that slow the entire system
Scaling requires increasingly complex configurations and infrastructure
Most critically, these architectures can't deliver millisecond-level response times
Rethinking the Architecture of Data Processing
Concurrent processing flips the model. Instead of waiting for each step to finish, operations run in parallel—cutting detection times from minutes to milliseconds. This architectural choice dramatically reduces latency and increases throughput by processing data streams in parallel.
"Our point is to shift left the computation in terms of correlation," notes Varela. "You can reduce data using one action in our data pipelines, but the interesting point is that we want to move computation, correlation, and your way of thinking in terms of what you need to do with your data to get real-time insights."
This shift-left approach extends beyond data reduction through simple filtering and routing. It required significant architectural decisions, including the use of Go programming language for its native concurrency capabilities, and a containerized architecture that enables both horizontal and vertical scaling.
Unlike sequential architectures that might take minutes to detect critical events, concurrent processing handles the same scenarios in milliseconds—often the difference between detecting a security threat before or after damage occurs.
From Data to Decisions: The Real-Time Intelligence Opportunity
Perhaps the most transformative aspect of modern data processing architecture isn't just speed—it's intelligence.
"We're in a privileged position in the data world," explains Sergio Bellido, Onum's VP of Product & Engineering. "We're close to the source of the data, at the left of the full data stream. That puts us in a position not only to orchestrate data but also to put intelligence on top of it."
This privileged position enables Onum to:
Trigger alerts and surface intelligence at the edge before data reaches downstream analytics platforms.
Deliver the right data, in the right format, to every system - automatically.
Take full control of your data—route it anywhere, from any source, with no strings attached.
Give every team the context they need—precisely when and where they need it.
Let’s look at how Onum delivers in an example.
Use Case: Detecting SSH Brute Force Attacks with Real-Time Intelligence
A perfect example of Onum's real-time data intelligence capabilities is detecting SSH brute force attacks using VPC Flow Logs. In this scenario, organizations that perform lift and shift transitions to cloud often have EC2 instances with port 22 open to the internet for legitimate business and operational needs, but these are prime targets for attackers.
Traditional security approaches would ingest all VPC Flow Logs into a SIEM and then run queries to detect potential brute force attempts—often minutes after attacks have already succeeded. Onum's concurrent processing architecture completely transforms this approach.
Using Onum, security teams can define a data pipeline that shifts left the analytics process, providing real-time detection and response capabilities without requiring any coding or RegEx creation.
The detection pipeline identifies suspicious patterns such as high volumes of short-lived TCP connections to port 22 (50+ connections within 5 minutes), connections using different source ports from the same source IP, and unusual spikes in connection attempts over short time windows.
What makes Onum's approach truly differentiated is its multi-layered contextual enrichment:
Real-time GeoIP enrichment: Source IPs are immediately enriched with location data to identify connections from unexpected or anomalous locations
Dynamic threat intelligence validation: IPs are checked against known bad actors from threat intelligence feeds like AbuseIPDB, AlienVault OTX, or internal denylists
Tiered response capabilities: Based on threat severity and contextual intelligence, Onum can trigger appropriate responses:
Level 1 (High Threat): Automatically update firewall rules via API to block the offending IP and notify the security team
Level 2 (Suspicious): Alert security teams with enriched context for review before blocking
Level 3 (Informational): Log behavior for trend analysis without immediate action
This approach delivers fine-tuned security without disrupting legitimate operations, avoids false positives by accounting for business needs, and uses a combination of threat intelligence and behavior analytics to drive precise actions.
Most importantly, this entire detection-to-response cycle happens in milliseconds rather than minutes, enabling security teams to prevent breaches rather than just detect them after the fact.
The Future: Real-Time Data Intelligence Without Complexity
As data volumes continue to explode, the shift from batch to real-time processing isn't just a technical improvement—it's a competitive necessity. Organizations leveraging concurrent processing architecture gain significant advantages through:
Millisecond-level detection and response capabilities
Optimal resource utilization and reduced infrastructure costs
Enhanced ability to extract immediate value from their data
Freedom from vendor lock-in through agnostic data routing
"Data is the currency that drives every business," notes Sergio. "It fuels every decision, powers every transaction, and determines the speed and agility with which organizations can operate."
By reimagining data pipeline architecture for the real-time era, organizations can transform their data from a challenge into a strategic asset—delivering the right information, to the right place, at precisely the right time.