A Problem Worth Solving
When asked about the origin story of Onum, Lucas Varela, CTO and co-founder explains: "We were coming from the data management and data analytics space, so our goal was to make complex things easier for users like us. Real-time processing, automatic parsing — all the things that caused problems in our day-to-day work.”
For Lucas, who previously managed cybersecurity at CaixaBank, managing and extracting value from 50TB+ of daily logs across a sprawling ecosystem of systems was a constant battle.
Sergio Bellido, Onum's SVP of Product and Engineering, shares a similar perspective from his own experience: "The amount of data being sent to different systems has increased almost exponentially over the last 10 years," he explains. "There's now a pressing problem about what to do with the data. Which data do we keep for SecOps, ITOps, and observability, and what do we put in a different, cheaper place?"
Data reduction tools have helped to mitigate part of this issue, but data reduction alone isn’t helping organizations untangle the web of data, systems, and use cases and put the right data in the right place at the right time.
As Sergio explains, "That's why orchestration tools are becoming a must, to deal with that complexity. You need a tool that puts in a little bit of order, orchestration, and logic on top of that to make that new data ecosystem happen."
This encapsulates why Onum was created—and intentionally architected—to solve the challenge of modern data processing at scale, in real-time, without breaking the bank. To move critical data processing upstream, closer to where data is generated, rather than waiting to process it further downstream which adds latency, complexity, and limits options for optimal data storage.
When they set out to build Onum, the founders knew the solution would need to be highly performant and scalable, add little to no overhead for users, and address a multitude of growing and changing use cases. That meant the architecture for the Onum Platform was no trivial decision. Existing platforms weren't optimized to solve these issues without complexity and cost. A more modern architecture was an imperative.
Architectural Foundations: Decisions That Drive Onum’s Real-time Performance
Onum is an open platform and 100% no-code with the ability to deploy data pipelines in on-premise, cloud environments, or hybrid environments. To build a next-generation telemetry data management platform without creating new bottlenecks or adding unnecessarily complicated infrastructure requirements, Onum’s architecture needed to solve four fundamental challenges:
1. Custom Processing Engine for Optimal Performance
Rather than adapting existing technologies that struggled to meet the speed and flexibility required, Onum built a custom data processing engine specifically optimized for high-throughput, low-latency data operations.
"We extensively tested different tools and architectures before building our own engine, and we ran many tests to find the most resilient solution,” Lucas explains. “Data processing is our business, and our goal is to consume the least infrastructure and computation possible to be able to process millions of events per second.”
This focus on efficiency wasn't just a technical exercise—it was a strategic necessity for delivering real-time data processing at enterprise scale."
“Open source was tempting at the very beginning but when you’re building a performant, end-to-end solution, you have to roll up your sleeves and build that yourself,” said Sergio. “When you’re operating at petabyte scale, piecing together open-source components creates efficiency gaps. Our engine delivers consistent performance even with the most complex processing requirements.”
Turns out, all this hard work was well worth it. In benchmark testing, the average processing time for data pipelines in Onum is around 10 milliseconds.
2. Concurrent Processing at Scale with Go
Traditional data processing often handles events sequentially or requires substantial infrastructure to process multiple streams simultaneously. Onum addressed this by building on Golang, a programming language designed specifically for high-concurrency workloads.
"We chose Go for our internal engine and backend language because it was designed for concurrency problems, and we're dealing with concurrency problems every day," Lucas explains. "We have pipelines written by our customers that are very complex in the sense that they entail different processes, processing units, different use cases, multiple destinations for sending them to, doing correlations… everything in the same pipeline."
This architectural choice enables Onum to handle millions of events concurrently across multi-dimensional pipelines while maintaining millisecond-level performance even during unexpected data spikes, without introducing latency. This has significant advantages in terms of scalability and efficiency.
It powers Onum’s ability to deliver real-time intelligence and identify patterns across all your data streams in real-time, enabling detection of events, issues, and scenarios that would be difficult to spot after the fact. This means:
Security teams can detect attack patterns by correlating seemingly unrelated events
Operations teams can identify cascading infrastructure problems before they cause outages
Business teams can detect transaction anomalies that indicate fraud or system issues
3. API-First Design for Integration and Automation
Enterprise data environments are complex ecosystems with a variety of tools, workflows, and automation deeply embedded. Rather than requiring companies to adapt their current tooling to Onum, the platform was designed to integrate seamlessly into existing environments.
"Being API-first isn't just a technical choice—it's essential for enterprise adoption," says Sergio. “Being strict about this decision is important for us in order to integrate seamlessly with other parts of the data ecosystem or the other tools of the customer.”
This decision helps data teams because it means:
Anyone can build computational processes or data flows in real-time and connect with existing tools without disruption
Customers can easily change source or destination vendors without the pain and frustration of rip and replace
Users can integrate Onum into their CI/CD pipelines so they can automate testing and deployment while maintaining governance and compliance controls
Onum's list of integrations continues to expand based on customer needs and ecosystem developments. Check them out here.
4. Elastic Scaling for Unpredictable Workloads
Data volumes see both predictable patterns and unpredictable spikes. For instance, retail operations might see 10x normal volume during peak shopping seasons like Black Friday, while security teams can face sudden spikes during attack scenarios where they might have their tools unavailable due to the load increase and the lack of elasticity in their systems.
"Data behaves in seasonal patterns with unpredictable spikes that can overwhelm ingestion systems," notes Sergio. "That was another design principle for us—to be able to scale in an agile manner so we can cope with those unexpected amounts of data coming into an industry."
Onum uses a containerized architecture to enable both vertical and horizontal scaling to account for these unpredictable demands, automatically adjusting resource allocation to maintain performance during peak loads.
The Customer Impact of Onum’s Architecture
The architectural decisions that form Onum's foundation were each specifically designed to address the data challenges that plague modern enterprises. But what do these technical choices mean for customers in practice? The impact translates directly into tangible benefits that solve the problems Lucas, Sergio, and countless security, IT, and data professionals face every day.
From Data Overload to Actionable Intelligence
Onum's custom processing engine and ability to process countless pipelines concurrently reduce the cognitive burden of data management. Security teams no longer need to choose which data to analyze based on storage or processing limitations. Instead, they can identify attack patterns by correlating seemingly unrelated events in real-time—turning what was once an overwhelming flood of information into precise, actionable intelligence.
Measurable Efficiency Gains
By processing data at the source and filtering out noise before it reaches expensive analytics platforms, organizations are seeing dramatic reductions in their data management costs:
Storage costs reduced by up to 50% by intelligently routing only valuable data to premium storage
Alert response times cut from minutes to milliseconds
Data migration timelines shortened by 50%, freeing up valuable engineering resources
Breaking Down Silos Without Breaking the Bank
The API-first design and agnostic routing capabilities mean that teams across the organization can access and activate the data they need without duplicating storage or processing. Operations teams, security analysts, and business units each get data where, when, and how they need it—transformed and formatted for their specific tools—while the organization maintains a single source of truth.
From Technical Features to Business Outcomes
Ultimately, what matters isn't the technology itself but what it enables. Onum's architectural choices have transformed how organizations relate to their data:
Instead of spending time reconciling data across disparate systems, teams now focus on extracting in-stream insights and taking real-time action
Rather than discovering problems minutes or hours after they occur, organizations can respond in real-time when milliseconds matter
Instead of making costly "store everything" decisions, businesses can make intelligent choices about data based on its actual value
By solving the fundamental architectural challenges of modern data processing, Onum has enabled organizations to transform their data from an overwhelming management burden into a strategic asset that drives real-time decision making across the enterprise.
The Future of Enterprise Data Processing
"We're in a privileged position in the data world," Sergio explains. "We're close to the source of the data — to the left of the full data stream. This puts us in a position not only to orchestrate data but also to apply intelligence before it reaches downstream systems like SIEMs or data lakes."
Onum’s architecture design offers a path forward for organizations struggling with the complexity, cost, and resource requirements of traditional data processing.
"We're just beginning to see how real-time data intelligence transforms business operations," says Sergio. "Companies that continue with post-processing analytics will increasingly find themselves at a disadvantage as competitors gain the ability to act instantly on their data."
For enterprises dealing with terabyte or petabyte-scale data processing challenges, the shift from "analyze later" to "intelligence now" represents a fundamental business advantage in efficiency, security, and operational responsiveness, and Onum is at the forefront of this change.
To see Onum’s architecture in action, let us show you what we can do with your data.