Onum Architecture
Seamlessly collect any type of data, in any format, from any source, and quickly route it to any destination, across every aspect of your hybrid network.
Cuts through data noise
Designed to work in hybrid, distributed environments using Edge computing as a design principle, Onum enables you to:
- Reach all events, logs, metrics, and traces across all data sources and destinations.
- Orchestrate the right logs, traces, and metrics from any source to the right destination.
- Gain insights and guidance to determine the most appropriate place to send specific datasets, based on past behavior, use case, and value.
- Delivery Methods
- Deployment Types
Delivery Methods
Onum supports all major standards such as Netflow, Syslog, and Kafka to orchestrate data streams to any desired destination, including popular data analytics tools such as Splunk and Devo, as well as storage environments such as S3.
Delivery Methods
Onum supports all major standards such as Netflow, Syslog, and Kafka to orchestrate data streams to any desired destination, including popular data analytics tools such as Splunk and Devo, as well as storage environments such as S3.
Deployment Types
The Onum Platform supports any deployment type ― including on-premises, the Onum public cloud, or your own private cloud.
In a typical SaaS-based deployment, most processing activities are conducted in the Cloud. Client-side components can be deployed on a Linux machine or on a Kubernetes cluster for easy, flexible deployment in any environment. Onum supports all major cloud environments, including Amazon Web Services (AWS), Google Cloud Platform (GCP), and Microsoft Azure.
- Edge Observability
- Centralized Management
- Low Level Architecture
Edge Observability
Listeners are placed right on the edge to collect all data as close as possible to where it’s generated. Each listener then forwards the data to the Onum SaaS Management Console to be processed.
Edge Observability
Listeners are placed right on the edge to collect all data as close as possible to where it’s generated. Each listener then forwards the data to the Onum SaaS Management Console to be processed.
Centralized Management
The Onum SaaS Management Console receives data from the listeners, and then observes and optimizes the data from all nodes, including applying machine learning algorithms to determine the most appropriate action to take with each data type. All data is then sent to the proper data sink.
All of these computations occur at the edge, to maximize speed and efficiency while minimizing the infrastructure required to run it.
Low Level Architecture
Client-side components can be deployed on a Linux machine, or on a Kubernetes cluster. This makes them easy to deploy and manage, and keeps them as close as possible to where the data is produced.
Request a demo or get in touch with our team
Let us help you discover the full potential of your data today!