NGLA: Next Generation Log Analytics

Today, most sophisticated technologies like IoT, Big Data, Cloud, data center consolidation, etc., demand smarter IT infrastructure and operations. They continuously generate lots of logs to report their operational activities. Efficient operation and maintenance of the infrastructure require many of the least glamorous applications, such as troubleshooting, debugging, monitoring, security breaching in real-time. Logs spot the fundamental information for them and are useful to diagnose the root cause of a complex problem. Because of the high volumes, velocities, and varieties of log data, it is an overwhelming task for humans to analyze these logs without a real-time scalable log analysis solution. We find that existing log analysis tools and solutions do not offer enough automation to ease humans burden. To solve this issue we have designed “Next Generation Log Analysis”(NGLA) a streaming log analytic framework.

We have implemented NGLA using a scalable distributed streaming framework, a message broker, and unsupervised machine learning based techniques. First in a training phase we learn the log patterns, and leverage these to generate models for causality, timeseries feature relationships, predictive analysis, and semantic analysis. These models are then used in real-time analytic framework(Spark Streaming) to scale out and analyze/detect anomalies in a large volume of logs. Offline querying is currently supported using ElasticSearch.

As a part of NGLA we have made modifications to the Spark Streaming platform to generate a generic solution with two additional functionalities: rebroadcasting and efficient state handling mechanism.

NGLA Architecture

Pattern Recognition

NGLA uses novel log pattern extraction using training logs, and generating regular expressions.