top of page
Dots

Uncover the Secret of Latency Prediction - Part 1

Updated: Nov 12

In previous editions, we have looked at the present by monitoring indicators. In this new edition, we are going to make a modest attempt to tackle the subject of predictions. We are all used to forecasts, such as weather or financial forecasts. Telecoms networks are no exception to this trend. Prevention is cure. So in a context of increasing automation, such as in remote operations like mining, healthcare or connected vehicles, prediction is no longer a luxury. The recent rise of AI means that we can predict a revolution in anticipating the future, all other things being equal. As the saying goes, "A good forecaster is no more intelligent than anyone else. He has simply organised his ignorance better". Besides, the uncertainty of forecasts is as important, if not more so, than the forecasts themselves, isn't it?


So enjoy reading this first part 1. The next part will shed light on Latencetech implementation (Part 2).


1.Ā Forecast vs Prediction?


Forecasting typically focuses on predicting outcomes over a longer time frame, often involving trends and patterns that occur over months, years, or even decades. Predictions can be more short-term and immediate, often used to estimate outcomes soon, up to a year.


However, ForecastĀ is more suitable since it isĀ a time seriesĀ problem.

This article will give an overview of prediction models and how Latencetech Inc integrates such technology in its solution for IP Networks Monitoring.



2.Ā The Challenge of Prediction

Predicting future values is a cornerstone of many fields, from finance to Networking. It involves estimating an unknown quantity based on available data. To achieve this, a variety of statistical and machine learning techniques are employed.

2 options either Statistical Methods and/ or Neuronal approach.

Statistical methods provide a solid foundation for prediction. Linear regression, for instance, models the relationship between a dependent variable and one or more independent variables as a linear equation. It's simple to understand and implement but can be limited by its assumption of linearity.

  • Advantages: Simplicity, interpretability of coefficients, wide range of software support.

  • Disadvantages: Assumption of linearity, sensitivity to outliers, difficulty capturing nonlinear relationships.

Neural networks, particularly convolutional neural networks (CNNs), have revolutionized prediction, especially in image and text processing. CNNs excel at extracting complex features from unstructured data.

  • Advantages:Ā Ability to learn complex representations, high accuracy on challenging tasks, adaptability to different data types.

  • Disadvantages:Ā Require large amounts of data for training, often considered black boxes, computationally expensive.

Neural networks, while powerful, can be challenging to interpret and may require significant computational resources.

Since mathematics has been the first tools for prediction (standard deviation, linear regression,) it becomes irrelevant with billions of parameters (token) so comes AI/ML.

Choosing a method depends on the duration of the forecast, linearity and interdependence with the time factor.

In the below picture, what will be the position and color of the next blue bubble?



Linear regression is a statistical model that estimates the linear relationship between a scalar response and one or more explanatory variables. The case of one explanatory variable is called simple linear regression; for more than one, the process is called multiple linear regression

Exponential Smoothing isĀ a method for forecasting univariate time series data. It is based on the principle that a prediction is a weighted linear sum of past observations or lags. The Exponential Smoothing time series method works by assigning exponentially decreasing weights for past observations. The Exponential Smoothing doesnā€™t need any training time, just 3-4 minutes of data.

ARIMA (AutoRegressive Integrated Moving Average) algorithm is a statistical model used to analyze and predict time series. It is based on the idea that the value of a data item at a given time depends linearly on a) its past values (auto-regressive component), b) its past errors (moving average component) and c) any trend (integrated component).


3.Ā AI Models for time series forecasting

In various sectors such as finance, retail, and meteorology, as well as Network Monitoring, time series forecasting is pivotal.

Traditional models often grapple with flexibility and interpretability, particularly when dealing with complex patterns.

Remember > Your model will always not give you the best performance, which you can benchmark with metrics such as MSE (mean squared error) and MAE (Mean Absolute Error (MAE) isĀ a measure of the average size of the mistakes in a collection of predictions, without taking their direction into account. It is measured as the average absolute difference between the predicted values and the actual values and is used to assess the effectiveness of a regression mode).

More recently, neural networks have become increasingly popular for predicting time series, as they correct some of the shortcomings of 'classic' methods such as ARIMA

Letā€™s review below some models for AI: N-BEATS, TFT, TCN, TIDE, LSTM


3.1Ā Ā Ā Ā  N-BEATS (neural basis expansion analysis)

N-BEATS (Neural Basis Expansion Analysis) modelĀ uses a series of fully connected layers organized into blocks, each with sub-networks for back casting and forecasting. N-BEATS trains by alternating between forecasting future values and reconstructing past values (backcasts). It minimizes the error between its predictions and actual data, sharpening its forecasting ability


3.2Ā Ā Ā Ā  TFT (temporal fusion transformer)

Temporal Fusion Transformer (TFT) is a model for multi-horizon and multivariate time series forecasting use cases.

As an example, to predict future energy consumption in buildings, we can characterize location as a static covariate, weather and data as time-dependent unknown features, and calendar data like holidays, day of week, season, ā€¦ as time-dependent known features


3.3Ā Ā Ā Ā  TCN (temporal convolutional networks and forecasting)

A TCN, short for Temporal Convolutional Network, consists of dilated, causal 1D convolutional layers with the same input and output lengths

Temporal Convolutional Networks (TCNs) areĀ deep neural network architectures that are used in trajectory prediction tasks. They are trained on historical trajectory data and are capable of predicting the future trajectory of a vehicle or pedestrian


3.4Ā Ā Ā Ā  TIDE (time series dense encoder)

TiDE stands for Time-series Dense Encoder, and it isĀ an MLP-based model Multi-layer Perceptron (MLP) designed for long-horizon multivariate forecasting. It relies on the residual block unit to first encode covariates and historical data. Then, it decodes the learned representation and generates forecast.


3.5Ā Ā Ā Ā  LSTM

LSTM (Long Short-Term Memory) is a Recurrent Neural Network (RNN) capable of capturing temporal properties by remembering the previously observed data

Unlike traditional neural networks, LSTM incorporates feedback connections, allowing it to process entire sequences of data, not just individual data points. This makes it highly effective in understanding and predicting patterns in sequential data like time series, text, and speech.


3.6Ā Ā Ā Ā  deepAR

DeepAR (by AWS),Ā a methodology for producing accurate probabilistic forecasts, based on training an autoregressive recurrent neural network model on many related time series.


4. Platform tool for AI models

GOOGLE

Vertex AI from Google Cloud is a complete platform for building, deploying and managing machine learning models. It offers a range of services covering the entire machine learning lifecycle.

In addition, it provides built-in support for popular ML frameworks such as TensorFlow and Scikit-learn, making it easy to develop and deploy your models AI Platform provides a set of APIs for delivering predictions from trained models.


AWS

Amazon SageMaker is AWS' answer to Vertex AI. It is a fully managed platform designed to help developers and data scientists create, train and deploy large-scale machine learning models. Azure Machine Learning is Microsoft's machine learning offering. It provides a complete set of tools for building, training and deploying models, as well as managing models in production.


MICROSOFT

Azure Machine Learning is Microsoft's machine learning offering. It provides a comprehensive set of tools for building, training and deploying models, as well as for managing models in AutoML production, enabling users to automate model selection and hyperparameter settings, further streamlining the machine learning workflow.


NextĀ ?

We have just briefly listed the structuring approaches for tackling the field of prediction. The next edition will look at how Latencetech uses them to enable operators to better anticipate the behavior of their network.

195 views0 comments

Recent Posts

See All

Comments


bottom of page