Big data, machine learning, and artificial intelligence are no longer the stuff of science fiction – they’re coming soon to a network near you (if not there already). I ran into Brian Lavallée, Sr. Director, Portfolio Marketing at Ciena, and asked him a few questions about how these technologies are shaping networks of the future, and how they’ll lead to self-aware networks. Here’s what he said.
SF: Why do we need autonomous networks?
BL: Autonomous networks can do a lot of things humans can’t, such as examining terabytes of data, performing massive amounts of ongoing processing, and using analytics to uncover actionable insights that just aren’t possible with human-powered analysis. This includes the ability to recommend proactive maintenance on ports likely to fail and avoiding outages before they occur. The network will also be able to self-optimize to manage bandwidth more intelligently by understanding which links are under pressure and which are underutilized and adjusting traffic accordingly. This will become much more important as 5G is deployed, dramatically increasing mobile bandwidth and making it harder to predict where spikes in demand will occur.
SF: By allowing the network to act autonomously, how are we going to determine when a virus enters the system? How can we distinguish authorized, genuine actions from fraudulent ones?
BL: The fear of not being able to detect bad actors within the network is big security concern. Having a virus take over an automated network has the potential for catastrophic consequences. Artificial intelligence detects trends and anomalies from past behaviors, making it the perfect technology to what’s “normal” and more importantly what’s “abnormal”, and act accordingly.
SF: Does network instrumentation, which generates the big data used by machine learning algorithms, require expensive extra equipment?
BL: Most of the data required is already being generated by sensors within modern packet-optical networks. This is partly because when 40 Gbps products were first introduced, there were no commercially available test sets, meaning network equipment had to develop methods of testing their own networks – this was particularly true for coherent networking technology because it was so revolutionary to traditional on-off keying modems. Measurement and testing capabilities were embedded directly into coherent optical processors, for example. This allowed for many parameters to be measured on an ongoing basis, without external test sets, such as chromatic dispersion, polarization dispersion, attenuation, latency, and much more. Open APIs allow this measured data to be extracted from the network and fed into offline machine learning systems.
SF : Can service providers extract that same raw data from a network with multiple vendors?
BL: Yes! Data is data, so once a service provider establishes a data model, either structured or unstructured, and as long as it can be accessed via an open API, it can be pulled from any network equipment from any vendor. It’s then up to the network operator to manipulate the data and decide if the network will respond automatically, or if human intervention is required.
SF: How far is the industry from proactive network maintenance?
BL: We already do proactive maintenance based on information derived from the network. For example, network monitoring analytics technology can predict when a specific network element or port is likely to fail within a given timeframe, and prompt the network operator to investigate and repair or replace that equipment. Right now, there are feedback mechanisms in place in which humans ultimately decide the best course of action based on machine learning conclusions reached by manipulating vast amounts of data extracted from network sensors. Over time, the autonomous systems will get more intelligent, and listen to itself and act accordingly.
Edited by Maurice Nagle