Viewpoint

Report of ITP seminar held in London on 2 February 2017

Terms such as machine learning, big data, predictive analytics, autonomic computing, are being used more and more both in the telecoms field and more widely.  Some serious claims are being made – that machine learning heralds a sea-change not only in the way the telecoms and IT industries face their own strategic and operational challenges, but also in terms of solutions opportunities for their customers.  How real is this?  And what are we talking about anyway?   This was the topic of an ITP seminar hosted in BT Centre, London and attended by well over 100 people with expert speakers from BT, Nokia Bell Labs, and Cisco earlier this month. 

 
What is machine learning? 
 
Dr Simon Thompson of BT gave a very simple example of the learning that takes place when playing roulette; if, for example, the ball almost always ends on red, then the individual will play red.  If there is a sign of some sort that always precedes the ball falling on black, then the punter will play black following that sign.  This is obviously a learning example based on actual observed data, not statistical analysis.  
 
In the telco world, machine learning can be applied to content recommendations to customers, fraud detection, intrusion detection in the network and route planning for the operations workforce.  Crucially though, machine learning must be based on clean observed data.  Also, given the vast quantities of data (hundreds of millions of records per day), collecting and loading the data must be done in near-real-time – a big challenge.
 
Understanding data quantities 
 
Chris White of Nokia Bell Labs emphasised that there was no shortage of data. It is creating knowledge from that data that allows decisions to be made, not just processing the data – but deriving understanding and meaning from it.  He predicted the advent of the Internet of Things will create huge quantities of data.  Machine learning should be regarded as a tool to augment our physical abilities in much the same way as humans have always created tools.  Thinking, understanding and meaning are about making connections, often between seemingly unrelated pieces of data.  Therefore, we should keep humans in the loop effectively augmenting their intelligence by machine processing of data.
 
Dealing with analytics 
 
Ray O’Hanlon of Cisco presented a practical use case of machine learning and analytics to optimise data centre operations using a tool developed by Cisco – Tetration Analytics. Data centres are becoming increasingly complex with huge quantities of data traffic remaining within them. The applications too are becoming more complex.  Tetration Analytics holds out the prospect of examining every packet header, not just a sample, within the data centre to provide greater insight into the behaviours of the applications, application of policies, analysis of data flows and so on.  This in turn enables the operators of data centres to become more nimble, agile and faster to react to change.
 
The human role
 
Mark BondMark Bond, formerly a director with Vodafone, led an insightful panel session answering questions around the role of humans and humanity in a world of machine learning. Does this mark the start of the end of humanity?  How can we be sure machines will exhibit ethical learning and behaviours rather than be a force for the bad?  Does machine learning herald an even greater widening of the gap between the first world and the third world? 
 
It was noted that whilst intelligence services have been the biggest user of machine learning to date, this could change with the move to the Internet of Things and the massive data volumes it implies.  The UK is well-positioned to develop the technology with many UK universities at the forefront – although exploitation of machine learning will be worldwide.  There will need to be some standards or regulation around security and privacy aspects of the sharing data.  As for the role of people, there will continue to be need to understand the reasons for a recommended decision. We wouldn’t simply take the recommendation off the dashboard – we would want some insight into how the machine has learned and reached a decision.  Future developments could include tools that expose the ‘why’ of a recommendation, not just the ‘what’.
 
The ITP runs events and seminars for the telecoms industry throughout the year www.theitp.org 
 

ABOUT THIS ARTICLE

Knowledge Network Articles enable our partners to create high value content to express a particular view or highlight an opinion in an editorial style environment on Total Telecom, they are not produced by our editors and do not reflect the views of Total Telecom. To find out about content creation opportunities for your organisation visit the information page or email info@totaltele.com

 

Share