In a rapidly expanding global traffic, the Internet is constantly evolving. But analyzing these data transfers in their entirety represents a real challenge for researchers, while with such information they could help create a more efficient network, prevent failures and improve defenses against cyber attacks.
Using a supercomputer, a team of researchers from the Massachusetts Institute of Technology (MIT) recently succeeded in creating a tool to analyze global Internet traffic. Since 2015, they have analyzed nearly 50 billion data packets collected in Japan and California.
To do this, they were first forced to process this "hyper-deep" data with a technique called Dynamic Distributed Dimensional Data Mode (D4M). Then, in a second step, they created a neural network to analyze the data to find relationships.
The researchers likened the results to a measure of Internet background noise. This allows you to detect anomalies and obtain information on file sharing, malicious IP addresses and spam, attack distribution, as well as data traffic jams.
Source : Futura Tech