Abstract:DoS and DDoS attacks have been growing in size and number over the last decade and existing solutions to mitigate these attacks are in general inefficient. Compared to other types of malicious cyber attacks, DoS and DDoS attacks are particularly more challenging to combat. With their ability to mask themselves as legitimate traffic, developing methods to detect these types of attacks on a packet or flow level, has proven to be a difficult task. In this paper, we explore the potential of Variational Autoencoders to serve as a component within an intelligent security solution that differentiates between normal and malicious traffic. Two methods based on the ability of Variational Autoencoders to learn latent representations from network traffic flows are proposed. The first method resorts to a classifier based on the latent encodings obtained from Variational Autoencoders learned from traffic traces. The second method is rather an anomaly detection method where the Variational Autoencoder is used to learn the abstract feature representations of exclusively legitimate traffic. Then anomalies are filtered out by relying on the reconstruction loss of the Variational Autoencoder. Both of the proposed methods have been thoroughly tested on two separate datasets with a similar feature space. The results show that both methods are promising, with a slight superiority of the classifier based method over the anomaly based one. %that the first method is able to successfully detect individual traffic flows with high precision on the training and validation data, slightly less successfully on the test data. For the second method, the Variational Autoencoder will require further adjustments to be able to sufficiently filter out anomalies from network traffic flows.
Abstract:By default, the Linux network stack is not configured for highspeed large file transfer. The reason behind this is to save memory resources. It is possible to tune the Linux network stack by increasing the network buffers size for high-speed networks that connect server systems in order to handle more network packets. However, there are also several other TCP/IP parameters that can be tuned in an Operating System (OS). In this paper, we leverage Genetic Algorithms (GAs) to devise a system which learns from the history of the network traffic and uses this knowledge to optimize the current performance by adjusting the parameters. This can be done for a standard Linux kernel using sysctl or /proc. For a Virtual Machine (VM), virtually any type of OS can be installed and an image can swiftly be compiled and deployed. By being a sandboxed environment, risky configurations can be tested without the danger of harming the system. Different scenarios for network parameter configurations are thoroughly tested, and an increase of up to 65% throughput speed is achieved compared to the default Linux configuration.
Abstract:Abruptions to the communication infrastructure happens occasionally, where manual dedicated personnel will go out to fix the interruptions, restoring communication abilities. However, sometimes this can be dangerous to the personnel carrying out the task, which can be the case in war situations, environmental disasters like earthquakes or toxic spills or in the occurrence of fire. Therefore, human casualties can be minimised if autonomous robots are deployed that can achieve the same outcome: to establish a communication link between two previously distant but connected sites. In this paper we investigate the deployment of mobile ad hoc robots which relay traffic between them. In order to get the robots to locate themselves appropriately, we take inspiration from self-organisation and emergence in artificial life, where a common overall goal may be achieved if the correct local rules on the agents in system are invoked. We integrate the aspect of connectivity between two sites into the multirobot simulation platform known as JBotEvolver. The robot swarm is composed of Thymio II robots. In addition, we compare three heuristics, of which one uses neuroevolution (evolution of neural networks) to show how self-organisation and embodied evolution can be used within the integration. Our use of embodiment in robotic controllers shows promising results and provide solid knowledge and guidelines for further investigations.