internet routing
Recently Published Documents


TOTAL DOCUMENTS

220
(FIVE YEARS 25)

H-INDEX

25
(FIVE YEARS 2)

2021 ◽  
Vol 5 (6) ◽  
pp. 1161-1170
Author(s):  
Valen Brata Pranaya ◽  
Theophilus Wellem

The validity of the routing advertisements sent by one router to another is essential for Internet connectivity. To perform routing exchanges between Autonomous Systems (AS) on the Internet, a protocol known as the Border Gateway Protocol (BGP) is used. One of the most common attacks on routers running BGP is prefix hijacking. This attack aims to disrupt connections between AS and divert routing to destinations that are not appropriate for crimes, such as fraud and data breach. One of the methods developed to prevent prefix hijacking is the Resource Public Key Infrastructure (RPKI). RPKI is a public key infrastructure (PKI) developed for BGP routing security on the Internet and can be used by routers to validate routing advertisements sent by their BGP peers. RPKI utilizes a digital certificate issued by the Certification Authority (CA) to validate the subnet in a routing advertisement. This study aims to implement BGP and RPKI using the Bird Internet Routing Daemon (BIRD). Simulation and implementation are carried out using the GNS3 simulator and a server that acts as the RPKI validator. Experiments were conducted using 4 AS, 7 routers, 1 server for BIRD, and 1 server for validators, and there were 26 invalid or unknown subnets advertised by 2 routers in the simulated topology. The experiment results show that the router can successfully validated the routing advertisement received from its BGP peer using RPKI. All invalid and unknown subnets are not forwarded to other routers in the AS where they are located such that route hijacking is prevented.  


2021 ◽  
Vol 10 (1) ◽  
pp. 8-11
Author(s):  
Michael Schapira

Combatting internet time shifters Arguably, the internet’s biggest security hole is the Border Gateway Protocol (BGP), which establishes routes between the organisational networks that make up the internet (e.g. Google, Facebook, Bank of England, Deutsche Telekom, AT&T). The insecurity of the internet’s routing system is constantly exploited to steal, monitor, and tamper with data traffic. Yet, despite many years of Herculean efforts, internet routing security remains a distant dream. The goal of the SIREN project is to propose and investigate novel paradigms for closing this security hole.


2021 ◽  
Vol 2021 ◽  
pp. 1-16
Author(s):  
Tulat Naeem ◽  
Abdu Gumaei ◽  
Muhammad Kamran Jamil ◽  
Ahmed Alsanad ◽  
Kifayat Ullah

Connectivity index CI has a vital role in real-world problems especially in Internet routing and transport network flow. Intuitionistic fuzzy graphs IFGs allow to describe two aspects of information using membership and nonmembership degrees under uncertainties. Keeping in view the importance of CI s in real life problems and comprehension of IFGs , we aim to develop some CI s in the environment of IFGs . We introduce two types of CI s , namely, CI and average CI , in the frame of IFGs . In spite of that, certain kinds of nodes called IF connectivity enhancing node IFCEN , IF connectivity reducing node IFCRN , and IF neutral node are introduced for IFGs . We have introduced strongest strong cycles, θ -evaluation of vertices, cycle connectivity, and CI of strong cycle. Applications of the CI s in two different types of networks are done, Internet routing and transport network flow, followed by examples to show the applicability of the proposed work.


2021 ◽  
Author(s):  
Fenwick Robert McKelvey

This dissertation develops the concept of transmissive control to explore the consequences of changes in Internet routing for communication online. Where transmission often denotes an act of exchanging information between sender and receiver, transmissive control theorizes transmission as the production and assignment of common times or temporalities between components of a communication system. Transmissive control functions both operationally according to how computational algorithms route Internet data (known as packets) and systematically according to how patterns in these operations express temporalities of coordination and control. Transmissive control questions how algorithms transmit packets and how transmission expresses valuable temporalities within the Internet. The concept of transmissive control developed as a response to advanced Internet routing algorithms that have greater awareness of packets and more capacity to intervene during transmission. The temporality of the Internet is changing due to these algorithms. Where transmissive control has been made possible by the Internet’s core asynchronous design that allows for many diferent temporalities to be simultaneous (such as real-time networks or time-sharing networks), this diversity has taxed the resources of the Internet infrastructure as well as the business models of most Internet Service Providers (ISPs). To bring the temporality of the Internet back under control, ISPs and other network administrators have turned to transmissive control to better manage their resources. Their activities shift the Internet from an asynchronous temporality to a poly-chronous temporality where network administrators set and manage the times of the Internet. Where this turn to traffic management has often been framed as a debate over the neutrality of the Internet, the dissertation re-orientates the debate around transmissive control. Tactics by the anti-copyright Pirate Bay and Internet transparency projects illustrate potential political and policy responses to transmissive control. The former seeks to elude its control where the latter seeks to expose its operation. These components as well as the operation of transmissive control will be developed through a series of metaphors from the film Inception, the demons of Pandemonium, the novel Moby-Dick and the film Stalker. Each metaphor cooperate to provide a comprehensive discussion of transmissive control.


2021 ◽  
Author(s):  
Fenwick Robert McKelvey

This dissertation develops the concept of transmissive control to explore the consequences of changes in Internet routing for communication online. Where transmission often denotes an act of exchanging information between sender and receiver, transmissive control theorizes transmission as the production and assignment of common times or temporalities between components of a communication system. Transmissive control functions both operationally according to how computational algorithms route Internet data (known as packets) and systematically according to how patterns in these operations express temporalities of coordination and control. Transmissive control questions how algorithms transmit packets and how transmission expresses valuable temporalities within the Internet. The concept of transmissive control developed as a response to advanced Internet routing algorithms that have greater awareness of packets and more capacity to intervene during transmission. The temporality of the Internet is changing due to these algorithms. Where transmissive control has been made possible by the Internet’s core asynchronous design that allows for many diferent temporalities to be simultaneous (such as real-time networks or time-sharing networks), this diversity has taxed the resources of the Internet infrastructure as well as the business models of most Internet Service Providers (ISPs). To bring the temporality of the Internet back under control, ISPs and other network administrators have turned to transmissive control to better manage their resources. Their activities shift the Internet from an asynchronous temporality to a poly-chronous temporality where network administrators set and manage the times of the Internet. Where this turn to traffic management has often been framed as a debate over the neutrality of the Internet, the dissertation re-orientates the debate around transmissive control. Tactics by the anti-copyright Pirate Bay and Internet transparency projects illustrate potential political and policy responses to transmissive control. The former seeks to elude its control where the latter seeks to expose its operation. These components as well as the operation of transmissive control will be developed through a series of metaphors from the film Inception, the demons of Pandemonium, the novel Moby-Dick and the film Stalker. Each metaphor cooperate to provide a comprehensive discussion of transmissive control.


2021 ◽  
Author(s):  
Wang Liao ◽  
Dong Liu ◽  
Yun Chen ◽  
Wei Du ◽  
Yingtu Mao ◽  
...  

2021 ◽  
Vol 1 (2) ◽  
pp. 67-74
Author(s):  
Dalia Nashat ◽  
Fatma A. Hussain ◽  
Xiaohong Jiang

Computer networks are vulnerable to many types of attacks while the Distributed Denial of Service attack (DDoS) serves as one of the top concerns for security professionals. The DDoS flooding attack denies the services by consuming the server resources to prevent the legitimate users from using their desired services. The hardness of detecting this attack lies in sending a stream of packets to the server with spoofed IP addresses, so that the internet routing infrastructure cannot distinguish the spoofed packets. Based on the odds ratio (OR) statistical measurement, in this work we propose a new detection method for the DDoS flooding attacks. By exploring the odds ratio to determine the risk factor of any incoming traffic to the server, the legitimate and attack traffic packets can be easily differentiated. Experimental results demonstrate the efficiency of the presented detection method in terms of its detection probability and detection time.


2020 ◽  
Author(s):  
Juliao Braga

This project establishes an environment for knowledge acquisition, learning, use and collaboration inter-agents over Internet Infrastructure. Four agent types are used in a previously applied four-tier model (A2RD), such as the use case on the Internet Routing Registry. This model, which can be implemented in each Autonomous System domain of the Internet infrastructure, is integrated into an environment with (a) capturing information from unstructured databases, (b) creating and updating training bases appropriate to machine learning algorithms and (c) creating and feeding of a knowledge base. Such resources become readily available to agents in each domain and to agents in all other domains with the aim of making them autonomous. The agents collaborate and interact with each other, through individual blockchain structures that also take care of operational security and integration aspects. In addition, a testbed to validate the entire model, including the functionalities of the agents, is also proposed and characterized.Acnowledge: This work is supported by CAPES -- Brazilian Federal Agency for Support and Evaluation of Graduate Education within the Brazil’s Ministry of Education, and is also supported by national funds through FCT with reference UID/CEC/50021/2019, and is supported by MackenziPesquisa from Universidade Presbiteriana Mackenzie..


2020 ◽  
Author(s):  
Akmal Khan ◽  
Hyun-chul Kim ◽  
Ted "Taekyoung" Kwon

Sign in / Sign up

Export Citation Format

Share Document