scholarly journals Efficient Algorithm for Providing Live Vulnerability Assessment in Corporate Network Environment

2020 ◽  
Vol 10 (21) ◽  
pp. 7926
Author(s):  
Michał Walkowski ◽  
Maciej Krakowiak ◽  
Jacek Oko ◽  
Sławomir Sujecki

The time gap between public announcement of a vulnerability—its detection and reporting to stakeholders—is an important factor for cybersecurity of corporate networks. A large delay preceding an elimination of a critical vulnerability presents a significant risk to the network security and increases the probability of a sustained damage. Thus, accelerating the process of vulnerability identification and prioritization helps to red the probability of a successful cyberattack. This work introduces a flexible system that collects information about all known vulnerabilities present in the system, gathers data from organizational inventory database, and finally integrates and processes all collected information. Thanks to application of parallel processing and non relational databases, the results of this process are available subject to a negligible delay. The subsequent vulnerability prioritization is performed automatically on the basis of the calculated CVSS 2.0 and 3.1 scores for all scanned assets. The environmental CVSS vector component is evaluated accurately thanks to the fact that the environmental data is imported directly from the organizational inventory database.

2014 ◽  
Vol 25 (4) ◽  
pp. 38-65
Author(s):  
Yongkwon Kim ◽  
Heejung Yang ◽  
Chin-Wan Chung

Modeling and simulation (M&S) are widely used for design, analysis, and optimization of complex systems and natural phenomena in various areas such as the defense industry and the weather system. In many cases, the environment is a key part of complex systems and natural phenomena. It includes physical aspects of the real world which provide the context for a specific simulation. Recently, several simulation systems are integrated to work together when they have needs for exchanging information. Interoperability of heterogeneous simulations depends heavily on sharing complex environmental data in a consistent and complete manner. SEDRIS (Synthetic Environmental Data Representation and Interchange Specification) is an ISO standard for representation and interchange of environmental data and widely adopted in M&S area. As the size of the simulation increases, the size of the environmental data which should be exchanged between simulations increases. Therefore, an efficient management of the environmental data is very important. In this paper, the authors propose storing and retrieval methods of SEDRIS transmittals using a relational database system in order to be able to retrieve data efficiently in the environmental data server cooperating with many heterogeneous distributed simulations. By analyzing the structure and the content of SEDRIS transmittals, relational database schemas are designed. To reduce query processing time of SEDRIS transmittals, direct storing and retrieval methods which do not require the type conversion of SEDRIS transmittals are proposed. Experimental analyses are conducted to show the efficiency of the proposed approach. The results confirm that the proposed approach greatly reduces the storing time and retrieval time compared to comparison approaches.


2015 ◽  
Vol 21 (4) ◽  
pp. 648-651
Author(s):  
Lukas Tanutama ◽  
Gerrard Polla ◽  
Raymond Kosala ◽  
Richard Kumaradjaja

The competitive nature of Internet access service business drives Service Providers to find innovative revenue generators within their core competencies. Internet connection is the essential infrastructure in the current business environment. Service Providers provide the Internet connections to corporate networks. It processes network data to enable the Internet business communications and transactions. Mining the network data of a particular corporate network resulted in its business traffic profile or characteristics. Based on the discovered characteristics, this research proposes novel generic Value Added Services (VAS). The VAS becomes the innovative and competitive revenue generators. The VAS is competitive as only the Service Provider and its customer know the traffic profile. The knowledge becomes the barrier of entry for competitors. To offer the VAS, a Service Provider must build close relationship with its customer for acceptance.


Author(s):  
R. V. Kyrychok ◽  
◽  
G. V. Shuklin

The article considers the problem of determining and assessing the quality of the vulnerability validation mechanism of the information systems and networks. Based on the practical analysis of the vulnerability validation process and the analytical dependencies of the basic characteristics of the vulnerability validation quality obtained using the Bernstein polynomials, additional key indicators were identified and characterised, which make it possible to assert with high reliability about the positive progress or consequences of the vulnerability validation of the target corporate network. The intervals of these indicators were experimentally determined at which the vulnerability validation mechanism is of high quality. In addition, during the calculations, a single integral indicator was also derived to quantitatively assess the quality of the vulnerability validation mechanism of the corporate networks, and an experimental study was carried out, as well as the assessment of the quality of the automatic vulnerability validation mechanism of the db_autopwn plugin designed to automate the Metasploit framework vulnerability exploitation tool. As a result, it was proposed the methodology for analysing the quality of the vulnerability validation mechanism in the corporate networks, which allows one to quantify the quality of the validation mechanism under study, which in turn will allow real-time monitoring and control of the validation progress of the identified vulnerabilities. Also, in the study, the dependences of previously determined key performance indicators of the vulnerability validation mechanism on the rational cycle time were obtained, which makes it possible to build the membership functions for the fuzzy sets. The construction of these sets, in particular, allows making decisions with minimal risks for an active analysis of the security of corporate networks.


2019 ◽  
pp. 497-513
Author(s):  
Ivan D. Burke ◽  
Renier P. van Heerden

Data breaches are becoming more common and numerous every day, where huge amount of data (corporate and personal) are leaked more frequently than ever. Corporate responses to data breaches are insufficient, when commonly remediation is minimal. This research proposes that a similar approach to physical pollution (environmental pollution) can be used to map and identify data leaks as Cyber pollution. Thus, IT institutions should be made aware of their contribution to Cyber pollution in a more measurable method. This article defines the concept of cyber pollution as: security vulnerable (such as unmaintained or obsolete) devices that are visible through the Internet and corporate networks. This paper analyses the recent state of data breach disclosures Worldwide by providing statistics on significant scale data breach disclosures from 2014/01 to 2016/12. Ivan Burke and Renier van Heerden model security threat levels similar to that of pollution breaches within the physical environment. Insignificant security openings or vulnerabilities can lead to massive exploitation of entire systems. By modelling these breaches as pollution, the aim is to introduce the concept of cyber pollution. Cyber pollution is a more tangible concept for IT managers to relay to staff and senior management. Using anonymised corporate network traffic with Open Source penetration testing software, the model is validated.


Author(s):  
Andy Luse

This chapter describes various firewall conventions, and how these technologies operate when deployed on a corporate network. Terms associated with firewalls, as well as related concepts, are also discussed. Highly neglected internal security mechanisms utilizing firewall technologies are presented, including host-based firewalls and the more novel distributed firewall implementation. Finally, a section on how to perform a cost-benefit analysis when deciding which firewall technologies to implement is included. The chapter is designed as an introductory tutorial to the underlying concepts of firewall technologies. This understanding should provide a starting point for both systems support specialists implementing network security and researchers who are interested in firewall technologies.


2021 ◽  
Vol 31 (1) ◽  
Author(s):  
Izabella Stach ◽  
Jacek Mercik

This paper discusses some game-theoretical methods for measuring indirect control in complex corporate shareholding networks. The methods use power indices in order to estimate the direct and indirect control in shareholding structures. Some of these methods only estimate the control power of investors (firms without shareholdings), and only a few measure the control power of all firms involved in shareholding networks (which means investors and stock companies). None of them take measuring the importance of mutual connections (edges in the networks) into consideration; thus we focus in particular on an extension of these methods in this paper in order to measure both the control-power of the firms involved in complex shareholding structures (represented by nodes in networks), and the importance (power) of linkages between the firms as elements of a whole corporate shareholding network. More precisely, we apply our approaches to a theoretical example of a corporate network. Moreover, we continue the considerations started in Mercik and Stach (Transactions on Computational Collective Intelligence XXXI, LNCS 11290: 64–79, 2018) about reasonable properties for indirect control measurement. Some ideas of new properties are proposed. The paper also provides a brief review of the literature concerning the topic.


2019 ◽  
Vol 97 (Supplement_1) ◽  
pp. 13-13
Author(s):  
William M Sims ◽  
Lawton Stewart ◽  
Jacob R Segers ◽  
Robert W McKee ◽  
Macc Rigdon ◽  
...  

Abstract Heat-stress in finishing cattle presents a significant risk to efficiency and economic viability. The project objective was to quantify the effects of long-term heat stress when finishing cattle during the summer in the southeastern United States. Forty-five Angus crossbred steers (446±23 kg) were blocked by weight and randomly assigned to environmental finishing treatments including: covered with fan (CWF), covered without fan (CNF), or outside without shade (OUT). For 92 d steers were individually fed a typical feedlot ration. Environmental data were continuously recorded including: black globe temperature (BG), heat load index (HLI), and accumulated heat load units (AHLU). Feed intake was recorded daily, and steers were weighed every 20–25 days. When the first treatment averaged 613-kg all steers were slaughtered, and carcass data were collected. Data were analyzed as a Mixed Model (JMP V13; SAS Inst.) and means were separated (Least Squares Means). Average maximal BG was lower for covered finishing than OUT (P < 0.01) however for HLI CWFCNF>OUT, while G:F was similar (P = 0.22) between CWF and CNF, which were greater (P < 0.01) than OUT. Hot carcass weights were heavier for CWF than OUT (P < 0.01) and CNF was similar to both (P ≥ 0.11). There was no difference for USDA Yield Grade (2.6; P = 0.44), or marbling score (Modest20; P = 0.76). Steers finished under cover were more efficient than steers finished in open dry-lots. The addition of cooling fans further improved steer gains over those that were covered without fans.


Sign in / Sign up

Export Citation Format

Share Document