Previously, NIST researchers developed a model for predicting the minimum investment needed to achieve the optimum cybersecurity for large networks. Basically, it assessed security measures – such as monitoring, diagnostics, and more – against probabilities of breaching, rates of spreading, and recovery rates. Based on modeling results, algorithms determined the minimum investment needed for optimum cybersecurity. The model was described in Optimal Cybersecurity Investments in Large Networks Using SIS Model: Algorithm Design.
NIST researchers extended their work to very common situations for large networks, in their paper, Optimal Cybersecurity Investments Using SIS Model: Weakly Connected Networks, which was presented at the IEEE Global Communications Conference 2022. The previous model was based on assumptions that large networks were comprised of strongly connected networks, in which case infections can easily spread. However, many large networks consist of weakly connected networks, and thus the model and algorithms could not compute the optimal cybersecurity and the associated minimum investment.
Researchers, therefore, focused on addressing external attacks to weakly connected, large networks, again with the intent of minimizing cybersecurity costs. They found that these external attacks randomly occurred, but when they did, they continued and increased; this allowed researchers to compute the upper and lower bounds of this problem and thus solve it. In doing so, researchers extended their model's formulation to assessing the cybersecurity needed to address these external attacks and secondary attacks. Researchers also adapted their algorithms to predict the minimum investment needed for cybersecurity.