Large networks are often comprised of interdependent subsystems, such as information, communication, and power systems, which may make them vulnerable to cyberattacks. A virus/malware infection in one system can spread internally, attacking other systems, potentially impacting the overall system. Researchers noted that the problem is similar to that of the spread of diseases in social networks.
NIST and university researchers have developed a model for minimizing large networks’ cybersecurity costs that was described in Optimal Cybersecurity Investments in Large Networks Using SIS Model: Algorithm Design published in IEEE/ACM Transactions on Networking. The researchers sought a way to determine optimum investments needed to minimize the costs of securing these networks, providing recovery from infections and repairing their damage.
Unlike previous studies that involved optimum vaccination strategies for quickly reducing infection rates in epidemics, researchers used a time-averaged, aggregate security cost, based on a network’s long-term behavior analysis as a key performance metric for determining security investments. The researchers then developed a model which assessed investments for security measures, such as monitoring, diagnostics, and more, against the following:
Based on modeling results, a set of efficient algorithms was developed for determining optimum investments that would minimize security costs for given conditions.