Ruslan Rakhmetov, Security Vision
Currently, one of the main challenges in cybersecurity is the shortage of human resources, which in many ways hinders the development of the industry. Self-taught enthusiasts and professionals entering IS from related professions (e.g., IT) often feel the need to upgrade their theoretical knowledge of IS. Continuously improving training courses at universities, professional retraining programmes, trainings and courses from vendors and training centres provide a great opportunity to get structured knowledge on IS, but in most cases they imply either long training (in many cases - full-time) or provide information only on a certain area of the profession. Established professionals, students and enthusiasts interested in IS topics might be interested in receiving a reference book with up-to-date information on most modern IS trends. As part of Security Vision's educational initiatives, we are starting a series of publications dedicated to the Cybersecurity Body of Knowledge (CyBOK).
CyBOK is a project to create a body of cybersecurity knowledge in 21 areas of IS, developed at the University of Bristol by a team of over 115 experts from around the world, including IS professionals and academics from various universities. The official project website contains a pdf-book version of CyBOK (which will be used as the basis for this series of publications), as well as a variety of additional IS learning materials, such as webinars and podcasts, and an interactive Knowledge Tree. CyBOK is notable in that it is designed to cover as many different areas of cybersecurity as possible - a comparison of the CyBOK body of knowledge with various universities' curricula and professional certification programmes (such as CISSP, SSCP, CISM, CRISC).
Chapter 1: Introduction
As defined by the CyBOK authors, cyber security is the protection of information systems (software, hardware and related infrastructure), the data in them and the services they provide from unauthorised access, accidental or deliberate harm or misuse. Information security is the preservation of confidentiality, integrity, availability of information, as well as its authenticity (authenticity), accountability, unreliability, trustworthiness (reliability). The book points out that the objects of protection in many cases are not only information and services, but also people and the physical environment (in the case of protecting technological processes, for example, in automated control systems), and confrontation in cyberspace can affect both the virtual and physical worlds.
1.2 Knowledge domains in CyBOK
The CyBOK corpus is divided into 21 Knowledge Areas (KAs), which are grouped into five categories:
- Human, organisational and regulatory aspects:
1) Risk and IS management: IS management systems and organisational safeguards, including standards, best practices and approaches to risk assessment and mitigation.
2) Laws and Regulations: International and national government and regulatory requirements, compliance requirements and IS ethics, including data protection and the development of doctrines for cyber resilience.
3) Human Factor: Social and behavioural factors affecting cyber security, IS culture and IS awareness, and the impact of protection measures on user behaviour.
4) Privacy and Online Rights: Techniques to protect personal data, including communications, applications and the results of data storage and processing. This section also includes cybersecurity of systems supporting online rights and data protection against interception and distortion, protection of systems for electronic voting, privacy in payment and identification systems.
- Attacks and defence techniques:
5) VPO and cyberattack technologies: Technical details of VPOs, exploits and distributed malware systems, as well as relevant techniques for identifying and analysing them.
6) Attacker Behaviour: Motivation, behaviour, methods of attackers, including VPO supply chains, attack vectors, money transfers.
7) Cybersecurity and IS incident management: Configuring, operating and supporting IS systems, including solutions for detecting and responding to IS incidents, collecting and applying cyber threat analytics data.
8) Forensics: Collect, analyse, report on digital evidence (clues) to support the investigation of IS incidents or cybercrime.
- System Security:
9) Cryptography: Basic cryptographic primitives currently in use and evolving algorithms, techniques for analysing them, and protocols using cryptographic algorithms.
10) Operating systems and virtualisation system security: Operating system security mechanisms, implementation of secure hardware abstractions and resource sharing including isolation in multi-user systems, virtualisation system security, database security.
11) Distributed Systems Security: Security mechanisms related to large-scale distributed systems, including security of consensus mechanisms, event and time-based systems, peer-to-peer systems, cloud infrastructures, multitenant data centres, distributed registries.
12) Formal Security Techniques: Formal specification, modelling and justification of cyber security measures for systems, software and protocols, considering fundamental approaches, techniques and tools.
13) Authentication, Authorisation and Accountability: Aspects of identity management, authentication technologies, architectural approaches and tools to support authorisation and accountability in isolated and distributed systems.
- Software and platform security:
14) Software Security: Known categories of programming errors that lead to vulnerabilities and techniques for avoiding such errors through secure development practices and secure language constructs, as well as tools, techniques and methods for detecting such errors in existing systems.
15) Web and Mobile Application Security: Issues related to web applications and services that are exploited by devices and frameworks, including a variety of programming paradigms and security models.
16) Secure Software Lifecycle: The application of security practices to software development throughout the systems development lifecycle to create secure software by default.
- Infrastructure Security:
17) Applied Cryptography: Application of cryptographic algorithms, schemes and protocols, including the problems of implementing cryptoalgorithms, managing encryption keys and their use in protocols and systems.
18) Network Security: Security aspects of network and telecommunication protocols, including routing security, network security elements, and certain cryptographic protocols for network security.
19) Hardware Security: Security in the design, implementation and deployment of general and specialised hardware, including trusted computing technologies and random number generators.
20) Security of Cyber-Physical Systems: Security challenges of cyber-physical systems such as IoT and ACS, attacker models, robust and secure architecture, security of large-scale infrastructures.
21) Physical layer and telecommunications security: Security issues and limitations of the physical layer, including aspects of radio frequency coding and transmission techniques, unintended emissions and interference.
1.3 Applying knowledge from CyBOK to solve IS problems
1.3.1 Cybersecurity tools and objectives
Cybersecurity involves defence against intruders as well as against physical or accidental processes, which are already related to the concepts of reliability and physical security. IS is based on modelling the actions of intruders - their motives, capabilities and threats they can implement. To counter threats, defence measures are implemented that affect people, processes and technologies. Protective measures identify and respond to threats and can prevent the negative consequences of the realisation of threats. Protective measures are selected as part of the risk management process, and human factors and a culture of cybersecurity play an important role. In addition, it is important to analyse the vulnerabilities of protected information systems: a hypothetical system without vulnerabilities would be resistant to all types of threats, while a vulnerable system would also be safe in the absence of threats. As part of the implementation of protection measures, questions inevitably arise about the correctness of their implementation and their effectiveness - such checks are carried out as part of IS audits, which include residual risk analysis and vulnerability assessment.
1.3.2 Failures and incidents
If an attacker successfully executes an attack, it can be said that the defences failed or that they were inadequate or ineffective. One or more defence failures can lead to a cyber incident, which can be assessed in terms of damage to information, devices, systems, networks, and, in the case of an attack on cyber-physical systems, physical damage. Cybersecurity is largely based on models of different systems, and protective measures are also developed based on the idea of such models and their behaviour - thus the role of abstractions in IS is significant. Many attacks are based on this assumption, where attackers compromise a lower level of the system than the one at which the protection measures are implemented - for example, by injecting VPO into firmware, while the protection measures are implemented at higher levels of abstraction (system or application level).
1.3.3 Risks
The principles of IS risk assessment and management are used to achieve a balance between the available resources for the implementation of protection measures and the consequences of threat realisation. As part of risk assessment, the level of risk of threat realisation is calculated, which depends on the probability of events leading to damage and the expected amount of damage. The probability of events leading to damage depends on the presence of vulnerabilities (known or unknown at the time of assessment) and the nature of the threat. Responding to risk may include implementing additional protective measures to reduce the damage or likelihood of the threat realisation, risk acceptance, risk transfer (e.g. by insurance), and risk avoidance (e.g. by avoiding certain actions and initiatives) due to unacceptable levels of potential damage. Risk management procedures should be adjusted to take into account metrics for assessing the effectiveness of IS processes.
1.4 Principles of IS
1.4.1 The Salzer and Schröder Principles
In 1975, scientists Jerome Salzer and Michael Schroeder within the development of a secure multi-user OS for the protection of confidential information in government and military organisations formulated 8 principles of secure system architecture and implementation of protection mechanisms, which are still relevant today:
- Simplicity of defence mechanisms: the architecture of the security system should be as simple as possible.
- Default security: subject access should be denied by default, and defence mechanisms should define a limited set of conditions under which access will be allowed.
- Full indirectness: every attempt to access the object must pass through and be verified by defence mechanisms.
- Open architecture: the security architecture should not be secret (which is difficult to achieve and would make it difficult to audit the system), but should be based on secret keys and passwords (which are easier to protect).
- Separation of privileges: to access sensitive objects, the defence mechanism must request confirmation from at least two independent entities (which makes it impossible for one person to perform critical actions).
- Privilege minimisation: each subject should work in the system with only a limited set of privileges, the minimum necessary to perform work tasks.
- Minimise the number of shared defences: the number of defences shared between entities should be minimised to reduce the number of likely points of failure and potential channels for uncontrolled information sharing between entities.
- Psychological acceptability: it is important that the user interface of the defence mechanisms is understandable and convenient for the subjects of access (which reduces human errors and simplifies the work with the system).
In addition, Salzer and Schröder also proposed two additional principles that relate to protective measures:
- Hacking labour intensity: bypassing robust security measures should require more resources than the attacker has;
- Logging of compromise attempts: reliable logging of events must be implemented to identify the attack.
1.4.2 NIST Principles
Thirty principles for building robust secure systems are outlined in Appendix "E" of NIST Special Publication SP 800-160 Vol. 1 Rev. 1, Engineering Trustworthy Secure Systems:
- Anomaly Detection: Any noticeable anomaly in the system or its environment is detected in time for an effective response.
- Clear Abstractions: The abstractions used to describe the system are simple, well-defined, precise, necessary and sufficient.
- Commensurate Protection: The strength and type of protection afforded to a system element is commensurate with the most significant malicious effect that results from a failure of that element.
- Commensurate Response: The system architecture is such that the strength of the effect of the response actions developed is matched to the necessary speed to control the consequences of each of the loss (loss, damage) scenarios.
- Commensurate Rigour (Commensurate Rigor): The rigour of system design provides the certainty needed to address the most significant potential adverse effect.
- Commensurate Trustworthiness: The reliability of a system element is at a level commensurate with the most significant adverse effects resulting from a failure of that element.
- Compositional Trustworthiness: The reliability of the architecture of the entire system is at a level corresponding to every possible aggregate composition of the interacting elements of the system.
- Continuous Protection: The protection of each element of the system must be effective and uninterrupted throughout the time it is required.
- Echeloned Defence (Defence In Depth): Losses are prevented or minimised through the use of several interrelated defence mechanisms.
- Distributed Privilege: Multiple authorised entities must perform coordinated actions before a critical operation on the system is allowed.
- Diversity (Dynamicity): The system architecture provides the required capabilities through a variety of structural and behavioural elements, data flows and control commands.
- Domain Separation: Domains with different protection needs should be separated physically or logically.
- Hierarchical Protection: A system element does not need to be protected from elements with a higher level of trust.
- Least Functionality: Each element of the system has the ability to perform the functions it requires, but no more than that.
- Least Persistence: System elements and other resources are available and capable of performing their intended functionality only for the time they are needed.
- Least Privilege: Each element of the system is allocated the privileges necessary for it to perform certain functions, but no more.
- Least Sharing: System resources are allocated to system elements only when necessary and to as few elements as possible.
- Loss Margins (Loss Margins): The system is designed to operate in a state sufficiently far from the threshold at which losses (losses, damages) may occur.
- Mediated Access: Any access and all operations on system elements are mediated.
- Minimal Trusted Elements: The system has as few trusted system elements as is practical.
- Minimise Detectability: The system architecture minimises the possibility of detecting the system as far as practicable.
- Protective Defaults: The default configuration of the system provides maximum protection effectiveness.
- Protective Failure: Failure of a system element does not result in an unacceptable loss or other loss scenario.
- Protective Recovery (Protective Recovery): Restoring a system element does not result in unacceptable losses.
- Reduced Complexity: The system architecture is as simple as practicable.
- Redundancy: The system architecture provides the required capabilities by replicating system functions or elements.
- Self-Reliant Trustworthiness: The reliability of a system element is achieved with minimal dependence on other elements.
- Structured Decomposition and Composition: The complexity of a system can be managed by structured decomposition of the system and structured composition of the building blocks to provide the required capabilities.
- Substantiated Trustworthiness: Assessments of system reliability are based on evidence that reliability criteria have been met.
- Trustworthy System Control: The architecture of the system control functions corresponds to the properties of a generalised reference monitor.
1.4.3 Hidden conditions of system design
Hidden design conditions begin to matter when a system is not used as intended by its creators. For example, in the case of cyber-physical systems, their architecture was not designed to connect to insecure networks such as the Internet, but this is happening more and more frequently, which is particularly dangerous for legacy devices. For such devices, architecturally designed security is not always possible, and their compromise can lead to process safety violations.
1.4.4 The Precautionary Principle
This principle means that the architecture of systems should be carefully calibrated before large-scale deployments - especially for services that are used by large populations. For example, a system that was designed to perform a small number of tasks may become so pervasive globally that even the smallest flaws in the original design begin to play a major role. This is why information system architects must consider the security and privacy implications for a wide range of users during the conceptualisation, modelling, implementation, maintenance, development and decommissioning phases of systems.
1.5 Cross-cutting themes
1.5.1 The economics of cyber security
Economic feasibility should be taken into account when addressing IS challenges: we need to understand the economic principles that guide companies in selecting and implementing defence measures. In addition, attackers also act in accordance with the principles of economic feasibility when carrying out cyberattacks - for example, by comparing their costs of finding vulnerabilities and developing exploits with the expected results of an attack, as well as investing in the protection of their malicious hardware and the privacy of their identities.
1.5.2 Security Architecture and Lifecycle
When developing the architecture, it is important to design methods of interaction between users, data and services so that risky interactions between them are protected by simple and self-sufficient IS mechanisms. For this purpose, it is first necessary to evaluate the model of the intended use of the system to be created: the description of the business process, the functioning of which will be ensured by the system to be created, should contain the characteristics of the intended interaction between users, data, and services in the system. Potentially unsafe interactions between these entities should be identified in the risk assessment, which should also take into account applicable legal requirements and contractual obligations. If potentially unsafe interactions are identified, the business process itself may need to be reviewed and changed. Next, data and users should be grouped into privacy and access levels, respectively. Access control and access control mechanisms should then be established for the user and data groups, and standards, guidelines and best practices from industry leaders should guide the selection of security measures.