Cyber Engineering Community Site

 View Only
  • 1.  Cyber Engineering Principles

    Posted 07-02-2023 02:12 PM
    The Cyber Engineering Working Group has developed a set of Principles for Cyber Engineering which are intended to cove all engineers involved in Cyber across all disciplines. The principles are set out below. Members views on the principles are welcomed.

    CYBER ENGINEERING PRINCIPLES

    1. Take a whole of system whole of lifecycle approach to secure design, operation, maintenance and disposal. The whole system includes all technology, people and processes. Technology includes all hardware, firmware and software and interconnected systems; people includes human interactions, human behaviour, and skills to operate and maintain the systems; and processes includes in built controls and processes required to operate and maintain the system securely.

    2. Implement an engineering management system linked to the enterprise governance framework that provides for clear accountability, authority and responsibility for secure design, operation, maintenance, and disposal of the system.

    3. Undertake continuous threat and vulnerability modelling, to understand emerging security risks, to inform and build in physical and logical security measures that are readily evolvable to incorporate new technologies, changes to threats and user needs.

    4. Tailor engineering processes and standards to system complexity and security needs.

    5. Apply the engineering principles applicable to the context in which the system is to be designed, operated, maintained and disposed.

    6. Undertake analysis of both functional and non-functional requirements (e.g. performance, confidentiality, integrity, availability, safety, human factors) to understand priorities and trade-offs of stakeholder needs and document decision trade-offs. Safety and security are addressed as co-ordinated views when determining trade-offs. Stakeholder needs include system intended capability outcomes, economic and organisational needs, customer requirements, and regulatory requirements.

    7. Design and refine the system architecture to reduce the evolving threat attack surface, cognisant of strategic performance, support, and security risks.

    8. Ensure all system elements are integrated in a logical sequence and tested to ensure that the system operates as predicted.

    9. Plan and implement verification and validation processes that provide objective evidence of system performance, identify residual security risks, and determine actions required to mitigate or accept residual risks.

    10. Establish processes to rigorously manage all interfaces (physical, logical, and human), including interfaces to external systems or networks where the pedigree of the connected systems cannot be accurately determined or actively managed.

    11. Assume that there will be successful attacks on the system and design in mechanisms to aid recovery from such attacks. Changes in sophistication, diversity and vectors of attacks must be expected. Plan for and regularly exercise system recovery processes as part of the operations, support, and maintenance regimes.

    12. Anticipate and expect human error that might create vulnerabilities (both intentional and unintentional).

    13. Establish mechanisms to detect and eradicate counterfeit parts, components, firmware and software that may contain malicious or poor-quality code in the supply chain; e.g. through functional and physical configuration audits.

    14. Design in mechanisms to detect and report unauthorised use, unusual or unpredicted system behaviour.

    15. Define processes for introducing software and patches into the system to reduce the risk of unintended security impacts or impacts on system performance.

    16. Implement security measures through a layered multifactored approach.

    17. Design and implement maintenance and support regimes that ensure secure configuration control, approval of changes to system requirements, system performance monitoring, obsolescence management, system updates and regular patching. Plan for the management of system components (hardware, firmware, and software) that are likely or may become obsolete during the planned life of the system.

     

     



    ------------------------------
    Shireane McKinnie
    ------------------------------


  • 2.  RE: Cyber Engineering Principles

    Posted 28-04-2023 11:07 AM

    Hello Shireane,

    Interesting (complex) topic. Who were / are really responsible for the introduced chaos?

    Anyway (my thoughts):

    3. It is suggested to become contend with what works well (higher speeds are not necessary always). Accomplish "Quality" (of Life). T

    here is no need to speed up for 'more', to add more app options (introduce more vulnerabilities), more Microsoft options hidden somewhere else (needlessly) at higher (user) costs and risks etc.

    When Laws are simple all understand without a possible doubt. When Laws are described over many books then only the wealthy can defend themselves (for obvious 'class system' control reasons). Same with Cyber security principles: keep things simple that work well as necessarily proven. Do not just amend or the sake of amending to economically claim, supposedly ethically, one can now charge more (for less incl. less security).

    I refer to Windows (97 / XP?) worked well – but more financial greed in various places unethically forced the end of it. Why not return to what worked well? Cyber security is no different. If there should be one system most secure per individual it should be Cyber related because every user is (apparently) supposedly unique.

    4. Supports point made above (3.): introduced system complexities are the issue which force system additional security needs.

    6. Return to simplicity. Complex regulatory requirements (e.g.: general lengthy T&Cs regarding privacy, access costs etc.) are ridiculous for users to possibly read completely, let alone legally understand.

    9. Detection of applicable security threats (and risks) are to initiate automatically appropriate legally enforceable punishments (anywhere). Otherwise, the criminal (threat) is protected (given a platform to rort in) instead of the legally abiding (contributing) user.

    10. Avoid anything that is or can be uncontrollable at all times with anything (e.g.: nuclear).

    12. Anything re-issued, any (program) amendment must have been proven to work reliably well months prior (e.g.; in pilot project) benefitting users. Microsoft and Cad software simply relocating icons etc. makes no sense other than needing to re-educate and purchase such "falsely claimed necessary improvement".

    13. Each country must have its own fully independent manufacturing capabilities for exactly those security, preventing dependency, and 'disaster risk mitigation' reasons. This applies for 'anything ethical manufacturing' (non-ethical processes should not apply anywhere).

    14, This must come with speedy legal enforcements once detected anywhere. If global controls cannot be accomplished reasonably than develop each as an 'island' (with country individually controlled internet inputs).

    15. Goes to earlier points made: improvements must be necessary, desired, useful, legally compliant, and practical. There are too many useless supposed "necessary updates" alongside really 'necessary proven to work well updates'.

    16. Wouldn't a two-step 'authority log in check' suffice on each device?

    17. There is expected to be a 'computer technology specification', international and 'individually national' legal compliance requirements, and a similar 'design criteria' somewhere ('internationally' and 'individually nationally').

    It sadly seems however that IT "engineers" and some associated have not applied fundamentally ethical (engineering) procedures at all – setting up their desired chaos for economic self-gains over decades (virtually criminally – see how rich some supposedly are).

    The complexities sit in the vast (international and national) quantities. That can be reduced when countries will be responsible as individual countries (non-colonized) themselves for their own systems to culturally apply their IT needs respectfully and appropriately. Being capable to secure, guarantee, and enforce what happens within (full independence).

    The claim here is: Computer (IT) specifications, legislations, regulations etc. are unreliably, unethically, and legally non-compliantly or disrespectfully developed. Pure concerns were and seem still to involve: the production of a mass chaotic vulnerable "business control system" and let's see what happens later.

    Why not make 'quality things' solely ethically, that are simple, reliable, flexible, and involves things that for all are simple to understand preventing any doubt like any (politically) 'ethical engineering specification' should be?

    The fact that people still (can) receive many thousands of words 'T&Cs online', necessarily to be agreed to proceed, is beyond legal logic or any ethical reason. It involves pure deceit, creating chaos by introducing vast complexities which allow 'manufacturers to remain 'non-accountable' and a (life-time loving) client fully accountable without possibly knowing what those 'thousands of words' involve legally really anyway. That is the real cyber security problem – there are no Australian user-protecting legislations / regulations in place otherwise any online T&Cs would be short.

    Regards,



    ------------------------------
    Sebastian Tops
    ------------------------------