Security notes / not-so-formal notes

From Helpful
Jump to: navigation, search
Security related stuff.
website security notes · some attack notes · not-so-formal notes · unsorted

Security truisms

  • "Any lock can be picked with a big enough hammer" (Sun System & Network Admin manual)


  • the harder you make it to work with, the more likely it is that people will work around it.


  • security is only as good as its weakest link.
A brilliant design can be completely undermined by not applying it correctly, by a weakness elsewhere, by making it too annoying to use (see previous point), etc.


  • The more value is being protected, the more you should consider a system insecure until proven secure
"it worked so far" is in no way proof. Nothing after years of attack is more convincing - but not proof.
In fact, in many cases a vulnerability didn't matter for years simply - because no one knew about it.
(Yes, this makes in fundamentally very hard to prove anything is secure)


  • Never bank on a system being fully secure, there is no such guarantee. As such,
secure distinct parts
variations in designs may narrow the effect of a breach, or shift the costs and risk to more (or less) sensible places.
design a system to be robust to the most likely threats. However...
whether each countermeasure is effective against a threat is usually less important than asking whether it is a good tradeoff in the real world
  • Security is a property of a whole design
something you duct-tape on later is often less secure and/or more annoying


  • The more complex a security system, the less likely it is to be secure.
In part because fewer people will be able to understand why it is or isn't secure.
  • introduction and changes to security systems should only be done by someone who both understands the implications, the function and details of and the system both before and after.
for physical as well as computer security



  • Security tests are almost per definition incomplete
so very easily overvalued
Arguably more if standardized because attackers know where the focus was


  • security through obscurity does not deter or prevent.
It often does slow down, but by an inherently unknown time.


Psychology of security

This article/section is a stub — probably a pile of half-sorted notes, is not well-checked so may have incorrect bits. (Feel free to ignore, fix, or tell me)
  • The feeling of security has little to do with the reality of security.
    • this feeling mostly consists of psychological reactions to both risks and countermeasures. And often quite biased away from real information, e.g. by generic fear, recent events, etc.
    • the reality can be approximated with probabilities (decently, never perfectly)


  • Insecurity is the norm.
Sure, this only matters when you are a target at all
...but once you are or may be, downplaying problems is hopeful, misinformed, and regularly stupidly risky.


  • People have problems estimating risks; intuition often overrules logic, often in a predictable way
    • spectacular but rare risks are exaggerated, common but boring risks are downplayed
    • popularly mentioned risks are exaggerated (those talked about in society and the media)
    • personal risks are exaggerated, anonymous risks are downplayed
    • offensive things are exaggerated, pleasant things downplayed
    • lack of knowledge of a risk makes it likely to be exaggerated (for example in new diseases, new technologies), familiarity with a risk makes it likelier to be downplayed
    • man-made risks are exaggerated relative to naturally occurring risks (varies)
    • active/passive roles:
      • passive risks (things taken to 'just happen') are downplayed
      • ...unless there is a sense that you could be in more control (consider being a passenger in a car or plane)
      • actively taken risks are often downplayed (e.g. being the driver)
      • imposed risks often exaggerated (e.g. passenger again)
    • estimated risk is based on the basic trust in an actor,
      • even if this is a very large institution/government/such (for which trust and actions may be hard to estimate or model)
      • ...although we do often trust people over governments, governments over corporations, etc.
    • a protective approach often leads to risk exaggeration, and occasionally lead to pointless tradeoffs (consider protective parents)
    • Something framed as a gain or loss may lead to being more risk averse or risk seeking (though exactly which and how may be complex)


Many of these are regularly exploited in some way - certainly by various media (we're looking at you, pop news TV).

Development, complexity, weakest links

This article/section is a stub — probably a pile of half-sorted notes, is not well-checked so may have incorrect bits. (Feel free to ignore, fix, or tell me)
  • There is a good correlation between more complex software and more holes. The more complex, the harder it is to make security guarantees (Perhaps exponential with the size/complexity).
  • There is a good correlation between simply more software on a system and more holes - On systems that are targets, stripping down to bare basics may reduce the likely amount of security holes better than patching.
  • Decent systems usually have security considered in their first designs, not patched on later
  • Security usually lessens as a system expands, as it often expands around the security measures in place.
  • It is easy to severely reduce security through a single weak link. Common weak links include:
    • Humans that introduced exceptions to policies.
    • Note that good but very inconvenient security systems simply will be circumvented by its own users through time/money/risk assessments, or perhaps more commonly, intuitions.
    • Intuitive decisions always have a chance of being ill-informed
    • Blindly following a policy, not considering its reason, often reduces the security as it makes it easier to circumvent.


Business and security

This article/section is a stub — probably a pile of half-sorted notes, is not well-checked so may have incorrect bits. (Feel free to ignore, fix, or tell me)
  • Way too many businesses try to duct tape it in
    • There is less correlation between money thrown at security and and resulting security
    • There is good correlation between infrastructure considerations and resulting security
  • Regulatory compliance
    • is good for basic level of security of commonly used/applied things (cash registers, etc.)
    • but only work as well as said regulations apply to your case
    • and are updated to stay so (which is hard work if you actually want it done well)
    • so are therefore frequently misapplied
  • Confidence in security is usually biased, and regularly based on the wrong things (such as regulations regardless of how good they are).



  • Given there is no perfect security, security often ends up being a question of economics, risk assessment, uptime consideration
  • In a system in which problems are presumable, the weaknesses should be designed to places where they can be controlled.


  • Layering is usually only useful with warning systems. For the simple reason that additional layers primarily mean additional time.

Perpetual arguments

Password requirements

People reusing passwords

Writing down passwords

This article/section is a stub — probably a pile of half-sorted notes, is not well-checked so may have incorrect bits. (Feel free to ignore, fix, or tell me)

Password systems

Why making a secure login system is hard

Password managing software

Forcedly changing passwords

This article/section is a stub — probably a pile of half-sorted notes, is not well-checked so may have incorrect bits. (Feel free to ignore, fix, or tell me)