Security notes / not-so-formal notes

From Helpful
Jump to: navigation, search
Security related stuff.
website security notes · some attack notes · not-so-formal notes · unsorted

Security truisms

  • "Any lock can be picked with a big enough hammer" (Sun System & Network Admin manual)
  • the harder you make it to work with, the more likely it is that people will work around it.
  • security is only as good as its weakest link. A brilliant design can be completely undermined by not applying it correctly, by a wekness elsewhere, by making it too annoying to use (see previous point), etc.


  • The more value is being protected, the more you should consider a system insecure, until proven secure
"it worked so far" is in no way proof. Nothing after years of attack is more convincing - but not proof.
(Yes, it is very extremely to prove something is secure)


  • Never count on a system being fully secure, there is no such guarantee. As such,
a design can often localize the effect of a breach, and shift the costs and risk to more (or less) sensible places.
design a system to be robust to the most likely threats. However...
whether each countermeasure is effective against a threat is usually less important than asking whether it is a good tradeoff in the real world
  • Security is a part of the whole. Security that you band-aid on tends not to work well, particularly for software.
  • The more complex a security system, the less likely it is to be secure. In part because fewer people will be able to understand why it is or isn't secure.
    • ...there is of course a difference between real complexity and apparent complexity


  • introduction and changes to security systems should only be done by someone who both understands the implications -- ideally both details of the security system, and of the thing it is for.
    • The math is often no harder than the application. When applies to a real system, though, the devil's in the practical reality of the details. A good cryptosystem can be made worthless by bad application.
    • at a large enough scale (e.g. many doors to a building, many connections into a network), either a complete understanding becomes impossible, or the necessary design becomes hopelessly impractical or draconian. At this point, damage control becomes at least as important.


  • Security tests are almost per definition incomplete, so very easily overvalued
  • security through obscurity does not deter or prevent. It slows down, but by an unknown time.


Psychology of security

This article/section is a stub — probably a pile of half-sorted notes, is not well-checked so may have incorrect bits. (Feel free to ignore, fix, or tell me)
  • The feeling of security has little to do with the reality of security.
    • the feeling mostly consists of psychological reactions to both risks and countermeasures - and of focus on such aspects (usually in reaction to recent events), and rarely by much real information
    • the reality can be approximated with probabilities (decently, never perfectly)


  • Insecurity is the norm. Downplaying problems is usually misinformed, and regularly stupidly risky.


  • People have problems estimating risks; intuition often overrules logic, often in a predictable way
    • non-usual situations are often badly estimated in general
    • spectacular but rare risks are exaggerated, common risks are downplayed (consider e.g. )
    • popular risks are exaggerated (those talked about in society and the media)
    • offensive things are exaggerated, pleasant things downplayed
    • personal risks are exaggerated, anonymous risks are downplayed
    • lack of knowledge of a risk makes it likely to be exaggerated, familiarity with a risk makes it likelier to be downplayed (for example in new diseases, new technologies)
    • man-made risks are exaggerated relative to naturally occurring risks
    • active/passive roles:
      • passive risks (things taken to 'just happen') are downplayed
      • ...unless there is a sense that you could be in more control (consider being a passenger in a car or plane)
      • actively taken risks are often downplayed (e.g. being the driver)
      • imposed risks often exaggerated
    • estimated risk is based on the basic trust in an actor,
      • even if this is a very large institution/government/such (for which trust and actions may be hard to estimate or model)
      • ...although we do often trust people over governments, governments over corporations, etc.
    • a protective approach often leads to risk exaggeration, and occasionally lead to pointless tradeoffs (consider protective parents)
    • Something framed as a gain or loss may lead to being more risk averse or risk seeking (though exactly which and how may be complex)


Many of these are regularly exploited in some way - certainly by various media (we're looking at you, pop news TV).

Development, complexity, weakest links

This article/section is a stub — probably a pile of half-sorted notes, is not well-checked so may have incorrect bits. (Feel free to ignore, fix, or tell me)
  • There is a good correlation between more complex software and more holes. The more complex, the harder it is to make security guarantees (Perhaps exponential with the size/complexity).
  • There is a good correlation between simply more software on a system and more holes - On systems that are targets, stripping down to bare basics may reduce the likely amount of security holes better than patching.
  • Decent systems usually have security considered in their first designs, not patched on later
  • Security usually lessens as a system expands, as it often expands around the security measures in place.
  • It is easy to severely reduce security through a single weak link. Common weak links include:
    • Humans that introduced exceptions to policies.
    • Note that good but very inconvenient security systems simply will be circumvented by its own users through time/money/risk assessments, or perhaps more commonly, intuitions.
    • Intuitive decisions always have a chance of being ill-informed
    • Blindly following a policy, not considering its reason, often reduces the security as it makes it easier to circumvent.


Business and security

This article/section is a stub — probably a pile of half-sorted notes, is not well-checked so may have incorrect bits. (Feel free to ignore, fix, or tell me)
  • Way too many businesses think in the wrong direction
    • There is good correlation between infrastructure considerations and resulting security
    • There is much less correlation between money spent and resulting security
  • Regulatory compliance
    • is regularly good to provide basic security in a context where no one would apply it
    • but only works when those regulations are good and sensible (considering your business),
    • are updated to stay so (which is hard, and a lot of work, if you actually want it done well)
    • so are therefore quite often misapplied
  • Confidence in security is usually biased, and regularly based on the wrong things (such as regulations regardless of how good they are).



  • Given there is no perfect security, security often ends up being a question of economics, risk assessment, uptime consideration
  • In a system in which problems are presumable, the weaknesses should be designed to places where they can be controlled.


  • Layering is usually only useful with warning systems. For the simple reason that additional layers primarily mean additional time.


Perpetual arguments

Password requirements

People reusing passwords

Writing down passwords

This article/section is a stub — probably a pile of half-sorted notes, is not well-checked so may have incorrect bits. (Feel free to ignore, fix, or tell me)

Password systems

Why making a secure login system is hard

Password managing software

Forcedly changing passwords

This article/section is a stub — probably a pile of half-sorted notes, is not well-checked so may have incorrect bits. (Feel free to ignore, fix, or tell me)