Difference between revisions of "Security notes / not-so-formal notes"

From Helpful
Jump to: navigation, search
(Created page with "{{SecurityRelated}} ===Security truisms=== * "Any lock can be picked with a big enough hammer" (Sun System & Network Admin manual) * the harder you make it to work with, the...")
 
m (Business and security)
 
(3 intermediate revisions by the same user not shown)
Line 3: Line 3:
 
===Security truisms===
 
===Security truisms===
 
* "Any lock can be picked with a big enough hammer" (Sun System & Network Admin manual)
 
* "Any lock can be picked with a big enough hammer" (Sun System & Network Admin manual)
 +
  
 
* the harder you make it to work with, the more likely it is that people will work around it.<!--
 
* the harder you make it to work with, the more likely it is that people will work around it.<!--
: For example, if that keycarded door or the beurocracy behind it regularly blocks people from doing their job, they will have no problem wedging it open.-->
+
: For example, if that keycarded door (or the beurocracy behind it) regularly blocks people from doing their job, they will have less and less problem wedging it open.-->
  
* security is only as good as its weakest link. A brilliant design can be completely undermined by not applying it correctly, by a wekness elsewhere, by making it too annoying to use (see previous point), etc.
 
  
 +
* security is only as good as its weakest link.
 +
: A brilliant design can be completely undermined by not applying it correctly, by a weakness elsewhere, by making it too annoying to use (see previous point), etc.
  
* The more value is being protected, the more you should consider a system insecure, until proven secure
 
: "it worked so far" is in no way proof. Nothing after years of attack is more convincing - but not proof.<!--(In fact, in many cases a vulnerability didn't matter for years simply because no one knew about it. Which ought to make you a bit nervous about everything new and untested)-->
 
: (Yes, it is very extremely to prove something is secure)
 
  
 +
* The more value is being protected, the more you should consider a system insecure until proven secure
 +
: "it worked so far" is in no way proof. Nothing after years of attack is more convincing - but not proof.
 +
: In fact, in many cases a vulnerability didn't matter for years simply - because no one knew about it.
 +
: (Yes, this makes in fundamentally very hard to prove anything is secure)
  
* Never count on a system being fully secure, there is no such guarantee. As such,
+
 
: a design can often localize the effect of a breach, and shift the costs and risk to more (or less) sensible places.
+
* Never bank on a system being fully secure, there is no such guarantee. As such,
 +
: secure distinct parts
 +
: variations in designs may narrow the effect of a breach, or shift the costs and risk to more (or less) sensible places.
 
: design a system to be robust to the most likely threats. However...
 
: design a system to be robust to the most likely threats. However...
 
: whether each countermeasure is effective against a threat is usually less important than asking whether it is a good tradeoff in the real world
 
: whether each countermeasure is effective against a threat is usually less important than asking whether it is a good tradeoff in the real world
  
* Security is a part of the whole. Security that you band-aid on tends not to work well, particularly for software.
+
* Security is a property of a whole design
 +
: something you duct-tape on later is often less secure and/or more annoying
  
* The more complex a security system, the less likely it is to be secure. In part because fewer people will be able to understand why it is or isn't secure.
+
 
 +
* The more complex a security system, the less likely it is to be secure.
 +
: In part because fewer people will be able to understand why it is or isn't secure.
 +
<!--
 
** ...there is of course a difference between real complexity and apparent complexity
 
** ...there is of course a difference between real complexity and apparent complexity
 +
-->
  
 +
* introduction and changes to security systems should only be done by someone who both understands the implications, the function and details of and the system both before and after.
 +
: for physical as well as computer security
  
* introduction and changes to security systems should only be done by someone who both understands the implications -- ideally both details of the security system, and of the thing it is for.
+
<!--
 +
* in computer security, secure the data, not the means of access
 +
-->
 +
 
 +
<!--
 
** The math is often no harder than the application. When applies to a real system, though, the devil's in the practical reality of the details. A good cryptosystem can be made worthless by bad application.
 
** The math is often no harder than the application. When applies to a real system, though, the devil's in the practical reality of the details. A good cryptosystem can be made worthless by bad application.
** at a large enough scale (e.g. many doors to a building, many connections into a network), either a complete understanding becomes impossible, or the necessary design becomes hopelessly impractical or draconian.  At this point, damage control becomes at least as important. <!-- Sandboxing, firewalling and such come -->
+
** at a large enough scale (e.g. many doors to a building, many connections into a network), either a complete understanding becomes impossible, or the necessary design becomes hopelessly impractical or draconian.  At this point, damage control becomes at least as important.  
 +
-->
 +
 
  
 +
* Security tests are almost per definition incomplete
 +
: so very easily overvalued <!--(particularly by management that values metrics)-->
 +
: Arguably more if standardized because attackers know where the focus was
  
* Security tests are almost per definition incomplete, so very easily overvalued <!--(particularly by management that values metrics)-->
 
  
* security through obscurity does not deter or prevent. It slows down, but by an unknown time.
+
* security through obscurity does not deter or prevent.
 +
: It often ''does'' slow down, but by an inherently unknown time.
  
 
<!--
 
<!--
  
* The concept of 'encryption strength' has no strong mathematical backing. The oft-applied probability only applies to the dumbest attack, brute force.  
+
* The concept of 'encryption strength' has no strong mathematical backing.  
Consider WEP - brute forcing takes a while, but structural weaknesses meant it could be cracked within minutes.
+
: The oft-applied probability is often one-in-keyspace, which only applies to the very dumbest attack, brute force.
Real-world strength is perhaps best described in terms of current known attacks.
+
: Real-world strength is perhaps best described in terms of current known attacks.
 +
: It's not certain that there are better attacks, but it's so common you should assume it.
 +
: Consider WEP - brute forcing takes pretty long, but cryptanalysis revealed weaknesses that meant it could be cracked within minutes.
  
 
-->
 
-->
Line 46: Line 69:
 
{{stub}}
 
{{stub}}
 
* The feeling of security has little to do with the reality of security.
 
* The feeling of security has little to do with the reality of security.
** the feeling mostly consists of psychological reactions to both risks and countermeasures - and of focus on such aspects (usually in reaction to recent events), and rarely by much real information
+
** this feeling mostly consists of psychological reactions to both risks and countermeasures. And often quite biased away from real information, e.g. by generic fear, recent events, etc.
 
** the reality can be approximated with probabilities (decently, never perfectly)
 
** the reality can be approximated with probabilities (decently, never perfectly)
  
  
* Insecurity is the norm. Downplaying problems is usually misinformed, and regularly stupidly risky.
+
* Insecurity is the norm.
 +
: Sure, this only matters when you are a target at all
 +
: ...but once you are or may be, downplaying problems is hopeful, misinformed, and regularly stupidly risky.
  
  
 
* People have problems estimating risks; intuition often overrules logic, often in a predictable way
 
* People have problems estimating risks; intuition often overrules logic, often in a predictable way
** non-usual situations are often badly estimated in general
+
** spectacular but rare risks are exaggerated, common but boring risks are downplayed
** spectacular but rare risks are exaggerated, common risks are downplayed (consider e.g. )
+
** popularly mentioned risks are exaggerated (those talked about in society and the media)
** popular risks are exaggerated (those talked about in society and the media)
+
** offensive things are exaggerated, pleasant things downplayed
+
 
** personal risks are exaggerated, anonymous risks are downplayed
 
** personal risks are exaggerated, anonymous risks are downplayed
** lack of knowledge of a risk makes it likely to be exaggerated, familiarity with a risk makes it likelier to be downplayed (for example in new diseases, new technologies)
+
** offensive things are exaggerated, pleasant things downplayed
** man-made risks are exaggerated relative to naturally occurring risks
+
** lack of knowledge of a risk makes it likely to be exaggerated (for example in new diseases, new technologies), familiarity with a risk makes it likelier to be downplayed
 +
** man-made risks are exaggerated relative to naturally occurring risks (varies)
 
** active/passive roles:
 
** active/passive roles:
 
*** passive risks (things taken to 'just happen') are downplayed
 
*** passive risks (things taken to 'just happen') are downplayed
 
*** ...unless there is a sense that you ''could'' be in more control (consider being a passenger in a car or plane)
 
*** ...unless there is a sense that you ''could'' be in more control (consider being a passenger in a car or plane)
 
*** actively taken risks are often downplayed (e.g. being the driver)
 
*** actively taken risks are often downplayed (e.g. being the driver)
*** imposed risks often exaggerated
+
*** imposed risks often exaggerated (e.g. passenger again)
 
** estimated risk is based on the basic trust in an actor,  
 
** estimated risk is based on the basic trust in an actor,  
 
*** even if this is a very large institution/government/such {{comment|(for which trust and actions may be hard to estimate or model)}}
 
*** even if this is a very large institution/government/such {{comment|(for which trust and actions may be hard to estimate or model)}}
Line 102: Line 126:
 
{{stub}}
 
{{stub}}
  
* Way too many businesses think in the wrong direction
+
* Way too many businesses try to duct tape it in
 +
** There is less correlation between money thrown at security and and resulting security
 
** There is good correlation between infrastructure considerations and resulting security
 
** There is good correlation between infrastructure considerations and resulting security
** There is much less correlation between money spent and resulting security
 
  
 
* Regulatory compliance  
 
* Regulatory compliance  
** is regularly good to provide basic security in a context where no one would apply it
+
** is good for basic level of security of commonly used/applied things (cash registers, etc.)
** but only works when those regulations are good and sensible (considering your business),
+
** but only ''work'' as well as said regulations apply to your case
** are updated to stay so (which is hard, and a lot of work, if you actually want it done well)
+
** and are updated to stay so (which is hard work if you actually want it done well)
** so are therefore quite often misapplied
+
** so are therefore frequently misapplied
  
 
* Confidence in security is usually biased, and regularly based on the wrong things (such as regulations regardless of how good they are).
 
* Confidence in security is usually biased, and regularly based on the wrong things (such as regulations regardless of how good they are).
Line 123: Line 147:
  
 
* Layering is usually only useful with warning systems. For the simple reason that additional layers primarily mean additional time.
 
* Layering is usually only useful with warning systems. For the simple reason that additional layers primarily mean additional time.
 
 
  
 
===Perpetual arguments===
 
===Perpetual arguments===
Line 247: Line 269:
  
  
The upside is that you avoid reuse of exact passwords.
+
The upside is that you avoid reuse of exact passwords,
The downside is that people may be able to guess your variations/
+
meaning machine attempts at using these elsewhere will fail.
 +
 
 +
The downside is that if a person looks at it, they may be able to guess your variations.
 +
 
 +
-->
 +
 
 +
====Why making a secure login system is hard====
 +
 
 +
<!--
 +
 
 +
* Online brute force
 +
: alleviations: account locking, or exponential backoff of delay between logins
 +
 
 +
* Distributed online brute force
 +
: alleviations: ?
 +
 
 +
* Offline brute force of hashes
 +
: alleviations: salt
 +
 
 +
* Host security protecting the hashes
 +
: alleviations: attention, probably never quite enough
  
 +
* insecure transport
 +
: alleviations: SSL and more
  
  

Latest revision as of 13:46, 8 October 2019

Security related stuff.

Practical


Theory


Unsorted


Security truisms

  • "Any lock can be picked with a big enough hammer" (Sun System & Network Admin manual)


  • the harder you make it to work with, the more likely it is that people will work around it.


  • security is only as good as its weakest link.
A brilliant design can be completely undermined by not applying it correctly, by a weakness elsewhere, by making it too annoying to use (see previous point), etc.


  • The more value is being protected, the more you should consider a system insecure until proven secure
"it worked so far" is in no way proof. Nothing after years of attack is more convincing - but not proof.
In fact, in many cases a vulnerability didn't matter for years simply - because no one knew about it.
(Yes, this makes in fundamentally very hard to prove anything is secure)


  • Never bank on a system being fully secure, there is no such guarantee. As such,
secure distinct parts
variations in designs may narrow the effect of a breach, or shift the costs and risk to more (or less) sensible places.
design a system to be robust to the most likely threats. However...
whether each countermeasure is effective against a threat is usually less important than asking whether it is a good tradeoff in the real world
  • Security is a property of a whole design
something you duct-tape on later is often less secure and/or more annoying


  • The more complex a security system, the less likely it is to be secure.
In part because fewer people will be able to understand why it is or isn't secure.
  • introduction and changes to security systems should only be done by someone who both understands the implications, the function and details of and the system both before and after.
for physical as well as computer security



  • Security tests are almost per definition incomplete
so very easily overvalued
Arguably more if standardized because attackers know where the focus was


  • security through obscurity does not deter or prevent.
It often does slow down, but by an inherently unknown time.


Psychology of security

This article/section is a stub — probably a pile of half-sorted notes, is not well-checked so may have incorrect bits. (Feel free to ignore, fix, or tell me)
  • The feeling of security has little to do with the reality of security.
    • this feeling mostly consists of psychological reactions to both risks and countermeasures. And often quite biased away from real information, e.g. by generic fear, recent events, etc.
    • the reality can be approximated with probabilities (decently, never perfectly)


  • Insecurity is the norm.
Sure, this only matters when you are a target at all
...but once you are or may be, downplaying problems is hopeful, misinformed, and regularly stupidly risky.


  • People have problems estimating risks; intuition often overrules logic, often in a predictable way
    • spectacular but rare risks are exaggerated, common but boring risks are downplayed
    • popularly mentioned risks are exaggerated (those talked about in society and the media)
    • personal risks are exaggerated, anonymous risks are downplayed
    • offensive things are exaggerated, pleasant things downplayed
    • lack of knowledge of a risk makes it likely to be exaggerated (for example in new diseases, new technologies), familiarity with a risk makes it likelier to be downplayed
    • man-made risks are exaggerated relative to naturally occurring risks (varies)
    • active/passive roles:
      • passive risks (things taken to 'just happen') are downplayed
      • ...unless there is a sense that you could be in more control (consider being a passenger in a car or plane)
      • actively taken risks are often downplayed (e.g. being the driver)
      • imposed risks often exaggerated (e.g. passenger again)
    • estimated risk is based on the basic trust in an actor,
      • even if this is a very large institution/government/such (for which trust and actions may be hard to estimate or model)
      • ...although we do often trust people over governments, governments over corporations, etc.
    • a protective approach often leads to risk exaggeration, and occasionally lead to pointless tradeoffs (consider protective parents)
    • Something framed as a gain or loss may lead to being more risk averse or risk seeking (though exactly which and how may be complex)


Many of these are regularly exploited in some way - certainly by various media (we're looking at you, pop news TV).

Development, complexity, weakest links

This article/section is a stub — probably a pile of half-sorted notes, is not well-checked so may have incorrect bits. (Feel free to ignore, fix, or tell me)
  • There is a good correlation between more complex software and more holes. The more complex, the harder it is to make security guarantees (Perhaps exponential with the size/complexity).
  • There is a good correlation between simply more software on a system and more holes - On systems that are targets, stripping down to bare basics may reduce the likely amount of security holes better than patching.
  • Decent systems usually have security considered in their first designs, not patched on later
  • Security usually lessens as a system expands, as it often expands around the security measures in place.
  • It is easy to severely reduce security through a single weak link. Common weak links include:
    • Humans that introduced exceptions to policies.
    • Note that good but very inconvenient security systems simply will be circumvented by its own users through time/money/risk assessments, or perhaps more commonly, intuitions.
    • Intuitive decisions always have a chance of being ill-informed
    • Blindly following a policy, not considering its reason, often reduces the security as it makes it easier to circumvent.


Business and security

This article/section is a stub — probably a pile of half-sorted notes, is not well-checked so may have incorrect bits. (Feel free to ignore, fix, or tell me)
  • Way too many businesses try to duct tape it in
    • There is less correlation between money thrown at security and and resulting security
    • There is good correlation between infrastructure considerations and resulting security
  • Regulatory compliance
    • is good for basic level of security of commonly used/applied things (cash registers, etc.)
    • but only work as well as said regulations apply to your case
    • and are updated to stay so (which is hard work if you actually want it done well)
    • so are therefore frequently misapplied
  • Confidence in security is usually biased, and regularly based on the wrong things (such as regulations regardless of how good they are).



  • Given there is no perfect security, security often ends up being a question of economics, risk assessment, uptime consideration
  • In a system in which problems are presumable, the weaknesses should be designed to places where they can be controlled.


  • Layering is usually only useful with warning systems. For the simple reason that additional layers primarily mean additional time.

Perpetual arguments

Password requirements

People reusing passwords

Writing down passwords

This article/section is a stub — probably a pile of half-sorted notes, is not well-checked so may have incorrect bits. (Feel free to ignore, fix, or tell me)

Password systems

Why making a secure login system is hard

Password managing software

Forcedly changing passwords

This article/section is a stub — probably a pile of half-sorted notes, is not well-checked so may have incorrect bits. (Feel free to ignore, fix, or tell me)