Security notes / not-so-formal notes: Difference between revisions
Jump to navigation
Jump to search
mNo edit summary |
|||
Line 1: | Line 1: | ||
{{#addbodyclass:tag_tech}} | |||
{{#addbodyclass:tag_security}} | |||
{{SecurityRelated}} | {{SecurityRelated}} | ||
Latest revision as of 00:20, 22 April 2024
Security related stuff.
Securing services
Unsorted - · Anonymization notes · website security notes · integrated security hardware · Glossary · unsorted |
Security truisms
- security is only as good as its weakest link.
- A brilliant design can be completely undermined by not applying it correctly, by a weakness elsewhere, by making it too annoying to use, by believing it is so secure that you stop thinking about it's used, etc.
- The harder you make it to work with, the more likely it is that people will work around it.
- The more value is being protected, the more you should consider a system insecure until proven secure
- "it worked so far" is proof of nothing beyond "no one ever tried breaking in before"
- Nothing after years of attack is more convincing - but how much more varies with what it is.
- ...yes, this (and more) makes it fundamentally very hard to prove anything is secure.
- "Any lock can be picked with a big enough hammer" (Sun System & Network Admin manual)
- Security is a property of a design, not something you add afterwards
- something you duct-tape on later is easily both less secure and more annoying to use
- The more complex a security system, the less likely it is to be secure.
- Changes to secure systems it should only be done by someone who both understands the implications, the function and details of and the system both before and after.
- ideally, changes to secure systems should be treated as an entirely new system
- for physical as well as computer security
- yes, this feels overbearing, and regularly is
- ...but there are also many cases where common sense was so absent it will make you headdesk
- Security tests are almost per definition incomplete
- so very easily overvalued
- Arguably more if standardized because attackers know where the focus was
- security through obscurity does not deter or prevent
- It often does slow down, but by an inherently unknown time.
- Never store anything secret online, because data providers are not necessarily allowed to keep your data private from the government.
- nor are they necessarily allowed to tell you even that they are subpoena'd
- Never store anything secret online, because you should assume every site gets hacked
- and companies are not necessarily required to tell you that happened
- assume the bad guys have more time than you
- while they are generally working from a a similar cost-benefit as you, sometimes their perceived benefits is higher, and they'll find flaws you weren't given time to find
Psychology of security
✎ This article/section is a stub — some half-sorted notes, not necessarily checked, not necessarily correct. Feel free to ignore, or tell me about it.
- The feeling of security has little to do with the reality of security.
- this feeling mostly consists of psychological reactions to both risks and countermeasures. And often quite biased away from real information, e.g. by generic fear, recent events, etc.
- the reality can be approximated with probabilities (decently, never perfectly)
- Insecurity is the norm.
- Sure, this only matters when you are a target at all
- ...but once you are or may be, downplaying problems is hopeful, misinformed, and regularly stupidly risky.
- People have problems estimating risks; intuition often overrules logic, often in a predictable way
- spectacular but rare risks are exaggerated, common but boring risks are downplayed
- popularly mentioned risks are exaggerated (those talked about in society and the media)
- personal risks are exaggerated, anonymous risks are downplayed
- offensive things are exaggerated, pleasant things downplayed
- lack of knowledge of a risk makes it likely to be exaggerated (for example in new diseases, new technologies), familiarity with a risk makes it likelier to be downplayed
- man-made risks are exaggerated relative to naturally occurring risks (varies)
- active/passive roles:
- passive risks (things taken to 'just happen') are downplayed
- ...unless there is a sense that you could be in more control (consider being a passenger in a car or plane)
- actively taken risks are often downplayed (e.g. being the driver)
- imposed risks often exaggerated (e.g. passenger again)
- estimated risk is based on the basic trust in an actor,
- even if this is a very large institution/government/such (for which trust and actions may be hard to estimate or model)
- ...although we do often trust people over governments, governments over corporations, etc.
- a protective approach often leads to risk exaggeration, and occasionally lead to pointless tradeoffs (consider protective parents)
- Something framed as a gain or loss may lead to being more risk averse or risk seeking (though exactly which and how may be complex)
Many of these are regularly exploited in some way - certainly by various media (we're looking at you, pop news TV).
Development, complexity, weakest links
✎ This article/section is a stub — some half-sorted notes, not necessarily checked, not necessarily correct. Feel free to ignore, or tell me about it.
- There is a good correlation between more complex software and more holes. The more complex, the harder it is to make security guarantees (Perhaps exponential with the size/complexity).
- There is a good correlation between simply more software on a system and more holes - On systems that are targets, stripping down to bare basics may reduce the likely amount of security holes better than patching.
- Decent systems usually have security considered in their first designs, not patched on later
- Security usually lessens as a system expands, as it often expands around the security measures in place.
- It is easy to severely reduce security through a single weak link. Common weak links include:
- Humans that introduced exceptions to policies.
- Note that good but very inconvenient security systems simply will be circumvented by its own users through time/money/risk assessments, or perhaps more commonly, intuitions.
- Intuitive decisions always have a chance of being ill-informed
- Blindly following a policy, not considering its reason, often reduces the security as it makes it easier to circumvent.
Business and security
✎ This article/section is a stub — some half-sorted notes, not necessarily checked, not necessarily correct. Feel free to ignore, or tell me about it.
- Way too many businesses try to duct tape it in
- There is less correlation between money thrown at security and and resulting security
- There is good correlation between infrastructure considerations and resulting security
- Regulatory compliance
- is good for basic level of security of commonly used/applied things (cash registers, etc.)
- but only work as well as said regulations apply to your case
- and are updated to stay so (which is hard work if you actually want it done well)
- so are therefore frequently misapplied
- Confidence in security is usually biased, and regularly based on the wrong things (such as regulations regardless of how good they are).
- Given there is no perfect security, security often ends up being a question of economics, risk assessment, uptime consideration
- In a system in which problems are presumable, the weaknesses should be designed to places where they can be controlled.
- Layering is usually only useful with warning systems. For the simple reason that additional layers primarily mean additional time.