Risk is the potential that a chosen action or activity (including the choice of inaction) will lead to a loss (an undesirable outcome). The notion implies that a choice having an influence on the outcome exists (or existed). Potential losses themselves may also be called "risks". Almost any human endeavor carries some risk, but some are much more risky than others.
The Oxford English Dictionary cites the earliest use of the word in English (in the spelling of risque) as from 1621, and the spelling as risk from 1655. It defines risk as:
(Exposure to) the possibility of loss, injury, or other adverse or unwelcome circumstance; a chance or situation involving such a possibility.
For the sociologist Niklas Luhmann the term 'risk' is a neologism that appeared with the transition from traditional to modern society. "In the Middle Ages the term risicum was used in highly specific contexts, above all sea trade and its ensuing legal problems of loss and damage." In the vernacular languages of the 16th century the words rischio and riezgo were used. This was introduced to continental Europe, through interaction with Middle Eastern and North African Arab traders. In the English language the term risk appeared only in the 17th century, and "seems to be imported from continental Europe." When the terminology of risk took ground, it replaced the older notion that thought "in terms of good and bad fortune." Niklas Luhmann (1996) seeks to explain this transition: "Perhaps, this was simply a loss of plausibility of the old rhetorics of Fortuna as an allegorical figure of religious content and of prudentia as a (noble) virtue in the emerging commercial society."
Scenario analysis matured during Cold War confrontations between major powers, notably the United States and the Soviet Union. It became widespread in insurance circles in the 1970s when major oil tanker disasters forced a more comprehensive foresight. The scientific approach to risk entered finance in the 1960s with the advent of the capital asset pricing model and became increasingly important in the 1980s when financial derivatives proliferated. It reached general professions in the 1990s when the power of personal computing allowed for widespread data collection and numbers crunching.
Governments are using it, for example, to set standards for environmental regulation, e.g. "pathway analysis" as practiced by the United States Environmental Protection Agency.
Definitions of risk
The many inconsistent and ambiguous meanings attached to "risk" lead to widespread confusion and also mean that very different approaches to risk management are taken in different fields. For example:
- The ISO 31000 (2009) /ISO Guide 73 definition of risk is the 'effect of uncertainty on objectives'. In this definition, uncertainties include events (which may or not happen) and uncertainties caused by a lack of information or ambiguity. This definition also includes both negative and positive impacts on objectives.
- Another definition is that risks are future problems that can be avoided or mitigated, rather than current ones that must be immediately addressed.
- Risk can be seen as relating to the Probability of uncertain future events.. For example, according to Factor Analysis of Information Risk, risk is: the probable frequency and probable magnitude of future loss. In computer science this definition is used by The Open Group.
- OHSAS (Occupational Health & Safety Advisory Services) defines risk as the product of the probability of a hazard resulting in an adverse event, times the severity of the event.
- In information security risk is defined as "the potential that a given threat will exploit vulnerabilities of an asset or group of assets and thereby cause harm to the organization",
- Financial risk is often defined as the unexpected variability or volatility of returns and thus includes both potential worse-than-expected as well as better-than-expected returns. References to negative risk below should be read as applying to positive impacts or opportunity (e.g., for "loss" read "loss or gain") unless the context precludes this interpretation.
- The related term "hazard" is used to mean something that could cause harm.
As risk carries so many different meanings there are many formal methods used to assess or to "measure" risk. Some of the quantitative definitions of risk are well-grounded in statistics theory and lead naturally to statistical estimates, but some are more subjective. For example in many cases a critical factor is human decision making.
Even when statistical estimates are available, in many cases risk is associated with rare failures of some kind, and data may be sparse. Often, the probability of a negative event is estimated by using the frequency of past similar events or by event tree methods, but probabilities for rare failures may be difficult to estimate if an event tree cannot be formulated. This makes risk assessment difficult in hazardous industries (for example nuclear energy) where the frequency of failures is rare and harmful consequences of failure are very high.
Statistical methods may also require the use of a Cost function, which in turn often requires the calculation of the cost of the loss of human life, a difficult problem. One approach is to ask what people are willing to pay to insure against death, and radiological release (e.g., GBq of radio-iodine), but as the answers depend very strongly on the circumstances it is not clear that this approach is effective.
In statistics, the notion of risk is often modelled as the expected value of some outcome seen as undesirable. This combines the probabilities of various possible events and some assessment of the corresponding harms into a single value. (See also Expected utility.) In a formula that can be used in the simple case of a binary possibility (accident or no accident), risk is then:
For example: if activity X may suffer an accident of A at a probability of 0.01 with a loss of 1000, the total risk is a loss of 10, since that is the product of 0.01 and 1 000.
In case of there being several possible accidents, risk is the sum of the all risks for the different accidents, provided that the outcomes are comparable:
For example: if activity X may suffer an accident of A at a probability of 0.01 with a loss of 1000, and an accident of type B at probability of 0.000 001 at a loss of 2 000 000, the total risk is a loss of 12, that is 10 from accident of types A and 2 from accidents of type B.
One of the first major uses of this concept was at the planning of the Delta Works in 1953, a flood protection program in the Netherlands, with the aid of the mathematician David van Dantzig. The kind of risk analysis pioneered here has become common today in fields like nuclear power, aerospace and the chemical industry.
Risk versus uncertainty
Risk: Combination of the likelihood of an occurrence of a hazardous event or exposure(s) and the severity of injury or ill health that can be caused by the event or exposure(s)
“ ... Uncertainty must be taken in a sense radically distinct from the familiar notion of Risk, from which it has never been properly separated. The term "risk," as loosely used in everyday speech and in economic discussion, really covers two things which, functionally at least, in their causal relations to the phenomena of economic organization, are categorically different. ... The essential fact is that "risk" means in some cases a quantity susceptible of measurement, while at other times it is something distinctly not of this character; and there are far-reaching and crucial differences in the bearings of the phenomenon depending on which of the two is really present and operating. ... It will appear that a measurable uncertainty, or "risk" proper, as we shall use the term, is so far different from an unmeasurable one that it is not in effect an uncertainty at all. We ... accordingly restrict the term "uncertainty" to cases of the non-quantitive type.: ”
Thus, Knightian uncertainty is immeasurable, not possible to calculate, while in the Knightian sense risk is measurable.
Another distinction between risk and uncertainty is proposed in How to Measure Anything: Finding the Value of Intangibles in Business and The Failure of Risk Management: Why It's Broken and How to Fix It by Doug Hubbard:
- Uncertainty: The lack of complete certainty, that is, the existence of more than one possibility. The "true" outcome/state/result/value is not known.
- Measurement of uncertainty: A set of probabilities assigned to a set of possibilities. Example: "There is a 60% chance this market will double in five years"
- Risk: A state of uncertainty where some of the possibilities involve a loss, catastrophe, or other undesirable outcome.
- Measurement of risk: A set of possibilities each with quantified probabilities and quantified losses. Example: "There is a 40% chance the proposed oil well will be dry with a loss of $12 million in exploratory drilling costs".
In this sense, Hubbard uses the terms so that one may have uncertainty without risk but not risk without uncertainty. We can be uncertain about the winner of a contest, but unless we have some personal stake in it, we have no risk. If we bet money on the outcome of the contest, then we have a risk. In both cases there are more than one outcome. The measure of uncertainty refers only to the probabilities assigned to outcomes, while the measure of risk requires both probabilities for outcomes and losses quantified for outcomes.
Risk as a vector quantity
Hubbard also argues that defining risk as the product of impact and probability presumes (probably incorrectly) that the decision makers are risk neutral. Only for a risk neutral person is the "certain monetary equivalent" exactly equal to the probability of the loss times the amount of the loss. For example, a risk neutral person would consider 20% chance of winning $1 million exactly equal to $200,000 (or a 20% chance of losing $1 million to be exactly equal to losing $200,000). However, most decision makers are not actually risk neutral and would not consider these equivalent choices. This gave rise to Prospect theory and Cumulative prospect theory. Hubbard proposes instead that risk is a kind of "vector quantity" that does not collapse the probability and magnitude of a risk by presuming anything about the risk tolerance of the decision maker. Risks are simply described as a set or function of possible loss amounts each associated with specific probabilities. How this array is collapsed into a single value cannot be done until the risk tolerance of the decision maker is quantified.
Risk can be both negative and positive, but it tends to be the negative side that people focus on. This is because some things can be dangerous, such as putting their own or someone else’s life at risk. Risks concern people as they think that they will have a negative effect on their future.
Insurance and health risk
Insurance is a risk-reducing investment in which the buyer pays a small fixed amount to be protected from a potential large loss. Gambling is a risk-increasing investment, wherein money on hand is risked for a possible large return, but with the possibility of losing it all. Purchasing a lottery ticket is a very risky investment with a high chance of no return and a small chance of a very high return. In contrast, putting money in a bank at a defined rate of interest is a risk-averse action that gives a guaranteed return of a small gain and precludes other investments with possibly higher gain.
Risks in personal health may be reduced by primary prevention actions that decrease early causes of illness or by secondary prevention actions after a person has clearly measured clinical signs or symptoms recognized as risk factors. Tertiary prevention reduces the negative impact of an already established disease by restoring function and reducing disease-related complications. Ethical medical practice requires careful discussion of risk factors with individual patients to obtain informed consent for secondary and tertiary prevention efforts, whereas public health efforts in primary prevention require education of the entire population at risk. In each case, careful communication about risk factors, likely outcomes and certainty must distinguish between causal events that must be decreased and associated events that may be merely consequences rather than causes.
Information technology risk
Information technology risk, or IT risk, IT-related risk, is a risk related to information technology. This relatively new term due to an increasing awareness that information security is simply one facet of a multitude of risks that are relevant to IT and the real world processes it supports.
The increasing dependencies of modern society on information and computers networks (both in private and public sectors, including military)   has led to a new terms like IT risk and Cyberwarfare.
Information security risk
Information security means protecting information and information systems from unauthorized access, use, disclosure, disruption, modification, perusal, inspection, recording or destruction.
Information security grew out of practices and procedures of computer security.
Information security has grown to information assurance (IA) i.e. is the practice of managing risks related to the use, processing, storage, and transmission of information or data and the systems and processes used for those purposes.
While focused dominantly on information in digital form, the full range of IA encompasses not only digital but also analog or physical form.
Information assurance is interdisciplinary and draws from multiple fields, including accounting, fraud examination, forensic science, management science, systems engineering, security engineering, and criminology, in addition to computer science.
So, IT risk is narrowly focused on computer security, while information security extends on risks related to other forms of information (paper, microfilm). Information assurance risks include the ones related to the consistency of the business information stored in IT systems and the one stored on other means and the relevant business consequences.
Economic risks can be manifested in lower incomes or higher expenditures than expected. The causes can be many, for instance, the hike in the price for raw materials, the lapsing of deadlines for construction of a new operating facility, disruptions in a production process, emergence of a serious competitor on the market, the loss of key personnel, the change of a political regime, or natural disasters. Reference class forecasting was developed to eliminate or reduce economic risk.
Means of assessing risk vary widely between professions. Indeed, they may define these professions; for example, a doctor manages medical risk, while a civil engineer manages risk of structural failure. A professional code of ethics is usually focused on risk assessment and mitigation (by the professional on behalf of client, public, society or life in general).
In the workplace, incidental and inherent risks exist. Incidental risks are those that occur naturally in the business but are not part of the core of the business. Inherent risks have a negative effect on the operating profit of the business.
Some industries manage risk in a highly quantified and enumerated way. These include the nuclear power and aircraft industries, where the possible failure of a complex series of engineered systems could result in highly undesirable outcomes. The usual measure of risk for a class of events is then:
- R = probability of the event × C
The total risk is then the product of the individual class-risks.
In the nuclear industry, consequence is often measured in terms of off-site radiological release, and this is often banded into five or six decade-wide bands.
The risks are evaluated using fault tree/event tree techniques (see safety engineering). Where these risks are low, they are normally considered to be "Broadly Acceptable". A higher level of risk (typically up to 10 to 100 times what is considered Broadly Acceptable) has to be justified against the costs of reducing it further and the possible benefits that make it tolerable—these risks are described as "Tolerable if ALARP". Risks beyond this level are classified as "Intolerable".
The level of risk deemed Broadly Acceptable has been considered by regulatory bodies in various countries—an early attempt by UK government regulator and academic F. R. Farmer used the example of hill-walking and similar activities, which have definable risks that people appear to find acceptable. This resulted in the so-called Farmer Curve of acceptable probability of an event versus its consequence.
The technique as a whole is usually referred to as Probabilistic Risk Assessment (PRA) (or Probabilistic Safety Assessment, PSA). See WASH-1400 for an example of this approach.
In finance, risk is the probability that an investment's actual return will be different than expected. This includes the possibility of losing some or all of the original investment. In a view advocated by Damodaran, risk includes not only "downside risk" but also "upside risk" (returns that exceed expectations). Some regard a calculation of the standard deviation of the historical returns or average returns of a specific investment as providing some historical measure of risk; see modern portfolio theory. Financial risk may be market-dependent, determined by numerous market factors, or operational, resulting from fraudulent behavior (e.g. Bernard Madoff). Recent studies suggest that testosterone level plays a major role in risk taking during financial decisions.
In finance, risk has no one definition, but some theorists, notably Ron Dembo, have defined quite general methods to assess risk as an expected after-the-fact level of regret. Such methods have been uniquely successful in limiting interest rate risk in financial markets. Financial markets are considered to be a proving ground for general methods of risk assessment. However, these methods are also hard to understand. The mathematical difficulties interfere with other social goods such as disclosure, valuation and transparency. In particular, it is not always obvious if such financial instruments are "hedging" (purchasing/selling a financial instrument specifically to reduce or cancel out the risk in another investment) or "speculation" (increasing measurable risk and exposing the investor to catastrophic loss in pursuit of very high windfalls that increase expected value).
As regret measures rarely reflect actual human risk-aversion, it is difficult to determine if the outcomes of such transactions will be satisfactory. Risk seeking describes an individual whose utility function's second derivative is positive. Such an individual would willingly (actually pay a premium to) assume all risk in the economy and is hence not likely to exist.
In financial markets, one may need to measure credit risk, information timing and source risk, probability model risk, and legal risk if there are regulatory or civil actions taken as a result of some "investor's regret". Knowing one's risk appetite in conjunction with one's financial well-being are most crucial.
A fundamental idea in finance is the relationship between risk and return (see modern portfolio theory). The greater the potential return one might seek, the greater the risk that one generally assumes. A free market reflects this principle in the pricing of an instrument: strong demand for a safer instrument drives its price higher (and its return proportionately lower), while weak demand for a riskier instrument drives its price lower (and its potential return thereby higher).
"For example, a US Treasury bond is considered to be one of the safest investments and, when compared to a corporate bond, provides a lower rate of return. The reason for this is that a corporation is much more likely to go bankrupt than the U.S. government. Because the risk of investing in a corporate bond is higher, investors are offered a higher rate of return."
The most popular, and also the most vilified lately risk measurement is Value-at-Risk (VaR). There are different types of VaR - Long Term VaR, Marginal VaR, Factor VaR and Shock VaR The latter is used in measuring risk during the extreme market stress conditions.
In public works
In a peer reviewed study of risk in public works projects located in twenty nations on five continents, Flyvbjerg, Holm, and Buhl (2002, 2005) documented high risks for such ventures for both costs and demand. Actual costs of projects were typically higher than estimated costs; cost overruns of 50% were common, overruns above 100% not uncommon. Actual demand was often lower than estimated; demand shortfalls of 25% were common, of 50% not uncommon.
Due to such cost and demand risks, cost-benefit analyses of public works projects have proved to be highly uncertain.
The main causes of cost and demand risks were found to be optimism bias and strategic misrepresentation. Measures identified to mitigate this type of risk are better governance through incentive alignment and the use of reference class forecasting.
In human services
Huge ethical and political issues arise when human beings themselves are seen or treated as 'risks', or when the risk decision making of people who use human services might have an impact on that service. The experience of many people who rely on human services for support is that 'risk' is often used as a reason to prevent them from gaining further independence or fully accessing the community, and that these services are often unnecessarily risk averse.
Risk in psychology
Framing is a fundamental problem with all forms of risk assessment. In particular, because of bounded rationality (our brains get overloaded, so we take mental shortcuts), the risk of extreme events is discounted because the probability is too low to evaluate intuitively. As an example, one of the leading causes of death is road accidents caused by drunk driving—partly because any given driver frames the problem by largely or totally ignoring the risk of a serious or fatal accident.
For instance, an extremely disturbing event (an attack by hijacking, or moral hazards) may be ignored in analysis despite the fact it has occurred and has a nonzero probability. Or, an event that everyone agrees is inevitable may be ruled out of analysis due to greed or an unwillingness to admit that it is believed to be inevitable. These human tendencies for error and wishful thinking often affect even the most rigorous applications of the scientific method and are a major concern of the philosophy of science.
All decision-making under uncertainty must consider cognitive bias, cultural bias, and notational bias: No group of people assessing risk is immune to "groupthink": acceptance of obviously wrong answers simply because it is socially painful to disagree, where there are conflicts of interest. One effective way to solve framing problems in risk assessment or measurement (although some argue that risk cannot be measured, only assessed) is to raise others' fears or personal ideals by way of completeness.[clarification needed How do fears solve the problem, and what does "by way of completeness" mean?]
Neurobiology of framing
Framing involves other information that affects the outcome of a risky decision. The right prefrontal cortex has been shown to take a more global perspective while greater left prefrontal activity relates to local or focal processing
From the Theory of Leaky Modules McElroy and Seta proposed that they could predictably alter the framing effect by the selective manipulation of regional prefrontal activity with finger tapping or monaural listening. The result was as expected. Rightward tapping or listening had the effect of narrowing attention such that the frame was ignored. This is a practical way of manipulating regional cortical activation to affect risky decisions, especially because directed tapping or listening is easily done.
Fear as intuitive risk assessment
For the time being, people rely on their fear and hesitation to keep them out of the most profoundly unknown circumstances.
In The Gift of Fear, Gavin de Becker argues that "True fear is a gift. It is a survival signal that sounds only in the presence of danger. Yet unwarranted fear has assumed a power over us that it holds over no other creature on Earth. It need not be this way."
Risk could be said to be the way we collectively measure and share this "true fear"—a fusion of rational doubt, irrational fear, and a set of unquantified biases from our own experience.
The field of behavioral finance focuses on human risk-aversion, asymmetric regret, and other ways that human financial behavior varies from what analysts call "rational". Risk in that case is the degree of uncertainty associated with a return on an asset.
Recognizing and respecting the irrational influences on human decision making may do much to reduce disasters caused by naive risk assessments that pretend to rationality but in fact merely fuse many shared biases together.
Risk assessment and management
Since risk assessment and management is essential in security management, both are tightly related. Security assessment methodologies like CRAMM contain risk assessment modules as an important part of the first steps of the methodology. On the other hand, risk assessment methodologies like Mehari evolved to become security assessment methodologies. A ISO standard on risk management (Principles and guidelines on implementation) was published under code ISO 31000 on 13 November 2009.
Risk and size
In the book Megaprojects and Risk, Professor Bent Flyvbjerg (with Nils Bruzelius and Werner Rothengatter) demonstrates that big ventures (big construction projects, big capital investments, etc.) are highly risky. For instance, such ventures typically have high cost overruns, benefit shortfalls, and schedule delays, plus negative and unanticipated social and environmental impacts.
Risk in auditing
The audit risk model expresses the risk of an auditor providing an inappropriate opinion of a commercial entity's financial statements. It can be analytically expressed as:
- AR = IR x CR x DR
Where AR is audit risk, IR is inherent risk, CR is control risk and DR is detection risk.
- Applied information economics
- Ambiguity aversion
- Benefit shortfall
- Civil defense
- Country risk
- Cost overrun
- Credit risk
- Cultural Theory of risk
- Early case assessment
- Event chain methodology
- Financial risk
- Fuel price risk management
- Fuzzy-trace theory
- Global Risk Forum
- Hazard prevention
- Identity resolution
- Information Assurance
- Information Security
- Inherent risk
- Insurance industry
- Interest rate risk
- International Risk Governance Council
- IT risk
- ISO 31000
- ISO 28000
- Legal risk
- Life-critical system
- Liquidity risk
- List of books about risk
- Loss aversion
- Market risk
- Operational risk
- Optimism bias
- Political risk
- Preventive maintenance
- Preventive medicine
- Probabilistic risk assessment
- Reference class forecasting
- Reinvestment risk
- Reputational risk
- Risk analysis
- Risk aversion
- Risk factor (finance)
- Risk homeostasis
- Risk management
- Risk-neutral measure
- Risk perception
- Risk register
- Sampling risk
- Security risk
- Systemic risk
- Value at risk
- ^ Oxford English Dictionary
- ^ a b c d e Luhmann 1996:3.
- ^ James Franklin, 2001: The Science of Conjecture: Evidence and Probability Before Pascal, Baltimore: Johns Hopkins University Press, 274.
- ^ Luhmann 1996:4.
- ^ Douglas Hubbard The Failure of Risk Management: Why It's Broken and How to Fix It, John Wiley & Sons, 2009.
- ^ E.g. "Risk is the unwanted subset of a set of uncertain outcomes." (Cornelius Keating).
- ^ a b "An Introduction to Factor Analysis of Information Risk (FAIR)", Risk Management Insight LLC, November 2006;.
- ^ Technical Standard Risk Taxonomy ISBN 1-931624-77-1 Document Number: C081 Published by The Open Group, January 2009.
- ^ "Risk is a combination of the likelihood of an occurrence of a hazardous event or exposure(s) and the severity of injury or ill health that can be caused by the event or exposure(s)" (OHSAS 18001:2007).
- ^ ISO/IEC 27005:2008.
- ^ Landsburg, Steven (2003-03-03). "Is your life worth $10 million?". Everyday Economics (Slate). http://www.slate.com/id/2079475/. Retrieved 2008-03-17.
- ^ Wired Magazine, Before the levees break, page 3.
- ^ Frank Hyneman Knight "Risk, uncertainty and profit" pg. 19, Hart, Schaffner, and Marx Prize Essays, no. 31. Boston and New York: Houghton Mifflin. 1921.
- ^ Douglas Hubbard "How to Measure Anything: Finding the Value of Intangibles in Business" pg. 46, John Wiley & Sons, 2007.
- ^ a b Douglas Hubbard "The Failure of Risk Management: Why It's Broken and How to Fix It, John Wiley & Sons, 2009.
- ^ Rychetnik L, Hawe P, Waters E, Barratt A, Frommer M (2004 July). "A glossary for evidence based public health". J Epidemiol Community Health 58: 538–45. doi:10.1136/jech.2003.011585. PMC 1732833. PMID 15194712. http://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=1732833.
- ^ Cortada, James W. (2003-12-04) The Digital Hand: How Computers Changed the Work of American Manufacturing, Transportation, and Retail Industries USA: Oxford University Press pp. 512 ISBN 0195165888 .
- ^ Cortada, James W. (2005-11-03) The Digital Hand: Volume II: How Computers Changed the Work of American Financial, Telecommunications, Media, and Entertainment Industries USA: Oxford University Press ISBN 978-0195165876 .
- ^ Cortada, James W. (2007-11-06) The Digital Hand, Vol 3: How Computers Changed the Work of American Public Sector Industries USA: Oxford University Press pp. 496 ISBN 978-0195165869 .
- ^ 44 U.S.C. § 3542(b)(1).
- ^ .
- ^ Flyvbjerg, B., 2008, "Curbing Optimism Bias and Strategic Misrepresentation in Planning: Reference Class Forecasting in Practice." European Planning Studies, vol. 16, no. 1, January, pp. 3-21.
- ^ Damodaran, Aswath (2003). Investment Philosophies: Successful Investment Philosophies and the Greatest Investors Who Made Them Work. Wiley. p. 15. ISBN 0-471-34503-2.
- ^ Sapienza P., Zingales L. and Maestripieri D. 2009. Gender differences in financial risk aversion and career choices are affected by testosterone. Proceedings of the National Academy of Sciences.
- ^ Apicella C. L. and all. Testosterone and financial risk preferences. Evolution and Human Behavior. Vol 29. Issue 6. 384-390.abstract.
- ^ Value at risk.
- ^ http://flyvbjerg.plan.aau.dk/JAPAASPUBLISHED.pdf.
- ^ http://flyvbjerg.plan.aau.dk/Traffic91PRINTJAPA.pdf.
- ^ http://flyvbjerg.plan.aau.dk/0406DfT-UK%20OptBiasASPUBL.pdf.
- ^ A Positive Approach To Risk Requires Person Centred Thinking, Neill et al, Tizard Learning Disability Review http://pierprofessional.metapress.com/content/vr700311x66j0125/
- ^ Amos Tversky / Daniel Kahneman, 1981. "The Framing of Decisions and the Psychology of Choice."[verification needed]
- ^ Schatz, J., Craft, S., Koby, M., & DeBaun, M. R. (2004). Asymmetries in visual-spatial processing following childhood stroke. Neuropsychology, 18, 340-352.
- ^ Volberg, G., & Hubner, R. (2004). On the role of response conflicts and stimulus position for hemispheric differences in global/local processing: An ERP study. Neuropsychologia, 42, 1805-1813.
- ^ Drake, R. A. (2004). Selective potentiation of proximal processes: Neurobiological mechanisms for spread of activation. Medical Science Monitor, 10, 231-234.
- ^ McElroy, T., & Seta, J. J. (2004). On the other hand am I rational? Hemisphere activation and the framing effect. Brain and Cognition, 55, 572-580.
- ^ Flyvbjerg 2006.
- ^ Bent Flyvbjerg, Nils Bruzelius, and Werner Rothengatter, 2003, Megaprojects and Risk: An Anatomy of Ambition (Cambridge University Press).
- Bent Flyvbjerg, 2006: From Nobel Prize to Project Management: Getting Risks Right. Project Management Journal, vol. 37, no. 3, August, pp. 5–15. Available at homepage of author.
- James Franklin, 2001: The Science of Conjecture: Evidence and Probability Before Pascal, Baltimore: Johns Hopkins University Press.
- Niklas Luhmann, 1996: Modern Society Shocked by its Risks (= University of Hong Kong, Department of Sociology Occasional Papers 17), Hong Kong, available via HKU Scholars HUB.
- Historian David A. Moss's book When All Else Fails explains the U.S. government's historical role as risk manager of last resort.
- Peter L. Bernstein. Against the Gods ISBN 0-471-29563-9. Risk explained and its appreciation by man traced from earliest times through all the major figures of their ages in mathematical circles.
- Rescher, Nicholas (1983). A Philosophical Introduction to the Theory of Risk Evaluation and Measurement. University Press of America.
- Porteous, Bruce T.; Pradip Tapadar (December 2005). Economic Capital and Financial Risk Management for Financial Services Firms and Conglomerates. Palgrave Macmillan. ISBN 1-4039-3608-0.
- Tom Kendrick (2003). Identifying and Managing Project Risk: Essential Tools for Failure-Proofing Your Project. AMACOM/American Management Association. ISBN 978-0814407615.
- Flyvbjerg, Bent, Nils Bruzelius, and Werner Rothengatter, 2003. Megaprojects and Risk: An Anatomy of Ambition (Cambridge: Cambridge University Press)..
- David Hillson (2007). Practical Project Risk Management: The Atom Methodology. Management Concepts. ISBN 978-1567262025.
- Kim Heldman (2005). Project Manager's Spotlight on Risk Management. Jossey-Bass. ISBN 978-0782144116.
- Dirk Proske (2008). Catalogue of risks - Natural, Technical, Social and Health Risks. Springer. ISBN 978-3540795544.
- Gardner, Dan, Risk: The Science and Politics of Fear, Random House, Inc., 2008. ISBN 0771032994.
Articles and papers
- Clark, L., Manes, F., Antoun, N., Sahakian, B. J., & Robbins, T. W. (2003). "The contributions of lesion laterality and lesion volume to decision-making impairment following frontal lobe damage." Neuropsychologia, 41, 1474-1483.
- Drake, R. A. (1985). "Decision making and risk taking: Neurological manipulation with a proposed consistency mediation." Contemporary Social Psychology, 11, 149-152.
- Drake, R. A. (1985). "Lateral asymmetry of risky recommendations." Personality and Social Psychology Bulletin, 11, 409-417.
- Gregory, Kent J., Bibbo, Giovanni and Pattison, John E. (2005), "A Standard Approach to Measurement Uncertainties for Scientists and Engineers in Medicine", Australasian Physical and Engineering Sciences in Medicine 28(2):131-139.
- Hansson, Sven Ove. (2007). "Risk", The Stanford Encyclopedia of Philosophy (Summer 2007 Edition), Edward N. Zalta (ed.), forthcoming .
- Holton, Glyn A. (2004). "Defining Risk", Financial Analysts Journal, 60 (6), 19–25. A paper exploring the foundations of risk. (PDF file).
- Knight, F. H. (1921) Risk, Uncertainty and Profit, Chicago: Houghton Mifflin Company. (Cited at: , § I.I.26.).
- Kruger, Daniel J., Wang, X.T., & Wilke, Andreas (2007) "Towards the development of an evolutionarily valid domain-specific risk-taking scale" Evolutionary Psychology (PDF file).
- Metzner-Szigeth, A. (2009). "Contradictory Approaches? – On Realism and Constructivism in the Social Sciences Research on Risk, Technology and the Environment." Futures, Vol. 41, No. 2, March 2009, pp. 156–170 (fulltext journal: ) (free preprint: ).
- Miller, L. (1985). "Cognitive risk taking after frontal or temporal lobectomy I. The synthesis of fragmented visual information." Neuropsychologia, 23, 359 369.
- Miller, L., & Milner, B. (1985). "Cognitive risk taking after frontal or temporal lobectomy II. The synthesis of phonemic and semantic information." Neuropsychologia, 23, 371 379.
- Neill, M. Allen, J. Woodhead, N. Reid, S. Irwin, L. Sanderson, H. 2008 "A Positive Approach to Risk Requires Person Centred Thinking" London, CSIP Personalisation Network, Department of Health. Available from: http://networks.csip.org.uk/Personalisation/Topics/Browse/Risk/ [Accessed 21 July 2008].
- Risk - The entry of the Stanford Encyclopedia of Philosophy
- Risk Management magazine, a publication of the Risk and Insurance Management Society.
- The Institute of Risk Management (IRM) is risk management's leading international professional education and training body
- Risk and Insurance
- StrategicRISK, a risk management journal
- "Risk preference and religiosity" article from the Institute for the Biocultural Study of Religion
Wikimedia Foundation. 2010.