Bibliography


§2.4 is on first-order risk aversion



Download 7.23 Mb.
Page33/103
Date28.05.2018
Size7.23 Mb.
#51225
1   ...   29   30   31   32   33   34   35   36   ...   103
§2.4 is on first-order risk aversion
P. 18 writes on violation of forgone-branch independence: “Such dependence is not irrational, ... disappointment or relief”
P. 25 top points out (for recursivity) that verification of a condition at the individual level does not imply the same condition at the aggregate level. Then writes: “We are left with the familiar “excuse” for representative agent modeling, namely the current lack of a superior alternative.”
P. 44 suggests that Hammond (1988) and Machina (1989) use the term consequentialism in the same sense (which they don’t).
P. 50: quasi-concave so deliberate randomization %}

Epstein, Larry G. (1992) “Behavior under Risk: Recent Developments in Theory and Applications.” In Jean-Jacques Laffont (ed.) Advances in Economic Theory II, 1–63, Cambridge University Press, Cambridge.


{% equilibrium under nonEU: discusses rationalizability and equilibrium for some nonEU theories. %}

Epstein, Larry G. (1997) “Preference, Rationalizability and Equilibrium,” Journal of Economic Theory 73, 1–29.


{% This paper explains the author’s views on ambiguity. It reflects impressive, deep, and consistent thinking. I disagree, however, with many intuitive directions chosen. The “finishing touch” for the author’s aims, endogenizing the definition of unambiguous events, will be given in Epstein & Zhang (2001, Econometrica), but most of the ideas, concepts, and interpretations are here. I will use the term ambiguity hereafter for what the author often calls uncertainty.
The author considers it desirable to endogenize many things such as probabilities. P. 583, Eq. 2.2, describes the “standard” definition of risk neutrality (and, hence, risk aversion) and the expectation involved therein not with respect to given objective probabilities as is common, but with respect to endogenous probabilities (Eq. 2.2), because there is a “for some” quantifier for the probability measure m. m is called subjective. In several places, for instance p. 585 below Eq. 2.5, the author equates “risk” with subjective rather than objective probabilities (SEU = risk). There are many economists who have done so since Savage (1954), including prominent ones. However, I think that this is an unfortunate and still minority terminology, and that risk better be related to objective probabilities only. By the way, in the latter way it was also defined by the author himself in Epstein (1992, p. 1)!
P. 584 1st para equates indifference-to-ambiguity with absence-of- ambiguity.
P. 584, Eq. 2.3, defines between-person more ambiguity averse as less favorable comparisons of ambiguous acts to unambiguous ones. If for every act a certainty equivalent exists then the condition amounts to same certainty equivalents for unambiguous acts and lower certainty equivalents for other acts. So, the comparison is defined only for people with same unambiguity preferences.
Ambiguity neutrality is defined as probabilistic sophistication. I argued in Wakker (2001, Econometrica, pp. 1051-1052) that such endogenous definitions are not tractable; they are hard to observe empirically. The same criticism holds for the definition by Ghirardato & Marinacci, where not probabilistic sophistication but subjective expected utility is taken as ambiguity neutrality. They have the same basic problem as Epstein, but take the other of the two then only available wrong ways to go. Epstein argues on p. 585 that his definition is consistent with common practice, but I think that common practice is only for exogenously given (so directly observable!) probabilities, until some authors (still a minority) started changing it since, say, 1990 (see my key word SEU = risk). Dean & Ortoleva (2017, Theoretical Economics) will nicely and properly point out that ambiguity neutrality means probabilistic sophistication when also objective probabilities are present. The implied agreement with objective probabilities (an exogenous concept) is the bigger half of it, and prob. soph. the smaller half.
Epstein then goes on to define ambiguity aversion as EXISTENCE of a hypothetical ambiguity neutral (= probabilistic sophistication) person who is less ambiguity averse than the real decision maker considered. Again, this existence clause makes the concept hard to observe. The probability measure of probabilistic sophistication is interpreted as index of belief. In general this need not be unique. It worries me that ambiguity aversion is a necessary prerequisite for defining beliefs. It also assumes that beliefs must still be quantifiable through Bayesian objective probabilities.
The author is well aware of the desirability of making ambiguity aversion observable. He provides impressively deep results on event differentiability to mitigate this problem. If a person satisfies event-wise differentiability of preferences, then eventwise local linear approximations of the preferences exist, which are probabilistically sophisticated. If this derivative is the same at every event (“coherence”), then ambiguity aversion holds if and only if it holds with respect to the derivative mentioned (Theorem 4.3, p. 599). Given the difficulty of observing probabilistic sophistication, and the depth of the ideas, this is an admirable achievement. It is, however, not a complete solution to the observability problem because deriving event derivatives from preferences is hard work, and the requirement that this derivative be the same at every event is very restrictive.
Another difficulty with the definition of belief is that it is completely ordinally driven. I think that in many situations there is more-than-ordinal information on beliefs, such as if we know that Choquet expected utility holds and we know the capacity at a more-than-ordinal level. Then we want to use that non-ordinal info for beliefs, rather than confine ourselves to the model-free ordinal info.
Under Choquet expected utility, a person is ambiguity averse if and only if the CORE of the capacity is nonempty, and each element of the CORE can serve the purpose of index of belief in the probabilistically sophisticated model. In the multiple priors model, any prior in the set of priors can serve this purpose. It shows that under these models, the indexes of belief and ambiguity neutrality are not unique.
Nonuniqueness will give conceptual problems when endogenizing unambiguous (as in Epstein & Zhang 2001). If there are two sources of uncertainty (say urns), and the decision maker is probabilistically sophisticated with respect to both, then which is to be taken as ambiguity neutral? It may matter for what we designate as ambiguity averse or not. This issue is discussed more in Epstein & Zhang (2001, Econometrica), pp. 281-282. %}

Epstein, Larry G. (1999) “A Definition of Uncertainty Aversion,” Review of Economic Studies 66, 579–608.


{% Shows the logical possibility of falsifying probabilistic sophistication from consumer choices: if the asset demand contingent on s exceeds that on t even though the price at s exceeds that at t also, then s must be more probable than t. No contradictions should result from such observations. An obvious research question is whether there exists empirical evidence of this kind. %}

Epstein, Larry G. (2000) “Are Probabilities Used in Markets?,” Journal of Economic Theory 91, 86–90.


{% Uses the multiple priors model à la Gilboa & Schmeidler (1989) in a two-period two-consumer model. Is positive about the model, mentions tractability and potential fruitfulness. %}

Epstein, Larry G. (2001) “Sharing Ambiguity,” American Economic Review, Papers and Proceedings 91, 45–50.


{% dynamic consistency: AA model where both prior probabilities and method of updating are chosen subjectively. It builds on Gul & Pesendorfer. %}

Epstein, Larry G. (2006) “An Axiomatic Model of Non-Bayesian Updating,” Review of Economic Studies 73, 413–436.


{% Three-period model with anxiety and so on generated by past consumption, axiomatized. It can lead to information-aversion (information aversion). Considers RDU and the relative shape of probabililty weighting at different time points. %}

Epstein, Larry G. (2008) “Living with Risk,” Review of Economic Studies 75, 1121–1141.


{% A short, critical, summary is in Baillon, Driesen, & Wakker (2012) p. 486: “Epstein (2010) started by criticizing the problematic empirical status of the endogenous two-stage decomposition of KMM. His first example shows that KMM is not able to model ambiguity within a stage, which is related to our criticism of KMM’s use of expected utility within each stage. Epstein’s second example shows that KMM is not able to model different degrees of ambiguity within a stage, which naturally follows from his first example. His §3 criticizes KMM for deviating from multiple priors.”
I next give details:
This paper criticizes the famous KMM model (Klibanoff, Marinacci, & Mukerji 2005, Econometrica) of smooth ambiguity. I first list some weak points of the KMM model:
1. The status of the two-stage decomposition.
1.1. If it is endogenous, as suggested by most of the KMM paper and needed for its interpretations, then it is almost impossible to observe, for one reason because it brings too much richness.
1.2. If it is exogenous (not derived from preference but just imposed by the experimenter, often explicitly to subjects or otherwise imposed when analyzing), as assumed in virtually all applications, then it is simply a two-stage model with a Kreps & Porteus’ (1978) representation.
2. It assumes EU within each stage, which surely for empirical applications is subject to EU violations such as Allais’ paradox.
3. It models ambiguity attitude through (utility of) outcomes, but ambiguity attitude should primarily depend on the events considered, and not on the outcomes.
4a. It commits to violation of reduction of compound lotteries, which is controversial.
4b. It commits to the dynamic principles of backward induction for nonexpected utility, as do all models that use the Anscombe-Aumann model. This is however controversial for nonEU with, for instance, Machina (1989, JEL) strongly arguing against it. KMM do not discuss this point.
5. Their condition of smooth ambiguity aversion is not directly observable and is not a preference condition because it takes !!subjective!! probabilities as input, which is the same as regards observability as taking utility as input.
6. Their whole model is targeted towards aversion to ambiguity, as are most models today, but it does not consider the empirically important likelihood insensitivity. It cannot do the latter because one then has to distinguish likely from unlikely events, which one cannot do if going by outcomes rather than by events.
Epstein targets the first two points explicitly, and the 3rd somewhat implicitly (in a lecture at HEC, April 2009, Paris, he once explicitly stated the 3rd point, so he also agrees with it). He does not discuss the other points.
So I agree that these points deserve criticizisms. But I do not agree with the way in which Epstein's paper does so.
The paper starts with an example of an exogenous two-stage case where the 2nd stage has Ellsberg events, making the EU model there questionable (hence, also, that second-order acts are evaluated by EU which cannot capture the ambiguity within that stage, which Epstein then contrasts with the modeling of ambiguity for Savagean acts depending on the 1st stage). I think that in essence Epstein is right here, and there is no reason for KMM to assume EU for the 2nd stage. But KMM can try a defense, being that they can handle Ellsberg in 2nd stage as they do it everywhere: by adding a stage on top, which here would lead to 3 stages. (So, for descriptive purposes, Allais would be better to criticize EU within a stage.) But then Epstein, replying to this defence in §2.3, goes on to argue that then they take their model endogenous making it unobservable. He could have made this point immediately, skipping the path through Ellsberg’s example. The presence of the Ellsberg example in his paper can be further explained by the history of this paper:
HISTORY. In a first version of this paper (July 6, 09) it reacted to the
defense mentioned by saying that then he could assume Ellsberg events
in a 3rd level. That, always if KMM resort to an n-level model, Epstein
could assume Ellsberg events in the nth level. That always KMM have
to add 1 level. That, continuing this way, it could become very complex
with many levels. Then, however, Epstein would consider that
complexity to be an argument against KMM. I would say that it is an
argument against Epstein’s example. In the published version of
Epstein’s paper this discussion has been dropped but the Ellsberg
paradox has remained as a left-over.
Obviously, if KMM cannot handle ambiguity within the 2nd stage, then they can neither distinguish between different degrees of ambiguity in the 2nd stage. This is the topic of Epstein’s example in §2.4. I don’t see what it adds to the first example.
In the reply of Klibanoff, Marinacci, & Mukerji (2012, Econometrica), KMM12 henceforth, KMM12 indeed defend by adding the extra, I would say 3rd, stage. They next collapse what I would call the 1st and 2nd stage into what usually is their 1st stage state space. Weak point in their defense is, at this point, how can users of the KMM model know whether we should remodel or not? KMM12 argue, p. 1307, citing Marschak & Radner, that “all relevant info” should be incorporated into the (1st order) state space. I think that KMM12 interpret this requirement too strictly. Meta-info about what the proper probabilities over the state space are, for instance, should not be part of the state space. (What I write here is often violated, for instance, by Aumann, who took Savage’s unfortunate requirement of the state space specifying all info too literally, leading to circular definitions.) KMM12 use similar reasonings to reply to Epstein’s §2.4. Their footnote 8 p. 1309, again shows this overly strict interpretation of the Marschak & Radner citation, as does their final sentence in §2.3.
§2.5 presents a nice thought experiment: imagine we have the KMM model with the two-stage decomposition and  endogenous. Then the subject is informed that there is a, now exogenous, two-stage decomposition with the same , but now  objectively given. Would the subject change behavior? I think that KMM would say “yes” because it now has changed into a regular two-stage model with no ambiguity perceived at all. But Epstein argues that behavior then should not change.
§3 is strange. It presents a thought experiment with two indifferent Anscombe-Aumann acts f1 and f2 generated by mutually independent ambiguous events. It argues that then the probabilistic mix ½f1 + ½f2 should be indifferent to f1 and f2. I expect that most readers will find ½f1 + ½f2 on p. 2095 less ambiguous and less aversive than f1 and f2, in agreement with the intuition of KMM cited by Epstein. (KMM12 also argue for this, and cite an experiment where it is apparently shown.) Epstein disagrees. Very strangely, the only argument he puts forward is that, apparently, “the” multiple priors model (MP) (and its restricted way of modeling hedging) implies his claimed indifference. Epstein here and throughout seems to assume as self-evident that the MP model is the gold standard. This is also suggested by the citation of Epstein & Schneider (2010) on pp. 2096-2097 who survey a “growing” literature on “fruitful” applications of MP in finance. So KMM are being criticized here for not being MP ...
§4, with concluding remarks, suggests that MP is “tighter” than KMM, but it only shows that the MP model uses fewer parameters than KMM, not that it is a subset. %}

Epstein, Larry G. (2010) “A Paradox for the “Smooth Ambiguity” Model of Preference,” Econometrica 78, 2085–2099.


{% Do an Epstein & Zin type quantitative assessment.
information aversion: seem to discuss it. %}

Epstein, Larry G., Emmanuel Farhi, & Tomasz Strzalecki (2014) “How Much Would You Pay to Resolve Long-Run Risk?,” American Economic Review 104, 2680–2697.


{% dynamic consistency: favors abandoning forgone-event independence, so, favors resolute choice,
information aversion (p. 11/12); propose the term “independence from unrealized alternatives,” for forgone-branch independence (often called consequentialism).
foundations of statistics: p. 4 suggest that choice -time independence (p. 11) and collapse independence (p. 12) are natural in statistics, and that forgone-event independence should be abandoned. %}

Epstein, Larry G. & Michel Le Breton (1993) “Dynamically Consistent Beliefs Must Be Bayesian,” Journal of Economic Theory 61, 1–22.


{% They consider an assumption such as an event existing with W(A) + W(SA) = 1 (S universal event; this is a symmetry-of-capacity condition for A), so that under RDU for this event we have SEU. %}

Epstein, Larry & Massimo Marinacci (2001) “The Core of Large Differentiable TU Games,” Journal of Economic Theory 100, 235–273.


{% %}

Epstein, Larry G. & Massimo Marinacci (2007) “Mutual Absolute Continuity of Multiple Priors,” Journal of Economic Theory 137, 716–720.


{% Continuing on the Kreps idea of demand for flexibility and choices from menus. The state space is then derived endogenously. %}

Epstein, Larry G., Massimo Marinacci, Kyoungwon Seo (2007) “Coarse Contingencies and Ambiguity,” Theoretical Economics 2, 355–394.


{% dynamic consistency. Non-EU & dynamic principles by restricting domain of acts
Strongly argue that dynamic consistency is normative. Give up reduction of compound lotteries. Their recursive multiple priors was considered before by Sarin & Wakker (1998, JRU, pp. 87–119), Theorem 2.1. Sarin & Wakker also used what Epstein & Schneider call rectangular, calling it the reduced family. Hansen, Sargent, Turmuhambetova, & Williams (2006, p. 78) argued that this family is too restrictive. A mathematical mistake is pointed out and corrected by Wakai (2007). %}

Epstein, Larry & Martin Schneider (2003) “Recursive Multiple Priors,” Journal of Economic Theory 113, 1–31.


{% %}

Epstein, Larry & Martin Schneider (2003) “Learning under Ambiguity,” Review of Economic Studies 74, 1275–1303.


{% Use multiple priors model, focusing on ambiguity generated during the processing of new information that may be of low quality. Then derive implications for topics of interest in finance. %}

Epstein, Larry G. & Martin Schneider (2008) “Ambiguity, Information Quality, and Asset Pricing,” Journal of Finance 63, 197–228.


{% %}

Epstein, Larry G. & Martin Schneider (2010) “Ambiguity and Asset Markets,” Annual Review of Financial Economics 2, 315–346.


{% PT, applications: nonadditive measures, excess volatility in security markets;
dynamic consistency %}

Epstein, Larry G. & Tan Wang (1994) “Intertemporal Asset Pricing under Knightian Uncertainty,” Econometrica 62, 283–322.


{% PT, applications: nonadditive measures, excess volatility in security markets %}

Epstein, Larry G. & Tan Wang (1995) “Uncertainty, Risk-Neutral Measures and Security Price Booms and Crashes,” Journal of Economic Theory 67, 40–82.


{% Games with incomplete information; do Mertens & Zamir (1985) for general nonEU where there need not even be a separable component reflecting belief. Hence, a hierarchy of preferences, iso hierarchy of beliefs, results.
P. 1344: “In Savage’s model, states of the world logically precede the specification of axioms.” Specify complications about asssuming common knowledge etc. in the description of states of the world, refer then to Aumann (1987), in a formulation that is not explicit about whether they criticize Aumann for it or not.
P. 1345: preferences need not even have a separable component that can be thought of as “beliefs;”
P. 1351: not only first-order uncertainty but also how DM feels about that, and feels about that feeling, etc., is incorporated in states of nature. So, an infinite hierarchy. (Compare to conservation of influence) %}

Epstein, Larry G. & Tan Wang (1996) “ “Beliefs about Beliefs” without Probabilities,” Econometrica 64, 1343–1374.


{% %}

Epstein, Larry G. & Jiankang Zhang (1995) “Expected Utility with Inner Measures,” Dept. of Economics, University of Toronto, Canada. Rewritten as Zhang, Jiankang (1997) “Subjective Ambiguity, Probability and Capacity,” Dept. of Economics, University of Toronto, Canada.


{% Propose least convex transform of capacity as index of belief, least concave function representing riskless preference as riskless attitude, and rest as “willingness to bet.” Defines likelihood relation over events through bets on events. %}

Epstein, Larry G. & Jiankang Zhang (1999) “Least Convex Capacities,” Economic Theory 13, 263–286.


{% The set of unambiguous events is a lambda system. There they impose qualitative probability and probabilistic sophistication à la Machina & Schmeidler (1992). Event T is, therefore, unambiguous if the more-likely-than relation, conditioned on Tc, between two subsets A and B of Tc, with the act on Tc\(AuB) a fixed act h, does not depend on the fixed outcome at T. More likely is defined from bets on, not against, events (e.g., p. 273, 280), as in Sarin & Wakker (1992), and as criticized by Nehring (1994).
They emphasize much that their definition of unambiguous is not exogenous but endogenous; i.e., derived from preference.
If there are two sources of uncertainty, say two different urns, and we find probabilistic sophistication for both in isolation but, say, a different probability transformation for one than for the other (so no probabilistic sophistication when joining the two), then it is not clear which source is to be taken as ambiguity neutral. Maybe one is ambiguity averse and the other ambiguity neutral, but maybe the one is ambiguity neutral and the other ambiguity seeking. This paper discusses this issue on pp. 281-282 and 295. Ambiguity of an event depends on the other events available (also visible in the role of A and B in the definition of unambiguous on p. 273). This point, defended by the authors on p. 295, is different, for example, for risk, where risk neutrality (EV maximization) w.r.t. a partition is determined by gambles on that partition and is not affected by the presence of other events.
The authors confound absence of ambiguity and neutrality towards ambiguity. They discuss this issue on p. 283 penultimate paragraph, comparing their treatment of ambiguity with risk. But the comparison is not proper: for risk it IS possible that I really perceive of risk and risk is not absent, but yet I am risk neutral. In the approach of Epstein & Zhang it is not possible that I really perceive of ambiguity but yet am neutral with respect to it. Absence of ambiguity, and neutrality towards it, are really confounded.
The second part of Corollary 7.4(a) on p. 287, claiming a characterization of risk aversion under rank-dependent utility, is not correct. Concavity of u is not necessary. This was pointed out by Chateauneuf & Cohen (1994, JRU, Corollary 2 on p. 86).
Footnote 18, p. 279 is also incorrect because the capacity need only be a transformation of an additive measure, and need not be additive, as one readily verifies. The transformation may very well be nonlinear, making the capacity nonadditive on the algebra mentioned. It implies that, contrary to the authors’ claim in 2nd para of p. 279, Axiom 6 need not be satisfied by CEU (Choquet expected utility). Also contrary to the authors’ claim, this axiom is rather restrictive, capturing a considerable part of probabilistic sophistication in addition to their unambiguity axiom.
For example, assume that:
S = [0,1] x [0,1]; for f and g probability transformation functions, we have:
for all A x [0,1] capacity W is the f-transform of the Lebesgue measure (the
usual additive measure assigning to each interval its length, so that W([a,b]) = f(ba)); and
for all [0,1] x B capacity W is the g-transformation of the Lebesgue measure.
Let A1 = [0,1/n] x [0,1] ,…, Ai = ((i1)/n, i/n] x [0,1], …,
An = ((n1)/n, 1] x [0,1].
Let B1 = [0,1] x [0,1/n], …, Bi = [0,1] x ((i1)/n, i/n], …,
Bn = [0,1] x ((n1)/n, 1].
If f(1/n) = g(1/n), then W(Ai) = W(Bj) for all i,j, and strong-partition-neutrality implies that W of the union of j A-events agrees with W of the union of j B-events, so that f(j/n) = g(j/n) for all j. This is very restrictive. Assume next for some  > 0 that f and g coincide on [0,). It then easily follows, first for all rational numbers and then by continuity throughout the domain, that f and g coincide throughout their domain [0,1]. Because of the erroneous footnote 18, the authors apparently were not aware of the existence of example as above with nonlinear and different f and g. In a RUD 2006 conference in Paris, Epstein explained in public during my lecture that this paper had been developed to handle the three-color Ellsberg urn and not the two-color urn.

Download 7.23 Mb.

Share with your friends:
1   ...   29   30   31   32   33   34   35   36   ...   103




The database is protected by copyright ©ininet.org 2024
send message

    Main page