How to Save Pascal (and Ourselves) From the Mugger

In this article, we re-examine Pascal ’ s Mugging , and argue that it is a deeper problem than the St. Petersburg paradox. We offer a way out that is consistent with classical decision theory. Specifically, we propose a “ many muggers ” response analogous to the “ many gods ” objection to Pascal ’ s Wager. When a very tiny probability of a great reward becomes a salient outcome of a choice, such as in the offer of the mugger, it can be discounted on the condition that there are many other symmetric, non-salient rewards that one may receive if one chooses otherwise. Résumé Dans cet article, nous réexaminons Le braquage de Pascal et soutenons qu ’ il s ’ agit d ’ un problème plus profond que le paradoxe de Saint-Pétersbourg. Nous proposons une issue cohérente avec les théories de la décision classiques. Plus précisément, nous formu-lons une solution « braqueurs multiples » analogue à l ’ objection « dieux multiples » quant au pari de Pascal. Lorsque le résultat notable d ’ un choix est une probabilité infime d ’ obtenir une grande récompense, comme dans l ’ offre du braqueur, on peut l ’ écarter pourvu que de nombreuses récompenses symétriques mais non notables soient disponibles si l ’ on fait un choix différent.

We pick up the scene from Nick Bostrom (2009): Pascal is approached in a dark alley by a "Mugger" who insists that he hand over his wallet (and the 10 livres in it).Though the Mugger has no gun or other means of threat, he promises to return the next day with double the amount in Pascal's wallet.Pascal declines the offer, at which point the Mugger offers 10 times the amount.When Pascal declines again, and says that he thinks the chances that the Mugger delivers on the promise is one in 1,000, the Mugger offers to return with 2,000 times the value, and Pascal remains sceptical.This goes on for a while, with the Mugger ultimately claiming that he is from the Seventh Dimension and has magical powers to deliver on any promise he might make.

Pascal:
The problem is that, given that I have already agreed that there is no temporal discounting of possible rewards,4 there is a possible infinity of rewards that you might promise me.If your powers are really strictly finite, as you claim, would you mind telling me where they give out?What is the largest reward that you can deliver?Mugger: In claiming that my powers are strictly finite, I meant that I can only deliver a finite amount, though I can deliver any finite amount you might want, without limit.5Pascal: But, for whatever odds I give you as a proposed lower bound, you might promise me a reward that is so extravagant that it would be more unlikely than those odds.That isn't a problem with my predictive capacities or an irrationality of minethat's just how a series of decreasing probabilities works.It's like playing a game of who can say a higher numberthe person who goes second can always win!The core of your trick is just that whatever low odds I give, you can promise to match it with a correspondingly higher value to make it seem to be worth giving you my wallet.This isn't a question of "infinite values" (whatever those are!)it's just a question of the limitless nature of our mathematical imaginations.So I'm not going to tell you any odds in advance of hearing the specific reward to which the odds directly apply.That seems only fair.Mugger: Fine.I can understand your hesitancy to tell me the odds before I tell you the reward.

Pascal:
Good! So, I'm keeping my wallet.Mugger: Not so fast.* * * Part II.In Which Pascal Agrees to, and Gives, a Decreasing Probability Function, and Believes That He Is Now Off the Hook Mugger: How about this instead: you seem to be willing to tell me odds after I've told you what the reward would be.So, how about giving me, in advance, just a probability distribution of the chances of my fulfilling different possible rewards that I might promise you?Pascal: Well, that seems incredibly difficult!I have no clue what you might promise!Mugger: I'll make it easier on you.We in our Dimension are trained to give people many days of bliss.So, how about you give me a function, for each x, going from an input of x days of bliss to an output of a probability of my being able to deliver that many days of bliss to you? Pascal: [Thinks for a moment.]Well, I think that the chance of you giving me even one day of bliss is about one in a billion, and it's going to get lower from there.How about a reciprocal function?Say 6 : 1 10 6 × (x + 1,000) It's just a simple function, and maybe you will allow me to add some minor tweaks later, but that should do, since it decreases just as fast as the days of bliss increase.And so, no matter how many days x of bliss you promise me, the odds of you fulfilling that promise are less than one in x million.And, at those odds, it would never be worth handing over my wallet, given that I have some livres in there to lose.It's: That sum is equal to about ≈1 × 10 −9 .This is obviously much lower than the expected value of keeping the 10 livres!So, it is definitely not worth me giving you my wallet.
Mugger: [Motions with hand, another person appears in the alleyway.]Pascal: Who's that?She is looking at me … menacingly.
Mugger: Oh, that's my Boss, also from the Seventh Dimension.You see, we have a system set up.

Pascal:
A system?This is not sounding good.Yes, actually, I did check, and they showed me some stacks of cash.Mugger: But how could you be really sure that they could give you any amount of rubles with no limit, if your coin flips came out right?What we have is a different, ummm … a different opportunity for you.

Pascal:
Hmm .…Mugger: We all know that money has marginally decreasing value, and you have a limited amount of time on the planet, things that might in practice have swayed intuitions in St. Petersburg to not give them even more money up front. 9We from the Seventh Dimension don't have nearly the same limitations in our negotiationswe can offer you anything you'd like.From your perspective, the only risk is the chance that I might not come through for you.Of course, you think that that chance is high, but it's your one and only risk, and that risk is already accounted for in your utility function. 10In St. Petersburg, you had to risk both getting an unlucky rollwhich you didplus another risk that the house might not come through for you if you did hit it big. 11Pascal: That sounds right.Mugger: And the other big difference is thatand I don't mean to sound like a stalkerour bargain is ever-present in a way that the St. Petersburg game is not.

Pascal:
What do you mean?Mugger: Honestly, I could even make it an ongoing offer.I could tell you that, at any point in time, you could just wire us 1,000 livres and we'd promise to give you some fantastic number of days of bliss.You don't even need to be in a place with a coin flip system set up.

Pascal:
Hmm Something like this.If you want to play the game of exponentially increasing unlikely-to-occur rewards, I can imagine any kind of odds function, including one with an exponent in the reciprocal, like this.The odds of you coming through with x days of bliss, in the context of you also telling me that there are lots (2 x ) of others who are also going to be promising me the same, is this: may not be rational, and for the purposes of this article, we can stipulate that Pascal, as an adherent of classical decision theory, is not loss averse.
11 See Jeffrey (1983,  §10.2), who writes that "it is essential that we avoid the St. Petersburg paradox."He continues, "our rebuttal … consists in the remark that anyone who offers to let the agent play the St. Petersburg game is a liar, for he is pretending to have an indefinitely large bank."At least from Pascal's perspective, the Mugger in the present dialogue does not have that restriction, since all that matters is the fact that there is a chance that he might come through.He does not need to give any other evidence up front.Thus, the Mugger paradox is in this way a deeper paradox than St. Petersburg. 1 10 6 × (2 x + 1,000) And, if I do the utility calculation on that, even including all your friends (but dividing now by one in 10 quadrillion), the utility will come out as something very tiny, approximately 1 × 10 Obviously, that would be a huge mistake, but I can imagine it.Humans often make mistakes like that!Stranger: Well, not so fast.Bear with me.Let's say that, after you walk away dejected, your next "friend" there [points] comes up to Pascal.And your "friend" tells him that, precisely because he declined your offer, that they are just going to give him 1,000 quadrillion days of bliss.

Mugger:
Outlandish!That won't happen.Pascal: [Irritated, chimes in.]Really?I thought we are loath to give zero probability to metaphysically possible events.Stranger: Indeed!So tell me: what are the chances, in your mind, that that might happen?Mugger: [Pauses to think.]Stranger: And I see your other friends.They all might have their own agendas, and perhaps their own offers for Pascal.There could be many muggers out here in this alley!12 Mugger: Hmm.Stranger: And is that low bound any lower than the one Pascal might've given for the likelihood that you'll come through for him?Mugger: Well, I am here, and he already has my word that I will give him all of those days of bliss.Those other possibilities about my friends are just idle speculations.

Pascal:
[Picks up on Stranger's line of thought.]You have indeed made the possibility of your giving me 1,000 quadrillion days of bliss decisiontheoretically salient to me .…Mugger: What does that mean?Pascal: A possibility is decision-theoretically salient when one has taken account of the possibility in one's decision structure.And it doesn't even mean that I brought the possibility to mind individually.It just means that it has entered my utility calculations, perhaps as one of a very large series of possibilities to which I have given a probability distribution function.
Of course, if a possibility (or a set of possibilities) is not psychologically salient to one, that could affect whether one takes account of it in one's decision makingit could affect whether it is decision-theoretically salient.

Pascal:
Now, I confess, I just started speculating about how your friends might help me out a minute ago when this Mysterious Stranger came over.But I don't think that the possibility of them coming through with a great deal for me (conditional on my declining your offer) is any less likely than you coming through for me with a great deal (conditional on my accepting your offer).

Mugger:
Well, that's harsh!Doesn't my giving you my word give you some greater credence in me, Pascal?Pascal: Earlier, I said that I was torn between giving an always-decreasing probability function, and one that had a low bound.What might justify having a low bound?Maybe because the story about you coming from the Seventh Dimension and giving me something really great has about as much chance as any scenario that undermines what I take to be the laws of nature and my basic experience.And your being from the Seventh Dimension is like that.

Mugger:
But it wouldn't be rational to give the same low probability to all of these scenarios.Some of them are .…Stranger: Yes, conjunctions of others, and thus should have even lower odds, equal to the product of the twoassuming that they are independent.

Pascal:
But, bending the laws of the universe for two scenarios means that they are not independent of the other, even if the laws of the universe are bent in different ways for the two scenarios .…Stranger: [Interrupts.]Pascal, let me help you out.You seem inclined to say that there is a set of possible outcomes that are such that whatever you do, each outcome in the set is equally likely to occur no matter what you decide to do.And I can see how that would be motivated by the fact that, for many scenarios, the probability hits some kind of lower limit.But our Mugger friend is correct that that would violate principles of aggregating the probabilities of outcomes.

Mugger:
Aha! Stranger: But Pascal can put things differently.Pascal can say that, for any decision, there is a set of unlikely outcomes that are such that, for each of them, there is a member of another set of unlikely outcomes with the same probability and the same reward.And because of that, we can ignore all of the outcomes in both sets.Pascal: Are they not the same set?Stranger: Well, if you give the Mugger your wallet, he might take it home, and then it might grow wings and flap them and land in the sink.The chances of it landing in his sink will be higher if you give it to him than if you keep it.But, the chances that it would fly into your sink if you keep it is, for all we can tell, the same as the chances of it flying into his sink if he took it home.

Mugger:
I think his sink is probably biggerbusiness has been slow for me the last couple of years!Stranger: That may be.But all in all, the total expected value of all of these unlikely but possible things happening, in one of the sets, is going to be equivalent to the total of the other set, whatever you do.And, if not, then maybe he should take your offer.But that's not the case here.

Mugger:
I'm not sure of that!Stranger: Perhaps, in some objective sense, some of these possibilities are more likely than others.But, from Pascal's own position, there is no reason for him to assign a greater overall expected value to the members of one set (collectively) rather than to the other.Mugger: Hmm ….

Pascal:
That sounds right.So, no deal.Mugger: [Scowls.] * * * Part VIII.In Which the Stranger Clarifies That We Should Not Discount in General and That Classical Decision Theory Is Preserved

Mugger:
So, what you're saying is, you are going to discount the expected value of my coming through on my offer to you down to zero? 13 In other words, you are not even going to include, at all, the value of the possibility of my coming through for you in your utility calculation of accepting my offer because you take the probability to be very small?For several reasons, discounting is irrational.14Pascal: No, no, I agreediscounting in most cases is wrong.That's why I was so interested in your offer in the first place.Stranger: You should have been more wary, Pascal!Pascal: If I get in my car and emit some greenhouse gases, there is a slight risk that that will make the world worse.(There is also a slight chance that it will make the world better, but there is more of a chance that it'll make things worse.)And I shouldn't discount that slight marginal risk.That's why there are reasons not to go for frivolous drives.15Mugger: Nice to hear that you care about the environment.But I saw you driving earlier today!Pascal: I only do it because other benefits from my driving (like getting to my job) outweigh the expected harms.(Or so I hope.Maybe I'm a bit selfish in considering my own goods above the harms to others.But that's a topic for another day.)Mugger: Ok, I'll grant you that.But let me get this straight.You are now discounting the small chance that my offer will be a good one, but you don't think discounting in general is acceptable?Pascal: Yes.But perhaps we can distinguish the tiny probabilities we can rationally discount from the ones we can't.Stranger: The reason for discounting is not the tininess of the probability or the tininess of the expected value (which might not actually be tiny!) itself.Instead, when a new possibility becomes salient, one can discount it if one realizes that there exists a symmetrical previously non-salient possibility that cancels out the expected value/disvalue from the one you are considering. 16Or, if it's a series of tiny possibilities that become salient, then one can discount them if one notices that there is a previously unrecognized, symmetrical series with which one can form a one-to-one function to the newly salient one.And, with a tiny probability like yours, there are basically an unlimited number of symmetrical possibilities.Your trick was that you made the remote possibilities on one side of the choice salient, but in a way that kept the symmetrical remote possibilities on the other side non-salient.Discounting, when rational, has just as much to do with the tiny probabilities that are salient parts of the option being considered in the choice as it does with previously nonsalient tiny probabilities that are parts of other options in the choice.Mugger: OK  Schwitzgebel (2017, p. 283) on symmetry: "In general, I'm not inclined to think that my prospects will be particularly better or worse due to their influence on extremely unlikely deities, considered as a group, if I [perform some action] than if I do not." 17 See Barrington (ms., §2), Hiller (2011, p. 355).
18 Balfour (2021) argues that Pascal's Mugging-type considerations show that according to expected utility theory, one ought to dedicate all one's decisions, including very minute ones, for the sake of long-term human wellbeing.(This is Yudkowsky's ultimate concern as well.)The considerations in this article rebut Balfour's argument only insofar as the tiny risks to future people are indeed symmetrical to hitherto nonsalient scenarios.For reasons that are beyond the scope of this article, we, in fact, think that at least some of the risks discussed by Balfour are symmetrical in that way, and thus we do not accept Balfour's conclusion.Furthermore, Balfour's (2021, pp.123-124) Mugger suggests that if we must change our lives so significantly due to these longtermist concerns, it would instead be a reason to abandon expected utility theory.Again, we disagree, but will reserve discussion for another occasion.time 10+n quadrillionths of a second) is lower if you move away than if you stay close to him.That makes it different from the consideration of possibilities that grandly violate what we take to be the causal order of things.There is no symmetrical pair of knock-out possibilities depending on which choice you make (of moving away or standing still).Pascal: OK.

Mugger:
So, is this the only case where you are going to discount the tiny probability?Do you really think that little of my honesty?Pascal: Well, there are pragmatic constraints built into how we apply decision theory in the real world that need to be looked at on a case-by-case basis.In the case of you and your friends, if you (or anyone else) is, according to you, always likely to give people days of bliss in exchange for their wallets, it gets me thinking, with our Stranger, that there is an equal potentially ever-present possibility that everyone can hold onto their wallets while still getting the rewards.So, I admit that I think little of your honesty.But the point here is not that your giving me quadrillions of days of bliss has a tiny chance of some hard-to-determine amount; the point is that, now that you mention that possibility, it seems just as likely to me that I will get that bliss whether or not I give you my wallet.[breaks the fourth wall and addresses reader] no one needs to give away their wallet to the next person they see, because even if they were going to get some days of bliss from doing so, they most likely wouldn't be hearing about it from an imagined Mugger in the pages of a philosophy journal.They're just as likely (or unlikely) to get bliss from going about their ordinary business.And so, there is nothing wrong with how classical decision theory handles any of these cases.No.But it was a … an informative time with you both.Stranger: The beauty and danger of your grift consists in the fact that anyone at any time can promise, in the shadiest of ways, an unlimited possible reward in exchange for a finite up-front expense, and the product of this utility will look amazing.Pascal was a perfect mark for you, given his attraction to this kind of argument.But your argument applies even more broadly than just to people who are enamoured by thoughts of an ever-powerful God.It could appeal to anyone with any imagination plus a knowledge of the rudiments of expected value theory.The central trick/flaw in the strategy is that it focuses on the option that is salient, and not on all of the other options that are not salient, but ought to be.Pascal: I think I might head back to St. Petersburg -I have a few things to say to those folks about their game.The more I think about some of the tiny probabilities and huge rewards that they mentioned, the more I wonder whether, at a certain point, I am just as likely to get the reward regardless of how the coins flip.If I were to flip a thousand or even just fifty heads in a row, I would be more worried that I was crazy or that I was in a simulation with at least one glitch (and likely more to follow!) than excited about my winnings.Well, that's something I'll have to talk over with them.
Part IX.In Which Pascal and the Stranger Take Their Leave Mugger:So, you're really not going to give me your wallet after all?Pascal: Well, first try doing a utility calculation with your reciprocal function.
Pascal:[Thinks for a moment.]NowthatI'mthinkingabout it, it's hard to figure out the utility function for my agreeing to your offer.The odds I just gave you are the odds that, for any given promised number of days x, if you promised me x days of bliss, then you would fulfil that promise.But I can't calculate the utility function for me giving you my wallet unless I know the odds, for each number of days, that you might promise me that many days of bliss.Mugger: You'll have to guess, then.Pascal:Any help?Mugger: Well, it'll have to be a function that sums to one as it heads to infinity, because I am going to promise you something.Pascal:OK.How about the odds of you promising me x days of bliss are 1 2 x ?Mugger: Isn't that rather pessimistic about what I'm going to promise you?I'm going to promise you a lot of days of bliss!But you know what?Fair enough.Try putting that into your utility function.Pascal: Mugger: No, no, you will like this.It is good news for you!For problem cases like youand actually, we've never had one quite like you before!we are prepared to offer you a special deal.My Boss has powers much greater than mine.If I were going to offer you x days of bliss, she has assured me [Mugger and Boss nod to each other] that there will instead be 2 x others just like me who are now silently promising to give you that same number of days of bliss.[Several others now walk into the alleyway.]Since presumably the probability of any of them coming through on the promise isn't any less than the probability of me coming through, you'll need to change your utility function to the following: .… Hmm .…Mugger: [Smiles.]In fact, [breaks the fourth wall and addresses reader] anyone can simply imagine that I am secretly the next person that they will talk to, and they should give away their wallet to that person in exchange for a possible, unspoken, offer from my Boss and me that they can imagine might come true.
* * * Part V.In Which Pascal Revises His Probability Function Pascal: Well, you might think you have me, but there is actually an easy fix.Sure, your offer is tough because you folks from the Seventh Dimension can offer me fantastical rewards at any point in my life, unlike St. Petersburg.But, I can also re-imagine my own probability function, also unlike the coin flips in St. Petersburg.Your Boss adding all of those other people from your dimension threw me for a loop.At first, I suggested that I lower my probability by a fraction.But I can resolve this if I just change my whole probability function for you.Mugger: OK, what does that look like?Pascal: −25 .So no, you are definitely not getting my wallet.Part VI.In Which the Mugger Points Out That This Is Ad Hoc, and Pascal Sums up the Quadrilemma He Is In Mugger: So, you think you are so smart?You might think that more math will save you, and formally, I confess that I think you can do it.At least, rather than ever handing over your wallet, you perhaps can keep engaging me in conversation by constantly updating your probability functions as I keep telling you about more and more of my Boss's exponentially more impressive powers.But what would independently motivate using the new probability function rather than your original function?Why not just give up on your intuition that our offer is a bad one?I started making these, umm … deals with people in public back almost 15 years ago and no one before you has ever objected to the idea that there is some non-zero lower bound on the probability of my fulfilling any promise that I might promise.And now, you first wanted to give me a simple reciprocal function, and then you put the number of days as an exponent in the denominator?OK, OK, let me think.I confess that I don't have a clear idea, independent of thinking about avoiding the utility function diverging, of what the probability function should look like.Mugger: Aha! So, you admit that you are just trying to rationalize your resistance to giving me your wallet!Pascal: Well, it's just not clear to me that what is happening here with your Boss and your fellow dimension-mates is not a more dressed-up version of what we were talking about earlier in our conversation.I thought that there was an implicit assumption when you specified that you would restrict your rewards to days of bliss that there weren't these other muggers or other crazy business.That's why I assented to giving you the probability distribution that I did.If I had known about your Boss, I wouldn't have.Mugger: Now you're being unfair to me.First of all, nothing I have said about my Boss violates my initial deal with you.And you didn't really know what else was going to come along, did you?You gave me a probability distribution for some really great rewards.You seemed satisfied with that.But now that I told you about my Bosswhose possibility you might wish you had considered beforehandyou want to change your probability function.Pascal: This is the kind of thing that I wanted to avoid getting into in the first place!Maybe I should have refused to give you a probability function, just as I refused to answer your first question.Mugger: But that is also ad hoc.A probability distribution is just an assessment of the chances of a range of possible states of affairs.You should not be turned off from giving one just because you are worried about some promises some people in the street might make afterwards.What other states of affairs might you refuse to give probabilities for, and for what reasons?That is not a stable strategy.Pascal: Hmm.Let me think about this.[Thinks.]I've tried four different kinds of responses to you, and to each, I must admit, you've had a good reply.Since I am an ardent devotee of classical decision theory, it seems that you have put me in the following quadrilemma.Either (1) I give you a function of the odds of you coming through with any promise that you might make that has a non-zero lower bound, in which case you and your associates will promise me such a big reward that, according to expected value theory, I should give you my wallet; or (2) there is some probability function that starts out lower than the possible rewards you might give me and decreases at the same rate as the rewards increase, in which case we have something similar to a St. Petersburg game (except you a probability function, but this refusal also doesn't have any independent motivation, and will make me wonder whether I can give probability functions for anythingand that of course is going to have some bad consequences. * * * Pascal: Maybe I would do that, or more likely I would just walk away!Mugger: Well, let me ask you this.Why are you trying so hard to resist us this way?If I may say so, it is rather ad hoc to now posit this new function when earlier you seemed content with a different function.All I wanted at the start was an answer to the question of the odds of my coming through with the best offer that I might give you.You didn't want to answer that, and instead you gave me a probability function.But now you are giving me a different probability function.Maybe you should have thought of some of these possibilities earlier.What would motivate choosing one function rather than another?Pascal: Well, I have the intuition that all your offers, even the one you made along with your Boss, aren't worth it, and that's how to make sense of that intuition.Mugger: Mugger: Also, you might not want to hear about more ways that my Boss and my friends might be able to give you even more rewards.What are you going to say next?We might be here a while, and I'm getting impatient.a less risky, and omnipresent one)and I should give you my wallet; or (3) I can propose a decreasing probability function that decreases exponentially faster than the possible rewards that you might give me increase, but then change my probability function depending on what you or your Boss say in response, but that would be deeply ad hoc; or (4) I can refuse to give [A Mysterious Stranger emerges from the shadows of the alley.]Stranger: Hold on just a moment, everyone.I've been observing what has transpired here.M. Pascallook carefully.The "Boss" and the others there, how do we know for certain that they are working with the Mugger, and not at cross-purposes?Pascal: What do you mean?Stranger: [Addresses Mugger.]Let's say that Pascal does try to sincerely answer your original question and gives you a non-zero probability.Let's say that he tells you that the probability of your coming through with the best offer that you might give him is one in 10 quadrillion, and then you tell him that you are offering him 1,000 quadrillion days of bliss in exchange for his wallet.And imagine that he then declines your offer.Mugger: resent that characterization!I was just trying to make a deal with M. Pascal that benefits both of us! Stranger: Yes, I hear you.And I understand you not wanting to give me a single answer to my question.If you don't want to give us a probability distribution, can you give us a lowest bound on the odds of some really wonderful other things that Pascal might hope for and might happen if he declines your offer?Mugger: , but can't one partition any large probability into a bunch of tiny ones? 17For instance, you might reasonably worry that I am about to knock you out in a few seconds to steal your wallet, but maybe you shouldn't worry: it might happen 10 plus one quadrillionth of a second from now, 10 plus two quadrillionths of a second from now, etc., and each of those has a vanishingly small chance.Perhaps even smaller than what you think the chances are of me coming through on any promise I might make.Yes, Pascal, our colleague here is right.But what you could have said is that the value of each of the possibilities of you knocking him out (at 16 See Everything that happens has a miniscule subjective chance of happening, if it is described in fine enough detail.But not everything that happens is symmetrical to scenarios like what you are talking about.And about your Boss: as Bayes tells us, the probability of the hypothesis that your Boss is limited in power just by the laws of metaphysics, given the evidence that both (A) you are telling Pascal this, and (B) you and your Boss are here in this alleyway trying to get Pascal to give you his wallet, shouldn't just be considered on its own, but varies in proportion to the chances that (A) and (B) would happen given that your Boss really is limited in power just by the laws of metaphysics.And the chances of that are, again I'm afraid, no greater than the chances of Pascal's getting bliss from not giving up his wallet.Most likely, a Boss with such expansive powers wouldn't need to bother with a trifle like Pascal's wallet! 19So, And, in addition to the prior probability of the Mugger having transdimensional superpowers being very small, the probability of the Mugger trying to mug Pascal conditional on the Mugger possessing transdimensional superpowers is intuitively much lower than the probability of the Mugger trying to mug Pascal conditional on his lacking superpowers.In other words, the very fact that the Mugger is saying both that he has superpowers and wants Pascal's wallet is evidence that the Mugger does not have superpowers.If so, then contrary to what Bostrom (2009, pp.444-445) and Yudkowsky (2007) suggest, the Mugger's testimony How to Save Pascal (and Ourselves) From the Mugger