
Welcome to the International Skeptics Forum, where we discuss skepticism, critical thinking, the paranormal and science in a friendly but lively way. You are currently viewing the forum as a guest, which means you are missing out on discussing matters that are of interest to you. Please consider registering so you can gain full use of the forum features and interact with other Members. Registration is simple, fast and free! Click here to register today. 
16th August 2018, 03:00 AM  #281 
Illuminator
Join Date: Jan 2006
Location: 22, Acacia Avenue
Posts: 3,083


__________________
Just drive. 

16th August 2018, 04:31 AM  #282 
Critical Thinker
Join Date: Apr 2012
Posts: 327

post deleted

16th August 2018, 09:01 AM  #283 
Hyperthetical
Join Date: Nov 2006
Location: A pocket paradise between the sewage treatment plant and the railroad
Posts: 14,165

Consider the algorithm I mentioned before. Select 1 as the candidate; then with probability 1/2 change the candidate to 2, then with probability 1/3 change the candidate to 3, and so forth. At each step before the consideration of candidate n, the current candidate is one of the integers from 1 to n1, with equal probability 1/(n1). After the random 1/n chance of switching to candidate n, each of the integers from 1 to n will be the current candidate with probability 1/n.
There's a "more efficient" version that starts with candidate 1, then for all n={0, 1, 2, ...} gives a 50% chance of adding 2^n to the candidate. This has some advantages, such as doubling the number of possible candidates considered at each step, and thus taking only logarithmic time (which of course is still infinite); and only needing a 5050 random generator instead of one that can correctly roll arbitrary infinitesimal chances. It amounts to randomly generating a binary integer one bit at a time from right to left. You could also do the same with, say, right to left base 10 digits. Start with 1, then for all n = {0, 1, 2...} add an equally distributed random selection from {0, 10^n, 2*10^n, ... 9*10^n}. That only needs a slightly more capable RNG (a d10 instead of a coin flip), and is another 3.3 times faster. With all these procedures, you can show that for any given positive integer, that integer has a fair chance at some specific iteration to become the candidate, and then to remain the candidate in all subsequent iterations. However, as with many other supertasks, there's no way to characterize what the end results would be. Consider this: replace the random chance in either of the above algorithms with a certainty (always selecting 9*10^n in the latter case), and it becomes a supposed method for finding the highest integer. Which is pretty much meaningless. On the other hand, the second procedure appears to tell you right away whether the eventual result will be odd or even. After the first (n=0) step, you know what the least significant bit must be, regardless of what happens subsequently. Likewise, the third procedure appears to tell you right away whether or not the eventual random integer will be a multiple of 5, as well as odd or even, since only multiples of 10 can be added to it after the n=0 step. And it tells you whether or not the eventual random number will be a multiple of a billion, after the n=8 step. So provided the infinite procedure is trusted to yield an integer at all, rather than, say, a giraffe or a black hole, can't you go ahead and make use of the determination of oddness or evenness or multipleofabillionness etc. while you're waiting forever for the final result? And can't you examine the probabilities of those finite initial steps to determine the probability of those earlyemerging outcomes? So it's tempting to think, as Bolgomolny does, that you can. That is, IF your necessarily imperfect definition of "a randomly selected integer" includes the "as fair as possible" requirement outlined above: that for a given positive integer, at some specific iteration of the selection process, that integer has a fair chance (equal to that of all previous or cocandidates as of that iteration) to be the candidate, and then to remain the candidate in all subsequent iterations. Of course, there are other orderings and other "random" selection algorithms that could be used that would alter the expectations. For instance, one corresponding to the "corridor A/corridor B" arrangement frequently mentioned upthread, where all the multiplesofabillion numbered hotel rooms and only those (those also being the supposedly exceptional "opulent" or "dingy" rooms) are located on one corridor, and the selection process begins with a coin flip to choose which corridor. (That corresponds to a modification of the base10digit selection algorithm, where at the outset with a 50% chance you set the candidate at one billion and then start iterating with n=9.) That appears to yield an equal chance of selecting a multipleofabillion integer as otherwise, but it clearly fails the asfairaspossible condition. Note that the first algorithm I outlined, which iterates through the set of integers by 1's, meets the asfairaspossible condition, but makes no early (in any finite number of steps) determination of whether the eventual selection will be odd or even etc. So to sum up, is it possible to salvage the intuitive notion of a 1 in a billion chance of selecting a multiple of a billion from the infinite set of positive integers (i.e. being assigned an exceptional hotel room), even if we don't know exactly what the selection process is, provided that (a) the selection process yields an unalterable decision on whether or not the eventual outcome is a multiple of a billion within a finite number of steps, and (b) the selection process meets the asfairaspossible condition? 
__________________
A zømbie once bit my sister... 

16th August 2018, 06:08 PM  #284 
Philosopher
Join Date: Feb 2015
Posts: 5,918

Note that the results can be changed at will by reordering the candidates. ETA2: Interpreting statements such as "select 1 as the candidate" as "select the first candidate".
Quote:
Quote:
Originally Posted by Myriad
Quote:
My guess on (b) is: Any such procedure will be able to be made to give any result we want by reordering the set, and the procedure will remain meeting this condition. ETA: My guess on the larger question "is it possible to salvage the intuitive notion of a 1 in a billion chance": It's only intuitive because you're used to thinking in terms of natural numbers which have an intuitive natural ordering (note how the concept of natural density can be extended to a general density with any wellorder, with natural density being a special case where the order is the natural order). The reason you're having this intuition in the first place is because you're letting structure "leak" from (N, ≤) to X where we only have X = N and have nothing about any order on X. The intuition is wrong and results from a failure to clearly delineate between things like X and (X, ≤), it shouldn't be salvaged, it should be removed. 
__________________
"Ideas are also weapons."  Subcomandante Marcos "We must devastate the avenues where the wealthy live."  Lucy Parsons "Let us therefore trust the eternal Spirit which destroys and annihilates only because it is the unfathomable and eternal source of all life. The passion for destruction is a creative passion, too!"  Mikhail Bakunin 

Bookmarks 
Thread Tools  

