|
|
|
|
petsfed
Jul 7, 2010, 4:16 AM
Post #1 of 32
(2051 views)
Shortcut
Registered: Sep 25, 2002
Posts: 8599
|
I think partly out of my amateur fascination with statistics, I geek out a lot over big numbers. One I learned about some time ago is the Graham's number, which I can say with neither reluctance nor embarrassment, I don't fully understand. But it did give me an interesting idea for a somewhat easier to calculate function that grows at a startling rate. Suppose you have a number S_n. The subscript n is both an index, as well as an initial value. S_n is defined as n to the nth power, to the nth power, to the nth power, and so on n times. For instance, S_2 = 2^2 = 4, but S_3 = 3^3^3 = 19683, S_4 = 4^4^4^4 = 3.40282366e38 and so on. It becomes quite apparent that S_n grows faster than an exponential even as n grows monotonically. I read, just now, that there are ways to manufacturer even faster growing functions using a similar, but in more ... dimensions than this. Any other big numbers that are easily expressed (other than the googol and googolplex)?
(This post was edited by petsfed on Jul 7, 2010, 4:25 AM)
|
|
|
|
|
spikeddem
Jul 7, 2010, 11:54 AM
Post #2 of 32
(2022 views)
Shortcut
Registered: Aug 27, 2007
Posts: 6319
|
petsfed wrote: I think partly out of my amateur fascination with statistics, I geek out a lot over big numbers. One I learned about some time ago is the Graham's number, which I can say with neither reluctance nor embarrassment, I don't fully understand. But it did give me an interesting idea for a somewhat easier to calculate function that grows at a startling rate. Suppose you have a number S_n. The subscript n is both an index, as well as an initial value. S_n is defined as n to the nth power, to the nth power, to the nth power, and so on n times. For instance, S_2 = 2^2 = 4, but S_3 = 3^3^3 = 19683, S_4 = 4^4^4^4 = 3.40282366e38 and so on. It becomes quite apparent that S_n grows faster than an exponential even as n grows monotonically. I read, just now, that there are ways to manufacturer even faster growing functions using a similar, but in more ... dimensions than this. Any other big numbers that are easily expressed (other than the googol and googolplex)? I hope you find what you're looking for so I can finally start grading my routes.
|
|
|
|
|
johnwesely
Jul 7, 2010, 12:20 PM
Post #3 of 32
(2018 views)
Shortcut
Registered: Jun 13, 2006
Posts: 5360
|
I don't think you would have to go that high for scientific notation to become a cumbersome way to express those numbers.
|
|
|
|
|
bill413
Jul 7, 2010, 12:27 PM
Post #4 of 32
(2016 views)
Shortcut
Registered: Oct 19, 2004
Posts: 5674
|
spikeddem wrote: petsfed wrote: I think partly out of my amateur fascination with statistics, I geek out a lot over big numbers. One I learned about some time ago is the Graham's number, which I can say with neither reluctance nor embarrassment, I don't fully understand. But it did give me an interesting idea for a somewhat easier to calculate function that grows at a startling rate. Suppose you have a number S_n. The subscript n is both an index, as well as an initial value. S_n is defined as n to the nth power, to the nth power, to the nth power, and so on n times. For instance, S_2 = 2^2 = 4, but S_3 = 3^3^3 = 19683, S_4 = 4^4^4^4 = 3.40282366e38 and so on. It becomes quite apparent that S_n grows faster than an exponential even as n grows monotonically. I read, just now, that there are ways to manufacturer even faster growing functions using a similar, but in more ... dimensions than this. Any other big numbers that are easily expressed (other than the googol and googolplex)? I hope you find what you're looking for so I can finally start grading my routes. He's not talking about fractions, but numbers that are much greater than 1.
|
|
|
|
|
airscape
Jul 7, 2010, 12:39 PM
Post #5 of 32
(2012 views)
Shortcut
Registered: Feb 26, 2001
Posts: 4240
|
I'm trying to get a number recognised by science, but it seems there is some antagonism against me and I don't know why. Airscape's constant (A): If a number > 0 is divided by Airscape's constant then it is halved. A = 2
(This post was edited by airscape on Jul 7, 2010, 12:50 PM)
|
|
|
|
|
bill413
Jul 7, 2010, 1:12 PM
Post #6 of 32
(2006 views)
Shortcut
Registered: Oct 19, 2004
Posts: 5674
|
airscape wrote: I'm trying to get a number recognised by science, but it seems there is some antagonism against me and I don't know why. Airscape's constant (A): If a number > 0 is divided by Airscape's constant then it is halved. A = 2 Sounds like it might be useful in some specialized applications. Maybe bring it in through an applied field (for example, Avogadro's number, Planck constant) rather than strictly mathematical (Pi, e)?
|
|
|
|
|
jt512
Jul 7, 2010, 6:09 PM
Post #7 of 32
(1978 views)
Shortcut
Registered: Apr 12, 2001
Posts: 21904
|
petsfed wrote: I think partly out of my amateur fascination with statistics, I geek out a lot over big numbers. One I learned about some time ago is the Graham's number, which I can say with neither reluctance nor embarrassment, I don't fully understand. But it did give me an interesting idea for a somewhat easier to calculate function that grows at a startling rate. Suppose you have a number S_n. The subscript n is both an index, as well as an initial value. S_n is defined as n to the nth power, to the nth power, to the nth power, and so on n times. For instance, S_2 = 2^2 = 4, but S_3 = 3^3^3 = 19683, S_4 = 4^4^4^4 = 3.40282366e38 and so on. It becomes quite apparent that S_n grows faster than an exponential even as n grows monotonically. I read, just now, that there are ways to manufacturer even faster growing functions using a similar, but in more ... dimensions than this. Any other big numbers that are easily expressed (other than the googol and googolplex)? Just change the order of association in your series. That is, S'_2 = 2^2 = 4 S'_3 = 3^(3^3) = 7.6e12 S'_4 = 4^(4^(4^4)) = ? Jay
|
|
|
|
|
petsfed
Jul 7, 2010, 6:49 PM
Post #8 of 32
(1962 views)
Shortcut
Registered: Sep 25, 2002
Posts: 8599
|
jt512 wrote: petsfed wrote: I think partly out of my amateur fascination with statistics, I geek out a lot over big numbers. One I learned about some time ago is the Graham's number, which I can say with neither reluctance nor embarrassment, I don't fully understand. But it did give me an interesting idea for a somewhat easier to calculate function that grows at a startling rate. Suppose you have a number S_n. The subscript n is both an index, as well as an initial value. S_n is defined as n to the nth power, to the nth power, to the nth power, and so on n times. For instance, S_2 = 2^2 = 4, but S_3 = 3^3^3 = 19683, S_4 = 4^4^4^4 = 3.40282366e38 and so on. It becomes quite apparent that S_n grows faster than an exponential even as n grows monotonically. I read, just now, that there are ways to manufacturer even faster growing functions using a similar, but in more ... dimensions than this. Any other big numbers that are easily expressed (other than the googol and googolplex)? Just change the order of association in your series. That is, S'_2 = 2^2 = 4 S'_3 = 3^(3^3) = 7.6e12 S'_4 = 4^(4^(4^4)) = ? Jay I didn't even think to look at that. You can tell how often nested exponentials are used in my day-to-day life. One thing I learned from all of this is that it is very easy to produce a number that would take more than the entire universe to fully express, if one base-10 digit were placed in each Planck volume.
|
|
|
|
|
jt512
Jul 7, 2010, 7:12 PM
Post #9 of 32
(1951 views)
Shortcut
Registered: Apr 12, 2001
Posts: 21904
|
petsfed wrote: jt512 wrote: petsfed wrote: I think partly out of my amateur fascination with statistics, I geek out a lot over big numbers. One I learned about some time ago is the Graham's number, which I can say with neither reluctance nor embarrassment, I don't fully understand. But it did give me an interesting idea for a somewhat easier to calculate function that grows at a startling rate. Suppose you have a number S_n. The subscript n is both an index, as well as an initial value. S_n is defined as n to the nth power, to the nth power, to the nth power, and so on n times. For instance, S_2 = 2^2 = 4, but S_3 = 3^3^3 = 19683, S_4 = 4^4^4^4 = 3.40282366e38 and so on. It becomes quite apparent that S_n grows faster than an exponential even as n grows monotonically. I read, just now, that there are ways to manufacturer even faster growing functions using a similar, but in more ... dimensions than this. Any other big numbers that are easily expressed (other than the googol and googolplex)? Just change the order of association in your series. That is, S'_2 = 2^2 = 4 S'_3 = 3^(3^3) = 7.6e12 S'_4 = 4^(4^(4^4)) = ? Jay I didn't even think to look at that. You can tell how often nested exponentials are used in my day-to-day life. One thing I learned from all of this is that it is very easy to produce a number that would take more than the entire universe to fully express, if one base-10 digit were placed in each Planck volume. I have no intuitive feel for how big S'_4 is. After an hour my laptop hasn't finished computing it. I emailed my girlfriend to ask if she can compute it at work. I'm curious to know how big it is. Jay
|
|
|
|
|
truello
Jul 7, 2010, 7:44 PM
Post #10 of 32
(1943 views)
Shortcut
Registered: Oct 1, 2006
Posts: 737
|
According to Java it is infinity :) However the java datatype I used only fits up to 1.79769313486231570e+308d, so once it blows that out the result is Infinity.
|
|
|
|
|
petsfed
Jul 7, 2010, 8:26 PM
Post #11 of 32
(1935 views)
Shortcut
Registered: Sep 25, 2002
Posts: 8599
|
Well, a quick check led to this: S_4 = 4^(4^(4^4)) = 4^(4^256) = 4^(1.3407807929942597099574024998206*(10^154)) = 4^(10^154)^1.3407... Which is pretty fuckoff big. If you had an i7 (about 100 gigaflops/second) processor running only this calculation, it would still take about 10^44 seconds, or about 3*10^36 years to calculate it out, assuming that the computer does one "times four" operation per flop. It might attack it logarithmically, which might speed things up quite a bit, but still, we're looking at two or three orders of magnitude in terms of speed, max. So you may as well just stop now Jay.
(This post was edited by petsfed on Jul 7, 2010, 8:27 PM)
|
|
|
|
|
jt512
Jul 7, 2010, 8:32 PM
Post #12 of 32
(1928 views)
Shortcut
Registered: Apr 12, 2001
Posts: 21904
|
petsfed wrote: Well, a quick check led to this: S_4 = 4^(4^(4^4)) = 4^(4^256) = 4^(1.3407807929942597099574024998206*(10^154)) = 4^(10^154)^1.3407... Which is pretty fuckoff big. If you had an i7 (about 100 gigaflops/second) processor running only this calculation, it would still take about 10^44 seconds, or about 3*10^36 years to calculate it out, assuming that the computer does one "times four" operation per flop. It might attack it logarithmically, which might speed things up quite a bit, but still, we're looking at two or three orders of magnitude in terms of speed, max. So you may as well just stop now Jay. I had already given up on it. Anna says she'll think about the problem. Jay
|
|
|
|
|
spikeddem
Jul 7, 2010, 10:26 PM
Post #13 of 32
(1915 views)
Shortcut
Registered: Aug 27, 2007
Posts: 6319
|
bill413 wrote: spikeddem wrote: petsfed wrote: I think partly out of my amateur fascination with statistics, I geek out a lot over big numbers. One I learned about some time ago is the Graham's number, which I can say with neither reluctance nor embarrassment, I don't fully understand. But it did give me an interesting idea for a somewhat easier to calculate function that grows at a startling rate. Suppose you have a number S_n. The subscript n is both an index, as well as an initial value. S_n is defined as n to the nth power, to the nth power, to the nth power, and so on n times. For instance, S_2 = 2^2 = 4, but S_3 = 3^3^3 = 19683, S_4 = 4^4^4^4 = 3.40282366e38 and so on. It becomes quite apparent that S_n grows faster than an exponential even as n grows monotonically. I read, just now, that there are ways to manufacturer even faster growing functions using a similar, but in more ... dimensions than this. Any other big numbers that are easily expressed (other than the googol and googolplex)? I hope you find what you're looking for so I can finally start grading my routes. He's not talking about fractions, but numbers that are much greater than 1. Damn you!!!
|
|
|
|
|
hafilax
Jul 7, 2010, 11:17 PM
Post #14 of 32
(1907 views)
Shortcut
Registered: Dec 12, 2007
Posts: 3025
|
WolframAlpha spits out: power of ten representation: 10^(10^153.9069975479678) Number length: 807230472602822537938263039708539903007136792173874303186708282\ 841841448156830914919891181470122948345198155757477115649645723\ 8535299087481244990261351117 decimal digits ~ 10^154 digits
(This post was edited by hafilax on Jul 8, 2010, 5:01 PM)
|
|
|
|
|
blondgecko
Moderator
Jul 8, 2010, 12:07 AM
Post #15 of 32
(1903 views)
Shortcut
Registered: Jul 2, 2004
Posts: 7666
|
petsfed wrote: about 100 gigaflops/second Memo from the pedantry department to the department of redundancy department: FLOPS = Floating Point Operations Per Second.
|
|
|
|
|
bill413
Jul 8, 2010, 2:49 AM
Post #16 of 32
(1895 views)
Shortcut
Registered: Oct 19, 2004
Posts: 5674
|
spikeddem wrote: bill413 wrote: spikeddem wrote: petsfed wrote: I think partly out of my amateur fascination with statistics, I geek out a lot over big numbers. One I learned about some time ago is the Graham's number, which I can say with neither reluctance nor embarrassment, I don't fully understand. But it did give me an interesting idea for a somewhat easier to calculate function that grows at a startling rate. Suppose you have a number S_n. The subscript n is both an index, as well as an initial value. S_n is defined as n to the nth power, to the nth power, to the nth power, and so on n times. For instance, S_2 = 2^2 = 4, but S_3 = 3^3^3 = 19683, S_4 = 4^4^4^4 = 3.40282366e38 and so on. It becomes quite apparent that S_n grows faster than an exponential even as n grows monotonically. I read, just now, that there are ways to manufacturer even faster growing functions using a similar, but in more ... dimensions than this. Any other big numbers that are easily expressed (other than the googol and googolplex)? I hope you find what you're looking for so I can finally start grading my routes. He's not talking about fractions, but numbers that are much greater than 1. Damn you!!!
|
|
|
|
|
skiclimb
Jul 8, 2010, 6:02 AM
Post #17 of 32
(1876 views)
Shortcut
Registered: Jan 11, 2004
Posts: 1938
|
petsfed wrote: I think partly out of my amateur fascination with statistics, I geek out a lot over big numbers. One I learned about some time ago is the Graham's number, which I can say with neither reluctance nor embarrassment, I don't fully understand. But it did give me an interesting idea for a somewhat easier to calculate function that grows at a startling rate. Suppose you have a number S_n. The subscript n is both an index, as well as an initial value. S_n is defined as n to the nth power, to the nth power, to the nth power, and so on n times. For instance, S_2 = 2^2 = 4, but S_3 = 3^3^3 = 19683, S_4 = 4^4^4^4 = 3.40282366e38 and so on. It becomes quite apparent that S_n grows faster than an exponential even as n grows monotonically. I read, just now, that there are ways to manufacturer even faster growing functions using a similar, but in more ... dimensions than this. Any other big numbers that are easily expressed (other than the googol and googolplex)? When in doubt and wanting the BIG just go Factorial...n!
|
|
|
|
|
petsfed
Jul 8, 2010, 6:18 AM
Post #18 of 32
(1871 views)
Shortcut
Registered: Sep 25, 2002
Posts: 8599
|
By comparison, factorial grows at a snail's pace. And I am aware of the definition of flops, Tristan. There's just a difference between FLOPs (FLoating-point OPerations) and FLOPS (FLoating-point Operations Per Second) which creates some ambiguity.
(This post was edited by petsfed on Jul 8, 2010, 6:23 AM)
|
|
|
|
|
airscape
Jul 8, 2010, 6:23 AM
Post #19 of 32
(1868 views)
Shortcut
Registered: Feb 26, 2001
Posts: 4240
|
skiclimb wrote: petsfed wrote: I think partly out of my amateur fascination with statistics, I geek out a lot over big numbers. One I learned about some time ago is the Graham's number, which I can say with neither reluctance nor embarrassment, I don't fully understand. But it did give me an interesting idea for a somewhat easier to calculate function that grows at a startling rate. Suppose you have a number S_n. The subscript n is both an index, as well as an initial value. S_n is defined as n to the nth power, to the nth power, to the nth power, and so on n times. For instance, S_2 = 2^2 = 4, but S_3 = 3^3^3 = 19683, S_4 = 4^4^4^4 = 3.40282366e38 and so on. It becomes quite apparent that S_n grows faster than an exponential even as n grows monotonically. I read, just now, that there are ways to manufacturer even faster growing functions using a similar, but in more ... dimensions than this. Any other big numbers that are easily expressed (other than the googol and googolplex)? When in doubt and wanting the BIG just go Factorial...n! If you go fictional n! you start getting imaginary numbers.
|
|
|
|
|
skiclimb
Jul 8, 2010, 6:31 AM
Post #20 of 32
(1862 views)
Shortcut
Registered: Jan 11, 2004
Posts: 1938
|
petsfed wrote: By comparison, factorial grows at a snail's pace. And I am aware of the definition of flops, Tristan. There's just a difference between FLOPs (FLoating-point OPerations) and FLOPS (FLoating-point Operations Per Second) which creates some ambiguity. Perhaps... hell it's been years since i really had to use statistics..
(This post was edited by skiclimb on Jul 8, 2010, 1:12 PM)
|
|
|
|
|
Toast_in_the_Machine
Jul 8, 2010, 12:12 PM
Post #21 of 32
(1835 views)
Shortcut
Registered: Sep 12, 2008
Posts: 5208
|
blondgecko wrote: petsfed wrote: about 100 gigaflops/second Memo from the pedantry department to the department of redundancy department: FLOPS = Floating Point Operations Per Second. If you are going to get a pedantic, get your bolds right. Where does the L come from?
|
|
|
|
|
airscape
Jul 8, 2010, 2:19 PM
Post #22 of 32
(1828 views)
Shortcut
Registered: Feb 26, 2001
Posts: 4240
|
FLoating point Operations Per Second
|
|
|
|
|
airscape
Jul 8, 2010, 2:22 PM
Post #23 of 32
(1826 views)
Shortcut
Registered: Feb 26, 2001
Posts: 4240
|
Very interesting stat (Wiki): Hardware cost: In 1961 it cost US$1,100,000,000,000 ($1.1 trillion)/GFLOPS, or US$1,100 per FLOPS 2009 it's 0.13c/GFLOP
|
|
|
|
|
truello
Jul 8, 2010, 5:07 PM
Post #24 of 32
(1803 views)
Shortcut
Registered: Oct 1, 2006
Posts: 737
|
skiclimb wrote: petsfed wrote: By comparison, factorial grows at a snail's pace. And I am aware of the definition of flops, Tristan. There's just a difference between FLOPs (FLoating-point OPerations) and FLOPS (FLoating-point Operations Per Second) which creates some ambiguity. Perhaps... hell it's been years since i really had to use statistics.. Look at it this way... S_3 = 19683 3! = 6
|
|
|
|
|
hafilax
Jul 8, 2010, 5:15 PM
Post #25 of 32
(1800 views)
Shortcut
Registered: Dec 12, 2007
Posts: 3025
|
petsfed wrote: By comparison, factorial grows at a snail's pace. And I am aware of the definition of flops, Tristan. There's just a difference between FLOPs (FLoating-point OPerations) and FLOPS (FLoating-point Operations Per Second) which creates some ambiguity. You could do a similar thing with the factorial function. F_1=1!=1 F_2=(2!)!=2 F_3=((3!)!)!~2.601218944e+1746 You could continue with a recursive function definition such that FF_1=F_1! FF_2=F_2! FF_3=F_2.601218944e+1746! Keep doing that to get arbitrarily fast growing functions. Speaking of factorials, I learned a new notation from my students this year while working with Bessel functions: n!!=n!(n+1)!. Didn't know that one before and almost marked them wrong thinking it meant (n!)!. I knew that the student is smarter than me so I looked it up. He did better than the prof who kept giving me wrong answers in the solution guides.
|
|
|
|
|
|
|
|
|