For my games server there is a job leveling system, there is a base wage -- Say, $10.00 -- And every time you level up it increases by 5%, then that new number increases by 5%, etc. (Example: level 0 = $10.00, level 1 = $10.50, level 2 = $11.025, level 3 = 11.57625) of course since its in dollars and cents it would be rounded to 2 decimal places, and there are 90 levels. I want to find out how much the wage would be by level 90 so I can figure out what to change the percentage to so that it'll be $20.00 by level 90.
What would the calculation be to calculate this? I am using the Windows 7 calculator.
What would the calculation be to calculate this? I am using the Windows 7 calculator.
-
Surprisingly low. There's a thing called the "Rule of 72" when it comes to doubling. Basically, at 4% it takes 18 years to double your money, 12 years at 6%, and both of those pairs are multiplied to 72.
Try in the neighborhood of 0.7-0.8%.
You could also use logarithms. Find log of 2, divide it by 90, and then do the inverse log. You have to be in scientific view.
_
Try in the neighborhood of 0.7-0.8%.
You could also use logarithms. Find log of 2, divide it by 90, and then do the inverse log. You have to be in scientific view.
_
-
Your best bet is to create a Microsoft Excel workbook. Put your initial value in cell A1. Cell A2 would be the formula =A1*(some value) and then drag it to how ever many levels you want. Try different values.
Report Abuse