In: Computer Science
Write a program that calculates the amount a person would earn over a period of time if his or her salary is one penny the first day, two pennies the second day, and continues to double each day. The program should display a table showing the salary for each day, and then show the total pay at the end of the period. The output should be displayed in a dollar amount, not the number of pennies. Input Validation: Do not accept a number less than 1 for the number of days worked.
Basic Java language please
Have a look at the below code. I have put comments wherever required for better understanding.
import java.util.*;
class Main {
public static void main(String[] args) {
// create scanner object to take user input
Scanner sc = new Scanner(System.in);
// create variable to store number of days
int days;
System.out.println("Enter the total number of days:");
// Take input with proper validation
while (true){
days = sc.nextInt();
// if user input is less than 1, give proper message and take input again
if(days<1){
System.out.println("Please enter a number greater or equal to 1");
}
else{
// if input is greater or equal to 1 then proceed
// create an array to store salary
double[] table = new double[days];
// intialize day 1 salary as 1 penny
int n = 1;
// loop in through the days
for (int i=0;i<days;i++){
// conver pennies to dollar
table[i] = n/100.0;
// double the salary
n*=2;
}
// display the result
for (int i=0;i<days;i++){
System.out.print("$");
System.out.print(table[i]);
System.out.print("\t");
}
break;
}
}
}
}
Happy Learning!