In: Computer Science
Each time Kevin re-reads his Python book (which happens every month), he learns 10% of whatever material he didn’t know before. He needs to score at least 95% on his final to get an A in the class. When Kevin started, he knew nothing about Python. Write a function getLearningTime(rate, target) that simulates Kevin’s learning progress and returns the number of months it will take him to get ready.
Python code:
#defining getLearningTime function
def getLearningTime(rate,target):
#initializing notLearned as 100
notLearned=100
#initializing number of months as 0
months=0
#looping till he have learned atleast upto
target
while((100-notLearned)<=target):
#incrementing
months
months+=1
#updating
notLearned
notLearned=notLearned-notLearned*0.1
#returning months
return months
#calling getLearningTime time for rate 10 and target 95 and
printing result
print(getLearningTime(10,95))
Screenshot:
Output: