In: Psychology
Instrumental Learning- Design an experiment to test how modifying the differentiability of a behavior for in fixed and variable interval schedules can change how resistant each schedule is to extinction.
The term reinforcement is referred to the paired presentation of the unconditioned stimulus and the conditioned stimulus. In operant conditioning, reinforcement is referred to the occurrence of an event, like providing food or water, followed by the desired response. Therefore reinforcement is an event whose occurrence increases the probability that a stimulus will on subsequent occasions evoke a response.
Schedules of reinforcement
When an animal's surroundings are controlled, its behaviour patterns after reinforcement become predictable, even for very complex behaviour patterns. A schedule of reinforcement is a rule or program that determines how and when the occurrence of a response will be followed by the delivery of the reinforcer, and extinction, in which no response is reinforced. Schedules of reinforcement influence how an instrumental response is learned and how it is maintained by reinforcement.
The two broad categories of schedules of reinforcement are
We are going to focus on the interval schedule and how resistant each schedule under the interval schedule is (Fixed interval and variable interval).
For a variable interval schedule of reinforcement (abbreviated VI), a certain period of time, random about some mean, must elapse after one reward is given before a single response produces the next. VI are widely used because they produce a very consistent and steady rate of responding., which accelerates only slightly as time without a reward passes.The variable interval schedule involves presenting the animal with a series of intervals in an unpredictable order. So for example, if the interval is scheduled to be for 23 minutes, in the variable interval the average of the interval have to result in 23 minutes ( first reward after 15 minutes and the next reward after 8 minutes). It need not be in the same order, an average of 23 minutes has to be maintained above.
As an example of a VI schedule, Imagine you have to buy a phone on Amazon and the website says that there will be a big sale that is upcoming but the exact date is not given. Also, depending on the number of phones left the discount will vary. If there is more stock then the discount can go up to 80% and if there is less stock then up to 5%. In this situation, a person will continuously keep checking the website so that the stock of the phone persists and he avails greater discount. In doing so, the person is responding a lot of times. And that is basically what happens in this type of schedule: a very high rate of response coupled with high resistance to extinction.
For a fixed interval schedule of reinforcement ( FI), the minimum period between successive rewards is always the same. Therefore, taking the previous example of 23 minutes, the interval will always be 23 minutes and the reward will be given after 23 minutes. Here the typical performance in a rat or a pigeon, depending on the length of the interval ( say 23 minutes), there will typically be a pause after the reinforcement has been received followed by a gradual increase in the rate of the response, such that the highest rates of response occur just before the next reinforcement becomes due.
As an example of a fixed interval may involve the times you go for a dental checkup. If your appointment is every 6 months then just before the appointment, one tends to start taking care of their teeth as you would not want the dentist to say anything to them. Whereas, you may not be as diligent on a day to day basis during the six months prior to the dental checkup.
Long intervals lead to better resistance to extinction. So, a Fixed Interval for 20 minutes would be a better schedule to use than a Fixed Interval of 3 minutes if you want the learning to last while at the same time you want not to rely too heavily on always providing reinforcers.
A comparison of fixed and variable intervals will reveal that resistance to extinction will be stronger in the VI schedules, all other things (such as the average size of the interval) being equal.