Second, to be effective the reinforcement must be desired by the person carrying out the behavior. Whereas a lollipop might be an effective reinforcer for Little Albert, it will not be an effective reinforcer for the pompous engineer. Although some consequences-such as praise and other positive attention-are desired by most people, it is a serious mistake to assume that one reinforcer will be desired by all employees. What constitutes a desired consequence varies consider ably from individual to individual. Thus, identifying and utilizing a desired consequence is an essential step in developing a successful behavior change strategy.
The power of a reinforcer to establish and maintain a behavior is determined by when and how often it is delivered. The timing of reinforcement refers to how long after the manifestation of the target behavior the reinforcement is delivered. The sooner the reinforcement is delivered, the more effective it will be. Common sense tells us that to say "thank you" for a favor immediately has more impact than to say "thank you" a month later. Obviously, in a work situation it is not always possible to deliver a reinforcement immediately after the performance of each desired behavior. As a general principle, to establish or teach a behavior, the reinforcement should be delivered each time the behavior is carried out, but once the behavior is learned it can be maintained most efficiently by a different schedule or frequency of reinforcement. There are five different schedules of reinforcement, and each one has a different effect on the target behavior.
Continuous Reinforcement
A schedule of continuous reinforcement is one in which a positive consequence follows each enactment of the target behavior. This results in a steady high rate of responding as long as the reinforcement continues to follow every response. This is a schedule appropriate for establishing or teaching new behaviors and increasing the frequency of existing behaviors. Continuous reinforcement has two drawbacks: First, the behavior weakens or undergoes extinction rapidly when the reinforcement is discontinued. For example, assistants who are accustomed to being thanked each time they get coffee for their supervisors are likely to stop getting coffee if they are not thanked a few times. The second drawback is that the reinforcer may lose its power to reinforce or its desirability. If a supervisor uses the word "terrific" in response to every suggestion a subordinate makes, "terrific" may lose its power to reinforce suggestion giving. To maintain a behavior once it has been established, the schedule of reinforcement should be gradually changed from continuous to intermittent.
Intermittent Reinforcement
When reinforcement follows some but not all manifestations of the behavior, it is said to be intermittent. There are two kinds of intermittent reinforcement: the ratio schedule and the interval schedule.
Ratio schedules refer to reinforcements received after a number of responses, whereas reinforcement available only after a lapse of time is on an interval schedule.
Both ratio and interval schedules may be either fixed or variable.
A fixed schedule is one in which a specific length of time or number of responses is required before the reinforcement is delivered. Piece work is an example of a fixed-ratio schedule; the workweek is an example of a fixed-interval schedule. With piecework the worker is paid for completing a specific number of items; with the workweek the worker is paid for spending a specific amount of time on the job. With both the fixed-interval and the fixed-ratio schedules, every response is not reinforced and the reinforcement is delivered intermittently and predictably.
Fixed schedules have one major drawback: The rate of responding usually drops immediately after the reinforcement is delivered. This is because the responses that occur immediately after the reinforcement are never reinforced. Therefore, in a fixed schedule we find that the frequency of responding is highest just before reinforcement and lowest just after reinforcement. The longer the fixed time or the larger the fixed number of responses required, the longer will be the pause after reinforcement before responding is resumed.
Although in theory, pay is supposed to reinforce work, it is actually contingent on the amount of time (number of days) one spends on the job. Thus, we would expect that the rate of absences among salaried workers would be highest at the beginning of a pay period (the day after receiving a paycheck) and lowest at the end of the pay period (the day the paychecks are issued). Likewise, those working on a task or piecework basis are likely to work harder when the task is near completion. This explains why many pieceworkers find it difficult to begin a new project. The pieceworker who must complete many items before being paid is likely to pause longer at the beginning than one who must complete fewer items before being paid.
Similarly, we would expect more absences at the beginning of a monthly pay period than at the beginning of a weekly pay period. A variable schedule is one in which the length of time or the number of responses required to obtain the reinforcement varies randomly. Because their actual delivery of reinforcement is unpredictable, variable schedules do not usually result in the problem of workers' pausing immediately after reinforcement. The reason for this is that, owing to the unpredictability of the variable schedule, the first response after receiving a reinforcement might also be rein forced, whereas there is always a fixed space between reinforcements with the fixed schedule-that is, in the fixed schedule the first response after reinforcement would never be reinforced. Consequently, that first response would be subject to extinction which results in a short pause.
If, for example, paychecks were issued on a variable-interval schedule, we would predict that the rate of absenteeism at the beginning of the pay period would drop, since the workers would not know on which day they would receive their pay. Likewise, slot machine players are likely to continue playing at a high rate because sometimes they will win twice in a row and sometimes they will win only after many turns (variable-ratio schedule). Making sales is another example of a variable-ratio schedule: The salesperson never knows how many calls will be required to secure a sale. (This could explain the perseverance of many salespeople.)
There is an exception to the general principle of continued high rates of response when using variable schedules. When the average interval or ratio required to secure reinforcement is very big, the person may pause after the reinforcement. That is, although some times the first response or first time segment is reinforced, the probability of this decreases as the interval or ratio increases. Hence, the person learns that another reinforcement will usually not be forth coming for a long time or until many responses have been evidenced. A quality control inspector in a company that produces very few defective items is more likely to become lax immediately after finding a defective item than is an inspector who works for a company that produces many defective items. In general, when the average intervals or ratios are moderate, response will remain constant and high. Thus, behavior is most efficiently maintained by variable or random schedules of reinforcement.
Resistance to Extinction
A behavior is said to be resistant to extinction when it repeatedly occurs after reinforcement is discontinued. Behavior that has been maintained on an intermittent schedule will be more resistant to extinction than will behavior that has been maintained on a schedule of continuous reinforcement. Suppose that the call button for the elevator in your building is defective and you have discovered you must push it three times to call the elevator. You are likely to continue to use that button to call the elevator as long as it always comes after three pushes. However, if one day you push the button three times and it doesn't come, you may continue to push it three, six, or even nine times more. But if the elevator still does not come, you will probably give up and use the stairs. If for a few days in a row the elevator does not come after pushing the button three or six times, you are likely to stop approaching the elevator altogether and begin going directly to the stairs each day. On the other hand, the manager who is trying to stop the engineer's temper tantrums may occasionally give in to them, without realizing that giving in intermittently will make the tantrums extremely resistant to extinction. But if the manager has always given in and then abruptly stops, it is probable that the tantrums will be extinguished rapidly (unless they are being reinforced by other workers’ attention).