Schedules of Reinforcement
Continuous- reinforcement is the exception rather than the rule; we more frequently see intermittent reinforcement of behavior.
Continuous: every response is reinforced
Intermittent: only some instances reinforced
Schedules of reinforcement:
Primarily concerned with intermittent relations (between behavior and its consequences)
Reinforcement schedules-rules for arranging consequences (or describe how consequences are arranged)
Defined by contingencies:
Different schedules=different conditions under which responses produce reinforcement.
Importance of schedules lies mainly in their ability to produce orderly and predictable patterns of behavior.
...view middle of the document...
g. on average schedule requires thirty lever presses for every reinforcer.
FI- Fixed Interval Schedule
Generates â€˜scallop patternâ€
Initial pause followed by increase in rate as interval times out (positive acceleration)
PRP increase with interval duration.
Fixed interval = FI
Period of time that must elapse before response produces reinforcer follows.
FI-30 sec or FI 30â€
Example, looking at a clock as you get out of class.
VR (interval) â€“ Variable Interval Schedule
Generates steady constant response rate (lower than VR)
Useful in generating a behavioral baseline for studying other variables.
Like VR, little or no PRP
*Example, waiting for a big wave.
Average period of time that must elapse before response produces reinforcer follows
On average schedule requires thirty seconds to pass before a lever press will produce a reinforcer
VI 30 sec or VI 30â€
Response rate is higher in ratio schedules compared to interval schedules
Longer Inner Response Time (time between responses) reinforced in interval schedules.
Ratio schedules: Reinforcer delivery rate Is directly proportional to response rate.
Faster responding = more reinforcers
Not the case in interval schedules: Reinforcer delivery rate is relatively independent of response rate
Faster responding may only slightly increase reinforcer rate if at all.
Pausing- occurs on fixed schedules (scalloping, decrease after a reward is delivered), but not on variables schedules.
Longer interreinforcement intervals (IRI) = Longer pauses
FI: longer pause may be matter of â€œtimingâ€ the interval
FR: upcoming ration seems to exert most control
Ratio strain- Longer pauses with higher ration may break down performance.
They have a regenerative nature.
Differential reinforcement of Low Rates (DRL)
Low rate= long interresponse time (IRT)
Response is reinforced if IRT > some amount of time
Used to generate behavior that is acceptable at low rates.
â€œâ€ â€œâ€ of High Rates (DRH)
High rate= short interresponse time (IRT)
Response is reinforced if IRT, some amount of time.
â€œâ€ â€œâ€ of Zero Rate/ Other behavior (DRO)
Reinforcer delivered if no response emitted in some amount of time
Alternative to extinction (reinforcer delivered)
â€œâ€ â€œâ€ of Alternative Behavior (DRA)
Reinforcer delivered for some particular behavior and withheld from undesired behavior.
Variables that Influence Choice
The more reinforcers the better.
The quicker the reinforcement the better.
The bigger the reinforcer the better.
But they also compete (Delay vs. Amount)
Smaller sooner vs. larger later
E.g. small reinforcer now vs big reinforcer later
Classic self control problem.
Common terms: â€œimpulsiveâ€ vs. â€œhas self controlâ€
Represents choice between small payoff now and a...