This term refers to the density of reinforcement for producing a pre-determined behavior(s). This
can be thought of as a continuum along time and instances of the behavior. Often, continuous schedules of reinforcement are used to teach a new, replacement behavior. Once the new behavior has been established, switching to a variable schedule of reinforcement can help maintain the newly learned behavior. Fixed schedules of reinforcement may be used to begin to teach an individual to accept delayed reinforcement.
- Fixed schedule = target behavior is reinforced after a specified number of responses (e.g., every 3 responses) or after a specified time interval (e.g., 5 minutes).
- Continuous schedule = every occurrence of the target behavior is reinforced.
- Variable schedule = the behavior is only reinforced on some less predictable basis (e.g., on an average of every 5 responses which could range from 1 to 9, or on an average of every 5 minutes which could range from 1 to 9 minutes.
The quantity, intensity, or duration of the reinforcer provided for responding (Hoch, McComas, Johnson, Faranda, & Guenther, 2002).
Example: A person is reinforced with a small sip of soda for following directions to fold her laundry. An increase in the quantity of soda (i.e. from 2oz to 6oz) is an increase in the magnitude of the reinforcer.
Refers to how soon reinforcement is delivered after a predetermined behavior is produced. It is important to reinforcer immediately after the desired behavior is produced so that it is reinforced and another behavior is not inadvertently reinforced.
Example: A person is learning to request a break with a nonverbal cue rather than running out of the room. Whenever the person requests a break, the interventionist immediately releases the person from the task he is working on and lets him have a short break. This is an example of immediate reinforcement. If the person requests a break and the interventionist does not immediately release him from the task he is working on, then the reinforcement is delayed (i.e. not immediate).
In behavior analysis a contingency defines the relationship between the antecedent, behavior, and consequence.
Example: A person is presented with a task she finds difficult, such as making her bed. The directive “make your bed, please” is the antecedent. Whenever the person is given this directive, she begins to swear at the person talking to her. This is the behavior. Once the individual begins to swear, she is removed from her room and doesn’t end up making her bed. This is the consequence for the problem behavior.
Also known as a setting event.
An antecedent event that alters the effectiveness of the reinforcer.
Example: Having just eaten a large meal will diminish the effectiveness of edible reinforcers. Similarly, deprivation will increase the effectiveness of reinforcers.
An individual’s continuum of ‘preferred’ to ‘non-preferred’ for different things.
Example: A person may avoid men with beards because of prior abuse from an individual with that appearance, thus preferring females or males without beards.