Question
click below
click below
Question
Normal Size Small Size show me how
Schedules of Reinfor
Question | Answer |
---|---|
A _____ of reinforcement is the _____ requirement that must be met in order to obtain reinforcement | schedule; response |
On a _____ reinforcement schedule (abbreviated _____), each response is reinforced, whereas on an _____ reinforcement schedule, only some responses are reinforced. The latter is also called a)n) _____ reinforcement schedule | continuous; CRF; intermittent; partial |
Each time you flick the light switch, the light comes on. The behavior of flicking the light switch is a(n) _____ schedule of reinforcement | continuous |
When the weather is very cold, you are sometimes unable to start your car. The behavior of starting your car in very cold weather is a(n) _____ schedule of reinforcement | intermittent (partial) |
_____ are the different effects on behavior produced by different response requirements. These are the stable patterns of behavior that emerge once the organism has had sufficient exposure to the schedule. Such stable patterns are know as _____ behaviors | schedule effects; steady-state |
On a(n) _____ schedule, reinforcement is contingent upon a fixed number of responses | fixed ratio |
A schedule in which 15 responses are required for each reinforcer is abbreviated _____ | FR 15 |
A mother finds that she always has to make the same request three times before her child complies. The mother's behavior of making requests is on an _____ schedule of reinforcement | FR 3 |
An FR 1 schedule of reinforcement can also be called a _____ schedule | continuous (CRF) |
A fixed ratio schedule tends to produce a _____ rate of response, along with a _____ | high; post-reinforcement pause |
A FR 200 schedule of reinforcement will result in a _____ pause than an FR 50 schedule | longer |
The typical FR pattern is sometimes called a _____ pattern, with a _____ pause that is followed immediately by a _____ rate of response | break-and-run; post-reinforcement pause; high |
A FR 12 schedule of reinforcement is _____ than a FR 75 schedule | denser |
A very dense schedule of reinforcement can also be referred to as a very _____ schedule | rich |
Over a period of months, Aaron changed from complying with each of his mother's requests to complying with every other request, then with every third request, and so on. The mother's behavior of making requests has been subjected to a procedure known as__ | stretching the ratio |
On a variable ratio schedule, reinforcement is contingent up a _____ of responses | varying, unpredictable number |
A variable ratio schedule typically produces a _____ rate of behavior _____ a postreinforcement pause | high; without |
An average of 1 in 10 people approached by a panhandler actually gives him money. His behavior of panhandling is on a _____ schedule or reinforcement | VR 10 |
As with a FR schedule, an extremely lean VR schedule can result in _____ | ratio strain |
On a fixed interval schedule, reinforcement is contingent upon the _____ response following a _____ period of _____ | first; fixed, predictable; time |
Responding on an FI schedule is often characterized by a _____ pattern of responding consisting of a _____ followed by a gradually _____ rate of behavior as the interval draws to a close | scalloped; post-reinforcement pause; increasing |
On a pure FI schedule, any response that occurs _____ the interval is irrelevant | during |
On a variable interval schedule, reinforcement is contingent upon the _____ response following a _____ period of _____ | first; varying, unpredictable; time |
You find that by frequently switching station on you radio, you are able to hear your favorite song an average of once every 20 minutes. Your behavior of switching stations is thus being reinforced on a _____ schedule | VI 20-min |
In general, variable interval schedules produce a _____ and _____ rate of response with little or no _____ | moderate; steady; post-reinforcement pause |
In general, _____ tend to produce a high rate of response. This is because the reinforcer in such schedules is entirely _____ contingent, meaning that the rapidity with which responses are emitted _____ greatly affect how soon the reinforcer is obtained | ratio schedules; response; does |
On _____ schedules, the reinforcer is largely time contingent, meaning that the rapidity with which responses are emitted has _____ effect on how quickly the reinforcer is obtained | interval; little |
In general, _____ schedules produce little or no postreinforcement pausing because such schedules often provide the possibility of relatively _____ reinforcement, even if one has just obtained a reinforcer | variable; immediate |
In general, _____ schedules produces postreinforcement pauses because obtaining one reinforcer means that the next reinforcer is necessarily quite _____ | fixed; distant |
On a _____ schedule, reinforcement is contingent upon responding continuously for a varying period of time; on an _____ schedule, reinforcement is contingent upon the first response after a fixed period of time | variable duration; fixed interval |
As Tessa sits quietly, her mother occasionally givers her a hug as a reward. This is an example of a _____ schedule | variable duration |
In practicing the slow-motion form of exercise known as tai chi, Tung noticed the more slowly he moved, the more thoroughly his muscles relaxed. This is an example of _____ reinforcement of _____ behavior (abbreviated _____) | differential; low rates of; DRL |
On a video game, the faster you destroy all the targets, the more bonus points you obtain. This is an example of _____ reinforcement of _____ behavior (abbreviated _____) | differential; high rates of; DRH |
Frank discovers that his golf shots are much more accurate when he swings the club with a nice, even rhythm that is neither too fast nor too slow. This is an example of _____ reinforcement of _____ behavior (abbreviated _____) | differential; paced; DRP |
On a _____ schedule of reinforcement, a response is not required to obtain a reinforcer. Such a schedule is also called a response _____ schedule of reinforcement | noncontingent; independent |
Every morning at 7:00 a.m. a robin perches outside Marilyn's bedroom window and begins singing. Given that Marilyn very much enjoys the robin's song, this is an example of a _____ 24-hour schedule of reinforcement (abbreviated _____) | fixed time; FT 24-hour |
For farmers, rainfall is an example of a noncontingent reinforcer that is typically delivered on a _____ schedule (abbreviated _____) | variable time; VT |
When noncontingent reinforcement happens to follow a particular behavior, that behavior may _____ in strength. Such behavior is referred to as _____ behavior | increase; superstitious |
Herrnstein (1966) noted that superstitious behaviors can sometimes develop as a by-product of _____ reinforcement for some other behavior | contingent |
As shown by the kinds of situations in which superstitious behaviors develop in humans, such behaviors seem most likely to develop on a(n) _____ schedule of reinforcement | VT |
During the time that a rat is responding to a VR 100 schedule, we begin delivering additional food on a VT 60-second schedule. AS a result, the rate of response on the VR schedule is like to _____ | decrease |
A child who is often hugged during the course of the day, regardless of what he is doing, is in humanistic terms receiving unconditional positive regard. In behavioral terms, he is receiving a form of _____ social reinforcement. | contingent |
As a result, this child may be _____ likely to act out in order to receive attention | less |
A complex schedule is one that consists of _____ | two or more simple schedules |
In a(n) _____ schedule, the response requirement changes as a funtion of the organism's performance while responding for the previous reinforcer, | adjusting |
while in a(n) _____ schedule, the requirements of two or more simple schedules must be met before the reinforcer is delivered | conjunctive |
To the extent that a gymnast is trying to improve his performance, he is likely on a(n) _____ schedule of reinforcement; to the extent that his performance is judged according to both the form and quickness of his moves, he is on a(n) _____ schedule | adjusting; conjunctive |
A chained schedule consists of a sequence of two or more simple schedules, each of which has its own _____ and the last of which results in a _____ | discriminative stimulus; terminal reinforcer |
Within a chain, completion of each of the early links ends in a(n) _____ reinforcer, which also functions as the _____ for the next link of the chain | secondary; discriminative stimulus |
Responding tends to be weaker in the _____ links of a chain. This is an example of the _____ effect in which the strength and/or efficiency of responding _____ as the organism approaches the goal | earlier; goal gradient; increases |
An efficient way to train a complex chain, especially in animals, is through _____ chaining, in which the _____ link of the chain is trained first. | backward; last |
However, this type of procedure usually is not required with verbally proficient humans, with whom behavior chains can be quickly established through _____ | instructions |
One suggestion for enhancing our behavior in the early part of a long response chain is to make the completion of each link more _____, thereby enhancing its value as a _____ reinforcer | salient; secondary |
According to drive reduction theory, an event is reinforcing if it is associated with a reduction in some type of _____ drive | physiological |
According to this theory (drive reduction), a _____ reinforcer is one that has been associated with a _____ reinforcer | secondary; primary |
A major problem with drive reduction theory is that _____ | some behaviors do not seem to be related to a physiological drive |
The motivation that is derived from some property of the reinforcer is called _____ motivation | incentive |
The Premack principle holds that reinforcers can often be viewed as _____ rather than stimuli. For example, rather than saying that the rat's lever pressing was reinforced with food, we could say that it was reinforced with _____ food | behaviors; eating |
The Premack principle states that a _____ behavior can be used as a reinforcer for a _____ behavior | high probability; low probability |
According to the Premack principle, if you crack your knuckles 3 times per hour and burp 20 times per hour, the opportunity to _____ can probably be used as a reinforcer for _____ | burp; cracking your knuckles |
If you drink five soda pops each day and only one glass of orange juice, then the opportunity to drink _____ can probably be used as a reinforcer for drinking _____ | soda; orange juice |
If "Chew bubble gum --> Play video games" is a diagram of a reinforcement procedure based on the Premack principle, then chewing bubble gum must be a _____ probability behavior than playing video games | lower |
What is grandmas rule, and how does it relate to the Premack principle? | First you work (LPB), then you play (HPB) |
According to the response deprivation hypothesis, a response can serve as a reinforcer if free access to the response is _____ and its frequency then falls _____ its baseline level of occurrence | restricted; below |
If a child normally watches 4 hours of television per night, we can make television watching a reinforcer if we restrict free access to the television to _____ than 4 hours per night | less |
The response deprivation hypothesis differs from the Premack principle in that we need only know the baseline frequency of the _____ behavior | reinforcing |
Kaily typically watches television for 4 hours per day and reads comic books for 1 hour per day. You then set up a contingency whereby Kaily must watch 4.5 hours of television each day in order to have access to her comic books. | Continue on next slide |
According to the Premack principle, this will likely be an _____ contingency. According to the response deprivation hypothesis, this would be an _____ contingency | ineffective; effective |
According to the behavioral _____ approach, an organism that _____ engage in alternative activities will distribute its behavior in such a way as to _____ the available reinforcement | bliss point; can freely; optimize |
Contingencies of reinforcement often _____ the distribution of behavior such that it is _____ to obtain the optimal amount of reinforcement | disrupt; impossible |
adjusting schedule | A schedule in which the response requirement changes as a function of the organism's performance while responding for the previous reinforcer |
behavioral bliss point approach | The theory that an organism with free access to alternative activities will distribute its behavior in such a way as to maximize overall reinforcement |
Chained schedule | A schedule consisting of a sequence of two or more simple schedules, each with its own discriminative stimulus and the last of which results in a terminal reinforcer |
complex schedule | A schedule consisting of a combination of two or more simple schedules |
conjunctive schedule | A type of complex schedule in which the requirements of two or more simple schedules must be met before a reinforcer is delivered |
continuous reinforcement schedule | A schedule in which each specified response is reinforced |
differential reinforcement of high rates (DRH) | A schedule in which reinforcement is contingent upon emitting at least a certain number of responses in a certain period of time--or, more generally, reinforcement is provided for responding at a fast rate |
differential reinforcement of low rates (DRL) | A schedule in which a minimum amount of time must pass between each response before the reinforcer will be delivered--or, more generally, reinforcement is provided for responding at a slow rate |
differential reinforcement of paced responding (DRP) | A schedule in which reinforcement is contingent upon emitting a series of responses at a set rate--or, more generally, reinforcement is provided for responding neither too fast nor too slow |
drive reduction theory | According to this theory, an event is reinforcing to the extent that it is associated with a reduction in some type of physiological drive |
fixed duration (FD) schedule | A schedule in which reinforcement is contingent upon continuous performance of a behavior for a fixed, predictable period of time |
fixed interval (FI) schedule | A schedule in which reinforcement is contingent upon the first response after a fixed, predictable period of time |
fixed ratio (FR) schedule | A schedule in which reinforcement is contingent upon a fixed, predictable number of responses |
fixed time (FT) schedule | A schedule in which the reinforcer is delivered following a fixed, predictable period of time, regardless of the organism's behavior |
goal gradient effect | An increase in the strength and/or efficiency of responding as one draws near to the goal |
incentive motivation | Motivation derived from some property of the reinforcer, as opposed to an internal drive state |
intermittent (or partial) reinforcement schedule | A schedule in which the reinforcer is delivered independently of any response |
noncontingent schedule of reinforcement | A schedule in which the reinforcer is delivered independently of any response |
Premack principle | The notion that a high-probability behavior can be used to reinforce a low-probability behavior |
ratio strain | A disruption in responding due to an overly demanding response requirement |
response deprivation hypothesis | The notion that a behavior can serve as reinforcer when: (1) access to the behavior is restricted and (2) its frequency thereby falls below its preferred level of occurrence |
response-rate schedule | A schedule in which reinforcement is contingent upon the organism's rate of response |
schedule of reinforcement | The response requirement that must be met to obtain reinforcement |
variable duration (VD) schedule | A schedule in which reinforcement is contingent upon continuous performance of a behavior for a varying, unpredictable period of time |
variable interval (VI) schedule | A schedule in which reinforcement is contingent upon the first response after a varying, unpredictable period of time |
variable ratio (VR) schedule | A schedule in which reinforcement is contingent upon a varying, unpredictable number of responses |
variable time (VT) schedule | A schedule in which the reinforcer is delivered following a varying, unpredictable period of time, regardless of the organism's behavior |