Name: 
 

Learning



Multiple Choice
Identify the letter of the choice that best completes the statement or answers the question.
 

 1. 

In __________ reinforcement, the reinforcer follows every correct response.
a.
intermittent
b.
partial
c.
negative
d.
continuous
 

 2. 

In Pavlov's experiments with dogs, salivation was the
a.
conditioned response.
b.
unconditioned stimulus.
c.
conditioned stimulus.
d.
unconditioned response.
 

 3. 

The presentation of an aversive stimulus or the removal of a positive stimulus are both examples of
a.
negative reinforcement.
b.
punishment.
c.
positive reinforcement.
d.
secondary reinforcement.
 

 4. 

In classical conditioning, learning is evident when a
a.
stimulus automatically produces a response without a prior history of experience.
b.
stimulus which did not initially produce a response now elicits that response.
c.
spontaneously emitted response increases in frequency as a result of its consequences.
d.
subject repeats an action he or she has observed in another and is praised for it.
 

 5. 

In Thorndike's law of effect, events critical for conditioning
a.
occur after the response.
b.
occur before the response.
c.
occur simultaneously with the response.
d.
are unrelated to the response except during extinction.
 

 6. 

If you have a snake phobia because you once heard a loud noise while looking at a snake, for you a snake is a(n)
a.
US.
b.
CS.
c.
UR.
d.
CR.
 

 7. 

A series of responses that gradually approach a desired pattern of behavior are called
a.
adaptations.
b.
gradients.
c.
successive approximations.
d.
conditioning trials.
 

 8. 

If the conditioned stimulus is presented many times without reinforcement, we can expect
a.
an increase in stimulus generalization.
b.
the strength of the UR to increase.
c.
an increase in response generalization.
d.
extinction to occur.
 

 9. 

A child has learned to avoid a furry, black cat. However, she still plays with her grandmother's short-haired tabby. Her response demonstrates
a.
negative transfer.
b.
extinction.
c.
discrimination.
d.
successive approximation.
 

 10. 

Punishment is most effective in suppressing behavior when it is
a.
immediate, consistent, and intense.
b.
delayed, consistent, and mild.
c.
immediate, consistent, and mild.
d.
delayed, inconsistent, and intense.
 

 11. 

The greatest degree of resistance to extinction is typically caused by a __________ schedule of reinforcement.
a.
variable interval
b.
variable ratio
c.
fixed interval
d.
fixed ratio
 

 12. 

In Pavlov's experiments with dogs, the conditioned stimulus was the
a.
food.
b.
bell.
c.
salivation to the food.
d.
salivation to the bell.
 

 13. 

Ivan Pavlov has been credited with the initial discovery of
a.
operant conditioning.
b.
reinforcement.
c.
classical conditioning.
d.
vicarious conditioning.
 

 14. 

In Pavlov's experiments with dogs, the bell (prior to conditioning) was the
a.
neutral stimulus.
b.
unconditioned stimulus.
c.
conditioned stimulus.
d.
unconditioned response.
 

 15. 

Reinforcement in operant conditioning is most effective when it is
a.
response contingent.
b.
stimulus contingent.
c.
US-CS contingent.
d.
NS-CS contingent.
 

 16. 

After pairing the CS and US in a series of conditioning trials, the organism learns to respond to the CS alone. This response is then called
a.
unconditioned stimulus.
b.
conditioned stimulus.
c.
unconditioned response.
d.
conditioned response.
 

 17. 

Jimmy helps his father put away the dishes after dinner. Jimmy's father wants to increase the probability of this behavior and will be most successful by praising Jimmy
a.
after all the dishes are put away.
b.
at bedtime.
c.
the next morning at breakfast.
d.
the next time they are putting away dishes.
 

 18. 

After a response has been extinguished, it will often reappear after a short time has passed. This is called
a.
adaptiveness.
b.
expectation checking.
c.
extinction recovery.
d.
spontaneous recovery.
 

 19. 

The first grade teacher gives students stickers when they perform well. If they earn five stickers in one day they are exempt from homework. The stickers in this example could also be called
a.
tokens.
b.
primary reinforcers.
c.
generalized reinforcers.
d.
prepotent responses.
 

 20. 

A student does a good job on math problems for homework, and the teacher awards a sticker. This demonstrates the use of
a.
extinction.
b.
reinforcement.
c.
spontaneous recovery.
d.
antecedents.
 

 21. 

In Pavlov's experiments with dogs, the bell (during and after conditioning) was the
a.
conditioned response.
b.
unconditioned stimulus.
c.
conditioned stimulus.
d.
unconditioned response.
 

 22. 

One difference between classical and operant conditioning is that
a.
animals learn only by operant conditioning.
b.
operant conditioning involves learning in which antecedent events are associated with one another.
c.
classical conditioning involves learning in which antecedent events are associated with one another.
d.
operant conditioning occurs when a response is not affected by consequences.
 

 23. 

In classical conditioning, events critical to the learning occur __________ the response.
a.
before
b.
after
c.
simultaneously with
d.
in a manner unrelated to
 

 24. 

A punisher can be
a.
the onset of an unpleasant event.
b.
the removal of a positive state of affairs.
c.
any consequence that reduces the occurrence of a behavior.
d.
a positive reinforcer.
 

 25. 

In a study of punishment, shock is administered to a hamster through a wire grid on the bottom of the cage. To the researcher's surprise, the hamster learns to roll on its back when shocked so that its fur insulates it from the shock. The hamster's response demonstrates
a.
positive reinforcement.
b.
negative reinforcement.
c.
discovery learning.
d.
cognitive learning.
 

 26. 

Becoming addicted to gambling is related to the effects of
a.
shaping.
b.
vicarious classical conditioning.
c.
unconditioned emotional reflexes.
d.
partial reinforcement.
 

 27. 

The technique of using desensitization involves
a.
flooding the person with images of the feared stimulus.
b.
gradually exposing the person to the feared stimulus.
c.
gradually exposing the person to the feared stimulus only when they are fully relaxed.
d.
systematically increasing the stimulus intensity up to the breaking point.
 

 28. 

A dog that gets rewarded for the first bark it makes in each ten minute period is being reinforced on a __________ schedule of reinforcement.
a.
continuous
b.
fixed interval
c.
variable interval
d.
fixed ratio
 

 29. 

In Pavlov's experiments with dogs, food was the
a.
conditioned response.
b.
unconditioned stimulus.
c.
conditioned stimulus.
d.
unconditioned response.
 

 30. 

To strengthen the connection between the CS and the CR, the CS must
a.
generalize to the UR.
b.
precede the US.
c.
be identical to the US.
d.
be followed by the UR.
 

 31. 

Which of the following statements about punishment is FALSE?
a.
Punishment teaches new responses.
b.
Punishment temporarily suppresses a response.
c.
Punishment may permanently suppress a response.
d.
Punishment applies an aversive event.
 

 32. 

Operant conditioning was studied by
a.
Pavlov.
b.
Maslow.
c.
Freud.
d.
Skinner.
 

 33. 

Using poker chips to reinforce mental patients for healthy behavior would be an example of using
a.
negative reinforcement.
b.
classical conditioning.
c.
extinction.
d.
tokens.
 

 34. 

Advertisers often try to use higher order conditioning by
a.
pairing images that evoke good feelings with pictures of their products.
b.
sounding loud tones at key points in the advertisement.
c.
reducing fear or anxiety as they repeatedly show the same commercial.
d.
associating the unconditioned stimulus with a cognitive response.
 

 35. 

A neutral stimulus is one that
a.
leads to an increase of the UR.
b.
leads to a decrease of the UR.
c.
depends on the size of the UR.
d.
does not evoke the UR.
 

 36. 

Which of the following might serve as a secondary reinforcer?
a.
sex
b.
grades
c.
food
d.
a pain-relieving drug
 

 37. 

For conditioning to occur, the proper order of events is
a.
US-UR-CR.
b.
CS-CR-UR.
c.
CS-US-UR.
d.
UR-US-CR.
 

 38. 

A team coach who benches a player for poor performance is using
a.
aversive conditioning.
b.
modeling.
c.
negative reinforcement.
d.
punishment.
 

 39. 

For the connection between the CS and the CR to be strengthened, the CS must
a.
generalize to the UR.
b.
precede the US.
c.
be identical to the US.
d.
be followed by the UR.
 

 40. 

In operant conditioning, what is the relationship between events critical to learning and the response to be learned?
a.
They occur before the response.
b.
They occur after the response.
c.
They occur simultaneously with the response.
d.
They are unrelated to the response.
 

 41. 

The unconditioned stimulus, by definition, leads to a(n)
a.
conditioned response.
b.
conditioned stimulus.
c.
unconditioned response.
d.
classic response.
 

 42. 

In a classic experiment, "Little Albert," a very young boy, was conditioned to be afraid of a rat. He also became fearful of white furry rabbits and bearded men. This is an example of
a.
spontaneous recovery.
b.
higher order conditioning.
c.
extinction.
d.
stimulus generalization.
 

 43. 

When a stimulus acquires the power to elicit a response as a result of being paired with a stimulus that already produces the response
a.
classical conditioning has occurred.
b.
spontaneous recovery has occurred.
c.
operant conditioning has occurred.
d.
aversive conditioning has occurred.
 

 44. 

Which of the following describes the state of affairs after conditioning?
a.
CS-CR
b.
US-CR
c.
CS-UR
d.
US-UR
 

 45. 

The most basic form of learning that is not heavily dependent on higher order intellectual processes is known as
a.
symbolic interaction.
b.
information processing.
c.
reductionism.
d.
conditioning.
 

 46. 

Which of the following best describes punishment?
a.
addition of a positive event
b.
addition of an aversive event
c.
declining response frequency
d.
withdrawal of a negative event
 

 47. 

Negative reinforcement __________ responding; punishment __________ responding.
a.
increases; increases
b.
decreases; decreases
c.
increases; decreases
d.
decreases; increases
 

 48. 

Two schedules of reinforcement that produce the highest rates of response are
a.
continuous and fixed interval.
b.
fixed interval and variable interval.
c.
variable interval and variable ratio.
d.
fixed ratio and variable ratio.
 

 49. 

__________ occurs when making a response removes an unpleasant event.
a.
Positive reinforcement
b.
Negative reinforcement
c.
Extinction
d.
Punishment
 

 50. 

To shape the behavior of their students, teachers employ
a.
tertiary reinforcers.
b.
secondary reinforcers.
c.
negative reinforcers.
d.
vicarious conditioning.
 

 51. 

If you give a child her favorite licorice candy for doing well in school and she continues to do well in school, the licorice candy is
a.
a reward and a reinforcer.
b.
a reward, but not a reinforcer.
c.
a reinforcer, but not a reward.
d.
neither a reinforcer nor a reward.
 

 52. 

Increased feedback
a.
sometimes improves learning and performance.
b.
has no effect on learning and performance.
c.
almost always improves learning and performance.
d.
is not as effective as computer-assisted learning.
 

 53. 

A child is conditioned to fear a furry, black cat. Soon she becomes fearful of any black, furry object. Her new response demonstrates
a.
spontaneous recovery.
b.
negative transfer.
c.
stimulus generalization.
d.
operant conditioning.
 

 54. 

The schedule of reinforcement in which a set number of responses must be made for each reward is called
a.
fixed ratio.
b.
fixed interval.
c.
variable ratio.
d.
variable interval.
 

 55. 

The fact that responses become more resistant to extinction, after partial reinforcement, is called
a.
the stimulus generalization effect.
b.
the partial reinforcement effect.
c.
the Skinner effect.
d.
the resistance effect.
 

 56. 

If you slow down every time you see a police car, your slowing down is probably due to
a.
positive reinforcement.
b.
negative reinforcement.
c.
punishment.
d.
extinction.
 

 57. 

Your niece has a temper tantrum in the store when she is shopping. If you buy her a toy you are
a.
being practical.
b.
being kind.
c.
encouraging more tantrums.
d.
discouraging more destructive behaviors.
 

 58. 

Negative reinforcement and punishment
a.
have opposite effects on behavior.
b.
are different terms for the same procedure.
c.
have the same effect on behavior.
d.
are not very effective in changing behavior.
 

 59. 

A puppy has begun to cry and bark in order to be let into the house. To extinguish this response, you would
a.
let the puppy in the house.
b.
ignore the crying, letting the puppy in when quiet.
c.
swat the puppy with a newspaper whenever it cried.
d.
let the puppy in when it cries, then swat it with a newspaper.
 

 60. 

Acquiring a fear of a light because you saw someone else getting shocked when the light came on is an example of
a.
vicarious conditioning.
b.
instrumental conditioning.
c.
classical extinction.
d.
vicarious withdrawal.
 

 61. 

Secondary reinforcers are
a.
almost never effective.
b.
much more effective than primary reinforcers.
c.
innate.
d.
learned.
 

 62. 

Which form of learning would most likely be studied in a Skinner box?
a.
classical conditioning
b.
vicarious conditioning
c.
operant conditioning
d.
conditioned emotional responses
 

 63. 

A child is frightened by a loud noise while playing with a cat. If the child learns to fear the cat, it can be said that the cat was
a.
the UR.
b.
a generalization gradient.
c.
the US.
d.
a CS.
 

 64. 

A child bitten by a white dog is not afraid of black dogs. This is an example of
a.
discrimination.
b.
spontaneous recovery.
c.
shaping.
d.
generalization.
 

 65. 

Presenting the conditioned stimulus without the unconditioned stimulus will result in
a.
reinforcement.
b.
generalization.
c.
spontaneous recovery.
d.
extinction.
 

 66. 

A corporate pay policy comparable to a fixed ratio schedule of reinforcement is
a.
paying employees a fixed salary.
b.
payment of employees on a piece-work basis.
c.
salary adjustments based on the quality of work performed.
d.
paying employees at the end of each day.
 

 67. 

By a continuous reinforcement schedule, we mean that
a.
reinforcements occur continuously regardless of the subject's behavior.
b.
responding without pausing is the requirement for reinforcement.
c.
each correct response is reinforced.
d.
reinforcement continues even when errors are made.
 

 68. 

When you are first learning golf, you may hit one or two great shots in an entire round. You are being reinforced on what kind of partial reinforcement schedule?
a.
fixed interval
b.
fixed ratio
c.
variable ratio
d.
variable interval
 

 69. 

To teach a child to eat spaghetti, you would reinforce initial responses, such as holding the fork, and then increasingly closer approximations to the final response, a procedure known as
a.
counter conditioning.
b.
secondary conditioning.
c.
desensitization.
d.
shaping.
 

 70. 

A rat learns to push a button in order to turn on a tone previously associated with food. The button pushing has been rewarded by a(n) __________ reinforcer.
a.
unconditioned
b.
primary
c.
secondary
d.
generalized
 

 71. 

Your handsome successful boyfriend winks at you each time before he tells you "I love you." Your expectation when he winks is a(an)
a.
unconditional stimulus.
b.
conditioned response.
c.
conditioned stimulus.
d.
unconditional response.
 

 72. 

The process through which a response is taught by rewarding successive approximations to the final desired response is
a.
extinction.
b.
fading.
c.
shaping.
d.
secondary reinforcement.
 

 73. 

A child who occasionally gets rewarded with candy after asking her grandmother for a "treat" is being rewarded on a __________ schedule.
a.
fixed ratio
b.
variable ratio
c.
fixed interval
d.
partial interval
 

 74. 

Teaching your cat to turn on the living room lights would best be accomplished by
a.
spontaneous recovery.
b.
shaping.
c.
classical conditioning.
d.
extinction.
 

 75. 

In operant conditioning, the reinforcer occurs __________ the response, and in classical conditioning, it occurs __________.
a.
after; before
b.
before; after
c.
before; before
d.
after; after
 

 76. 

Two principles of conditioning that have aided our learning and improved our adaptability as a species are
a.
stimulus generalization and stimulus discrimination.
b.
spontaneous recovery and extinction.
c.
lower order and higher order conditioning.
d.
extinction and inhibition.
 

 77. 

Learning is best defined as
a.
any change in behavior.
b.
a relatively permanent change in behavior due to past experience.
c.
a permanent change in behavior due to physical development.
d.
any change in behavior caused by punishment.
 

 78. 

Responses that are reinforced and tend to be repeated illustrate
a.
stimulus control.
b.
operant conditioning.
c.
generalization.
d.
discrimination.
 

 79. 

To be effective, punishment should be
a.
delivered late in the day.
b.
immediate and severe.
c.
explained in detail to the child.
d.
paired with reinforcement.
 

 80. 

Which of the following best describes the unusual events that occurred in Pavlov's laboratory leading him to the discovery of classical conditioning?
a.
Dogs salivated after meat powder was placed in their mouths.
b.
Dogs sometimes salivated before meat powder was placed in their mouths.
c.
Salivation existed in dogs as an unlearned reflex.
d.
Dogs salivated if and only if they were given a reward.
 

 81. 

After weeks of successful extinction trials, your pet dachshund suddenly resumes burying bones in the front yard. Your pet therapist advises you that the dog's behavior is an example of
a.
token reinforcement.
b.
stimulus generalization.
c.
spontaneous recovery.
d.
satiation.
 

 82. 

Operant conditioning is to Skinner as classical conditioning is to
a.
Pavlov.
b.
Thorndike.
c.
Miller.
d.
Freud.
 

 83. 

Classical conditioning is most often used to condition
a.
reflexes.
b.
short-term behavior.
c.
negative behavior.
d.
positive behavior.
 



 
Check Your Work     Reset Help