A.
A
simple definition: Learning is a change in behavior resulting from experience;
in evolutionary terms, learning is an adaptive change in behavior that
results from experience
B.
The
difference between maturation and learning: Some behavior change (walking,
talking, adult sexual behavior) requires biological development as well as
experience
C.
Simple
vs. complex kinds of learning
1.
Relatively
simple forms of learning: habituation, classical conditioning, operant
conditioning
2.
More
complex kinds of learning: learning to talk, learning calculus, learning the
history of the Civil War
II.
Classical
conditioning
A. Pavlov’s
dogs: Pavlov originally studied the
physiology of salivation, for which he won the Nobel Prize. In the course of this research, he became
aware of a kind of learning, which today is called, “classical conditioning.” Sometimes, it is also referred to a
“Pavlovian conditioning”
B.
The
basic paradigm of classical conditioning:
A formerly neutral stimulus (the conditioned stimulus; a bell,
for example) is paired with another stimulus (the unconditioned stimulus;
food, for example) that automatically produces a response (the conditioned
response; for example, salivation).
After repeated pairing, the neutral stimulus (the bell) will elicit a
response similar to the unconditioned response (i.e., the bell will produce a
response of salivation; this learned response to the bell is called the conditioned
response)
Abbreviations:
US = unconditioned stimulus
CS = conditioned stimulus
UR = unconditioned response
CR = conditioned response
C.
Some
examples of classical conditioning:
1)
Learning
to feel upset at the sight of flashing police lights in your rearview mirror
2)
Learning
to feel anxiety when you hear the sounds at the dentist’s office
3)
Learning
sexual arousal to objects that have been associated with sexual arousal in the
past (e.g., items of clothing)
4)
Feeling
tender emotions when you hear a song that was associated with your first
romance
5)
A
new mother whose breasts start to produce milk when she hears her baby’s cry
6)
Learning
to feel emotional arousal to certain words (4-letter words, bigoted labels)
7)
The
famous case of “little Albert” – learning fear
D.
Traditionally,
psychologists believed that responses that can be classically conditioned are involuntary
responses (examples: heart rate changes, gastric motility, sweating, eye
blinks, sexual arousal). This is in
contrast to operant conditioning, in which voluntary responses are
molded through their rewarding and punishing consequences
E.
What
is the evolutionary “purpose” of classical
conditioning? One answer: It
helps the body prepare itself for an expected or likely event. For example, if food is likely, salivation
aids the digestive process. If a
painful shock is likely, the body prepares itself for this stressor.
F.
Some
important terms and concepts in CC:
1.
Extinction:
A weakening of the conditioned response when there ceases to be a pairing
between the CS and the US
2.
Spontaneous
recovery: The tendency for a conditioned response to reappear after extinction
takes place
3.
Generalization:
The tendency for an animal or person not only to condition to the exact CS used
during conditioning trials, but also to similar stimuli; for example, if
a dog is conditioned to salivate to a particular bell, it may also salivate to
other bells as well
4.
Semantic
generalization: a kind of generalization which occurs only in people; when people learn conditioned responses to
words, they may generalize the responses to the objects or concepts that the
words refer to. For example, if you
learn prejudiced feelings to a bigoted label, you may generalize them to the
people referred to by the label; Also,
when people learned conditioned responses to words, they may generalize the
responses to words with similar meanings.
For example, if the word “white” is paired with electric shocks and you
learn to be afraid of the words, you may also show fear responses to the word
“light,” which is semantically related to the word “white”
G.
Some
factors that influence classical conditioning
1.
Time
delay between CS and US: Usually conditioning is strongest if the delay is
between 250 to 700 milliseconds
2.
Time
arrangements of CS and US
a.
forward,
trace, simultaneous, and backward conditioning:
forward – CS comes first, and while it’s still going, the
US occurs
trace – CS comes first, and after it stops, the US occurs
simultaneous – CS and US occur at the same time
backward – CS occurs after the US has started
H.
The
contiguity vs. contingency views of classical conditioning
1.
contiguity
(Pavlov’s view): CC occurs when the CS
and US occur together, in time and space; (Note similarity to British
associationist views)
2.
contingency:
CC occurs only when the CS provides some information ahead of time about the
likelihood of the US occurring
3.
Some
evidence:
a.
effects
of CS, US time arrangements
b.
Blocking
experiments – What happens if animals are first conditioned to blink (CR) to one
CS (a sound) for 8 trials, and then a light and a sound (two CS’s)
are paired with the air burst for another 8 trials? Will the animal show a CR to the newly added CS?
I.
Is
classical conditioning a kind of stimulus-response learning (the CR is “hooked
to” the CS), or is it a kind of stimulus-stimulus learning (the animal learns
that the CS “signals” the US)?
1.
the
“response-prevention paradigm” – What happens if we prevent an animal from
making the CR by paralyzing the muscle?
When the paralysis is removed, will the animal show the CR?
2.
the
“US devaluation” paradigm – In Pavlov’s original experiments, the dogs were
hungry. What happens if we condition
dogs to salivate to a bell, and then allow the dogs to eat until they’re
stuffed. Will they then salivate to the
bell?
J.
Biological
preparedness and classical conditioning
1.
the
unusual case of learned taste aversions:
conditioning can occur in on trial; the time delay between CS and US can
be long
2.
animals
can learn some kinds of CR (food aversions) more readily to some kinds of CS
(smell, taste) than to others (visual cues)
III.
Operant
conditioning
A.
Edward
Thorndike’s (1898) cat puzzle box: Hungry cats locked in a box, which could be
opened only if the cats pulled an unlatching device (a loop of wire); at first
cats randomly moved, meowed, and clawed, but gradually they became better
(quicker) at getting out of the box with successive trials
B.
Thorndike’s
“law of effect”: rewards or reinforcers strengthen stimulus-response connections;
a mechanistic, unthinking view of the effects of reward
C.
Is
reward necessary for learning to take place?
Tolman’s notion of “latent learning” – a rat allowed to freely roam
through a maze stills seems to learn it layout, even when the rat is not
rewarded
D.
B.
F. Skinner’s (1904-1990) view of operant (or instrumental) conditioning
1.
Animals
emit behaviors freely, called “operants”; for example, rats in a “Skinner box”
might press a lever sticking out of the wall of the box; a reinforcer is anything
that increases the probability of a response when it follows the response
(Examples: food is reinforcing to a hungry animal; water is reinforcing to a
thirsty animal; sex can be reinforcing to a sexually mature animal)
E.
Some
important concepts and terms in operant conditioning:
1.
Positive
vs. negative reinforcement: Both increase the probability of a response;
however, positive reinforcement is the presentation of a desired stimulus
(food, money), whereas negative reinforcement is the termination of an aversive
or unpleasant stimulus (electric shock, pain, anxiety).
Example to think about: Do people drink alcohol or
take drugs like cocaine because of positive or negative reinforcement?
2.
Primary
vs. secondary reinforcers: Primary
reinforcers are unlearned and ‘wired in” to the organism (example: food, water,
sex); Secondary reinforcers are learned reinforcers (examples: money, school
grades, tokens that monkeys work for to get treats)
3.
Schedules
of reinforcement: interval (based on time) or ratio (based on number of
responses); fixed (occurring after set intervals of times or fixed numbers of
responses) or variable (occurring after variable time intervals or a variable
number of responses); different schedules of reinforcement produce different patterns
of response in animals and people
4.
Partial
(some of the time) vs. continuous (every response reinforced) schedules of
reinforcement; The partial reinforcement effect: Partial reinforcement produces responses
that are more resistant to extinction
5.
Punishment:
an aversive stimulus delivered after a behavior which decreases the probability
of (or even eliminates) a response
Skinner’s early view of punishment: It is often ineffective because it may lead
only to a temporary, situation-specific suppression of a response; furthermore,
punishment is at best a partial strategy: it may eliminate an undesired
response, but it doesn’t necessarily establish a desired response in its place;
also, physical punishment may produce anger and modeling of aggressive behavior
When is punishment effective?
1)
In
animals, punishment must be delivered soon after a response to be most
effective
2)
Punishment
must be strong (as severe as is ethically or practically acceptable) to be
effective
3)
Punishment
must be delivered consistently (compare this with the partial reinforcement
effect)
4)
Punishment
should start out strong; it should not start out weak and build up with
repeated “infractions”
5)
Punishment
is less effective if animal earlier experiences random and noncontingent
punishment (example: a child is randomly
abused, and then punished for a specific “bad” behavior)
6)
Punishment
is more effective if animal if offered an alternative response to the punished
response
IV.
Language
and conditioning: Can language learning be explained using classical and
operant conditioning?
A.
Classical
conditioning of emotional reactions to words
B.
Operant
conditioning of early verbal utterances: Are some words reinforced?
C.
Limitations
to the conditioning approach: Chomsky’s (1959) critique of conditioning
approaches
I.
Communication
in infra-human species and characteristics of human language
A.Some examples of communication in lower animals
1.
the
dance of the honeybee
2.
communication
in jackdaws (a European blackbird)
3.
communication
in apes and attempts to teach chimps and gorillas to “talk”
D.
Some
characteristics of human language
1.
Natural
human languages are based on relatively small sets of speech sounds called phonemes
2.
Syntax
or grammar: All human languages have complex structural rules; Note, “grammar”
here does not refer to the “good grammar” you learn in school
3.
Use
of arbitrary, non-representative meaningful symbols: words and morphemes
4.
Generativity:
All human languages can generate an infinite number of meaningful statements or
sentences
5.
Learning:
All human languages are learned; however, the human capacity for language and
language learning may have a strong biological basis
6.
Cultural
transmission of information: Human languages permit the storage and
transmission of complex cultural information from generation to generation
E.
Language
and thought
1.
Does
thought require language? -- the case of Helen Keller
2.
The
Whorfian hypothesis (also known as the Sapir-Whorf hypothesis or the linguistic
relativity hypothesis): the language that people speak influences the way they
think
a.
Strong
vs. weak form
b.
Lexical
vs. grammatical form
3.
Some
psychological research relevant to the linguistic relatively hypothesis
a.
color
labels and color perception in various cultures
b.
linguistic
labels and their effects on memory
I.
Early
attempts to study intelligence
A.
Galton
(a cousin of Charles Darwin) and his book, Hereditary Genius; tried to use
simple measures (like grip strength, pain sensitivity, memory for dictated
consonants) to assess “intelligence”
B.
In
the United States, Cattell (1901) conducted a study which suggested little relationship
between simple sensory measures and Columbia college students’ test scores
C.
Modern
research has returned, using more sophisticate methods, to the question of
reaction time and intelligence. Note, common sense views relate speed of
thinking and intelligence, as is illustrated by phrases such as “quick witted”
and “slow minded”
II.
Binet’s
seminal work on intelligence
A.
A
practical problem: The Paris school system sought Binet’s help in “objectively”
indentifying “dull” school children
B.
Binet
was pragmatic and he tried many different methods to measure “intelligence,”
including digit recall, measuring size of cranium, assessing moral judgments,
graphology, and even palmistry
C.
Defining
intelligence: Binet eventually came to see intelligence as involving many
processes related to 1) “the tendency to take and maintain a definite
direction” in thought, 2) the capacity “to make adaptations,” and 3) the power
of “auto-criticism”; In simple terms,
according to Binet, intelligence involves purposeful, directed thought, which
successfully achieves goals and which is self-critical and
self-correcting; according to Binet and
Simon (1916), the core of intelligence is: “judgment, otherwise called good
sense, practical sense, initiative, the faculty of adapting one’s self to
circumstances. …to judge well, to comprehend well, to reason well….”
D.
Modern
conceptions of intelligence hold on to Binet’s notions; for example,
Sternberg’s triarchic (three-part) model of intelligence describes three
aspects: 1) analytic intelligence (“school smarts”; the ability to
analyze and solve academic problems), 2) practical intelligence (“street
smarts”; the ability to apply knowledge and solve problems and achieve success
in everyday life), and 3) creativity (the ability to invent, to come up
with new solutions and views; divergent thought).
E.
The
first intelligence tests: Binet-Simon
scale was published in 1905; revisions were published in 1908 and 1911;
“difficulty” or “level” of questions (items) was determined by age-related changes
in performance
F.
The
notion of “general intelligence”: Binet observed that children’s performance on
various questions tended to be positively correlated; that is, children who
tended to do well on one set of items also tended to do well on other items,
whereas children who tend to do poorly on one set, tended also to do poorly on
other items
G.
Stanford-Binet: American version of the Binet test; these
are individual, not group tests
H.
Modern
group IQ tests, such as the Wechsler
III.
Characteristics
of good IQ tests
A.
Reliability:
Is the test measuring something consistently; There are various kinds of
reliability: test-retest reliability and internal consistency are two important
kinds
1. In
general, IQ tests are quite reliable; for example, test-retest reliabilities
are on the order of .9
B.
Validity:
Is the test measuring what it’s supposed to measure? Do test scores predict
what you would expect them to predict?
In the case of IQ scores, you might expect that they would predict
school grades, job success, and intellectual accomplishments (e.g., scientific
discoveries, patents, creative accomplishments such as publishing books)
1.
Cox’s
estimates of the IQ’s of famous people like Mozart, Goethe, and Mill
2.
Terman’s
“gifted children” study
3.
Research
on IQ and school grades
4.
Research
on IQ and job success
IV.
Nature,
nurture, and intelligence
A.
The
concept of “heritability”: Heritability is the computed proportion of
population variability in a trait (e.g., variability in height or in intelligence)
that is due to genetic factors; Note, heritability only applies to
populations, not to individuals
B.
Ways
of assessing heritability: Behavior genetic studies look at patterns of trait
correlations in identical and fraternal twins and in families with adopted
children. Two clear examples: Trait
correlations between identical twins reared apart, and trait correlations
between adopted children and members of their genetically unrelated adopted
family
C.
Behavior
genetic studies often investigate two kinds of environmental influences: 1)
Common environmental influences, which affect all children in a family the same
way, and 2) Unique environmental influences, which affect various children in a
family differently. In common-sense
terms, common environmental influences tend to make siblings more similar to
one another, and unique environmental influences tend to make sibling different
D.
Behavior
genetic evidence on intelligence:
1.
Heritability
of IQ tend to be the range of .5 to .80 (i.e., 50% to 80% of the variation in
adults’ IQ scores is due to genetic factors); heritability is lower for
children and higher for adults
2.
Common
environmental effects in children account for about 20-25% of the variability
of children’s IQ scores, but 0% of adults
3.
Thus,
in adults most of the non-genetic variability in IQ seems to be due to unique
environmental effects
E. Behavior
genetic statistics on IQ do not give us useful information about group
differences in IQ