Classical and operant conditioning
Classical conditioning
Classical conditioning is learning through association. In simple terms, two stimuli are linked together to produce a new learned response in an animal or human.
During the 1890s, Russian physiologist, Ivan Pavlov was researching salivation in dogs in response to being fed. Â He found that objects or events could trigger a conditioned response.
The experiments began with Pavlov demonstrating how the presence of the lab assistant bringing the dog food (stimulus) would trigger an unconditioned response (salivation). But Pavlov noticed that the dogs started to associate his lab assistant with food, creating a learned and conditioned response. Pavlov then designed an experiment using a bell as a neutral stimulus. As he gave food to the dogs, he rang the bell. Then after repeating this procedure, he tried ringing the bell without providing food to the dogs. On its own, an increase in salivation occurred. The result of the experiment was a new conditioned response in the dogs. Pavlov’s theory later developed into classical conditioning, which refers to learning that associates an unconditioned stimulus that already results in a response (such as a reflex) with a new, conditioned stimulus. As a result, the new stimulus brings about the same response.
Key terms to understanding classical conditioning:
- Natural stimulus – A Stimulus that produces no specific response.
- Unconditioned stimulus – A stimulus that naturally and automatically produces a response.
- Conditioned stimulus – A previously natural stimulus that after being associated with the unconditioned stimulus, triggers a conditioned response.
- Conditioned response –  The learned response to the previously natural stimulus, which has become a conditioned stimulus.Â
Little Albert
Another well known experiment into classical conditioning is that of ‘little Albert’. Scientists Watson and Rayner demonstrated that classical conditioning could be use to create phobias. They introduced little Albert to stimulus such as rats, rabbits and dogs etc, things that little Albert initially did not fear. These natural stimuli were then paired with a loud noise that frightened little Albert who then begun to fear the stimuli that he was previously unafraid of. Little Albert had been conditioned to fear the previously natural stimuli. https://www.youtube.com/watch?v=9hBfnXACsOI
Experiments like that of Pavlov, Watson and Rayner illustrate how phobias and fears in our horses can be the result of classical conditioning. The phobias and fears of Little Albert were conditioned after multiple pairings but it is also possible for a fear to develop after just one presentation. For example, a horse might have to only be bitten by a dog once to then fear dogs.
The most important thing to remember is that classical conditioning involves automatic or reflexive responses, and not voluntary behaviour.
Many undesired behaviours the horse gives us are formed from involuntary pairings that we don’t want the horse to learn. They are learning through association. For example, A horse has an accident in the trailer. Next time the horse is led towards the trailer he is afraid and refuses to go in. Through his experience he has become fearful (the response) of the trailer (the stimulus).
Its important to remember that when we own or care for horses, we don’t always know their history. What many label as ‘bad behaviour’ could be a fear response to a conditioned stimulus that we may be unaware of.
Can you think of a behaviour you have seen that is a result of a conditioned response?
Second order conditioning
Rather than pairing the stimulus with something that naturally produces a response (unconditioned stimulus), second order conditioning extends the conditioning to one more level by pairing the unconditioned stimulus with a stimulus that has been conditioned to produce a response.
Here is an example –
The horse is not afraid of the whip as he has never been hurt by one before. He is not afraid of his owner as she has not caused him harm. One day the owner starts to train the horse using the whip, causing pain, as a result the horse is conditioned to fear the whip. Second order conditioning takes place when the horse begins to fear the owner as she is the one using the whip on the horse. The horse has learnt that the whip with the owner causes pain and has therefor been conditioned that the owner with a whip is something to fear.
How do you think second order conditioning in this example affects the relationship between horse and human?
Operant Conditioning
Operant conditioning is a method of learning that occurs through rewards and punishments for behaviour. In simple terms, an association is made between a behaviour and a consequence (whether negative or positive) for that behaviour. There are four types of operant conditioning-positive reinforcement, positive punishment, negative reinforcement, and negative punishment.
Positive reinforcement (+R) :
The addition of a pleasurable stimulus following a behaviour, making the behaviour more likely to occur again.
In the 1930’s an American psychologist B.F. Skinner proposed a theory on operant conditioning by conducting various experiments on animals. He used a special box for his experiments on rats. The box became known as skinner’s box.
The first step to his experiment was to place a hungry rat inside the box. Initially the rat didn’t move inside the box, but gradually as it began to adapt to the environment, it began to move around. Eventually, the rat discovered a lever that when pressed, released food. After eating the food it started exploring the box again, and after a while it pressed the lever for the second time and again food was immediately dispensed. This continued for the third, fourth and the fifth time, and after a while, the hungry rat immediately pressed the lever once it was placed in the box. This concluded the experiment as the rat had now been conditioned to know that he pressed the lever to get the food.
This is an example of positive reinforcement. Upon pressing the lever, the hungry rat got food, and was no longer hungry hence, it’s a positive reinforcement.
Another example- The horse ball. The horse is interested in the ball and gives it a nudge, food comes out. He learns that moving the ball rewards him with food.
Negative reinforcement (-R):
The removal of an aversive stimulus following a behaviour, making it more likely for the behaviour to occur again.
In Skinner’s second experiment, he placed the rat inside the box and sent an electric current through the box floor. As the rat moved around the box it would accidentally knock the lever and the electric current would stop. The rats soon learned that the lever stopped the current. When placed in the box the rat would immediately press the lever so as not to get a shock. This is an example of negative reinforcement. The stimulus (electric current) is removed after a particular behaviour (pressing the lever) is exhibited.
Another example- The rider applies pressure to the horses side by using their legs. When the horse responds by moving forward the pressure is removed. He learns that pressure from the leg is removed when he walks forward.
Most traditional riding methods use negative reinforcement to train the horse, achieving the desired behaviour by removing something unpleasant.
Positive punishment (+p):
The addition of an aversive stimulus to make the behaviour less likely to occur again.
For example- striking the horse with a whip when he bucks. The buck is the undesired behaviour, the strike with the whip is aversive stimulus (the punishment).
Negative punishment (-P):
The removal of something desirable following a behaviour, making it less likely for the behaviour to occur again.
Because negative punishment decreases the likelihood of the behaviour occurring again by removing a stimulus, the stimulus must be pleasant or essential.
For example- A hungry horse is barging his way into you to get the feed bucket (the stimulus). You take the bucket away and he gets no food (the punishment). Negative punishment is not often used when working with horses as it’s difficult to remove something the horse values whilst effectively linking the removal to the behaviour. Timing is critical in order for the horses to make the association with the behaviour and the punishment (1 or 2 seconds).
Both operant and classical conditioning are considered associative learning- learning the relationship between two things. In classical conditioning the horse learns a relationship between two stimuli (the conditioned stimulus and the unconditioned stimulus). In operant conditioning the horse learns a relationship between his behaviour and a reinforcer.
Counter conditioning
Counter conditioning is a conditioning process used on both humans and animals. The process can be used to change the response to a stimulus by associating this stimulus to one of the opposite value. For example; if the horse is fearful of a dog because he has been bitten by one, his handler could give him a treat every time he sees a dog. The treat causes a pleasurable emotional response.
Before counter conditioning
Dog = fear Treat = feel good
During counter conditioning
Dog + treat = fear reduced
After counter conditioning
Dog = feel good
Extinction
Extinction refers to the gradual decrease of a behaviour that results in the behaviour becoming less frequent or completely disappearing. It occurs in both classical and operand conditioning.
In operant conditioning when a behaviour is no longer reinforced (negatively or positively) extinction will occur. For example: If a horse as learnt that when he puts his nose in his owners pockets he gets a treat and then one day he stops getting treats he will eventually stop putting his nose in the pocket as he has learnt that the behaviour does not achieve anything.
In classical conditioning when a controlled stimulus is no longer paired with the unconditioned stimulus there will no longer be a conditioned response. For example: If the bell stops being paired with the food the dog will no longer salivate when hearing the bell.
Often behaviour will get worse before it gets better. This is called an extinction burst. The behaviour may increase in intensity as the horses is trying to make it work. For example: a horse that is no longer getting treats when he puts his nose in the pocket my start to nip at the pocket. It important that the behaviour is not reinforced at this point as it will lead to the increased behaviour being rewarded at this intensity.
Extinction is a gradual process and there are often spontaneous recovery episodes where behaviour reoccurs but often at less intensity.