Have you ever gone to a circus or other event and watched animals perform amazing tricks? Do you ever wonder exactly how the trainers managed to get these animals to do things that it would never do on its own? The key to getting these creatures to engage in these behaviors is a behavioral technique known as shaping.
The concept was introduced by psychologist B. F. Skinner as part of his operant conditioning theory. Shaping is a term used in behavioral psychology to describe establishing an operant behavior through a series of successive approximations toward the desired response. Closer and closer responses are reinforced until the desired response is achieved.
How Does Shaping Work?
So how exactly does this shaping process work in psychology? It can be helpful to look at how Skinner himself described the process. In 1953, he wrote:
“By reinforcing a series of successive approximations, we bring a rare response to a very high probability in a short time. … The total act of turning toward the spot from any point in the box, walking toward it, raising the head, and striking the spot may seem to be a functionally coherent unit of behavior; but it is constructed by a continual process of differential reinforcement from undifferentiated behavior, just as the sculptor shapes his figure from a lump of clay.”
Shaping can be a powerful tool when teaching a new behavior. As the learner exhibit behaviors that are increasingly close to the desired behavior, the teacher progressively reinforces each successfully closer approximation. Initially, this might involve reinforcing any behavior that come even remotely close to the desired response. The learner’s behavior becomes closer and closer to the desired outcome until the correct response is achieved and then further reinforced.
Imagine, for example, that you want to utilize shaping to teach your dog how to fetch a stick. You might begin by rewarding your dog if he even walks near the stick. Next, you might reward him if he picks the stick up in his mouth. The next successive approximations might be walking to you with the stick in his mouth, giving you the stick, watching you throw the stick, walking toward the stick, picking up the stick, and eventually chasing after the stick when you throw it. By rewarding each step, you are able to shape the behavior toward the desired outcome.
When to Use Shaping to Teach Behavior
So when might someone choose to utilize shaping to teach a behavior? It can be particularly useful in situations where a person or animal does not produce the desired response all on his or her own. Teaching a rat to press a lever inside of a Skinner box is a good example. When placed in a training box, the rat probably will not press the lever at all. In order to get the rat to engage in the desired action, the experimenter might begin by giving the animal food pellets whenever it walks near the lever.
Once that action becomes more frequent, the experimenter might require the animal to get closer to the lever before providing reinforcement. Eventually, the rat will actually have to touch the pellet in order to obtain the food. At some point, the rat might accidentally press the lever, which will lead to the release of food.
Now that the desired response has been achieved, the experimenter can present the reinforcement when the animal actually presses the lever on purpose and vary the rate of reinforcement in order to determine the rate of response.
Examples of Shaping in Action
“Molly would like to teach her 1-year-old daughter Sally to call her mommy. Sally currently does not say mommy, but she does say ‘ma’ occasionally. Molly begins to pay extra attention (e.g., says ‘Great job, Sally’ and smiles) when Sally says ‘ma.’ Soon Sally says ‘ma’ often. Molly decides it is now time for Sally to say something more similar to mommy, so she no longer praises her for simply saying ‘ma.’ Instead, Molly reinforces Sally for saying ‘ma’ twice in a row, which sounds like ‘moma.’ This process continues until moma is shaped into mommy.” (Hatfield, & Wallace, 2004)
“Using shaping techniques, the United States Coast Guard successfully trained pigeons to find people lost at sea who are wearing bright orange life jackets. Pigeons have much better eyesight than humans. They are first trained in the laboratory to search for an orange disk and then peck a button with their beaks. After training, the pigeons are taken on rescue missions to search for orange vests in the water. While helicopter pilots notice bobbing orange vests in the water only 35 percent of the time, pigeons’ success rate is closer to 90 percent.” (Franzoi, 2011)
How to Use Shaping
Shaping can be an important tool used by clinicians, teachers, and parents. According to Martin and Pear (1999), there are four key considerations that need to be made when using shaping:
- It is essential to specify the final behavior. By having a specific behavioral target, the “trainer” will be better able to apply reinforcement consistently.
- Select a starting behavior. The trainer must choose a starting behavior that is likely to occur in order to receive reinforcement.
- Establish shaping steps. Before shaping begins, the trainer should spend some time considering the likely behaviors that may occur between the starting behavior and the target behavior. By deciding which approximations should be reinforced, the trainer will be more likely to deliver consistent reinforcement.
- Move at a correct pace. If the individual fails to show progress, try simpler steps. If progress is moving too rapidly, try raising the criteria for reinforcement.
DeLamater, J. & Myers, D. (2011). Social psychology. Belmont, CA: Wadsworth Cengage Learning.
Franzoi, S. L. (2011). Psychology: A discovery experience. Mason, OH: South-Western Cengage Learning.
Hatfield, D. B., & Wallace, D. C. (2004). Successive approximation (shaping). In The Concise Corsini Encyclopedia of Psychology and Behavioral Science. (W. E. Craighead & C. B. Nemeroff, Eds.). Hoboken, NJ: John Wiley & Sons.
Martin, G., & Pear, J. (1999). Behavior modification: What it is and how to do it (6th ed.). Englewood Cliffs, NJ: Prentice Hall.
Skinner, B.F. (1953). Science and human behavior. pp. 92–3. Oxford, England: Macmillan.
Weiten, W. (2010). Psychology: Themes and variations. Belmont, CA: Wadsworth.