B. F. Skinner was born on March 20, 1904. He went on to become an influential psychologist who first described the learning process known as operant conditioning. Skinner played a pivotal role in behaviorism, a school of thought that suggested that all behavior was learned through conditioning processes.
Skinner referred to himself as a radical behaviorist because he believed that psychology should focus only on the study of observable behavior.
In a survey of psychologists, B. F. Skinner was identified as the most influential psychologist on a list of the most influential psychologists of the 20th century.
This article takes a closer look at his life, his work, and the powerful impact he had on psychology and our understanding of how people learn.
B. F. Skinner’s Life
Born Burrhus Frederic Skinner, he grew up in a small, rural Pennsylvania town as one of two children. He enjoyed building things as a boy and started to develop a strong interest in science when he was in high school.
Despite this interest, he went on to earn a degree in English literature, graduating from Hamilton College in 1926. He originally set out to become a novelist, but soon grew disillusioned with his prospects as a writer.
After discovering the works of Ivan Pavlov and John B. Watson, two thinkers important in the discovery and advancement of behaviorism, Skinner enrolled at Harvard University to study psychology. He worked at the university after graduating with his Ph.D. in 1931. It was during this period that he began working to develop his theory of operant conditioning.
B. F. Skinner’s Research
In order to study human behavior in a systematic, scientific manner, Skinner put his building and inventing skills to work to create tools to conduct his research. His creations included:
- The Skinner box: This was a chamber in which an animal subject could press a key in order to receive some type of reward.
- The cumulative recorder: This tool recorded the animal’s response rate on a piece of paper as a sloping line.
As Skinner used these devices in his research, he observed something interesting. Where classical conditioning insisted that a response depended on the stimulus that came before it, Skinner realized that behavior could also hinge on the events that come after.
In other words, the consequences of the behavior affect how well and how quickly that behavior is learned.
After leaving Harvard to take a position at the University of Minnesota, Skinner began working on a project to support the war effort. His goal was to teach pigeons to guide missiles. The development of missile radar meant that Skinner’s project fell to the wayside. But his work contributed to his continued investigation into operant conditioning.
During this time, Skinner applied his skills to developing a crib/playpen of sorts that he dubbed “the baby tender.” Meant to serve as a safer alternative to the available cribs of the time, it became the subject of psychology lore when an urban legend suggested it was used in experiments involving Skinner’s own children.
Skinner’s Operant Conditioning
Skinner’s theory of operant conditioning was based on Edward Thorndike’s law of effect.
The law of effect states that behaviors followed by desirable or enjoyable consequences are more likely to occur again, while behaviors followed by negative consequences are less likely to occur again.
Based on his experiments with animals, Skinner described two important concepts that can influence learning and behavior:
Reinforcement is anything that increases or strengthens a behavior. Reinforcements can either involve the addition of something (known as a positive reinforcer) or the removal of something (known as a negative reinforcer).
For example, giving a child a cookie for cleaning their room is an example of positive reinforcement. Canceling a test if a student does all of their homework on time is an example of negative reinforcement.
Punishment is anything that decreases or weakens a behavior. Punishments can involve adding something (positive punishment) or removing something (negative punishment).
For example, assigning a child chores for not doing their homework is an example of positive punishment. Taking away their iPad for hitting their sibling is an example of negative punishment.
Skinner also discovered that the timing and frequency of reinforcements played a role in how behaviors are elicited. For example, some schedules of reinforcement involve delivering the reinforcement every time a behavior occurs. In other cases, it involves delivering the reinforcement after a set period of time or after a number of responses occur.
What Impact Did B. F. Skinner Have?
Skinner’s work made him one of the most influential figures in the field of psychology. His ideas had a powerful impact within psychology as well as in other fields including therapy and education.
He was diagnosed with leukemia in 1989. In 1990, the American Psychological Association (APA) awarded him with their lifetime achievement award. He passed away eight days after accepting the award on August 18, 1990.
During his storied career in psychology, he published more than 20 books and close to 200 articles. For much of the 20th century, he was one of the figures most associated with the field of psychology, and his work continues to have an impact today.
Behaviorism is no longer the force it once was, but mental health professionals, therapists, parents, and many others continue to use the principles of operant conditioning to teach and change behaviors in order to help people learn, adapt, and grow.
B. F. Skinner was an advocate for behaviorism and believed that psychology should be the science of observable behavior. His work contributed to our understanding of operant conditioning and how reinforcement and punishment can be used to teach and modify behaviors.
Bjork DW. B.F. Skinner: A Life. Washington, D.C.: American Psychological Association; 1997.
Haggbloom SJ. The 100 most eminent psychologists of the twentieth century. PsycEXTRA Dataset. 2001. doi:10.1037/e413802005-787
The B.F. Skinner Foundation. Biographical Information.
Skinner BF. The Behavior of Organisms: An Experimental Analysis. New York: Appleton-Century; 1938.