B. F. Skinner: 4 Interesting Experiments from the Father of Operant Conditioning

B. F. Skinner Sure Did Love His Pigeons Photo by sanjiv nayak on Unsplash
B. F. Skinner Sure Did Love His Pigeons Photo by sanjiv nayak on Unsplash

There are few names in psychology more well-known than B. F. Skinner. First-year psychology students scribble notes as their professors introduce him and his work to the class, and doctoral candidates cite his work in their dissertations as they test whether rat’s behavior can be used to predict behavior in humans.

Skinner is one of the most well-known psychologists of our time. Still, like many larger-than-life figures, for many, he has become little more than a meme of himself, reduced to the two-paragraphs of notes dedicated to him in the notebooks of those bright-eyed freshmen. “Oh, yes. The father of operant conditioning!” we say at dinner parties, hoping the topic changes before our limited knowledge becomes apparent.

But how did he become such a central figure of these Intro to Psych courses, and how did he develop the theories and methodologies cited by those sleep-deprived Ph.D. students?

B. F. Skinner’s Famous Works & Contributions to Psychology

Skinner, born in Pennsylvania in 1904, spent his life studying the way we behave and act, and how this behavior can be modified.

Viewing the classical model of behavioral conditioning championed by Ivan Pavlov, another mainstay of modern psychological study, as being too simplistic a solution to fully explain the complexities of human (and animal) behavior and learning, B. F. Skinner began looking for a better way to explain why we do what we do.

Basing his early work on Edward Thorndike’s 1989 Law of Effect, Skinner went on to expand on the idea that the prevalence of a given behavior is directly related to the consequences which follow said behavior. His expanded model of behavioral learning, known as operant conditioning, is centered around the concepts of behaviors, the actions an organism exhibits, and operants, the environmental response directly following the behavior.

These responses, often referred to as consequences—though this is somewhat misleading due in part to the fact that there need not be a causal relationship between the behavior and the operant—can either come in three forms. The first is reinforcers, which present the organism with a desirable stimulus and serve to increase the frequency of the behavior. On the other end of the spectrum are punishers or environmental responses that present an undesirable stimulus and serve to reduce the frequency of the behavior. Finally, there are neutral operants which, as the name suggests, present stimuli that neither increase nor decrease the prevalence of the behavior in question.

Throughout his long and storied career, Skinner performed a number of strange experiments trying to test the limits of how punishment and reinforcement affect behavior.

4 Interesting Experiments from B. F. Skinner

Though Skinner was a professional through and through, he was also quite a quirky person… and his unique ways of thinking are readily apparent in the strange and interesting experiments he performed while researching the properties of operant conditioning.

Here are four of the most famous experiments from throughout his career:

Experiment #1: The Operant Conditioning Chamber

The Operant Conditioning Chamber, better known as the Skinner Box, is a device that B.F. Skinner used in many of his experiments. At its most basic, the Skinner Box is a chamber where a test subject, such as a rat or a pigeon, can be placed and must ‘learn’ the desired behavior through trial and error.

B.F. Skinner used this device for several different experiments. One such experiment involves placing a hungry rat into a chamber with a lever and a slot where food is dispensed when the lever is pressed. Another variation involves placing a rat into an enclosure, which is wired with a slight electric current in the floor. When the current is turned on, the rat must turn a wheel in order to turn off the current.  

Though this is the most basic experiment in operant conditioning research, there is an infinite number of variations that can be created based on this simple idea.


Experiment #2: A Pigeon That Can Read

Building on the basic ideas from his work with the Operant Conditioning Chamber, B. F. Skinner eventually began designing more and more complex experiments.

One of these experiments involved teaching a pigeon to read words presented to it in order to receive food. Skinner began by teaching the pigeon a simple task, namely, pecking a colored disk, in order to receive a reward. He then began adding additional environmental cues (in this case, they were words), which were paired with a specific behavior that was required in order to receive the reward.

Through this evolving process, Skinner was able to teach the pigeon to ‘read’ and respond to several unique commands.

Though the pigeon can’t actually read English, the fact that he was able to teach a bird multiple behaviors, each one linked to a specific stimulus, by using operant conditioning shows us that this form of behavioral learning can be a powerful tool for teaching both animals and humans complex behaviors based on environmental cues.


Experiment #3: Pigeon Ping-Pong

But Skinner wasn’t only concerned with teaching pigeons how to read. It seems he also made sure they had time to play games as well. In one of his more whimsical experiments, B. F. Skinner taught a pair of common pigeons how to play a simplified version of table tennis.

The pigeons in this experiment were placed on either side of a box and were taught to peck the ball to the other bird’s side. If a pigeon was able to peck the ball across the table and past their opponent, they were rewarded with a small amount of food. This reward served to reinforce the behavior of pecking the ball past their opponent.

Though this may seem like a silly task to teach a bird, the ping-pong experiment shows that operant conditioning can be used not only for a specific, robot-like action but also to teach dynamic, goal-based behaviors.


Experiment #4: Pigeon-Guided Missiles

Thought pigeons playing ping-pong was as strange as things could get? Skinner pushed the envelope even further with his work on pigeon-guided missiles.

While this may sound like the crazy experiment of a deluded mad scientist, B. F. Skinner did actually do work to train pigeons to control the flight paths of missiles for the U.S. Army during the second world war.

Skinner began by training the pigeons to peck at shapes on a screen. Once the pigeons reliably tracked these shapes, Skinner was able to use sensors to track whether the pigeon’s beak was in the center of the screen, to one side or the other, or towards the top or bottom of the screen. Based on the relative location of the pigeon’s beak, the tracking system could direct the missile towards the target location.

Though the system was never used in the field due in part to advances in other scientific areas, it highlights the unique applications that can be created using operant training for animal behaviors.


How B. F. Skinner’s Work Continues to Impact Psychology and Beyond

B. F. Skinner is one of the most recognizable names in modern psychology, and with good reason. Though many of his experiments seem outlandish, the science behind them continues to impact us in ways we rarely think about.

The most prominent example is in the way we train animals for tasks such as search and rescue, companion services for the blind and disabled, and even how  we train our furry friends at home—but the benefits of his research go far beyond teaching Fido how to roll over.

Operant conditioning research has found its way into the way schools motivate and discipline students, how prisons rehabilitate inmates, and even in how governments handle geopolitical relationships.