Whereas classical conditioning depends on developing associations between events, operant conditioning involves learning from the consequences of our behavior. Skinner wasn’t the first psychologist to study learning by consequences. Indeed, Skinner's theory of operant conditioning is built on the ideas of Edward Thorndike.
Thorndike (1898) studied learning in animals (usually cats). He devised a classic experiment in which he used a puzzle box (see fig. 1) to empirically test the laws of learning.
He placed a cat in the puzzle box, which was encourage to escape to reach a scrap of fish placed outside. Thorndike would put a cat into the box and time how long it took to escape. The cats experimented with different ways to escape the puzzle box and reach the fish.
Eventually they would stumble upon the lever which opened the cage. When it had escaped it was put in again, and once more the time it took to escape was noted. In successive trials the cats would learn that pressing the lever would have favorable consequences and they would adopt this behavior, becoming increasingly quick at pressing the lever.
Edward Thorndike put forward a “Law of effect” which stated that any behavior that is followed by pleasant consequences is likely to be repeated, and any behavior followed by unpleasant consequences is likely to be stopped.
Thorndike, E. L. (1898). Animal intelligence: An experimental study of the associative processes in animals. Psychological Monographs: General and Applied, 2(4), i-109.
McLeod, S. A. (2007). Edward Thorndike. Retrieved from www.simplypsychology.org/edward-thorndike.html
Listen to a MIT undergraduate lecture on Conditioning