Fixed-Interval Schedule and Operant Conditioning

Payday on calendar

Nick M Do/Getty Images

In operant conditioning, a fixed-interval schedule is a schedule of reinforcement where the first response is rewarded only after a specified amount of time has elapsed. This schedule causes high amounts of responding near the end of the interval but much slower responding immediately after the delivery of the reinforcer.

As you may remember, operant conditioning relies on either reinforcement or punishment to strengthen or weaken a response. This process of learning involves forming an association with behavior and the consequences of that behavior.

Behaviors that are followed by desirable outcomes become stronger and therefore more likely to occur again in the future. Actions that are followed by unfavorable outcomes become less likely to occur again in the future. 

It was noted psychologist B.F. Skinner who first described this operant conditioning process. By reinforcing actions, he observed, those actions became stronger. By punishing behaviors, however, those actions become weakened. In addition to this basic process, he also noted that the rate at which behaviors were either reinforced or punished also played a role in how quickly a response was received and the strength of that response.

How Does a Fixed-Interval Schedule Work?

In order to better understand how a fixed-interval schedule works, let's begin by taking a closer look at the term itself. A schedule refers to the rate at which the reinforcement is delivered or how frequently a response is reinforced. An interval refers to a period of time, which suggests that the rate of delivery is dependent upon how much time has elapsed. Finally, fixed suggests that the timing of delivery is set at a predictable and unchanging schedule.

For example, imagine that you are training a pigeon to peck at a key. You put the animal on a fixed-interval 30 schedule (FI-30), which means that the bird will receive a food pellet every 30 seconds. The pigeon can continue to peck the key during that interval but will only receive reinforcement for the first peck of the key after that fixed 30-second interval has elapsed.


There are a few characteristics of the fixed-interval schedule that make it distinctive. Some of these can be seen as benefits, while some might be considered drawbacks.

  • Results in a fairly significant post-reinforcement pause in responding
  • Responses tend to increase gradually as the reinforcement time draws closer

The big problem with this type of schedule is that the behavior tends to occur only right before the reinforcement is delivered. If a student knows that there will be an exam every Friday, he might only begin studying on Thursday night. If a child knows she gets her allowance on Sunday as long as her bedroom is clean, she probably won't clean up her room until Saturday night.

The response rate in a fixed-interval reinforcement schedule is fairly predictable; it increases as the reinforcement time arrives and then drops off precipitously immediately after reinforcement.


It can be helpful to look at a few different examples of the fixed-interval schedule in order to better understand how this reinforcement schedule works and what impact it might have on behavior.

Fixed Interval Schedules in a Lab Setting

  • Imagine that you are training a rat to press a lever, but you only reinforce the first response after a ten-minute interval. The rat does not press the bar much during the first five minutes after reinforcement but begins to press the lever more and more often the closer you get to the ten-minute mark.

Fixed Interval Schedules in the Real World

  • A weekly paycheck is a good example of a fixed-interval schedule. The employee receives reinforcement every seven days, which may result in a higher response rate as payday approaches.
  • Dental exams also take place on a fixed-interval schedule. People who go in for their regular six-month checkup and cleaning often take extra care to clean their teeth right before the exam, yet they may not be as diligent on a day-to-day basis during the six months prior to the exam.

A Word From Verywell

Fixed-interval schedules can be an important tool when teaching new behaviors. Sometimes these schedules occur naturally, while other times they are artificially created and controlled by rewards systems. If you are planning to utilize some sort of reinforcement schedule to teach a behavior, it is important to consider how the fixed-interval schedule might influence the speed of learning as well as the rate of response.

3 Sources
Verywell Mind uses only high-quality sources, including peer-reviewed studies, to support the facts within our articles. Read our editorial process to learn more about how we fact-check and keep our content accurate, reliable, and trustworthy.
  1. Overskeid G. Do we need the environment to explain operant behavior?Front Psychol. 2018;9:373. doi:10.3389/fpsyg.2018.00373

  2. Sproatt D, Navab A. Operant conditioning. In: Volkmar FR, ed. Encyclopedia of Autism Spectrum Disorders. New York, N: Springer; 2013.

  3. Watson ST, Griffes C. Fixed interval schedule. In: Goldstein S, Naglieri JA, eds. Encyclopedia of Child Behavior and Development. Boston, MA: Springer; 2011.

By Kendra Cherry
Kendra Cherry, MS, is the author of the "Everything Psychology Book (2nd Edition)" and has written thousands of articles on diverse psychology topics. Kendra holds a Master of Science degree in education from Boise State University with a primary research interest in educational psychology and a Bachelor of Science in psychology from Idaho State University with additional coursework in substance use and case management.