Skip to main content

Applied Behavior Analysis

Your source for materials and information supporting a "Functional Approach" to behavior.
ABA Home
About Us
Downloads
Videos

Glossary & Abbreviations

Encyclopedic Glossary of Terms and Abbreviations in the Technology and Principles of Behavior


Please cite any use of definitions in this Glossary as follows:

O'Heare, J. (2011). Encyclopedic glossary of terms and abbreviations in the technology and principles of behavior. Retrieved March, 20, 2012 from http://www.associationofanimalbehaviorprofessionals.com/glossary.html.

Encyclopedic Glossary of Terms


Definitions, additions and alterations of definitions that differ from the afoementioned sitation will be coled the same as this text. This page is not finished!!!

 

 

ABC. Antecedent, Behavior, Consequence. See Three-Term Contingency.

Abolishing Operations (AOs). Abolishing operations temporarily decrease the effectiveness of consequences. See Motivating Operations and Function Altering Stimulus.

Alternating Treatment Design. The alternating treatment design in the single subject experiment is characterized by rapidly alternating at least two distinct treatments (independent variables) and observing their effect on a single behavior (dependent variable) (Cooper et al., 1987). Rather than waiting for stability of the independent variable to be achieved, such as in the reversal design, the alternating treatment design alternates interventions right from the start.

Antecedent Stimulus. Stimuli present prior to the behavior in question. There can be many stimuli present in the environment prior to a behavior in question but not all of them will have functional control over the behavior of concern. Once an antecedent stimulus is confirmed to have functional control over a behavior it is called a discriminative stimulus (SD). See Discriminative Stimulus for details.

Anxiety. General term referring to emotional arousal. Miltenberger (2004): "A term used to describe respondent behavior involving the activation of the autonomic nervous system (including rapid heart rate, shallow rapid breathing, and increased muscle tension). Often used more specifically to refer to a anticipatory foreboding, to the awareness related behaviors or private experience of certain emotional responses. Certain conditioned stimuli elicit emotional responses because they have been associated with other stimuli afterward, which we might term anticipation. This "worrying" can be thought of as anxiety.

Applied Behavior Analysis. "Behavior analysis is a science concerned with the behavior of people, what people do and say, and the behavior of animals. It attempts to understand, explain, describe and predict behavior." (Source) "The use of behavior principles and methods to solve practical problems. (Source)

Autoshaping. A respondent conditioning procedure that produces skeletal muscle responses, more typical of operant behavior. “For example, a key is turned on a few seconds before grain is presented to a pigeon. After several pairings of key light and grain, the bird begins to peck the key” (Pierce & Cheney, 2004, p. 420).  Ask if created mechanical prompting sequence to engage in work/play qualifies?

Aversive Stimulus(SAVE). A stimulus that an organism acts to escape. Aversive stimulation can result in some problematic secondary effects, including escape behavior (which may be manifest in aggression, elopement, etc.), response depression / learned helplessness and problematic respondent conditioning such that environmental stimuli come to elicit the similarly unpleasant emotional arousal. Aversive stimulation involves fear- or pain-eliciting stimuli and is the opposite of appetitive stimulation, which is pleasure eliciting. Averser is sometimes equated with punisher. (Frayley, General Behaviorology: The Natural Science of Human Behavior, 2008) See AppetitiveStimulation.

Avoidance. Organisms can be expected to attempt to escape aversive stimuli. Via aversive conditioning, animals can learn to anticipate aversive stimulation. By this we mean that stimuli that come before it come to also elicit aversive experiences and evoke escape behaviors. When this occurs, we call it avoidance. Note though that it is always, really, escape. They avoid the original aversive stimulus but only because some previous stimulus has become associated with it and takes on stimulus control over the escape behavior. See Negative Reinforcement.

Backward Chaining. "A method used to train a chained performance. The basic idea is to first train behavior that is closest to primary reinforcement; once responding is established, links in the chain that are farther and farther from primary reinforcement are added. Each link in the chain is reinforced by the SD (which is also a conditioned reinforcer) that signals the next component in the sequence." (Source) "A sequence of responses in which each response produces a stimulus change that functions as a conditioned reinforcement for that response and as a discriminative stimulus for the next response in the chain; reinforcement for the last response in a chain maintains the reinforcing effectiveness of the stimulus changes produced by all previous responses in the chain." Notice the added hypothesis that the SD also serves as a conditioned reinforcer. So, you have a series of behaviors in a chain. Completion of each behavior serves a dual function; it serves as a conditioned reinforcer for the behavior the learner just performed and it acts as the discriminative stimulus (SD) for the next behavior in the chain. The opportunity to perform the next behavior in the chain reinforces the behavior and this occurs for each link in the chain until the final or terminal behavior, which produces the primary reinforcer from the trainer. This final reinforcement maintains the chain and the conditioned reinforcers that make it up. (Cooper, Heron and Heward, 2007, p. 690)There are no in-chain or interjected cues from the trainer in a behavior chain. See Sequencing.

Baseline. The strength of a behavior (measured via latency, duration, rate of responding, relative frequency or intensity) prior to a behavior change procedure or some other intervention. The strength of the behavior during and after intervention is compared with the baseline in order to objectively identify changes. The difference in level and trend are used to determine whether the intervention was responsible for the change and whether the intervention can be considered successful. "The phase of an experiment or intervention in which the behavior is measured in the absence of an intervention." (Source)

Behavior. See also Response. Behavior is anything that an organism does that can be measured. The term behavior is a general term whereas response refers to a particular instance of a behavior. Behavior can include operants or respondents. Covert verbal behaviors also known as thinking are behaviors as are the release of chemicals in the bloodstream by glands and neural awareness behaviors in the nervous system. Behavior is a general term; the term response is used for particular instances of a behavior.

Behavior Chain. "A sequence of related behaviors, each of which provides the cue for the next, and the last of which produces a reinforcer." (Source) See Backward Chain.

Behavior Change Procedure. The procedural description of how a change in behavior is to be produced as a function of a contrived change in the environment. (Frayley, General Behaviorology: The Natural Science of Human Behavior, 2008) Antecedents and consequences are manipulated in order to change the behavior.

Behavior Change Project / Program / Plan. Systematic and comprehensive plan involving behavior change procedures for changing specific behaviors of an individual. Usually includes both antecedent control procedures and manipulation of consequences or respondent associations.

Behaviorism. Originally a branch of psychology, behaviorism is the natural science of behavior and quite distinct from psychology. It has philosophical and theoretical foundations emphasizing natural science assumptions and avoids speculation and theoretical constructs for explaining behavior. Behaviorism has two main branches: experimental analysis of behavior, which identifies basic principles of behavior, and applied behavior analysis, which applies basic principles of behavior to changing problem behaviors in real-life settings. See bridge or translational?

Behavior Maintenance. How long a behavior persists after the original contingencies are discontinued. Often refers to the stable performance of behavior after the acquisition stage. Not to be confused with the maintenance stage in a behavior change program. See Steady-State Responding.

Behavioral Economics. Pat please add here

Behavioral Momentum. Pat please add here

Capturing. Reinforcing a behavior when it occurs without contrived trainer prompting or cueing. In free-shaping, the behavior is captured when it occurs.

Chaining. See Behavior Chain.

Coercion. The “use of punishment and the threat of punishment to get others to act as we would like, and to our practice of rewarding people just by letting them escape from our punishments and threats” (Sidman, 2001, p. 1). See Aversive Stimulus.

Competing Behavior Model. The competing behavior model emphasizes replacing problematic behaviors with more acceptable behaviors. This is consistent with the constructional approach. Usually involves diagramming the ABCs of a problem, including identification of replacement behaviors. See Constructional Approach. Does this provide the conceptual framework for FCT?

Compound Stimuli. Two conditioned stimuli presented together in respondent conditioning, such that both come to elicit the same conditioned response.

Conditioned Aversive Stimulus (CSAVE). An aversive stimulus that acquires its aversive effect through conditioning, as opposed to an unconditioned aversive stimulus, such as being shocked, struck or otherwise hurt, offended or traumatized. Often, trainers and the aversive tools they use can become conditioned aversive stimuli. Cues that predict aversive stimulation may also become conditioned aversive stimuli. See also Aversive Stimulus.

Conditioned Inhibition. “In respondent conditioning, when a [conditioned stimulus] is presented repeatedly without the [unconditioned stimulus] (extinction, the conditioned stimulus is said to acquire increasing amounts of inhibition, in the sense that its presentation suppresses the response.” (Pierce & Cheney, 2004) How does this relate to extinction?

Conditioned Reinforcer. A previously neutral stimulus that has been paired with an unconditioned reinforcer and has acquired effectiveness to increase the frequency of an operant. Generally used in the context of positive reinforcement rather than negative reinforcement. See also Unconditioned Reinforcer.

Conditioned Response (CR). Response elicited by a conditioned stimulus. Often, but not always, similar to the unconditioned response. For example, a click comes to elicit a similar response to the food it has been associated with. There are instances, however, in which the CR is quite dissimilar to the unconditioned response.

Conditioned Stimulus (CS). A previously neutral stimulus that has been paired with an unconditioned stimulus and now elicits reflexive behavior.

Conditioned Suppression. A conditioned stimulus is paired with an aversive unconditioned stimulus. Once it becomes a conditioned aversive stimulus (CSAVE), its presentation will suppress ongoing operant behavior.

Conditioning. Any change in behavior due to experience. Same as learning.

Consequence. That part of the postcedent environment that is functionally related to the behavior in question. The consequence will be reinforcing or punishing. Postcedent stimuli are referred to as a consequence once they have been confirmed as functional related to the behavior of concern. See Postcedent.

Constructional Approach. As opposed to the eliminative approach. In 1974, Israel Goldiamond proposed and outlined a basic strategic approach to changing behavior that provided a paradigm shift from the popular eliminative approaches practiced at the time. In the eliminative approach, behavior is commonly thought of as abnormal, pathological and excessive. The focus of “treatment” is on decreasing the excessive behavior (via extinction or punishment, for example). Goldiamond (1974; 2002) and Delprato (1981) agree that a view of behavior as pathological or abnormal fosters unnecessary acceptance of the eliminative behavioral methods of behavior change. In the constructional approach, rather than reducing an organisms’repertoire of behaviors, the trainer increases them. In the eliminative approach, an organism is shown what not to do, whereas in the constructional approach, an organism is shown what else to do. Contrast with Eliminative Approach. See my blog on this topic and the implications for explaining the implications.

Context for Conditioning. The ontogenetic and phylogenetic history and current anatomic and physiologic condition of the organism, as well as the environmental conditions present when a given learning process is occurring. The influence of history and environment on conditioning. Constraints and influences on conditioning.

Contingency. A description of the functional relationship between behavior and the environment. It includes the behavior itself as well as the antecedent that evokes it and the consequence that influences the behavior’s strength. As a criterion for effective reinforcement or punishment it often refers to a relationship between an operant class and a consequence, in which the consequence occurs if, and only if, the operant occurs. In respondent conditioning it refers to a positive correlation between the conditioned stimulus and the unconditioned stimulus. See Functional Relationship.

Contingency Analysis. The analysis of a particular situation in order to identify the variables in the contingency or contingencies that constitute the situation. This includes the ABCs for each contingency involved.

Contingency Statement. A concise statement of a particular behavior problem, identifying the behavior in question as well as its antecedents and maintaining consequences. It is constructed based on a functional assessment. See Functional Assessment and Contingency Analysis. Likely the source for the Hypothesized Function.

Continuous Reinforcement(CRF). A schedule of reinforcement in which every response results in reinforcement.  Is this the same as an FR1 schedule and do we need to provide a definition of FR schedules?

Controlling Stimulus. A stimulus that changes the likelihood of an operant across subsequent occasions. An SD (discriminative stimulus) makes the operant more likely and an S∆ (extinction stimulus) makes it less likely. SAVE (conditioned aversive stimulus) can increase or decrease the likelihood, depending on the particular contingency in operation.

Coprophagia. Eating of feces. Also see Pica.

Counterconditioning. A respondent conditioning process in which the learner's previous conditioned response to a conditioned stimulus is changed. Counterconditioning is used to change a conditioned emotional response from fearful to joyful, or anxiety to relaxation. It may play a role in systematic desensitization procedures. A term that has been used in place of counterconditioning is reciprocal inhibition. This term was presented to describe a situation in which a relaxed response was created in the presence of an anxiety-eliciting stimulus at a low level of intensity; the relaxation inhibits the anxiety response. See also Systematic Desensitization.

Countercontrol. Operant behavior that functions to oppose aversive stimulation. When an individual is coerced, they will behave in order to work around or against this contingency in order to maintain access to reinforcement. Often misinterpreted as “dominance.” Explore this as it relates to Sr-, Lying, etc.

Cycle of Reciprocal Countercontrol. Term coined by O’Heare (2007). Here is how the cycle of countercontrol works: The guardian finds some particular dog behavior irritating. The guardian's behavior (usually punitive countercontrol, such as “correcting” the dog with leash pops, hitting or yelling) is negatively reinforced as a quick fix tactic, which then produces an irritation for the dog, who in turn resorts to countercontrol. This is also negatively reinforced in many cases, and the cycle of countercontrol continues. All the while, fallout from the lose–lose encounters is compounded to degrade the relationship and produce further problematic behaviors. See also Countercontrol. Wow, I left this in to remind myself to write on multiple organism PB cycles (e.g. Leeds)

Delay Conditioning. A respondent conditioning procedure in which the conditioned stimulus is presented prior to the unconditioned stimulus and then ends or ceases after the US is being presented.

Dependent Variable. In experimentation, the dependent variable is the variable that is measured. The experimenter controls for variables other than an independent variable. The independent variable is the only variable changed between subjects, or with a single subject through time. The dependent variable is measured in order to determine if the independent variable affected it.

Deprivation. An establishing operation procedure in which the reinforcer is withheld in order to temporarily increase its effectiveness. As an abolishing operation, a stimulus is withheld in order to temporarily decrease its effectiveness.

Diagnosis (Dx). A term used in the medical-model approach to behavior cases in which the “patient” is labeled with a “disorder.” This is as opposed to a behavioral approach, in which a functional assessment is carried out and a contingency statement is hypothesized. Diagnoses may act as weak descriptions but in no way explain behavior. Only a contingency statement of the functional relationship between behavior and the environment can explain behavior. See Contingency Statement.

Differential Reinforcement (DR). A procedure in which a target behavior is reinforced while another target behavior is extinguished. It can also refer to targeting a specific behavior for reinforcement in the presence of a particular discriminative stimulus, and targeting that same behavior for extinction in the presence of a different discriminative stimulus. This would be discrimination training.  See Positive Reinforcement and Extinction.

Differential Reinforcement of Alternative Behaviors (DRA). A differential reinforcement procedure in which a specific target behavior that is not necessarily incompatible to the undesirable behavior is reinforced while another undesirable target behavior is extinguished.

Differential Reinforcement of communication (DRC). A differential reinforcement procedure in which a behavior is reinforced only if it a targeted communicative response is emitted (e.g. “more” as a mand for additional tangible items).

Differential Reinforcement of High Rate (DRH). A differential reinforcement procedure in which a behavior is reinforced only if it is performed at least specific number of times in a given time frame.

Differential Reinforcement of Incompatible Behaviors (DRI). A differential reinforcement procedure in which a target behavior that is incompatible or mutually exclusive to the undesirable behavior is reinforced while another target behavior is extinguished.

Differential Reinforcement of Low Rate (DRL). A differential reinforcement procedure in which a behavior is reinforced no more than a specific number of times in a given time frame.

Differential Reinforcement of Other Behaviors (DRO); aka Differential Reinforcement Zero Responding (DR0). A differential reinforcement procedure wherein reinforcement is delivered contingent on absence of the target behavior within a specified period of time. Another definition might be, any behavior other than the undesirable behavior is reinforced while undesirable target behavior is extinguished.

Differential Reinforcement of Successive Approximations of a Target Behavior (aka shaping or behavior shaping). A differential reinforcement procedure in successive approximations to a target behavior is reinforced incrementally while other behaviors are extinguished in order to achieve the target behavior. See Shaping.

Dimensional Approach. The dimensional approach to understanding behavior assumes that typical behavior is variable (presenting at various levels along several definable dimensions). In the dimensional system, various distinct dimensions or characteristics would be identified theoretically and empirically, as would psychological instruments to measure them. Each would have a scale from, say, 1 to 100 and tests for each characteristic. A person or dog would then be tested with reliable and valid tests, and their points would be graphically presented on the scales. The scales and tests would be arranged such that adaptive scores would be found close to the 50-point mark. The further a point lies from this mark, the greater the indication of a problem with that particular characteristic. In this way, the diagnosis (psychological profile) would identify the actual problem traits.

Direct Observation. Part of a functional assessment, direct observation involves observing and measuring a particular behavior in order to establish its operant level (baseline), and to produce an accurate contingency statement. See Functional Assessment.

Discrimination. "Discrimination is the process of behaving, or coming to behave, differently in the respective presence of different stimuli (instead of behaving in the same way in the respective presence of different stimuli). (Frayley, General Behaviorology: The Natural Science of Human Behavior, 2008) Refers to an organism responding differently to two or more different stimuli. The organism discriminates between the stimuli.

Discrimination Training. Promoting discrimination in a training context and establishing a very specific discriminative stimulus and countering generalization to similar stimuli. Differentially reinforcing the behavior when performed after the specific discriminative stimulus and targeting all behaviors evoked by similar stimuli for extinction.

Discriminative Stimulus(SD). An immediate antecedent stimulus that evokes a behavior as part of a contingency. Saying “sit” evokes sitting behavior if the stimulus has a history of reinforcement associated with it.

Distance-Decreasing Behaviors. Behaviors that function to decrease distance between an organism and another social being. Approach behaviors.

Distance-IncreasingBehaviors. Behaviors that function to increase distance between an organism and another social being. Escape behaviors.

Distant antecedents. Older term referring to stimuli other than the discriminative stimulus that come before the behavior and influence it. See Function Altering Stimulus  Do we want to relate this to Setting Events? Would we do well to write a piece on Setting events, MO, AO, EO and their respective antecedent or evocative qualities?

Ecological Factors. Setting events and other function altering stimuli are sometimes referred to as ecological factors. They are aspects of the environment that contribute indirectly to the contingencies. See Setting Events, Antecedent and Function Altering Stimulus. See above

Elicited. Respondents are elicited. They are caused by the presentation of a stimulus. Respondents are never evoked or emitted; they are elicited.  I need to understand this.

Eliminative Approach. In the eliminative approach, behavior is commonly thought of as abnormal, pathological and excessive. The focus of “treatment” is on decreasing the excessive behavior (via extinction or punishment, for example). Goldiamond (2002) and Delprato (1981) agree that a view of behavior as pathological or abnormal fosters unnecessary acceptance of the eliminative behavioral methods of behavior change. The alternative proposed by Goldiamond was a constructional approach. Rather than reducing the organism’s repertoire of behaviors, the trainer increases them. In the eliminative approach, the organism is shown what not to do, whereas, in the constructional approach, the organism is shown what to do. Contrast with Constructional Approach. See comments way up

Emitted. Operant behavior is emitted. Its likelihood is controlled by the discriminative stimulus that evokes it and its associated history of reinforcement in the presence of the discriminative stimulus. Operant behaviors are never elicited.

Emotional Behavior. Physiological behaviors including the release of hormones into the bloodstream by glands and the neural behaviors occurring in the nervous system as well as the aftereffect experience of the learner regarding those behaviors (feelings). The behaviors are the respondents involved, including the neurophysiology and the awareness behaviors of their effects. Emotional behaviors are respondently conditioned and changed only via respondent conditioning. They may act as motivating operations or antecedent conditions for operants.

Environment. All stimuli and conditions that may influence the behavior of an organism, including some internal environments such as hormonal conditions, thinking and the experience of pain. "The natural domain defined by the existence of theoretically measurable independent variables in behavior-controlling relations. The environment occurs on both sides of the skin of the behaving organism. The concept of behavior-controlling environment excludes all non-natural events." See Natural Event(Frayley, General Behaviorology: The Natural Science of Human Behavior, 2008)

Escape Behavior. Behaviors that function to allow an organism to stop or diminish an aversive experience that has already commenced. See Negative Reinforcement (Sr-).

Establishing Operations (EOs). Operations that temporarily increase the effectiveness of consequences. See Motivating Operations and Function Altering Stimulus.

Evoke. Operants are evoked, as opposed to respondents, which are elicited. Operant behaviors are never elicited.

Experimental Analysis of Behavior. The branch of the science of behavior that applies scientific methods to identifying and elaborating basic principles of behavior. Commonly focuses on replicated single subject experimental designs.

Extinction: Withholding or preventing reinforcement for a behavior (procedure), and the resulting decline in the frequency of that behavior (effect) across subsequent occasions in operant conditioning. Presentation of the conditioned stimulus without the unconditioned stimulus after conditioning has occurred, and resulting decline in the strength of the association and response in respondent conditioning. Extinction represents no postcedent environmental change (as opposed to reinforcement and punishment, which both involve postcedent environmental changes). Extinction has commonly been described in terms of abrupt cessation of reinforcement, but it can also be framed in terms of gradual decline. Furthermore, extinction might be defined more broadly than common as "the process of decreasing difference between the antecedent and postcedent environmental conditions in a three-term contingency featuring a behavior that was previously effective in the setting (or generalized to it) and that was maintained at its previous rate by reinforcement alone or by a combination of reinforcement and punishment." (Frayley, General Behaviorology: The Natural Science of Human Behavior, 2008) In this broader definition, extinction would apply logically to punishment as well as reinforcement. We might refer to this extinction with regards to a punished behavior as an extinction-like process since the word extinction would be confusing in reference to an increase rather than a decrease in the rate or frequency of the behavior. This increase in the behavior has historically been called recovery but it can be thought of as extinction.

Extinction Burst. A temporary increase in the frequency of a specific behavior being extinguished, immediately after an extinction procedure is instated.

Extinction Stimulus(S∆). Pronounced “S-Delta”, an antecedent stimulus indicating an extinction contingency being in place for a particular behavior, and results in a decline in that operant. It can refer to a stimulus that once was a discriminative stimulus but now does not evoke the behavior it once did.

Fading (or Fading the Prompt). Broadly, a procedure in which stimulus control is gradually transferred from one value of that stimulus to another by making it seem less and less like the old stimulus and more ands more like the new one. If we are fading an established discriminative stimulus (SD), then we are transferring stimulus control from that SD to another. Where we are fading a prompt, we are establishing stimulus control with some other stimulus and thereby eliminating the prompt from the sequence before it becomes an SD for the behavior. Often, we fade a visual-olfactory prompt (e.g., lure) to a hand signal by gradually changing the appearance of the prompt toward looking more like the hand signal we want to use. "the procedure by which an added stimulus (prompt) is gradually withdrawn. Fading is used to help establish a simple discrimination." (Source) "Disregarding the common usage of the term, fading does not always refer to the disappearance of a stimulus. Sometimes in a fading procedure, a stimulus begins at a low value and is increased in magnitude." (Source)

Fixed Duration(FD). This schedule of positive reinforcement makes the rule that reinforcement will be delivered after a behavior has been occurring for a specified fixed amount of time.

Fixed Interval(FI). This schedule of positive reinforcement makes the rule that reinforcement is provided immediately after a response after a specific interval of time has passed.

Fixed Ratio(FR). This schedule of positive reinforcement makes the rule that responses will be reinforced after a specific and fixed number of responses has been performed.

Flooding. In a flooding procedure, the organism is exposed to the full intensity of the conditioned stimulus (i.e. flooded with the conditioned stimulus) without the unconditioned stimulus. Exposure continues until the conditioned response is extinguished, and escape attempt behavior declines. Escape is prevented. Flooding is a procedure intended to produce respondent extinction.

Four-Term Contingency. A term that may be used in place of the three-term contingency model (ABC) to include function altering stimuli (setting events and motivating operations) as a term before the A in the three term contingency. In certain cases, the four-term contingency may allow for a fuller explanation of the functional sequence. see Three-Term Contingency for further discussion.

Free-Shaping (procedural). A type of shaping wherein approximations are captured rather than prompted. See shaping. That is, the trainer does not prompt responses in a contrived manner, but rather waits for the approximation and provides reinforcement when it occurs. Once the approximation is stable, it is put on operant extinction. The behavioral variability created by extinction provides different behaviors from which to capture the next approximation. Use of this term has been criticized because shaping is a postcedent intervention and prompts are antecedent interventions. This criticism is perfectly accurate, as far as that goes. Shaping is the term well established in behavior analysis and behaviorology and the term free-shaping seems to be an animal trainer invented term. Shaping involves reinforcement of successive approximations of a target behavior. It is a special kind of differential reinforcement. Both are compound procedures involving reinforcement and extinction, but while differential reinforcement changes only the rate or frequency (quantity) of a behavior, shaping changes the form (quality) of the behavior. They are both postcedent interventions in that they involve reinforcement and extinction, both postcedent behavior change processes. The antecedent conditions are not specified in the procedure. The term free-shaping was conceived in order to provide a term for shaping wherein no contrived trainer provided prompts are utilized to evoke the approximations, antecedently. There are occasions where one is operating under contingencies to avoid prompts and having to fade them and so it would seem there is a use for the term. There are reinforcers available for using the term. Free-shaping connotes somethings specific and useful and does not represent an error or misunderstanding of any kind. It is not conflating antecedent and postcedent interventions. Those who use the term can fully recognize what shaping is and is not. We can think of the "free-" part as specifying an antecedent intervention into the procedure and we put it with the word "shaping" because this addition is only for shaping projects.

Frequency of Responding. "The quotient derived from the following: (fulfilled opportunities to respond / total opportunities to respond. The result may be expressed as a percentage." (Frayley, General Behaviorology: The Natural Science of Human Behavior, 2008)

Frustration. Emotional behavior resulting from obstructed access to reinforcers. Frustration can precipitate aggressive responses.

Functional Analysis (FA). A part of a functional assessment in which the observer manipulates antecedents and/or consequences in order to test specific hypotheses regarding the controlling variables influencing a behavior. It is an experimental approach to evaluating behavioral contingencies.

Function Altering stimulus (SFA). Antecedent stimuli that alter the evocative capacity of the SD to evoke the behavior is changed; they influence the likelihood of the SD evoking the behavior (Fraley, 2008, pp. 509-533). This general term includes motivating operations, sensitization, habituation and other more specific terms. For instance, the presence of a fire alarm lever will evoke lever-pulling operants but not always. In many instances, it is a neutral stimulus (SN) rather than an SD. Consideration of context, or, function altering stimuli will help us achieve a higher degree of explanatory power in our contingency description. The presence of flames or smoke (SFA) alters the capacity of the lever (the SD) to evoke the lever pulling operant. Without the presence of the SFA, the maintaining consequences would not occur (merely pressing the lever any time you see one, would not likely be reinforced, or else a punitive consequence would suppress it) (Fraley, 2008, pp. 512).

Functional Assessment. Term used to describe a range of evaluation strategies and techniques, including the informant method, direct observation and functional analysis.

Functional Relationship. A relationship between behavior and the environmental stimuli that control it. A relationship between a dependent variable and an independent variable. In the natural science of behavior, the environment acts as the independent variable and the behavior acts as the dependent variable. We study the effects of the independent variable on the dependent variable, as in other natural sciences. See Contingency.

General Behavior Trait. Tendencies (as opposed to specific behaviors) strongly influenced by genes. Includes activity level, aggression, introversion and anxiety. More flexible than fixed action patterns (Chance, 2003, p. 17).

Generalization: Process whereby a behavior occurs in the presence of antecedent stimuli that are similar, but not identical, to the discriminative stimuli used in the original conditioning. When a behavior that has been reinforced in the presence of a particular discriminative stimulus also comes to be controlled by similar stimuli, we call this stimulus generalization. Also called stimulus generalization. Generalization is inversely related to discrimination. Generalization widens the stimuli the behavior will be evoked by whereas discrimination narrows it. Discrimination training makes it so that the behavior is evoked by a very specific stimulus and not similar stimuli, whereas generalization training makes it so that the behavior is evoked by similar stimuli. As one increases, the other decreases. Differentiated from "setting generalization" or " generalization of behavior change," which refers to a target behavior being emitted in the presence of stimulus conditions other than those in which is was trained as part of maintaining a training program you have achieved. These use similar terms but are distinct.

Generalized Conditioned Reinforcer. A conditioned reinforcer that has been associated with a variety of unconditioned reinforcers. Praise often achieves this standard.

Graded exposure. Incrementally exposing the learner to a stimulus, first at a low level of intensity and gradually increasing the exposure or intensity through repeated trials. The learner is exposed only at an intensity that does not elicit or evoke the problem behavior.

Habituation. "Habituation occurs when an unconditioned stimulus (US) repeatedly elicits an unconditioned response (UR). The frequent presentation of the US produces a gradual decline in the magnitude of the UR. When the UR is repeatedly elicited it may eventually fail to occur at all." (Pierce & Cheney, 2008, Behavior Analysis and Learning 4th ed. p. 350) Some responses do not habituate, although many do. Eyelid-blinking for instance does not tend to habituate. Some habituation effects can seem to be permanent, while many are temporary. If you present the stimulus repeatedly with a very small interval of time between each presentation, habituation will likely occur quickly, but it is likely it will take less time for the response to return to pre-exposure levels. If on the other hand you present the stimulus over a longer overall time frame with a greater interval between each presentation then it will likely take a bit longer to achieve habituation but it will tend to last longer. To illustrate, pop a balloon and the dog startles. The dog will likely be sensitized also, meaning, for a brief period of time, the dog is more likely to respond to other startling things more strongly, and that will wear off over time. If, on the other hand, you pop a balloon every several seconds for an hour, habituation will come quickly, but pop another one 2 hours later and the dog startles again. If, on the other hand, you pop a balloon every hour for a couple days, it will take a little longer to achieve habituation but then that effect will last a little longer.

Hierarchy of Stimulus Intensity. A breakdown of a stimulus from an exposure that elicits the least responsiveness through to the exposure that elicits the greatest responsiveness.

History of Conditioning. This refers to the history of all the times the learner has participated in that contingency back through time. At some point in time, this basic ABC sequence was performed for the first time and then with repeated trials it has generally strengthened and weakened the behavior involved. 

Impulsive Behavior. Behaving to access a maladaptive low-valued but immediate reinforcer, rather than a delayed but far higher valued reinforcer.

Independent Variable. In experimentation, the independent variable is the variable that is manipulated. The experimenter maintains other variables stable and changes the independent variable among groups of individuals or through time in one individual. The dependent variable is measured in order to determine if the independent variable affected it.

Informant Method. One approach in functional assessments to gain information on the contingencies involved in the target behavior, involving questioning people about the behavior and events surrounding it. 

Intermittent Schedule of Positive Reinforcement. Any schedule of reinforcement other than continuous reinforcement or extinction. The positive reinforcer is delivered sometimes, but not always.

Keep Going Signal (KGS). A conditioned positive reinforcer used during the performance of a behavior that is not followed by an unconditioned positive reinforcer right away necessarily but where the completion of the behavior results in delivery of the "terminal" conditioned positive reinforcer and the unconditioned positive reinforcer. There may be a series of KGSs distributed during the behavior. Presumably, the KGS is intended to preemptively keep the learner responding through a long duration behavior where the unconditioned reinforcer may not be sufficient to maintain the behavior.

Latency. A dimension or measure of behavior, usually indicating the time between the presentation of the discriminative stimulus and the performance of the behavior it evokes.

Law of Effect. Responses that produce a satisfying effect are more likely to occur again in that situation. Conversely, responses that produce an annoying effect are less likely to occur again in that situation. Note that we no longer define principles of behavior by how satisfying or annoying the stimuli are. Instead, we define them by their actual effect of on the behavior, whether they actually strengthen or weaken the rate or frequency of the behavior on subsequent occasions. 

Learned Helplessness. Refers to ceasing to even try to escape in the face of inescapable, severe, aversive stimulation. If a learner cannot effectively escape punishers, they will often cease trying—they simply resign themselves to it.

Learning. A "change in behavior due to experience." (Chance, 2009, p. 392) Also called conditioning.

Least Intrusive Effective Behavior Intervention. Model for decision making regarding the appropriate use of aversive stimulation in training and behavior consulting. See article here. Place all links to the sources site.

Limited Hold. It adds the rule to a schedule of reinforcement that reinforcement is only available within the context of the schedule for a limited period of time. This rule is particularly helpful when you intend to train a behavior to occur quickly upon presentation of the discriminative stimulus.

Lure. A prompt wherein you direct the subject's attention with something and use that to get some behavior to be performed. Like the carrot in front of the donkey, we can use treats that the dog will act to smell, which we can then use to encourage them to move wherever we move the treat.

Medical Model Approach. A theoretical and procedural orientation to behavior change that tends to explain and change behavior similarly to how medical professionals treat physical disease. Behavior is classified as normal or disordered and disordered behavior is classified into various classifications. This model refers to diagnosing and treating behavior problems and often but not always takes a biological approach to viewing and changing behavior. Diagnostic labels do not explain behavior, one important limitation in this approach. It barely describes it, and the label does not contribute effectively to implying the intervention necessary, partly because it fails to identify the cause of the behavior. See http://www.associationofanimalbehaviorprofessionals.com/theoreticorientation.html. See Biological Approach and contrast with Applied Behavior Analysis, which addresses observable behaviors and how it relates adaptively to the environment.

Modal Action Pattern. A sequence of behaviors, which is relatively invariant and is considered relatively innate, activated by a specific environmental stimulus. Used to be called fixed action patterns implying that the response is fixed and unchangeable.

 

Motivating Operations (MOs). A type of antecedent. Briefly, MOs alter the effectiveness of reinforcers or punishers and the frequency of operant response classes maintained by those reinforcers and punishers (Laraway et al., 2003). Abolishing operations (AOs) temporarily decrease the effectiveness of consequences, whereas establishing operations (EOs) temporarily increase the effectiveness of consequences. The term MO encompasses all four quadrants in the contingency table, with EOs for both reinforcers and punishers, and AOs for both reinforcers and punishers. Usually, satiation and deprivation are used as MOs. For reinforcers, deprivationtends to be an EO, while satiation tends to be an AO. See Function Altering Stimulation for more general term.

Natural Event. "An event that is defined in terms of time, distance, mass, temperature, charge, and/or perhaps a few other more esoteric properties taken into account by theoretical physicists. A natural event is defined by measurable physical properties and occurs only as the culmination of a sequential history of similarly definable events. Thus, natural events cannot occur spontaneously." (Frayley, General Behaviorology: The Natural Science of Human Behavior, 2008)

Natural Selection. “The mechanism of evolution by which the environment acts on populations to enhance the adaptive ability and reproductive success of individuals possessing desirable [effective] genetic variants, increasing the chance that those beneficial [effective] traits will predominate in succeeding generations” (Horvitz, 2002, p. 303). Traits are variable within a population, and many traits are heritable. Individuals reproduce at differential rates and levels (some individuals reproduce more than others). Those traits (adaptive traits) that contribute to reproductive success tend to increase in frequency within the population because they are passed on more successfully to progeny. Traits that do not contribute to reproductive success tend to become less frequent in future generations. The environment selects for adaptive traits and selects against maladaptive traits. Natural selection is about changes in gene frequencies due to differentially successful reproduction. It is not intentional, directional or progressive; it simply describes the outcome of differential reproductive success and mechanisms by which that occurs.

Negative Punishment. A behavior change process in which a decrease or subtraction of a stimulus to the environment during or immediately following a response results in a decrease in the strength of the behavior across subsequent occasions. (Frayley, General Behaviorology: The Natural Science of Human Behavior, 2008)

Negative Reinforcement. A behavior change process in which a decrease or subtraction of a stimulus to the environment during or immediately following a response results in an increase in the strength of the behavior across subsequent occasions. (Frayley, General Behaviorology: The Natural Science of Human Behavior, 2008)

Negative Reinforcer. Any stimulus that, when removed following a behavior, results in an increase in the strength of that behavior.

Neutral Stimulus. A stimulus that does not elicit a response. See Respondent Conditioning.

Nothing in Life is Free (NILIF). Originating with Dr. Victoria Voith, the Nothing In Life Is Free program involves requiring dogs to perform specific behaviors in return for everyday activities and resources. For example, the dog is required to sit before being let outside or being allowed to eat. This program is commonly interpreted as a boot-camp-style regimen, and, because of that, may promote adversarial relationships between people and their dogs. The most beneficial aspect of this program is the principle that one should take advantage of everyday opportunities to train the organism because more training gets done that way, the training generalizes well, and learning takes place all the time. See Premack Principle.

One-zero sampling. A sampling procedure in which it is observed whether or not a behavior occurs in a given interval of time (e.g., whether or not the behavior occurred in the predetermined 30 second interval).

Operant behavior. Behavior that is maintained by the consequences that it has historically generated. Consequences influence these behaviors by strengthening or weakening the discriminative stimulus' capacity to evoke the behavior on subsequent occasions. Note that some behavior analysts use the term "operant" in the same way this glossary is using Operant Set. They would suggest that the word operant used as an adjective as in "Operant Behavior" describes consequence driven behaviors, and that "operant" used as a noun is what we are calling an Operant Set.

Operant Conditioning. A change (increase or decrease) in the strength of an operant behavior across subsequent presentations of the discriminative stimulus as a function of its historic consequences.

Operant Level. The rate of an operant prior to specific conditioning procedures. 

Operant Set. A class of operants that may differ topographically but function to produce the same consequence. Some behavior analysts use the term operant to refer to operant sets whereas others use the term operant to simply refer to consequence driven behaviors. Note that some behavior analysts use the term "operant" in the same way this glossary is using Operant Set. They would suggest that the word operant used as an adjective as in "Operant Behavior" describes consequence driven behaviors, and that "operant" used as a noun is what we are calling an Operant Set.

Overshadowing. In respondent conditioning, if two neutral or conditioned stimuli are used simultaneously in conditioning an association with an unconditioned stimulus and only one becomes conditioned while the other does not, we would say that the successfully conditioned stimulus overshadowed the unsuccessful stimulus. Which stimulus overshadows the other is probably determined by prior exposure to the stimuli, salience and perhaps preparedness. 

Phobia. An extreme fear and the intense escape behaviors motivated by it.

Pica. Tendency to eat nonfood items. Some dogs will eat rocks for example if they are allowed access to them.

Positive Punishment. A behavior change process in which an increase or addition of a stimulus to the environment during or immediately following a response results in a decrease in the strength of the behavior across subsequent occasions. (Frayley, General Behaviorology: The Natural Science of Human Behavior, 2008)

Positive Reinforcement. A behavior change process in which an increase or addition of a stimulus to the environment during or immediately following a response results in an increase the strength of the behavior across subsequent occasions. (Frayley, General Behaviorology: The Natural Science of Human Behavior, 2008)

Positive Reinforcer. Any stimulus that, when presented following a behavior, results in an increase in strength of that behavior in subsequent occasions.

Postcedent. "The environment as it exists beginning immediately after a response." (Frayley, General Behaviorology: The Natural Science of Human Behavior, 2008) Once we know what components of this environment are actually functionally related to the antecedent-behavior sequence, we call that the consequence.

Potentiation. “[A]n increase, over repeated presentations, in the respondent behavior elicited by a stimulus (especially, an aversive stimulus)” (Catania, 1998, p. 402). This is what people commonly, but mistakenly, think of as sensitization. See Sensitization.

Premack Principle. A behavior with higher frequency or probability can act as reinforcement for a less frequent or less probable behavior. In practice, we can use everyday opportunities to train dogs. Not only treats and toys can act as reinforcers, so too can other behaviors, such as running or playing.

Preparedness. “Some relations between stimuli, and between stimuli and responses, are more likely because of phylogenetic history. This phenomenon has been called preparedness. For example, a bird that relies on sight for food selection would be expected to associate the appearance of a food item with illness, but rats that select food on the basis of taste quickly make a flavor-illness association” (Pierce & Cheney, 2004, p. 438). See Biological Context.

Primary Reinforcer. See Unconditioned reinforcer.

Principle of Behavior. A description of a relationship between behavior and its controlling variables (Cooper et al., 1987).

Prompt. A prompt is anything that increases the likelihood of the behavior being performed that is not a discriminative stimulus or cue. It is a way of generating the behavior so that it may be reinforced and eventually a new cue-response relationship can be established. If a prompt takes on stimulus control over the behavior it becomes a discriminative stimulus or cue for that behavior. Trainer performed prompts are referred to as response prompts, while non-trainer-performed, environmental prompts are referred to as stimulus prompts. Prompts can be visual or or olfactory as in luring, auditory as in making noise to have the learner orient or physical as in physically manipulating the learner's body or a part of it into the desired position. To train without prompts involves "capturing" the behaviors as they occur without prompts an is the process used in free-shaping, which refers to shaping without prompts.

 

Prototypal Approach. In classifying behaviors, a conceptual entity depicting an idealized combination of characteristics that more or less regularly occur together in a less than perfect manner at the scale of actual observation. The prototypal approach does not necessarily assume distinct categories and, in that regard, provides a far lower level of validity than a categorical approach. See also Categorical Approach for a contrasting system.

 

Punisher. A stimulus that, when presented or removed contingent on a behavior, decreases the future strength of that behavior across subsequent occasions.

 

Punishment. A behavior change process in which a stimulus during or immediately following a response results in a decrease in the strength of that behavior across subsequent occasions. (Frayley, General Behaviorology: The Natural Science of Human Behavior, 2008) Punishment weakens the evocative power of the SD.

 

Rate of Response. "The quotient when a count of responses is divided by a count of the time units across which the count of responses occurred. (The count of responses is the dividend; the count of the time units is the divisor; and the rate of responding is the quotient.)" (Frayley, General Behaviorology: The Natural Science of Human Behavior, 2008)

 

Ratio Schedules. Schedules of positive reinforcement that make the rule that responses will be reinforced after a specific number (fixed or variable) of responses has been performed.

 

Ratio Strain. Disruption of operant responding when a ratio schedule is increased rapidly.

 

Reciprocal Inhibition. See Counterconditioning.

 

Reflex. The elicitation of an unconditioned response(UR) with an unconditioned stimulus (US). A US–UR relationship.

 

Reinforcer. A stimulus that, when presented or removed contingent on a behavior, increases the future strength of that behavior across subsequent occasions.

 

Reinforcement. A behavior change process in which a stimulus immediately following or during a response results in an increase in the strength of that behavior across subsequent occasions. (Frayley, General Behaviorology: The Natural Science of Human Behavior, 2008) Reinforcement strengthens the evocative power of the SD.

 

Relaxation. Relative minimal arousal. Calmness. Lack of anxiety, or stress. See Systematic Desensitization.

 

Resistance to Extinction. The persistence of an operant behavior after it is put on an extinction schedule. Prominent when the behavior was maintained on an intermittent schedule as opposed to a continuous reinforcement schedule.

 

Respondent. An unconditioned response(reflex) or conditioned response that is elicited by a stimulus (unconditioned or conditioned).

Respondent Aggression. This term is misleading and somewhat dated at this point. It refers to a lashing out and pain evoked aggressive response. The aggressive behaviors are operants and not respondent so the term is a misnomer. It is used to characterize the almost reflexive nature of the response and heavy influence of reflexes in setting the occasion for the operants. Contrast with Aggression.

 

Respondent Conditioning. Occurs when a neutral stimulus is paired with an unconditioned stimulus (that elicits an unconditioned response ), and after conditioning has occurred, the neutral stimulus itself elicits what we call a conditioned response, and the neutral stimulus has become a conditioned stimulus.

 

Respondent Generalization. A behavior change process that occurs when an organism performs a conditioned response to values of the conditioned stimulus not previously trained.

Respondent Level. The magnitude of a conditioned response before conditioning has taken place. The magnitude of the response to the neutral stimulus.

 

Response. "Any covert or overt innervated muscular movement of all or part of an organism resulting from energy transformations occurring within the organism and initiated by energy inputs from beyond the affected body part. Also, a particular innervated pattern of neural activity that relies on a particular molecular configuration." (Frayley, General Behaviorology: The Natural Science of Human Behavior, 2008) A particular instance of behavior (See Behavior). Typing is behavior, for example. An individual keystroke is a particular instance of the behavior, and we call it a response.

 

Response Class. See Operant Class.

 

Response Cost. Form of negative punishment in which a specified amount of reinforcer is removed or lost contingent on performance of a specific behavior, and the behavior decreases in frequency as a result.

 

Response Effort. The amount of effort required to execute a particular behavior. The probability of that behavior being performed decreases proportionally as response effort increases, if an alternative functionally equivalent behavior becomes available. Organisms will generally choose a behavior fulfilling a specific function that requires less effort than other behaviors that will achieve the same function.

 

Response Prevention. Usually used in conjunction with flooding. Floodingand response prevention is a procedure based on the principle of respondent extinction. It is the opposite of systematic desensitization (based on counterconditioning). In a flooding and response prevention procedure, the organism is exposed to the full intensity of the conditioned stimulus (i.e. flooded with the CS) without the unconditioned stimulus, and escape is prevented (i.e., response prevention). Exposure continues until the conditioned response is extinguished, and escape behavior declines. This procedure is susceptible to problematic secondary effects.

 

Resurgence. “The increase in topographic variability during extinction after a period of reinforcement…” (Pierce & Cheney, 2004). To put the term in context, an extinction burst is an initial increase in the frequency of the specific behavior being extinguished, whereas resurgence involves different behaviors being offered once extinction is in place. Resurgence is the basis for the variability needed in shaping. While Pierce and Cheney clearly describe resurgence as the topographic variability in responding during extinction, some sources refer to resurgence as the appearance of other behaviors from the organism's repertoire with a reinforcement history, that organisms perform after extinction is put in place. In this use of the term, it is the organism running through their repertoire of behaviors in order to access reinforcers as opposed to the simple increase in the topographic variability during extinction.

 

Reversal Design. The reversal-design single subject experiment typically involves two phases. The first phase (the A phase) involves establishing a baselinefor the frequency or magnitude of the behavior (the dependent variable) in question. Following the A phase, you instate the independent variable (that is, the consequence or the antecedent that you want to know about) and continue to measure the frequency or magnitude of the behavior. This second phase is called the B phase. Usually, there is at least another A phase. See also Alternating Treatment Design.

 

Safety Signal. A salient stimulus that is presented immediately before or during conditioning trials wherein an aversive stimulus will not be presented and is not present for trials wherein the aversive stimulus will be presented.

 

Salience. A stimulus is salient to the extent that it is noticeable. The more noticeable and prominent the stimulus is, the more salient it is. The more salient a stimulus is, the greater its associative strength as a conditioned stimulus.

 

Satiation. Decline in the effectiveness of a reinforcing stimulus due to excess exposure to it or repeated presentation of it. If an organism is satiated with a particular reinforcer, its value declines and it is not as powerful a reinforcer as a result. Satiation can also be used in the context of punishers, and in that regard the behavior would increase in frequency. See Motivating Operations.

 

Schedule of Positive Reinforcement. Rules specifying which target responses are followed by positive reinforcers.

 

S-Delta (S∆). See Extinction Stimulus.

 

Secondary Reinforcer. See Conditioned Reinforcer.

 

Self-Controlled Behavior. Self-controlled behavior is the opposite of impulsive behavior. Contrast with Impulsive Behavior.

 

Sensitive Periods. Narrow windows of time in early development when organisms are particularly susceptible to particular forms of learning or learning specific classes of associations. Previously referred to as “critical periods.”

 

Sensitization. “In sensitization, the eliciting effects of one stimulus are enhanced as a result of presentation of some other stimulus; one stimulus amplifies the eliciting effect of another stimulus” (Catania, 1998, p. 50). And: “The tendency to be more responsive to the environment following an arousing experience” (Hergenhahn & Olson, 2001, p. 469). Catania offers the example that an animal who is shocked and then shortly thereafter is exposed to a loud noise is more likely to have their startle response elicited. The shock sensitized the organism to the noise. Chance (2003, p. 454) offers this definition: “An increase in the intensity or probability of a reflex response resulting from earlier exposure to a stimulus that elicits that response.” It is common to confuse the notion of sensitization with the notion of potentiation. Potentiation, explains Catania (p. 50), involves “an increase, over repeated presentations, in the respondent behavior elicited by a stimulus (especially, an aversive stimulus).”

 

Sensory Preconditioning. “In respondent conditioning, two stimuli such as light and tone are repeatedly presented together (light + tone) without the occurrence of a US (preconditioning). Later, one of these stimuli (CS1) is paired with an unconditioned stimulus (US) and the other stimulus (CS2) is tested for conditioning. Even though the second stimulus (CS2) has never been directly associated with the US, it comes to elicit the conditioned response (CR)” (Pierce & Cheney, 2004, p. 443).

 

Separation Distress. Distress related behaviors (physiologically and anatomically, panic and pain related although anxiety and fear related sometimes) elicited by social (or place attachment) isolation, or conditioned stimuli predicting social (or place attachment) isolation, and the operants they motivate. (O'Heare, Separation Distress and Dogs, 2009)

 

Sequencing. "A sequence is a series of multiple, individually cued behaviors performed consecutively and usually without added reinforcement between them. An agility course is a long sequence made up of not only the obstacles, but also the directional cues given between the obstacles." (Source) Notice that this is different from chaining in that it can involve interjected cues from the trainer. Sequencing is often confused with chaining but chaining as conventionally defined does not involve interjected verbal or physical cues from the trainer. This has been called "flexible chaining" as well but again, this is now what chaining is as conventionally defined.

 

Setting Events. Environmental events or conditions, not including motivating operations, not typically occurring immediately prior to the behavior in question but setting the occasion, or form the context, for a particular behavior, making the behavior more or less likely. These can often be thought of as more distant motivating operations. See Function Altering Stimulation.

 

Shaping (procedural)."A procedure in which differential reinforcement is applied to a series of successive approximations of a final specified form of a behavior." (Frayley, General Behaviorology: The Natural Science of Human Behavior, 2008) Shaping is a postcedent intervention and a compound procedure in that it utilizes two basic principles of behavior (both postcedent changes): extinction and reinforcement. It usually involves positive reinforcement, but utilizing negative reinforcement can legitimately be called shaping. It is a special kind of differential reinforcement. Both are compound procedures involving reinforcement and extinction, but while differential reinforcement changes only the rate or frequency (quantity) of a behavior, shaping changes the form (quality) of the behavior. They are both postcedent interventions in that they involve reinforcement and extinction, both postcedent behavior change processes. See Free-Shaping for further discussion.

 

Social behavior. Also called communication in psychology and ethology, behavior that influences the behavior of others. Social behavior or "communication" is just like any other behavior and operates on the same principles of behavior. It is not the transmission of information from mind to mind.

 

Social Disruption. One form of the problematic secondary effects of aversive stimulation, in which the person presenting the aversive stimulation and the context in which it is delivered become conditioned aversive stimuli. This is related to a decline in the social bond.

 

Socialization. The process of exposing an animal to stimuli in a sensitive manner while the organism is in (or approximately in) a sensitive period of development and particularly amenable to acclimating to these stimuli and establishing nonaversive respondent emotional responses to it and establishing a history of reinforcement for contacting the stimuli.

 

Spontaneous Recovery (operant). After operant extinction, when the behavior is at the operant level, if the organism is put back into the context that previously set the occasion for that behavior, the behavior may be performed again. It is thought that extraneous discriminative stimuli (contextual stimuli) not fully extinguished evoke the behavior. With repeated exposure and continued extinction, the behavior becomes less and less likely to spontaneously recover. The word “spontaneous” is unfortunate and misleading because the responding is not actually spontaneous at all.

 

Spontaneous Recovery (respondent). After a respondent behavior has been extinguished and the conditioned stimulus is presented, the conditioned response may return or increase in magnitude. Continued extinction results in a decline of the response.

 

Steady-State Responding. Behavior that is stable in rate over time. Once a behavior is past the acquisition stage and into “maintenance,” it should reach steady state.

 

Startle Response. Reflexin which the organism rapidly activates in a frightened manner; rapid activation of the nervous system, preparing for energy expenditure. Perhaps a rapid surprise version of the orienting response.

 

Stimulus. Any thing or event that is capable of influencing behavior. 

 

Stimulus Class. “Stimuli that vary across physical dimensions but have a common effect on behavior belong to the same stimulus class” (Pierce & Cheney, 2004, p. 444).

 

Stimulus Control. "Stimulus control is the functional control that stimuli in the environment acquire over the behaviors exhibited in their presence. These stimuli set the occasion for the behavior that reliably follows them." (Frayley, General Behaviorology: The Natural Science of Human Behavior, 2008)

 

Structural Approach. An approach to classifying behaviors in which behaviors that share topographies are clumped into the same classification.

Submission. See Appeasement Behaviors.

 

Submissive Urination. When a dog commonly urinates as part of their greeting ritual along with other appeasement signals, more prominent when being greeted by an assertive or hostile individual.

 

Superstitious Behavior. Accidentally or unintentionally reinforced behavior where a behavior is reinforced but there reinforcement occured by random chance instead of in accordance with a specific contingency. For example, if a rat is in a Skinner Box and food is delivered at random intervals, not contingent on any particular behaviors, the rat is likely to be performing a common behavior, such as sniffing, turning or standing on hind feet, more often than other behaviors. Even though the food delivery is not contingent on any particular behavior, the frequency of some behavior may increase, and this is called accidental or superstitious. In this example, after a few sessions the rat might be spinning in circles well above its previous operant rate.

 

Systematic Desensitization. Systematic desensitization is effective, no question about it. But we just do not know why exactly. In 1920, John Watson and Rosalie Rayner published a classic paper on how emotional responses are conditioned via respondent conditioning in which the authors detail how a child names Albert was conditioned to fear specific stimuli. In 1924, Mary Cover Jones published a classic follow-up article on how emotional responses, fear in particular, could be changed via respondent conditioning by outlining how they counterconditioned fear responses in a child names Peter. These classic works provided the foundation for Joseph Wolpe’s 1954 seminal article in which he proposed the procedure known as systematic desensitization. Systematic desensitization was proposed to change problem emotional responses including fears, anxieties and phobias. The idea was to coach the client in relaxation exercises, construct a hierarchy of fear and then incrementally and gradually expose the client to an imagined or actual exposure to each level in the hierarchy starting with the least intense, promote relaxation and work through the entire hierarchy, level by level. Wolpe proposed reciprocal inhibition as the mechanism by which the learner desensitizes to the feared stimulus. This is similar to counterconditioning. The idea is that the learner cannot engage in two contradictory or mutually exclusive emotional / physiological responses at the same time. The relaxation was said to inhibit the fear or anxiety (or countercondition it). Since then, the reciprocal inhibition and counterconditioning hypotheses have been called into question. Others have proposed that habituation or respondent extinction are responsible for the desensitization effects but these also have been called into question. In recent years, many behavior technologists have proposed that much of the beneficial effects are actually the result of operant conditioning rather than respondent conditioning. Complex cognitive (covert verbal behavior) explanations and expectancy/placebo effects have also been proposed. Systematic desensitization is the term often used to describe any procedure involving relaxation and graded exposure through a hierarchy of stimulus intensity. Within that framework, there is in vitro systematic desensitization wherein the learner imagines the exposure and in vivo systematic desensitization wherein the learner is actually exposed to the stimulus. This latter approach is sometimes called contact desensitization though. And sometimes the in vitro version is referred to as systematic desensitization and the contact version is referred to as in vivo desensitization. The entire process is sometimes also referred to as simply exposure therapy but this term is usually used to refer to systematic desensitization and other exposure procedures. (O'Heare, Separation Distress in Dogs, 2009)

 

Temperament. In behavior analysis, this would likely reflect the present net effect of all contingencies in place. The argument from those who recognize "temperament" as a useful construct goes as follows generally in a more behavioral perspective: There is variation within a species of general behavior traits, and these tendencies in an individual often remain stable for extended periods of time. The particular quantity or quality of the general behavior traits of a particular individual along with their conditioning history throughout their lives up until that point and all of the behavior-controlling environments / stimuli may be the basis for temperament. See General Behavior Traits.

 

Tertiary Reinforcer. A conditioned reinforcer that was established by pairing with another conditioned reinforcer as opposed to an unconditioned reinforcer. "A stimulus that functions as a reinforcer because of its contingent relation to another reinforcer. Such stimuli have also been called secondary reinforcers, but this designation is best reserved for cases in which the modifier specifies how many stimuli separate the conditioned reinforcer from a primary reinforcer (e.g., a secondary reinforcer is followed directly by a primary reinforcer, a tertiary by a secondary, etc.)." (Source)

 

Tethering. Not to be confused with chaining a dog out. Tetheringinvolves having a dog on a leash or line that is attached to a person in some contexts or a stationary object like a wall stud or tree while supervised during behavior change protocols. Tethering a dog is sometimes used as a safety measure. It prevents them from accessing their victim and allows the victim to escape. Tethering does remove escape options and a tightened tether will elicit the opposition reflex and cause frustration so it is vital that it be used only in situations where the dog will not be stimulated to escape or tighten the line. The dog must be maintained subthreshold if a tether is used. It is only a backup safety measure. This is not an endorsement.

 

Three-Term Contingency. The three-term contingency describes the controlling variables for a behavior and the functional relationship between behavior and the environment,in terms of what occurs before the behavior (antecedents) and immediately after the behavior (postcedents) that influences it. Once we have determined that an antecedent stimulus controls a behavior, we refer to it as the discriminative stimulus (SD). Once we have determined that a particular postcedent stimulus reinforces or punishes the behavior and is therefore functionally related to the behavior, we call it the consequence. Under some circumstances, it can be useful to include a four term in the contingency, a function altering stimulus. This might include function altering stimulation (or context) such as the general setting events and the specific motivating operations. For instance, a fire alarm lever will not always evoke pulling behaviors but in the presence of smoke or flames, the alarm lever evokes the pulling behavior (barring other opposing contingencies). The flames act as a function altering stimulus and the lever then becomes the SD. Without the smoke or flames, lever pressing behavior will not likely result in reinforcement. In fact, it may result in punishment. Emotional arousal, if the organism goes into the contingency in question in this "mood" can also act as a function altering stimulus. Function altering stimulus is a general term hat includes all of these context terms such as setting events and motivating operations etc.

 

Threshold Model. A model for conceptualizing aggressive behavior that identifies in graphical form the stimulus that controls an aggressive response and the intensities that stimulate escalated aggressive responses to it.

Thunderstorm Phobia. Extreme distress and panic related behavior associated with thunderstorms.

 

Trace Conditioning. A respondent conditioning procedure whereby the conditioned stimulus is presented and then removed, followed shortly by the presentation of the unconditioned stimulus. Establishing a "clicker" as a conditioned positive reinforcer utilizes trace conditioning. For effective conditioning, the US ought to follow the CS within a couple/few seconds to achieve satisfactory contiguity.

 

Unconditioned Reinforcer. A stimulus that acts as a reinforcer but not as a result of conditioning. Related to biological needs such as food, optimal temperature etc. The body develops structurally under genetic controls such that they will be reinforced automatically by the property that is said to be a primary or unconditioned reinforcer (Fraley, 2008, General Behaviorology, p. 125)

 

Unconditioned Response (UR). Response elicited by a stimulus related to biological adaptations. For example, eye blinking is a UR elicited by a puff of air on the eye. It is adaptive because it protects the eye and hence contributes to biological/reproductive fitness. See Reflex.

 

Unconditioned Stimulus(US). Stimulus that elicits an unconditioned response. For example, a puff of air is a US that elicits blinking. See Reflex.

 

Variable Duration (VD). This schedule of positive reinforcement makes the rule that reinforcement will be delivered after a behavior has been occurring for a variable amount of time. As with other variable schedules, the reinforcement is delivered on what seems like a random schedule but is variable around a mean of a specified duration of time.

 

Variable Interval (VI). This schedule of positive reinforcement sets the rule that reinforcement is to be delivered on the first occurrence of the target behavior after a variable interval of time has passed. Similar to the fixed interval schedule, the first behavior after an interval of time passes is reinforced, but in this case, the time interval is variable around a mean, rather than being fixed.

 

Variable Ratio (VR). This schedule of positive reinforcement makes the rule that the target response will be reinforced after an apparently random but specific average number of responses. The VR schedule is similar to the fixed ratio schedule, but rather than reinforcing after every, say, 4 responses, you provide the reinforcement after an average of 4 responses.

 

Glossary of Terms

 

See glossary for definitions

 

ABC. Antecedent Behavior Consequence

 

AO. Abolishing Operation

 

CER. Conditioned Emotional Response

 

CR. Conditioned Response

 

CRF. Continuous Reinforcement

 

CS. Conditioned Stimulus

 

CS. Conditioned Aversive StimulusAVE

 

DR. Differential Reinforcement

 

DRA. Differential Reinforcement of Alternative Behavior

 

DRE. Differential Reinforcement of Excellent Behavior

 

DRH. Differential Reinforcement of High Rate of Responding

 

DRI. Differential Reinforcement Incompatible Behavior

 

DRL. Differential Reinforcement of Low Rate of responding

 

DRO / DR0. Differential Reinforcement of Other Behavior / Differential Reinforcement of Zero Responding

 

EO. Establishing Operations

 

SFA. Function Altering stimulus

 

GAS. General Adaptation Syndrome

 

KGS. Keep Going Signal

 

LIEBI. Least Intrusive Effective Behavior Intervention.

 

MO. Motivating Operations

 

NS. Neutral Stimulus

 

S. Discriminative StimulusD

 

S. Extinction Stimulus∆

 

UR. Unconditioned Response

 

US. Unconditioned Stimulus

 

VD. Variable Duration Schedule

 

VI. Variable Interval Schedule

 

VR. Variable ratio Schedule