By Christan Griffin, M.Ed., BCBA, LBA

Direct Instruction

Direct instruction (DI) is an evidence-based teaching intervention included under the umbrella of Applied Behavior Analysis (ABA). Some practitioners believe DI is a powerful and effective intervention strategy that can improve all learners’ educational and behavioral success.

“Direct Instruction, originating from the work of Engelmann and colleagues in the 1960s, is a systematic model of teaching focusing on a sequenced and incremental mastery of curriculum-based competence and a capacity to apply generalizable skills to tackle other similar questions/problems” (Liem & Martin, 2013).  

Principles of Direct Instruction

DataFinch Blog Principles of Direct Instruction

Burton et al. (2005) discuss components of DI which contribute to the diversity of this method from other forms of direct teaching methods, which may not be empirically validated in the field of ABA. The authors emphasize the notion that DI is not a lecturer approach, but rather an intervention designed to focus heavily on the interactions between a learner and their teacher. Specifically, the authors provide the key components of DI which include “modeling, reinforcement, feedback, and successive approximations” (Burton et al. p.41, 2005). As described by Liem & Martin (2013) the basic principles and assumptions of DI include, 

  • All learners are able to be taught and have the potential for progress.  
  • Learners with special needs or historically low-performance should be taught at an accelerated rate to promote reaching performance levels of their peers.   
  • All educational and behavioral practitioners are capable of learning how to implement the DI model.  
  • The DI model is most effective in an environment that is structured and standardized to reduce the probability of the learners misunderstanding the materials or learning exercises and may increase the fluency of learning new material for the learner.    

The implementation of DI, as shared by Liem & Martin (2013), requires thoroughly planned and detailed lesson plans, which initially incorporate a greater amount of guidance and structure; however, also include strategies for reduction of prompts to promote independence. Additionally, the learners are provided with a higher frequency, or at least equal to that of their peers, to respond, as compared to a traditional classroom structure. DI also allows for the placement of learners in small groups relative to their skill level compared to those of their peers. The flexibility of DI allows learners to access lessons appropriate to their current levels of performance (Liem & Martin, 2013). As described by Colvin & Engelmann (p. 8, 2006), “DI programs attempt to control all the variables that influence student performance within the context of a packaged program. If these variables are adequately controlled, more is taught in less time, and student’ learning is accelerated.” 

Colvin & Engelmann Rubric for Direct Instruction  

DataFinch Blog Direct Instruction

 

There are four empirical outcomes that can be monitored to assess whether the DI program is being implemented correctly, thus increasing the performance of the learner. This demonstrates that students were appropriately assessed and placed in the correct DI program, as well as attests to the successful organization of DI instructional materials (Colvin & Engelmann, 2006). The four main criteria to monitor during DI programming, as explained by Colvin & Engelmann p. 10, 2006) are listed below.  

  1. Difficulty of Inducing Mastery: Create programs based on the level of difficulty for each individual learner. For example, a DI program created with 8 total units would include lessons sequentially ordered with the level of difficulty increasing ever so slightly for each new unit introduced. The information presented in each new unit is more complex; however, the modification is hardly recognized by the student. If the student is able to master the material within a day, it meets the criteria of not being too difficult.
  2. Presentation Time per Track: For each unit in a DI program, 3-4 tracks (e.g., mini-lessons or sequences) should be included and can be thought of as “small-step intervals” (Colvin & Engelmann, p. 10, 2006). Each track needs to consist of 3 to 12 minutes of subject material that creates subsequent steps that will lead up to the next unit.  
  3. Proper Pattern for Learning Related Content: The foundation of all properly implemented and well organized DI programs will meet the criteria related to lesson difficulty and time spent learning material, which creates a downward trend, or curve when displayed visually. The idea behind achieving the desired curve criteria is that the first unit should be organized in a way where the subsequent units are less time-consuming and easier for the learner to acquire than if taught in isolation. If learning tracks are arranged in a fashion where their relationship with one another is capitalized, then “the student does not have to learn later units from scratch, but simply learn what is unique about each (Colvin & Engelmann, p.13, 2006).   
  4. Steps Required to Teach a Universe of Examples: The authors provided an example of a DI program that taught the same content as a non-DI program for comparison reasons. At the end of the study, the authors analyzed the data to compare how many steps were required to mastery for the same content broken down for each program,  DI and non-DI. The assumption was that each step required about the same amount of time to meet mastery, which demonstrated the DI program required far less time for the learner to meet mastery of the content. The DI program was able to break the content into 12 steps, whereas the non-DI program had created 20 steps from the content. While the non-DI program met the criteria for steps 1 through 3 including not being too difficult, sectioned context out into 3 -12 minute segments, and achieved a curve which demonstrated each unit was relatively easier than the last, the non-DI program was not considered efficient. “Instead it was parceled out too slowly, without organizing the context so that it could be communicated faster (Colvin & Engelmann, p.18, 2006).  

Colvin & Engelmann Rubric for Features of Direct Instruction

DataFinch Blog Direct Instruction Rubric

There are seven main features that make up a well-organized and efficient Direct Instruction program.

Stockard (2015) shared the thoughts of several researchers’ who examined variables that may directly influence why some DI programs have been more successful than others.

“The clear conclusion appears to be that DI works best when it is implemented as described, in other words, [practitioners] carefully follow the procedures and programs and when students fully master each element of the lesson and make regular progress through the programs” (Stockard, p. 8, 2015).

Colvin & Engelmann (2006) provide guidance on the seven components which make up a foundationally sound DI instruction program:

  1. Presentation of Information: The practitioners will provide the learner with information related to the subject matter, followed by providing the presentation of a task to assess if the learner had comprehended the information. For example, the information should be clear and concise, and simple enough that the learner could probably repeat it after several attempts. 
  2. Tasks: The tasks should include questions or instructions that will require the learner to respond. The tasks will directly relate to the information just reviewed, or information the learner has previously acquired. 
  3. Task Chaining: This step requires the teacher to provide three or more tasks in an uninterrupted chain to analyze the learner’s acquisition of the information recently learned. 
  4. Exercises: An exercise may be larger than a task chain. If something is being taught for the first time, the exercise would need to include the presentation of instructions and information, prior to presenting one or more task chains. If the exercise is not presenting something the learner has not already completed, then it is more of a review of previous information and may include a task chain. 
  5. Sequences (Tracks): When completing the sequence, exercises will be presented on the same subject matter that is presented subsequent across lessons. The objective is that the easier terms and discriminations are learned, followed by the integration of the more simple subject matter in more complex sequences.  
  6. Lessons: One lesson is a unit consisting of a sequence of tracks that ideally are completed on the same day. Units require a duration of 30 minutes for primary grades, 50 minutes for Jr. High and above. Lessons need to be organized and structured the same across units. For example, the terms and examples must be concise and clear, and the same across units. 
  7. Organization of Content: The organization of content is essential as the main objective for each unit is to have learners acquire skills and knowledge, which will promote correct responding across any examples “within a specific universe of examples” (Colvin & Engelmann, p.28, 2006). 

Conclusion

Conclusion

Direct Instruction is an effective and efficient method for ABA practitioners, as it involves data-driven systems that can be visually displayed and analyzed.

“It is important to emphasize that a student does not passively absorb knowledge from the world around him but must play an active role, and also that action is not simply taking. To know is to act effectively, both verbally and nonverbally (Burton & Lockee, p.41, 2005).

DI is an evidence-based and empirically validated instructional model that has research demonstrating the success and effectiveness when the core components are followed with fidelity. XX shared his philosophy that if a student is unable to learn and acquire new information, then it is not the fault of the student but rather a direct reflection of the instruction provided. In conclusion, when ABA practitioners are confronted with a learner who is failing to acquire new skills it may be beneficial to assess the interventions in place and potentially consider revisiting the use of the DI model.

Learn How DataFinch Meets All Your ABA Data and OBM Collection Needs

Catalyst & Pinnacle: the industry's gold standard for ABA Practices
Christan Griffin, M.Ed., BCBA, LBA

Christan Griffin, M.Ed., BCBA, LBA

Christan Griffin has worked closely with neurodiverse learners in a variety of settings for over a decade now. Christan began her career in 2009 working 1:1 with a child diagnosed with Autism. This experience sparked a passion that ultimately led her to pursue her master’s degree in Special Education and certification as a Board Certified Behavior Analyst (BCBA). Christan has experience with curriculum development and program implementation for ages ranging from 18 months to 30 years old across individuals diagnosed with Autism, as well as many of the common comorbid conditions. Christan currently serves as the Interim Director of Training, Clinical Supervisor, and a Senior Clinician at Behavior Change Institute. Her responsibilities include the development of BCBA supervision and training content, providing direct support and consultation for BCBAs, and case management for the Adult population. Outside of her daily clinical responsibilities, she is currently serving as a stakeholder on a committee conducting research through the Patient-Centered Outcomes Research Institute (PCORI), and has recently published on telehealth implementation of ABA treatment in the Journal of Applied Behavior Analysis. Christan has a 12-year-old son who was diagnosed with Autism at the age of 2. The personal experience coupled with her clinical experience, continues to fuel her motivation to invest time and increase knowledge in the field of Applied Behavior Analysis.

References

  1. Anderson, D. M., & Keel, M. C. (2002, Winter). Using Reasoning and Writing to Teach Writing Skills to Students with Learning Disabilities and Behavioral Disorders. Journal of Direct Instruction2(1), 49-55. 
  2. Becker, W. C. (2001, November). Teaching Reading and Language to the Disadvantaged–What We have Learned for Research. Journal of Direct Instruction47(4), 31-52. 
  3. Engelmann, S. (2006, February 7). Rubric for Identifying Authentic DI Programs. National Institute for Direct Instruction. Retrieved December 11, 2020, from https://www.zigsite.com/PDFs/rubric.pdf   
  4. Liem, G.A.D., & Martin, A.J. (2013). Direct instruction and academic achievement. In J. Hattie & E. Anderman (Eds.). International Guide to Student Achievement. Oxford: Routledge.  
  5. Magliaro, Susan & Lockee, Barbara & Burton, John. (2005). Direct instruction revisited: A key model for instructional technology. Educational Technology Research and Development. 53. 41-55. 10.1007/BF02504684. 
  6. Stockard, J. (2015, January 18). A Brief Summary of Research on Direct Instruction. National Institute for Direct Instruction. Retrieved December 12, 2020, from https://www.nifdi.org/research/recent-research/whitepapers/1352-a-brief-summary-of-research-on-direct-instruction-january-2015/file.html 
Share on facebook
Facebook
Share on twitter
Twitter
Share on linkedin
LinkedIn

This post is for informational purposes only and is not meant to be used in lieu of practitioners’ own due diligence, state and federal regulations, and funders’ policies. 

Leave a Reply

Your email address will not be published. Required fields are marked *