As a new student, you probably have lots of questions about where to go to find the resources and information you need about living and learning at NC State. The New Student Checklist will help you start on the path to becoming the newest member of the Wolfpack family.
Note:If you are a first-year student accepted into the Spring Connection program, click here to view the Spring Connection New Student Checklist.
Confirm Your Enrollment
We can’t wait for you to come to campus and start enjoying the Wolfpack life!
- Log into your wolfPAW account to respond to our offer of admission under your To-Do List.
- The deadline to confirm your acceptance is:
Fall Semester: May 1 or within 10 days of receipt of your acceptance
Spring Semester: December 1 or within 10 days of receipt of your acceptance
Spring Connection:June 1
If you choose not to attend, please let Undergraduate Admissions know via your wolfPAW account.
Phone: 919.515.2434 (Undergraduate Admissions)
Access Your Unity Account
- Your Unity ID is an electronic identification key that allows you to access all of your records and services while associated with NC State. You need your Unity ID and password to access your email and campus services (e.g., eBilling, financial aid, Registration and Records) available via MyPack Portal at https://mypack.ncsu.edu.
- To obtain your Unity ID, log in to https://wolfpaw.ncsu.edu/ with your wolfPAW username and password. You will see your Unity ID Credentials (ID and Password) listed on the main page.
- Your Unity ID password is initially set to a default value. That is an 8 digit number composed of the last four digits of your ID number (found on the wolfPAW page) followed by your two-digit birth month and two-digit birthday (mmdd).
- Before you can log in to anything, you must create a password for your account. Please follow these instructions:
- Go to ncsu.edu/UIA to set up your security questions and answers.
- Once you have updated your security questions and answers, go to ncsu.edu/password to change your password to something only you know. (Enter your Unity Username, then your current password (default value). Create a new password and enter it twice.)
- Keep your password private. It is against university policy to share this information with anyone, including a parent or legal guardian.
Phone: 919.515.HELP (4357) (NC State Help Desk)
Complete the University Housing Application (*required for first-year students, optional for transfer students)
Living on campus is an essential part of the Wolfpack experience; that’s why students are required to live on campus their first year. However, transfer students are not required to live on campus and other exceptions can apply for first-year students.
Once you have been offered acceptance at NC State, you should complete a University Housing application. This process is done through MyPack Portalusing your student ID number (can be found on your acceptance letter from Undergraduate Admissions).
Select a Preferred Roommate and/or Residence Hall
The University Housing application gives students the optional to select a preferred roommate and a preferred residence hall. If no roommate or residence hall is entered, University Housing will assign you based on lifestyle questions submitted in your application.
Check out our 16 Living and Learning Villages!
Living and Learning Villages provide on-campus residents with an additional community based on interests, goals, and lifestyles. There is no additional cost to join a Village, but the benefits are endless. If you do plan to join a Village, there is a separate application in MyPack Portal.
Phone: 919.515.2440 (University Housing)
Get Connected to NC State
Now that you are part of the Pack, you need to get to know us better.
Apply for Scholarships and Financial Aid
The Office of Scholarships and Financial Aid (OSFA) assists students with applying for and securing financial assistance. You will be offered the best funding options available to you depending upon the timing of your application for aid, your level of financial need, and availability funding.
(*not applicable to international students)
The Free Application for Federal Student Aid (FAFSA) is available each October 1. Students who wish to be considered for federal, state and institutional financial aid must submit a FAFSA. NC State University as a priority filing deadline of March 1. However, due to the high number of FAFSA applications received, grant funding is awarded on a first come, first served basis. Therefore, it is important to submit the FAFSA and any other requested information as soon as possible. Complete the FAFSA at https://fafsa.gov
Once you are offered acceptance to NC State University, you can begin applying for scholarships via our scholarship portal, PackASSIST. For more information, including deadlines, and to log on using your unity credentials, visit: https://go.ncsu.edu/PackASSIST
Phone: 919.515.NCSU (6278) (Scholarships and Financial Aid)
Provide Admissions with an Official Final Transcript
You must submit an official final transcript from all institutions you attended prior to enrolling at NC State. Final acceptance is contingent on your maintaining a level of academic performance that is equal to or better than your previous academic record.
- Failure to submit an official final transcript will cancel your enrollment
- Mail transcripts to: NC State Office of Undergraduate Admissions, Campus Box 7103, Raleigh, NC 27695-7103
Phone: 919.515.2434 (Undergraduate Admissions)
Provide Health Insurance
Per state law, students must show evidence of existing creditable insurance coverage (coverage under another health plan), purchase a private insurance plan or enroll in the NC State-sponsored plan (Student Blue). Established students will be able to enroll or waive the insurance beginning June 1st (for fall semester) or November 1st (for spring semester).
Phone: 919.515.2563 (Student Health Services)
Complete the Medical and Immunization Forms
Before arriving at NC State University, students are required to submit their medical histories and proof of required immunizations.
North Carolina Law requires that NC State students submit proof of immunizations to Student Health Services within 30 days of acceptance at NC State University. For more information on the requirements, exceptions, exemptions and how to submit records visit https://healthypack.dasa.ncsu.edu/health-records/immunizationtb-compliance/
In addition to submitting immunization records, students are required to complete a Health History Form and TB Questionnaire. Both can be found on the HealthyPackPortal (https://sso.medicatconnect.com)
Phone: 919.515.2563 (Student Health Services)
Access Your Tuition and Fees Information
You will use MyPack Portal to view your eBILLs and detailed student account information. Bills will not be mailed.
- It is the responsibility of each student, whether or not they receive a statement, to make full payment or complete financial aid information prior to the deadline. Approved and accepted financial aid awards and sponsorships will be noted on the account. Only the amounts exceeding the award must be paid.
- View your bill and make an online payment via MyPack Portal. From the Student Homepage, select the Student Accounts tile. Click Billing Statements from the left menu to view your bill. Detailed account information is under Account History. Click the “Make a Payment” button to pay your charges. “What I Owe” gives you a current snapshot of your account.
- Tuition and Fees must be paid by the date posted on your billing statement to avoid class schedule cancellation.
- Fall billing is in July
- Spring billing is in November
- Summer I billing is in April
- Summer II billing is in May
- Grant parental access to student accounts through the MyPack Portal. Details are at go.ncsu.edu/parentaccess
- Sign up for direct deposit through the MyPack Portal to receive refunds. Details are at go.ncsu.edu/directdeposit
- Learn more about billing and payment options by visiting go.ncsu.edu/billing
- Reduce your debt or “payment pain” by enrolling in a monthly payment plan – go.ncsu.edu/mpp
- Waive or enroll in the NC State Insurance plan (if you don’t waive, you will automatically be enrolled and billed). Learn more at go.ncsu.edu/insurance
- If you decide not to attend NC State University you must notify Undergraduate Admissions (firstname.lastname@example.org) no later than the first day of classes (include your full name, student ID and date of birth in the email). You must also cancel your housing and dining plan if applicable. You will be responsible for all charges on your account including any charges for late cancellations.
Phone: 919.515.NCSU (6278) (Cashier’s Office)
Take Placement/Skills Assessments
First-year students:To ensure you are registered in the appropriate courses, placement/skills assessments are offered – and some required – in certain subjects such as math, English, chemistry, and foreign languages. To learn more about these assessments, click here.
- Placement test for math is not required if you have transferable math credit but otherwise you may be required to complete a placement test in mathematics.
- Transfer students should review the transfer policies of the FIrst-Year Writing Program
- Based on your needs you may be required to complete placement tests in chemistry and/or foreign languages
- To learn more about these assessments, click here.
Register to Attend New Student Orientation
You must register for Orientation through your wolfPAW account via https://wolfpaw.ncsu.edu/.
- Click on the “Register for Orientation” link under the “My To-Do List” section on the Dashboard.
- Read the instructions, select, and submit the Orientation you wish to attend.
- Only the session(s) available to you are offered as options. Review your Application Status screen in wolfPAW or your acceptance letter from Undergraduate Admissions if you are unsure of your college.
- Contact New Student Programs at 919.515.1234 immediately if you have a conflict with attending your designated Orientation session.
Parents and family members are also highly encouraged to attend Family Orientation. The program runs in conjunction with your Orientation session and is designed to introduce families to the academic programs and services available at NC State. Click here to learn how to register your family or call 919.515.1234 for more information.
Phone: 919.515.1234 (New Student Programs)
Review Your College’s Common Reading Program Assignment
(First-year students whose first semester is Summer/Fall only – does not apply to transfer students or first-year students whose first semester is Spring)
This year’s Common Reading was selected by a committee of faculty, staff, and students, as a way of introducing new students to institutional and academic values and expectations. You will receive a copy of the book when you come to your orientation (so do not buy it), and there will be activities related to the book throughout your first semester. For information about your college’s expectations and possible related assignments regarding the Common Reading click here.
Phone: 919.515.1234 (New Student Programs)
Sign up for Your Meal Plan (*required for first-year students living on campus, optional for transfer students)
NC State’s award-winning dining program offers a variety of food options that are convenient, fresh, diverse and delicious. From traditional all-you-care-to-eat dining halls to a mix of restaurants, cafes and convenience stores, our program is designed to meet the unique needs and taste of our student body. Whether you live on campus or commute, NC State Dining offers a variety of meal plans that can be tailored to your specific needs.
All first-year students (students entering college for the first time) living in on-campus housing are required to have a meal plan. To compare meal plan options and sign up, visit to dining.ncsu.edu. You can make any changes to your meal plan for the fall semester by September 30 and spring semester by January 31.
Phone: 1.800.701.4940 (NC State Dining)
Complete Alcohol and Hazing Prevention Online Education
All incoming first-year and transfer students under the age of 21 are required to complete an alcohol and other drug education program online. Transfer students who completed such a course at their previous institution may send these results to email@example.com in lieu of retaking the course. Students turning 21 on/before September 1, 2018 are exempt.
Additionally, all incoming first year and transfer students, regardless of age, are required to complete a hazing prevention program online.
Those failing to complete the programs will have a hold placed on their records, and they will not be able to register for next semester classes. Instructions for completing the online programs will be available soon.
Website: Coming Soon
Phone: 919.513.3295 (Alcohol and Other Drug Prevention Education)
Review the Office of International Services (OIS) Pre-Arrival Guide for Immigration Information and Arrival Support (*required for all international students)
The Pre-Arrival Guide is available online here: https://internationalservices.ncsu.edu/pre-arrival-information/
Phone: 919.515.2961 (Office of International Services)
Review the Code of Student Conduct
The Code of Student Conduct serves as the basis for student behavioral expectations at NC State. It contains information related to the university’s jurisdiction over student behavior, academic and non-academic violations, sanctions available when a violation occurs, and information regarding interim suspension.
Academic violations include cheating, plagiarism, or aiding another to cheat or plagiarize. Non-academic violations vary widely and include rules concerning alcohol, drugs, infliction or threat of bodily harm, vandalism, disorderly conduct, sexual and racial harassment, sexual assault, and more.
All students are strongly encouraged to read and understand the Code of Student Conduct.
Phone: 919.515.2963 (Office of Student Conduct)
Request All AP/IB Test Scores Be Sent to NC State
A full list of the AP/IB credit policies at NC State is available online: https://admissions.ncsu.edu/credit-opportunities
- Test scores must be submitted to NC State before New Student Orientation
- All scores must be received directly from the testing agency. Undergraduate Admissions does not accept scores from high school or college transcripts.
- NC State’s AP Code is: 5496
Phone: 919.515.2434 (Undergraduate Admissions)
Update Your Home Address and Phone Number
It is very important that your address and phone information are kept current with the university.
Contact Disability Services
Contact the Disability Services Office (DSO) should you require academic accommodations during the year due to physical, mental health and/or learning disabilities.
Phone: 919.515.7653 (Disability Services Office)
Grant Parent/Guardian Access to Your Records
You can give your parent or guardian online access to view your financial (billing/payment) and academic information (schedule/grades):
Request a College or Curriculum Change
(Does not apply to students whose first semester is Spring)
Requests to change the program into which you were accepted must be submitted to and approved by Undergraduate Admissions prior to June 1.
- Email your request to firstname.lastname@example.org and provide your complete name and date of birth.
- You cannot change your program after June 1 or during Orientation
Phone: 919.515.2434 (Undergraduate Admissions)
Review Options for Bringing a Computer
Most students bring a laptop or other computer when they come to NC State. Be sure to review the computer specifications at https://go.ncsu.edu/comp-specs and check with your individual department or college before purchasing your computer.
The Bookstore offers laptops from Apple, Lenovo, and Dell. Each model is custom configured to meet or exceed the University’s minimum requirements for student laptop purchases. Each model qualifies for education discounts, technical support, and warranty repair from the OIT Walk-In Center.
Phone: 919.515.2161 (NC State Bookstores)
Review Campus Parking/Alternative Transportation Information
Parking on campus requires a valid NC State Parking Permit. Permits are sold online beginning in early July for fall semester and late November for spring semester. Eligibility to purchase a permit is based on a student’s credit hours on file with Registration and Records. Freshmen may purchase a parking permit; however, freshmen will only be eligible to park in the RS-Resident Storage lot which is approximately 380 spaces.
Alternative transportation options include:
- Wolfline campus bus system
- GoPass access card to ride city/regional transit
- Zipcar car share service
- LimeBike Bicycle share program
- Personal Bicycle (registration mandatory)
Phone: 919.515.3424 (Transportation)
Keep Your Computer and Information Safe
Be careful about what information you share on all of your devices and never share your password with anyone, even your parents.
- Phishing attacks are common on campus; and we remind you that the NC State Help Desk will never ask you for your password.
- If you receive suspicious email or other notices requesting your password, consider them spam and delete them immediately.
- All of your technology devices, including smartphones and tablets, should be password-protected and kept up-to-date with security patches and antivirus protection, if available.
- You are strongly encouraged to enroll in Two-Factor Authentication to protect your data. NC State supports two services: Google 2-Step Verification for Google accounts, and Duo for Shibboleth-connected services like MyPack Portal and Wolfware. Visit go.ncsu.edu/2fa for more information.
Phone: 919.515.HELP (4357) (NC State Help Desk)
Purchase Your Textbooks
After you have registered for your classes, go to MyPack Portal https://mypack.ncsu.edu or to the NC State Bookstore’s website and order your textbooks. All you will need is your student ID to view your textbook list and order your books. The Bookstore offers a robust online ordering system that includes online price comparison, textbook rental, ebooks and textbook buyback. Students eligible for financial aid may also order their books online and the bill will be sent directly to the University Cashier’s Office.
Phone: 919.515.3424 (NC State Bookstores)
Get WolfTV for Your On-Campus Room
The Office of Information Technology provides an all-digital, comprehensive cable television package (WolfTV) for students living on campus.
Phone: 919.515.3153 (Office of Information Technology)
Consider Participating in One of NC State’s Additional Summer Programs
(Does not apply to students whose first semester is Spring)
There are a number of optional supplemental programs offered to assist students in their transition to the university. Consider participating in one (or more) of the following:
Wolf Camp & Wolfpack Bound
Open to all first-year and transfer students
Wolf Camp and Wolfpack Bound are outdoor adventure programs, offered by University Recreation, for incoming students. Wolf Camp is a three-day trip with approximately 50 other new students. For this trip, you’ll ascend up the NC State challenge course and zipline and then head to a campground for campfires with s’mores, games to get to know other new students, and activities like kayaking, biking, and hiking.
Wolfpack Bound is a five-day trip with just 11 other new students and ventures to Hammocks Beach State Park on the NC coast for sea kayaking, relaxing on the beach and plenty of time to get to know other students and get valuable information on what college will be like.
Both trips are beginner-friendly and no outdoor experience is required! Your registration fee covers transportation, food, equipment, and experienced leadership by University Recreation Outdoor Adventures staff.
Symposium for Multicultural Scholars
Open to all first-year and transfer students
The purpose of the Symposium is to maximize the academic success of incoming African American, Hispanic/Latino, Asian American, Pacific Islander and Native American/Indigenous first-year students by providing information about opportunities that enhance your academic experience and knowledge of campus resources, multicultural faculty and staff, cultural heritage, networking, and other strategies for success. The Symposium fosters a sense of community and provides a foundation to ensure academic success on our college campus.
Open to all first-year and transfer students
The GLBT Symposium is a half-day welcome and community-building experience designed to provide incoming students with information about the GLBT Center, annual events, and ways to get involved. Students will learn about GLBT student organizations on campus, opportunities to volunteer through the GLBT Center, and GLBT resources in the local community. Incoming students will get a chance to connect with each other and hear from returning students about what it’s like to be GLBT at NC State. This annual event is a great way for students to make new friends, enjoy free food, and have some fun. Lunch will be provided.
Open to all first-year and transfer students
Summer Start assists new first-year and transfer students with the transition to NC State through five weeks of college courses and campus involvement during Summer Session II. Summer Start works closely with each college to ensure students will be enrolled in academic courses towards their specific major to get them on the accelerated path to graduation. In addition to up to eight credit hours of university coursework, many optional social, academic, leadership, and service programs are planned throughout the week and on weekends. Summer Start will provide a strong introduction to the culture of NC State and a jump start on your academic degree requirements. Take advantage of this unique opportunity to get ahead with a smaller community of your peers.
First Year Inquiry in Prague
Open to all first-year students
FYI in Prague, hosted at the NC State European Center in Prague, is a 2 week program that merges the high impact experiences of a first year inquiry course and study abroad. Students earn 3 GEP credits that count towards their degree before the start of their first semester on campus. Students will be immersed in inquiry-guided learning, utilizing the historic and beautiful city of Prague as their classroom.
Children with Attention-Deficit/Hyperactivity Disorder (ADHD) typically experience clinically significant impairment in the school setting as evidenced by lower school grades and achievement scores and higher rates of school dropout in comparison to their peers (DuPaul & Stoner, 2003; Frazier et al., 2007). Emerging evidence suggests that organizational skills problems characteristic of children with ADHD are strongly associated with academic impairment. Organizational skills is a broad term that encompasses both the ability to manage materials and belongings (e.g., transfer of homework assignments to and from school) and time (e.g., planning ahead to ensure adequate time is spent studying). Parent and teacher ratings of materials management and planning behaviors have been shown to predict school grades, with materials management behaviors predicting grades above and beyond the impact of intelligence (Langberg, Epstein et al., 2011). Further, parent ratings of homework materials management in elementary school have been shown to predict grade point average (GPA) in high school (Langberg, Molina et al., 2011). The association between homework materials management and academic performance is present even after controlling for stimulant medication use and receipt of school services (Langberg, Molina et al., 2011).
Problems with organization tend to increase in severity as children progress through school (Booster, DuPaul, Eiraldi, & Power, 2010; Langberg et al., 2010). In particular, problems with organization often escalate following the transition to middle school (Evans, Serpell, & White, 2005). The transition to middle school is marked with numerous environmental changes and represents a significant challenge for children with externalizing behavior problems (Langberg, Epstein, Altaye et al., 2008; Moilanen, Shaw, & Maxwell, 2010). A greater number of teachers, increased demands for independence, and larger workloads make the transition to middle school difficult (Evans, Langberg, Raggi, Allen & Buvinger, 2005; Evans, Serpell et al., 2005). Middle school children with ADHD frequently lose homework assignments or fail to turn them in on-time, misplace school materials such as books, pencils, and classwork, and procrastinate and fail to adequately prepare for tests (Evans et al., 2009; Langberg, Epstein et al., 2011).
Given the association between organizational skills and academic performance (Langberg, Vaughn et al., 2011), and the fact that medication does not normalize these problems (Abikoff et al., 2009), psychosocial interventions have been developed. Organizational skills interventions have typically focused on academic aspects of organization, such as classroom preparation, homework management, and managing time during and after school, in addition to the physical organization of school materials. Strategy and skills training are typically the core features of organizational interventions for children with ADHD. Behavioral therapeutic techniques such as rehearsal, prompting, shaping and contingency management are used to teach and promote skills use and their generalization.
Most organizational skills interventions include point systems or token economies to monitor and reward adherence to a structured organizational skills system (see Langberg, Epstein, & Graham, 2008 for a review). On a periodic basis, children are awarded points for meeting operationalized goals. Points are typically applied towards purchasing rewards. The ultimate goal of all programs is to reduce the frequency of monitoring and overt reward and/or to transfer monitoring and reward responsibilities from the clinician to school staff or to a parent/guardian. To this end, many organizational skills programs for children include intervention with parents/guardians or school mental health (SMH) providers (e.g., Gureasko-Moore, DuPaul & White, 2006; 2007; Pfiffner et al., 2007). Parents or school staff are trained to take over the monitoring of organization and application of rewards in an effort to promote skills generalization.
Organizational skills training has been included as part of a number of multicomponent interventions for children with ADHD (e.g., Evans, Langberg et al., 2005; Evans, Serpell, Schultz, & Pastor, 2007; Evans et al., 2009; Hechtman et al., 2004; Pfiffner et al., 2007; Power et al., under review). These interventions are multicomponent because in addition to targeting organization and time management, they often target behavior problems, social skills, and other educational skills (e.g., study skills). These multicomponent interventions have been shown to lead to significant improvements in interpersonal functioning and organizational skills (Evans et al., 2009; Pfiffner et al., 2007; Pfiffner et al., 2011) as well as decreases in parent and teacher ratings of overall academic impairment (Evans, Langberg et al., 2005). Given that it is difficult to disentangle the specific impact of organization skills training versus other interventions in multicomponent studies, the literature review below focuses on interventions designed specifically to target organizational skills.
Gureasko-Moore, DuPaul, and White (2006, 2007) used a multiple baseline design to examine the efficacy of self-management training for improving the organizational skills of young adolescents with ADHD. Participants were taught to monitor and record their own classroom preparation and homework behaviors daily on checklists. Participants reviewed the checklists with an SMH provider and operationalized goals for improvement. The efficacy of this intervention was evaluated across two studies using three and six middle school students (Mage = 12) respectively. Post intervention, all participants were completing classroom preparation behaviors nearly 100% of the time. Similarly, participants exhibited low percentages of homework behaviors at baseline (range = 18–66%) and improved to nearly 100% by completion of the 6-week intervention.
Abikoff and Gallagher (2008) pilot-tested a 10-week, 20-session clinic-based individual intervention designed to improve physical organization of materials, time-management, assignment tracking and planning skills. Twenty children in grades 3–5 diagnosed with ADHD received the intervention delivered by clinical psychologists. This pilot study focused on evaluating feasibility, acceptability and effectiveness, and so no comparison group was included. Participants made significant improvements on parent and teacher ratings of organizational skills and on parent ratings of homework performance. Parents and teachers were highly satisfied with the intervention and attendance was high with no children dropping out of the intervention. In addition, the investigators recently completed a large randomized trial of the organizational skills intervention. Participants (N = 158) in grades 3–5 were randomly assigned to three conditions, including one of two different organizational skills interventions or a waitlist comparison. Preliminary results show that children with ADHD in both of the organizational skills intervention groups made significant gains according to parent and teacher rated organizational skills, homework problems and academic proficiency (Abikoff et al., 2011).
Langberg, Epstein, Urbanowicz, Simon, and Graham (2008) evaluated the efficacy of an 8-week intervention called the Homework, Organization, and Planning Skills (HOPS) intervention for middle school age students with ADHD. Thirty-seven students (Mage = 11) were randomly assigned to receive the HOPS intervention (n = 24) or to a waitlist comparison (n = 13). The intervention focused on improving participants’ physical organization (i.e., bookbag, binder, and locker) and homework management (i.e., accurate homework and test recording and planning) and was delivered by undergraduate college students as a school-based after-school program. The intervention included two parent training sessions that focused on transferring behavior monitoring responsibilities and contingency management to the home setting. According to parent ratings, intervention participants in this study made large gains in materials organization and homework management relative to the comparison and these improvements were largely maintained at an 8-week follow-up. Further, participants in the intervention group made small to moderate improvements in overall GPA. Teachers rated minimal improvements in academic performance that were not statistically significant.
In summary, organizational skills interventions appear to be highly effective at improving organization and time management skills and homework problems in children and young adolescents with ADHD. Youth with both ADHD-Inattentive Type and ADHD-Combined Type have been included in prior studies and, to date, there is no evidence for differential intervention effectiveness. There is also some evidence that these improvements translate into gains in overall academic performance as measured by teacher ratings and school grades (e.g., Abikoff et al., 2011; Langberg et al., 2008). However, the primary limitation of the organizational skills intervention work completed to date is that the interventions have been implemented by trained research staff under controlled conditions. For example, in the Langberg et al. (2008) study, research staff received in-depth training and daily observation and supervision to promote high levels of treatment fidelity. Failure to evaluate interventions as implemented in their intended settings by community providers has been identified as one of the primary barriers to successfully disseminating evidence-based treatments (Chorpita, 2003; Weisz, Jensen, & McLeod, 2004). If organizational skills interventions are to be widely disseminated, they must be feasible for clinicians/schools to implement using existing infrastructure (e.g., staff and time; Kataoka, Rowan, & Hoagwood, 2009). Weisz and colleagues (Weisz, 2000; Weisz et al., 2004) proposed the Deployment Focused Model (DFM) as a method of developing treatments that can overcome the research to practice gap. This model suggests that effectiveness research should take place early in the intervention development process with intervention protocols piloted in their intended settings. As part of this process, feedback should be gathered from stakeholders regarding feasibility of implementation and modifications made to the protocol to increase the potential for widespread dissemination. The intervention is then tested, typically using randomized trial methodology, to determine if the modified protocol can be implemented in the intended setting with fidelity and produce clinically significant improvements in participant functioning. Assessment of treatment fidelity is a critical component of effectiveness research in order to gauge the amount of training and supervision that will be necessary when the intervention is disseminated.
With this goal in mind, Langberg, Vaughn et al. (2011) modified and refined the HOPS intervention for young adolescents with ADHD so that it could be feasibly implemented by SMH providers during the school day. Using an open trial design, SMH providers (N=10) from three separate school districts implemented the HOPS intervention, each with one middle school student with ADHD. SMH providers and teachers participated in focus groups and provided feedback on ways to improve the feasibility and usability of the HOPS intervention. These qualitative data, along with a review of audio-recorded HOPS sessions, were used to systematically refine the HOPS intervention protocol. A number of substantial changes were made, including adding scripts for SMH providers to use to engage students in session, devoting additional sessions to troubleshooting, increasing the frequency of rewards provided for skills implementation, and moving parent sessions earlier in the intervention (see Langberg, Vaughn et al., 2011 for further detail).
The purpose of the present study is to complete an evaluation of the refined HOPS intervention using a randomized controlled design. As in the two previous studies of HOPS, the primary dependent measures were ratings of homework problems and organizational skills. It was hypothesized that participants in the HOPS intervention group would demonstrate significantly greater improvements in homework problems and organizational skills in comparison to participants in a waitlist comparison group. It is also critical that studies of organizational skills interventions also evaluate change in more distal outcomes, in order to demonstrate that improvements in organizational skills impact academic performance. Accordingly, school grades were also examined in this study. In keeping with a focus on feasibility and potential for dissemination, SMH providers working in local school districts were recruited to implement the refined HOPS intervention. SMH providers were provided with the HOPS treatment manual but did not receive formal consultation from research staff during intervention implementation. Accordingly, another important aspect of this study is to preliminarily evaluate SMH providers’ ability to implement the HOPS intervention with fidelity.
Schools and SMH Providers
Seventeen SMH providers (seven school psychologists and ten school counselors) from five school districts and twelve distinct schools were recruited to participate in this study. The school districts involved in the study were diverse, with urban, suburban and rural school districts represented. The three urban schools in this study each had a >90% minority student body with >85% of students receiving free or reduced lunch. SMH providers were recruited through a series of face to face meetings with the first author. At these meetings, SMH providers were told that they would receive a copy of the HOPS treatment manual (Langberg, 2011) and a $100 honorarium for their participation. In addition, SMH providers were told that they would receive new school materials for each participant they provided intervention to (e.g., school binder, folders, and paper) and that incentives earned by participants would be provided by the study. The SMH provider participation rate was 100% at three of the five districts where presentations were made. Specifically, at those three districts, all middle school counselors and school psychologists in the district participated. In the fourth district there were two middle schools and the SMH providers at one of the two schools agreed to participate. The fifth district was a large urban district and a single, 10-minute presentation was made to all 36 school psychologists who served middle school students. Interested school psychologists were asked to follow-up by calling the first author, and 4 of 36 called and signed consent to participate (11%). As a condition of participation, SMH providers each had to agree to work with a minimum of two students at their school. This was to allow random assignment of participants to occur at the SMH provider level. For example, if an SMH provider worked with two study participants, one was randomly assigned to intervention and the other to waitlist comparison. All of the SMH providers who participated were female and Caucasian. The SMH providers were diverse in terms of age (M = 39; SD = 12.7; Range = 27 – 66), educational background (N = 7 Ed.S; N = 7 M.A; N = 3 M.Ed.), and years of service (M = 10.1; SD = 7.8; Range = 1 – 26).
All student participants (N = 47) were in grades 6–8 with an age range of 11–14 (see Table 1 for additional student demographics). Students were referred to the study by the SMH providers. Specifically, SMH providers were provided with recruitment flyers which described the study and stated that students in grades 6–8 with attention problems and academic difficulties and/or students with a diagnosis of ADHD were eligible to participate. SMH providers then contacted the parents/guardians of students that they thought would be a good fit for the study. Parents who called study staff to express interest in participation were scheduled for an inclusion/exclusion evaluation if their child met the phone screen criteria (≥4 of 9 symptoms of inattention endorsed over phone or a previous diagnosis of ADHD). Sixty-three families completed an inclusion/exclusion evaluation and 47 met full inclusion/exclusion criteria and were enrolled. To be included in the study, students had to meet DSM-IV criteria for a diagnosis of ADHD -Inattentive Type or Combined Type and have an estimated full scale IQ > 75. Diagnosis was determined using a combination of a structured interview administered to the parent, the Diagnostic Interview Schedule for Children – IV (DISC-IV; Shaffer, Fischer, Lucas, Dulcan, & Schwab-Stone, 2000), and teacher ratings on a DSM-based scale, the Vanderbilt ADHD Teacher Rating Scale (VATRS; Wolraich, Feurer, Hannah, Baumgaertel, & Pinnock, 1998). To be eligible for participation, students had to meet criteria for ADHD on the DISC-IV and have at least four symptoms in one domain endorsed as often or very often on the VATRS. Children with comorbid conditions were included in the study (see Table 1) unless they met criteria for Bipolar Disorder, Psychotic Disorder, or Substance Dependence. Full scale IQ was estimated using four subtests from the Wechsler Intelligence Scale for Children – 4th Edition (WISC IV; Wechsler, 2003).
Participants were randomized at the SMH provider level to receive the intervention immediately (at the beginning of the school year) or to a waitlist comparison condition that would receive intervention as soon as the SMH provider finished working with intervention participants. The interventions that participants on the waitlist received were determined by the SMH provider in collaboration with the family. Specifically, the SMH provider and family could decide to implement all of HOPS, parts of HOPS, or to try a different intervention or accommodation. To ensure that equivalent numbers of students in the intervention and comparison groups were on ADHD medication, random assignment was completed blocking on ADHD medication status (see Table 1). For example, if an SMH provider was working with four students and two of them were taking ADHD medications, random assignment was blocked to ensure that only one of the two students assigned to the intervention condition was taking ADHD medication. The median number of participants assigned to each SMH provider was three (M = 2.76; Range = 2–5). The study was approved by the IRB and SMH providers, parents, and children either consented or assented to participate in the study.
Parents and teachers completed ratings for both the intervention and comparison groups pre-and post-intervention. Parents and teachers also completed a 3-month follow-up for students in the intervention group. Two teachers, Math and Language Arts, completed ratings for each participant.
Homework Problems Checklist (HPC; Anesko, Schoiock, Ramirez, & Levine, 1987)
Homework completion and homework materials management behaviors were assessed using the 20-item parent-completed HPC. For each item, parents rate the frequency of a specific homework problem on a 4-point Likert scale (0 = never, 1 = at times, 2 = often, 3 = very often). Higher scores on the measure indicate more severe problems. The measure has excellent internal consistency, with alpha coefficients ranging from .90 to .92 and corrected item-total correlations ranging from .31 to .72 (Anesko et al., 1987). Factor analyses indicate that the HPC has two distinct factors (Langberg et al., 2010; Power et al., 2006) measuring homework completion behaviors (HPC Factor I) and homework materials management behaviors (HPC Factor II). These factors are consistent across general education and clinical samples. Example items from Factor I (Homework Completion) include: a) Must be reminded to sit down and start homework; b) Daydreams during homework; c) Doesn’t complete work unless someone does it with him/her; and d) Takes an unusually long time to complete homework. Example items from Factor II (Homework Materials Management) include: a) Fails to bring home assignments and materials; b) Forgets to bring assignments back to class; and c) Doesn’t know exactly what has been assigned. In the present study, internal consistencies were high (Factor I α = .87, Factor II α = .88).
Children’s Organizational Skills Scale (COSS; Abikoff & Gallagher, 2008)
The COSS is a measure of organization, planning and time-management skills that has parent, teacher, and child versions. The COSS yields three subscale scores that have been validated through factor analysis: Task Planning, Organized Actions, and Memory and Materials Management. Items on the Task Planning subscale relate to children’s proficiency with planning out the steps needed to complete tasks in order to meet deadlines. Items on the Organized Actions subscale relate to children’s use of tools (e.g., planners and calendars) and strategies (e.g., lists) to accomplish tasks. Items in the Memory and Materials Management subscale relate to whether children lose items and how well they manage their materials (e.g., bookbags, binders, and supplies). The items from these subscales can be combined to generate a COSS Total Score. There are also two additional subscales, Life Interference and Family Conflict, which assess for the presence of functional impairment due to organizational skills problems. Scoring the COSS generates raw scores for each subscale which were used in the analyses. The raw scores can be turned into T-scores with scores > 60 indicating a clinically significant problem. T-scores between 60 and 69 are considered elevated (more problems than typical) and scores > 70 are considered to be very elevated (many more concerns than typical). Internal consistency for the items included in the COSS total score as reported in the COSS Technical Manual (Abikoff & Gallagher, 2008) is high for the parent version (.98) and teacher version, (.97). Test-retest reliability with the three COSS subscales is also high for the parent (.94 –.99) and teacher (.88 – .93) versions. In the present study, each participant’s parent/guardian and Math and Language Arts teacher completed the COSS. The COSS subscales had adequate internal consistencies in the present study (parent αs = .74–93; Language Arts teacher αs = .89-.96; Math teacher αs = .82-.94).
Vanderbilt ADHD Diagnostic Parent Rating Scale (VADPRS)
The VADPRS is a DSM-IV-based scale that includes all 18 DSM-IV symptoms of ADHD. Parents rate how frequently each of symptoms occur on a 4-point Likert scale (0 = never, 1 = occasionally, 2 = often, 3 = very often). The VARS produces an Inattention score (sum of the nine inattention items) a Hyperactivity/Impulsivity score (sum of the nine hyperactive/impulsive items) and a Total score. The VADPRS has excellent psychometric properties (Wolraich et al., 2003) and internal consistencies were high in the present study (Inattention α = .92, Hyperactivity/Impulsivity α = .96, Total ADHD α = .94).
At the end of the school year, report cards containing school grades were collected for all study participants. All of the districts involved in the study used the same scale for grades where A = 4.0, A− = 3.7, B+ = 3.3, B = 3.0, B− = 2.7, etc. Grade point average (GPA) was calculated as the average of participants’ core class grades (math, science, history, language arts). Participants’ overall GPA served as the criterion variable in the analyses.
Parent Skills Implementation Questionnaire
At the 3-month follow-up, parents of intervention participants completed a brief questionnaire asking them to indicate if they continued to monitor and reward their child’s use of the HOPS skills. Specifically, parents were asked if they had been: (1) monitoring their child’s homework assignment completion; (2) checking their child’s planner for homework recording accuracy; and (3) monitoring their child’s materials organization using the HOPS organization checklist. If parents answered yes to any of the above questions, they were asked to indicate how often per week they were monitoring and if rewards and/or consequences were being provided.
A nine item satisfaction questionnaire was modified and used in this study (Langberg, Vaughn et al., 2011). The majority of items assessed parent satisfaction related to specific components of the HOPS intervention. For example, parents were asked to rate the level of communication between the SMH provider and the parent and how well the binder organization system worked for their child. In addition, parents responded to more general questions about overall satisfaction with the intervention. Parents indicated their agreement with each statement on a 5-point Likert scale (0 = strongly disagree, 1 = disagree, 2 = neutral, 3 = agree, 4 = strongly agree). Statements were phrased so that higher scores represented greater satisfaction (e.g., “I found the two parent meeting with my child’s school counselor/psychologist to be helpful”).
SMH Provider Satisfaction
The SMH providers completed a satisfaction questionnaire after implementing the intervention. All items were Likert-type items and SMH providers were asked to indicate if the strongly disagreed, disagreed, were neutral, agreed, or strongly agreed with ten statements. Example items included, “The HOPS interventions were feasible to implement in the school setting”, and “The HOPS treatment manual was user friendly and easy to follow”, and “I am likely to use this intervention again for students in the future”. All items were scored from 0 (strongly disagree) to 4 (strongly agree), with higher scores representing greater satisfaction.
Organizational Skills Checklist
The Organizational Skills Checklist has been utilized in a number of treatment outcome studies with adolescents with ADHD (e.g., Evans et al., 2009; Langberg et al., 2008). This checklist consists of 14 operationalized criteria for binder (7 criteria), bookbag (4 criteria), and locker (3 criteria) organization. Example items include: 1) There are no loose papers in the bookbag; and 2) All papers in the binder are filed in the appropriate class section. SMH providers completed the organizational skills checklist at the beginning of every HOPS session and recorded either “Yes” or “No” to indicate whether participants met each criterion. The organizational skills checklist was used to assess treatment fidelity in this study. Specifically, research staff completed the checklist independently from the SMH provider and agreement was calculated. In addition, SMH provider checklists from all HOPS sessions were collected at the end of the intervention period and reviewed for accuracy.
HOPS Components Checklist
The HOPS Components Checklist was developed for this study. Each of the 16 sessions in the HOPS treatment manual was reviewed by research staff and the first author. A separate checklist was created for each HOPS session as some sessions included more steps than others (e.g., time-management is introduced in later sessions). The number of criteria on each checklist ranged from 8 – 11, depending on the session. Example items include: 1) SMH provider completed the time-management checklist; 2) SMH provider reviewed the evening schedule completed last session with the student; 3) SMH provider spent time helping the student troubleshoot difficulties with the organization system; and 4) SMH provider introduced and explained the self-management checklist. The checklist also asked research staff to record how long the session took. SMH providers did not have access to the HOPS components checklists as they were not included in the treatment manual. When research staff observed HOPS sessions, the HOPS Components Checklist was completed as a measure of fidelity.
SMH providers received the HOPS intervention manual (Langberg, 2011) to review at the beginning of the school year and began implementing the HOPS intervention with children assigned to the intervention group in September, 2010. The first author met individually with each of the SMH providers for 1 hour prior to intervention implementation. Half of this meeting was spent reviewing study procedures. Example issues discussed included: when ratings would be administered, when SMH providers could start working with students in the waitlist group, and how treatment fidelity observations would be scheduled. During the second half of this meeting, the first author provided an overview of the HOPS intervention treatment manual and procedures. Specifically, the first author outlined when each particular skill would be introduced (e.g., organization versus time management) and demonstrated how to complete the progress monitoring checklists provided in the HOPS manual. These are the checklists SMH providers use to monitor participant progress with organization, teacher initials, and time management skills and also to track the number of points earned. SMH providers were informed during the consent process that in order to test the feasibility and usability of the HOPS manual, the first author would not provide any ongoing consultation while they were implementing the interventions.
The HOPS intervention delivered in this study was an individual (i.e., 1:1), 16-session intervention, delivered during the school day, with each session designed to last no longer than 20 minutes. Initial sessions occurred twice weekly and then moved to once-a-week for the last six sessions. As a result, the 16 sessions can be completed over an 11-week period. The specific skills areas targeted with intervention did not change from the Langberg, Vaughn et al. (2011) study to the current study. Three main skills areas were covered: school materials organization, homework recording and management, and planning/time-management. Materials organization and homework recording and management skills were introduced first and time-management/planning was introduced second.
For materials organization, the SMH provider taught the student a specific system of bookbag, school binder, and locker organization. The student also was taught to implement an organization system for transferring homework materials to and from school. For homework recording and management, the SMH provider taught the student how to accurately and consistently record homework assignments, projects and tests in a planner. In the planning/time-management portion of the program, SMH providers taught students how to break projects and studying for tests down into small, manageable pieces, and how to plan for the timely completion of each piece. Participants were also taught how to plan out after school activities using an evening schedule to balance extracurricular activities and school responsibilities. Skills instruction was completed by session 10, after which the SMH providers met with students once per week and focused on problem-solving difficulties and self-monitoring and maintaining skills (for further details about the HOPS intervention see Langberg, 2011).
The HOPS intervention included a point system. SMH providers completed skills tracking checklists at every intervention session that included operationalized definitions of materials organization and homework management. At each HOPS session, students’ materials (e.g., binder, bookbag, and planner) were visually inspected by the SMH provider. Students received points for each criterion they met on the skills tracking checklists (e.g., no loose papers in bookbag = 1 point). In later sessions, the SMH providers also completed a checklist containing operationalized definitions of time-management, and the student earned points for effectively planning and studying for tests and projects (e.g., recorded a test in the planner = 1 point; designated a time to study for the test = 1 point). These points accumulated and students traded in the points for gift card rewards.
The HOPS intervention included two 1-hr parent meetings. These meetings were held at the school and included the SMH provider, the student, and one or both parents. The first meeting took place early in the intervention and was designed to orient the parent/guardian to the program. The second meeting took place near the completion of the intervention. The goal of the second parent meeting was to teach the parent how to manage the HOPS checklist completion and reward responsibilities once the intervention period ended. Parents learned about the point system and worked with the SMH provider to establish a plan for providing home-based rewards.
All SMH providers consented to having one randomly selected HOPS sessions observed and audio-taped. SMH providers were not told which sessions would be observed until the week the session was held. Study staff spread out the fidelity observations to ensure that HOPS sessions 2 – 15 were each observed at least once. There were three separate processes for evaluating fidelity to the intervention procedures outlined in the HOPS manual. First, HOPS intervention component checklists were developed that listed the specific topics to be covered by the SMH provider in each intervention session. Study staff completed these checklists during the observed sessions to evaluate SMH providers’ fidelity to the intervention procedures. Second, during session observations, study staff completed the relevant skills checklists (e.g., organizational skills checklist) independent of the SMH provider. Each checklist contains a number of operationalized criteria (e.g., organization checklist contains 14 criteria) and the SMH provider indicates yes/no whether the student met each criterion. Agreement between the study staff checklists and the SMH provider checklists was examined. Third, all SMH provider-completed checklists were photocopied at the end of the intervention. This allowed study staff to evaluate SMH providers’ fidelity to completing the checklists to monitor and reward progress with organizational skills at all intervention sessions as specified in the HOPS manual.
Prior to intervention effect analyses, baseline equivalence between groups was examined using independent sample t-tests and chi-square analyses. Next, repeated measures multivariate analyses of variance (MANOVAs) were conducted to examine main effects of group (intervention and comparison), time (pre- and post-intervention), and group × time interactions for the primary dependent measures (i.e., COSS and HPC). Four MANOVAs were conducted for parent ratings and one MANOVA was conducted for teacher ratings. The four MANOVAs for parent ratings included: (1) two HPC factors (Homework Completion and Materials Management); (2) three COSS factors that make up the COSS Total Score (Task planning, Organized actions, Memory and Materials Management); (3) the two COSS impairment factors (Life Interference and Family Conflict); and (4) the two subscales from the Vanderbilt ADHD Rating Scale (Inattention Total Score and Hyperactivity/Impulsivity Total Score). The MANOVA for teacher ratings included the three COSS factors (Task Planning, Organized Actions, Memory and Materials Management). The Math and Language Arts teachers’ ratings were entered simultaneously into the teacher MANOVA. For all MANOVAs, when group × time interactions were significant, effects at the subscale level were examined using repeated measures ANOVAs with Bonferonni corrections. When two follow-up tests were conducted (e.g., for the two subscales on the HPC), statistical significance was set to .025 and when three follow-up tests were conducted (e.g., three subscales on the COSS), statistical significance was set to .017. Eta-squared (η2) effect sizes were calculated to represent the magnitude of the group × time interactions and Cohen’s d effect sizes were calculated using standardized mean difference scores to examine the magnitude of between group differences (Kline, 2004). For Cohen’s d effect sizes, .20 considered small but likely meaningful, .50 considered a medium effect and .80 is considered large (Cohen, 1988). For η2 effect sizes, .01 is considered small, .06 medium, and .14 large (Cohen, 1988). We also conducted paired sample t-tests in order to examine whether intervention participants’ gains evident at post-intervention were maintained at the 3-month follow-up.
In addition to the primary analyses, we wanted to explore the impact of the intervention on more distal measures of functioning. Therefore, in secondary analyses we also examined the impact of the intervention on school grades. Independent sample t-tests were used to compare overall GPA between the intervention and comparison groups for the 1st and 2nd quarters of the school year (i.e., the intervention period) and Cohen’s d effect sizes were calculated. GPA during the 3rd and 4th quarters was also examined as part of the follow-up analyses. Independent sample t-tests were used to examine GPA because no baseline grade data were available. Specifically, participants had received more than a full month of intervention prior to the end of Quarter 1.
This study evaluated the effectiveness of the HOPS intervention for young adolescents with ADHD as implemented by SMH providers during the school day. Forty-seven middle school students with ADHD were randomly assigned to receive the HOPS intervention or to a waitlist comparison group. Intervention participants demonstrated significant improvements relative to comparison participants across parent-rated materials management and planning skills, life interference due to organizational skills problems, and homework problems outcomes. These effects were largely maintained at a 3-month follow-up assessment relative to the post-intervention timepoint. Intervention participants also had significantly higher GPAs than the comparison group during the intervention period and their GPAs did not decline during the post-intervention period. In contrast to parent ratings, significant effects on organizational skills were not observed on teacher ratings. Preliminary evidence also suggests that SMH providers were able to implement the intervention with fidelity despite the fact that no formal ongoing consultation was provided.
These findings further support the effectiveness of the HOPS intervention as implemented by SMH providers (Langberg, Vaughn et al., 2011). Similar to previous work, effects in this study were found on parent ratings but not on teacher ratings (Langberg et al., 2008, 2011). Further, the magnitude of between group effects on homework problems in this study as implemented by SMH providers (HPC Total Score; d = .83) was similar to the effects found in the previous randomized trial implemented by trained and supervised research staff (d = .71; Langberg et al., 2008). This study adds to previous work evaluating the efficacy of the HOPS intervention for young adolescents with ADHD by using a randomized controlled design along with SMH provider implementation.
The findings that SMH providers were able to implement the HOPS intervention without formal ongoing supervision or consultation, and that SMH providers found the intervention feasible to implement during the school day, are perhaps the two most important findings from this study. Typical randomized controlled trials use research staff to implement the intervention. Interventionists often receive weekly supervision to ensure that protocols are followed closely. Further, psychosocial interventions tested in randomized trials are often time and resource intensive (Chorpita, 2003; Weisz et al., 2004). As a result, evidence-based psychosocial interventions are rarely disseminated into community settings (Kataoka et al., 2009). When they are disseminated, fidelity is often an issue, either because interventions need to be modified so that they are feasible to implement, or because the community providers do not have the training, supervision, and/or infrastructure necessary to implement the procedures (Frazier, Formoso, Birman, & Atkins, 2008; Schoenwald & Hoagwood, 2001; Weisz, Donenberg, Han, & Kauneckis, 1995).
The HOPS intervention was specifically designed and refined with these dissemination concerns in mind. For example, during the development of the HOPS intervention SMH providers indicated that it would not be feasible to have parents attend more than two sessions. Therefore, while it might be ideal to include more parent sessions, only two sessions were included. It is important to note that attendance at the parent meetings in this study was 100%, with at least one parent/guardian attending two sessions for all intervention participants. This finding lends credence to SMH provider input regarding parent involvement in school-based interventions.
During intervention development, SMH providers also indicated that sessions needed to be fewer than 20 minutes in length if the intervention was to be implemented during the school day. While longer sessions would allow the intervention to be delivered over a shorter period of time, based on this input, the manual was written so that each session should take no longer than 20 minutes to implement. In this study, the mean session length was 22.5 min with some sessions taking as few as 10 min. The fact that SMH providers were able to implement the intervention during the school day is promising from a dissemination perspective. These findings also demonstrate the value of involving community-based providers in intervention development, and in conducting effectiveness work under real world conditions, prior to completing large scale efficacy trials. Such an intervention development model is counter to current intervention development theory but may result in more evidence-based interventions reaching the community.
In terms of resources, implementation of HOPS requires SMH provider time, space to implement the intervention, and a source for providing students with rewards. In the current study, students were provided with gift cards as rewards for consistently implementing materials organization and planning skills. Outside of the context of a research study, SMH providers may not have access to funds for gift cards and may need to use other types of rewards. The HOPS manual suggests that the SMH provider create a rewards menu, listing multiple reward options that do not cost money, such as playing a game with the SMH provider, a get out of homework free pass, or time on a computer or video game system (Langberg, 2011). SMH providers also received a 1 hr meeting with the first author prior to implementing the intervention, and approximately 30 minutes of that time was spent orienting the SMH provider to the treatment manual and checklists. It is currently unclear if the 1 hr meeting or provision of gift cards are critical components of the HOPS intervention, and future research will need to examine these questions.
The finding that intervention participants had significantly higher school grades than comparison participants strengthens the evidence supporting the efficacy of the HOPS intervention because school grades are less subject to rater biases. Further, the fact that no significant effects were found on teacher ratings, yet intervention participants had higher school grades and parent-rated improvements in functional impairment, supports the assertion that middle school teachers may not be able to accurately rate the constructs of organization and time management (Evans, Allen, Moore, & Strauss, 2005; Langberg, Vaughn et al., 2011). Specifically, middle school teachers may not have sufficient opportunity to observe what students record in their planners or how they organize their backpacks and lockers given the brief amount of time students spend in each class and the large number of students in each class. Alternatively, it may be that the effects generated by the HOPS intervention are not large enough to be noticed by teachers or did not meet teacher expectations. Additional research is needed to determine what types of behaviors middle school teachers are able to accurately rate, perhaps by comparing teacher ratings to objective skills observations, or by providing “don’t know” options on rating scales.
Intervention-related improvements in parent-rated materials management, organized actions, and homework completion during the intervention were largely maintained at the 3 month follow-up (see Figures 1 & 2) and school grades did not decline during the follow-up period. It is possible that this maintenance of gains was due to the fact that many parents continued to monitor and reward the HOPS skills. At the 3 month follow-up assessment, 80% of intervention group parents indicated that they continued to monitor their child’s assignment completion and homework assignment recording accuracy on a frequent basis. Fifty-five percent of parents also indicated that they were monitoring their child’s use of organizational skills by completing the HOPS organizational skills checklist multiple times each week. Many of the parents in the sample also reported that they were providing rewards and consequences for their child’s use of the homework and/or organizational skills. Another possible explanation for the generalization of effects across time is that the HOPS manual encourages SMH providers to add frequent monitoring of organization and time management skills using the checklists to students’ IEP and 504 plans. However, this hypothesis cannot be tested with the data collected in this study.
In this study, randomization was completed blocking on ADHD medication status to ensure that an equal number of students on and off medication were in the HOPS and comparison groups. Medication changes made during the intervention period were also tracked (see Table 1). A stronger design would be to control for the impact of ADHD medication through the analyses or to evaluate whether ADHD medication status moderated outcome. The sample size in this study is not sufficient for these types of analyses. Similarly, it would be important to control for other types of school or therapeutic services that students may have received (see Table 2).
Parents and teachers were involved in the intervention and therefore could not be blind to condition. Accordingly, rater-biases may be present. Further, the comparison condition was a waitlist comparison condition and as such, the potential impact of nonspecific therapeutic effects (e.g., the SMH provider/student relationship) cannot be accounted for. It will be important for future studies to compare the HOPS intervention to an active comparison group where students in the comparison receive the same amount of therapist attention. An active comparison group may also reduce rater-basis as both groups of parents would be expecting to see improvements. Further, group differences on school grades must be interpreted with caution because baseline equivalence could not be established, although it also worth noting that the intervention and waitlist control groups did not differ on standardized measures of IQ and academic achievement.
Another important limitation is that the SMH providers volunteered to participate in this study and therefore, may represent a unique group of motivated school practitioners. The results may not generalize to SMH providers as a group. Further, SMH providers took part in the process of selecting students to participate in the intervention. As such, the findings may not generalize to all middle school students with ADHD. It could be that the middle school students in this study were selected because they had particular difficulties with organizational skills or fewer difficulties in other areas (e.g., learning problems). However, it should be noted that the participants in this study were recruited from a diverse group of schools, and were relatively diverse in terms of race, comorbid mental health disorders, and parent education level and income (see Table 1).
Finally, treatment fidelity was assessed through live observation of randomly selected sessions for each SMH provider. Although SMH providers were given short notice that they were going to be observed, having an observer present may have changed their behavior (i.e., the Hawthorne effect). A stronger method of assessing fidelity would have been to audio-record all sessions and to complete components checklists based upon those recordings. However, that would not have permitted assessment of checklist completion accuracy which requires that an observer complete checklists independently. Future research with the HOPS intervention needs to assess fidelity as a multi-dimensional construct (Sanetti & Kratochwill, 2009), including examining session length as a potential predictor of outcomes (Nock & Ferriter, 2005).
The HOPS intervention appears to have considerable promise as an effective school-based intervention for improving the organizational skills of adolescents with ADHD. Larger studies of the HOPS intervention are necessary to answer questions about moderators and mediators of treatment response. It may be that the HOPS intervention works well for some students but less well for others. For example, it may be that students with severe oppositional defiant behaviors or with comorbid learning disorders respond less well to the intervention or need a higher intervention dose to achieve a clinically meaningful response. In this study, while participants made large improvements in homework problems according to parent ratings, there was still additional room for improvement, and a longer intervention may be necessary in some cases. It is also possible that the HOPS intervention could be applied to a broader group of students than students with ADHD, and could potentially have a larger impact. In terms of mediation, it will be important to evaluate mechanisms of change within the HOPS intervention. For example, it may be that students’ use of certain skills (e.g., time management) drives improvements in overall school performance. It is also possible that student perception of the SMH provider or satisfaction with the intervention plays an important role in predicting outcomes. A limitation of this study is that we only assessed SMH provider and parent satisfaction and did evaluate satisfaction from the students’ perspective. It will also be critically important for further research to compare the HOPS intervention to an active comparison group to account for potential nonspecific therapeutic effects. It would be useful to compare the HOPS intervention to the types of services typically provided in school settings to address problems with homework and organization (e.g., a homework tutoring condition).
The HOPS intervention has now undergone a systematic process of evaluation and refinement during which stakeholder input was gathered at multiple points. The hope is that by focusing on feasibility of intervention delivery with treatment fidelity up front, the HOPS intervention will be able to overcome the oft-cited research to practice gap following proof of efficacy. The HOPS intervention appears promising for improving the organizational skills and academic performance of students with ADHD. Additional research comparing HOPS to an active control group, and with a stronger evaluation of fidelity, is needed before efficacy can be firmly established.
Joshua M. Langberg, Virginia Commonwealth University and Cincinnati Children’s Hospital Medical Center.
Jeffery N. Epstein, University of Cincinnati School of Medicine and Cincinnati Children’s Hospital Medical Center.
Stephen P. Becker, Miami University.
Erin Girio-Herrera, Cincinnati Children’s Hospital Medical Center.
Aaron J. Vaughn, Cincinnati Children’s Hospital Medical Center.
- Abikoff H, Gallagher R. Assessment and remediation of organizational skills deficits in children with ADHD. In: McBurnett K, Pfiffner L, Elliott G, Schachar R, Nigg J, editors. Attention Deficit/Hyperactivity Disorder: 21st Century Perspective. New York: Marcel Dekker, Inc; 2008. pp. 137–152.
- Abikoff H, Gallagher R. Children’s organizational skills scales: Technical manual. North Tonawanda, NY: Multi-Health Systems Inc; 2008.
- Abikoff H, Gallagher R, Wells KC, Murray DW, Petkova E, Shook SE, Stotter R. Improving organizational functioning in ADHD children: Evaluation of skills- and performance-based interventions. Poster presented at the bi-annual meeting of the International Society for Child and Adolescent Psychopathology (ISRCAP); Chicago, IL. 2011.
- Abikoff H, Nissley-Tsiopinis J, Gallagher R, Zambenedetti M, Seyffert M, Boorady R, McCarthy J. Effects of MPH-OROS on the organizational, time management, and planning behaviors of children with ADHD. Journal of the American Academy of Child and Adolescent Psychiatry. 2009;48:166–175.[PubMed]
- Anesko KM, Schoiock G, Ramirez R, Levine FM. The Homework Problem Checklist: Assessing children’s homework problems. Behavioral Assessment. 1987;9:179–185.
- Booster GD, DuPaul GJ, Eiraldi R, Power TJ. Functional impairments in children with ADHD: Unique effects of age and comorbid status. Journal of Attention Disorders. 2010 doi: 10.1177/1087054710383239. Advance online publication. [PubMed][Cross Ref]
- Chorpita BF. The frontier of evidence-based practice. In: Kazdin AE, Weisz JR, editors. Evidence-based psychotherapies for children and adolescents. New York: Guilford Press; 2003. pp. 42–59.
- Cohen J. Statistical power analysis for the behavioral sciences. 2. Hillsdale, NJ: Lawrence Erlbaum; 1988.
- DuPaul GJ, Stoner G. ADHD in the schools: Assessment and intervention strategies. 2. New York: Guilford Press; 2003.
- Evans SW, Allen J, Moore S, Strauss V. Measuring symptoms and functioning of youth with ADHD in middle schools. Journal of Abnormal Child Psychology. 2005;33(6):695–706.[PubMed]
- Evans SW, Langberg J, Raggi V, Allen J, Buvinger E. Development of a school-based treatment program for middle school youth with ADHD. Journal of Attention Disorders. 2005;9:343–353.[PubMed]
- Evans SW, Schultz BK, White LC, Brady C, Sibley MH, Van Eck K. A school-based organization intervention for young adolescents with Attention-Deficit/Hyperactivity Disorder. School Mental Health. 2009;1:78–88.
- Evans SW, Serpell ZN, Schultz B, Pastor D. Cumulative benefits of secondary school-based treatment of students with ADHD. School Psychology Review. 2007;36:256–273.
- Evans SW, Serpell Z, White C. Attention! (CHADD) 2005. Jun, The transition to middle school: Preparing for challenge and success; pp. 29–31.
- Frazier SL, Formoso D, Birman D, Atkins MS. Closing the research to practice gap: Redefining feasibility. Clinical Psychology: Science & Practice. 2008;15:125–129.
- Frazier TW, Youngstrom EA, Glutting JJ, Watkins MW. ADHD and achievement: Meta-analysis of the child, adolescent, and adult literatures and a concomitant study with college students. Journal of Learning Disabilities. 2007;40:49–65.[PubMed]
- Gureasko-Moore S, DuPaul GJ, White GP. The effects of self-management in general education classrooms on the organizational skills of adolescents with ADHD. Behavior Modification. 2006;30:159–183.[PubMed]
- Gureasko-Moore S, DuPaul GJ, White GP. Self-management of classroom preparedness and homework: effects on school functioning of adolescents with attention deficit hyperactivity disorder. School Psychology Review. 2007;36:647–664.
- Hechtman L, Abikoff H, Klein RG, Weiss G, Respitz C, Kouri J, Pollack S. Academic achievement and emotional status of children with ADHD treated with long-term methylphenidate and multimodal psychosocial treatment. Journal of the American Academy of Child and Adolescent Psychiatry. 2004;43:812–819.[PubMed]
- Kataoka SH, Rowan B, Hoagwood KE. Bridging the divide: In search of common ground in mental health and education research and policy. Psychiatric Services. 2009;60:1510–1515.[PubMed]
- Kline RB. Beyond significance testing. Washington, DC: American Psychological Association; 2004.
- Langberg JM. Homework, Organization and Planning Skills (HOPS) Interventions: A Treatment Manual. Bethesda, MD: National Association of School Psychologists (NASP) Publications; 2011.
- Langberg JM, Arnold LE, Flowers AM, Epstein JN, Altaye M, Hinshaw SP, Hechtman L. Parent-reported homework problems in the MTA study: Evidence for sustained improvement with behavioral treatment.