
61 minute read
Letter from the OKLA Chair
Letter from Rebecca Marie Farley Chair, Oklahoma Literacy Association

Advertisement
Hello, OKLA!!!
My name is Rebecca Marie Farley. I am an assistant professor of education at Oklahoma Baptist University in Shawnee, Oklahoma, and cherish the opportunity to be Chair of OKLA for 2021-2022. This year is number six for me in higher education where I teach P3 Reading Methods, 4-8 Reading Methods, Reading Assessment, Cornerstone for Teacher Education, Reading Clinical, and Reading Practicum. Prior to teaching higher education, I taught 4th grade, kindergarten, first grade, and served as Reading Specialist/Title I/Literacy Resource Specialist PK-6th grade for 22 years in a public school here in Oklahoma. I am a formerly National Board- Certified Teacher in Literacy: Reading/Language Arts/Early and Middle Childhood.
My bachelor’s degree is in Elementary Education from Oklahoma Baptist University, master’s degree in Reading Specialist from University of Central Oklahoma, and doctoral degree in Early Childhood with an emphasis in Literacy, from North Central University. My husband, Tracy, and I will celebrate 39 years of marriage on Christmas Eve 2022. We have three daughters, three sons-in-law, and seven grandchildren whom we dearly love. I look forward to getting to know many of you better.
I am excited to serve alongside you in our common endeavor of promoting literacy across our great state. I strongly believe in the power of literacy. It opens doors for
our children that cannot be opened in any other way—doors of nurturing lovefilled moments of sharing books with family in various forms such as snuggled up on someone’s lap or part of a bedtime routine. Literacy nourishes our children’s minds, grows their knowledge, and sets the foundation for their academic success in all areas. Literacy empowers adults to explore and expand their life opportunities and experiences through employment options and community involvement, as well as supports well-informed literate citizens to actively participate in political arenas at all levels from voting to holding office. Literacy contributes to the economy by forging a productive and critically thinking workforce that is innovative and creative. Literacy empowers people spiritually by enabling them firsthand access to read scriptures, seek truth, and develop beliefs. Literacy allows people to improve health mentally and physically as they learn new things such as healthy eating, recipes, workout routines, and medicinal information. It is a high calling and responsibility that we together have, supporting all ages of readers to success and triumph.
It is with this strong belief in literacy that I encourage and challenge you to keep doing the stellar job of promoting literacy that has been the tradition of OKLA. Encourage your local and state members to become active participants in OKLA in the post-COVID era. I challenge you to become actively involved in your local and state chapters at a new level to stretch and grow yourself as a professional and literacy advocate. Serve on committees, run for offices, sponsor local events to get your community engaged in literacy, and participate in OKLA through promoting literacy, promoting the awards, scholarship, and grant opportunities provided by OKLA, or submit a proposal for a breakout session at our state conference coming up in April. If you do not have a local in your area, get a group of literacy advocates together and form a new local or partner with an existing local that is able to virtually connect to you if the distance is too great to attend meetings in person.
Mark your calendar now to attend our currently in-person conference with keynote speaker Dr. Tim Rasinski on April 2. Visit the OKLA website at www.oklahomaliteracy.org regularly for updates and information to share with your locals. Join and follow the Oklahoma Literacy Association Facebook group
to network with other OKLA members.
If you are serving in higher education, I encourage you not only to be active in your local literacy organization, but also join the Oklahoma Higher Education Reading Council (OHERC). Reach out to current chair, Dana Oliver, dana.oliver@swosu.edu, current treasurer, Martie Young, MLYoung@nwosu.edu, or myself, rebecca.farley@okbu.edu, and we will get you connected. Also search OHERC on Facebook to join that Facebook group.
Let us come together and make it an outstanding year of literacy promotion and advocacy for our state. If you have questions or ideas, please contact me at rebecca.farley@okbu.edu.

Dr. Tim Raskinski OKLA Keynote Speaker April 2, 2022
Teddy D. Roop and Kathleen S. Howe
Using the Active View of Reading to Inform RtI and Diverse Reader Profiles
Diverse learner profiles exist for students who experience challenges learning to read. Students vary in their unique characteristics and require tailored targeted intervention and teachers who are equipped to assess and design instruction that meets their individual needs. This reality calls for a more complete model such as the Active View of Reading (AVR) proposed by Duke & Cartwright (2021), versus a more limited one such as the Simple View of Reading (SVR; Gough & Tunmer, 1986). Yet, the SVR (Gough & Tunmer, 1986) is included in recent dyslexia legislation and drives teachers’ understanding of literacy assessment and instruction for diverse learners. Use of an incomplete model of reading is problematic for several reasons. First, research consensus does not exist for a single definition of dyslexia (Johnston & Scanlon, 2020). Second, a diverse range of reader profiles does exist (Valencia & Riddle Buly, 2004). However, the International Dyslexia Association’s (IDA; 2021) definition for dyslexia has shaped recent legislation across the country and is driving decisions at district and state levels about literacy and reading instruction for all students. Such mandates can result in over- or misidentification of students as “at risk” for or as having dyslexia. In addition, the literacy assessments and prescribed instructional approaches are not necessarily the best match to identify and address diverse learners (Johnston & Scanlon, 2020). Teachers need to know and understand a more active view of reading, and how to select from a range of assessment tools and instructional practices to design targeted instruction for diverse reader profiles. This article proposes use of a broader view of reading, the Active View of Reading (AVR; Duke & Cartwright, 2021). In addition, it discusses identifying and assessing diverse reader profiles through the lens of the AVR and Response to Intervention (RtI). Lastly, an activity is shared for use with preservice teachers that provides them with the opportunity to assess a diverse reader profile and design targeted instruction from a broad and more complex view of reading.
Going Beyond a Limited Model of Reading
Distinctive reader profiles primarily have been documented based on a collection of studies across the science of reading that reflect a reader’s proficiency with the code and ability to comprehend. The Simple View of Reading (SVR; Gough & Tunmer, 1986), describes and represents reading comprehension linearly as the product of two separate components--decoding and listening comprehension. As noted by Duke and Cartwright (2021), the reading research field has expanded its understanding of these terms and their contribution to reading comprehension since their earlier introduction, and now recognizes the importance of using broader constructs (i.e., word recognition and language comprehension) (see Cervetti et al., 2020; Hoover & Tunmer, 2020; Scarborough, 2001). The narrow constructs within the SVR often result in an overemphasis on decoding and teaching reading skills in isolation. Furthermore, this model does not address the interactive role of the reader such as executive function, other factors involved in the act of reading such as culture, or the interplay between word recognition and language comprehension (Duke & Cartwright, 2021; Rosenblatt, 1993).
A more recent model by Duke and Cartwright (2021), the Active View of Reading (AVR), expands on SVR through consideration of the more broadly recognized constructs of word recognition and language comprehension and their role in reading comprehension. More specifically, Duke and Cartwright’s model unpacks each of these concepts to highlight three key understandings about reading that are known from the broader science of reading but not included within the SVR. These understandings include the need to go beyond decoding, the interaction between word recognition and language comprehension, and the role of the reader within the complex act of reading comprehension. First, Duke and Cartwright (2021) note that decoding skills and phonics instruction are necessary parts of word recognition but in and of themselves are not sufficient for reading comprehension. The field’s knowledge and the science of reading has expanded and recognizes the role of a broader construct known as word recognition that includes phonological awareness/phonemic awareness, alphabetic principle, phonics knowledge, decoding skills, and sight word recognition. Each of these play an important role in automatic word recognition, which is necessary to build and support reading fluency. It is critical for classroom teachers to understand this and avoid a singular emphasis on phonics instruction. In addition, they must know how to assess for all word recognition skills. Second, Duke and Cartwright (2021) highlight the importance of the ways in which word recognition connects to language comprehension to support reading comprehension. The researchers note multiple studies that identify shared variance, or overlap, between word recognition and language comprehension in the prediction of reading comprehension (Duke & Cartwright, 2021). The fact that shared variance exists challenges the SVR model (Gough & Tunmer, 1986) which is linear and suggests that word recognition (decoding) must occur before language comprehension and that there is not interplay between the two. Duke and Cartwright (2021) argue the reported shared variance suggests a great deal of interaction between the two constructs and therefore lends support for the need for their Active View of Reading, and it is imperative for practitioners to be able to teach “within” and “across” word recognition and language comprehension (Duke & Cartwright, 2021, p. 4). Duke and Cartwright (2021) label the connection between word recognition and language comprehension as “bridging factors,” which include print concepts, reading fluency, vocabulary knowledge, morphological awareness, and graphophonological-semantic cognitive flexibility or the ability to consider letter-sound-meaning simultaneously. It is these “bridging factors” as explained within Duke and Cartwright’s AVR (2021) that allow for the negotiation between the simple word-calling of print and the meaningmaking from print. This model allows practitioners to differentiate instruction that meets the varying skill levels of diverse readers within and across constructs, rather than to assume all readers’ difficulties lie within one or the other and are primarily rooted in decoding. Third, as noted by Duke and Cartwright (2021) the AVR model includes the widely accepted understanding from the science of reading research that readers and what they bring to the reading of text play an important role in reading comprehension, an understanding that the SVR model ignores. Duke and Cartwright (2021) propose that the reader brings unique levels of motivation and engagement, executive functioning skills, and strategy use that impact word recognition and language comprehension. In addition, the AVR model includes cultural and other content knowledge, reading-specific background knowledge, verbal reasoning, language structure, and theory of mind as part of the language comprehension construct. Teachers need to be aware of these factors within Duke and Cartwright’s (2021) expanded understanding of language comprehension and recognize that they impact what individual readers bring to the act
of reading, as well as how they relate to and perceive messages within the text. Today’s teachers work with an increasingly diverse learner population (i.e., cultural, linguistic, socioeconomic, etc.) (Hanson, 2021) and it affects the lens through which teachers plan and deliver reading instruction. Fourth and most importantly, Duke and Cartwright (2021) note that components of the AVR model are “instructionally malleable” (p. 10), meaning practitioners play an important role by using it to assess and design their teaching to meet the individual needs of diverse learners. Each of these four components results in a more complex view of reading and provides an essential pathway for identification and instruction of diverse readers.
Identifying Diverse Reader Profiles
The reading process is complex, and the reader is dynamic, resulting in the need for diverse profiles. Literature over several decades describes readers as either struggling with word recognition, reading comprehension, or both. The AVR (Duke & Cartwright, 2021) includes the role of reading fluency as a key bridging component between word recognition and reading comprehension (Pikulski & Chard, 2005; Rasinski & Samuels, 2011). Too often readers profiled as experiencing the same struggles according to three most common categories (word recognition, reading comprehension, or both), yet they are assigned separate labels by different researchers. Readers whose struggles lie in the word recognition realm are typically profiled with Specific Word Reading Difficulties (SWRD, Spear-Swirling, 2016) or dyslexia (Kilpatrick, 2016). The category of those who struggle to comprehend but decode proficiently is labeled Specific Reading Comprehension Difficulties (SRCD) or hyperlexia (Kilpatrick, 2016). For students struggling with both, the profiles are indicated as Mixed Reading Difficulties (MRD) or compensator type (Kilpatrick, 2016). Labeling the same profile using different terminology does not foster effective communication and collaboration amongst professionals (i.e., speech language pathologist, special education teachers, classroom teachers, administrators, reading specialist) working across various fields but providing services to the same child. Additionally, these profiles are too broad to be useful, especially if we accept the tenets of the AVR (Duke & Cartwright, 2021). However, the fluency component and underlying reading skills are further delineated by Valencia and Riddle Buly (2004), adding the dimensions of accuracy and automaticity to the word recognition level, thus providing more nuanced diverse reader profiles (See Figure 1). The International Dyslexia Association’s (IDA; 2021) definition for dyslexia has shaped recent legislation across the country and is driving decisions at district and state levels about literacy and reading instruction for all students. This narrow definition for dyslexia and the equally narrow definition for the science of reading result in more students identified as “at risk” or having dyslexia, requiring them to receive a specific type of reading instruction (Johnston & Scanlon, 2020). This is problematic for several reasons. First, research consensus does not exist for a single definition of dyslexia (Johnston & Scanlon, 2020). Second, a documented and well explained diverse range of reader profiles does exist (Valencia & Riddle Buly, 2004). The diverse learner profiles for students who experience challenges learning to read vary in their unique characteristics and require tailored targeted intervention, not a one-size-fits-all phonics program. The term dyslexia appears only in Kilpatrick’s (2016) reader profiles. Spear-Swirling (2016) and Valencia and Riddle Buly (2004) do not use the term dyslexia, but only the latter
Figure 1
Diverse Reader Profiles
Cluster
Automatic Word Callers
Struggling Word Callers Need
Decoding quickly but not reading for meaning.
Decoding and word meaning.
Word Stumblers Slow reading rate and difficulty decoding but good comprehension.
Slow Comprehenders Slow reading rate but accurate decoding and good comprehension.
Slow Word Callers Accurate but slow decoding and difficulty with comprehension.
Disabled Readers Difficulties in all three areas--word identification, fluency, and comprehension. Note: The six reader profile clusters define the targeted area for reading development (from Valencia & Riddle Buly, 2004)
specifies unique characteristics in reader performance related to poor word recognition. Stanovich (1988) points out that there is a difference between dyslexia, a phonological core deficit, and “garden variety” poor readers, a global deficit in a variety of domains, typically associated with language (p. 602). The challenge in using the dyslexia label, especially in light of IDA’s vague definition, is the lack of an “empirical basis for the use of the term to distinguish a group of children who are different from others experiencing difficulty acquiring literacy” (International Literacy Association, ILA; 2016, p. 8). While early identification of reading difficulties is crucial for determining appropriate and effective interventions, early identification of dyslexia “contributes nothing beyond that awareness” (Johnston & Scanlon, 2020, p. 18). In addition, the challenge of a narrow view of reading with an over-emphasis on phonics instruction puts several reader profiles at a disadvantage of not getting the targeted instruction they need. The National Reading Panel (NRP; 2000) recommended “that systematic phonics instruction should be integrated with other reading instruction to create a balanced reading program. Phonics instruction is never a total reading program” (pp. 2-96-97). A more helpful approach is to focus on the unique characteristics of readers through a more broadly defined view of reading and reading instruction that addresses readers’ strengths and needs as determined by assessment data. This approach will result in more effective instruction for all, including students with dyslexia.
Assessing Diverse Reader Profiles
Since the process of reading is complex and readers’ profiles are diverse, a sophisticated level and depth of professional knowledge is required to assess and instruct in a way that best
meets a student’s individual reading needs. It is important to consider how assessment informs instruction in light of these complexities. The way in which assessment is used to inform teaching and learning directly links it to achievement. (Valencia, 2011). One approach used in addressing reading difficulties is already in place the RtI framework (Scanlon, 2019). It provides guidance for implementation following either a standard- or a problem-solving protocol that establishes “appropriate use of evidencebased instruction across tiers, [and] it should in principle decrease the numbers of children incorrectly identified as disabled” (Fuchs & Fuchs, 2006, p. 96). RtI is structured in tiers that include: (a) Tier 1, which is high quality classroom instruction, screening, and small groups for all students; (b) Tier 2, which is interventions for students in need of targeted instruction; and (c) Tier 3, which is intensive interventions and comprehensive evaluation for those students still not responding to instruction provided within Tiers 1 and 2 (RTI Action Network, 2021). The RtI cycle begins with administration of screening assessments in Tier 1, followed by diagnostic and progress-monitoring assessments in Tier 2 that allow teachers and reading specialist to navigate appropriate targeted instruction tailored to specific reader profiles. In line with the AVR (Duke & Cartwright, 2021), engagement plays an important role when providing effective interventions. One-size-fits all and scripted programs call to light the role of student and teacher engagement within reading instruction. Harn and colleagues (2017) found that teachers who solely focus on fidelity of the intervention (e.g., “sticking” to the script) “became less engaged with the students and delivered less responsive instruction” (p. 298). These findings support the need for a more complex reading model such as AVR (Duke & Cartwright, 2021) that addresses individual learner motivation and engagement in addition to word recognition and language comprehension.
Types & Purpose of Assessment
Prior to engaging in the RtI process, it is important for teachers to know and be able to use a range of assessments. This includes understanding the different types of assessments and their purpose. Both formative and summative assessments exist. Summative assessments can be either outcome-based or standardized and provide lagging data. They can be formative but do not immediately provide timely information that can be used to design responsive interventions for students at-risk or identified with dyslexia. Summative assessments confirm that there is a problem but fail to provide guidance for a solution. On the other hand, formative assessments provide data that enables teachers to make instructional adjustments in a timely manner. Formative assessment tools are more diagnostic than they are evaluative. These assessments include screeners, diagnostics, progress monitoring tools, and other more informal data points that provide useful instructional guidance such as teacher observations, checklists, quizzes, writing samples, and other portfolio artifacts. Literacy screeners are used to proactively determine or predict students at risk for not meeting grade level learning goals. Diagnostic assessments help prevent failure before it happens by providing specific performance-related data that can be immediately addressed through targeted instruction. Progress monitoring is a way to look at the ongoing academic progress of a child and to determine the effectiveness of specific instruction. As noted in the Oklahoma State Department of Education’s RtI guidance document (2010), students in Tier 2 are monitored for progress every other week, and students in Tier 3 are monitored weekly. Progress monitoring
can be conducted using different forms of diagnostic assessments, which are typically selected at the school or district levels.
Screener Assessments
Timely identification of students who are at risk for future reading failure is crucial. Early identification provides the opportunity for early interventions, given the intervention is evidencebased and targets the students’ needs (Chard et al., 2008; Harn et al., 2008; Johnston & Scanlon, 2020; Scanlon, 2019). Screening assessments should be “practical” and “accurate” (Johnson et al., 2009, p. 175) and utilize a baseline or benchmark to compare a student’s performance in reading to a norm group. As part of the screening battery, it is important to consider information provided by parents/guardians and vision and hearing screenings (Rose, 2009). However, screening for dyslexia is not a reliable measure to predict later reading difficulties and can produce false positives and false negatives (Rose, 2009). Measures such as letter-knowledge and phonological processing are better predictors (Rose, 2009) and that is where diagnostic assessments enter the cycle.
Diagnostic Assessments
Once a screening assessment has been administered and data indicate at-risk status when compared to peers on an established benchmark, the RtI cycle continues with diagnostic assessments. These are typically administered to students who are identified with “at-risk” status and not intended for all students. Instruction is designed to target areas of need based on data from diagnostic assessment. Kibby (2009) describes targeted instruction as “diagnostic teaching,” the effect of which is measured by progress monitoring assessments that are discussed in the previous section. Data used to design targeted instruction should be derived from multiple diagnostic assessments measuring the level of proficiency on different reading skills such as phonological and phonemic awareness, phonics, fluency, vocabulary, comprehension, as well as motivation. To put it in a more familiar context, doctors diagnose before they prescribe medication or treatment and those are specific to the ailment and considerate of the overall patient health history. Multiple diagnostic assessments are available for districts or building teams to choose from, and several are mentioned later in the article. Diagnostic assessment data allows the “design [of] what seems to be an appropriate next level of instruction, select[ing] or creat[ing] materials for that instruction, implement[ing] the instruction, evaluat[ing] the child’s learning from that instruction, and if necessary, modify[ing] (Kibby, 2009, p. 252).
Preparing Pre-Service Teachers to Assess Diverse Reader Profiles
The following is an example of an assignment used with pre-service teachers in an advanced methods reading course that can also be used by in-service teachers. It allows individuals to engage in the practice of responsive teaching. It utilizes the assessment-instruction cycle and focuses on individuals and their unique reader profiles. The resources mentioned are suggestions and substitutions can be made as appropriate. The pre-service teachers completed a portfolio with a primary (K-2) or intermediate student (3-6) that consists of the following five main categories: 1. Getting to Know the Student
2. Foundational Reading Skills 3. Reading Connected Text and Reading Behaviors 4. Data Analysis, Summary, and Recommendations 5. Evidence-Based Instruction Sections one through three are followed by a reflection discussing the results from each assessment and preliminarily identifying areas of strength and need. In addition, pre-service teachers provided an analysis for any correlation between scores, observed behaviors, and other relative, reported information. Most sessions are video- or audio-recorded to provide further artifacts for reflection for the pre-service teacher as an administrator of assessments.
Getting to Know the Student
Getting to Know the Student is a set of assessments that provide preservice teachers with important information about what may impact a student’s learning and affect reading ability. This may include parent provided information about health, such as vision and hearing, an interest inventory completed by the student, as well as their motivation and purposes for reading different types of text and topics. Suggested materials for this section: • Parent Information Survey (Dobler, Mann, & Roop, 2021a) • News About Me or Inventory Experiences (Johns & Lenski, 2019) • Elementary Reading Attitude Survey (McKenna & Kear, 1990, as cited in Johns & Lenski, 2019).
Foundational Reading Skills
Foundational Reading Skills addresses a range of word recognition skills (Duke & Cartwright, 2021; Gough & Tunmer, 1986). In addition, concepts of print and text features are addressed (AVR; Duke & Cartwright, 2021). This is a section where assessments are matched to the developmental and appropriate grade level skills. The three subsections in the Foundational Reading Skills category include Basic Literacy, Phonemic Awareness, and Phonics assessments. The Basic Literacy Assessments focuses on (a) concepts of print (K-2) or text features (3-6); (b) Letter and Sound Identification (K-2) or Nonsense Word reading (3-6); and (c) Reading Words in Isolation. The Phonemic Awareness assessment examines the student’s ability on the tasks of phoneme isolation, blending, segmenting, deletion, and substitution, as well as some phonological components such as rhyming identification and production, onset and rime, and syllabication of spoken words. Knowledge of print patterns in mono- and multisyllabic words is assessed by the Phonics assessment. Suggestions for assessment materials include: • Basic Literacy Assessments o Concepts About Print, K-2 (Clay, 2005; see also Reading Rockets, 2021) or Concepts of Text Features, 3-6 (Dobler, Mann, Roop, 2021b) o Letter Identification and Sound Identification, K-2 (Blevins, 2017) or Nonsense Words 3-6 (Blevins, 2017) o San Diego Quick Assessment (Blevins, 2017) • Phonemic Awareness o Phonemic Awareness Assessment (Blevins, 2017) o Phonological Awareness Screening Test (PAST; Kilpatrick, 2016) • Phonics
o Early Names Test, K-2 (Mather et al., 2006) or Names Test, 3-6 (Duffelmeyer et al., 1994)
Assessments Reading Connected Text and Reading Behaviors
This category examines the strategies and observable behaviors by the reader through analysis of their errors and self-corrections. A running record is an assessment of oral text reading providing data on how well a student recognizes words in running text. Clay (2005) suggests looking at the interaction between three dimensions in regard to errors and selfcorrections leading to comprehension: meaning, syntax, and visual information or MSV. As an assessment, running records show what cueing system(s) the child uses by analyzing their errors and self-corrections. This section gives the pre-service teacher additional data on how the reader is applying word recognition strategies while reading connected text. In addition, it allows them to consider errors and self-corrections in the overall analysis and preparation to guide future instruction, such as matching students to appropriate leveled readers. Oral reading fluency data is gathered from timing the oral reading of 100-200 words and determining overall percent accuracy. In addition, words correct per minute is calculated and compared to grade level norms. A fluency rubric is also used to evaluate the prosody. Suggested forms for assessing reading continuous text are: ● Observational Survey (Clay, 2005) ● Basic Reading Inventory (Johns et al., 2017) ● Reading A-Z assessment passages (Reading A-Z, 2021a) and decodable and leveled books (Reading A-Z, 2021b) ● Multidimensional Fluency Scale (Zutell & Rasinski, 1991) ● Oral Reading Norms (Hasbrouck & Tindal, 2017)
Data Analysis, Summary, and Recommendations
After data from all of the assessments are gathered, pre-service teachers summarize the results and analyze the data. They are encouraged to look for patterns and reading behavior trends across assessments, compare against existing reader profiles, and then come to conclusions about student’s strengths and areas of need. Research-based recommendations from credible sources are aligned to the identified areas of need. Those recommendations are then implemented in two lesson plans that the pre-service teacher teaches to their practicum student. Suggested evidence-based resources are: ● Reading Rockets (2021) ● Florida Reading Research Center (2021) ● Improving reading: Strategies, Resources and Common Core Connections (Johns & Lenski, 2019) ● Reading from A-Z (Blevins, 2017)
Evidence-Based Instruction
Evidence-based instruction consists of two-lesson plans (Dobler, Mann, Roop, 2021c) that include a phonics or word recognition lesson plan and a needs-based lesson plan. The lesson plan requirements for explicit instruction follow a direct instruction format and gradual release
(Pearson & Gallagher, 1983) format that includes: (a) Modeling (I do), (b) Guided Practice (We do), and (c) Independent Practice (You do). Preservice teachers are required to explain how the lessons support the targeted skill or their practicum student’s needs. Preservice teachers are also required to explain what data from the portfolio assessment supports their instructional decisions. They complete a reflection after each implemented lesson plan that includes prompts on what went well, what they would adjust or change, what follow up lessons would be included based on the assessment for each lesson, and what they learned about themselves as a teacher of reading. The preservice teacher practicum example provides the opportunity for teachers and teacher educators to compare the suggested assessments against the constructs within the AVR (Duke & Cartwright, 2021) to make a plan for assessing and determining reader profiles as explained in the Data Analysis, Summary, and Recommendations section. Students can make this determination on their own, in a collaborative group, or in consultation with their instructor. We encourage inservice teachers as well to use this information to identify what skills within each construct (word recognition, language comprehension, bridging processes, and activate selfregulation) noted in Duke & Cartwright’s AVR (2021) model may need to be addressed. In addition, we recommend teachers determine which tools they have available to assess the various skills. We suggest beginning by asking questions related to the various skills within the constructs that will need to be answered through an assessment. For example, within the word recognition construct for determining phonological awareness, teachers may ask whether a student has adequate phonemic awareness skills? More specifically, can the child identify individual sounds? Can they segment and blend? Then the teacher determines what assessment tool(s) can likely provide data to answer their questions. For any skills within a construct for which an assessment is not listed within the practicum example, brainstorm a list of known tools used in your school or district and have a discussion with colleagues about which ones may work.
Teachers should repeat the above process for each of the key areas identified within the word recognition construct: phonological awareness, print concepts, decoding, sight vocabulary, and fluency in context. Suggested assessments for PA, print concepts and phonics (decoding skills) are included above within the preservice teacher practicum example. A suggestion for assessing sight vocabulary is not included in the practicum example; however, teachers can easily access online and use the Dolch Inventory or Fry Inventory. The Multidimensional Fluency Scale (Zutell & Rasinski, 1991), the Oral Reading Norms (Hasbrouck & Tindal, 2017), or any informal reading inventory such as the Basic Reading Inventory (Johns et al., 2017) can be used to assess fluency in context and are included within the list of assessments in the practicum example. Other tools for determining fluency in context can be selected and may be readily available within schools and districts. It is essential that inservice teachers not only know and have access to a wide variety of assessments to screen, diagnose and progress monitor for specific reading skills, but are able to use assessment results to determine the diverse reader profile of their students. This is imperative, because just like one size does not fit all, there is no such thing as all struggling readers experiencing exactly the same reading challenges. By following the assessment process of screening, diagnosing, and progress monitoring to determine the diverse profiles of readers they have among their students, teachers are able to provide and adjust instruction appropriately for specific reader profiles. Teachers should utilize screeners to identify students who are struggling to read on grade level, but then diagnostic assessment should be used to determine the construct the student struggles with the most in order to target their instruction. In addition,
through ongoing progress monitoring, teachers will know whether a student is making gains and whether the instruction is the best match for the intended goal.
Conclusion
What is known from the science of reading is that assessment is the basis for designing evidence-based instruction depending on the reader profile (Valencia & Riddle Buly, 2004). Targeted instruction focuses on various needs and leads to improved reading performance (Riddle Buly & Valencia, 2003). It is imperative that administrators, scholars, and advocates who may not be involved in daily reading instruction understand: (a) reading is a dynamic process which is best understood through a more complete model such as AVR (Duke & Cartwright, 2021); (b) readers are multifaceted and should be analyzed according to their unique reader profiles; and (c) the existing framework (RtI) is best positioned to help educators continuously assess and design targeted instruction. Teacher educators are devoted to preparing pre-service teachers through the development of a professional knowledge base and opportunities to assess readers and design targeted instruction that meets the needs of today’s diverse learners. Inservice teachers are dedicated to using this information to help all readers achieve proficiency and beyond.
References
Blevins, W. (2017). Phonics from A-Z: A practical guide (3rd ed.). Scholastic. Chard, D. J., Stoolmiller, M., Harn, B. A., Wanzek, J., Vaughn, S., Linan-Thompson, S., & Kame’enui, E. J. (2008). Predicting reading success in a multilevel schoolwide reading model. Journal of Learning Disabilities, 41(2), 174-188. doi: 10.1177/0022219407313588 Cervetti, G.N., Pearson, P.D., Palincsar, A.S., Afflerbach, P., Kendeou, P., Biancarosa, G., … Berman, A. (2020). How the Reading for Understanding initiative’s research complicates the simple view of reading invoked in the science of reading. Reading Research Quarterly, 55(S1), S161–S172. https://doi.org/10.1002/rrq.343 Clay, M. M. (2005). An observational survey of early literacy achievement. Heinemann. Dobler, E., Mann L., & Roop T. (2021a). Background information form [class material]. Emporia State University Canvas. Dobler, E., Mann L., & Roop T. (2021b). Concepts of text features [class material]. Emporia State University Canvas. Dobler, E., Mann L., & Roop T. (2021c). Evidence-based instruction—lesson plans and reflections [class material]. Emporia State University Canvas. Duffelmeyer, F. A., Kruse, A. E., Merkley, D. J, & Fyfe, S. A. (1994). Further validation and enhancement of the Names Test. The Reading Teacher, 48(2), 118-128. Duke, N. K., & Cartwright, K. B. (2021). Communicating advances beyond the Simple View of Reading. Reading Research Quarterly, 56(S1), S25-S44. doi: 10.1002/rrq.411 Fuchs, D., & Fuchs, L. S. (2006). Introduction to response to intervention: What, why, and how valid is it? Reading Research Quarterly, 41(1), 93-99. doi: 10.1598/RRQ.41.1.4 Gough, P. B. & Tunmer, W. E. (1986). Decoding, reading and reading disability. Remedial and Special Education, 7(1), 6-10.
Hanson, M. (2021, September 19). K-12 school enrollment & student population statistics. EducationData.org. https://educationdata.org/k12-enrollment-statistics Harn, B. A., Stoomiller, M., & Chard, D. J. (2008). Measuring the dimensions of alphabetic principle on the reading development of first graders. Journal of Learning Disabilities, 41(2), 143-157. doi: 10.1177/0022219407313585 Harn, B. A., Damico, D. P., & Stoolmiller, M. (2017). Examining the variation of fidelity across an intervention: Implications for measuring and evaluating student learning. Preventing School Failure, 61(4), 289-302. doi: http://dx.doi.org/10.1080/1045988X.2016.1275504 Hasbrouck, J., & Tindal, G. (2017). Hasbrouck & Tindal oral reading fluency data 2017. Read Naturally. https://www.readnaturally.com/knowledgebase/documents-andresources/26/616 Hoover, W.A., & Tunmer, W.E. (2020). The cognitive foundations of reading and its acquisition. Springer. International Dyslexia Association. (2021, April 20). Definition of dyslexia. https://dyslexiaida.org/definition-of-dyslexia International Literacy Association. (2016). Dyslexia: A response to the International Dyslexia Association [Research Advisory Addendum]. Author. Johns, J., Elish-Piper, L., & Johns, B. (2017). Basic reading inventory: Kindergarten through grade twelve and early literacy assessments (12th ed.). Kendall Hunt Publishing Company. Johns, J., & Lenski, S.D. (2019). Improving reading: Strategies, resources and Common Core connections (7th ed.). Kendall Hunt Publishing Company. Johnson, E. S., Jenkins, J. R., Petscher, Y., & Catts, H. W. (2009). How can we improve the accuracy of screening instruments? Learning Disabilities Research & Practice, 24(4), 174-185. Johnston, P., & Scanlon, D. (2020). An examination of dyslexia research and instruction, with policy implications [Literacy Research Report]. Literacy Research Association. Kibby, M. W. (2009). Why is the school psychologist involved in the evaluation of struggling readers? Journal of Educational and Psychological Consultation, 19, 248-258. doi: 10.1080/10474410903128988 Kilpatrick, D. A. (2016). Equipped for reading success: A comprehensive, step-by-step program for developing phonemic awareness and fluent word recognition. Casey & Kirsch Publishers. McKenna, M. C., & Kear, D. J. (1990). Measuring attitude toward reading: A new tool for teachers. The Reading Teacher 43(9), 626-639. Mather, N., Sammons, J., & Schwartz, J. A. (2006). Adaptations of the Names Test: Easy-to-use phonics assessments. International Reading Association, 60(2), 114–122. doi:10.1598/RT.60.2.2 National Reading Panel. (2000). Teaching children to read: An evidence-based assessment of the scientific research literature on reading and its implications for reading instruction. https://www.nichd.nih.gov/sites/default/files/publications/pubs/nrp/Documents/report.pdf Oklahoma State Department of Education. (2010). Response to Intervention (RtI) guidance document. Author. https://sde.ok.gov/sites/ok.gov.sde/files/RtIGuidanceDoc.pdf Pearson, P.D. & Gallagher, G. (1983). The gradual release of responsibility model of instruction. Contemporary Educational Psychology, 8, 112-123.
Pikulski, J. J., & Chard, D. J. (2005). Fluency: Bridge between decoding and reading comprehension. The Reading Teacher, 58(6), 510-519. doi:1598/RT.58.6.2 Rasinski, T. V., & Samuels, S. J. (2011). Reading fluency: What it is and what it is not. In S. J. Samuels & A. E. Farstrup (Eds.), What research has to say about reading instruction (4th ed., pp. 94-114). International Reading Association. Reading A-Z. (2021, April 21a). Benchmark books and running records. Reading A-Z.com. https://www.readinga-z.com/assessments/benchmark-books/ Reading A-Z. (2021, April 21b). Benchmark passages and running records. Reading A-Z.com. https://www.readinga-z.com/assessments/benchmark-passages/ Reading Rockets. (2021, April). Concepts of print assessments. Reading Rockets.org. https://www.readingrockets.org/article/concepts-print-assessment Riddle Buly, M. R., & Valencia, S. (2003). Meeting the needs of failing readers: Cautions and considerations for state policy. University of Washington, Center for the Study of Teaching and Policy. https://www.education.uw.edu/ctp/sites/default/files /ctpmail/PDFs/Reading-MRBSV-04-2003.pdf Rose, J. (2009). Identifying and teaching children and young people with dyslexia and literacy difficulties. An independent report from Sir Jim Rose to the Secretary of State for Children, Schools and Families June 2009. http://www.thedyslexiaspldtrust.org.uk/media/downloads/inline/the-rose-report.1294933674.pdf Rosenblatt, L. (1993). The transactional theory: Against dualisms. College English, 55, 377-386. doi: 10.2307/378648 RTI Action Network. (2021). What is RTI? National Center for Learning Disabilities. http://www.rtinetwork.org/learn/what/whatisrti#:~:text=Response%20to%20Intervention %20(RTI)%20is,in%20the%20general%20education%20classroom Scanlon, D. (2019, November). Dyslexia/reading difficulties and approaches to intervention (PowerPoint presentation). Presented at the 2019 New York State Reading Association Annual Conference, Albany, NY. Scarborough, H. S. (2001). Connecting early language and literacy to later reading (dis)abilities: Evidence, theory, and practice. In S. Neuman & D. Dickinson (Eds.), Handbook for research in early literacy. Guilford Press. Spear-Swerling, L. (2016). Common types of reading problems and how to help children who have them. The Reading Teacher 69(5), 513-522. https://doi.org/10.1002/trtr.1410 Stanovich, K. E. (1988). Explaining the difference between the dyslexic and garden-variety poor reader: The phonological-core variable-difference model. Journal of Reading Disabilities, 21(10), 590-612. Valencia, S.W., & Riddle Buly, M. (2004). What struggling readers really need. The Reading Teacher, (57) 6, 520-531. Valencia, S. W. (2011). Using assessment to improve teaching and learning. In S. J. Samuels & A. E. Farstrup (Eds.), What research has to say about reading instruction (4th ed., pp. 379-405). International Reading Association. Zutell, J., & Rasinski, T. V. (1991). Training teachers to attend to their students’ oral reading fluency. Theory Into Practice, 30, 211-217.
Teddy D. Roop teaches in the Elementary Education/Early Childhood Education/Special Education of Emporia State University, Emporia, Kansas. She can be reached at troop@emporia.edu.
Kathleen S. Howe teaches in the School of Education, Park University, Parkville, Missouri. She can be reached at Kathleen.howe@park.edu.

Chelsea K. Bradley
Introduction Online Learning Communities: Generated by Communication
As online learning continues to gain popularity (Allen & Seaman, 2010/2016; Yuan & Kim, 2014), institutions of higher education, as well as K-12 educators, are tasked with creating spaces where learning can flourish. As students engage in online learning environments, the tools available to them through an online learning management system (LMS) are digital in nature. These digital tools and technologies afford students new modes to communicate and learn. Due to the important role of learning communities, the increase of online learning and its impact on higher education, and the important role of learners’ perceptions, the author designed a phenomenological study to gain insight into common lived experiences of learning communities among pre-service teachers within online undergraduate college courses. A phenomenology (Moustakas, 1994) focuses on peoples’ common experiences of a phenomenon and how those experiences are relied on to make meaning in the world. Since the author sought to understand how undergraduate pre-service teachers experienced learning community in their online courses through in-depth interviews, a phenomenological study was used. The data were derived from a broader phenomenological study to examine undergraduate pre-service teachers’ perceptions of learning community. The data identified three sources in which learning communities were generated within online settings. This article describes the first identified source: online learning communities are generated by communication. For the purpose of this article, learning community refers to a supportive learning environment generated by dialogue and collaboration (Yuan & Kim, 2014).
Literature Review
Research indicates that community is pivotal to human experiences. For example, Maslow (1954), in his work regarding humans and motivation, studied community, developing his theory of human motivation. His work was influential in describing how to best construct the foundation to properly fuel the development of human beings. Maslow’s work defined five basic needs that all human beings strive to acquire: physiological needs, safety and security, the need to belong and experience affection, respect and self-respect, and self-actualization (pp. 35-47). This hierarchy of needs is applicable to community formation in both offline and online spaces. For the purpose of the broader study, the focus remained on Maslow’s third defined need, the need to belong and experience affection, which speaks to the importance of belonging to a group. Maslow stated the following about the third need: “Any good society must satisfy this need, one way or another, if it is to survive and be healthy” (p. 44). When considering community formation, if a person does not feel like he or she belongs, his or her participation in the community will falter. Additionally, if other participants experience the same lack of belongingness, the community could experience failure. Doolittle and MacDonald (1978) developed the Sense of Community Scale, which focuses on community in relationship to communication. Additionally, Conrad (2005), in her
work regarding community and an online cohort of students, defined community as “a general sense of connection, belonging, and comfort that develop over time among members of a group who share purpose and commitment to a common goal” (p. 1). Not only does community formation enhance collaboration, it has also been found to increase engagement and feelings of satisfaction within a group atmosphere (Darby et al., 2013). When group members feel satisfaction within their group, they may become more motivated to contribute and work with colleagues. In their work examining academic service-learning, researchers Darby et al. (2013), found that students’ motivation increased when they enjoyed their experiences, formed relationships, and felt a sense of responsibility to their learning community (p. 188). Numerous studies show that learning communities play a vital role in online educational spaces (Cleugh, 2013; Jeong & Hmelo-Silver, 2016; Kozlov & Große, 2016). Learning communities provide a space for collaboration to occur, which positively impacts student learning (Cleugh, 2013; Luo et al., 2017). This collaboration among members in a learning community occurs in various ways, relying on multiple modes and tools. When learning communities are successfully generated, there is an increase in the effectiveness of the learning environment (Kucuk & Sahin, 2013). In an educational setting, the experience of learning communities is important to students’ success as well as feelings of satisfaction toward courses (Lear et al., 2010; Vlachopoulos & Cowan, 2010). The development of learning communities remains an essential feature of the classroom, either offline or online. Within the classroom setting, learning communities serve as support systems (Greene & Mitcham, 2012), safe spaces for experimentation and exploration (Greene & Mitcham, 2012), relationship builders (Murray et al., 2011), spaces to create (Murray et al., 2011), and places to engage in dialogue (Murray et al., 2011). To gain an understanding of undergraduate pre-service teachers’ common lived experiences of learning communities in online college courses, this study was guided by the following research question: What were the lived experiences of learning communities in online courses among undergraduate pre-service teachers?
Methods
The author’s phenomenological study was largely based on work by Moustakas (1994), Vagle (2014), and van Manen (1990), which relied on data collection and synthesis to generate meaning about the phenomenon of participants’ experiences of online learning communities. The participants in this study were undergraduate pre-service teachers attending one university in the Midwest. At the time, the author was an adjunct professor at the university. The author relied on a method of sampling known as the snowball effect (Creswell, 2007), where current students passed on her contact information to other undergraduate pre-service teachers who might be interested in participating in the study. Measures were taken to ensure none of the participants would have the author as an instructor as she taught courses which occurred at the beginning of the teacher education program and participants were advanced in their program of study. Participants were gathered from a wide variety of courses and backgrounds, but all were undergraduate pre-service teachers. The study included four phases: Orientation and Review, Connecting with Participants, Emergent Themes, and Interpretations and Conclusions. The purpose of the first phase, Orientation and Review, was to gain information about and become familiar with possible
participants, the phenomenon being studied, and the site in which the study took place. Phase Two, Connecting with Participants, consisted of communicating with participants. During this time, participants signed and returned consent forms and filled out a brief information sheet. The author also began scheduling interviews with participants and building rapport with each of them through email or phone conversations. The third phase, Emergent Codes and Themes, consisted of initial data analysis. Each interview was transcribed in its entirety by the author. Once an interview was transcribed, initial codes and themes were noted. These notes were used as a guide in future interviews and were also reviewed by two colleagues who served as reviewers throughout the entirety of the study. During Phase Four, Interpretations and Conclusions, emergent themes and codes were organized. The author conducted four rounds of analysis, as explained below. There was overlap between phases of this study as data collection and analysis occurred simultaneously. Data collection for the study included three in-depth interviews. The first interview focused on the life history of the participant. The second interview focused on the details of the experience. The third and final interview provided a space for participants to reflect on their experiences with the phenomenon and make meaning. The interviews were structured and conducted synchronously over the phone, as preferred by each participant. Each interview was recorded and later transcribed by the author. The author relied on transcripts and a research journal to collect data. Data analysis of the phenomenology consisted of constant comparative analysis (Vagle, 2014), which confirms collecting data and performing analysis simultaneously as an appropriate technique. Conducting analysis in this way assisted in reaching redundancy. After conducting and transcribing eight interviews, common themes regarding lived experiences of learning communities began to take shape. As coding of common themes continued, three main findings surfaced: learning communities are relationship-based, generated by communication, and technologically bound. The author’s main goal for data analysis was to reach redundancy. Findings from this study were confirmed after 10 participant interviews and reached redundancy with the completion of interviews 11 and 12.
Results
As mentioned, while data analyses indicated three sources for undergraduate pre-service teachers’ perceptions of learning community, for the brevity of this report, only one source will be discussed. Data analyses indicated that experiences of online learning communities were related to effective communication. The concept of effective communication included conversations, questioning, and responses. Whether participants divulged information about discussion boards, emails, group work, or video-recorded lectures, it became apparent the experience of effective communication was a common thread within online learning communities experienced by study participants. Without effective communication, there would be limited interaction occurring in an online course. Participants claimed they were not only more involved in courses when communication occurred frequently, but they also found themselves able to retain more information, both of which positively influenced experiences of learning communities in their online learning spaces. Evidence for these findings will be described. Every participant who was interviewed discussed the use of discussion boards within their online course. Participants claimed they were not only more involved in courses when communication occurred frequently, but they also found themselves able to retain more
information, both of which positively influenced experiences of learning communities in their online learning spaces. Sarah (all names are pseudonyms), a participant from the study said, “I keep going back to discussion boards, but I think that’s really what online courses are about, using those discussion boards to discuss with one another and learn from one another. You are more involved when talking through these [discussion] boards.” (Sarah, interview 3, December 22, 2017). Participants also found enjoyment in Parking Lot type discussion boards, where they were not required to post, yet if they had general questions or concerns, they could post in the Parking Lot and the instructor or another classmate could respond. One participant said, “It’s nice to be able to post and not have to worry about requirements from the instructor and making sure you’re going to get your points. Sometimes you just want to talk to your classmates. That’s why I like Parking Lot discussions.” (Adam, interview 3, January 25, 2018). In addition to discussion board posts, participants appreciated variety in the prompts to which they were required to respond. By incorporating diverse prompts each week, instructors were able to add a sense of variety to course discussions, which participants found engaging and important for building learning communities. One participant took an online course where the instructor still gave a weekly reading assignment, but instead of supplying a single prompt or question to answer, students were provided with two or more prompts or questions. From those prompt or questions, students were able to choose one to answer and post for the rest of the class to read. Heath, the participant who experienced this type of questioning, shared he enjoyed those conversations because the prompts and questioning were different than just “regurgitating what the text said and you got to know your classmates better and stuff” (Heath, interview 3, December 30, 2018). He considered the single question prompts and questions less beneficial for students, since everyone already read the same assignment. In continuing with the communication theme, participants spoke about being heard by their instructors and classmates. When participants deemed they were truly being listened to, they felt more connected and welcomed into the course, which positively affected their common understandings of learning communities. As participants completed assignments and posted on discussion boards, their experiences of learning communities increased as classmates and instructors responded to their work. Skyler, Sarah, and Rory specifically mentioned that when they knew their work was being read and taken seriously, they wanted to work harder. When their classmates took the time to read their work and genuinely responded to it, it motivated them to want to do the same, which ultimately made them more involved in their courses. Sarah said, “I know that what I’m posting is being received and not just being posted to receive a grade. It’s more for collaborating; you can ask questions and stuff” (Sarah, interview 3, December 22, 2017). In doing so, she was able to communicate more frequently with her classmates, which Sarah felt positively influenced her sense of learning community within her online courses. When participants read in-depth replies to their own work, they experienced validation and the sense that their original post was accepted and provided their classmates with a means to hold a genuine conversation. Sending email was another form of communication found important for experiencing learning communities and motivating students. Participants acknowledged the important role email played when communicating with members of an online course. Sarah shared an experience when a classmate emailed her after she posted a response on a discussion board. The classmate expressed to Sarah how she was struggling with the content, and it was apparent Sarah understood the content. Sarah felt pride in her knowledge of the course content as well as her ability to help another student be successful. Sarah expressed how her sense of learning
community was heightened by her classmate contacting her “outside” of the online classroom: “you have to feel comfortable with someone in order to do that” (Sarah, interview 2, December 15, 2017). Lilly, who revealed that she was extremely social, shared if she was unable to communicate with people, she felt like she was missing something. While Lilly did not experience the same connection to classmates that Sarah did, Lilly divulged that she constantly stayed in contact with instructors through email. She appreciated being able to email them questions and receive feedback right away. Overall, the use of email was perceived as a means to communicate and grow learning community. These findings correlate with work from Rigelman and Ruben (2012), Carlen and Jobring (2005), Beins (2016), and Ouyang and Sharber (2017), all of whom described how communication is directly linked to participation and facilitates development of community. However, this work grows the field in understanding participant desires for frequent communication, among both peers and instructors. This frequent and effective communication positively impacted participants’ experiences of learning community in their online spaces. This communication can occur through discussion board posts, email, or feedback on assignments.
Implications
Implications from this study addressed enhanced communication afforded by availability of different modes. The modes of communication participants relied on within their online courses to converse, collaborate, and produce with peers became activities that fostered connection within their learning communities. As participants described their experiences of learning communities in online courses, the theme of effective communication became increasingly apparent. While participants expressed their exposure to a variety to communicative tools, one concept emerged; the execution of effective communication was vital to students’ success in an online learning environment. Participants elucidated three main affordances of effective communication. Within their online courses, study participants had positive communication experiences through the use of discussion boards, email, and the realization that they were being heard by their peers and instructors. Every participant who was interviewed discussed the use of discussion boards within their online course. Since discussion boards are an important component of communication within online courses, it is vital that instructors design discussion boards in an accessible manner. This accessibility includes clear guidelines and expectations for posts as well as a variety in the types of prompts presented to students. Implications of prompt variety suggest instructors also integrate prompts that encourage students to consider their own experiences. These types of questions are beneficial and could generate thoughtful conversations among students because those prompts and questions urge students to explore their own understandings due to the reflective nature of questions that focus on applicability (Lohr & Haley, 2018). These types of prompts could create a space for learning communities to flourish as students share personal experiences with classmates (Dailey-Hebert, 2018). Participants also appreciated the optional Parking Lot discussion board, where they could ask general questions and engage in conversation with their peers. These Parking Lot discussions appeared to lead to experiences of learning community, as participants were able to converse with one another about topics of their choosing. Implications of this finding suggest instructors
consider including an optional discussion board space, such as the Parking Lot, within their online courses. Communication through email was another tool that positively impacted participants’ experiences of learning community. Participants felt personally connected to peers when communicating via email regarding various things about their course. Additionally, implications for email communication applied to instructors. As instructors read through posts and notice a misconception, they can email the student directly. This small gesture can make students feel connected to their instructors, while also validating the work they are putting forth in their online courses (Al-Asfour, 2014; Dailey-Hebert, 2018, Parenti, 2013). Additionally, if an insight is introduced that might be beneficial to the rest of the class, instructors should share such findings, as other students could have curiosities about the same topic. Participants experiencing a sense of being heard was another major theme that emerged within the importance of communication and learning communities. Participants felt encouraged and supported in their learning when they received thoughtful responses from classmates and instructors. Study participants described thoughtful responses as responses that built upon their own posts. This occurred when classmates posed questions or prompts that elicited deeper responses and created a space for conversation to continue; this encouraged more consistent involvement and also produced positive experiences of learning communities. These in-depth responses also created a sense of support and connection to the course because participants felt their hard work was being taken seriously and their classmates and instructors were reading their posts for content.
Conclusion
When the previously described forms of communication occurred, participants claimed they were more involved in their courses. Online courses need to be developed in ways that promote accessible, effective communication among classmates and instructors. Participants desired clear guidelines and expectations, while having opportunities to engage in authentic conversations with their classmates and instructors. By providing a variety of modes with which to communicate, instructors may assist in developing communication-rich learning environments, which, according to the findings in this study, help establish relationships and positively influence experiences of learning communities. While the broader study focused on institutions of higher education, the affordances of effective communication and the implications established in the study hold true for all levels of learning.

Table 1
Best Practices for Educators: High School & Institutions of Higher Education
How to Build Community High School Educators Educators at Institutions of Higher Education
Discussion Board Posts: -Implement a variety in prompts -Encourage personal experience sharing -Model all types of discussion board posts: initial posts, responses, asking a clarifying question, etc. -Post clear guidelines and expectations for posts -Explain guidelines and expectations -Offer optional Parking Lot thread -Implement a variety in prompts -Encourage personal experience sharing -Model the types of posts the course will utilize -Post clear guidelines and expectations for posts -Offer optional Parking Lot thread
Sense of Being Heard: -Model what different responses look like, placing emphasis on detailed, in-depth responses -Encourage students to ask questions that elicit a deeper response from the author of a post
Sending Email: -Encourage peer-to-peer communication, perhaps using whatever LMS is in place, which can build learning community in online spaces -Educators can use email or messages via the LMS to provide personalized feedback or provide insight about a student misconception -Model what different responses look like, placing emphasis on detailed, indepth responses -Encourage students to ask questions that elicit a deeper response from the author of a post -Encourage peer-to-peer email, which can build learning community in online spaces -Instructors can use email to provide personalized feedback or provide insight about a student misconception
References
Al-Asfour, A. (2014). Improving motivation and persistence of online human resource students through the use of E-mail communication: A study employing a single case study design. Journal of Learning in Higher Education, 10(2), 1-7. https://files.eric.ed.gov/fulltext/ EJ1143340.pdf Allen, I. E., & Seaman, J. (2010). Learning on demand: Online education in the United States. Babson Survey Research Group. http://files.eric.ed.gov /fulltext/ED529931.pdf
Allen, I. E., Seaman, J, Poulin, R., & Straught, T. T. (2016). Online Report Card – Tracking Online Education in the United States, 2015. Online Learning Consortium. https://onlinelearningconsortium.org/read/online- report-card-tracking-online-educationunited-states-2015/ Beins, A. (2016). Small talk and chit chat: Using informal communication to build a learning community online. Transformations: The Journal of Inclusive Scholarship & Pedagogy, 26(2), 157-175. doi:10.5325/trajincschped.26.2.0157 Carlen, U., & Jobring, O. (2005). The rationale of online learning communities. International Journal of Web Based Communities, 1, 272–295. http://citeseerx.ist.psu.edu/viewdoc/ download?doi=10.1.1.599.7386&rep=rep1&t ype=pdf Cleugh, C. (2013). Sense of community in post-secondary online blended courses: Importance of, opportunities and implications for course development. (Unpublished doctoral dissertation). Creswell, J. W. (2007). Qualitative inquiry and research design: Choosing among five traditions (2nd ed.). Conrad, D. (2005). Building and maintaining community in cohort-based online learning. Journal of Distance Education, 20(1), 1-21. https://files.eric.ed. gov/fulltext/EJ807822.pdf Dailey-Hebert, A. (2018). Maximizing Interactivity in Online Learning: Moving beyond Discussion Boards. Journal of Educators Online, 15(3). Darby, A., Longmire-Avital, B., Chenault, J., & Haglund, M. (2013). Students’ motivation in academic service-learning over the course of the semester. College Student Journal, 47(1), 185-191. Doolittle, R. J., & Macdonald, D. (1978). Communication and a sense of community in a metropolitan neighborhood: A factor analytic examination. Communication Quarterly, 26(3), 2-7. http://dx.doi.org/10.1080/014633778 09369297 Greene, K., & Mitcham, K. C. (2012). Community in the classroom. English Journal, 101(4), 13-15. http://www.jstor.org.proxy.mul.missouri.edu/stable /41415466 Jeong, H., & Hmelo-Silver, C. E. (2016) Seven affordances of computer-supported collaborative learning: How to support collaborative learning? How can technologies help? Educational Psychologist, 51(2), 247-265. doi:10.1080/ 00461520.2016.1158654 Kozlov, M., & Große, C. S. (2016). Online collaborative learning in dyads: Effects of knowledge distribution and awareness. Computers in Human Behavior, 59, 389- 401. http://dx.doi.org/10.1016/j.chb.2016.01.043 Kucuk, S., & Sahin, I. (2013). From the perspective of community of inquiry framework: An examination of Facebook uses by pre-service teachers as a learning environment. Turkish Online Journal of Educational Technology - TOJET, 12(2), 142-156. Lear, J. L., Ansourge, C. & Steckelberg, A. (2010). Interactivity/community process model for the online education environment. Journal of Online Learning and Teaching. 6(1), 71-77. Lohr, K. D., & Haley, K. J. (2018). Using biographical prompts to build community in an online graduate course: An adult learning perspective. Adult Learning, 29(1), 11-19. http://eds.a.ebscohost.com.proxy.mul.missouri.edu/eds /pdfviewer/?vid=8&sid =f6027af3-e2dd-4d4a-b1aa-697cff845c8b%40 sessionmgr4006 Luo, N., Zhang, M., & Qi, D. (2017). Effects of different interactions on students' sense of community in e-learning environment. Computers & Education, 115, 153-160. doi:10.1016/j.compedu.2017.08.006
Maslow, A. H. (1954). Motivation and personality. Harper. http://s-f-walker.org.uk/pubsebooks /pdfs/Motivation_and_Personality- Maslow.pdf Moustakas, C. (1994). Phenomenological research methods. Sage. Murray, T. A., Higgins, P., Minderhout, V., & Loertscher, J. (2011). Sustaining the development and implementation of student-centered teaching nationally: The importance of a community of practice. Biochemistry & Molecular Biology Education, 39(6), 405-411. http://dx.doi.org.proxy.mul. missouri.edu/10.1002/bmb.20537 Ouyang, F., & Scharber, C. (2017). The influences of an experienced instructor’s discussion design and facilitation on an online learning community development: A social network analysis study. The Internet and Higher Education, 35, 34-47. https://doi.org/10.1016 /j.iheduc.2017.07.002 Parenti, M. A. (2013). Student Perceptions of Asynchronous and Synchronous Web Based Tools and Perceived Attainment of Academic Outcomes. Journal of Educational Technology, 9(4), 8–14. Rigelman, N. M., & Ruben, B. (2012). Creating foundations for collaboration in schools: Utilizing professional learning communities to support teacher candidate learning and visions of teaching. Teaching and Teacher Education, 28, 979-989. http://essentialconditionswiki .pbworks.com/w/file/fetch/61128690/Professional%20Learning%20Communities %20Rigelman%202012.pdf Vagle, M. D. (2014). Crafting phenomenological research. Left Coast Press, Inc. van Manen, M. (1990). Researching lived experience: Human science for an action sensitive pedagogy. University of Western Ontario. Vlachopoulos, P. & Cowan, J. (2010). Reconceptualising moderation in asynchronous online discussions using grounded theory. Distance Education 31(1), 23-36. Yuan, J., & Kim, C. (2014). Guidelines for facilitating the development of learning communities in online courses. Journal of Computer Assisted Learning, 30(3), 220-232. doi:10.1111 /jcal.12042

Dr. Chelsea K. Bradley is Assistant Professor of Reading at the University of Arkansas at Little Rock. She can be reached at ckbradley@uair.edu.
Teacher to Teacher
Karen B. Coucke
OKLA Easy Grant Is as Easy as Sleeping and Helps Build a Classroom Library!
In the Spring of 2019, I applied for and was awarded the OKLA (Oklahoma Literacy Association) Easy Grant. The purpose of the OKLA Easy Grant is to provide books for classroom teachers’ personal libraries. At the time I applied for the grant, I taught kindergarten in a diverse classroom. I incorporated a writing workshop model and wanted some more inspiration to motivate my budding authors. I needed to find a way to acquire more books to add to my personal classroom library. The OKLA Easy Grant seemed like my best option since the word easy was in the title! After I reviewed the requirements and submitted my application, I concluded the application process is easy. People have different interpretations of what is considered to be easy. I decided to ask kindergartners their perspective on what activities they thought were easy. Here are a few activities kindergartners believe are easy: “Math is easy because I know so many math. I’ve been in school a lot, so that’s why I’ve been so good at math.” “Making worms with playdough.” “Playing easy games.” “Sleeping is easy. When I’m tired, I just fall right to sleep, and my dogs don’t wake me up.” “Swimming because you move your arms and legs, and you go somewhere.” If you agree with these statements, then you will understand how easy it is to apply for the OKLA Easy Grant. All you have to do is be a member of OKLA and submit a bibliography of the books you plan to purchase along with the application. I applied for the grant and was awarded $100 to purchase books for my personal library for my kindergarten classroom. The purpose of this article is twofold. First, I will provide some basic information about the grant writing process. Secondly, I will explain how the books I received from the OKLA Easy Grant were used in my diverse kindergarten classroom.
Writing a Grant
Writing a grant can seem like a daunting process since organizations have different guidelines to follow, and there are many different types of grants to apply for. The process and strategies outlined here for the OKLA Easy Grant can be incorporated when applying for other grants. The OKLA Easy Grant is a simple way to become familiar with grant writing.
General Information
The first step in writing a grant is to identify a need in your classroom or school. At the time, my kindergarten classroom consisted of a diverse group of learners with varying abilities. My personal classroom library lacked texts that I could incorporate with our writing workshop
when teaching specific writing processes. I decided to look for opportunities to add mentor texts to my classroom library. The second step in grant writing is to find a grant that matches your identified need. There are many opportunities available when doing a Google search for literacy grants. As a member of OKLA, I decided to begin my grant search with the OKLA website. The purpose of the grant is to provide teachers with $100 to help them build their personal classroom libraries. I found the Easy Grant one way to address my identified need. Click the link to view the OKLA Easy Grant .
Parts of the Easy Grant
Once I identified my classroom need and reviewed the submission grant guidelines, I decided to apply. This grant has a few simple guidelines for submission. For this grant, in order to be considered, membership in the organization is required. Other submission requirements include having a cover page with the applicant’s contact information as well as that of the school district, a one-to-two-page description of the project, and a bibliography of the book titles requested. This was my first time writing a grant, and these submission guidelines seemed easy to accomplish. I began my search for a literacy grant in the middle of December, and the submission deadline for the OKLA Easy Grant is February 15. This gave me two months to prepare my submission. This was enough time to prepare my description and bibliography along with having a colleague edit my work before submission.
Tips for Writing the Grant
There are two main tips for writing a grant. First, review submission requirements and ensure there is enough time to meet the deadline. Second, have a colleague edit the submission to provide necessary feedback and suggestions before submitting. Being aware of all the requirements and meeting the deadline will help ensure the grant meets the basic requirements to be reviewed by the organization. Having a colleague edit the submission ensures there are no overlooked grammar or mechanics mistakes and that the description meets the submission guidelines.
From Grant Writing to Teaching
My goal in writing the grant was to acquire mentor texts to support the writing workshop process in my classroom. My students were eager writers, and I needed to encourage their enthusiasm by engaging them with mentor texts paired with our writing workshop. I needed mentor texts that showcased different writing processes such as narrative writing, expository writing, and argument/opinion writing. I consulted The Writing Thief: Using Mentor Texts to Teach the Craft of Writing (Culham, 2014) and found inspiration. Mentor texts are examples or models of a particular writing skill or trait being used by an author. Using mentor texts is a research-based strategy and provides students with real-world examples of an identified writing trait or process (Culham, 2014). These connections help students create writing that connects to their own lives. Having a mentor text as a model that focuses on a process or a trait incorporates reading along with writing and, as Culham (2014),
says, “A deep, thoughtful understanding of how texts works creates an understanding of what good writers do” (p. 32). The book lists over 90 mentor text examples and recommends mentor texts best suited for each writing process as well as components of the 6+1 Traits of Writing (Culham, 2005). My grant proposal included nine of the recommended texts. I tried to select books I was not able to find in my school library or through Scholastic. Each of the nine books I selected were between 8 and 15 dollars. See Table 1 for the list of books purchased with the OKLA Easy Grant.
Table 1
Books received through application of OKLA Easy Grant
Narrative Writing
Shannon, D. (2012). Jangles: A big fish story. Blue Sky Press.
Messner, K. (2011). Over and under the snow. Chronicle Books.
Rubin, A. (2012). Dragons love tacos. Dial Books for Young Readers.
Expository Writing
Davies, N. (2004). Poop: A natural history of the unmentionable. Candlewick.
Perdomo, W. (2010). Clemente! Henry Holt and Company.
Schafer, L. M. (2016). Lifetime: The amazing numbers in animal lives. Chronicle Books.
Argument Writing
Buzzeo, T. (2013). Just like my papa. Hyperion.
Palatini, M. (2003). The perfect pet. Katherine Tegen Books.
Willems, M. (2103). That is not a good idea! Balzer + Bray.
Once I received the books, I decided which of the mentor texts to use for our writing workshop. I decided to focus on the voice trait through narrative writing. Voice is the trait that recognizes “...the writer's unique way of looking at the world and interpreting it” (Culham, 2014, p. 108). In my experiences with kindergartners, they have unique ways at looking at many things. I just needed to guide their creativity during the writing process.
The mentor text I chose is Dragons Love Tacos (Rubin, 2012). This book describes a boy and his idea to have a party for dragons by making lots of tacos and what happens when he accidentally serves spicy salsa at the party to dragons who don’t like spicy food. My class loved this book! We had just recently finished an author study on Mo Willems. The class connected the taco party to the Pigeon character and his hot dog party. They made connections to themselves and the boy in the book. Additionally, they made connections to food, including a discussion on spicy and not spicy foods. Their connections were the foundation of our mini-lesson on using voice in narrative writing. As a group, some choices they decided on for writing narratives included writing about a party for people or animals they would like to have and some favorite foods they like to eat. We also decided as a class two sentences with illustrations would be a good starting point for our narratives. I had many emergent bilinguals in my class, so we created sentence starters for those who chose to use them. During independent writing time, there were many conversations and much sharing of ideas. The book was available for students to use to get ideas, such as illustrations and different ways to use words to grab the reader's attention. However, our main focus of this narrative writing piece was using our voice to create a narrative text. This workshop on voice lasted over 2-3 days. During this time, I had mini conferences with students on their writing, and students were with partners to share ideas as well. On the third day, we had time for sharing for those who chose to read their texts to the class. Share Time is an exciting time in kindergarten! The majority like to share their writing. I always have the option where they can have myself or a friend share their work if they are too uncomfortable to talk in front of the class. Some of their narrative pieces included having a party for fairies, having a party for their pets, having a party for their families, and included their favorite foods to serve at the party. Pizza was a big class favorite!
Conclusion
Writing a grant is a way teachers can acquire materials for their classroom. Identifying a specific need and finding a grant that supports this need is an important first step. Ensuring the submission guidelines and deadlines are understood is a critical part of grant writing. Before submitting any grant application, have a colleague proofread for any editing issues. The OKLA Easy Grant is a simple application process for teachers to use as a resource to help build their classroom libraries. If you are interested in applying for the grant, check out the OKLA Literacy webpage for the grant application process and deadlines: It is as easy as sleeping and helps build a classroom library!
References
Culham, R. (2005). 6 + 1 traits of writing: The complete guide for the primary grades. Scholastic. Culham, R. (2014). The writing thief: Using mentor texts to teach the craft of writing. International Reading Association. Rubin, A. (2012). Dragons love tacos (D. Salmieri, Illus.). Dial Books for Young Readers.
Karen B. Coucke currently teaches English Language Learners in grades 1-8 at John Rex Charter Elementary. She is also working on a Ph.D. in Reading at the University of Oklahoma. She can be reached at kbcoucke@ou.edu.

