Register & Login HERE

Here at AUTHORSdB we've formed the only database of authors, including social media, book listings and much more, for today's mine-field of thousands of aspiring and established writers.

We are a dedicated website that helps authors for free.

Get Your Geek On: Critical Thinking Practice

Critical Thinking Practice by Tracy Wilson

This post will discuss an article written by Mirriahi, Alonzo, and Fox (2015).  The main idea of the article will be explored and analyzed.  Problems with methodology, and the results of the study will be explained.  Finally, errors, disagreements, and areas for further research will be covered in this discussion post.

Compelling Points

An article featured in Research in Learning Technology (Mirriahi, Alonzo, & Fox, 2015) proposes a framework for blended learning curriculum design.  The major points in the article were made in a somewhat compelling manner.  Due to the type of methodology the researchers used, the study lacked a convincing tone.  The methodology will be discussed later in this post, but it directly impacts the persuasiveness of the article.

RASE Model

The RASE model was the main point of the article (Mirriahi, Alonzo, & Fox, 2015).  The model supports a student-centered approach to blended learning (Mirriahi, Alonzo, & Fox, 2015).  According to Mirriahi, Alonzo, and Fox (2015), RASE stands for resources, activities, support, and evaluation.  Essentially, the model provides resources for students, activities that promote the acquisition for multiple skills, support for learners, and structured assessments allowing educators to monitor progress (Mirriahi, Alonzo, & Fox, 2015).

Mirriahi, Alonzo, and Fox (2015) provided a clear break-down of the model, and then each section of the article expounded upon the model.  The researchers argued that using the RASE model provided a unambiguous structure for blended learning implementation (Mirriahi, Alonzo, & Fox, 2015).  The authors presented the model using an authoritative, believable tone.   

Problems with Methodology

            The methodology used by Mirriahi, Alonzo, and Fox (2015) is problematic.  A simple online database search served as the primary foundation for the tools proposed by the researchers.  Thereafter, they chose only eight undergraduate participants for the study, breaking them into two separate groups.  Although they used students in varying disciplines, the sample size is too small to apply the results to the general population. 


            The findings of the research study are based on qualitative measures.  The article did not offer a discussion section, but rather a single paragraph merely restating the research.  It did not offer in-depth solutions.  The summarization was supported with interviews as well as the literature from various databases, so the conclusion was as authentic as it could be given those circumstances. 

Errors and Disagreement

            Using Ruscio’s (2006) book as a best-practice guide, the decisions and conclusions presented in the article are similar to the clinical approach in Critical Thinking in Psychology.  According to Ruscio (2006), the clinical approach to research offers “nothing more sophisticated than using unaided human judgment to evaluate available information and arrive at a decision” (p.  171).  Because the researchers used databases searches and the research of others to formulate their assertions, their work appears wholly opinion-based. 

The methodology is the primary problem with the study.  Comparing and contrasting the work of others does not lend valid solutions.  It presents an argument with no foundational evidence.  Furthermore, the small sample size makes it impossible to apply any of the findings to the general population.  The research does not account for cultural and gender differences, nor does it account for faculty involvement.

Ruscio (2006) points out that a statistical approach to research involves mathematical calculations.  Without the use of quantitative methods, the article falls short.  While the article does an adequate job of presenting the opinions of eight students, it does not provide much more than that.    

            Mirriahi, Alonzo, and Fox (2015) failed to offer any new information that could be useful to administrators or faculty members striving to implement blended learning.  The authors indicated that blended learning is different than online learning by virtue of design and delivery (Mirriahi, Alonzo, & Fox, 2015), which is a moot point.  No argument exists regarding the definition.  Most researchers agree that blended learning is supplemental to face-to-face delivery and in-class interactions (Porter, Graham, Spring, & Welch, 2014).  There is no reason to include such a distinction.  

            Mirriahi, Alonzo, and Fox (2015) felt that by providing their framework for blended learning, teachers could formulate activities.  However, the information presented is rooted in the students’ perspective.  There are no guidelines or noteworthy suggestions for educators.  Once again, the teacher is left to his or her own devices, perpetuating the inconsistency of blended learning. 

Unanswered Questions

            Mirriahi, Alonzo, and Fox (2015) explained that there were deficits in their research, leaving many questions unanswered.  The research should have included faculty members (Mirriahi, Alonzo, & Fox 2015) as well as a larger number of student participants.  If faculty members would have been included, the stages of blended learning could have been thoroughly explored, general practices could have been identified, and possible improvements could have been found. 

            Future studies should include professional development resources (Mirriahi, Alonzo, & Fox, 2015).  Still, without a reliable framework, that may not be possible.  Nonetheless, the gap allows for further exploration about how faculty development can improve the implementation of blended learning. 


The article written by Mirriahi, Alonzo, and Fox (2015) strives to provide a clear model of blended learning.  Nonetheless, their methodology for data collection affects the reliability of their findings.  Without a larger sample size and the inclusion of educators, the research appears skewed.  On the other hand, there are areas for further exploration, which may improve the overall outcome for blended learning implementation.           


Mirriahi, N., Alonzo, D., & Fox, B.  (2015).  A blended learning framework for curriculum

design and professional development.  Research in Learning Technology, 23(1), 1-14.  doi:  10.3402/rlt.v23.28451

Porter, W., Graham, C., Spring, K., & Welch, K.  (2014).  Blended learning in higher education: 

Institutional adoption and implementation.  Computers and Education, 75, 185-195.  doi:  10.1016/j.compedu.2014.02.011

Ruscio, J. (2006). Critical thinking in psychology: Separating sense from nonsense (2nd ed.).

Belmont, CA: Wadsworth. 

Tagged: doctoral work, higher education, Ph.D.
Cookbook Featuring Cruise Ship Cuisine--7 Fantasy ...
Meet Susan Noble and Check Out THE HEIR TO ALEXAND...