Computer Science

Analysis of and Solutions for Disengagement in Massive Open Online Courses (MOOCs): Literature Review

Abstract

Despite their promises of democratising learning and revolutionising higher education, Massive Open Online Courses (MOOCs) continue to suffer from many drawbacks, the most concerning of which are high disengagement rates. Currently, anywhere upwards of 90% of students fail to graduate from most enrolled courses7. In order to investigate the reasons for such high attrition rates, as well as propose possible solutions, a review of literature was performed that accounted for both quantitative and qualitative data points. The review indicated that the most prevalent reasons for course incompletion were a lack of academic support and/or guidance, ill-suited course pacing, and ill-suited course assessments. These results prompted the suggestion of two potential solutions to improve retention rates: firstly, increased emphasis on personalisation of education through personalised learning pathways from genetic algorithms, explained in detail below. Secondly, usage of intelligent tutoring systems to effectively offer personalised learning support and guidance.

Acknowledgements

This paper was made possible through the support and guidance of two amazingly kindhearted individuals. Mr. Christopher Walleck, project lead at Microsoft, both expressed his perspective as a MOOC student and extended constructive criticism to improve the workings of this paper. He dedicated his time to suggesting elaborations, clarifications, and further content development that played an essential role in making this literature review conceivable. Dr. Casey Roehrig, HarvardX project lead and Harvard preceptor, offered valuable insights through a personal interview into the difficulties and successes that current MOOC teachers face.

Introduction

Since they burst onto the scene in 2012, Massive Open Online Courses (MOOCs) have quickly become one of the most promising and disruptive frontiers in the wave of education technology1. These MOOCs are differentiated into cMOOCs and xMOOCs: cMOOCs take a connectivism-based approach to learning while xMOOCs follow the more traditional instructive and individualist path2. This literature review addresses the disengagement rates concerning the more popular and scalable xMOOCs (Figure 1)3.

Figure 1 – Illustration of the cMOOC and xMOOC branches of Massive Open Online Courses. Adapted from Phil Hill, 2012

MOOCs are virtual courses that anyone with internet access can take. They are free-of-charge, require no formal entry prerequisites, and are not restricted by participation limits. While these courses do not offer credits, certificates of completion can often be purchased upon course fulfillment5. Massive Open Online Courses hold great power as drivers for social mobility: they break down the financial and geographical barriers that many families traditionally face. MOOCs have the potential to revolutionise higher education, executive learning, and empowering of life-long learners6. Despite their promise, though, MOOCs still face a severe challenge: critically high disengagement rates.

Anywhere upwards of 90% of enrolled students leave MOOC courses incomplete7. In this context, incompletion refers to a failure to graduate from the online course. This occurs when a student does not complete either video lectures or assigned assessments. This review refers to a failure to complete course assessments and/or lectures as disengagement and nestles the subset of a failure to do both as a dropout. With such potential to democratise education, it is of paramount importance for these courses to be adjusted to optimise reapings for all students.

Figure 2 – Disengaging students versus Dropout Students. Original

While this high dropout rate is undoubtedly influenced by the lower financial commitment when compared to traditional college courses, it is important to review current studies in order to optimise engagement and pedagogical teaching techniques that are within the preceptor’s realm of influence. From there, suggestions can be made to address the factors attributed with MOOC disengagement. Therefore, the goal of this literature review is to answer the following research questions: first, what are the types and causes of student disengagement? Second, what are the potential strategies that will decrease MOOC disengagement?

Materials and Methodology

This literature review aims to first identify the reasons for high MOOC student disengagement, as well as suggest solutions for increased student retention. The Google Scholar search engine was used to examine credible and relevant research on these topics, where search terms included “MOOCs,” “Disengagement,” “Student Testimonies,” and “Pedagogical Techniques.” This causality analysis was performed on two frontiers.

In the reviewed literature, k-means clustering was used on a summary of the quantitative data collected. K-means clustering is an iterative unsupervised learning algorithm that partitions data into distinct non-overlapping clusters. This literature review only included papers where k-means clustering was performed more than one hundred times in order to account for its characteristic of being dependent on randomness. The sample upon which the classification research was performed consisted of a computer science class divided into High School (HS), Undergraduate (UG), and Graduate School (GS) levels (Figure 3). These prototypical engagement algorithms were proven to be robust through successful perturbations in methodology such as changing data labels, as well as a strong “goodness of fit” through silhouette cluster validation tests.

Figure 3 – Course demographics of computer science courses on which k-means clustering was performed. Human Development Index (HDI) is a combined indicator of life expectancy, education, and income3.

This literature review complemented the quantitative data with user testimonies. This emphasis on student learning experience was achieved through direct quotations and surveys 8.

A comparison of these quantitative and qualitative results served as the guideline for suggestions of possible solutions.

Results

High Numbers of Course Samplers:

These are students who watch only one or two videos with no real intention to complete the course but rather to gauge the scope of the course “out of curiosity” or simply “learn more about MOOCs”9. In a particular study conducted on students who enrolled in three levels of computer science courses, 53% of HS-level course attendees, 80% of UG-level course attendees, and 84% of GS-level course attendees were samplers10. This subpopulation of students also includes those who pick courses at inappropriate difficulty levels without fulfilling the prerequisites to a reasonable extent. This behavioral pattern of disengagement, however, is a sign of positive student exploration; as such, this paper emphasises creating learning solutions for the students who have fulfilled the prerequisites to an acceptable extent and a genuine desire to complete the MOOC.

Lack of Personalisation:

Ill-Suited Assessments:

Currently, MOOCs implement a unique combination of short video quizzes, weekly assessments, larger course projects, final exams, and peer-reviewed work. This testing combination is entirely determined by instructors, without any student input or personalisation11. Students have indicated that the course assessments can be unhelpful and that “courseware didn’t meet [their] needs well”12. These ill-suited assessments often lead to students auditing or dropping out of a MOOC all together because they were “unable to make the transition from theoretical learning to the practical application required for the assessments”12.

Ill-Suited Pacing:

Students’ failure to keep up with course pacing can be attributed to subjectively illogical assignment sequences, students’ previous experience or lack thereof with course content, and independent changes in students’ schedule 13, 14. Feeling as though the coursework “was set-up in a confusing manner,” was “too fast”, “too slow,” or that there was “not enough time to complete the course” have been shown to cause disengagement even when students had strong intentions to complete the course 15, 16. This likely led students to leave enrolled MOOC incomplete out of frustration14.

Lack of Academic Support and Guidance:

Currently, MOOCs offer a few modes of communication between instructors, moderators, and students such as live chats and project-based-learning. However, these methods of collaboration become largely inefficient and unhelpful as class sizes increase17. The largely independent nature of MOOCs means many students have expressed that inadequacy of peer support and lack of instructor aid when working through challenging concepts are reasons why they exit the course out of frustration. Some students have identified that the reason they drop out is that they “cannot understand the issues being discussed any more” or they “do not receive enough feedback from assessments”16, 18.

Discussion

The following solutions are proposed to address the resulting reasons for MOOC disengagement.

Personalised Learning Pathways

By their very definition, ill-suited course assessments and pacing can be improved through personalisation. While deviations from traditional one-size-fits-all models are costly in traditional classrooms, MOOCs’ economies of scale make this emphasis on personalised learning pathways realisable19. As such, personalised learning pathways may serve as a vehicle to address issues like ill suited assessments, illogical assignment sequences, students’ prior knowledge scope, and independent schedule changes.

As in a traditional classroom, MOOCs currently have a rigid schedule of video classes and assessments that students follow in chronological order in order to complete the course. While the standardised model of content sequence maximises the concept continuity of certain subjects with clearly pre-defined learning sequences, such as high school mathematics, it fails to consider that specific learner’s past familiarity with course concepts. No single fixed learning path will fit all learners, so the current model results in varying degrees of disorientation and cognitive overload that lead to disengagement20. Furthermore, when dealing with MOOCs that teach higher-level concepts with more abstract learning path sequences, such as Graduate Level Complex Analysis Mathematics, in accordance to previous pedagogical research, greater emphasis should be placed on the student’s learning style rather than a predefined path21. Thus, this paper proposes the adaptation of personalised learning paths that both optimises concept continuity and optimal courseware personalisation in terms of pace and content. Such adaptations will lead to a superior MOOC learning experience.

One particularly advantageous mode of conducting this personalisation is through the use of genetic algorithms. Genetic algorithms are function optimisers inspired by evolution, a powerful tool that can be leveraged in everything from medical treatment planning to neural network training22. They are characterised by being especially sensitive to input changes — a slight population change can result in surprisingly new optimised conclusions23. When dealing with the significance of a child’s education, this sensitive model is best suited to be used to simultaneously consider both courseware difficulty and concept continuity19. In all, this can be done by splitting up MOOCs into Micro-MOOCs and implementing Genetic Algorithms for optimisation24.

Implementation: Micro-MOOCs

In order to utilize this approach, MOOC lessons must first be distilled into Micro-MOOC sessions. These are sessions that split lengthy classes into more granular modules, each lasting no more than five minutes in length. These Micro-MOOCs will then be rearranged in a manner deemed optimal by the Genetic Algorithm in order to suit the student’s learning needs. As such, the more abridged these Micro-MOOCs are, the greater the personalisation of the learning pathway will be.

Implementation: Pre-Course Diagnostic Test

MOOC instructors will then administer to all students a pre-course assessment gauging their familiarity of course material, as well as their preferred learning style, difficulty, and pace. This data will help the genetic algorithm better construct the optimal learning path by helping determine the scoring system of the Fitness Function.

Implementation: Genetic Algorithms

With these preliminary elements in place, the genetic algorithms (GA) will have the necessary components to generate the personal learning path. Genetic algorithms, much like neural networks, mimic a biological process. Genetic algorithms are used as optimisation techniques modeled after Darwin’s Theory of Evolution. They are uniquely efficient and easily programmable25. A GA involves the creation of iterations of the current hypothesis through crossovers and mutations, creating the next generation of offspring that is a better fit for the hypothesis than the parent population.

The seven-step process is completed as follows, as shown in Figure 4:

  1. Definition of Genes and Chromosomes: Every Micro-MOOC is coded as a string of integers, called a gene. These genes are then randomly combined with other genes in order to form the whole individual, the chromosome. In essence, every chromosome represents a potential learning pathway.
  2. Definition of Initial Population Size: The initial population size of is influenced by the number of genes, in other words, the granularity of the Micro-MOOCs. While a higher initial population size increases the likelihood of optimising the personalised pathway, it will eventually give way to diminishing marginal returns due to the additional searching speed required.
  3. Definition of the Fitness Function: The fitness of every chromosome is assessed to determine its probability of reproduction. This performance index simultaneously maximises concept relation degrees, prioritises concepts identified as challenging to the student by the pretest, as well as the student’s pacing preferences.
  4. Parent Selection: the most “fit” individuals, in this case, single chromosomes, are then given a proportionally higher probability to reproduce through methods such as the weighted roulette selection. This will optimise the fitness of the next generation of personalised learning pathways.
  5. Crossovers: This involves exhausting the permutations and combinations of the fittest parents, thereby generating a “fitter” next generation of personalised learning pathways
  6. Mutation: Random mutations will ensure that the optimisation of the learning pathway will not get stuck at a local maximum, also known as the maximum value of a function in a given range that may not be the entire domain of the function. However, a mutation frequency that is too high leads to a slower convergence speed.
  7. Convergence: Once the Fitness level of the succeeding generations show no significant improvement, the reproduction stops.

Figure 4 – Illustration of the seven-step Genetic Algorithm optimisation process. Original

In this manner, the genetic algorithm will have successfully personalised learning paths by maximising the connection between the domain concepts and catering to individuals’ learning needs. As such, this strategy will help decrease the number of students who disengage due to ill-tailored course pacing, disorienting sequencing, or overwhelming selection of assignments. Assessment modules can be optimised in a similar manner, adapting to users’ assessment style preferences to best engage the students who currently resort to auditing. This amelioration thus assuages disengagement caused by lack of personalisation.

Intelligent Tutoring Systems

In order to remedy the lack of academic support and guidance found in modern MOOCs, this paper proposes the use of artificial intelligence in the form of intelligent tutoring systems to serve as the solution to the lack of academic teaching support. This conclusion was drawn first by analysing the distinct advantages that Human Tutors possess over non-intelligent Computer-Aided Instruction (CAI). CAI is the testing method that is currently used by most MOOCs. It is characterised by non-personalised hints and an inability to provide student-specific solutions, recommendations, or feedback. In other words, they are answer-based teaching systems.

Relative Effectiveness: Human Tutor Versus CAI support

The following hypotheses have been put forth from extensive previous research as an explanation to why Human Tutoring has demonstrated in previous research superior test score elevation effectiveness in relation to CAI:

1) Feedback: Human tutors are able to more quickly intervene when students make an error in thought processes, whereas the CAI can only indicate the student came to an incorrect final conclusion. This opacity dissuades the student from being able to hone in on his/her exact logical flaw, leading to a less effective learning session26.

2) Scaffolding: Human Tutors are able to invoke a method of “guided response” that helps nudge tutees in the right direction27. This technique is illustrated in the following dialogue:

Student: “I’m not sure how to find the number of atoms of the Oxygen in 1 mole of water”

Teacher: “What are your knowns?”

Student: “The moles of water, the molar ratio of Oxygen atoms to water atoms, and the number of atoms in a mole.”

Teacher: “Great, how do you convert the number of moles of water to the moles of Oxygen?”

Student: “Oh, can I use the molar ratio?”

Teacher: “Right. Now once you have the moles of Oxygen, how can you convert that to the number of atoms?”

Student: “I can multiply by the number of atoms in a mole!”

In this dialogue, steps 2, 4, and 6 are all examples of human tutors’ ability to scaffold students’ responses and consequently create a more effective tutoring method than a simple CAI .

Relative Effectiveness: Human Tutor Versus ITS support

With these considerations in mind, intelligent tutoring systems can then be designed to imitate Human Tutors’ advantageous qualities. Intelligent tutoring systems lie at the intersection of knowledge on a subject, exhibited by CAI, AI-powered computer science, and psychological cognitive techniques. The general ITS structure, as shown in Figure 5, is composed of an expert knowledge Module, student knowledge module, tutoring module, and user interface. When the student answers a question, the student interacts with the user interface, which sends the data to the tutoring module. The tutoring module then updates the student model module with this incoming data. To verify the student’s responses, this updated student model is compared with the static expert model. From there, results are sent to the tutoring module, which employs pedagogical techniques through the user Interface Module to educate the user on the validity of their response and possible improvements28.

Figure 5 – Basic Architecture of Intelligent Tutoring Systems. Original

In order to imitate the first advantage of human tutoring, the feedback that they provide, an ITS can similarly break down problem solutions to more digestible sub-steps that require only limited reasoning per step. In this manner, should the tutee make an error, he or she can easily track down the flawed reasoning in the last step. As for the second advantage, scaffolding, an ITS can similarly employ sub-step “hint” buttons that the student can use as prompts to jog their memory and reap the same benefits as Human Tutor Scaffolding. These ITS approaches are certainly a meaningful improvement over the current Answer-Based CAI, which is largely incapable of giving sufficient Feedback or Scaffolding. It is of note that for both ITS and human tutors, the marginal benefit of increased Feedback and Scaffolding sub-step granularity eventually plateau; therefore, in the construction of ITS, an upper boundary for the number of Feedback or Scaffolding sub-steps may prove to be a meaningful advantage over human tutors25.

With these implementations, intelligent tutoring systems have been shown to be “just as effective” as one-on-one human tutors in STEM fields and very effective in enhancing student learning in a wide range of domains29. As such, they are a suitable approach and scalable option to provide MOOCs with increased student support and guidance30.

Technical Integration of Intelligent Tutoring Systems

Previous research has indicated the potential difficulty of integrating intelligent tutoring systems built with the popular Cognitive Tutor Authoring Tools (CTAT) as they may not be compatible with the majority of MOOC browsers. A recent study, however, successfully made CTAT tutors runnable on all popular web browsers by supporting HTML as the tutor interface technology31. This is a promising step forward in the future of synergistic relationships between ITS and MOOCs. As such, the integration of Intelligent Tutoring Systems shows promise to assuage reasons for student disengagement.

Conclusion

MOOCs hold great promise for the future of education technology. However, they still currently face outrageously high attrition rates. Through analysis of literature, this paper has identified that the principal causes of MOOC disengagement include a lack of personalisation as well as a lack of academic support and/or guidance. This paper proposes to implement personalised learning pathways through genetic algorithm optimisation. More tailored and scalable aid can be offered to students through Intelligent Tutoring Systems. These proposed solutions aim to not only decrease disengagement but also ameliorate the quality of the MOOC learning experience for all students. In future research, it is important to be aware of data biases and ensure these learning tools focus on helping students develop their intelligence, not just develop their test-taking aptitude. Moreover, it is important to further investigate the sustainability and quality of dividing complex courses into micro-MOOCs, ultimately providing instructors with the relevant tools to do so.

Abbreviations

Abbreviation

Full-Form

MOOCs

Massive Open Online Courses

HS

High School

UG

Undergraduate

GS

Graduate

HDI

Human Development Index

CAD

Computer-Aided Design

ITS

Intelligent Tutoring System

GA

Genetic Algorithm

References

  1. Gaebel, Michael. MOOCs: Massive open online courses. EUA, 2014. https://eua.eu/downloads/publications/moocs%20-%20massive%20open%20online%20courses.pdf. (Accessed April 3, 2020)
  2. Fidalgo-Blanco, Ángel, María Luisa Sein-Echaluce, and Francisco José García-Peñalvo. “From massive access to cooperation: lessons learned and proven results of a hybrid xMOOC/cMOOC pedagogical approach to MOOCs.” International Journal of Educational Technology in Higher Education 13, no. 1 (2016): 24. https://doi.org/10.1186/s41239-016-0024-z. (Accessed April 3, 2020)
  3. Kizilcec, René F., Chris Piech, and Emily Schneider. “Deconstructing disengagement: analyzing learner subpopulations in massive open online courses.” In Proceedings of the third international conference on learning analytics and knowledge, pp. 170-179. 2013. https://dl.acm.org/doi/pdf/10.1145/2460296.2460330. (Accessed April 3, 2020)
  4. Belfield, Clive. “Center for Benefit-Cost Studies in Education Teachers College, Columbia University.” (2013). https://files.eric.ed.gov/fulltext/ED547237.pdf. (Accessed April 4, 2020)
  5. Hollands, Fiona M., and Devayani Tirthali. “Why Do Institutions Offer MOOCs?.” Online Learning 18, no. 3 (2014): n3. https://files.eric.ed.gov/fulltext/EJ1043160.pdf. (Accessed April 4, 2020)
  6. Khalil, Hanan, and Martin Ebner. “MOOCs completion rates and possible methods to improve retention-A literature review.” In EdMedia+ innovate learning, pp. 1305-1313. Association for the Advancement of Computing in Education (AACE), 2014. https://www.researchgate.net/profile/Martin_Ebner2/publication/306127713_MOOCs_completion_rates_and_possible_methods_to_improve_retention-A_literature_review/links/57bb349c08aefea8f0f44ce9.pdf. (Accessed April 4, 2020)
  7. Liyanagunawardena, Tharindu R., Pat Parslow, and Shirley Williams. “Dropout: MOOC participants’ perspective.” (2014): 95-100. http://centaur.reading.ac.uk/36002/2/MOOC%20Dropout%20Participants%20Perspective.pp95-100.pdf. (Accessed April 4, 2020)
  8. Onah, Daniel FO, Jane Sinclair, and Russell Boyatt. “Dropout rates of massive open online courses: behavioural patterns.” EDULEARN14 proceedings 1 (2014): 5825-5834. https://warwick.ac.uk/fac/sci/dcs/people/research/csrmaj/daniel_onah_edulearn14.pdf. (Accessed April 4, 2020)
  9. de Vries, P. “Online learning and higher engineering education.” In 41st SEFI Conference, 16-20 September 2013, Leuven. 2013. https://www.researchgate.net/publication/257251734_Online_Learning_and_Higher_Engineering_Education_the_MOOC_Phenomenon. (Accessed April 4, 2020)
  10. Kizilcec, René F., Chris Piech, and Emily Schneider. “Deconstructing disengagement: analyzing learner subpopulations in massive open online courses.” In Proceedings of the third international conference on learning analytics and knowledge, pp. 170-179. 2013. https://dl.acm.org/doi/pdf/10.1145/2460296.2460330. (Accessed April 4, 2020)
  11. Jordan, Katy. “Massive open online course completion rates revisited: Assessment, length and attrition.” International Review of Research in Open and Distributed Learning 16, no. 3 (2015): 341-358. https://www.erudit.org/en/journals/irrodl/1900-v1-n1-irrodl04980/1065985ar.pdf. (Accessed April 4, 2020)
  12. BEZERRA, Luis N., and Márcia T. SILVA. “A review of literature on the reasons that cause the high dropout rates in the MOOCS.” Revista Espacios 38, no. 05 (2017). https://www.revistaespacios.com/a17v38n05/a17v38n05p11.pdf. (Accessed April 7, 2020)
  13. Onah, D. F., J. Sinclair, R. Boyatt, and J. Foss. “Massive open online courses: learner participation.” In Proceeding of the 7th International Conference of Education, Research and Innovation, pp. 2348-2356. 2014. https://warwick.ac.uk/fac/sci/dcs/people/research/csrmaj/daniel_onah_iceri14.pdf. (Accessed April 7, 2020)
  14. de Vries, P. “Online learning and higher engineering education.” In 41st SEFI Conference, 16-20 September 2013, Leuven. 2013. https://www.researchgate.net/publication/257251734_Online_Learning_and_Higher_Engineering_Education_the_MOOC_Phenomenon. (Accessed April 7, 2020)
  15. Kulik, James A., and J. D. Fletcher. “Effectiveness of intelligent tutoring systems: a meta-analytic review.” Review of educational research 86, no. 1 (2016): 42-78. https://apps.dtic.mil/dtic/tr/fulltext/u2/1030353.pdf. (Accessed April 7, 2020)
  16. Boyatt, Russell, Mike Joy, Claire Rocks, and Jane Sinclair. “What (Use) is a MOOC?.” In The 2nd international workshop on learning technology for education in cloud, pp. 133-145. Springer, Dordrecht, 2014. https://www.researchgate.net/publication/291257649_What_Use_is_a_MOOC. (Accessed April 10, 2020)
  17. Gütl, Christian, Rocael Hernández Rizzardini, Vanessa Chang, and Miguel Morales. “Attrition in MOOC: Lessons learned from drop-out students.” In International workshop on learning technology for education in cloud, pp. 37-48. Springer, Cham, 2014. https://link.springer.com/chapter/10.1007/978-3-319-10671-7_4. (Accessed April 10, 2020)
  18. Huisman, Bart, Wilfried Admiraal, Olga Pilli, Maarten van de Ven, and Nadira Saab. “Peer assessment in MOOCs: The relationship between peer reviewers’ ability and authors’ essay performance.” British Journal of Educational Technology 49, no. 1 (2018): 101-110. https://www.researchgate.net/publication/322385452_Peer_assessment_in_MOOCs_The_relationship_between_peer_reviewers’_ability_and_authors’_essay_performance. (Accessed April 10, 2020)
  19. Mackness, Jenny, Sui Mak, and Roy Williams. “The ideals and reality of participating in a MOOC.” In Proceedings of the 7th international conference on networked learning 2010, pp. 266-275. University of Lancaster, 2010. https://www.researchgate.net/publication/235886519_The_Ideals_and_Reality_of_Participating_in_a_MOOC. (Accessed April 15, 2020)
  20. Pardos, Zachary A., Steven Tang, Daniel Davis, and Christopher Vu Le. “Enabling real-time adaptivity in MOOCs with a personalized next-step recommendation framework.” In Proceedings of the Fourth (2017) ACM Conference on [email protected] Scale, pp. 23-32. 2017. http://people.ischool.berkeley.edu/~zp/papers/LAS_realtime_adaptive.pdf. (Accessed April 15, 2020)
  21. Chen, Chih-Ming. “Intelligent web-based learning system with personalized learning path guidance.” Computers & Education 51, no. 2 (2008): 787-814. https://nccur.lib.nccu.edu.tw/bitstream/140.119/57638/1/787-814.pdf. (Accessed April 15, 2020)
  22. Ghaheri, Ali, Saeed Shoar, Mohammad Naderan, and Sayed Shahabuddin Hoseini. “The applications of genetic algorithms in medicine.” Oman medical journal 30, no. 6 (2015): 406. https://scholar.google.com/scholar?hl=en&as_sdt=0%2C5&q=The+Applications+of+Genetic+Algorithms+in+Medicine&btnG=. (Accessed May 23, 2020)
  23. Pane, John F., Elizabeth D. Steiner, Matthew D. Baird, and Laura S. Hamilton. “Continued Progress: Promising Evidence on Personalized Learning.” Rand Corporation (2015). https://www.rand.org/pubs/research_reports/RR1365.html. (Accessed April 15, 2020)
  24. Kumar, Manoj, Mohamed Husain, Naveen Upreti, and Deepti Gupta. “Genetic algorithm: Review and application.” Available at SSRN 3529843 (2010). http://www.csjournals.com/IJITKM/PDF%203-1/55.pdf (Accessed May 2, 2020)
  25. Zhang, Xiaoyan, Lufeng Cao, and Yipeng Yin. “Individualized learning through MOOC: online automatic test system based on genetic algorithm.” In Proceedings of the 2016 International Conference on Intelligent Information Processing, pp. 1-6. 2016. https://www.researchgate.net/publication/314106711_Individualized_learning_through_MOOC_online_automatic_test_system_based_on_genetic_algorithm. (Accessed April 15, 2020)
  26. Janikow, Cezary Z., and Zbigniew Michalewicz. “An experimental comparison of binary and floating point representations in genetic algorithms.” In ICGA, vol. 1991, pp. 31-36. 1991. https://www.researchgate.net/profile/Abul_Beg/publication/309614123_Advantages_and_limitations_of_genetic_algorithms_for_clustering_records/links/59ed853aaca272cddde06776/Advantages-and-limitations-of-genetic-algorithms-for-clustering-records.pdf. (Accessed April 15, 2020)
  27. VanLehn, Kurt. “The relative effectiveness of human tutoring, intelligent tutoring systems, and other tutoring systems.” Educational Psychologist 46, no. 4 (2011): 197-221. http://www.public.asu.edu/~kvanlehn/Stringent/PDF/EffectivenessOfTutoring_Vanlehn.pdf. (Accessed April 15, 2020)
  28. Boetzer, Marten, Christiaan V. Henkel, Hans J. Jansen, Derek Butler, and Walter Pirovano. “Scaffolding pre-assembled contigs using SSPACE.” Bioinformatics 27, no. 4 (2011): 578-579. https://watermark.silverchair.com/btq683.pdf. (Accessed April 15, 2020)
  29. Ma, Wenting, Olusola O. Adesope, John C. Nesbit, and Qing Liu. “Intelligent tutoring systems and learning outcomes: A meta-analysis.” Journal of educational psychology 106, no. 4 (2014): 901. https://www.apa.org/pubs/journals/features/edu-a0037123.pdf. (Accessed April 15, 2020)
  30. Steenbergen-Hu, Saiying, and Harris Cooper. “A meta-analysis of the effectiveness of intelligent tutoring systems on college students’ academic learning.” Journal of Educational Psychology 106, no. 2 (2014): 331. https://core.ac.uk/download/pdf/37750661.pdf. (Accessed April 25, 2020)
  31. Aleven, Vincent, Jonathan Sewall, Octav Popescu, Michael Ringenberg, Martin Van Velsen, and Sandra Demi. “Embedding intelligent tutoring systems in MOOCs and e-learning platforms.” In International Conference on Intelligent Tutoring Systems, pp. 409-415. Springer, Cham, 2016. https://link.springer.com/chapter/10.1007/978-3-319-39583-8_49. (Accessed April 25, 2020)

Biography

Grace is a 16-year-old high school student who is fascinated by the world of Education Technology. She is constantly looking for new ways to learn and explore the world around her through the use of cutting edge novel technologies. In her free time, she loves watching Ted Talks, listening to audiobooks, and French baking.

Leave a Reply

Your email address will not be published. Required fields are marked *