Este portal está dedicado a mostrar los productos finales del Proyecto de Investigación INevalCO (EA2010-0052) INnovación en la EVALuación de COmpetencias: Diseño y desarrollo de procedimientos e instrumentos para la evaluación de competencias entornos de aprendizaje mixtos/virtuales con la participación de los estudiantes en los títulos de grado. Financiado por la Secretaría General de Universidades (Ministerio de Educación)
El estudio se ha centrado en un aspecto básico para el presente de las enseñanzas universitarias: el diseño y desarrollo de procedimientos e instrumentos para la evaluación de competencias en los títulos de grado, fomentando la participación de los estudiantes y adaptándolos para entornos de aprendizaje mixtos y virtuales.
In CARS, we think of the assessment process as an ongoing cycle, in which assessment of student learning outcomes is used to improve programming. Student learning outcomes are things we want students (or program participants) to know, think, or do upon completing the offered program. For more information on the assessment cycle, see the resources below. The following sections include resources for each step of the cycle.
The Laboratory for Innovation Science at Harvard (LISH) is spurring the development of a science of innovation through a systematic program of solving real-world innovation challenges while simultaneously conducting rigorous scientific research and analysis.
Seven New England colleges and universities formed the Learning Assessment Research Consortium (LARC) and developed online modules on assessment to be utilized for professional development within colleges and universities nationally. NILOA is pleased to house the great work of the Consortium on our website. All LARC developed materials and module content are under a creative commons license.
Lumi es una aplicación de escritorio que le permite crear, editar, ver y compartir contenido interactivo con docenas de diferentes tipos de contenido. Es gratis y de código abierto.
There are many excellent guides and toolkits online which introduce you to Monitoring and Evaluation (M&E). While they are all useful, most work on the assumption that you already have a pretty good grasp of M&E. This toolkit assumes that you may be starting from scratch or that you really need a refresher
The Mapping Lab develops technologies for the future of digital education. We design and create scalable methods and tools for data modeling, data visualization and digital analytics. These tools are key to unlocking the value of educational data, to creating personalized pathways and adaptive learning systems, and to achieving data-driven decisions.
In this post “missing data” does not mean absence of whole categories of data, which is a common enough problem, but missing data values within a given data set.
While this is a common problem in almost all spheres of research/evaluation it seems particularly common in more qualitative and participatory inquiry, where the same questions may not be asked of all participants/respondents. It is also likely to be a problem when data is extracted from documentary source produced by different parties e.g. project completion reports.
Established in 2008, the mission of the National Institute for Learning Outcomes Assessment (NILOA) is to discover and disseminate ways that academic programs and institutions can productively use assessment data internally to inform and strengthen undergraduate education, and externally to communicate with policy makers, families and other stakeholders.
NILOA assists institutions and others in discovering and adopting promising practices in the assessment of college student learning outcomes. Documenting what students learn, know and can do is of growing interest to colleges and universities, accrediting groups, higher education associations, foundations and others beyond campus, including students, their families, employers, and policy makers.
NILOA assists institutions and others in discovering and adopting promising practices in the assessment of college student learning outcomes. Documenting what students learn, know and can do is of growing interest to colleges and universities, accrediting groups, higher education associations, foundations and others beyond campus, including students, their families, employers, and policy makers.
To help campuses ensure that their online courses are learner centered and well designed, a team of Open SUNY staff and campus stakeholders has designed the OSCQR rubric, a customizable and flexible tool for online course quality review.
The OSCQR rubric specifically targets online course design. The OSCQR rubric is unique and differs from other online course quality rubrics in several ways. It is not restricted to mature online courses. The rubric can be used formatively with new online faculty to help guide, inform, and influence the design of their new online courses, and, it is non-evaluative.
Conceptually, the rubric and the online course review and refresh process are implemented as a professional development exercise designed to guide online faculty to use research-based effective practices and standards to improve the quality, effectiveness, and efficiency of their online course design, rather than as an online course evaluation, or quality assurance procedure.