The Mapping Lab develops technologies for the future of digital education. We design and create scalable methods and tools for data modeling, data visualization and digital analytics. These tools are key to unlocking the value of educational data, to creating personalized pathways and adaptive learning systems, and to achieving data-driven decisions.
Established in 2008, the mission of the National Institute for Learning Outcomes Assessment (NILOA) is to discover and disseminate ways that academic programs and institutions can productively use assessment data internally to inform and strengthen undergraduate education, and externally to communicate with policy makers, families and other stakeholders.
NILOA assists institutions and others in discovering and adopting promising practices in the assessment of college student learning outcomes. Documenting what students learn, know and can do is of growing interest to colleges and universities, accrediting groups, higher education associations, foundations and others beyond campus, including students, their families, employers, and policy makers.
The Transparency in Learning and Teaching in Higher Education project (TILT Higher Ed) is an award-winning national educational development and research project that helps faculty to implement a transparent teaching framework that promotes college students' success. The Project's activities include:
workshops for both faculty and students that promote student's conscious understanding of how they learn,
online surveys that help faculty to gather, share and promptly benefit from current data about students' learning by coordinating their efforts across disciplines, institutions and countries
confidential reporting of survey results to faculty
collaborative research on students' learning experiences.
Since its inception at the University of Illinois, Urbana-Champaign in 2009-2010, the project has involved over twenty-five thousand students in hundreds of courses at more than forty institutions in seven countries. Now housed at UNLV, the project invites participants from all institutions of higher education in the US and abroad. In 2014-2015, the Transparency Project began partnering with the Association of American Colleges and Universities to focus on advancing underserved students' success in higher education.
In CARS, we think of the assessment process as an ongoing cycle, in which assessment of student learning outcomes is used to improve programming. Student learning outcomes are things we want students (or program participants) to know, think, or do upon completing the offered program. For more information on the assessment cycle, see the resources below. The following sections include resources for each step of the cycle.
ISRIA was designed to give participants the opportunity to enhance their skills for the planning and development of assessment studies, to better understand how best to report and implement research impact assessments, and how to use those tools and techniques within their own organisations. Most importantly, participants were given the chance to create lasting connections with the people they met and to become a part of a growing global community of practice.
The Laboratory for Innovation Science at Harvard (LISH) is spurring the development of a science of innovation through a systematic program of solving real-world innovation challenges while simultaneously conducting rigorous scientific research and analysis.
This list is a companion to our curated list on technical topics. It puts together our posts on issues of measurement, survey design, sampling, survey checks, managing survey teams, reducing attrition, and all the behind-the-scenes work needed to get the data needed for impact evaluations.
This is a curated list of our technical postings, to serve as a one-stop shop for your technical reading. I’ve focused here on our posts on methodological issues in impact evaluation
We deliver actionable, meaningful research and development that advances the field of educational assessment and evaluation, promotes effective and equitable education policy, and improves evidence-based inferences.
This list is a companion to our curated list on technical topics. It puts together our posts on issues of measurement, survey design, sampling, survey checks, managing survey teams, reducing attrition, and all the behind-the-scenes work needed to get the data needed for impact evaluations. updated through October 23, 2018.Measurement