It is often said that peer review is one of the pillars of scientific research. It is also well known that peer review doesn't actually do its job very well, and, every few years, people like me start writing articles about alternatives to peer review. This isn't one of those rants. Instead, I'm going to focus on something that is probably less well known: peer review actually has two jobs. It's used to provide minimal scrutiny for new scientific results, and to act as a gatekeeper for funding agencies.
What I would like to do here is outline some of the differences between peer review in these two jobs and the strengths and weaknesses of peer review in each case. This is not a rant against peer review, nor should it be—I have been pretty successful in both publications and grant applications over the last couple of years. But I think it's worth exploring the idea that peer review functions much better in the case of deciding the value of scientific research than it does when acting as a gatekeeper for scientific funding.
Drawing
on sociocultural theory, the present study investigated how children in an intensive elementary
level Grade 6 class for English as a second language (ESL) scaffolded each other while carrying
out cooperative learning tasks.
Dekita advocates participatory uses of Web applications in EFL/ESL teaching; we favor open approaches to language learning in which students get to engage the public Web instead of being locked into narrowly circumscribed online spaces. *** nice descript
Naboj is a dynamical website that lets you review online scientific articles. Right now the only articles that are available for review are those that have been posted at Los Alamos arXiv.
To create communities of researchers reviewing and labeling papers in their field. These communities could be entitled Peer Community in xxx, e.g. Peer Community in Evolution, Peer Community in microbiology.
Just a small number of bad referees can significantly undermine the ability of the peer-review system to select the best scientific papers. That is according to a pair of complex systems researchers in Austria who have modelled an academic publishing system and showed that human foibles can have a dramatic effect on the quality of published science.
Anyone who thinks peer review is a process of nudges and winks from your mates has never faced the harsh reality of having your work pulled apart, says Jenny Rohn (who has).
If you think research and knowledge are as vital to humanity as air, water, bread and freedom, then you probably know what Peer Evaluation is about.
Peer Evaluation is about giving Open Access to your primary data, working papers, articles, media and having them all reviewed and discussed by your peers. Peer evaluation is a strong supporter of qualified peer reviewing and is, in that respect, a valuable supplement, inspiration and hub for peer reviewed journals and publications. Finally, Peer Evaluation is an independent and community interest project.
6Search User Study: Understanding 6Search Users' Behavior and Emerging Network in a Realistic Setting
Sixearch is a collaborative peer network application, which aims to address the scalability and context limitations of centralized search engines and also provides a complementary way for Web search.
D. Taraborelli (2008), Soft peer review. Social software and distributed scientific evaluation, Proceedings of the 8th International Conference on the Design of Cooperative Systems (COOP 08), Carry-Le-Rouet, France, May 20-23, 2008
A. Ghumare, N. Patil, C. Holkar, und V. Badgujar. International Journal on Recent and Innovation Trends in Computing and Communication, 3 (1):
175--180(Januar 2015)
M. Redekopp, Y. Simmhan, und V. Prasanna. IEEE Workshop on Parallel Algorithms and Software for Analysis of Massive Graphs (ParGraph), Seite 1--8. (2011)
Y. Simmhan, B. Plale, und D. Gannon. International Provenance and Annotation Workshop (IPAW), Volume 4145 von Lecture Notes in Computer Science (LNCS), Seite 222-236. Springer Berlin / Heidelberg, (2006)
Y. Simmhan, B. Plale, und D. Gannon. International Provenance and Annotation Workshop (IPAW), Volume 4145 von Lecture Notes in Computer Science (LNCS), Seite 222--236. Springer Berlin / Heidelberg, (2006)
A. Kumbhare, Y. Simmhan, und V. Prasanna. IEEE/ACM International Symposium on Cluster, Cloud and Grid Computing (CCGrid), Seite 344--353. (2014)CORE A.
J. Zhao, Y. Simmhan, und V. Prasanna. International Proveanance and Annotation Workshop, Volume 7525 von Lecture Notes in Computer Science, Seite 250--253. Springer, (2012)Poster.