TheEvidenceDoc wants to help you better understand evidence based methods. The series is called "Common Myths About Evidence" The first title in the series addresses a common mistake made by new learners. This mistake is confusing a high quality evidence review with high quality evidence. The process is not the same as the product. In a free download, we explain the difference using a timely example.  For you city folks who may not know, the time to make hay is while the sun shines! Download free pdf 7/11/14



Resources of Interest

Periodically TheEvidenceDoc will share an expanded description of a specific website with exceptional free resources. You can also follow @TheEvidenceDoc on Twitter for links to lots of free resources.

1. Here's a site to help your organization improve the quality of its services. The AHRQ Innovations Exchange is one of the most comprehensive sites on evidence based healthcare quality improvement, offering instructional reports and videos, innovations, tools and guides for beginners through experts. Don't miss the guide to systematic evaluation for evidence based selection of quality improvement efforts that will help you identify innovations likely to have an important impact in your organization. There are other resources for learning in print, audio and video format as well as a searchable database of quality improvement innovations. These innovations must meet minimal criteria to be posted, including evidence of effectiveness, so you can assess the likelihood the innovation may work in your setting.
This may be the best website you never heard of.

2. This time let's head to twitter - #PRC7 - to access the record of the Seventh International Congress on Peer Review and Biomedical Publication. The event happened in early September last year, but you will still find reviews and commentary on the event - like this one published in JAMA online - that calls for implementing existing legislation and enacting additions to close the loopholes in what exists.

In addition to some great twitter coverage, there are several good blog summaries of the event. You can read summaries by Hilda Bastian  "Bad Research Rising", "Academic Spin"  and "Opening a Can of Data Sharing Worms". You can also read the blogs of Richard Smith, former editor of BMJ, "Time for Science to be about Truth, rather than Careers" and "Medical Journals, a Colossal Problem of Quality".

The Congress also shares abstracts of Plenary and Poster sessions forthose who weren't able to attend.

From all this data, we get a peek at the challenges facing editors - unbelievably prolific authors, incomplete conflict of interest declarations, duplication of papers, and spin. We also see the challenges facing readers due to outdated and ill-defined peer review processes, publication of poor quality and biased studies, pursuit of impact factors, and editorial spin by journals. 

Some themes for change are old ones - training for editors (one presenter asked how many of the 500 in attendance had training before becoming editor and six raised a hand), better and open peer review processes, registration of study trials, and standardized methods for conduct and reporting.

But did the attendees also miss some opportunities to consider real change?

Just as printed textbooks are fast becoming obsolete sources of clinical knowledge, have traditional medical journals also become dinosaurs? Individual primary studies have value only in context of the body of research. Repeated admonitions by Doug Altman, Ian Chalmers, Mike Clark and others have yet to be implemented by most journals. "Clinical Trials Should Begin and End with Systematic Reviews of Relevant Evidence: 12 years and waiting" was published in 2010 to remind us and yet where is the model for adopting this practice even today? Rejection of all studies lacking this step would go a long way to reducing journal pollution that keeps clinicians from finding the evidence they need to help patients make care decisions.  Instead, journals are wasting resources to refine their search processes to sift through their vast study wasteland, unable to identify the one quarter or less of studies that are valid, and even fewer that have been systematically integrated with other valid, relevant studies.

It's time to rethink the entire story-based journal publication process. We need access to valid data, critically appraised and systematically aggregated.  And we need it now. 10-14-13



And here's a starter list of healthcare links to a wide variety of general resources including evidence reviews, evidence based recommendations, policy and research.