On 2018 Jan 06, Ivan Buljan commented:
Dear Hilda,
Thank you for your comments. You have made several excellent observations and we are glad to clarify those issues.
As it was addressed in the limitations, high dropout rate was present in trials with consumers and physicians, while there was none in student trials. Unlike the student’s trial, where we tested the efficacy of the formats, our intention was to test their effectiveness (the real world application). We believe that our results are representative because in the real world there will be a significant amount of patients and physicians who would not want to read the CSR. It is true that in the student trial there was no difference in reading experience between PLS and infographics, but although there was significant difference between infographic and PLS in consumer and physician trials, it has to be admitted (as we did in the article) that the difference was very small (only couple of points on reading experience scale, which poses the question whether that is a “clinical significance”).
There were differences between infographics and PLS concerning the number of numerical expressions, but it cannot be claimed that that was the only reason for the dropout from the trials. Participants may have refused to participate in the trials even before reading the text, or before giving the answers about the summary. Also, even with different number of numerical expressions, numeracy was the predictor of results for all formats, meaning that participant with higher numeracy levels were better in understanding the results in a summary format even when fewer numerical expressions were used.
Although there were differences in the formats with regard to the presentation of quality of the evidence between infographic and PLS. We accounted for that when scoring the results. In the scoring of the answers, we scored the answer as correct if the answer described the studies with terms like “low quality of evidence”, “the quality of studies differed greatly”, “the strength of the evidence from the studies differed”, “the studies were too small”, whereas the incorrect answers would be such as “the quality of evidence was very good”, “there was no differences in quality of studies” or “all studies produced equal strength of the evidence”. The result was that there was no difference between formats in number of correct answers on that question in any of the three trials.
As for the metaanalysis, we had all individual level data from the three trials, so it was not necessary to use metaanalysis to estimate the effect. We presented the pooled results from all three trials.
The ways of presentation of evidence to public using infographic are still not well explored. We do not know whether the symbols used in this research were appropriate to consumers for the purposes they are intended to. To explore that, we would need to ask participant their opinion about the symbols used in the infographic, which would require a qualitative approach, which is underway.
This comment, imported by Hypothesis from PubMed Commons, is licensed under CC BY.