Pandemic Publishing: Speed vs Quality Control

; Clyde W. Yancy, MD, MSc; Brahmajee J. Nallamothu, MD, MPH

Disclosures

August 27, 2020

This transcript has been edited for clarity.

Robert A. Harrington, MD: Hi. I'm Bob Harrington from Stanford University, here on theheart.org | Medscape Cardiology.

Over the past couple of months of the COVID-19 pandemic, there have been many issues for the clinical cardiovascular community to discuss. One that's at the top of many people's list is how to get information that's reliable out to the clinical and public health communities in a timely way.

I have been thinking about this often, as there have been a few issues with papers being retracted and with preprints not looking anything like the subsequent publications.

I thought it would be worthwhile to bring in two well-known senior investigators, clinicians, and editors from the cardiovascular journal world to have this discussion about how to get this information out in a reliable but timely way, what they think about preprints these days, how they work with their editorial staff and their reviewers to ensure data integrity, and a variety of other topics.

I'm really pleased to be joined by two friends and colleagues. First off, Brahmajee Nallamothu is from the University of Michigan, where he's a professor of internal medicine in the Division of Cardiovascular Medicine. In this role, Brahmajee is with us today because he's the editor-in-chief of Circulation: Cardiovascular Quality and Outcomes. Brahmajee, thanks for joining us.

Brahmajee J. Nallamothu, MD, MPH: Thanks, Bob.

Harrington: Also with us today is Clyde Yancy from Northwestern University. Clyde is the chief of the Division of Cardiology at Northwestern Feinberg School of Medicine, where he's also a professor of medicine and vice dean for diversity and inclusion. Clyde, thanks for joining us.

Clyde W. Yancy, MD, MSc: Thanks, Bob. Happy to be here.

Harrington: I should note that Clyde's with us in his editorial role as the deputy editor of JAMA Cardiology. That's the hat that I'm asking you to wear today, Clyde, among your many.

Yancy: I'm twice happy to be here, then.

Harrington: Let's start with the question from the introduction. Brahmajee, I'll start with you. How do you balance speed and reliability in a pandemic?

As a clinician and as a scientist, I'm anxious to hear what works. What are they observing in New York? What's happening in Italy? Do we have some early reports out of China?

We're all having to make decisions about a disease we didn't know a whole lot about, and we're relying, at least in part, on the scientific publishing process. Let's go back to March or April, when you first started seeing papers come in. How did you think about those issues as editor-in-chief?

Nallamothu: We struggled with many of the same things that you highlighted. Very early on, around late February, as each of our health systems and schools were thinking about this challenge, we took it upon ourselves and the editorial team to start to think about this as well.

We knew that this was going to be with us for a while and that we were going to have an important role in terms of getting the science out there. Three things came to mind. We sat down as a team and talked, and we came to the conclusion that, first, do no harm.

With the role that scientific journals play in terms of disseminating high-quality information, we had to be very careful about what we would push out there. There's a great line that we often came back to as an editorial team, which is that there are no emergencies in pandemics. Like the old John Wooden quote, "Be quick but don't hurry." I think we took that to heart.

The second thing we realized was that we didn't have a system that could operate with the volume that we were starting to see. Mike Ho (the deputy editor) and I triage the manuscripts, and we quickly started to think about the types of papers we would or would not be interested in so as not to burden our associate editors and reviewers, who are an important part of the scientific community. I'm interested to hear what Dr Yancy thought and what they came up with as well.

The third thing was that we really wanted to identify where we could make the most impact. Each of us has different expertise in scientific disciplines, so we were interested in trying to inform the outcomes research community in particular and trying to play a proactive role. We wanted to seek out those types of papers, and then once we got them, we wanted to expedite those through the process.

Harrington: I'm glad you brought up your last comment about impact, because the two of you represent two different types of journals. Brahmajee, you're going after a subspecialty investigator community, the outcomes research community, and the health services research community. JAMA Cardiology is going for the broader clinical community. Clyde, how did you start thinking about reliability vs speed, and what kind of conversations did you and the other editors of JAMA Cardiology have?

Yancy: Bob, thank you for having this conversation, because the community is really curious about this and there have to be lessons learned that we can carry forward. Brahmajee, your comments are very familiar because in every editorial room, virtual or otherwise, these thoughts were being distributed.

The single guiding ethos that affected us as senior editors and as part of our brand equity was, can we be the trusted voice and can we maintain that trust? Truth is very important to us. The litmus test when we would see papers come in was, "Is this true?"

That seems like a simple question. But as you know, Bob, sometimes there is a complexity in answering that question. No matter what the pace of information is, no matter how interesting, no matter how startling it is, we have to maintain that trust because it takes so much to get it and very little to lose it. That was our guiding ethos.

After we recognized that we would not vary from our ethos, then we had to think operationally, how would we do this? This is the point in time when having experience among your senior leaders is so very important, particularly leaders who have experience internationally and, specifically, Bob Bonow.

Bob was in a unique position in that he already knew a lot of the people in China executing the research. Much like Clyde Yancy and Bob Harrington are on a first-name basis, Bob Bonow and a number of investigators in China are on a first-name basis. That gave us a little bit of an inside edge, if you will.

It took us one step closer to appreciating the integrity and executing the trust. For us, it started with maintaining our commitment to being the trusted voice and searching for the truth and then really relying on the most precious resource we have, which is the experience that comes from people who have been in the business for 20-30 years.

Harrington: You are converging on some common themes here. I'll go to you first, Brahmajee. One of the issues that has certainly arisen in the last few months is, how does one balance science and politics? The hydroxychloroquine story is an interesting one in that regard.

Both of you have used the word "trust," and sometimes people trust nonscientists to deliver scientific information. Did you talk about trying to separate science from politics?

Nallamothu: We tried not to directly think about politics, particularly when we were evaluating science itself. As you know, most journals publish things beyond original articles. We tried to publish cardiovascular perspectives and other pieces that could influence the community. Very early on, actually, we published a perspective that was a warning call to health systems to start thinking about COVID-19 and their responses to it.

In terms of evaluating the science, we tried to separate the two. It's very challenging. In preparing for this discussion, you referenced Ioannidis and colleagues' paper out of Stanford University. There were many thoughts about how we would have handled that. We tried, as Clyde had said, to learn from others also as everything started to go into this uncertain environment.

In response to some of the retractions and controversies over the past several months, we have tried to institute a more redundant system of review to be extra careful.

One last thing is that I really appreciated this job before, but in the past 3-4 months, I have appreciated it even more. It is unbelievable in all the noise that's out there how important it is for scientific journals to maintain integrity and trust, because they really are an important voice.

Peer review, for all its faults — and sometimes peer review doesn't live up to the status that it's gotten in our communities — has helped us shape a lot of science that we published in the past few months and made it better.

Harrington: It's interesting. I often reference Winston Churchill's comment about democracy: "[It's] the worst form of government except for all the others." I think peer review might fall into that same category as the worst form of review, except for all the alternatives — at least thus far — that we've discovered.

Clyde, a topic that you think about often is data integrity. Those things can be easy or really difficult to pick up. I think of the whole peer review publication process as clinical research. There's this trust trail. Each and every person doing the work, you have to trust that they did the right thing. You must have talked about this at JAMA Cardiology.

Yancy: Absolutely. I'm smiling because Brahmajee said something that was so very tactful: "a greater appreciation." Buddy, you worked your tail off. Let's just put it out there. It was a labor of love, and we approached it as a duty.

Back to your question. Thankfully, there is triple redundancy within the walls of our organization, meaning that we have senior editors, associate editors, and statistical reviewers. That is a pretty robust clearinghouse to make it through to begin with.

Recognize that we didn't have the time to send everything out for peer review. Some COVID-19 papers came exclusively to the associate editors and editors, and we were turning decisions around within 24 hours. We were constantly getting new papers in our queue, exchanging ideas, and then doing our own fact-checking and our own peer review of each other.

I really hope the community gives every journal a bit of license, because the processes that were in place were established in a very different era. The idea of sending a paper out to peer review, even expecting it to come back within 72 hours, and vetting it internally took days to weeks at your fastest speed.

Some papers that came in needed a much faster throughput. When we're working at what is fundamentally warp speed for editors and peer reviewers. It's remarkable, Bob, that more mistakes weren't made, to be perfectly frank.

Harrington: Yeah, it really is. The journal publishing process, in some regards, relies on the slowness of the process to find the untruths and the problems.

Brahmajee, I'm going to go back to you. You're an outcomes investigator. I suspect a large amount of nonrandomized data was submitted to the journal. It's important to make early observations, in particular, and it may be important to understand patterns of care. You must have a special view toward thinking about data integrity, particularly in large data sets.

How do you think about that? What processes have you employed to try to understand whether these data are reliable?

Nallamothu: That's a great question. I'd like to make one comment on what Clyde said before tackling that, just to give people raw numbers.

Typically, in March, April, and May, the American Heart Association (AHA) journals overall get about 5500 submissions. To put this in perspective, there were 8500 or so in 2020 — a relative increase of over 50%.

The second thing is that we haven't seen a drop in submissions. About half of these papers are COVID-19 related, but half of the increased submissions are other types of science. It's been really interesting to watch that.

To answer your question about data integrity, it's so critical to go back to Clyde's question, "Is it true?" which is so important. It's funny because in most editorial offices, there are two types of balances people will sometimes talk about. They'll say, "Is it true?" But we all like to publish new and exciting findings, so the next question is, "Is it new?"

Sometimes the tension that can exist there leads you to start to really think about what's important and what might not be important. I'll say that, obviously, many studies have been out there, but the most prominent have been the Surgisphere-related ones that were recently retracted.

We did a little thought exercise internally: "If we'd gotten that paper, what would we have done?" We have nowhere near the magnitude of submissions that The Lancet and The New England Journal of Medicine get, but there were so many things that just popped up about those papers that seemed inconsistent.

We see many papers that rely on observational data. Data integrity is one key aspect, but the bigger problem that we tend to run into in our field involves overstated claims about causal inferences.

Again, we try to be very careful about that because when these papers do get published, they go out to a broader audience. If we can't police ourselves in terms of those issues, how can we expect the media to understand the nuances? Those are really critical things.

The last thing I'll say is that one of the papers that we were super proud to publish early on highlighted some of the opportunities to do better science. The AHA put together a COVID-19 registry rather quickly as part of the Get With The Guidelines program. We highlighted that, and we've actively sought out papers from groups that we think have the ability and access to these types of studies, similar to what Clyde had mentioned about Dr Bonow's connections.

Harrington: Just to give an idea of how fast the AHA's COVID registry moved, we got a request on a Friday from a group of investigators who were thinking about it, we had a phone call on Saturday, and on Monday, the team was working on it. Pandemics do cause organizations to move quickly.

Yancy: Bob, you just made a very good point, because the lessons we've learned through this pandemic tell us that we can move at a quicker pace with conventional publishing. There are opportunities that we can capture to become more efficient and more timely in the dissemination of news. That has not always been the trademark of the scientific publishing community.

We've also learned how valuable peer review is. It's a precious resource. Many of us are thinking about whether there is another model in the way that we can ascribe more value to the peer review process so that we can have more people capable of executing excellent peer review and engaging with journals. I don't know what that model would be right now, but it's a very important thought process to go through.

Harrington: People underestimate both how hard and also how rewarding being a peer reviewer can be. I actually enjoy reviewing articles. It gives me insight into what people are thinking and a chance to reflect on what a group outside of my own bubble of investigators are thinking about. I hope that we can encourage more.

I'm going to pivot to something that Brahmajee brought up, which is that although the volume that they have seen has gone up, it's not all COVID-related; there's a lot of non-COVID–related research.

Is that because everybody has more time sitting at home to finally write all the papers they wanted to write? I suspect there is some truth to that. How is JAMA Cardiology dealing with the fact that we need to learn things other than COVID-related issues?

Yancy: The first thing we had to do was to increase the number of entry points. Traditionally, there was one entry point where Dr Bonow would do all of the primary triage. That clearly became a nonstarter, because we had the conventional papers coming in and then the rush of COVID-19 papers. We had to rotate and elevate the triage responsibility among the group, but still the internal review process remained intact.

When you have very adroit senior editors and several of the people with whom I have the privilege to work are incredibly well-read themselves, there is a very quick study that can happen when a paper comes in to assess: Is this novel? Is this new? Is this redundant? And we can make an immediate editorial decision.

This is one of those circumstances where you want to be highly specific, not highly sensitive. You'd rather turn away something good than risk bringing something into the fold that you did not have the time necessary to make certain it was true. I am certain that we missed out on some papers we should've kept, but our bar was deliberately set high because we knew we were operating under stress and we had to be careful.

Harrington: Two questions I want to finish with. One is your current view of preprints, which have exploded during the past few months. The second is a very serious topic in that the pandemic has certainly revealed great health inequities and how your individual journals are thinking of that, because we have seen a flurry of papers addressing some important topics in that regard.

Brahmajee, let me start with you for the first question about preprints.

Nallamothu: Preprints have been around now for a few years, mostly in the hard and physical sciences up until recently. medRxiv was timed perfectly for launching last year and has just been inundated. Most of the preprints that are out there around COVID-19 have been hitting medRxiv.

I want to make two points about that. One is that medRxiv has a more complicated system than the preprint servers in other sciences, like physics, where you can just post it and it's immediately available. It goes through a review process. It's not what we would think of as a peer review process, but there are some quality checks.

What they've done that's really interesting is a check making sure that they aren't posting things that are highly suspect or really questionable. The last thing that medRxiv wants to do is post a preprint that makes a causal claim on a new treatment, leading to conspiracy theories and things like that.

That's really been a model that's taken off. I don't know how they're able to do that because of the volume of submissions they get, but I think they've tried really hard to do that.

The second thing that is really interesting about preprints is there's a group that's just come out — I think it's published by the Nature family — called Rapid Reviews: COVID-19, which is also using the preprint model [Editor's Note: RR:C19 is published by the MIT Press and edited by a team from UC Berkeley]. They scour all the preprints that are posted using AI technology to find those papers that would be of most interest to the community. Then it's pulled together by a group of volunteers who are willing to review these papers and post reviews within a week or so.

I've gone through some of these, and it's quite interesting because the reviews are of fairly good quality. It gives folks an idea about the quality of science as a first pass. What's really interesting to me is that journals are struggling a bit. Do we want to be newspapers? Do we want to be textbooks? Do we want to live somewhere in the middle?

Even if we publish something that might be true today, science has never been about getting into the end zone and spiking the ball. One advance just leads to another. We need to have the humility to know that things that we publish prospectively, knowing it's the right thing to do, might not turn out in the long term to be the right way in which science advances. There's a lot out there.

Harrington: I have tried to explain this to some of my friends outside of medicine. They say, "Well, you published that and it turned out to be wrong." I said, "Well, that's science."

Things turn out to be wrong all the time. To your point, we're moving the ball down the field, but not necessarily scoring.

Preprints from your perspective, Clyde.

Yancy: I'll make one comment about preprints. Bob, you and I have appreciated the fact, as have many others, that nothing beats the power of randomization. The next sentence is that nothing beats the power of peer review. I think, right now, that's the highest bar.

There is a role for preprint services and we're wrestling with that. Brahmajee is right that particularly for the basic and physical sciences, where it really is a repository for preliminary data that other investigators can use in fundamental discovery, it's very appropriate. We have to work through the rest of this process and understand whether that's where clinical data should go that might be actionable.

The second question that you raise, though, is a very important one. I applaud you for bringing this up, because many in medicine wrestle with the intersection of society and medicine. There is this vernacular phrase, "That's not in our lane."

As senior editors, we have to have enough judgment to understand that biology doesn't explain every human condition that we treat and every circumstance we have to address in medicine. There is an intersectionality of the life and living experience and the biological understanding that we promulgate.

If we don't appreciate that intersectionality, if we don't comment on that, and if we don't inform our community as carefully and as truthfully as we can about these personal existences that affect health and disease, then we haven't executed our job. If our job is to maintain the flow of information about health and disease, we have to appreciate all of the tethers that influence health and disease.

It takes courage to go into this space because it very quickly gets into societal and public policy arguments. I want to close with this thought. It was a brilliant line that was published by William Owen in JAMA, talking about the healthcare disparities that were exposed by COVID-19.

He made it very clear: We've learned from COVID-19 that all policy is health policy, because there is an impact on life and living that ultimately becomes manifest as a clinical circumstance. These kinds of statements are very powerful. I'm pleased that the editing community has accepted the challenge to foray into this intersectionality of the life and living experience in science.

Harrington: One of the great tragedies in the 20th century was the separation of schools of medicine and schools of public health because, in fact, they're so intertwined. Yes, there is the clinical medicine piece, but that's part of a bigger public health set of issues. We need to do more to bring those two together to try to answer questions.

You know well, Clyde, that the increasing focus on the social determinants of health is critically important. We're starting to move beyond that into such things as systemic and structural racism and the effect that they have on the public health.

Clyde, one of the best papers I read the past few months was your viewpoint in JAMA. Stay out of your lane, because it helps inform the rest of us.

Brahmajee, I want to ask the same question around health equity and racism. What have you thought about with your editorial group? Have you had the conversation about what is going to be your lane for your journal? I would think that an outcomes journal is perfect for some of these things.

Nallamothu: Absolutely. Maybe it's a huge land grab, but we've always thought of those issues as being well within the role of the journal. I love this historical discussion about the schools of medicine and the schools of public health. You could go back to Rudolf Virchow and the ideas that he had about the connections between politics and medicine.

We absolutely consider many of these things. What's been really sad about the whole issue with COVID-19 is how it's shown the stresses and break points within our healthcare systems and how they have affected vulnerable populations. We are all in with these things. We just published a recent statement on housing and its impact on health.

If we really want to make our patients better, we have to think more broadly than just in the moment when they're in front of you in clinic. Many of these are, as you alluded to, structural issues. Tough answers and tough problems, and really, things that we have to come together to address.

I can tell you absolutely that Circulation: Cardiovascular Quality and Outcomes wants to be a voice in that discussion, and we want to be a voice that can hopefully influence it in a way that's positive.

Harrington: Great. Thank you both. I can't imagine having had a better discussion about publishing during a pandemic with my two colleagues, Dr Nallamothu and Dr Yancy, from the University of Michigan and Northwestern University, respectively.

Thank you to the listeners for joining us here on Medscape Cardiology and theheart.org. Again, terrific job, Clyde and Brahmajee. Thanks for joining us.

Nallamothu: Thank you.

Yancy: Thanks, Bob.

Bob Harrington, MD, is chair of medicine at Stanford University and current president of the American Heart Association. (The opinions expressed here are his and not those of the American Heart Association.) He cares deeply about the generation of evidence to guide clinical practice. He's also an over-the-top Boston Red Sox fan.

Follow Bob Harrington on Twitter

Follow theheart.org | Medscape Cardiology on Twitter

Follow Medscape on Facebook, Twitter, Instagram, and YouTube

Comments

3090D553-9492-4563-8681-AD288FA52ACE
Comments on Medscape are moderated and should be professional in tone and on topic. You must declare any conflicts of interest related to your comments and responses. Please see our Commenting Guide for further information. We reserve the right to remove posts at our sole discretion.

processing....