
The Conversing Nurse podcast
Are you a nurse curious about the experiences of other nurses? For 36 years, I have only known the Peds/NICU realm but I am intrigued by the roles of nurse researchers, educators, and entrepreneurs. Through conversations with nurses from various specialties, I aim to bring you valuable insights into their lives. At the end of each episode we play the five-minute snippet, just five minutes of fun as we peek into the 'off-duty' lives of my guests! Listen as we explore the nursing profession, one conversation at a time.
The Conversing Nurse podcast
Summer Series: Research 101- Part THREE
Welcome back to Part Three of our Summer Series on Research 101! I'm once again joined by Chris Patty here in my recording closet. In Part One, we explored the history of research, Chris defined what research truly means, and we had some insightful discussions about nursing literature. An important takeaway from that episode was Chris's enthusiasm for AI and its potential in our field.
In Part Two, we shifted our focus to developing the research protocol, particularly before presenting it to the Institutional Review Board (IRB). Chris helped demystify the IRB process, detailing its structure and what’s needed for protocol approval. He also introduced the hierarchy of evidence, emphasizing its importance in the research landscape and reiterating how AI is influencing healthcare and nursing practices. If you missed those episodes, I highly recommend going back and giving them a listen!
In this concluding episode, we tackled four crucial aspects of the research process: data collection, analysis, publication, and dissemination. After all, why invest so much effort in formulating your PICO question, facing the IRB, conducting your study, and collecting data, only to keep your findings to yourself? That would be quite insane!
Before we wrapped up, I asked Chris why conducting research is so important, and his answer hit home, so be sure to listen for that.
I'm a bit sad that our summer series is coming to an end. I've had such a rewarding time learning from Chris, and I hope you have too. Your feedback would be greatly appreciated, so feel free to share your thoughts! And don’t forget to check out my CE Library at RNegade.Pro, because great news: this series qualifies for CE credits! The link is in the show notes.
Contact The Conversing Nurse podcast
Instagram: https://www.instagram.com/theconversingnursepodcast/
Website: https://theconversingnursepodcast.com
Your review is so important to this Indie podcaster! You can leave one here! https://theconversingnursepodcast.com/leave-me-a-review
Would you like to be a guest on my podcast? Pitch me! https://theconversingnursepodcast.com/intake-form
Check out my guests' book recommendations! https://bookshop.org/shop/theconversingnursepodcast
I've partnered with RNegade.pro! You can earn CE's just by listening to my podcast episodes! Check out my CE library here: https://rnegade.thinkific.com/collections/conversing-nurse-podcast
Thanks for listening!
[00:01] Michelle: Welcome back to Part Three of our Summer Series on Research 101.
[00:05] I'm once again joined by Chris Patty here in my recording closet.
[00:10] In Part One, we explored the history of research.
[00:13] Chris defined what research truly means, and we had some insightful discussions about nursing literature.
[00:19] An important takeaway from that episode was Chris's enthusiasm for AI and its potential in our field.
[00:27] In Part Two, we shifted our focus to developing the research protocol, particularly before presenting it to the Institutional Review board.
[00:37] Chris helped demystify the IRB process,
[00:40] detailing its structure and what's needed for protocol approval.
[00:44] He also introduced the hierarchy of evidence,
[00:48] emphasizing it's important in the research landscape and reiterating how AI is influencing healthcare and nursing practices.
[00:58] If you missed those episodes, I highly recommend going back and giving them a listen.
[01:04] In this concluding episode, we tackled four crucial aspects of the research.
[01:10] Data collection, analysis,
[01:13] publication and dissemination.
[01:16] Because after all,
[01:17] why invest so much effort in formulating your PICO question,
[01:22] facing the IRB,
[01:24] conducting your study and collecting data only to keep your findings to yourself?
[01:30] That would be quite insane.
[01:32] Before we wrapped up, I asked Chris why conducting research is so important,
[01:37] and his answer hit home.
[01:40] I'm a bit sad that our Summer Series is coming to an end.
[01:43] I've had such a rewarding time learning from Chris and I hope you have too.
[01:48] Your feedback would be greatly appreciated. So feel free to share your thoughts.
[01:52] And don't forget to check out my CE library at RNegade.Pro,
[01:58] because great news. this series qualifies for CE credits.
[02:03] The link is in the show notes.
[02:22] Michelle: Good morning, Chris. Welcome back to my closet.
[02:25] Chris: Good morning. It seems very comfortable here today.
[02:29] Michelle: It is very comfortable.
[02:31] So today we're going to talk about part three and in review research 101. Part one was basically just an introduction to research and study design.
[02:43] And then in part two, we talked a lot about the IRB.
[02:49] Chris: Yes, we talked about the plan for your study, which we call the protocol. And that's the document that's submitted to this ethics review board called the IRB. We talked a little bit about their composition and their role,
[03:06] which is to evaluate the ethics of a proposed study. We always say their role is to protect the research subject from the investigator. And that really is the role exactly.
[03:21] Michelle: And so this is part three today. And what are we going to talk about today?
[03:26] Chris: Well, today I'd like to talk about what happens between the time that you get your approval letter from the IRB that says you may begin interacting with your living human participants. It's always my temptation to call them subjects. That still is the federal language, much like the federal language is alien for.
[03:53] A person,
[03:53] you know, in the country, not a citizen.
[03:57] So I'm trying, that we're trying to talk to in terms of, you know,
[04:02] participants and even in some research designs, partners.
[04:08] We probably don't have time to discuss it much, but for the last 25 years or so, there is a style of research called community based participatory research.
[04:21] And actually we've done several studies locally.
[04:24] Using this model where you actually make the participant an active investigator in the study. And that could be through sitting on an advisory board to inform the investigators,
[04:42] Lead investigators, where you should find the subject, what kind of questions to ask them, how you approach them, all that sort of thing, to actually going out and being trained to collect the data themselves.
[04:54] Sometimes in community settings, a researcher, and these can be lay people go out and they're at the elbow or sitting with the subject, if you will, to say that.
[05:10] And we actually have them participate in the data collection.
[05:13] So we're going to talk about what happens after you get your license from the IRB to go out and start collecting data or interacting with your living human beings, their identifiable data or biospecimens, which is the definition of human subjects research.
[05:34] So you get your letter, you're going to be all happy and you're going to, you know, feel like I did it and you did it. But in fact, this is really another, I won't say the beginning because the beginning is to have that idea,
[05:49] that problem, that need to find out, right?
[05:55] And then you turn that into the research question, the PICO or the SPIDER.
[05:59] And then you write that plan and present to the IRB and they say
[06:03] Yes, you've adequately protected your subjects, go and do your thing. And I'm making the sign of the cross for your viewers there.
[06:12] Michelle: But I mean, you're celebrating because this is a major milestone.
[06:18] Chris: This is a milestone,
[06:21] So this means that you're really on the journey now to answering your research question.
[06:29] Remember we talked about, at a very high level,
[06:34] research is a process of discovery or investigation and it begins with a question that's answerable with data.
[06:44] So you have your question protocol flows from that. Now you're ready to go out and collect your data.
[06:52] How do you answer the research question with the data well, that's what's your,
[06:57] that's what your analysis plan is.
[07:00] And it might be a very simple plan. Depends on your question.
[07:03] Depends on how much and what type off data you're collecting.
[07:08] It may be nothing but, you know, taking the temperature of everyone in the room and averaging it. Right. Or maybe something, you know, go to.
[07:20] The other end of the spectrum, you know, the 80 year Framingham Health Study, right.
[07:26] We're collecting many variables over many decades.
[07:30] And then we'll talk a little bit about, once you've answered your question, writing the manuscript, how do you sit down and put the, I would say the.
[07:42] Ink on the paper?
[07:42] It's most likely be electronic ink. How do you put the electrons on the paper, as it were, and how do you then reap the, the fruit.
[07:54] Of your labor, which is to get.
[07:56] The results of your study seen by other people? Right.
[08:00] Because part of, you know, it's not just knowledge for knowledge sake,
[08:03] it's knowledge for people's sake, for improving the life of human beings present and future.
[08:14] So data collection is also an aspect of the study that is governed by the irb. So in order to get an approval, you have submitted a plan to collect the data.
[08:30] And so that plan lists everything you want to collect. Generally we call these variables in quantitative research. So if you're doing a study,
[08:40] for example, a retrospective study where you're not.
[08:44] going to interact with a living human.
[08:46] Being, but you're going to get a report from an electronic health record and it's going to contain information about every patient who's had a heart attack and has had a certain drug,
[08:58] and you're going to look at the difference between.
[09:01] Some outcome like length of stay or mortality as a function of drug A versus drug B. Okay? You have submitted and the IRB is
[09:10] Approved, every variable that you're going to collect for the date range that you plan to collect it.
[09:17] So let's say you're going to collect,
[09:19] you know, the standard things like, you know, age, gender, race, ethnicity,
[09:27] you know, admitting source,
[09:30] height, weight, previous procedures, medications,
[09:34] all those kind of things.
[09:35] You may have 20 or 30 or 40 variables. Each one of those is listed and the IRB should consider the value of
[09:45] Each variable to answering your research question in the context of ethics,
[09:51] do you really need to know someone's address or have a picture of their driver's license photo?
[09:58] Some studies you do,
[10:01] right? Some studies you do.
[10:03] So you go out and you collect all these variables and you've already.
[10:07] Got approval to get a set of data that's either de identified or has identifiers in it.
[10:17] And you've already specified where you're going to get the data from.
[10:22] And the date range is very important too.
[10:24] You're going to get approval to collect, let's say, three years worth of data.
[10:29] You know, all of years. Let's say 2024, 2023 and 2022. So you're going to bracket those days.
[10:39] Anytime you need to change the data collection.
[10:42] If you say, oh, I was reading
[10:44] A study and they're collecting this other variable that I didn't get approval for
[10:48] Or, you know, looking back on what was going on in our community, I
[10:54] Think I better collect the 2021 data also.
[10:58] How do you handle that? Well, you go back to the IRB.
[11:03] And you submit a modification request and.
[11:06] You tell them what you want to do differently. And you can't do that differently until they approve the modification.
[11:14] Now, the modifications are usually quick and pretty easy,
[11:18] so that gives you a little hope there.
[11:20] But you will go. And you may need,
[11:23] you may need the assistance of people in a information technology capacity or administrators of a some
[11:33] Database, whether you own it or whether
[11:36] It's a, you know, a publicly owned or privately owned database, you may have to, again, modify the protocol. I need to add Joe Smith as a research assistant.
[11:51] Or I'm going to have a biostatistician
[11:53] That's also going to look over the
[11:55] Data and he's going to see the PHI. So I need to add him.
[12:01] So it would be nice.
[12:03] And there are many studies that never need a modification.
[12:06] Michelle: That's one of my questions.
[12:08] Chris: It's like, I would say three quarters of studies that are not more than
[12:13] Minimal risk don't need a modification.
[12:15] And if they do, it's something like
[12:18] Extending the period of data collection because
[12:21] You haven't found enough subjects.
[12:24] So easy things. Okay, but we've had studies where we had 20 or 30 modifications,
[12:32] right? So there's a process for that. Every IRB knows how to do modification.
[12:38] They know how to do a renewal,
[12:41] they know how to do a closure,
[12:43] right? So now talk about closure. I'll just throw that right out here.
[12:48] So, you know, the process of,
[12:51] you know, launching a study, collecting the data, analyzing it,
[12:55] you know, writing up a manuscript, publishing it, even in a simple study that's a survey of nurses, good days versus bad days might take a year or two because many, you know, nodes of the chain from the, from the beginning to the end,
[13:11] the oversight of the IRB in most of these, certainly all the retrospectives and most of
[13:18] The observationals, most of these low risk studies ends when the data are collected, right?
[13:24] Because they've said you can go out.
[13:25] interact with humans and you can weigh them or you can draw their blood.
[13:32] Once all the blood's drawn, once all
[13:33]The weights are done, once all the
[13:35] Retrospective data are pulled out of the
[13:37] EMR, that ends the oversight of the IRB question. You can close the study and you can work on publishing it for the next 10 years.
[13:46] Michelle: Okay, so let's say you originally
[13:50] had gone to the IRB and said, this study is going to go for three years,
[13:56] but you're, you're two years in and you have all your data.
[14:01] Do you have to go back to the IRB to close it early?
[14:04] Chris: You should.
[14:05] Michelle: Okay.
[14:05] Chris: Yeah, you should. And there was a large rewrite of
[14:10] The federal Research protection rules in 2021, which actually, unlike most federal regulations, actually rolled back a lot of bureaucracy.
[14:25] And so many of the low risk studies and the not more than minimal risk studies, the regulations changed and said you don't even need to close this study. You don't need to annual or renew it.
[14:40] You can just walk away from it.
[14:42] And these are a lot of studies that you find, for example, in university
[14:46] Settings where everyone getting a master's degree has to submit something to an IRB and do a little piece of research.
[14:56] And maybe at a school like Harvard, let's say there may be, you know, 5,000 of these a year,
[15:03] right. So it's a lot of work for the research administration and the IRB to just close, close, close, close, close. So they just walk away from and they administratively, someone just pushes a button and walks away from them or just
[15:19] Leaves them hanging because there's no requirement now to review these things, you know, little surveys and little retrospective studies. But, but in the main, anytime you
[15:33] Change something, you're going to need a modification. You can close it.
[15:36] Generally speaking, when your data are collected, which is nice because then you don't have any more interaction with the IRB,
[15:42] Sometimes when you close a study,
[15:46] they will have a series of easy questions. They're going to ask you, how many subjects did you enroll? Did you have any untoward events, you know, those kinds of things.
[16:02] Michelle: And then does the IRB ever want to see the published results of your study?
[16:08] Chris: They do and they will ask you.
[16:11] We have a template for our IRB that says, is there a publication associated with this study.
[16:18] And if there is, then you should just attach it.
[16:21] And that's not really so much for
[16:23] The protection of human subjects because by the time your study's published, if there's
[16:29] Harm, it's been done already.
[16:30] Michelle: Exactly.
[16:33] Chris: That is usually because, you know, IRBs are, you know, part of institutions that also value publication.
[16:44] And you know, dissemination, which we're going
[16:46] To talk about in a little bit.
[16:47] I will tell you that there are basically two good ways to analyze your data and the short, the short answer to that is hire a biostatistician, which is not as difficult as it sounds.
[17:01] Speaker D: So you can, you know, there are many good, put an ad on Facebook or something. There are many good Internet, you know, remote work biostatisticians. And they actually a lot of them work cheap.
[17:15] Because they happen to be located in low wage countries. For example, India has many well trained scientists that will work for $15 an hour.
[17:30] And so it actually is not that difficult to hire somebody to work on a analysis for you. If you hire a US based biostatistician, the one I use from the University of California charges $140 an hour.
[17:50] They have a minimum usually of about you know, 10 hours or so. Right. So it can get, it can get a little bit pricey. We pay between three and $4,000 for a, you know, medium sized one year project to get data analysis. When you, when you hire a biostatistician, or when you decide on option number two, which is much easier now with AI chat, GPT for example, to analyze your own data,
[18:23] one of the ways you can succeed in that I have found and this actually goes along with part of writing the protocol for your study. But you have to introduce a little bit of background about what you're going to study to the IRB because they're not all scientists, they're not all, you know, non scientists. They may not know anything about what you do. So you go out to your literature and you find other studies on the same theme on the same research question.
[18:54] Now if you're lucky, especially if you're looking for it,
[18:58] when you go, let's say you're going to do this good day, bad day, qualitative study. Well, you're going to look for a good day, bad day nursing qualitative study in a peer reviewed journal by a high class author from a high class organization published in a high class journal. Quality work. And if you're lucky, you'll find their analytic plan for answering their research question with the data and the plan will be written in the method section of their manuscript
[19:36] In enough detail so that you can replicate, and in, in a lot of studies, you'll actually find the whole data set available as supplemental digital content.
[19:50] They call SDC. So another, I wouldn't call it a trend, but it's, It's a direction that research wants to move in, is in transparency, is putting your data out there.
[20:02] And the only people who don't do this are, you know, drug companies and device companies. Right, because they're trade secrets in the data.
[20:11] But, you put your data out there too, and then that helps to you know, should say, solve the reproducibility problem.
[20:19] So if other people analyze your data and get the same results, right. If other people collect data similarly analyze data according to your plan and get similar results, that strengthens your original plan and your results.
[20:33] Michelle: Is that like the, it strengthens the validity.
[20:36] Chris: Yes, it would strengthen the validity. Of course, you've heard a peer review, right? Journal peer review. And there's kind of a what? There are many sayings about research, right? But one of the sayings about research is that peer review begins at publication. Real peer review. Because journals have their own peer review staff. They may be paid, they may be volunteers. There are some evidence that they don't really do their job that well.
[21:13] There's been studies done on that where, you know,
[21:17] A manuscript is submitted with little,
[21:21] you know, thought bombs built into it by the editor. Call this number for a hundred dollars buried in the method section. And the research, the peer reviewers don't find it. So the peer reviewers are like a lot of other people who read research. They read the abstract and the conclusion. And then they go, okay, you know, that's great, what institution is affiliated with this? Who's the author? Okay. Anthony Fauci, NIH. Okay. It's good, right?
[21:53] And so true peer review, the most rigorous peer review begins at publication when now all the smart brains in the world can get their eyes on that thing and go let me tell you what the peer reviewers missed on this.
[22:10] And if you actually put your data set out there and your analytic plan in enough detail that someone can replicate it or you may have a BS data analysis plan,
[22:22]That you got off a chat GPT or whatever. I'm not saying that's bad,
[22:27] but, you know, but a real peer reviewer would reanalyze your data. I mean, if a rigorous peer review. And most peer reviewers aren't going to do that. They're going to read your plan, they're going to look at your results, and they're going to assume that what was between the plan and the results was legit.
[22:46] Michelle: So is it possible when you,
[22:49] when your study is done and you're disseminating and you're publishing and someone else, some other brilliant mind reads your publication and has already done that exact study and found different results,
[23:07] like they can challenge you?
[23:09] Chris: Yeah. That's why part of publication ethics and part of qualifying yourself as an author is that you will designate a person as a corresponding author, they call it. And you will stand by your results and you will be open to questions and comments, criticism, and you leave your little email address on the paper.
[23:41] And this is made easier. And I'll just segue for a moment here and give an ad and an endorsement. It's a nonprofit. Doesn't cost anything for the ORCID ID. O R C I D ORCID id. And so many of your listeners who read research will see an electronic publication and there'll be 10 authors, and five of them will have a little green dot, little round green dot next to their name. It says ID on it.
[24:16] Just a round, little lime green dot. That is the ORCID ID identifier.
[24:22] And so the ORCID ID is a lifetime, 16-digit alphanumeric code or identifier that's assigned to you,
[24:32] and your identity is verified. And so if your name's John Smith. There may be other researchers named John Smith.
[24:43] The ORCID ID lives with you for your whole lifetime, and it maps to a repository of all of your publications.
[24:52] Michelle Oh, that's really cool.
[24:54] Chris: Yeah. And so NIH, for example, now requires an ORCID ID.
[24:59] There are many journals that require the ORCID ID because they don't want to mix up John Smith and John Smith and John Smith and John Smith.
[25:08] And they want their readers to be able to click that green dot and see the whole pedigree of that researcher.
[25:17] Michelle: Oh, got it.
[25:18] Chris: And then it's also nice when you're doing things like, it's kind of like LinkedIn.
[25:23] You can also put, where you work and you can put awards you've won or whatever.
[25:27] Michelle: It's like your bio kind of.
[25:29] Chris: It is. It is a bio. And it makes it easy when you do things like change jobs or you apply for a job instead of, you know, especially if you're a prolific publisher or presenter or something like that,
[25:43] then, you know, instead of updating that section of your CV every five minutes,
[25:48] right. You can just say,
[25:50] refer to my ORCID ID. And it's got the whole thing. One minute to sign up.
[25:57] Michelle: Wow. I was imagining when you were talking about it that,
[26:01] you know, you're like, oh, you're reading this and, and it has a little green dot by it.
[26:06] And I'm like, yeah, if I had a physical copy, obviously I couldn't, like, access that ID. It's only for online.
[26:13] Chris: That's right.
[26:13] Michelle: You probably just hover over it.
[26:15] Chris: That's all you have to do is hover over it, click it, and then that person's bio comes up.
[26:20] Michelle: I can't wait to look for that next time I'm reading research.
[26:23] Chris: And I would say if you're, if you're listeners are going to publish even one study, it takes two minutes literally to get your ORCID ID. And it always looks good on a CV, too. And you can put a link to your ORCID ID in your email signature. And so people can click it and go, oh, this person published all this trash, in my case.
[26:52] Michelle: That's very, very interesting.
[26:56] Chris: It's a good thing. And my current position,
[27:01] I work with medical residents and they all have to,
[27:06] as a graduation requirement in their either 1, 3, 4 or 5 years with us,
[27:13] they have to produce a manuscript of publishable quality. There's only one program that requires publication, but a manuscript of publishable quality. So I'll tell you what that is right now and we can circle back around to it because you're. I was going to get into this. So the resident always says to me, well,
[27:35] how do you define publishability? Let me back up one more knot. I require the residents to get the ORCID ID when they start. Because they're young, they may publish over their lifetime. This will be something that'll, that'll be good. It follows them around and unlike LinkedIn, you don't get a bunch of junk mail messages. Congratulations on your 45th work anniversary, you know, and then you go, yay. It's an AI button.
[28:05] Michelle: It has, it has the, the, the comments all picked out for you.
[28:10] Chris: Congrats, Chris.
[28:11] Chris: Yeah, yeah, right. So anyway, I make them get the ORCID ID at the end of their tenure with us, they submit a publication.
[28:19] They submit a manuscript, something they wrote by hand.
[28:23] Definition of a manuscript. So if Chat GPT wrote it, technically not a manuscript. Well, it's technically a manuscript, but it's ethically not a manuscript. Okay, okay. Because you didn't, it wasn't written by hand. But publishability in a manuscript we define as an intersection of two concepts. Okay? So readers are familiar with the intersectionality concept. They love those Venn diagrams.
[28:53] So this is a simple Venn diagram. Imagine two circles with a little bit of overlap in the middle. In the middle is publishable quality. And so the two circles are novelty. This is what publishers want.
[29:09] They want as many citations by their readers and other researchers of your study in their journal
[29:19] That gives them what's called an impact factor that is higher because what they publish is having an impact on other research. They live and die by this number. And so what they want is they want things at the intersection of novelty and quality. These are where the biggest impact lies.
[29:43] So novelty means that it brings something new to the body of knowledge in that field of endeavor. Doesn't mean that it's the, you know, the very first thing that's ever been written and no one else had a thought that they would ever study it. We had a novel paper we published a couple years ago. First study of long Covid in the farm worker population.
[30:14] So that's novel, but studying long Covid, not novel. Studying the farm worker, not novel. Studying long Covid in the farm worker population. We think novel.
[30:25] Michelle: Your circles intersected.
[30:29] Chris: Yeah. So I could have another Venn diagram. So something new. New means either it hasn't been seen before or it extends previous findings. So someone went to a paper and they saw the last paragraph that says, we recommend that these few future research directions be pursued. And they actually got one of those and pursued it. So they extended knowledge or they refuted previous knowledge. That's also novel. So extend, refute or create something new.
[31:06] So novelty, and then what the publisher wants, again because it boosts the impact factor, is something intersectional with quality. Now, quality is a very fuzzy concept. And it's somewhat surgical. It's hard to define. But, you know, when you see it. The Supreme Court justice back in the 60s or 70s about the pornography. I don't know what, I don't know how to define pornography, but I know it when I see it. Because there was a great debate on you know, is Renaissance art pornography? Because it shows a naked woman. You know, in a bathtub or whatever. He says, well, I don't. I can't define porography, but I know it when I see it.
[31:55] Michelle: Yeah.
[31:55] Chris: So quality is similar thing. It's got some subjectivity to it. You know, one man's floor is another man's ceiling. Exactly right. However, to take some of that subjectivity out of it, the journals have a pretty good definition of what they want when they say, I want quality.
[32:15] And quality means that it conforms to some sort of writing guideline or author instruction. And every journal lays their author instructions right out on their website and says, okay, this is JAMA surgery and when you write up the results of a randomized controlled trial, you will use the CONSORT writing guideline, which is available free from the Equator Network website, which has about 700 writing guidelines.
[32:52] If you write up a observational study, you will use the strobe writing guideline available from the Equator Network website.
[33:03] Michelle: So they want a lot of like,
[33:07] what's the word I'm looking for? Not conformity,
[33:10] but they want anybody reading the study.
[33:15] They want it set up in such a way that it's. It just makes it easier.
[33:20] Chris: Yes, they want, and conformity. I'm searching for the word.
[33:25] Michelle: I know. It's not conformity.
[33:27] Chris: Standardization, something like that. They don't want the reader to have to decode the structure and the formatting of every publication. Yeah. They want when the reader reads a thing and it says a systematic review.
[33:47] Then they want to know that there's going to be. They're going to follow a guideline for systematic reviews.
[33:53] Michelle: Yeah.
[33:53] Chris: Which is called PRISMA. Okay. And the two key elements of the PRISMA guideline for the systematic review are the checklist for the elements to be reported and the flow diagram, which is a graphic. It starts with how many hits did you get in your, literature search and ends with how many studies were included in this review?.
[34:21] Chris: And so when I see a systematic review, I know on about page three or four, I'm going to find the flow diagram. And I know that I'm going to find it if I look for a systematic review.
[34:34] Because checklist item number one in the PRISMA checklist is identify this study as a systematic review with those words in the title.
[34:46] And so that way when you go to PubMed and you search for heart failure, rural, whatever, and you click the filter, systematic review. Because you're trying to get evidence at the top of the pyramid, the top of the evidence hierarchy. Then it's gonna pick up all those words in the title and give you all the systematic reviews. So the authors. Authors, you're the author, the editors of journals are pretty stickler about quality.
[35:23] And their definition of quality is industrial conformance to a specification. And, and there's websites that you can go to find this out. One is called Scimago S C I M A G O, got every journal you want, gives you all their rankings, their impact factor, how many references per article, all that kind of stuff.
[35:50] Michelle: So is it possible that the authors of the study,
[35:56] once they have everything and they take it to get published,
[36:01] the publisher rejects it because it doesn't have quality?
[36:05] Chris: It's not only possible, it's about an 85 to 95% probability.
[36:12] Michelle: Okay, that's so sad because you've gone through all of the asking the question, you've gone to the IRB, you've done the study,
[36:22] now you're trying to get it published and the publisher says, nope, doesn't meet ours.
[36:30] Chris: Yeah. So on average, journals will accept about 15% of the submissions.
[36:40] Michelle: So knowing this, researchers know these demands from the publisher. Why aren't they using these demands from the publisher when they go to set up their. study?
[36:57] Chris: Well, good ones do. I mean, if you, if you want to keep publishing, if you want to publish, then you'll have a relationship with a journal editor. I published, I think, three times in the Journal of Nursing Care Quality. And, you know, I knew Marilyn Orman, you know, on somewhat of a friendly basis. She was the previous editor and the authors I work with at the University of California, they have, you know, journals that, you know, they, they know the editorial staff or the editorial manager, and they will do things like send a inquiry letter before they get started on a study.
[37:43] They'll say, okay, hi, Jim, Hope things are going well over at Journal of Rural Health. You know, I'm Chris. I've submitted, you know, there, you published three of my papers. I'm writing another one and it's going to be on this. Does that sound like something of interest to you?
[38:03] And the editor will write back and say, absolutely love to have another one of your publications. Or you can do that even before you go through, you probably should do that. Or they may say, you know what? I got seven in the pipeline, just like that. So I don't need that. However, what I do need. And they might tell you what you want too. Absolutely right. And most novice authors don't know that.
[38:32] They just take their chances, sort of roll the dice and say, well that sounds good. But you know, as you get a little more sophisticated in choosing a journal and there are choose their journal choosing tools out there and the one that probably gets the most play is called Jane J A N E and it's Journal and Name Estimator.
[38:58] And the website is janebiosemantics.org and it's a simple piece of primitive AI where there's a little search box and you enter your research question or your abstract and it pulls up a ranked list of journals that have published on that topic and it tells you which journals are Medline indexed, which you want that because remember, you don't want to put your publication into a black hole predatory journal.
[39:32] And it tells you whether or not the journal publishes in open access or not.
[39:39] Have we discussed open access? And so, it gives you a little bit of information.
[39:46] Now I have also found that probably even a better tool is ChatGPT. And so you say I want to write a paper on this. Give them as much of the paper as you can. Maybe it's only the research question. And say give me some ideas on journals and they will give you a list of journals that will be pretty close to the topic.
[40:11] And of course, you know, as you get more sophisticated as an author and as a researcher, you will get to know journals and you will have favorite journals.
[40:23] And when you're doing your literature review for the study that you plan to do way back when you're writing the protocol from the research question, you're going to look and see where your literature review information comes from. What journals was it published in?
[40:42] And that'll start to give you an idea. But it's a good idea to send that letter of inquiry because it takes, you can write it with chat GPT in 10 seconds, be very professional. And you submit it and you'll get an answer back pretty quick.
[41:01] And generally the answer you know is going to be, yeah, let's take a look at it. Now one of the rules of journal manuscript submission is you can only submit to one journal at a time, so you have to be rejected by that journal before you can submit to another journal.
[41:21] Michelle Oh, got it.
[41:22] Chris: So this is not like Tinder where you can throw out and you can have 50 fish on the hook, who reels it In. Yeah, no, so you have to go to. But experienced authors, you know probably going to hit it on the first or second try because they have relationship with the journal. But they're also going to have a sort of a ranked hierarchy and I know that my collaborators at the UC have this, they go, well, this is going to be our first choice.
[41:54] If we don't get that, we're going to go here. If we don't get that, we're going to go here. If we don't get that, we're going to go here..
[42:00] And at the, at the end of the day,
[42:03] If we don't get that,
[42:05] then. And I just want to mention this journal
[42:09] to your listeners because I'm getting to really like the journal a lot.
[42:13] It's called Curious. I don't know if you've ever heard of it.
[42:17] Michelle: I have not.
[42:17] Chris: It's C U R E U S. And it is a online journal. So no print is online only. It's open to any healthcare discipline, it is free open access. It is Medline indexed and it is owned by Springer Nature.
[42:43] And I would say at my hospital, Kaweah Health in Visalia is, I think on, I think we have our 342nd publication. I put out a little newsletter every month internally that says look who's published. And I give a little recognition to that.
[43:05] Michelle: Who does that go to?
[43:07] Chris: It goes to. It's internal. It goes to everybody in the organization.
[43:14] Michelle: Not just like director level?
[43:16] Chris: No.
[43:18] Michelle: Do you get feedback from that?
[43:19] Chris: I do, yeah. From both positive and negative.
[43:23] Michelle: Really?
[43:24] Chris: Yeah. How dare you expose what I wrote and share it with everybody.
[43:30] Michelle: Oh, you get it from the authors?
[43:32] Chris: Oh, I have.
[43:33] Michelle: Don't you tell them you're going to see it?
[43:34] Chris: I told them, you know, this is published in a Medline index peer reviewed journal.
[43:40] Michelle: That's the point that they want to get.
[43:41] Chris: Isn't that the point? Exactly. That is the point. But that's been a very, a very low frequency thing. But usually what I get is thank you, you know, for you know, recognizing me.
[44:00] Michelle: Do you get feedback from like any of the nursing staff or like, oh, I can't wait to read this or.
[44:05] Chris: Yeah, it's 99% positive feedback and the biggest part of the feedback is. Oh, thank you. I didn't know that one of my residents,
[44:16] you know, got something published. Yeah, I go well, you know, you're supposed to know as the program director, but let's leave that alone for a while, you know, since I don't want to tell those guys how to do their job. There is that tool, Jane, There is Chat GPT where you could ask the same thing. I'm going to get back to CUREUS in a minute.
[44:44] Michelle: So is it curious because of like the word cure?
[44:49] Chris: I think it's a play on curiosity. Yeah, because it's for healthcare individuals. Yeah, so check it out. So here's what I like about CUREUS.
[45:03] Michelle: I'm going to put that in the show notes.
[45:05] Chris: So what I like about cureus is I've told you about a couple of the reasons why I think it's a legit thing. And it is a legit thing. It's free. And the only instance in which it's not free is when you send them a manuscript that needs so much copy editing that they have to massage it with their AI and maybe a human in the loop somewhere.
[45:30] And they do accept AI assisted manuscripts. You just have to identify the AI that came out of Chat GPT or whatever. And they have instructions for that. But I like it because it's pretty legit. It's Medline indexed so you get that coin of the realm of healthcare publication, the PubMed ID, the PMID. It's open access, which means when you publish in that journal, anyone in the world can read that article for free. So they don't have to have a subscription. So they will like that.
[46:07] And the other thing that they do that for me is a little bit novel, it's probably not really novel in the world of publishing, but they have a fairly unique peer review system where they have their own bank of peer reviewers. I'm actually a journal peer reviewer for Cureus.
[46:26] And then they ask you to suggest three to five peer reviewers for your study or your article that you publish there. And so I like that for our residency programs. And it would be good for nurse residency or nursing career development because being a journal article, peer reviewer is also a piece of scholarship on its own. So we're always trying to, you know, get the residents and nurses have the same issue, you know, to think a little bit beyond their clinical care and to, to recognize that, and both nurses and physicians fall into this category, they're humanistic scientists. And so this is the term that their organizations actually use. I didn't make it up.
[47:23] So they're humanistic scientists. So, you know, they do know how to sit with people and, you know, talk with them and support them and that sort of thing. They also have some scholarship ability and they have to demonstrate it. For example, physicians in medical residencies have to demonstrate what is a three legged stool of scholarship. Okay, you know this about these three legged stools?
[47:52] Michelle: Yes, sir.
[47:54] Chris: Their stool has three legs: research, quality improvement, and teaching. So it is expected that when a physician is launched into the world and they practice independently that they will know something about research. They will know how it's conducted, how it's regulated, how it's shared with the research community, patients.
[48:22] They will know how to do quality improvement because what healthcare system or organization or even individual practice doesn't need quality improvement? And so they will have to do a quality improvement project.
[48:35] And they need to know something about teaching because it is part of their role to educate not only patients and families, but also other providers and, you know, other physicians. So, we help them prepare for that. And nursing. I don't know that they've articulated it the same way, but I think those would hold up for nursing too.
[48:59] You know, I think the complete nurse, you know, is also a humanistic scientist.
[49:05] And so, yeah, they know their clinical stuff and they can, you know, talk to patients about bad things happening, good things happening and all that, but they also should know a little bit about,
[49:18] you know, how the knowledge base of the profession is extended research, you know, how quality is improved and how teaching is done.
[49:29] So, and I know in my, advanced nursing education, particularly in my doctoral education, we took two semester long courses. One of them was curricular development, the other one was curricular evaluation.
[49:47] So we know how to write a syllabus and we know how to write a test and determine where the curve is and all that kind of stuff.
[49:58] And we did take courses in quality improvement and we did take courses in evidence based practice and research.
[50:06] So all the advanced practice providers in any healthcare discipline are the same things.
[50:19] So, what questions do you think your listeners might be harboring at this, at this point?
[50:29] Michelle: I think, you know, I'm not sure. I hope we get some feedback.
[50:34] I'm trying to think of what questions. I'm trying to think as a listener.
Okay, so we have gone to the publisher with our manuscript and we're saying.
I'm trying to get published. And they, I went to this journal and they, they're going to publish my, my study, my research.
[51:03] Chris: Buy your bottle of champagne, put it in the fridge and about eight months from now you can pop it.
[51:09] Michelle: Okay. So that's how long it takes.
[51:12] Chris: Yeah, it can. Now the good news on that is that with open access publishing, where the author pays. It's not a bribe, but it, but it does help the journal to hire editorial staff and review staff and all that. Right.
[51:36] So, the journals are getting more transparent and they will, some of them at least will publish their production stats. Cureus does this for example, days from their metrics for the production of how fast it's exactly. They'll say,
[52:04] here's what we're dealing with. Okay.
[52:06] We get, you know, 1 million submissions a month. We accept 150,000. So our acceptance rate is 15%. Our time from, you know, receipt of submission to decision is 14 days. Our time from decision to publication is 36 days. And you can get that from journal websites if they're transparent. Not enough of them are because there's no law that they have to be transparent. You get it from Chat GPT, you get it from Simago.
[52:45] Michelle: I would think it would benefit them to be transparent.
[52:48] Chris: It would if their metrics are good. Yes, that's true. Because the person looking to publish is going to look at the journal's metrics and they're going to go, well, this one takes 36 days and this one takes 86 days. And I really need to get my stuff out there. And the reality of publication, however, is it's kind of like real estate with the buyer and seller's market. So when the journals have way more submissions than they can publish, then they can be very choosy.
[53:28] They can also be very obscure as opposed to transparent. And they don't have to act like there's a competition for their services. I mean there is a lot of competition. So they can afford to sit back and you know, now it would be different if the journals were all scratching for submissions. Then transparency would probably be enhanced and they would put it right out there. And like I said, Cureus does that and that's one of the reasons I like Cureus. But other journals are starting to do that now.
[54:02] Michelle: Okay, so my study has been published and put out into the world. What happens now?
[54:12] Chris: What is the hope of the principal investigator that thought of the question that went to the IRB board, that went through all of the, you know, the long period of collecting all the data that is a very good question.
[54:30] Michelle: What is it that they're hoping? Is it just my name and lights? What are they hoping to come from their work being published?
[54:41] Chris: So, yes, they're hoping for, you know, because we're human beings, we like the little struggles, you know, so we're hoping for some recognition somewhere but what you're really hoping from is you're hoping to get those results put into practice in some way or at least used as a basis for other research. So you're hoping for citations.
[55:10] Michelle: So people citing your research.
[55:12] Chris: Yeah. So if you sign up for something like researchgate.net okay, which I have Research Gate, so my publications have maybe 150 citations. So, you know, people will say they're doing a study on missed nursing care,
[55:32] and they go and they read some of studies where I've published on missed nursing care, and they reference that so citations are one thing you want. You also want other authors or other researchers to contact you and say, hey, I read your study and I'm really interested in knowing how you did this. Or would you be interested in participating in my study? Because I'm going to study a similar topic. Would you be interested in helping me to write the introduction and background? Because I see that you've published four times on this topic, so an opportunity to network and mentorship. And, you know, sometimes people will contact you and say, you know, would you be willing to come to a conference that we're putting on and share this information? So I've had that done several times with my work in medication safety, particularly with my work in discontinuing the practice of double verification of sub Q insulin at the bedside in hospitals. So I presented in, you know, Baltimore, Los Angeles, Florida, Texas, Iowa on this topic, because people contacted me and said hey, we're doing a nursing conference. Would you come and talk about this?
[57:11] Michelle: So it's also an opportunity to get yourself out there and be known for your research. And be able to disseminate it more widely to the people that might use your research and also maybe too, as a mode of financial gain. It's possible. It hasn't happened for me, but that doesn't mean anything, and I think that's possible get paid.
[57:41] Chris: Oh, no. Speakers get paid a little bit.
[57:44] Michelle: All expenses paid.
[57:45] Chris: You get paid. Something like that. Or we'll get a $750 honorarium or something like that. You know, Fauci will get more. But I would say that that's a consideration if that's something that you, that you want to do.
[58:04] And then I think another consideration that I think everyone should be open to is talk about answering the question, what do you expect when you get that publication out there? I would love to have challenges to it. And I've had some where people read it and they say, well, you know, this doesn't make any sense. Are you aware of the results of this other investigator who has 25 papers on this thing? And you came to a different conclusion. So, you know, just, you know, for curiosity's sake, can you explain that?
[58:39] Michelle: Right, Like a slap down?
[58:41] Chris: It's not a slap down, it's a challenge. It could be a slap down. But I think those are valuable for your, you know, growth as an investigator.?
[58:54] Michelle: Yeah. I mean, you are proud of this baby. You have put a lot of blood, sweat and tears.
[59:00] Chris: Right. And that's the, and in fact, this is codified, you know, in graduate. education, particularly at the doctoral level, this is the point of defending one's thesis is, you know, publishing something or writing the manuscript, you know, and answering the research question with some data and then having a bunch of your professors gather around and say, why did you choose to use, you know, the Cohen's kappa statistic?
[59:36] Michelle: And I mean, I think that would make you, for the next time, maybe a better researcher.
[59:43] Michelle: Because you, maybe it's something that you hadn't taken into account.
[59:49] Chris: Say Chat GPT suggested it, you'll have an answer for that. Because you'll, you know, when you, part of the analytical plan is to for sure, if you're a student, or even just submitting a manuscript just to be able to defend, how you chose to, to examine the data and so what I do nowadays when I want to write an analytical plan is I, in my literature review, I've identified analytical plans built on the same research question and the same data, if I can and then I'll say to ChatGPT, I'll upload my data and I'll tell ChatGPT, look at these five papers that I've also uploaded, look at their analytical plans,
[01:00:41] because I want something similar for the data I just submitted.
[01:00:45] And then in two minutes it will tell me, okay, I have taken your Excel data set and I have offered it up to these Legacy statistical analysis platforms,
[01:01:02] SPSS R, which will take you a year to learn how to use those. But you don't have to do that anymore. And I'll say, here is the results of analysis of your data and we, here's the plan we've used.
[01:01:21] And then you can say, well, tell me a little bit more about that plan. Why didn't you include, you know, this statistic? And it'll say, well, because the data that you gave me are at the ordinal level and they're not at the ratio level. So the assumptions underlying the use of that statistic are not valid.
[01:01:44] So then I learned something. So great tool. And we can actually, for another episode of Conversing Nurse podcast, I hope we'll come back and just talk about, about A.I.
[01:02:02] Michelle: We're going to.
[01:02:03] Chris: And the large language model generative platforms like Chat GPT, which is my, my favorite platform because it's old like me.
[01:02:15] Michelle: Okay. So the other thing that the, that the researcher hopes that will happen is that medical professionals, whether it's directed towards nurses, physicians, whatever might change their practice for this better way that was found through research.
[01:02:36] And so an example of this is.
[01:02:38] When I became the developmental nurse for the NICU, which was back in 2010, I started reading lots of research about the way we feed babies. And so there was research from the 1990's, some from the late 1980s, talking about infant driven feeding or cue based feeding.
[01:03:05] And they did lots of comparisons as opposed to the way we were traditionally feeding babies, which was on a very rigid schedule with a rigid volume that they had to take in a rigid amount of time.
[01:03:19] Chris: You're like a plant, you know, you're fertilizing a crop.
[01:03:23] Michelle: Exactly.
[01:03:24] Chris: Well, it's Tuesday. We need to put the gypsum on.
[01:03:27] Michelle: Yes. And so other developmental specialists in the field had done research on the way that they fed these babies and it was better for the babies. They had less apnea, bradycardia, desaturation episodes during feeding.
[01:03:45] They retained more of the feeding, they were less stressed, they grew better.
[01:03:51] Chris: But what if the baby wants to eat at change of shift. What then?
[01:03:56] Michelle: What the heck are you doing? Nothing happens at change of shift. Then what are you gonna do?
[01:04:00] And so I'm taking all this data from many people that have done this and I'm bringing it to the NICU and I'm saying, look at all this data.
[01:04:12] It's undeniable in the favor of infant driven feeding or Cue-based feeding.
[01:04:20] And I had a lot of convincing to do from a lot of neonatologists who are still practicing
[01:04:27] Back when we weren't doing that and everyone was scared to do it.
[01:04:32] And we implemented this project which took about three years and now that's what they use.
[01:04:43] So I was taking studies from the 1980s, 90s, 2000s, because it had been replicated so many times with the hopes that people are going to take this and now implement it in their own NICU and everyone will be better for it.
[01:05:04] So the length of time that that takes, because I remember sitting with my manager saying, naively I guess, that if this is the best way to do itt, why aren't we all doing it this way?
[01:05:25] Chris: A mentor of mine and I'll even mention his name here, because he probably doesn't. He's still kind of a mentor, Leland Beggs, Remember him?Great ICU intensivist physician. I used to collect a little, a little booklet of his, of his sayings, I called them Beggsisms. He was my medical director for about 11 years in quality. And he had a little phrase, a little saying that I wrote down.
[01:05:53] He said, results don't matter, cost doesn't matter, only politics matter.
[01:06:03] Right. So, you know, why do things? And we've, we've talked about this statistic 17 years. That's the lag time between the info getting out there in the public and it getting into practice. So keep watching. You know, it could happen anytime now.
[01:06:20] Michelle: Well, you take smoking.
[01:06:22] How much research has been done on smoking? Yeah, we all know it's bad for you. It's not going to improve your life in any way but people still smoke.
[01:06:34] Chris: Yeah, they do.
[01:06:35] Michelle: They haven't read the research.
[01:06:37] Chris: 20% or so of people don't wear seatbelts when they're, when they're driving.
[01:06:41] Michelle: Lots of research on that.
[01:06:42] Chris: Yeah, yeah, yeah. But you know, there's, you know, you know the Mark Twainism about those lies. **** lies and statistics. You may know or you could assume this and you'd be, you would be correct that most of the people who were killed in car accidents are wearing a seat belt.
[01:07:07] Michelle: I didn't know that. Yeah, but probably since most people wear their seatbelt.
[01:07:11] Chris: So there's that 20% out there that say, well, you know, if, if all these people in these car accidents wearing these seatbelts are dying. I'm not sure I want to be in that crowd.
[01:07:24] Michelle: I don't want to wear my seatbelt in case my car goes off the road. It's going to trap me in my car.
[01:07:31] Chris: And I'm going to running and I'm upside down and I can't get myself out of the seat belt.
[01:07:35] Michelle: That's why someone invented the seat belt cutter.
[01:07:37] Chris: Yeah. And that could happen, but much more likely.
[01:07:41] Michelle: And the car window breaks.
[01:07:42] Chris: Their head just goes through the windshield.
[01:07:50] Michelle: So we've come to the end of our research series where we are now, all of our research is disseminated.
[01:08:01] And we sit back and we're waiting for people to challenge us,
[01:08:06] people to say, I'm taking this to my NICU. You have inspired me. I'm going to replicate your study. Please be my mentor. Please come and speak at our conference.
[01:08:21] Chris: And the most gratifying really is like when, when I've worked with people who, for example, have successfully led the practice of discontinuing the bedside sub Q insulin double checks where 40% of the doses of insulin in their hospital at the bedside sub Q or one unit. And they're in the middle of their medication pass, and they're thinking the whole time, where am I going to get that extra nurse for the double check? So they're distracted. And then they have to go and interrupt the other nurse who may be probably doing something more important medication. Maybe she's given 50 units of insulin. Who knows?
[01:09:12] You know, and so then you realize that you're actually inducing, you know, distractions and interruptions. And then if you actually observe and talk to people, you realize no one's doing it. Right. Anyway, so it's a lie on top of a lie with some distraction and interruption on the top of it.
[01:09:34] Michelle: And we wonder why medical errors happen.
[01:09:38] Chris: Now, of course, what we really need are fewer diabetics.
[01:09:41] Michelle: There you go.
[01:09:42] Chris: Now we're in quality improvement land. We're, you know, draining the swamp as opposed to swatting the mosquitoes. But maybe that's for another day.
[01:09:52] Michelle: Yes. Well, as you know, as we near the end, I'm going to go back to you with this question of why do we do research and why do we even need it to do research?
[01:10:06] Chris: Research is just answering hard questions in a repeatable, methodical way in a way that can be shared with others who may have the same or similar questions so that they can answer their questions. We need research because clinical practice is rapid and complicated.
[01:10:32] And there is often not time in our clinical practice to make sense of what's going on and to tease out all the possible influences and variables that may cause something to happen that we don't want to have happen or that we would like to have happen more often. So research is a tool. It's a process for backing away from what's really important, which is the service to our patients and clients and saying whoa, we have a question, we have a problem, we have a desire to do something differently, to do something better. Let's back off a couple of rings. And let's take a hard look at the process and what's going on, and.
[01:11:23] Let's get to a better result.
[01:11:25] Michelle: I love it. All right, next time, AI okay? This has been a wonderful discussion.
[01:11:34] Chris: It's been fun. Thank you.
[01:11:35] Michelle: A three-part discussion.
[01:11:37] Chris: I really appreciate your probing questions.
[01:11:41] Michelle: Yes, I have a lot of those. And I appreciate your thoughtfulness.
Chris: You're a true expert.
[01:11:47] Michelle: Thank you so much. I am not.
[01:11:49] Michelle: And are you on your way to work?
[01:11:51] Chris: I am. I'm on my way to work.
[01:11:53] Michelle: Wonderful.
[01:11:54] Chris: To solve more problems.
[01:11:55] Michelle: Yes. Through research.