Development and Validation of a New Tool to Measure Performance Knowledge and Communication Skill of Multidisciplinary Health Science Learners on Radiation Emergency Preparedness and Response Management

Abstract Objective: The purpose of the study was to design, develop, and validate a newer tool on radiation emergency preparedness responses (RadEM-PREM IPE tool) to measure communication, knowledge, performance skills in multidisciplinary health science learners. Methods: The study design is a prospective, single centric, pilot study. Five subject experts designed, analyzed, and selected items of the instrument for relevant content and domain. Psychometrics that the tool assessed were content validity, internal consistency, test-retest reliability, and intraclass correlation coefficient. Twenty-eight participants completed test-retest reliability for validation of 21 sorted out items calculated percentage of agreement >70% I-CVI/UA (item content validity index with universal acceptability) and S-CVI/UA (scale content validity index with universal agreement method). Results: Items with percentage agreement >70% and I-CVI over 0.80 were kept, ranged from 0.70 to 0.78 were revised, and below 0.70 were rejected. Items with kappa values ranging from 0.04 to 0.59 were revised and ≥0.74 were retained. Internal consistency assessed using Cronbach’s alpha was 0.449. Positive correlation between attitude and communication (r = 0.448), between performance and communication (r = 0.443) were statistically significant at 0.01 level. Overall, intraclass correlation coefficient for all the measures is 0.646, which is statistically significant at 0.05 level (P < 0.05). Conclusions: Study concludes that the RadEM-PREM IPE tool would be a new measuring tool to assess knowledge, performance, and communication skills of interprofessional radiation emergency response team learner’s evaluation.

Emergency preparedness and response (EPR) to handle radioactive materials is concerned with occupational health and safety at the work place.Enriching the high standard, quality work culture by adopting good radiation safety practices in a nuclear medicine setup is a vital step in minimizing hazards and protecting the health of people.Lack of knowledge, behavior (attitude), and practice process for handling the various awkward radiation emergency situations is alarming among all the stakeholders and creates panic situations and consequences that are hazardous to professional, public, patients, and environments.To assess and evaluate learners' pre-existing level of learning, the learning acquired in course of time, continuous improvement in the delivery of content, and modes of training administration as per the Kirkpatrick level 2 evaluation, 1,2 there is a need for an appropriate assessment tool, which is lacking at present for radiation emergency response team evaluation.Assessment of learners for psychometric parameters analysis, the close-ended questionnaires design, formatting, validating, and reliability are the systematic process and vital to analyze educational deliverable programs.In the same line, developing an evidence-based radiation emergency response assessment in the hospital setup is vital to design and develop a reliable and valid assessment tool.

Aims and Objectives
The aim of this present study is to develop a valid and reliable tool on radiation emergency response preparedness and response management (RadEM-PREM) also to explore the psychometric measures of the instrument.Once this instrument is ready for intervention, it will help to evaluate the net learning and the learning gap in future-ready multidisciplinary health science graduate students.Objective of the study is to design, develop, validate, and check the reliability of the items for "Radiation emergency preparedness response management (RadEM-PREM) Inter professional evaluation (IPE) assessment tool," which is lacking at present.This RadEM-PREM IPE evaluation tool would be significantly helpful to assess the readiness of health science interprofessional learners in team building for radiation emergency preparedness and response.
The process adopted in this work is designing the items' content, content validation, face validation, analyzing test-retest of the items to check for their reliability for ready to use as assessment tool recommended by Considine   [3][4][5] This work in the long-run will be very effective and productive to assess and evaluate the pre-existing learning level, net learning attributes and learning lag to bridge the continuous improvements in future training to check the interprofessional learner's readiness for building multidisciplinary radiation emergency response team.

Ethical Committee Approval
This work is duly approved from KMC and KH Institutional Ethics Committee vide IEC :1017 /2019 in December 2019.

Study Design
Study design is prospective, cross-sectional, single centric, pilot study conducted in India.

Participants
Items were generated by 5 subject experts for instrument design for relevant content, and domain specifications followed the stages of questionnaire design and development systematic good practice process.The tool is hypothesized for initiating team work capacity building in radiation emergency preparedness and response area in a hospital setup.Psychometrics of the tool was assessed in terms of content validity, internal consistency, and test-retest reliability.
Of the 3 domains of knowledge, performance skill, and communication skill, items were sorted out based on the high percentage relevance of the content-domain compatibility.In the next step of validation, the 5 subject experts validated the contents of items, substituting to remove ambiguousness in the item contents, modification-remodification, screened for the levels of learning in the radiation emergency response area.Content validity was assessed to review the scale by a panel of 5 content experts with professional expertise in radiation protection.These 5 content experts were experienced either teaching faculty, a professional member of Nuclear Medicine Physicists Association of India (NMPAI)/Association of Medical Physicist India (AMPI), or having at least more than 10 y of professional practicing experience in radiation safety and the protection relevant field of medical radioisotope applications handling.All the item content experts and validators have at least a postgraduate degree in science holding with postgraduate degree/diploma in medical physics or nuclear medicine relevant professional qualifications and training in radiation protection.
Twenty-eight radiation safety expert professionals completed test re-test reliability of items for validation of RadEM preparedness management 21-item questionnaire Tool (RadEM-PREM tool) designed and developed for multidisciplinary interprofessional education (IPE) health science learners.

Study Size
In the present study, 92 items were screened out to 42 items in the first stage.In next stage, a further 21 items were sorted out for 3 domains such as knowledge learning attributes (KS), performance (PS), and communication (CS) skills based on the content validity index and high relevance in their respective domains.A total of 21 items were sorted out having percentage of agreement more than 70%, and each item's content validity index with universal acceptability (I-CVI/UA) and scale content validity index with universal acceptability (S-CVI/ UA) is calculated.Of 21 items, 7 belong to the knowledge domain, 7 were from the performance domain, and another 7 were from the communication domain.
The inclusion criterion for the present study was working radiation professional experts in India.Entry-level future ready radiation professional were excluded.

Measurement Tool Used
Measurement tool includes close-ended questionnaires having 5 options which was curated and sent to the subject experts.The experts were then asked to rate each item based on relevance, clarity, grammatical /spelling, ambiguity, and structure of the sentences.Substituting content body, editing, correcting is adopted for domain-wise relevance of the items.Item-level content Validity index (I-CVI), Scale-Content Validity index (S-CVI)/Universal agreement (UA), Cronbach alpha coefficient, Pearson's correlation coefficient (PC), intraclass correlation coefficient, percentage relevance of the tools are either discussed or tabulated.

Results
The high percentage relevance of the content-domain compatibility on the 4-rater scale initiated for instrument design process by 5 validators is shown in Table 1.In this table, validators' percentage agreement for relevancy of the 21 items is tabulated in percentage relevance.The I-CVI, S-CVI, and modified kappa were analyzed for all the questions using Microsoft Excel (Microsoft Corp., Redmond, WA) (Table 2).
Lynn et al. 6 recommended that I-CVIs should be no lower than 0.78 to accept that item content valid.Researcher used I-CVI information in revising, deleting, or substituting items.The items that had a CVI over 0.80 were kept, and those ranged from 0.70 to 0.78 were revised.Items with CVI score of less than 0.70 were rejected and not considered for further inclusion or revision.The S-CVI calculated using universal agreement (UA) was calculated by adding all items with an I-CVI equal to 1 divided by the total number of items.The S-CVI/UA of 21 items ranged between ≥0.8 and ≥1.0, considering that the items had excellent content validity.The modified kappa was calculated using the formula: kappa = (I-CVI-Pc)/ (1-Pc), where Pc = [N!/A!(N-A)!]*0.5N.In this formula Pc = the probability of chance agreement; N = number of experts; and A = number of experts that agree the item is relevant.Reliability testing was done, and items with a kappa value ≥0.74 were retained and those with values ranging from 0.04 to 0.59 were revised.
Reliability of the developed tool done by doing test-retest of these questionnaire items measured Cronbach's alpha coefficient.
The test-retest reliability of all 21 items were administered on-line through MS form to 28 radiation safety professionals working in India within a 4-week interval.The internal consistency of the subscale was assessed using IBM SPSS-16 software in which Cronbach's alpha was reported to be 0.449.
Pearson's product-moment correlation was used to assess the correlation between the subscales of the questionnaire for item discrimination analysis as advocated by Haladyna 1999. 7A highly positive correlation was observed between communication and knowledge (r = 0.728).Moderate correlation was observed between performance and knowledge (r = 0.544) and also between performance and communication (r = 0.443).All 3 correlations among knowledge, communication, and performance were statistically significant at the 0.01 level (Table 3).
Overall, intraclass correlation coefficient for all the measures was 0.646, which was statistically significant at 0.05 level (P < 0.05).

Limitations
This tool has been developed and validated in consultation with content experts and peers of radiation protection.Furthermore, the tool is checked for reliability and internal consistency on another group of radiation safety professionals working in India.However, this tool is not tested among the interprofessional learner sample group from multidisciplinary health sciences including medical students.

Discussion
The present study describes a newer tool to assess the knowledge along with performance and communication skills of interprofessional radiation emergency team learners, with 5 subject experts as recommended by Lynn 1986 6 followed.In this tool design and development, all the steps have been taken into account (Polit and Beck 2006, 8 Polit et al. 2007 9 ) to obtain content validity evidence.
The example provided was taken from a construction process instead of an adaptation process.Thorndike 1994 10 advocates that both reliability coefficients and alternative form correlation be reported, so it is followed in this study too.Low reliability coefficients may indicate that the test done on 28 professionals are small.Another possible reason of less coefficiency, as Rattray and Jones in 2007 11 reported, may be close-ended questions, which mainly restrict the depth of participants response and, hence, diminish or render incomplete the quality of the data.Another possible reason of low reliability coefficient may be that the professional who participated in the re-test either did not pay required attention on the items, were disinterested, or got distracted.A solution to this could be item discrimination analysis to conduct on a larger group of panel experts.To overcome, in this study, an alternative form of correlation has also been calculated following the guidelines and it is greater than the reliability coefficient (r = 0.66).Peer review practices using guidelines led to improved psychometric characteristics of items. 12This instrument has been constructed it has undergone the testing by limited experts as pilot data collection, and their results are very promising; need to check test-retest reliability in either a larger panel expert group or multidisciplinary health science learners group after the sufficient span of time.
The new "RadEM-PREM IPE assessment tool" would be very useful for assessment of multidisciplinary team-building in the area of radiation emergency preparedness and response in entrylevel and intermediate-level emergency response team members.

Conclusions
This study concludes that the 21-item "RadEM-PREM IPE assessment tool" is a valid and reliable new measuring tool."RadEM-PREM IPE assessment tool" is to assess knowledge, performance, and communication skills of interprofessional radiation emergency response team learners in hospital settings.

Appendix
Radiation emergency preparedness response management interprofessional evaluation assessment tool (RadEM-PREM IPE) assessment tool for multidisciplinary health science learners Preamble: This 21-item tool going to use for survey on assessing the awareness and readiness on radiation emergency preparedness and response management to be handled in case of any mishap experienced by the future ready interprofessional learners (IPL) because among eligible participants who all are going to coordinate radiation work during your delivery of profession at work place sometime in future in hospital setting.Therefore, this questionnaire tool is for assessment of 3 main domain of knowledge (KS), communication skill (CS), and performance skill (PS) among the multidisciplinary health science students from medical undergraduate course, nursing midwifery courses, nursing graduate courses, dental graduation, allied health professional graduation courses, etc.The allied health science graduate students will be from nuclear medicine technology, medical imaging technology, emergency medical technology, and postgraduate allied health science will be from nuclear medicine and medical radiation physics students.This tool will be used on IPE learner groups both pre-and post-delivery of interprofessional educational awareness teaching learning materials in sufficient time interval.After collecting responses through on-line/off-line will be get analyzed the data for assessing the change in readiness for learning together and to bring it up in collaborative practice in hospital.

Table 3 .
Pearson correlation coefficient for the questionnaire https://doi.org/10.1017/dmp.2023.80Published online by Cambridge University Press This developed tool is part of the M-FIILIPE FAIMER-USA project study titled -Capacity Building Initiative for Radiation Emergency Preparedness and Response in Nuclear Medicine: An Interprofessional Education Approach.Wipe the area in single outward movements or circular motion with absorbent sheet or cotton using tongs or forceps.Option 2 Wipe the area in single inward movements or circular motion with absorbent sheet or cotton using tongs or forceps.Option 3 Wipe the area in single inward movements or circular motion with hands.Option 4 Wipe the area in single outward movements or zigzag motion with absorbent sheet or cotton using tongs or forceps.Hide the radioactive spill from his vision Option 2 Inform him that you will carry the source Option 3 Inform him about the radioactive spill Option 4 Confine the spillage with tissue paper immediately Option 5 Options 3 and 4 Inform about the technical error, to the nuclear medicine physician, who has to interpret the scan Option 3 Inform your colleague Option 4 Leave the place for coffee Option 5 Inform the patient attender Should have good communication between the shipment authorities and the hospital Option 2 Make queries about the shipment daily Option 3 Security cameras should be fixed at all of the areas Option 4 Stock keeping should be done every day Option 5 Options 1 and 2 Footnote Claimer: (copyright) (Permission required from author before using this tool for academic-research purpose) 6 S Bhushan et al.