In recent years, information technology increasingly has been used to assist and guide nursing practice. Consequently, both government and professional organizations have recognized the need to incorporate technology into healthcare delivery.1 While adopting this technology has provided opportunities to improve practice and patient outcomes,2 more studies are needed to document the technology's actual effects on patient outcomes3 and how the data these systems generate can be used to assess and predict care.4 While the increased use of technology also has resulted in the need for nurses to demonstrate competence in informatics,1 many nurses may not be prepared to use the technology fully.5 Doctor of Nursing Practice (DNP) providers are in a position to use information technology to monitor and improve patient outcomes and assess practice improvements.6 Indeed, the American Association of Colleges of Nursing and the National Organization of Nurse Practitioner Faculties recommend that DNP graduates be competent in information technology use.7,8 While considerable literature explains the need for nurses to be competent in informatics, minimal literature addresses the course content6 and competencies9,10 needed to ensure such mastery.
BACKGROUND
One difficulty in promoting informatics mastery among graduate-level nurses could be simply a lack of informatics course offerings. A survey of 36 graduate nursing leadership programs in the western United States and Canada revealed that only 33.4% required a course with informatics content and only 16.6% required an elective informatics course.5 In a review of the top 24 online nursing schools identified by US News & World Report in 2012, only four offered informatics content at the doctoral level.1 The results from these two studies indicate that the lack of informatics course content could result in inadequate preparation of advanced practitioners. Given the growing importance of informatics in healthcare, it is likely that the number of doctoral programs with an informatics course has increased since 2012. Despite this fact, doctoral programs-through specific informatics course offerings or informatics competencies spread throughout the curriculum-may not contain enough informatics content to produce graduates with full mastery of informatics.
Another difficulty in promoting informatics mastery could be the fact that students enter nursing programs with different informatics competence levels, complicating course design and delivery. In one study, students in BSN, RN-to-BSN, and DNP programs assessed themselves on informatics competencies using the Self-Assessment of Nursing Informatics Competencies Scale.11 While both undergraduate and graduate students were competent in basic computer knowledge and skills, graduate students displayed higher scores on two subscales: clinical informatics role and clinical informatics attitudes.11
In a study in which post-BSN and post-MSN students enrolled in an online DNP informatics course self-reported their competencies, post-BSN students were more competent in four of 18 areas (computer skills in communication, systems, documentation, and informatics knowledge about information management).12 In comparison, post-MSN students were competent in only one of the 18 areas (computer skills in communication).12 Of note, both groups scored lowest on decision support and data access. While these studies' authors suggested that informatics course content needs to be strengthened in several areas, they evaluated student self-reported competencies only11,12 and during the first week of the course.12 To address these potential study limitations, the current study's authors assessed competencies after each unit of an informatics course was completed.
In a previous study, we examined whether the highest academic degree obtained predicted students' mastery of some informatics competencies in an online DNP informatics course.13 Results showed that post-BSN students performed better in informatics skills, while post-MSN students performed better in analysis and informatics concept application.13 In other words, a greater percentage of post-BSN students mastered the competencies of working with datasets (92% and 64%, respectively) and databases compared to post-MSN students. However, the opposite was true for the informatics analysis and application competencies for which students needed to apply Internet resources to the learning needs of vulnerable patients or analyze the challenges of achieving meaningful use (MU) objectives. On these competencies, post-MSN students performed better than most post-BSN students. Because these conflicting results about the link between education level and competency attainment could result from students' prior clinical experience with informatics-instead of their education level-we concluded that this area needed further examination.
Because technology is increasingly used in practice, it is likely that students enter DNP programs with varying levels of informatics experience,11 a variable that could have a greater effect on students' ability to master content than education alone. In our previous study, we recommended that "informatics course content be tailored to reflect variation in students' educational and clinical backgrounds."13(p366) In the current study, we extended our investigation to examine the effect students' baseline clinical experiences had on their mastery of informatics competencies. We did so because we suspected that such experience could override educational background in attaining competencies in a doctoral-level course. Therefore, the aim of the current study was to extend the knowledge gained from the previous study by evaluating DNP students' prior experience with information technology and assessing whether that experience predicted their ability to master competencies in an informatics course.
METHODS
Design and Ethics Approval
A retrospective descriptive design was used to examine the effects of informatics experience and education on the mastery of informatics competencies. The university's institutional review board designated this study as exempt.
Participants and Setting
A convenience sample of DNP students enrolled in an online informatics course at a US state university in Michigan completed a self-assessment of their informatics experience when they completed a self-introduction exercise during the first week of the course.
Data Collection
Information on participants' past experience with informatics competencies was collected by having them answer "yes" or "no" to questions such as "Have you had experience working with MU; datasets; and e-health systems?" Students who responded "yes" were asked to describe their experience for each category.
The course content and competencies measured (analysis of challenges related to MU objectives, datasets, and e-health-applying Internet resources to vulnerable patients' learning needs) have been described previously13 and were selected because they represented basic competencies described in the DNP Essentials.8 The MU and e-health competencies address DNP Essentials III (Clinical Scholarship and Analytical Methods for Evidence-Based Practice), IV (Information Systems/Technology and Patient Care Technology for the Improvement and transformation of Health Care), VII (Clinical Prevention and Population Health for Improving the Nation's Health), and VIII (Advanced Nursing Practice), while the datasets competency addresses Essentials III, IV, and VII. The e-health competency also addresses requirements in Essential V (Health Care Policy for Advocacy in Health Care). Another competency was added to the current study to assess students' ability to use clinical support systems (CSS) data. In the CSS competency, students were presented with patient scenarios, relevant to advanced practice nurses, and asked to evaluate evidence-based information using health sciences databases, Clinical Queries within PubMed and DynaMed Plus, to make recommendations. This competency addresses Essentials III and IV.
Faculty assessed participants' competency mastery via their performance on competency assignments given at the completion of each relevant unit. Values based on competency scores were assigned as 1, mastered; 2, competent; or 3, did not master.
Data Analysis
Participants' self-assessment of informatics experience was compared to their competency mastery scores using the Pearson [chi]2 test. Logistic regression was performed to assess the effect experience and highest degree obtained had on competency mastery, with data collapsed into two categories (mastered or did not master). P values lower than .05 were considered statistically significant. Participants' specific experiences with each competency area also were summarized.
RESULTS
Demographics
Data showed that 91% of participants (n = 50) were female and 9% (n = 5) were male; 80% (n = 44) were admitted with a BSN, and 20% (n = 11) were admitted with an MSN. Most participants were in the Family Nurse Practitioner track (34.5%), with the remainder in the Adult Geriatric Acute Care (20%), Psychiatric (14.5%), and Adult Geriatric Primary Care (18.2%) tracks.
Mastery of Informatics Competencies
Meaningful Use
Analysis revealed that a statistically significantly higher proportion of participants with MU experience mastered the MU competency (Pearson, P = .004) (Table 1). After controlling for highest degree obtained, MU experience remained a significant predictor of mastery (logistic regression, P = .001). Participants with MU experience had 7.00 (95% confidence interval [CI], 1.90-25.8) times the odds of MU mastery compared to participants without MU experience.
Participants with previous MU experience (n = 29, 53%) identified their experiences as participating in hospital or clinic efforts to address MU (n = 12), electronic charting (n = 8), electronically ordering laboratory tests or medications (n = 5), and receiving education (n = 4).
Datasets and Databases
The following competencies focused on working with datasets (mastery of data entry and data analysis) and databases (mastery of database exploration). For the dataset competency, experience did not predict mastery (Table 1). Interestingly, a greater percentage of participants without dataset experience mastered this competency (Pearson, P = .06). After controlling for highest degree obtained, dataset experience did not predict dataset mastery (logistic regression; odds ratio [OR], 0.64; 95% CI, 0.21-1.97; P = .43).
For the database competency, both participants with and without experience were able to master the competency (Pearson, P = .19) (Table 1). After controlling for highest degree obtained, database experience did not predict mastery of the database competency (logistic regression; OR, 0.87; 95% CI, 0.22-3.50; P = .84).
Participants with dataset and database experience (n = 27, 49%) identified their experiences as working with datasets in a statistics course (n = 22), analyzing patient census or outcome trends and quality improvement efforts (n = 5), completing scholarly projects (n = 2), working with specific order sets (n = 2), and accessing library databases such as PubMed (n = 2).
Clinical Support Systems
Analysis revealed that, while not statistically significant (Pearson, P = .42), a higher proportion of participants with CSS experience mastered the CSS competency (Table 1). After controlling for highest degree obtained, experience with CSS was not a predictor of CSS mastery (logistic regression; OR, 2.39; 95% CI, 0.64-8.87; P = .18).
Participants with CSS experience (n = 24, 44%) had worked with computerized programs that provide alerts for medication interactions, recommendations for condition-specific order sets, and reminders to assess the need for continuing interventions such as catheters (n = 16), facilitation of electronic documentation (n = 3), alerts to CSS staff for discussion (n = 2), and information from UpToDate through Epic (n = 2).
e-Health
A significantly higher percentage of participants with e-health experience mastered this competency (Pearson, P = .037) (Table 1). After controlling for highest degree obtained, experience with e-health remained a significant predictor of mastery (logistic regression, P = .05). Participants with e-health experience had 4.00 (95% CI, 0.96-14.59) times the odds of e-health mastery. Participants with experience in e-health systems (n = 37, 67%) reported using patient portals (n = 15), electronic documentation (n = 15), and nurse references to improve care (n = 8).
DISCUSSION
To ensure adequate preparation of DNP graduates in informatics competencies, faculty must review and revise their curricula to meet the changing needs of the workplace.10 According to Hunter and colleagues,1(p112) "Faculty need to evaluate curricula and their graduates closely to determine if they have been successful in imparting the informatics competencies required by every nurse, where gaps occur in the curricula, and determine strategies to successfully develop these essential skill sets. However, this status report suggests that the integration of informatics content through nursing education is not yet where we need to be." Vottero14 noted that, because students enter programs with varying levels of technology experience, content might need to be leveled and outcomes need to be assessed. In response to these challenges, the authors have endeavored to extend our previous study13 by assessing other factors influencing student mastery of informatics competencies to provide further information for revising course content.
Previously, we found that education level was a factor in the attainment of some informatics competencies.13 In contrast, we found in this study that prior experience with MU analysis, CSS analysis, and applying e-health also affected mastery. This finding underscores the need to perform a thorough baseline assessment of students as a prerequisite for including appropriate informatics course content.
Based on the availability and diversity of healthcare technology, nurse educators need to include content on "electronic health records (EHRs), wearable technologies, big data and data analytics, and increased patient engagement as key areas for curriculum development."15(p89) In the current study, the authors found that prior experience predicted mastery in competencies focused on analysis or application of informatics content. In other words, a greater percentage of participants with experience mastered the competencies focused on MU analysis and applying e-health resources to vulnerable patients. Although not statistically significant, a greater percentage of participants with CSS experience mastered CSS analysis. The descriptive data collected on participant experience with informatics included some very specific examples of their work with MU and CSS. While experience with e-health also predicted mastery of this competency, participant experiences were limited primarily to the use of patient portals and electronic documentation. Based on this study's results, faculty should conduct a baseline assessment of students' experience in all informatics competencies. The survey questions used to determine past informatics experience and the study methods used to gather students' experiences could be adapted to develop a baseline assessment for informatics courses at other institutions.
According to Westra and colleagues, "Nursing leaders need to be knowledgeable, proactive, and engaged in establishing efficient ways to capture, integrate, and use Big Data."9(p305) While we found in our previous study that dataset and database skills were mastered by more students with post-BSN degrees than those with post-MSN degrees (92% and 64%, respectively),13 in the current study, prior experience working with datasets or databases did not affect competency mastery in these areas. However, because participants identified their experiences in these areas as primarily occurring in statistics courses with minimal to no clinical experience, it is possible that student experiences in the current sample were too basic to serve as a primer to produce mastery. Consequently, because nurses need to possess these higher-level skills (managing and interpreting data, and accessing information through public databases), our results indicate that this area requires faculty to undertake careful course planning and possible curriculum redesign. It is likely that nursing students will need several building-block assignments to attain an appropriate level of mastery. Assignments and activities similar to those listed in Table 2-using and evaluating a minimal dataset in the classroom and then a clinical database to improve care in a clinical experience-could help students develop the skills needed for today's practice environment.
Relevance to Nursing Practice and Education
Graduate nursing schools educate nurses to use evidence-based practice to deliver safe, efficient, and high-quality care. Information technology is used extensively in nursing practice to meet this same goal. The American Association of Colleges of Nursing's Essentials document outlined core curricular elements and competencies for DNP programs, including the requirement that nurses understand and use information systems with specialized content in advanced practice nursing.8 To support their organizations in transforming healthcare using information systems, DNP students must be skilled in MU, dataset and database analysis, CSS, and e-health. Therefore, faculty should assess students' experiences in these fundamental skills to determine their specific, initial learning needs. According to Choi and De Martinis,11(p1975) faculty can use these results "to determine the specific areas of informatics content that need greater focus and inclusion in the design of better nursing educational programmes, so that they ensure all graduates are competent with the fundamentals of informatics in clinical nursing practice." Based on the students' reported baseline experiences, additional learning activities should be designed to improve content mastery. The results of this study provide information that faculty could use to design informatics competencies that prepare nurse leaders by helping them build these fundamental skills (Table 2).
The DNP graduate should be prepared to design, select, use, and evaluate information systems and technology that focus on patient care outcomes. Students lacking basic skills could benefit from a module focusing on minimal MU requirements. Advanced course objectives could be met by such activities as shadowing a hospital nurse informaticist or billing department staff and focusing on performance categories, quality measure reporting, and care coordination strategies related to improving communication and sharing patient information (Table 2).
It also may be necessary to design or update curriculum to ensure that course content helps students develop their ability to evaluate data extracted from clinical information databases. Courses could include working with de-identified patient data via a partnership with a medical center or clinic. The initial learning experience could teach students to identify available databases that support evidence-based practice and quality outcomes (eg, acuity systems, severity of illness systems, and nurse scheduling systems). It is important that such initial learning be followed by a clinical experience embedded in a nursing informatics department. This strategy requires that faculty designing clinical courses coordinate with informatics faculty to understand students' prior learning in this area. Another helpful experience for students would be analyzing a real dataset that measures patient outcomes. For example, students could use data to compare clinical results such as falls and staffing ratios when the falls occurred. The initial focus of the students' learning should be dataset elements, terminology, and evaluation tools. Related classroom activities would allow students to "download and manipulate the data into reports and graphs, draw logical conclusions about the data, and synthesize the data to make decisions."14(pS25) For a subsequent course, a follow-up clinical experience should be designed to include working with real data to improve patient outcomes.
This study supports a curriculum design that focuses on capturing practice data, assembling and analyzing data, interpreting results, and designing interventions to improve these systems and take appropriate actions. Such a design must necessarily be based on an initial assessment of students' experiences in informatics followed by classroom assignments to build informatics knowledge and skills and concluding with informatics clinical experiences. The outcome of this type of curriculum is enhanced student understanding of evidence-based systems and data that can be used to improve quality outcomes and reduce patient risks (Table 2).
This study's results indicate a need to provide learning on CSS, MU, and e-health. Clinical support systems content should focus on tools that support decisions clinicians make to improve patient care. Students should be given an experience in which they evaluate available electronic EHR rules, alerts, best practice alerts, and reports that can help clinicians with decision making. Finally, students need to be able to evaluate consumer health information for accuracy, timeliness, and appropriateness. Students could be asked to assess and critique e-health applications for best practices, barriers, and areas for improvement. These systems could include telehealth, Web pages, phone applications, emails to clinicians, and blogging. In subsequent clinical courses, students should be expected to integrate these systems into the care they provide.
This study's results underscore the need to assess students' prior informatics experiences in order to guide the redesign of course content and clinical experiences to ensure mastery of informatics competencies. While it is beyond the scope of this article to include a detailed instructional design focus, several course examples have been provided to help achieve this outcome. Healthcare organizations' chief nurses are looking for DNP-prepared nurses who have mastered these critical competencies to help them transform their organizations to deliver safe, high-quality care and healthcare services.
Limitations
Based on the students' reported baseline experiences, additional learning activities should be included to improve content mastery. While this study yielded interesting results, it used a small sample size and focused on competency assessment at one point in time. Further research is necessary to evaluate larger sample sizes and at other time points.
CONCLUSION
These results indicate that students with prior experience were more likely to master competencies focused on analysis or application of informatics content (MU or CSS analysis and applying e-health resources to vulnerable patients). However, experience did not affect competency mastery for competencies focused on skills such as working with datasets or databases. These findings suggest that faculty should consider tailoring courses for novice and experienced learners to improve mastery of informatics competencies.
References