17 Jul Read the attached: How Can Technology Support Quality Improvement? Lessons Learned from the Adoption of Analytics Tool for Advanced Performance Measurement in a Hospital Unit? A s
Read the attached:
How Can Technology Support Quality Improvement? Lessons Learned from the Adoption of Analytics Tool for Advanced Performance Measurement in a Hospital Unit
A successful quality improvement (QI) initiative is essential to help a health care organization boost efficiency and improve its business model. It is critically important that the health care organization measures and monitors whether the anticipated outcomes are achieved. Health information technology (HIT) is often used to support systematic QI efforts by providing timely and valuable feedback on performance.
For your Quality Improvement Initiative final project, you must complete an eight- to 12-page paper that details your University of Iowa Hospital & Clinics QI initiative, “Quality Improvement Measure, “Patients who ‘Strongly Agree’ they understand their care when they left the hospital” under the “Survey of Patients’ Experiences” category.
- Summarize your Quality Improvement Initiative: Part 1 and Part 2.
- Assess an appropriate advanced HIT that can be used to support your QI initiative.
- Be sure to provide the rationales for your selection.
- Describe how your QI initiative can be incorporated into the organization’s overall strategic plan.
- Determine how to evaluate the effectiveness of your QI initiative.
Must use at least eight scholarly or peer-reviewed sources published in the past five years in APA Style.
RESEARCH ARTICLE Open Access
How can technology support quality improvement? Lessons learned from the adoption of an analytics tool for advanced performance measurement in a hospital unit Sara Tolf1* , Johan Mesterton1,2, Daniel Söderberg1, Isis Amer-Wåhlin1,3 and Pamela Mazzocato1,4
Abstract
Background: Technology for timely feedback of data has the potential to support quality improvement (QI) in health care. However, such technology may pose difficulties stemming from the complex interaction with the setting in which it is implemented. To enable professionals to use data in QI there is a need to better understand of how to handle this complexity. This study aims to explore factors that influence the adoption of a technology- supported QI programme in an obstetric unit through a complexity informed framework.
Methods: This qualitative study, based on focus group interviews, was conducted at a Swedish university hospital’s obstetric unit, which used an analytics tool for advanced performance measurement that gave timely and case mix adjusted feedback of performance data to support QI. Data was collected through three focus group interviews conducted with 16 managers and staff. The Nonadoption, Abandonment, Scale-up, Spread, and Sustainability (NASS S) framework guided the data collection and analysis.
Results: Staff and managers deemed the technology to effectively support ongoing QI efforts by providing timely access to reliable data. The value of the technology was associated with a clear need to make better use of existing data in QI. The data and the methodology in the analytics tool reflected the complexity of the clinical conditions treated but was presented through an interface that was easy to access and user friendly. However, prior understanding of statistics was helpful to be able to fully grasp the presented data. The tool was adapted to the needs and the organizational conditions of the local setting through a collaborative approach between the technology supplier and the adopters.
Conclusions: Technology has the potential to enable systematic QI through motivating professionals by providing timely and adequate feedback of performance. The adoption of such technology is complex and requires openness for gradual learning and improvement.
Keywords: Obstetrics, Quality improvement, Technology, Complexity, Performance measurement
© The Author(s). 2020 Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.
* Correspondence: [email protected] 1Department of Learning, Informatics, Management and Ethics, Medical Management Centre, Karolinska Institutet, Stockholm, Sweden Full list of author information is available at the end of the article
Tolf et al. BMC Health Services Research (2020) 20:816 https://doi.org/10.1186/s12913-020-05622-7 BMC Health Services Research
http://crossmark.crossref.org/dialog/?doi=10.1186/s12913-020-05622-7&domain=pdf
Background Quality Improvement (QI) in health care has been de- scribed as the combined and continuous actions that lead to better patient outcomes, better system perform- ance, and better professional development [1]. For QI to be effective in health care, performance measurement is required to provide feedback to professionals and orga- nizations on the quality of care provided [2]. Feedback on performance may increase health care professionals’ and managers’ learning in ways that can result in changes in practice that can be retained, modified, or rejected [1–4]. However, despite many health care orga- nizations engaging in feedback and QI, few manage to consistently improve quality and sustain results over time [4, 5]. In part, this may be explained by the challenges in-
volved in providing relevant feedback to professionals. Analyses of interventions involving audit and feedback show mixed results, with a number of different factors related to the type of feedback impacting its effectiveness [6]. One hindrance links to the requirement that per- formance data must be fed back continuously and in a timely manner [7, 8]. A systematic review reveals that the QI tool referred to as the Plan-Do-Study-Act (PDSA) is often used with no access to data at weekly or even monthly intervals [4]. Research even shows that time lags in data feedback can be as long as 3 years [8]. Con- sequently, health care professionals and managers are unable to continuously evaluate the changes [4]. Lack of trust in underlying data also affects the use of feedback and studies show that case mix adjustment, i.e. adjust- ment of data for differences in patient characteristics, is important factor for trust among professionals [7, 9]. Improvements in technology have advanced the ability
to capture, process, analyze, and present data [10]. While performance measurement used to be a largely manual process, technical solutions can now incorporate data from different databases (e.g., claims data, quality registers, and electronic health records [EMRs]), adjust for differ- ences in patient characteristics, and quickly analyze data. However, despite the potential for technology to support
QI, implementation of such programs is often challenging and may often result in less than satisfactory results [11, 12]. A wide body of evidence indicates that several attri- butes of a technology itself, such as its potential benefits, user-friendliness, compatibility with organizational values and its complexity, influence the innovation’s adoption into health care organizations [13, 14]. During the last de- cades, increasing academic attention has also been given to contextual factors, such as adopters, organizational as- pects and leadership as well as political and economic in- fluences that affect the adoption of technologies [11–16]. Greenhalgh et al. suggest factors influencing the adoption of innovations in health care can be categorized into seven
different domains pertaining to the technology itself as well as system into which it is being introduced [13, 17]. Furthermore, their research shows it is not individual fac- tors themselves, but rather the dynamic interaction be- tween them, that determine the adoption of technological innovations in health care. Health care organizations tend to underestimate this complexity [17]. Technologies tend to be over-simplified, poorly prototyped, and inappropri- ately customized, which can result in early rejection and abandonment [12]. A deeper understanding of the dy- namic interaction between the technology and the context in which it is implemented can guide successful adoption [13, 18]. Thus, this study aims to explore factors that influence
the adoption of a technology-supported QI programme in a hospital unit, through a complexity informed framework.
Methods This qualitative study, based on focus group interviews, was conducted at the obstetric unit of an obstetrics and gynecology (OB/GYN) department at a Swedish university hospital. We selected the unit because of its work with an innovative and technology-supported QI programme.
Theoretical framework This study explores the adoption of a technology for advanced performance management to support a QI programme. The study was guided by a theoretical frame- work that was specifically developed to understand how complexity influences the adoption of technology- supported programmes, i.e. the Nonadoption, Abandon- ment, Scale-up, Spread, and Sustainability (NASSS) frame- work [13]. The framework has been previously empirically used to
explain the success or failure of technology adoption in health care [13, 19]. Seven domains (D) are included in the framework: clinical condition, technology, value prop- osition, adopter system, organization, wider system, and embediness and adoption over time (Table 1). Building on complexity theory [18, 20], the framework suggests that each domain can be classified as simple (“straightforward, predictable, few components”), complicated (“multiple interacting components or issues”), and complex (“dy- namic, unpredictable, not easily disaggregated into con- stituent components”). The framework helps to understand how the inherent complexity ant the interac- tions between the domains can influence the success or failure of technology supported programme [13]. The more complex technology supported programs are the more these interactions can be expected to be constantly changing, unpredictable, and non-linear [18, 21].
Tolf et al. BMC Health Services Research (2020) 20:816 Page 2 of 12
For QI to be effective in health care, performance measurement is required to provide feedback to professionals and organizations on the quality of care provided [2].
Lack of trust in underlying data also affects the use of feedback and studies show that case mix adjustment, i.e. adjust- ment of data for differences in patient characteristics, is important factor for trust among professionals [7, 9].
11 through 16
11 through 16
Setting The obstetric unit in the OB/GYN department provides birth care for 4.200–4.300 women annually as well as re- lated pathology and recovery care. The university hos- pital in which the unit is located has beyond its regional catchment area also referrals from northern Sweden for fetal medicine. In 2013, the Swedish government approved financing for
a national, cross-regional research and development project called Sveus, which provided the foundation for the clinic’s QI programme. Sveus aimed to develop methodologies for continuous measurement of case mix adjusted perform- ance. This research resulted in the launch of a cross- regional analytics tool for case mix adjusted benchmarking of outcomes, resource use, and care processes. Birth care was one of the six patient groups initially addressed in the project [22–24]. Moreover the government decided to put improvement of birth care and women’s health on the pol- itical agenda and pledged significant national funds to the regions between 2015 and 2022 [25].
The technology supported QI programme In 2017 the obstetric unit launched a technology- supported QI programme as part of a hospital wide Value Based Health Care (VBHC) effort that involved the use of an analytics tool, Era (Ivbar Institute AB, Stockholm, Sweden), for advanced performance meas- urement. The unit paid a license fee for use of the tool. The tool included data from several sources, one of which is the cross-regional benchmarking tool used by the regions participating in Sveus. This data was primar- ily based on information from patient administrative sys- tems and included algorithms for case mix adjustment
[23]. This data was combined with clinical data from local EMRs to enable tracking of local performance on a wide array of different indicators. Different dashboards for performance measurement were available to the clinic through web-interfaces and were all updated weekly to ensure timely feedback of performance. The dashboards included:
1) An overview dashboard for managers group with granular information on volumes, indicators of care process, resource use, outcomes and patient satisfaction. The dashboard included information on performance in different subgroups of patients, tracked ongoing improvement projects, and presented information on indicators where the clinic performed better and worse than expected based on information about case mix-adjusted per- formance from the cross-regional tool.
2) Dashboards available for managers and staff actively engaged in the QI programme with detailed information of development of indicators related to ongoing improvement projects, including analysis of different subgroups of patients.
3) A dashboard made available to the entire staff with information about selected important performance indicators, a list of indicators where the clinic performed better/worse than expected, as well as information on ongoing improvement projects.
The adoption of the tool in 2017 led to the launch of five improvement initiatives organized into multi- disciplinary QI teams that consisted of physicians, mid- wives, and assistant nurses. These teams focused on
Table 1 Description of the seven domains in the NASSS framework
Domain (D) Description
Condition (D1) The nature or characteristics of the condition or diagnoses that the technological innovation address, as well as relevant co-morbidities and sociocultural aspects.
Technology (D2) The technological features of the innovation, such as its design and perceived usability, the quality and reliability of knowledge generated as well as the skill and support needed to use the technology. It also concerns the long- term sustainability of the technology, such as possibility of adaptations and potential market dynamics that may impact the future availability of the product.
Value proposition (D3) The expected value of the technological innovation, both from a supply-side business model view, and from the perspective of the health provider, weighing potential benefits for patients against costs of procurement.
Adopter system (D4) Changes in staff roles or responsibilities that threat professional identities are factors that add complexity and may impede implementation of new innovations. The domain also includes expectations on patients’ or their caregivers’ knowledge and involvement in innovation adoption.
Organization (D5) The organisation’s readiness to adopt new technology, how the decision to implement the technology into the organisation was made and how that decision was motivated and funded. Disruptions to established work routines and the amount of work required to adopt the new technology may as well affect organisational response.
Wider system (D6) Political, financial, regulatory/legal and social context that may influence the means and successfulness of the technology into the organisation.
Embedding and adaptation over time (D7)
The possibility to “coevolve” technology to changing context within the organisation and the resilience of the organisation in adapting to unforeseen events, which can impact the ability of the organisation to retain and further develop technology over time.
Tolf et al. BMC Health Services Research (2020) 20:816 Page 3 of 12
Domain (D) Description Condition (D1) The nature or characteristics of the condition or diagnoses that the technological innovation address,
as well as relevant co-morbidities and sociocultural aspects. Technology (D2) The technological features of the innovation, such as its design and perceived usability, the quality
and reliability of knowledge generated as well as the skill and support needed to use the technology. It also concerns the long- term sustainability of the technology, such as possibility of adaptations and potential market dynamics that may impact the future availability of the product.
Value proposition (D3) The expected value of the technological innovation, both from a supply-side business model view, and from the perspective of the health provider, weighing potential benefits for patients against costs of procurement. Adopter system (D4) Changes in staff roles or responsibilities that threat professional identities are factors that add complexity and may impede implementation of new innovations. The domain also includes expectations on patients� or their caregivers� knowledge and involvement in innovation adoption.
Organization (D5) The organisation�s readiness to adopt new technology, how the decision to implement the technology into the organisation was made and how that decision was motivated and funded. Disruptions to established work routines and the amount of work required to adopt the new technology may as well affect organisational response. Wider system (D6) Political, financial, regulatory/legal and social context that may influence the means and successfulness of the technology into the organisation.
Embedding and adaptation over time (D7)
The possibility to �coevolve� technology to changing context within the organisation and the resilience of the organisation in adapting to unforeseen events, which can impact the ability of the organisation to retain and further develop technology over time.
The adoption of the tool in 2017 led to the launch of five improvement initiatives organized into multi- disciplinary QI teams that consisted of physicians, mid- wives, and assistant nurses. These teams focused on
1) An overview dashboard for managers group with granular information on volumes, indicators of care process, resource use, outcomes and patient satisfaction. The dashboard included information on performance in different subgroups of patients, tracked ongoing improvement projects, and presented information on indicators where the clinic performed better and worse than expected based on information about case mix-adjusted per- formance from the cross-regional tool.
2) Dashboards available for managers and staff actively engaged in the QI programme with detailed information of development of indicators related to ongoing improvement projects, including analysis of different subgroups of patients. 3) A dashboard made available to the entire staff with
3) A dashboard made available to the entire staff with information about selected important performance indicators, a list of indicators where the clinic performed better/worse than expected, as well as information on ongoing improvement projects.
reductions in the rates of caesarean deliveries, labour in- ductions, post-partum infections, newborns with low Apgar score and urinary retention. The choice of im- provement initiatives was largely based on identified im- provement potential from the cross-regional benchmarking performed within Sveus.
Data collection Data were collected in three focus group interviews (16 in- formants, with 4–6 participants in each group) in Septem- ber and October of 2018. We chose to conduct focus group interviews because QI at the unit was conducted in multi-professional teams, and thus we wanted to promote group discussions around the potential benefits of using technology to support QI [26]. A semi-structured inter- view guide with open-ended questions (Additional file 1) was used that addressed the seven domains in the NASSS framework (Table 1), and thus included questions con- cerning: D1) what characterizes the patient group treated in the unit; D2) how the tool was used to support QI; D3) the perceived value of the tool; D4) changes in the adop- tion system needed for the use of the tool; D5) organizational aspects related to the adoption; D6) aspects of the wider system that influenced the adoption; D7) and the embeddedness and the adoption over time. The Con- solidated Criteria for Reporting Qualitative Research (COREQ), which is a 32-item checklist, was used to en- hance the reporting of the findings [27]. The multidisciplinary research team consisted of: a
health economist (JM), two physicians (IAW, DS), a sociologist (ST), and senior researcher in medical man- agement (PM). Collectively, the team has experience in the theory and practice of QI and organizational change (PM, IAW, ST), obstetrics (IAW), performance measure- ment (IAW, JM), health economics (JM), and medical management (PM, IAW, ST). Purposive sampling was used to select informants who
could provide a rich and diverse perspective on the po- tential benefits and challenges of technology-supported QI [28, 29]. Based on this we included managers, staff actively engaged in the QI programme, and staff not ac- tively engaged in the QI programme. The latter were in- cluded because we expected them to have had experienced the technology-supported QI programme even if they were not directly involved. The diversity among the informants aimed to provide varied experi- ences and perceptions [26]. The head of the obstetric unit was tasked with identi-
fying possible informants who met the selection criteria, as she had the main responsibility for the QI programme and hence knew which personnel was involved in the programme or not. She recruited informants by speaking with them directly or emailing them, there were no rec- ord of informant drop out in the recruitment process.
The participants were grouped into three groups: man- agers, staff involved in QI and staff not involved in QI. Informants included are summarized in Table 2. The interviews lasted from 75 to 90 min each and were
audio-recorded and transcribed. A facilitator (PM or ST) led the interviews. One or two researchers (DS and/or ST) observed the interviews. Before the interview partici- pants were also verbally informed about the educational background and field of interest of the interviewers. In- terviews were conducted at the participants’ workplace.
Data analysis We analyzed the data through directed content analysis, i.e. a deductive approach to analysis [30, 31]. We chose a deductive approach because the NASSS framework had previously identified key domains important to consider in the adoption of technology-driven programmes. Therefore, we developed an a priori code book based on the definition of the domains in the NASSS framework [13]. The content analysis process followed seven steps. First, two researchers (ST and DS) read the transcribed interviews to get a sense of the material. Second, these two researchers condensed the data into condensed meaning units, i.e. reduced meaning units into shorter text. One interview was condensed by both researchers independently and then compared their results to ensure consistency in level of condensation. Thereafter DS con- densed the text in the second interview into condensed meaning units and ST the third interview. Third, con- densed meaning units where printed and placed ran- domly on a table; ST, DS and JM sorted the condensed meaning units independently and in silence into the a- priori defined categories (D1-D7) based on the NASSS framework and developed subcategories. The subcat- egories were identified by grouping condensed meaning units with related meanings. Fourth, all authors revised the sorting of the condensed meaning units into the a- priori defined categories and together further developed the subcategories informed by negotiated consensus [32]. Fifth, the authors developed synthesized descrip- tions of the empirical databased on the subcategories the authors. Sixth, all authors read through the descriptions of each domain and independently categorized the do- mains into simple, complicated, or complex. All authors articulated their reasoning and discrepancies were iden- tified, discussed, and resolved. Seventh, validation with
Table 2 Participants of focus groups
Group: Staff not involved Staff involved Managers Total
Physicians 2 2 2 6
Midwifes 2 2 2 6
Assistant Nurses 2 2 0 4
Total 6 6 4 16
Tolf et al. BMC Health Services Research (2020) 20:816 Page 4 of 12
reductions in the rates of caesarean deliveries, labour in- ductions, post-partum infections, newborns with low Apgar score and urinary retention. The choice of im- provement initiatives was largely based on identified im- provement potential from the cross-regional benchmarking performed within Sveus.
Group: Staff not involved Staff involved Managers Total Physicians 2 2 2 6 Midwifes 2 2 2 6 Assistant Nurses 2 2 0 4
6 6 4 16
participants from the focus groups as well as other em- ployees was performed by PM and ST by presenting the results in a workshop. All participants in the validation session were asked to independently mark the emerged categories within the a priori domains with agree or dis- agree. Informants confirmed that the findings mirrored their experience. Microsoft Word and Nvivo 12.0 were used to manage the data. The datasets generated and/or analysed in this study are not publicly available to main- tain confidentially, but de-identified data are available from the corresponding author on reasonable request.
Results The results section includes first a presentation of the empirical findings, based on the subcategories that were identified (Table 3) and linked to each of the seven do- mains in the NASSS framework, followed by an analysis of how the complexity inherent in each domain and the interaction between the domains influenced the QI programme.
Condition: pregnancy spans from simple to complex (D1) Representatives from all focus groups described that the unit treated a broad patient population e.g. both emer- gency and elective care took place at the unit and that some patients were low-risk in normal labour while others were high-risk with complex conditions such as premature delivery and maternal co-morbidity.
Technology: a practical but not trivial analytics tool (D2) Representatives of the managers and staff involved in QI described that the feature of case mix adjustment made data more relevant compared to unadjusted data and counteracted the practice of justifying poor performance outcomes with misconceptions about patient complexity.
(One informant) – And the case mix adjustment has made a difference. Before we blamed a lot on the fact that our patients are so special. (Another informant) – Yes, absolutely, [we said] “We have so difficult pa- tients” and “It’s a little bit special here”. [Managers]
Staff involved in QI teams expressed that the level of data detail was generally high which was considered im- portant for its usage, although in some cases it was too coarse. Moreover, both managers and staff involved in QI perceived the timelines of data feedback to be rele- vant and useful.
(One informant) – Now it is more easily accessible. (Another informant) – Quickly look at recent data that are divided into different focus areas so that you can quickly get an overview as you say. “The last month something has happened, it is suddenly 30%
caesarean sections, what should we do?” [Staff in- volved in QI teams]
The managers said the dashboards were easily available on any devices such as computers, smart phones, and tab- lets. The graphic presentation of the data via the web interface was perceived as understandable, user-friendly, and clear by managers and staff involved in QI-teams. The staff who were involved in QI teams reflected on their dif- ferent preferences concerning the visual presentation of data and they suggested that further improvements in the interface would increase data accessibility even more. The two staff groups said more guidance and support was needed for identifying, selecting, extracting, and under- standing relevant data. It was also described by involved staff that prior understanding of statistics was helpful to be able to fully grasp the presented data. Representatives from the manager group and staff in-
volved in QI said that the relationship with the supplier was such that it was possible to customize the analytics tool to the local needs and conditions. For example, in addition to indicators established through the cross- regional benchmarking, the supplier facilitated measure- ment of indicators specifically demanded by the unit, in- corporated local data available in the department’s EMR, and adapted to the hospital’s system for data transfer. In- volved staff described how they contributed with their clinical knowledge to identify what data was needed to guide QI. The supplier was in turn able to translate these needs into data demands to the hospital-IT department.
Value proposition: timely and reliable data (D3) According to informants from all focus groups, the ana- lytics tool was needed because of the existing QI chal- lenges. The managers described that a previous cross- regional report, from the Sveus project, suggested there was room for improvement in a number of areas. This report was said to motivate the unit to require more data to better understand performance and to initiate improvement activities. It became clear that there were areas of underperformance in the unit, and that the pa- tient mix was not the cause of the variations. The man- agers described that even before the introduction of the technology-supported QI programme, data were seen as essential to QI in the unit. However, prior to the adop- tion of the technology innovation data were often diffi- cult to access and were often out-of-date.
(One informant)- Statistics has been our weak spot. We’ve been able to measure but it has been difficult. And really difficult sometimes when we wondered: “How many women with diabetes do we have?” or “How many complications do we have?”. So it was very difficult to get that data (Another informant) –
Tolf et al. BMC Health Services Research (2020) 20:816 Page 5 of 12
Table 3 Description of subcategories linked to each domain of the NASSS framework
Domain Subcategory
Condition (D1) Broad patient population including both high and low risk patients with diverse background.
Large birth clinic also accepting patients from other regions.
Technology (D2) The platform enabled staff to easily understand data and to gain new knowledge; prior understanding of statistics was helpful.
The tool was easily accessible.
Timely feedback of data made it more relevant for QI than historical data previously used.
Case mix adjustment made the data more relevant for QI.
Highly detailed data was important for its use in QI.
Lack o
HOW OUR WEBSITE WORKS
Our website has a team of professional writers who can help you write any of your homework. They will write your papers from scratch. We also have a team of editors just to make sure all papers are of
HIGH QUALITY & PLAGIARISM FREE.
Step 1
To make an Order you only need to click ORDER NOW and we will direct you to our Order Page at WriteDen. Then fill Our Order Form with all your assignment instructions. Select your deadline and pay for your paper. You will get it few hours before your set deadline.
Deadline range from 6 hours to 30 days.
Step 2
Once done with writing your paper we will upload it to your account on our website and also forward a copy to your email.
Step 3
Upon receiving your paper, review it and if any changes are needed contact us immediately. We offer unlimited revisions at no extra cost.Is it Safe to use our services?
We never resell papers on this site. Meaning after your purchase you will get an original copy of your assignment and you have all the rights to use the paper.
Discounts
Our price ranges from $8-$14 per page. If you are short of Budget, contact our Live Support for a Discount Code. All new clients are eligible for 20% off in their first Order. Our payment method is safe and secure.
Please note we do not have prewritten answers. We need some time to prepare a perfect essay for you.
The post Read the attached: How Can Technology Support Quality Improvement? Lessons Learned from the Adoption of Analytics Tool for Advanced Performance Measurement in a Hospital Unit? A s | WriteDen appeared first on Professors Essays.