Skip to main content
Intended for healthcare professionals
Free access
Research article
First published online January 23, 2019

Development process and patient usability preferences for a touch screen tablet–based questionnaire

Abstract

We sought to design a touch tablet asthma questionnaire while identifying patient preferences for usability features of such questionnaires.
We created an evidence-based prototype and employed rapid-cycle design (semi-structured focus group testing, analysis, corresponding modifications, re-testing) with asthma patients aged ⩾16 years. We analyzed transcripts using deductive and inductive content analysis. Quantitative measures included Likert-type-scale responses, the System Usability Scale, and questionnaire completion times.
There were 20 participants across five focus groups (15/20 female, age 49.1 ± 15.6 years). Usability-related themes included (1) “Touch Technology” (hygiene, touch technology familiarity, ease of use) and (2) “Questionnaire Design” (visual characteristics, navigation). Completion time was 11.7 ± 5.9 min. Summative Likert-type scale responses suggested high system usability, as did a System Usability Scale score of 84.2 ± 14.7.
In summary, Attention to specific technology- and design-related preferences can result in a highly usable patient-facing touch tablet questionnaire. Our findings can inform touch questionnaire design across other diseases.

Background and significance

Consumer technologies are increasingly being adapted for clinical use, with a goal of facilitating and enhancing health care. Specifically, administration of patient questionnaires through tablet devices has been found to be both feasible and efficient.1,2 Tablet-based patient questionnaires have been used for osteoporosis risk assessment and for pain management and quality of life assessment in chronic diseases such as rheumatoid arthritis, lupus, spondyloarthritis, chronic back pain, and various cancers.311
Usability is a major determinant of the uptake of technology-based systems, particularly in complex healthcare settings.12 Yet despite their growing use, few studies have addressed patient usability preferences for touch tablet questionnaires.1315 We sought to systematically design a highly usable waiting room touch tablet questionnaire for patients with asthma that could be completed within approximately 10 min (about 85% of patients wait 10 min or more in a primary care waiting room)16 and to describe patient preferences for usability features of such questionnaires.

Materials and methods

To achieve our objectives, we employed rapid-cycle design—a process by which practical problems are identified and addressed using incremental analysis.17,18 We collected qualitative data regarding patient preferences and concerns in semi-structured focus groups. We also included the following quantitative measures: Likert-type scale questions, the System Usability Scale (SUS), and questionnaire completion times.19

Study setting and population

The study was approved by the St. Michael’s Hospital Research Ethics Board. We recruited patients aged ⩾16 years with a self-reported physician diagnosis of asthma through clinician referrals and clinic posters in the Greater Toronto Area. We employed purposive sampling to achieve sample heterogeneity with respect to age and electronic technology and touch device experience (from novice to expert).20

Prototype questionnaire design

The purpose of the questionnaire was to collect information about asthma symptoms and medication use. The questionnaire was designed as an electronic tool to be administered in a clinic waiting room. Questionnaire data were later integrated into a point-of-care computerized clinical decision support system (eAMS—the Electronic Asthma Management System)21 based on adult asthma guidelines (targeting patients aged ⩾16 years).22 The questionnaire was delivered through a web-based application on a second-generation Apple iPad (Apple Inc., Cupertino, CA, USA). We chose this device over smaller-screened devices based on the prior literature suggesting that patients prefer a device that more closely approximates a standard piece of paper.23 First, a prototype was developed by asthma and knowledge translation experts (S.G., L.P.B., A.K.). Where possible, we applied evidence-based electronic questionnaire design principles and included touch device usability features shown to be broadly favorable to patients. The principles included (1) suitability across ages and genders, (2) ability to efficiently complete the questionnaire without prior training, (3) all responses entered by user action (as opposed to default answers), and (4) information necessary to answer each question available on the question screen (without requiring scrolling).24 Specific features included (1) large radio buttons for answer selection, (2) a scrolling system for integral increases or decreases in numerical entry values, and (3) minimal text entry requirements.25 Movement through the questionnaire was enabled by arrow buttons on either side of the screen or by swiping. The questionnaire was viewable in portrait or landscape mode and screen zooming was enabled.24,26 Italic script was avoided and a uniform color scheme espoused.26 This prototype (with response-dependent paths yielding anywhere from 7 to 30 question slides) was pilot tested by a convenience sample of 12 users (clinicians and students) who provided unstructured individual feedback on content and usability. The corresponding improvements were made before focus group testing.

Questionnaire testing and outcome assessment

The rapid-cycle design process involved the following (in sequence): (1) questionnaire testing in a focus group, (2) analysis of focus group findings, (3) corresponding modification of the questionnaire, and (4) re-testing of the modified questionnaire in a subsequent focus group.4,27 We performed serial focus group testing until no new categories of usability-related concerns were identified.
Each focus group was facilitated by a moderator and attended by the study coordinator. Sessions lasted 2 h, included 3–5 participants, and were audio-recorded, with tapes transcribed verbatim. Each session was scripted and began with a slide presentation providing background on asthma, the study’s purpose, and session instructions. Participants then used the device to complete a survey collecting background information. Next, they completed the questionnaire on individual devices while being observed by a moderator and a study investigator, who recorded questions/features which appeared to be problematic in field notes. The moderator then debriefed participants as a group, eliciting opinions on each of two pre-set overall themes: usability-related issues and content-related issues (data from the latter theme are presented separately).28 Finally, participants completed a Likert-type scale–based questionnaire which included the validated SUS, which measures effectiveness (ability to complete tasks), satisfaction (subjective reactions to using the system), and system efficiency.19,29

Analysis

Focus group transcripts and field notes were independently analyzed by two team members with qualitative research expertise. Analysts independently generated lists of usability and content issues (and solutions, where applicable) and compared these to generate a final list of issues. Investigators used this list to devise the corresponding system revisions, which were then made before the next session. After the last focus group, analysts coded all transcripts, applying both deductive and inductive content analysis approaches. Consensus on coding was reached through comparison and discussion between analysts.30,31 Demonstrative quotations were collected for each identified sub-theme. Quantitative summary statistics included means and standard deviations, or proportions, as appropriate. The SUS was calculated in accordance with questionnaire instructions.19

Results

We conducted five focus groups with 20 participants in total (Table 1).
Table 1. Background and demographics (n = 20).
  Number (%)
GenderMale5 (25)
Female15 (75)
Age (years)<304 (20)
30–395 (25)
40–495 (25)
50–590 (0)
⩾606 (30)
Highest level of education completedElementary school1 (0)
High school6 (5)
College/trade school/other6 (30)
University13 (65)
Internet access at work or homeYes19 (95)
No1 (5)
Hours spent on the Internet in an average week at home<3 h6 (30)
4–9 h4 (20)
10–15 h3 (15)
>15 h7 (35)
Hours spent on the Internet in an average week at work<3 h10 (50)
4–9 h5 (25)
10–15 h1 (5)
>15 h4 (20)
Use a tablet device with a touch screen at work or at homeYes9 (45)
No11 (55)
Ever use of a tablet device with a touch screenYes15 (75)
No5 (25)

Quantitative results

Participants took a mean of 11.7 ± 5.9 min to complete the questionnaire (range: 5.4–26.9 min). The mean SUS was 84.2 ± 14.7 (maximum score: 100). Exit questionnaire responses were highly favorable.
Responses were entered on a five-point Likert-type scale labeled 1 (disagree), 3 (neutral), and 5 (agree). For the purposes of this figure, scores of 1 and 2 were considered “disagree” and 4 and 5 were considered “agree.” Each bar demonstrates the proportion of participants with each response, for each statement. This includes 20 participants. (Figure 1)
Figure 1. Participant responses to exit questionnaire.

Qualitative results

Issues identified in each focus group were grouped into subthemes under the pre-set overall usability theme. Subthemes are described below and further representative quotations are provided in Table 2. Specific issues raised in each subtheme and the corresponding modifications to the questionnaire are shown in Table 3. Figure 2 provides visual representations of some of these changes.
Table 2. Themes with representative quotations from focus groups.
ThemesRepresentative quotations
Usability
(a) Touch technology
HygieneSome participants indicated that screen hygiene was a concern:
“… as I get older my awareness of how many germs there are is increasing … if they handed me that I might need an antibacterial wipe before I used it.” (DEV02)
Familiarity with touch screen technologyIn regards to launching the keyboard, one participant asked:
“Will the patients be told hit this to get the keyboard, numbers?” (DEV05)
Similarly concerns were raised about navigation through the questionnaire:
“And then I saw ‘next’ and I did that. But yeah you could have a little instruction on how to navigate.” (DEV05)
Ease of use of touch screenSome participants indicated that touch screen use might be easier with a stylus, especially for users with reduced dexterity:
“I wish I had brought my stylus with me … I feel more comfortable using the stylus than my finger …” (DEV01)
(b) Questionnaire design
Visual characteristics (fonts, images)Several participants raised the importance of a large font size:
“… because I’m old, I have bad eyes.” (DEV01)
“… I think the font could have been a bit bigger or give us an option.” (DEV05)
And in some places, a need for font emphasis:
“Um, I had an asthma action plan … You might want to capitalize that …” (DEV11)
Similarly, larger images were preferred:
“Just so that there is less information there … especially because the images are so small …” (DEV15)
“Yeah that’s the one, I’m like … the Ventolin … but it’s like … small …” (DEV17)
The importance of using true-to-life colors in inhaler images was emphasized:
“Yeah that’s the one, I’m like … the ventolin is so dark. I know I recognize my medications, but … I thought it was dark, like the Advair is so dark.” (DEV17)
NavigationParticipants indicated the need to format pop-ups in a user-friendly way:
“That thing terrified me when it popped up.” (DEV15) (in reference to a name and date of birth confirmation message)
And to ensure that the questionnaire’s “Stop” button (intended to save all existing data and to end the questionnaire) was more clearly indicated and explained:
“No I didn’t [notice it] …” (DEV06)
“I saw it, but I didn’t know … when I should have used it …” (Dev08)
Similarly, participants sought to ensure the ease of pressing radio buttons:
“I think if you have to touch it with your finger, for the majority of people, it needs to be bigger.” (DEV02)
And clarity in the process of selecting medications by clicking device images:
“Well, there was nothing that showed that you have actually selected something. You just keep pressing it and you are waiting for it to do something …” (DEV07)
Participants also recommended an explicit indication that the emergency contact screen in particular could be skipped without completing (given the personal nature of the information being requested):
“But is there an option that if someone doesn’t want to provide it they can just go to the next page?” (DEV08)
One participant accidentally entered his or her full name in the first name box:
“I accidently put my first and last name into the same bar because I didn’t read what each of the bars was for …”(DEV15)
“Maybe if it had like first name and last name on the same line, as 2 bars side by side it would be less likely for that to be a problem.” (DEV15)
Table 3. Usability issues raised during focus groups and the corresponding system modifications.
ThemesIssues raisedSystem modifications in response to issues raised
Usability
(a) Touch technology
HygieneMaintaining tablet hygiene: fingerprints noted on screen, concern about “germs” (FG1)Disinfectant wipes: provided disinfectant wipes to clean screen between uses
Familiarity with touch screen technologyOpening the “keyboard”: concern that iPad-naïve users would not know how to pull up keyboard (FG1)Instructions to open keyboard: added
“To show the keyboard, tap in the text boxes below”
Navigating for those who have never used a tablet: concern that touch device–naïve users might not know how to navigate from screen to screen and how to scroll (FG1)Navigation instructions: added basic navigation instructions to the home screen
Ease of use of touch screenNeed for a stylus: concern about usability of the device for users with physical limitations (e.g. arthritis) (FG1)Stylus: attached a stylus to the tablet case
(b) Questionnaire design
Visual characteristics (fonts, images)Legibility: noted that font was too small (FG1)Increased font size: increased from 10 to 12 points throughout questionnaire, eliminated font changes between screens
Medication image visibility: noted that medication images were too small (FG4)Enlarged medication images: increased each image size surface area by 56% (from 2 × 2 cm2 to 2.5 × 2.5 cm2)
Need for text emphasis: noted that “Electronic Asthma Action Plan” would not be noticed if not emphasized to users (FG3)Changed to a bold typeface for “Electronic Asthma Action Plan”
NavigationClosing the questionnaire: noted that the “stop” button was difficult to find and to interpret (FG2)Enhanced “stop” button and added hover message: “You can stop this questionnaire at any time by pressing the Stop button” along with an image of the button (Figure 2(a) and (b))
Pressing radio buttons: noted that radio buttons were too small and close together, causing errors (FG1)Enlarged and added space between radio buttons
Selecting medication images: noted lack of clarity as to whether a medication image had been successfully selected (originally users were expected to select by clicking on an image only, which then became highlighted) (FG2)Added check boxes with medication images: added a check box with each medication image and allowed users to click on either the medication image itself or on the check box to indicate a choice (the check box was checked automatically if the image was selected)
Name and date of birth confirmation message appearance: noted that the confirmation message included an error message and a URL (FG4)Adjusted error message: removed the error message and URL, creating a more conventional looking window appearance for pop-up (Figure 2(c) and (d))
Participants unaware that they could skip certain screens (FG2)Instructions for skipping screens: added
“If you are unsure about a question, you can skip it and continue to the next one” (in bold typeface)
Name entry error: the “last name” box was below the “first name box” and one participant did not see the “last name” box below and therefore entered both the first and last names in the “first name” box (FG4)Placed boxes for the first and last names on the same line
Figure 2. Questionnaire design modifications: (a) original introduction screen with small stop button (top right corner), (b) improved introduction screen (with larger stop button), (c) original name and date of birth confirmation message, and (d) improved image of name and date of birth confirmation message (without error message).

Usability

Touch technology
Hygiene
Participants raised concerns about the hygiene of a device being used by multiple patients, some of whom could be seeking care for an infectious illness (“the little kids with the runny noses” (DEV02)) with cough or sneeze that could directly contaminate device surfaces, or spread from mouth/eyes to hands and then to the device (participants could “… see the fingerprints” (DEV04)). A proposed solution was to use an antimicrobial screen wipe. Even those participants who did not share this particular concern conceded that “it’s something that others would appreciate …” (DEV07).
Familiarity with touch screen technology
Certain users expressed concerns that patients without touch screen technology familiarity could have difficulties with tasks such as launching the keyboard for text entries and advancing between screens. They recommended navigation cues for novice users: “[Add] a memo with how to bring up the keyboard, maybe with a picture for people who don’t understand” (DEV02).
Ease of use of touch screen
Ease of use of the touch screen itself was a concern, and using fingers in particular was believed to be less comfortable and efficient than a stylus:
“It looks like a pen but it has a little tip at the bottom that is soft and you just touch everything with that. It’s very efficient.” (DEV01)
Although it was acknowledged that most users would be habituated to using fingers on a touch screen, it was believed that a stylus might enable faster questionnaire completion and reduce inadvertent touches and juxtaposition errors (i.e. when two options are close together and the wrong one is accidentally chosen).
Questionnaire design
Visual characteristics (fonts, images)
Participants expressed a desire for “truer” image colors and larger image and font sizes:
“… this [medication image] could be a bit bigger.” (DEV17)
“It doesn’t need to be bold, maybe bigger so that you can understand right away.” (DEV05)
However, no further concerns were raised, even by subjects with visual impairment, after we increased the font size to 12 points:
“Well, as I said, I am diabetic and I have bad sight … And I have no problem with the fonts …” (DEV08)
Navigation
Design factors influenced the ease of navigating through the questionnaire. For example, participants in the first focus group had difficulties pressing radio buttons (this issue did not recur in the subsequent sessions, following their enlargement). Choosing medication images, which were initially designed as “press-to-select,” was not intuitive to participants:
“I didn’t know what to do, because I was looking for the checkmarks boxes … I didn’t know if I had to choose the picture or the word.” (DEV07)
Similarly, a “Stop” button, which enabled users to save existing answers and terminate the questionnaire, was initially missed and required enlargement and emphasis (Figure 2(a) and (b)), while a pop-up to confirm a user’s name and date of birth was misinterpreted as an error message, therefore requiring reformatting (Figure 2(c) and (d)).

Discussion

We used an iterative design process to design a highly usable touch screen tablet questionnaire for use by asthma patients in the clinic waiting room. Although clinical use of patient e-questionnaires is growing rapidly,32 there are few prior reports of systematic questionnaire development processes and limited data on patient usability preferences for touch screen questionnaires.15
Our rapid-cycle design approach enabled us to achieve a summative SUS score of 84.2. This score was well above the reported mean of 68 across web-based systems and corresponds to a subjective usability rating of “excellent” and percentile rank of 95 percent (i.e. higher perceived usability than 95% of other systems).33,34 This was supported by favorable user ratings of design features including layout, fonts, colors, and navigation features such as drop-downs, arrows, scroll-bars, and text data entry (Figure 1). In exit questionnaires, half of our users provided a neutral response when asked if they were able to zoom in and out of the screen. Although zooming was enabled, we believe that most users may not have even attempted to zoom, given that font size was optimized after the first focus group and 84.2 percent of all users found the font easy to read (Figure 1).
Our analysis uncovered usability findings that may have broad relevance for design and implementation of touch device questionnaires. The first main theme was “touch technology.” Here, users raised concerns that the device might act as a vector for infectious disease transmission (“hygiene”). Although transmissibility of infections through patient-facing touch devices has not been studied, mobile devices used by healthcare workers have been shown to carry nosocomial bacteria.35 Given that these organisms are comparable to those isolated from workers’ hands, it is reasonable to assume that a waiting room touch device would similarly become contaminated with pathogens on patients’ hands.36 Previous studies have shown that even magazine covers in general practice waiting rooms can carry pathogenic bacteria. Irregular surfaces near tablet speakers and connecting ports allow for fluid ingress and debris buildup, increasing the magnitude and persistence of contamination.35,36 We addressed this by providing disinfectant wipes for use before and after device manipulation (as has been recommended for clinician-facing devices).35 Previous studies have shown that most conventional wipes do effectively disinfect iPad screens without causing screen damage, and some have prolonged efficacy after a single use.37 Additional strategies would be to set a device alarm to remind staff to periodically disinfect the device (e.g. hourly), to reinforce hand hygiene maneuvers before and after device use, and to set screen saver reminders.35 It is also of interest that certain users recommended providing a stylus to simplify use (“ease of use of touch screen”). Although a previous study did not demonstrate usability constraints in patients with arthritis, others have also suggested that certain patients prefer a stylus for touch screen questionnaires, and this option should be made available to any user.15,38 Participants also indicated that functions such as accessing the device keyboard and swiping to advance the screen might not be intuitive to all users (“familiarity with touch screen technology”). Accordingly, we added simple instructions regarding these features on the home screen. Previous studies have demonstrated user difficulties with small keyboards on personal digital assistants. However, modern touch screen keyboards have been shown to be well accepted and easy to use across patient types, and we did not identify any usability concerns with the keyboard itself.39
The next main area was “questionnaire design.” In relation to “visual characteristics,” we originally used a 10-point font size and relatively small medication images in order to eliminate the need for scrolling, which may prolong completion time and lead to missed content.24 However, feedback indicated a need to increase font and image sizes. Font sizes of 11 or 12 points have previously been recommended to maximize legibility (particularly for older users).40,41 Given that legibility is a primary usability consideration, we compromised by increasing image sizes and the font size (to 12) and either dividing pages into two or occasionally requiring some scrolling.
Participants completed our questionnaire in a mean of 11.7 ± 5.9 min. The large observed standard deviation was attributable not only to the widely ranging tablet device familiarity but also to the diversity of asthma itself among the included users, introducing variability in the length and complexity of the questionnaire. We believe that a vast majority of patients would be able to complete this questionnaire during a typical medical waiting room interval. Primary care waiting room time averages 28.4 ± 18.9 min in the United States, with 62 percent of patients waiting at least 15 min and 85 percent at least 9.5 min.16,42 Given that a typical primary care physician visit lasts around 10–15 min, utilizing this waiting room time effectively doubles the available time for information collection.43 Furthermore, it has been shown that perception of wait room time is inversely correlated with patient satisfaction, and being occupied during the wait markedly increases satisfaction independent of duration.43
Integrating waiting room touch screen questionnaires into the process of care can also improve quality of care. First, they enable use of standardized and validated disease-specific questionnaires and minimize missing data.44 This is particularly important because physicians often forget or lack knowledge regarding appropriate disease-specific questions. Second, repeated use of such validated tools enables objective monitoring of disease progression and/or treatment effectiveness over time. Third, questionnaires may increase data accuracy, particularly in sensitive areas where patients may be reluctant to answer honestly in face-to-face interviews, such as substance abuse or medication adherence.44,45 Fourth, electronic questionnaires reduce clinician time required for data collection, particularly when tablet data are directly integrated into the electronic medical record.43 Data may also be processed through a clinical decision support system, enabling real-time decision support,13 as occurs in our tool.21 Questionnaires also present an opportunity for delivery of education and healthcare advice to patients, and to enable patients to set their goals prior to seeing the doctor, which increases patient satisfaction.43,46,47 Finally, questionnaires can be used to coach patients to deliver physician prompts—a quality improvement strategy called patient-mediated knowledge translation.43,48 It is also of note that, although paper questionnaires can offer some of these advantages, electronic questionnaires have been found to be more efficient, potentially less expensive over the long term, less prone to missing data and erroneous data entry, and perceived more favorably by patients across age groups.1,13,15,38,4951
The overall implication of our study is that by addressing patient preferences, it is feasible to design a highly usable touch tablet questionnaire for collection of data in the clinical waiting room. Specifically, developers should consider provision of antimicrobial wipes; availability of a stylus; provision of basic written instructions on how to type into and navigate through the questionnaire; use of bolding for emphasis and large image and font sizes (at least 11 or 12 points); and use features such as radio buttons, check boxes, and explanatory information in a hover dialogue for unintuitive terms and icons.
Our study has several limitations. We had a small sample size (20 participants). However, studies have demonstrated that up to 80 percent of usability issues can be identified with 5–8 participants, and we had pre-set stopping criteria which were met after five focus group sessions.27,52 We also recruited participants with varied ages, levels of education, and tablet device familiarity, suggesting that findings may be broadly applicable. Similarly, the Apple iPad is one of the most widely used tablet devices by healthcare professionals and represents about 70 percent of the nearly 100 million tablets in use worldwide.35,53 Furthermore, our usability findings relate to features that are common between tablets and smartphones and may also be applicable to the development of smartphone-based questionnaires. However, the small size of individual focus groups did not allow us to analyze and compare quantitative data (SUS scores, Likert-type scale questions) iteratively, and these data are thus only presented in summative form.

Conclusion

The use of patient-facing electronic questionnaires has many potential advantages and is gaining popularity across diseases and settings, particularly as the importance of patient-reported outcomes has grown.54 We successfully designed a highly usable touch tablet–based patient questionnaire. Given the importance of interface and application design in determining system uptake across a wide range of user types and in complex settings, we believe that our usability-related findings can be applied to optimize the future design and implementation of touch screen questionnaires across diseases. Our successful rapid-cycle design process can also be adopted by others. Future studies should measure factors that influence the real-world uptake of such tools and use these to develop and test strategies to optimize clinical workflow integration in various settings.

Acknowledgments

We would like to acknowledge Susan Hall and Lucy Frankel for their contributions to this work.

Declaration of conflicting interests

The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.

Funding

The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: Funding was provided by Canadian Institutes of Health Research (CIHR; 234365 and 322013), the Ontario Lung Association, and the Keenan Research Summer Student Program. Dr. S.G. was supported by The Michael Locke Term Chair in Knowledge Translation and Rare Lung Disease Research. Dr. M.K. was funded by a CIHR New Investigator Award. Dr. S.S was supported by a Tier 1 Canada Research Chair in Knowledge Translation and the Mary Trimmer Chair in Geriatric Medicine.

ORCID iD

References

1. Abernethy AP, Herndon JE 2nd, Wheeler JL, et al. Improving health care efficiency and quality using tablet personal computers to collect research-quality, patient-reported data. Health Serv Res 2008; 43(6): 1975–1991.
2. VanDenKerkhof EG, Goldstein DH, Blaine WC, et al. A comparison of paper with electronic patient-completed questionnaires in a preoperative clinic. Anesth Analg 2005; 101(4): 1075–1080.
3. Freynhagen R, Baron R, Tolle T, et al. Screening of neuropathic pain components in patients with chronic back pain associated with nerve root compression: a prospective observational pilot study (MIPORT). Curr Med Res Opin 2006; 22(3): 529–537.
4. Kastner M, Li J, Lottridge D, et al. Development of a prototype clinical decision support tool for osteoporosis disease management: a qualitative study of focus groups. BMC Med Inform Decis Mak 2010; 10: 40.
5. Motulsky A, Wong J, Cordeau JP, et al. Using mobile devices for inpatient rounding and handoffs: an innovative application developed and rapidly adopted by clinicians in a pediatric hospital. J Am Med Inform Assoc 2017; 24(e1): e69–e78.
6. O’Leary KJ, Lohman ME, Culver E, et al. The effect of tablet computers with a mobile patient portal application on hospitalized patients’ knowledge and activation. J Am Med Inform Assoc 2016; 23(1): 159–165.
7. Prgomet M, Georgiou A, Westbrook JI. The impact of mobile handheld technology on hospital physicians’ work practices and patient care: a systematic review. J Am Med Inform Assoc 2009; 16(6): 792–801.
8. Richter JG, Becker A, Koch T, et al. Self-assessments of patients via Tablet PC in routine patient care: comparison with standardised paper questionnaires. Ann Rheum Dis 2008; 67(12): 1739–1741.
9. Salaffi F, Gasparini S, Ciapetti A, et al. Usability of an innovative and interactive electronic system for collection of patient-reported data in axial spondyloarthritis: comparison with the traditional paper-administered format. Rheumatology 2013; 52(11): 2062–2070.
10. Kim H, Park HC, Yoon SM, et al. Evaluation of quality of life using a tablet PC-based survey in cancer patients treated with radiotherapy: a multi-institutional prospective randomized crossover comparison of paper and tablet PC-based questionnaires (KROG 12–01). Support Care Cancer 2016; 24(10): 4399–4406.
11. Suzuki E, Mackenzie L, Sanson-Fisher R, et al. Acceptability of a touch screen tablet psychosocial survey administered to radiation therapy patients in Japan. Int J Behav Med 2016; 23(4): 485–491.
12. Xiao Y. Artifacts and collaborative work in healthcare: methodological, theoretical, and technological implications of the tangible. J Biomed Inform 2005; 38(1): 26–33.
13. Fritz F, Balhorn S, Riek M, et al. Qualitative and quantitative evaluation of EHR-integrated mobile patient questionnaires regarding usability and cost-efficiency. Int J Med Inform 2012; 81(5): 303–313.
14. Kastner M, Lottridge D, Marquez C, et al. Usability evaluation of a clinical decision support tool for osteoporosis disease management. Implement Sci 2010; 5: 96.
15. Khurana L, Durand EM, Gary ST, et al. Subjects with osteoarthritis can easily use a handheld touch screen electronic device to report medication use: qualitative results from a usability study. Patient Prefer Adherence 2016; 10: 2171–2179.
16. Michael M, Schaffer SD, Egan PL, et al. Improving wait times and patient satisfaction in primary care. J Healthc Qual 2013; 35(2): 50–59; quiz 59–60.
17. Kitzinger J. Qualitative research. Introducing focus groups. BMJ 1995; 311: 299–302.
18. Johnson KGD, Ewigman B, Ewigman B, et al. Using rapid-cycle research to reach goals: awareness, assessment, adaptation, acceleration. AHRQ Publication No. 15-0036, June 2015. Rockville, MD: AHRQ Publication.
19. Brooke J. SUS—a quick and dirty usability scale. In: Jordan PW, Thomas B, Weerdmeester BA, et al. (eds) Usability evaluation in industry. London: Taylor and Francis, 1996, pp. 189–194.
20. Kuzel A. Sampling in qualitative inquiry. In: Crabtree B, Miller W (eds) Doing qualitative research. Newbury Park, CA: SAGE, 1992, pp. 31–44.
21. Kouri A, Boulet LP, Kaplan A, et al. An evidence-based, point-of-care tool to guide completion of asthma action plans in practice. Eur Respir J 2017; 49(5): 1602238.
22. Lougheed MD, Lemiere C, Dell SD, et al. Canadian Thoracic Society Asthma Management Continuum – 2010 Consensus Summary for children six years of age and over, and adults. Can Respir J 2010; 17(1): 15–24.
23. Silvey GM, Macri JM, Lee PP, et al. Direct comparison of a tablet computer and a personal digital assistant for point-of-care documentation in eye care. AMIA Annu Symp Proc 2005; 2005: 689–693.
24. Palmblad M, Tiplady B. Electronic diaries and questionnaires: designing user interfaces that are easy for all patients to use. Qual Life Res 2004; 13(7): 1199–1207.
25. Robinson TJ, DuVall S, Wiggins R III. Creation and usability testing of a web-based pre-scanning radiology patient safety and history questionnaire set. J Digit Imaging 2009; 22(6): 641–647.
26. Bhatia SK, Samal A, Rajan N, et al. Effect of font size, italics, and colour count on web usability. Int J Comput Vis Robot 2011; 2(2): 156–179.
27. Kushniruk AW, Patel VL. Cognitive and usability engineering methods for the evaluation of clinical information systems. J Biomed Inform 2004; 37(1): 56–76.
28. Gupta S, Lam Shin Cheung V, Kastner M, et al. Patient preferences for a touch screen tablet-based asthma questionnaire. J Asthma 2019; 56(7): 771–781.
29. Albert W, Tullis T. Measuring the user experience: collecting, analyzing, and presenting usability metrics. Burlington, MA: Morgan Kaufmann, 2013.
30. Elo S, Kyngas H. The qualitative content analysis process. J Adv Nurs 2008; 62(1): 107–115.
31. Richards L, Morse JM. Readme first for a user’s guide to qualitative methods. Thousand Oaks, CA: SAGE, 2012.
32. Bleustein C, Rothschild DB, Valen A, et al. Wait times, patient satisfaction scores, and the perception of care. Am J Manag Care 2014; 20(5): 393–400
33. Bangor A, Kortum P, Miller J. Determining what individual SUS scores mean: adding an adjective rating scale. J Usability Stud 2009; 4: 114–123.
34. Sauro J. Measuring usability with the System Usability Scale (SUS), 2011, https://www.userfocus.co.uk/articles/measuring-usability-with-the-SUS.html
35. Manning ML, Davis J, Sparnon E, et al. iPads, droids, and bugs: infection prevention for mobile handheld devices at the point of care. Am J Infect Control 2013; 41(11): 1073–1076.
36. Ulger F, Esen S, Dilek A, et al. Are we aware how contaminated our mobile phones with nosocomial pathogens? Ann Clin Microbiol Antimicrob 2009; 8: 7.
37. Howell V, Thoppil A, Mariyaselvam M, et al. Disinfecting the iPad: evaluating effective methods. J Hosp Infect 2014; 87(2): 77–83.
38. Abernethy AP, Herndon JE 2nd, Wheeler JL, et al. Feasibility and acceptability to patients of a longitudinal system for evaluating cancer-related symptoms and quality of life: pilot study of an e/Tablet data-collection system in academic oncology. J Pain Symptom Manage 2009; 37(6): 1027–1038.
39. Dexheimer JW, Borycki EM. Use of mobile devices in the emergency department: a scoping review. Health Informatics J 2015; 21(4): 306–315.
40. Edwards P. Questionnaires in clinical trials: guidelines for optimal design and administration. Trials 2010; 11: 2.
41. Mayberry JF. The design and application of effective written instructional material: a review of published work. Postgrad Med J 2007; 83(983): 596–598.
42. Anderson RT, Camacho FT, Balkrishnan R. Willing to wait?: The influence of patient wait time on satisfaction with primary care. BMC Health Serv Res 2007; 7: 31.
43. Sherwin HN, McKeown M, Evans MF, et al. The waiting room “wait”: from annoyance to opportunity. Can Fam Physician 2013; 59(5): 479–481.
44. Schick-Makaroff K, Molzahn A. Strategies to use tablet computers for collection of electronic patient-reported outcomes. Health Qual Life Outcomes 2015; 13: 2.
45. Smith PH, Homish GG, Barrick C, et al. Using touch-screen technology to assess smoking in a low-income primary care clinic: a pilot study. Subst Use Misuse 2011; 46(14): 1750–1754.
46. Kinnersley P, Edwards A, Hood K, et al. Interventions before consultations for helping patients address their information needs. Cochrane Database Syst Rev 2007; 3: CD004565.
47. Yoong SL, Carey ML, Sanson-Fisher RW, et al. Touch screen computer health assessment in Australian general practice patients: a cross-sectional study protocol. BMJ Open 2012; 2(4): 1–7.
48. Gagliardi AR, Legare F, Brouwers MC, et al. Patient-mediated knowledge translation (PKT) interventions for clinical encounters: a systematic review. Implement Sci 2016; 11: 26.
49. Matthew AG, Currie KL, Irvine J, et al. Serial personal digital assistant data capture of health-related quality of life: a randomized controlled trial in a prostate cancer clinic. Health Qual Life Outcomes 2007; 5: 38.
50. Rogausch A, Sigle J, Seibert A, et al. Feasibility and acceptance of electronic quality of life assessment in general practice: an implementation study. Health Qual Life Outcomes 2009; 7: 51.
51. Payne M, Janzen S, Earl E, et al. Feasibility testing of smart tablet questionnaires compared to paper questionnaires in an amputee rehabilitation clinic. Prosthet Orthot Int 2016; 41(4): 420–425.
52. Nielsen J, Landauer TK. A mathematical model of the finding of usability problems. In: Proceedings of the INTERACT’93 and CHI’93 conference on human factors in computing systems, Amsterdam, 24–29 April 1993, pp. 206–213. New York: ACM.
53. Sclafani J, Tirrell TF, Franko OI. Mobile tablet use among academic physicians and trainees. J Med Syst 2013; 37(1): 9903.
54. Bottomley A, Jones D, Claassens L. Patient-reported outcomes: assessment and current perspectives of the guidelines of the Food and Drug Administration and the reflection paper of the European Medicines Agency. Eur J Cancer 2009; 45(3): 347–353.

Cite article

Cite article

Cite article

OR

Download to reference manager

If you have citation software installed, you can download article citation data to the citation manager of your choice

Share options

Share

Share this article

Share with email
EMAIL ARTICLE LINK
Share on social media

Share access to this article

Sharing links are not relevant where the article is open access and not available if you do not have a subscription.

For more information view the Sage Journals article sharing page.

Information, rights and permissions

Information

Published In

Article first published online: January 23, 2019
Issue published: March 2020

Keywords

  1. asthma
  2. eHealth
  3. human factors engineering
  4. quality improvement
  5. surveys and questionnaires
  6. touch devices
  7. usability

Rights and permissions

© The Author(s) 2019.
Request permissions for this article.
PubMed: 30672358

Authors

Affiliations

Victor Lam Shin Cheung
Monika Kastner
North York General Hospital, Canada
Sharon Straus
University of Toronto, Canada; St. Michael’s Hospital, Canada
Alan Kaplan
University of Toronto, Canada; Family Physician Airways Group of Canada, Canada
Louis-Philippe Boulet
Université Laval, Canada
Samir Gupta
University of Toronto, Canada; St. Michael’s Hospital, Canada

Notes

Samir Gupta, Keenan Research Centre, Li Ka Shing Knowledge Institute, St. Michael’s Hospital, Suite 6042, Bond Wing, 30 Bond Street, Toronto, ON M5B 1W8, Canada. Email: [email protected]

Metrics and citations

Metrics

Journals metrics

This article was published in Health Informatics Journal.

VIEW ALL JOURNAL METRICS

Article usage*

Total views and downloads: 1752

*Article usage tracking started in December 2016


Altmetric

See the impact this article is making through the number of times it’s been read, and the Altmetric Score.
Learn more about the Altmetric Scores



Articles citing this one

Receive email alerts when this article is cited

Web of Science: 9 view articles Opens in new tab

Crossref: 8

  1. Advancing acceptance: assessing acceptance of the ESR iGuide clinical ...
    Go to citation Crossref Google Scholar
  2. Are older adults considered in asthma and chronic obstructive pulmonar...
    Go to citation Crossref Google Scholar
  3. Evaluating the Lower Urinary Tract Syndrome with a Telemedicine Applic...
    Go to citation Crossref Google Scholar
  4. Mixed Methods Studies
    Go to citation Crossref Google Scholar
  5. Primary Care Pre-Visit Electronic Patient Questionnaire for Asthma: Up...
    Go to citation Crossref Google Scholar
  6. Barriers and Enablers to Using a Patient-Facing Electronic Questionnai...
    Go to citation Crossref Google Scholar
  7. Electronic clinical decision support system (eCDSS) in the management ...
    Go to citation Crossref Google Scholar
  8. The Electronic Asthma Management System (eAMS) improves primary care a...
    Go to citation Crossref Google Scholar

Figures and tables

Figures & Media

Tables

View Options

View options

PDF/ePub

View PDF/ePub

Get access

Access options

If you have access to journal content via a personal subscription, university, library, employer or society, select from the options below:


Alternatively, view purchase options below:

Purchase 24 hour online access to view and download content.

Access journal content via a DeepDyve subscription or find out more about this option.