Home > Vol. 29, No. 4

Design and Evaluation of an Accessible Academic Course Search Portal
  • Young Mi Choi : School of Industrial Design, Georgia Institute of Technology, Atlanta, USA
  • Omid Elliyoun Sardroud : School of Industrial Design, Georgia Institute of Technology, Atlanta, USA

Background This paper presents the design and development of an accessible acedemic course search portal. In particular accessibilty features related to the capabilities of the device used to access the portal (desktop vs mobile) as well as capabilities of the end users (such as disabled users) are considered. The continued growing population of students with disabilities and increasing use of mobile devices make both of these important considerations in providing equitable access to critical services for a diverse student population.

Methods An alternative academic course search interface was designed that incorporates a mobile-first design approach combined with current universal design principles and best practices. Participants with varying levels of ability/disability evaluated both portals in terms of effectiveness, efficiency, subjective satisfaction and learnability using the System Usability Survey. Task times, efficiency and errors were also recoreded.

Results Within subjects paired t-tests comparing the current portal with the new design showed significant improvements in task performance time and accuracy. Overall usability scores for the new portal design were also significantly better.

Conclusions The results indicate that the application of these principles can lead to significant imporvements in overall usability, effectiveness and task efficiency within a single interface that is designed to be appropriate for both mobile and desktop access.

Universal Design, Usability, User Interface Design, Higher Education, Web Accessibility.
pISSN: 1226-8046
eISSN: 2288-2987
Publisher: Korean Society of Design Science
Received: 13 May, 2016
Revised: 11 Aug, 2016
Accepted: 16 Sep, 2016
Printed: Nov, 2016
Volume: 29 Issue: 4
Page: 21 ~ 37
DOI: https://doi.org/10.15187/adr.2016.
Corresponding Author: Young Mi Choi (christina.choi@gatech.edu)
PDF Download:

Citation : Choi, Y., & Sardroud, O. (2016). Design and Evaluation of an Accessible Academic Course Search Portal. Archives of Design Research, 29 (4), 21-37.

Copyright : This is an Open Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (http://creativecommons.org/licenses/by-nc/3.0/), which permits unrestricted educational and non-commercial use, provided the original work is properly cited.

1. Background

The motivation for this project was born from the Vertically Integrated Project program which brings together faculty, graduate and undergraduate students in multidisciplinary teams to tackle large scale design/discovery projects. The team was given a broad task to survey the various educational resources utilized by the student body and to identify specific opporutinities where usability and accessibility of these tools might be improved.

Through interviews with students the course registration system was consistently cited as one which many students often encountered one problem or another. For many students, this system is one that is mainly used only a couple of times per yer during course registaration. But even though use is relatively infrequent, there are no alternatives and so is used by everyone. General issues mentioned at this stage were difficulties comparing various scheduling options, difficulties using the system via mobile devices along with accessibility challenges such as reading fonts and use with screen readers.

The aim of this project was set to design, build and assess the usability and accessability of a new class registration portal. The portal in it’s current form has been in use for ten years. While the main task (registering for class) has remained the same, the way that students may go about this has evolved. Viewing or registering for classes via a browser running on a mobile phone or tablet would not have been possible when current system was put in place. Though most students may still utilize a desktop environment for registaration it was important to find enhancements to meet the changing methods by which that the system is accessed.

To this end a mobile-first approach was chosen for all design work. A page designed using a mobile-first takes a minimalist approach and implements the necessary features/functions of a site from the very beginning in a usable way to ensure that they work and can be viewed normally on a mobile device (Wroblewski, 2012). This is in contrast to an approach known as Responsive Web Desgin where an approach might be to take a comples, function heavy site and progressively degrade it based on the screen size and functions supported by the device (Firtman, 2010).

There are accessibility challenges related to the capabilities of the types of devices utilized by a user to access a page. The raw computing power of mobile devices is ever increasing. For example, in some tests the performance of an Apple iPhone 6s which was introduced in 2015 matches the performance of some Apple MacBook laptops from the same year (Napier, 2015). However it is not hard to find examples of people who would much prefer to view web pages through a laptop/desktop (Patel, 2015). Though mobile device performance may be similar, the experience of viewing pages through mobile browsers/apps are that pages can be ugly, less usable and/or have broken features. This may be due to things like smaller screen size or technical features (like java script) that a mobile browser simply does not know how to handle. Similarly it is not hard to find examples of desktop browser experience being negatively affected by design aimed primarily at mobile (Archer, 2015).

This can result in a desktop experience where pages are missing information, where navigation and search elements are hidden or excessively large iamges (Budiu, 2015). One often employed solution to this issue is developing two versions of a site: a basic one that is presented when visited by a mobile device and another "full site" that is presented when visited by a non mobile device (Nielsen, 2012) . Of course the drawback to this approach is that two sites must be maintained.

There are also accessibility challenges related to the capabilities of the users themselves. The number of disabled students entering postsecondary education has been increasing (Case & Davidson, 2011; Ofiesh, Rice, Long, Merchant, & Gajar, 2002; Providenti, 2004; Raskind & Higgins, 1998). In 2008, disabled students represented nearly 11 percent of the entire postsecondary students in the US (Raue & Lewis, 2011). Academic tools that must be used by students are increasingly being placed online (Case & Davidson, 2011; Fichten, Asuncion, Barile, Ferraro, & Wolforth, 2009), and the growing number of students with disabilities makes the accessibility of these systems an increasing concern (Bradbard, Peters, & Caneva, 2010).

There has been an increasing interest in accessibility and universal usability aimed at end users with disabilities among researchers (Gubbels & Kemppainen, 2002). As a result numerous guidelines for accessibility have been put into place (Katsiyannis, Zhang, Landmark, & Reber, 2009). The 1990 Americans with Disabilities Act was one of the pieces of legislation enacted to prohibit discrimination based on disability (Morin, 1990). This updated the Rehabilitation Act of 1973 which prohibited discrimination based on disability by any program or activity that receives federal funding (De Fabrique, 2011, p. 54), and included guidelines on web sites accessibility (McLawhorn, 2001). In addition to legislation, the World Wide Web Consortium offers the most detailed guidelines and recommendations for accessible Web sites (Consortium, 2008).

Despite the efforts of legislators and standards bodies, web accessibility practices have not been significantly improved (Hollins, 2012; Lazar et al., 2012). University web sites have been shown to suffer from severe accessibility issues (Kane, Shulman, Shockley, & Ladner, 2007). Researchers analyzing the underlying issues have pointed out to factors such as lack of training, limited resources, and lack of interest among practitioners (Horton & Quesenbery, 2014; Raue & Lewis, 2011).

Principles outlined by universal design has been suggested as a potential solution to these web accessibility problems. (Ye, 2014). Universal design (UD) is “the design of products and environments to be usable by all people, to the greatest extent possible, without the need for adaptation or specialized design” (Connell et al., 1997). The definition is closely related to the more broad definition of accessibility. UD establishes a set of principles (Anders & Fechtner, 1993), which can be applied in the design of physical environments, products and services in order to make them usable by people with a variety of characteristics including age, gender, language, and levels of ability to hear, see, move, and speak.

These principles include: Equitable Use, Flexibility in Use, Simple and Intuitive Use, Perceptible Information, Tolerance for Error, Low Physical Effort, and Size and Space for Approach and Use. Successful implementation of these universal deisgn principles has been shown to result in more usable and accessible interfaces for all users regardless of their individual characteristics (Stephanidis & Savidis, 2001). While conforming to accessibility guidelines such as WCAG 2.0 would ensure accessibility, UD principles can help designers ensure equal access and usability (Burgstahler, 2002).

For this project, accessibility is defined as the extent extent to which products, systems, services, environments and facilities are able to be used by a population with the widest range of characteristics and capabilities (e.g. physical, cognitive, financial, social and cultural, etc.), to achieve a specified goal in a specified context (Persson, et.al., 2015). This definition covers the cases related to the differing capabilities of devices that may be used to visit an online educational resource as well as the actual capabilities of the end user.

The main question for the design team became, can existing design guidelines and best practices be applied to redesign the registration portal so that a single interface can be more easily be used by mobile devices while simultaneously improving the accessibility and usability when accessed from a desktop environment?

1. 1. Assessing the Current System

Students at Georgia Institute of Technology use a central online registration portal for searching and registering for classes. Since it is used by all students an important goal is that it have a high level of usability and that it ideally implement best practices for design to help improve accessibility. The first step in the project was a evaluation of the current registration portal. To do this a series of semi-structured interviews were conducted to understand in detail how students utilize the registration system. Five current students, ranging from a first year undergraduate to a post graduate research scientist participated. The goal of the interviews were to identify the most common usability issues and frustrations encountered by students. The students selected were purposively sampled to be as broadly representative as possible. This sampling method was used due to time and resource constraints.

The interviews were recorded and transcribed. The interview transcriptions were used to create affinity diagrams which identified users’ goals and assumptions as they navigated through the system. The user experience flow of the system (to identify each step and decision point of the registration process) was mapped out. The affinity diagram and system flow were combined to identify points in the current registration system which might violate universal design principles. A number of universal design principles (Connell et al., 1997) were used as guidelines identify opportunities for improvement and to drive design decisions.

UD principle violated: low physical effort. Upon logging in to the system, users are presented with a drop down menu which asks them to select a term to proceed. By default this value is set to none (Figure 1 A ). Students are generally interested in looking up the course offerings for the current term, and it is rarely the case that they would be looking for past offerings.

Figure 1 Select term drop down menu (A), and the search box (B)

An obvious improvement to reduce steps/effort would be to set the default to the current term. A better solution would be to remove this step altogether, while making the previous course offerings available through a secondary menu.

UD principle violated: Simple and Intuitive Use. Instead of yielding results related to the course content, the search box at the top of the page seems to return results related to the sitemap and different sections of the system (Figure 1 B).

The second page after selecting the term presents a list of 72 subjects with two call to action buttons labeled course search and advanced search. In order to proceed with the “course search” button, users must select at fleast one subject from the list. However, selecting a subject is not required for the “advanced search” option. In fact, if users select a subject before proceeding to the advanced search section, they will need to select the subject again, since the previous list only applies to “course search” (Figure 2). The interface does not provide any means of communicating the differences between these two call to action buttons to the users, and it does not conform to their mental models.

Figure 2 Users are presented with two options: Course Search and Advanced Search

In the advanced search page, users can narrow down their search. One problem in this page that could go against users’ expectations is the field for selecting the hours (Figure 3). While the system asks for the “start time” and “end time”, the results are actually a range between these two values.

Figure 3 Selecting the hours in the advanced search form

Another violation of this principles was found in the “new search” button at the bottom of the search results page. Instead of taking the users to the advanced search form, the new search button takes them back to the very first page, where they need to select the term once again.

UD principle violated: Perceptible Information. While the “course search” page presents only two pieces of information for each course (Figure 4), namely their titles and their given numbers, the results of “advanced search” is a table with 20 columns, each with a unique piece of information, some of which are not of interest to the majority of users (Figure 5). If the search query returns only 10 courses, student need to parse a table with 200 data cells. There is a lack of information in the course search page, and there is information overload in the advanced search section. Thus, the solution might lie in finding a balance between these two sections.

Figure 4 The course search page presents only two pieces of information

Figure 5 Twenty columns of information can be overwhelming for the users to comprehend

In addition, each search results page starts with a guide, which attempts to explain the results table with an image in which some of the information in the table is highlighted in red (Figure 6). While this might have been an attempt to alleviate the confusion and enhance perceive-ability of the results, it acts as an obstacle between the users and the search results, and it forces the users to scroll down a page to see the information they are looking for.

Figure 6 The help image takes a considerable amount of screen real state

The lack of borders in the table showing the advanced search results and the tight space between table cells makes it difficult to distinguish the information in two neighboring cells. This hinders the readability of the presented information.

The table headers are only present at the very top of the table, and the only way to figure out what each data cell represents is to either scroll up to the very top of the page, or to remember what each column stands for. Users would forget which data cell they were looking at, if they had to scroll up and down, and it would demand a high cognitive load to remember all the column headers. Moreover, the use of acronyms and abbreviations for the mentioned headers (e.g. WL Cap for waitlist capacity) presents another hurdle for perceiving the information. Users have to hover over each header and wait for the tooltip to see the unabbreviated version of the corresponding term.

UD principle violated: flexibility in use. The system does not provide any means of enabling the user to control the interface, even though there are several opportunities for doing so. For example, the number of columns in the advanced search section could be an opportunity to give users the ability to choose what information they want to see.

An alternative course search website was designed to address the identified problems and make the system equally usable by all students regardless of their different characteristics. The objective was to improve the usability of the system for all users.

1. 2. Site Redesign

The principle of equitable use was employed first. Instead of segregating a certain group, and designing special tools for them, one objective of this design is to provide a single online tool that could be equally usable by people with or without disabilities. Regardless of ability, all users should be equally efficient when using it. To achieve this, the search and navigation functionalities were re-thought. The course search and registration functions were separated from lesser used functions and moved to their own page. This decision greatly simplifies the navigation making it easier to understand for typical users and easier to support assistive devices (Figure 7).

Figure 7 The suggested design is not nested inside another services, and is a stand-alone service

A new registration interface should be flexible to use and be able to work effectively on a wide range of devices such as smart phones and tablet computers. In the new interface, the select subject button is kept visible, even if a mobile view is zoomed into the results, so that it is usable without the need of horizontal scrolling (Figure 8).

Figure 8 The new design works across devices without sacrificing usability

The interface adds controls to allow users to easily adjust aspects of the interface. It includes controls for font size and page contrast (Figure 9 A).

Figure 9 Giving users control over appearance and presented information

Settings are also provided to allow customization of what type of information is presented in the search results. For example, location of the course, which is initially hidden, can be added to the presented information if users are looking for that information (Figure 9 B).

In addition, the subject search drop-down menu provides two ways of selecting a course: scrolling and clicking on the subject, or typing part of its name in the search box, making it flexible for both keyboard and mouse users (Figure 10).

Figure 10 Multi-modal and searchable dropdown menu

The new design attempts to make the interface more simple and intuitive to use by removing the advanced search present on the original page. Instead of clicking on the advanced search button and directing users to a new page, Users are able to refine their results within the same page without losing context of what they were looking for by simply typing filter text (Figure 11).

Figure 11 Search filters provide an easier way of refining results

Default settings for font size and contrast (Darroch, Goodman, Brewster, & Gray, 2005) help to ensure perceptibility of information. Redundant visual cues such as color and icons are used where appropriate.

The button for selecting the subject, which is usually the first step and the most important, is in a different color, and an icon helps to distinguish it from the rest of the layout. The course titles, which are links to course descriptions, are in internationally recognizable blue color (Figure 7).

Error tolerance is improved by prompting users with helpful and actionable information in case a search does not yield any results. The number of results found will be presented to them at the top of the page. There are also hidden signifiers for screen-readers that will inform the users when they have reached the end of the results lists. The cost of potential errors is minimized by performing all actions on the same page.

Redundant steps and unnecessary form fields are removed to lower the amount of effort required to navigate the system. Users will not have to select the term, since the current term is initially selected by default. Actions, like selecting the subject, and search results are presented in the same page, which removes the need of navigating away from the page for a change of subject.

In the current system, users have to deal with a significant amount of horizontal scanning in order to find the relevant information (Figure 12).

Figure 12 The current system requires a significant amount of horizontal eye movement

Horizontal eye movement lowers efficiency, especially for the visually impaired (Goldberg & Kotval, 1999). In the new design the horizontal span of the information has been restricted in order to minimize the need for horizontal eye movements. The tabular display of information has also been changed to remove the need for going back to the table headers for each piece of information. This makes it easier for users to find the information they are looking for, and it also provides context for screen-readers. (Figure 13)

Figure 13 The suggested design minimized horizontal scanning, and exploits the ease of vertical eye movement
2. Methods

An evaluation of the new interface was conducted to compare the new design with the existing system. The objectives were:

  • 1. to measure the speed and accuracy of registration tasks performed using the new design
  • 2. to indicate that the new design does not reduce usability for users with disabilities
  • 3. to measure the usability and subjective satisfaction for both systems

Sixteen participants between the age of 21 and 48 were recruited (7 male, 9 female) to evaluate each of the interfaces. 14 participants were undergraduate or graduate students who were familiar with the existing system, while two users were not students. Three of the participants had low vision impairment defined here as visual acuity between 20/70 or lower, but not worse than 20/200 (standard measurement for legal blindness). The visually impaired participants indicated that text-enlargement is the primary assistive technology they use when interacting with websites. The text-enlargement feature built into the web browser (Internet Explorer and Google Chrome) was used while interacting with both systems to perform tasks during the evaluations.

Participants were given three tasks to complete within each of the interfaces. The tasks were chosen to represent common use case scenarios that students encounter during the registration process. They were also chosen to ensure that the various interaction updates to the interface would be utilized. The tasks were:

  • Task 1 - Find the list of courses for the computer science subject for the spring 2016 semester.
  • Task 2 - Find the list of architecture courses with two credits for the spring 2016 semester.
  • Task 3 - Find out who is the instructor for the design methods course in industrial design department in the spring 2016 semester.

Task one is the simplest and one of the most common ones students perform during registration and course search. In addition to the reasons mentioned above, another reason for selecting such a simple task was to ensure that the new design does not add complexity to an already simple task. Task two involves more steps than either of other two tasks, in which users have to filter down the results by both the subject and credits. Task three was designed as an open ended task that could be achieved in different ways and includes scanning the page for specific information.

Time to completion and number of errors for each task were measured. Completion time for each task was measured in seconds with a digital stopwatch. Participants were asked to indicate the start and end of the task verbally. The measurement started when participant started interacting with the system (first mouse or trackpad movement), the timing of the task was stopped when participants indicated the end of the task or when the researcher observed that the task is complete, whichever came first. The accuracy was measured by dividing the number of correct trials by the total number of trials for each task, thus yielding a percentage. Each time the users finished the task without encountering any errors was counted as a correct trial. Any action that required the participant too redo one or several steps to achieve the goals of the task was counted as an error.

Immediately after completing the tasks within one of the systems, particiapnts completed the SUS (System Usability Survey) in order to provide a measurement of usability (Bangor, Kortum, & Miller, 2008).

Before beginning an evaluation, participants were asked to interact with a web page (example. com) in order to set their preferred touchpad and scrolling settings. After the settings were adjusted to individual preferences, the pages for both interfaces were opened in separate web browser tabs.

The order in which each system was evaluated was determined by the assigned participant ID number. Participants with even numbers started with the current system, while the ones with odd numbers evaluated the new design first. The order in which the tasks were performed within each interface were randomly assigned.

3. Results

The Kolmogorove-Smirnov test confirmed the normal distribution of the data for completion times both for the existing system (M=41.73, SD=17.66, Skewness=0.94) and the alternative design (M=11.94, SD=4.85, Skewness=0.85). Within-subjects paired t-test was used to analyze the differences between the two systems for completion times, accuracy and SUS scores.

There was a significant difference in the completion times (Figure 14) for the existing system (M=41.73, SD=17.66) and the alternative design (M=11.94, SD=4.85) conditions; t(8.22), p = 0.0001.

Figure 14 The alternative design resulted in significant reduction in average completion time in all three tasks

Breaking down the results and comparing the average completion times for individual tasks across the two systems also showed statistically significant performance reduction time for each task (Figure 15).

Figure 15 Task two demonstrated the most significant difference in terms of average accuracy between the two systems

There was a significant difference in the accuracy for the existing system (M=77.95, SD=14.07) and the alternative design (M=92.36, SD=9.04) conditions; t(4.15), p = 0.0008.

There was a significant difference in SUS scores for the existing system (M=40.46,SD=17.08) and the alternative design (M=86.25, SD=16.83) conditions; t(8.16), p=0.0001. The average SUS score for the existing system was 37.34, while the alternative design was given a score of 89.37 in a 0-100 SUS scale.

4. Discusion

The completion time for the alternative design was significantly lower than the existing system. The accuracy in performing registration tasks within the new interface was also significantly better. The overall rated usability of the new interface was also significantly higher as shown in the higher SUS scores.

Task 2, which was the most complex task among the three, showed the largest difference in completion time between the two interfaces. This task generated a high number of errors and because of the interface organization required more steps to complete than in the new one. Not being able to find the advanced search option was the most common error encountered by participants using the existing interface. They would select the subject, miss the advanced search button and go directly to the course search section. Unlike in the new interface design it does not provide a direct way of filtering down results. Users had to start over in order to navigate to the advanced search options. Another mistake that several participants made when using the existing system was in defining the range for credits in the advanced search forms. In order to filter results by credit range the system asks for a minimum and maximum range. Given that the task was to find courses with only two credits, some users did not provide the second value (maximum) range. The returned results were courses with 2 credits and more. Participants encountering this error had to go back to the advanced search page and enter the maximum value. Although the alternative design’s performance was significantly better the the existing system, participants encountered some errors on the second task. The most common error was setting the filters before selecting the subject. A defect in the system caused it to ignore the filter and present the entire course list of the chosen subject. Participants noticed the removal of the filter and were able to recover from the error, but in order to improve the alternative design, filters should work regardless of the steps in which users select them.

In the existing interface, users have to navigate to a new page each time they need to change their search criteria. Choosing the semester, changing the subject and applying search filters are all different sections of the system. In the new interface, all of these options are presented in one page, making it possible for users to change their search criteria without navigating away from the results list, which resulted in shorter completion times. The effect of having a limited horizontal span was particularly noticeable in task 3, in which participants were looking for a specific piece of information. Using the existing interface, participants would scan the page and the table, in some cases several times, before locating the instructor’s name which was on the far right corner of the results table. Visually impaired users reported physical discomfort and eye strain when looking at the results table on the existing interface. By significantly reducing the horizontal span of the results section, participants did not report any discomfort using the alternative design, and they were able to locate the information faster. The use of multi-modal dropdown menu in the alternative design offered participant with two choices of scrolling down the list or searching for the name of the subjects. Although this did not translate into significantly faster completion times for task 1, users expressed satisfaction at having such an option.

Even with only three visually impaired participants, the task performance time data provides encouraging results related to the accessibility of the new interface. There was a bigger performance gap between the visually impaired and normal vision group when using the existing interface than the new design. On average, visually impaired users completed the tasks 37.8 seconds slower that the rest of the participants when using the existing interface. When using the new interface the difference in average task completion time was reduced to 9 seconds.

A problem was encountered by visually impaired participants when using the new interface design. There was a lack of salient feedback when changing the filters on the sidebar. When changing search filters on the sidebar, the results get updated immediately on the right panel without the need to apply any settings. This data update was not as obvious to the visually impaired users as it was to normal vision group. Having the option to apply the settings, instead of automatically updating the results could solve this problem for the visually impaired users. This would result in an extra action (clicking on the apply button) to the task, but considering its trivial effect on performance time and its potential benefit to the visually impaired group, it would be a reasonable change.

The study was limited in two important ways. The number of participants was small and unlikely to be fully representative of the overall user group. Similarly the absolute number of visually impaired participants was also small. This limits the generalizability of the results. However the application of universal design principles led to strongly significant results for all users (abled and disabled), even among a small population. Further investigation would be needed to confirm that these results hold up among a larger participant population or with other types of registration tasks which may be less common. While accessibility (particularly for low vision users) was not directly measured, the results do imply that improving usability contribute to improved accessibility by making an interface more effective for disabled users to perform tasks.

5. Conclusion

Though test was small, it does provide some indication that a single interface can be developed with mobile devices in mind that not only preserves, but improves the desktop experience. This is one of the scenarios presented in the background. A mobile-first approach is a viable option so long as it is not assumed to be mobile-only. Currenty defined best practices for web desgin and usability can be employed to ensure that all the needed features are present, functional and are appropriate for mobile devices while working properly for desktops. Desktop access is still the primary method used to access the registration system. While every effort was made to design equal usability through a mobile device interface, further testing would be needed to confirm this.

  1. 1 . Anders, R., & Fechtner, D. (1993). Universal design. Pratt Institute Department of Industrial Design & Pratt Center for Advance Design Research.
  2. 2 . Archer, J. (2015). Why is it So Easy to Get "Mobile First" Wrong?. Retrieved November, 2015 from http://jamesarcher.me/mobile-first/.
  3. 3 . Bangor, A., Kortum, P. T., & Miller, J. T. (2008). An empirical evaluation of the system usability scale. Intl. Journal of Human-Computer Interaction, 24 (6), 574-594. [https://doi.org/10.1080/10447310802205776]
  4. 4 . Bradbard, D. A., Peters, C., & Caneva, Y. (2010). Web accessibility policies at land-grant universities. The Internet and Higher Education, 13 (4), 258-266. [https://doi.org/10.1016/j.iheduc.2010.05.007]
  5. 5 . Budiu, R. (2015). The State of Mobile User Experience. Retrieved November, 2015 from https://www.nngroup.com/articles/mobile-usability-update/.
  6. 6 . Burgstahler, S. (2002). Distance learning: Universal design, universal access. Educational Technology Review, 10 (1), 32-61.
  7. 7 . Case, D. E., & Davidson, R. C. (2011). Accessible online learning. New Directions for Student Services, 2011 (134), 47-58. [https://doi.org/10.1002/ss.394]
  8. 8 . Connell, B.R., Jones, M., Mace, R., Mueller, J., Mullick, A., Ostroff, E., Sanford, J., Steinfeld, E.D., Story, M., & Vanderheiden, G. (1997). The principles of universal design.
  9. 9 . World Wide Web Consortium. (2008). Web content accessibility guidelines (WCAG) 2.0.
  10. 10 . Darroch, I., Goodman, J., Brewster, S., & Gray, P. (2005). The effect of age and font size on reading text on handheld computers. In IFIP Conference on Human-Computer Interaction (pp. 253-266). Springer Berlin Heidelberg. [https://doi.org/10.1007/11555261_23]
  11. 11 . De Fabrique, N. (2011). Section 504 of the Rehabilitation Act of 1973. In Encyclopedia of Clinical Neuropsychology (pp. 2227-2228). Springer. [https://doi.org/10.1007/978-0-387-79948-3_1024]
  12. 12 . Fichten, C. S., Asuncion, J. V., Barile, M., Ferraro, V., & Wolforth, J. (2009). Accessibility of e-learning and computer and information technologies for students with visual impairments in postsecondary education. Journal of Visual Impairment & Blindness, 103 (9), 543-557.
  13. 13 . Firtman, M. (2010). Programming the mobile web. O'Reilly Media, Inc.
  14. 14 . Goldberg, J. H., & Kotval, X. P. (1999). Computer interface evaluation using eye movements: methods and constructs. International Journal of Industrial Ergonomicst, 24 (6), 631-645. [https://doi.org/10.1016/S0169-8141(98)00068-7]
  15. 15 . Gubbels, A., & Kemppainen, E. (2002). A review of legislation relevant to accessibility in Europe. eEurope 2002 Action Plan.
  16. 16 . Hollins, N. L. (2012). Learning disabilities and the virtual college campus: A grounded theory of accessibility.
  17. 17 . Horton, S., & Quesenbery, W. (2014). A web for everyone: designing accessible user experiences. Rosenfeld Media.
  18. 18 . Johnstone, C. (2003). Improving Validity of Large-Scale Tests: Universal Design and Student Performance (NCEO Technical Report).
  19. 19 . Kane, S. K., Shulman, J. A., Shockley, T. J., & Ladner, R. E. (2007). A web accessibility report card for top international university web sites. In Proceedings of the 2007 international crossdisciplinary conference on Web accessibility (W4A) (pp. 148-156). ACM. [https://doi.org/10.1145/1243441.1243472]
  20. 20 . Katsiyannis, A., Zhang, D., Landmark, L., & Reber, A. (2009). Postsecondary education for individuals with disabilities legal and practice considerations. Journal of Disability Policy Studies, 20 (1), 35-45. [https://doi.org/10.1177/1044207308324896]
  21. 21 . Lazar, J., Wentz, B., Akeley, C., Almuhim, M., Barmoy, S., Beavan, P., & Bradley, B. (2012). Equal access to information? Evaluating the accessibility of public library web sites in the State of Maryland. In Designing Inclusive Systems (pp. 185-194). Springer. [https://doi.org/10.1007/978-1-4471-2867-0_19]
  22. 22 . Lopez, Napier (2015). The iPhone 6s outperforms the 2015 MacBook in some tests, which says a lot about the iPad Pro. Retrieved November, 2015 from http://thenextweb.com/apple/2015/09/24/the-iphone-6s-outperforms-the-2015-macbook-in-some-tests-which-says-a-lotabout-the-ipad-pro/.
  23. 23 . McLawhorn, L. (2001). Leveling the accessibility playing field: Section 508 of the Rehabilitation Act. HeinOnline.
  24. 24 . Morin, E. C. (1990). Americans with Disabilities Act of 1990: Social Integration Through Employment. Cath. UL Rev., 40, 189.
  25. 25 . Nielsen, J. (2012). Mobile Site vs. Full Site. Retrieved November, 2015 from https://www.nngroup.com/articles/mobile-site-vs-full-site/.
  26. 26 . Ofiesh, N. S., Rice, C. J., Long, E. M., Merchant, D. C., & Gajar, A. H. (2002). Service delivery for postsecondary students with disabilities: A survey of assistive technology use across disabilities. College Student Journal, 36 (1), 94.
  27. 27 . Patel, N. (2015). The mobile web sucks. Retrieved November, 2015 from http://www.theverge.com/2015/7/20/9002721/the-mobile-web-sucks/.
  28. 28 . Persson, H., Åhman, H., Yngling, A. A., & Gulliksen, J. (2015). Universal design, inclusive design, accessible design, design for all: different concepts-one goal? On the concept of accessibility-historical, methodological and philosophical aspects. Universal Access in the Information Society, 14 (4), 505-526. [https://doi.org/10.1007/s10209-014-0358-z]
  29. 29 . Pisha, B., & Coyne, P. (2001). Smart from the start The promise of universal design for learning. Remedial and Special Education, 22 (4), 197-203. [https://doi.org/10.1177/074193250102200402]
  30. 30 . Providenti, M. (2004). Library Web Accessibility at Kentucky's 4-year Degree Granting Colleges and Universities. D-Lib Magazine, 10(9). [https://doi.org/10.1045/september2004-providenti]
  31. 31 . Raskind, M. H., & Higgins, E. L. (1998). Technology and learning disabilities: What do we know and where should we go. Perspectives, 24 (2), 1.
  32. 32 . Raue, K., & Lewis, L. (2011). Students with Disabilities at Degree-Granting Postsecondary Institutions. First Look. NCES 2011-018. National Center for Education Statistics.
  33. 33 . Stephanidis, C., & Savidis, A. (2001). Universal access in the information society: methods, tools, and interaction technologies. Universal Access in the Information Society, 1 (1), 40-55.
  34. 34 . Tognazzini, B. (2014). First Principles of Interaction Design (Revised & Expanded). Retrieved November, 2015 from http://asktog.com/atc/principles-of-interaction-design/.
  35. 35 . Wroblewski, L. (2012). Mobile First. New York: A Book Apart.
  36. 36 . Ye, H. (2014). Universal Design for Learning in an Online Teacher Education Course: Enhancing Learners' Confidence to Teach Online. MERLOT Journal of Online Learning and Teaching Vol, 10.