User interface design for situation-aware decision support systems


10 pages

Please download to get full document.

View again

of 10
All materials on our website are shared by users. If you have any questions about copyright issues, please report us to resolve them. We are always happy to assist you.
Information recall about general situations incurs memory and cognitive loads on operators. Recognition of information for specific situations identified with users' context and the state of the world is helpful to operators in performing tasks
    OpenAIR@RGU The Open Access Institutional Repository at Robert Gordon University This is an author produced version of a paper published in IEEE Multi-disciplinary Conference on Cognitive Methods in Situation Awareness and Decision Support (ISBN 9781467303453) This version may not include final proof corrections and does not include published layout or pagination. Citation Details Citation for the version of the work held in ‘OpenAIR@RGU’: NWIABU, N., ALLISON, I., HOLT, P., LOWIT, P. and OYENEYIN, B., 2012. User interface design for situation-aware decision support systems. Available from OpenAIR@RGU  . [online]. Available from: Citation for the publisher’s version: NWIABU, N., ALLISON, I., HOLT, P., LOWIT, P. and OYENEYIN, B., 2012. User interface design for situation-aware decision support systems. In: IEEE Multi-disciplinary Conference on Cognitive Methods in Situation Awareness and Decision Support. 6-8 March 2012. Piscataway, New Jersey: IEEE. Pp. 332-339. Copyright Items in ‘OpenAIR@RGU’, Robert Gordon University Open Access Institutional Repository, are protected by copyright and intellectual property law. If you believe that any material held in ‘OpenAIR@RGU’ infringes copyright, please contact with details. The item will be removed from the repository while the claim is investigated.     © 2012 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.    User Interface Design for Situation-aware DecisionSupport Systems Nuka Nwiabu, Ian Allison, Patrik Holt, Peter Lowit, Babs OyeneyinSchool of Computing, IDEAS Research Institute, Robert Gordon University, Aberdeen, UK  Abstract —Information recall about general situations incursmemory and cognitive loads on operators. Recognition of infor-mation for specific situations identified with users’ context andthe state of the world is helpful to operators in performing tasksin complex environments. The emergence of ubiquitous, ambient,and pervasive technologies is increasingly providing methods tohelp operators to perform their tasks in smart and intelligentways. Existing user interface design does not solve the problemof drawing together the information required for situation-awaredecision support systems in a way that minimises cognitive load.This paper discusses a framework for user interface designof situation-aware systems that exploit inputs from users andthe environment to provide information tailored to the user’stasks in specific situations. The user interface can reconfigureautomatically in order to adapt to the current situation. Theadaptation of the user interface to the current situation andthe presentation of a reusable sequence of tasks in the situationreduces memory loads on operators. Hierarchical Task Analysis(HTA) is used to describe tasks for various types of situations.HTA is supplemented with scenarios to stimulate design ideasand requirements analysis is used to represent interrelationshipsbetween tasks.  Keywords:  Situation awareness; Context awareness; User inter-face design; Cognition; Scenarios; Task Analysis; Requirementsanalysis. I. I NTRODUCTION User interfaces (UIs) represent the point of contact betweensystems and human users. The emergence of ubiquitous,ambient, and pervasive technologies has resulted in methodsto assist users in smart and intelligent ways. One such way iscontext-aware computing; a trend whereby computing devicesand systems serve their users beyond the traditional desktopin diverse environments [6]. Dey [10] defines context as anyinformation that can be used to characterize the situation of an entity. A system is said to be context aware if it usescontext to provide relevant information and services to theuser [29]. Context awareness was introduced by Schilit [32] to develop an application that adapts to the location of use,nearby people and objects, and the change of those objectsover time. With technology advancement and the growth of mobile computing in recent times, context awareness hasattracted greater research attention [16]. Context-aware userinterfaces allows systems to dynamically adapt to changes ina user’s task domain by updating relevant information andservice provision.A related concept to context awareness is the notion of situation awareness. Situation awareness (SA) is a cognitiveprocess in decision-making and is defined as ”the perceptionof elements in the environment within a volume of time andspace, the comprehension of their meaning, and the projectionof their status in the near future” [14]. The Endsley SAmodel [14] has three layers comprising perception, com-prehension, and projection. The perception layer recognisesall the necessary information about the environment. Thecomprehension layer interprets the perceived information inorder to understand the current state of the environment. Theprojection layer uses knowledge of the current state of theenvironment to predict its future state. Situation awareness andcontext awareness both focuses on information about the stateof the environment in which these tasks are carried out [16].Situation-aware systems exploit explicit and implicit inputs toprovide information tailored to users’ tasks in different situa-tions. The system can adjust to a range of user abilities to solvethe problem of variations in user’s expertise, greater speed of performance, reduced operators workload, more consistency,greater flexibility in behaviours, and less training time [23].But it is simplistic to assume that adaptive user modellingwill solve all human-computer interaction problem. A growingbody of research has examined the characteristics of human-operator interaction with adaptive display and described thehuman performance costs such as trust, complacency, skilland performance degradation, decrease user acceptance thatcan occur in such interaction [26],[31]. Designers of UIsfor situation-aware systems must know what changes fromusers or environments are related to the tasks that the usersperform to achieve goals by drawing up a task model, using anotation which allows it to describe tasks for various types of situations [9]. There appears to be no existing framework witha notation to support designers in building UIs for situation-aware systems from situation-based tasks.This paper describes a framework for the design of situation-aware interfaces in a manner that input information (contextand environmental cues) can be explicitly taken into accountin the task specification. In order to achieve a concrete userinterface (UI), it is assumed that the designer adds abstract UIcomponents to the task model. This information is platform-independent so that the rendering back-end can ultimatelyuse this information to construct a concrete UI for variousplatforms. The next step consists of creating the dialoguemodel. Designers can be supported by automatically generat-ing the statuses and transitions between the various individualdialogues, so as to simplify the work of designers. The toolincludes an algorithm to calculate the different dialoguesand transitions between dialogues from the task specifica-tion. Designers can adjust, add or remove these transitions  according to the results of a previous testing stage or thedesigners’ experience. This way situation-aware UI designerscan manipulate transitions that would be triggered by situationchanges. Designers thus have control over the influence of situation on the usability of the UIs.The case study for the paper is the design of situation-aware UI for Hydrate formation prediction in subsea oil andgas pipelines. There will three transition statuses, Normal,Warning, and Danger [25]. Normal situations represent sit-uations where there is no problem in the domain. A warningsituation represents a situation that is not normal but not yetin danger. A danger situation is a crisis situation that meansthere are already problems in the domain. The UI executesreconfiguration after input variation so as to stay adaptedto any of these situations that depict the current situationin the domain. Warning situations cause the presentation of preventive sequence of task while danger situations causethe presentation of remediation or repair sequences of task.Hierarchical Task Analysis (HTA) is used to describe tasksfor these situations. HTA is supplemented with scenariosto stimulate design ideas. Each scenario has a setting thatexplicitly describes the starting state of the current and thefuture situations, and implicitly depicts the characters that takepart in the situations in the scenario. Each scenario has actorswho perform tasks to achieve goals in different situations.Requirements analysis is used to supplement our scenario-based HTA in representing interrelationships between tasks.Dialogues and transitions between dialogues are calculatedfrom the task specifications.The remainder of the paper is as follows. The followingsection provides a short overview of related work. Thenthe design process, and the task model to the approach aresuccessively presented followed by a prototype architecturaldesign for situation-aware UI. Finally, the design is evaluatedand the work is summarised and concluded.II. R ELATED  W ORK The emergence of ubiquitous, ambient, and pervasive tech-nologies has triggered research in context-aware UI design.Limbourg et al.[21] developed a language, UsiXML, to de-scribe context-aware UIs. He provided tool support, however,concentrates on transformations between models in order totransform abstract descriptions to concrete ones, with norecognition of the fact that there could be unexpected changesof the UI when a context change occurs. Clerckx and Coninx[7] provided a mechanism to avoid these unexpected changesby incorporating context in UI development using transforma-tions between models [8] but the integration with the contextmodel is done by the designer. Mori et al.[24] describes theTERESA tool for designing UIs for mobile devices. Abstractmodels are used in order to deploy concrete UIs on severalplatforms. The approach is task centered implying that a lotof effort has been taken in visualizing the task model. A recon-sideration of visual representation of task models is recentlycarried out by [27]. Techniques like semantic zoom (hidinginformation outside the point of focus) and fish eye views(increasing the size of elements in focus) are introduced inorder to improve the effectiveness of viewing and constructingtask models.To express the solution for identified UI patterns in anabstract way, [3] provided a modelling tool for model-basedUI design having two different levels of abstraction; wisdompresentation model, and canonical abstract prototypes. Thetool applies the Wisdom model to UI patterns, easily express-ing containment relationship, while the Canonical prototypeis much closer to the concrete representation of the identifiedpattern. However, support for context-aware and multi-deviceUIs using the Canonical notation is not obvious and is there-fore not considered by the approach.Calvary et al. [2] describe a development process to createcontext-sensitive UIs. The development process consists of four steps: creation of a task-oriented specification, creation of the abstract interface, creation of the concrete interface, andfinally the creation of the context-sensitive interactive system.The focus however, is on a mechanism for context detectionand how context information can be used to adapt the UI,captured in three stages; recognizing the current situation,calculating the reaction, and executing the reaction.Wu et al. [35] used HTA combined with scenario-baseddesign to develop a UI to context-aware indoor navigationapplications. The approach used HTA method to identify user,user-application, and application tasks. The work provided aframework of command interfaces for executing interactionbetween application tasks and user tasks. These commandinterfaces link users, user-application, and application tasks.The work did not look at how human variability influencesusability. Also, no mention was made of the method of interaction between objects.In a similar hybrid approach, Lewis [20] combined HTAwith requirement analysis by replacing the abstract, and partialtask elements of requirement analysis with real tasks from thetask analysis. The approach does not however consider thepossibilities of losing detail in the process of generalisation.Kim et al [18] and Liu [22] combined metadata definition with scenarios to build task knowledge structures in theirworks on sentence ends and interruption points in speech.Metadata was created within a specific context and for specificpurpose, and different purposes and different contexts havedifferent metadata requirements. Metadata are informationand documentation associated with objects which makes dataunderstandable and shareable for users over time relievingthem of having to have full advance knowledge of the dataexistence or characteristics.III. D ESIGNING  S ITUATION -A WARE INTERFACES This section provides an overview of the design process(Fig. 1.). The design process supports the design of declarativeabstract models, describing the situation-aware user interface.The aggregate of the models can be serialized in orderto export these models to a runtime. To test the result of these models, the corresponding UI can be generated in theshape of a prototype to check the usability of the system.  Fig. 1. Situation-aware User Interface Design Process Considering the prototype, some changes to the models inthe design process can be applied to alter for instance thepresentation of the UI or how situation changes may affectthe UI. Situation-based Task Model : First, a task model is speci-fied describing the tasks users and application may encounterwhen interaction with the system is taking place. Because wewant to develop Situation-aware UIs, tasks also depend on thecurrent situation. This is why tasks in the task model are drawnfor specific situations. In this way the designer can describedifferent tasks for different situations. Input Model : When the task model is specified, the de-signer has to denote what kind of input can influence theinteraction, i.e. the tasks. This can be done by selectingobjects for input gathering (Perception Objects or POs). Theseobjects can be aggregated by the aggregation objects (AO) andinterpreted by the interpretation objects (IO). The designercan do this by linking AOs to POs and selecting from aset of predefined interpretation rules how the input has to beinterpreted. The IOs represent the interpreted information atthe comprehension layer. When the input model is specified,the designer has to link the IOs to task model nodes (inter-model connection). In this way, the designer can denote whichtasks can be performed in which situation. Situation-Specific Dialogue Models : Next, the tool willautomatically extract a dialogue model from the task model foreach situation. Afterwards, inter-model connections are addedautomatically between states of the dialogue model and tasksof the task model that are enabled for each particular state.The dialogue model nodes (states) of the different dialoguemodels are linked to denote between which states situationchanges may occur. Presentation Model : To provide the interface model withinformation about how the interaction should be presented tothe user, designers have to compose abstract UI components,and link these to the relevant tasks for each presentation modelnode. The presentation model nodes can be structured hierar-chically in order to group presentation components for layoutpurposes. The designer can choose from several abstract UIcomponents such as static, input, choice, navigation control,hierarchy, and custom widget. Finally the UI components canbe grouped, and structured in a hierarchical structure. Situation-aware Interface Model : The aggregate of all themodels results in a situation-aware interface model. Usability evaluations : Usability tests are then carried outto test and improve usability of the graphical interface withthe models.IV. S ITUATION - BASED  T ASK  D ESIGN The first step in the situation-aware user interface (SAUI)design process just like every other interface design is todraw up the task model, a hierarchic structure and a wayof establishing temporal relationships between various (sub)tasks. Task analysis can help designers understand what needsto be accomplished by the user, the environment, and thesystem and break down the major task into the simplestcomponent parts. Designers need to know what user tasksare necessary to operate the system and also need to knowwhich part of user input can be transferred to the systemtask in order to increase the level of context awareness of the system. Hierarchical Task Analysis (HTA) focuses on theway a task is decomposed into subtasks and the order andconditions where these are executed. They are representedas a hierarchy of tasks, subtasks and plans. It provides abrief picture of user tasks and basic functional specificationof the proposed application. The top down structure of HTAensures completeness and is easy to comprehend [4] butcannot adequately address human factor and social issues, forexample, emotion [6]. Such issues may be elicited from ascenario.Scenarios according to Carroll [5], are examples of specificexperience that exist to stimulate designers’ creative imagi-nation. Scenarios and claims are lightweight instruments thatguide thought and support reasoning in the design process [5].But scenarios also have their own downsides. According toDiaper [11] scenarios can lead to errors, as a scenario, or evena set of scenarios, do not explicitly guide a designer towards acorrect model of the required system. Both scenarios and task analysis are criticised for omitting the explicit representationof communication between agents engaged in collaborativetasks and also not capturing the richness of interaction thatoccurs in the real world compared with other methods such asrequirements analysis [19].This paper designs a task model based on situations, usinga hybrid technique of combining scenarios, HTA, and require-ments analysis. Designers use the set of tasks that can beidentified in the task specification as a basis for the differentdialogues the user interface will need to complete its tasks.  A. Problem Scenario We commence design with scenarios using the Robertsonmodel (Fig. 2.) below:
Related Documents
View more...
We Need Your Support
Thank you for visiting our website and your interest in our free products and services. We are nonprofit website to share and download documents. To the running of this website, we need your help to support us.

Thanks to everyone for your continued support.

No, Thanks

We need your sign to support Project to invent "SMART AND CONTROLLABLE REFLECTIVE BALLOONS" to cover the Sun and Save Our Earth.

More details...

Sign Now!

We are very appreciated for your Prompt Action!