-
Notifications
You must be signed in to change notification settings - Fork 1
Tutorial Screen User Testing
The content of this part will focus on the evaluation of the new tutorial pages on player, enemy and map content respectively, introducing basic game rules and game elements.The tutorial pages are improved based on the last sprint’s results obtained from the last text version tutorial pages. Therefore, this part of the evaluation is to check whether the new tutorial pages conforms to the concept of human-computer interaction. In this evaluation, we invited 2 out of class anonymous testers. These experts can use the principle of interaction to better help us find the good and bad qualities of the design and determine whether it has problems or not. The evaluation methods we used include:
- Technology Acceptance Model (TAM)
- Cognitive Walkthrough
The evaluation process will be carried out according to the protocol in the evaluation approaches. We will explain our improved protocol to users and introduce the evaluation process to ensure that the evaluation of each participant is consistent. The invited testers will give different types of feedback for various evaluation methods. These feedback results will be used for analysis and discussion, but will not be displayed in this testing session. Finally, we will sort out basic conclusions and possible solutions based on the results of the analysis.
The Technology Acceptance Model (TAM) aims to measure the adoption of new technologies based on the attitudes of users(Allen, 2020). Users' attitudes towards new technologies are mainly divided into four categories, namely, perceived usefulness (PU), perceived ease-of-use (PEOU), attitude towards technology (ATT) and intention to use (ITO). Perceived usefulness is a measure of the degree to which users find a particular system useful when working with it. Perceived ease-of-use is a measure of whether users think it is easy for them to use a particular system. Attitude towards technology is a measurement method based on perceived usefulness and perceived ease of use. It mainly detects users' attitudes towards new technologies, that is, whether they accept the technology. Finally, the intention to use is based on all the above directions to measure the possibility of users using this technology in the future.
Before starting the evaluation, we introduced the operation process of TAM and the purpose of our tutorial pages to the users, and allowed them to simply interact before filling out the questionnaire. The user's operation should not be too complicated. After that, users will complete the questionnaire. The maximum score for each question is 4 points, and the scores given by users will be integrated and analyzed at the end.
We chose this evaluation method because the current evaluation stage has reached the suitable evaluation level, and all tutorial pages are basically in a relatively complete state. At this time, testing whether the user can accept such a tutorial guide will more easily reflect the acceptance of our way to show tutorial guidelines, based on the user's attitude. In addition, the questionnaire model allows us to better analyze those problems.
The lack of a consistent process during testing can easily lead to the omission of details, and all users' ratings are based on their ideas, rather than being completely objective and fair. Finally, since TAM only provides quantitative data, more information is needed.
For the testing session, a plan is made for collecting data from users’ feedback, focusing on both further design advice and future changes. According to the tutorial page’s goals, the selected evaluation methods will be TAM, cognitive walkthrough which will work with the TAM form, observations, and follow-up questions for generating various types of data to keep both quality and quantity. Observations will be regarded as one type of feedback after completing pre-design tasks which are related to users’ interactions and check whether users can know the goals/ purpose and get information from the tutorial pages or not. At the same time, TAM will be conducted with specified survey forms. All results will be collected and visualized with forms and average marks which may provide a straightforward data visualization for analyzing to show the advantages and disadvantages on both UI designs like layouts and contents with yes or no questions. TAM will be used to cover the first level of feedback which was aiming to point out the simplest views on “good or bad” opinions and will be selected as one of the evaluation methods to collect data from many aspects based on individuals’ attitudes. The differences between individuals will be connected with many aspects to give average marks which are regarded as a tool for analyzing with a multi-dimensional model for the evaluation of users' opinions. However, TAM can hardly tell what exactly the problems are. To point out problems targeted, walkthrough tasks with follow-up questions and the related observations will be necessary to show users’ reactions on their using experience as well as the tutorial pages’ usability, and accessibility. The follow up questions will be designed as a semi-structured interview which is a meeting when the interviewer does not follow the formal questionnaire steps strictly or use the direct model of question-and-answer and make a discussion with the interviewees. Before asking questions, the interviewer will have a general understanding with some tasks for already known information and preference which will also be realized through TAM sessions. During the questioning session, according to the question lists, the interviewer will expand some aspects about the theme with open-ended questions which the answers of those are not just simple “Yes” or “No”. The semi-structured interview makes it easy for double side communication and sharing information about the topic. During the evaluation process, the walkthrough helps the target user to understand the whole tutorial pages better.
Before the evaluation, tasks will be designed first to be evaluated. To start testing, there will be no provided guideline for users on each task. Users will be asked to read tutorial pages before completing tasks. When users are completing tasks, observations will be observed at the same time for users’ reactions which can be regarded as a kind of direct feelings from users. Some pages may cause users’ reactions like hesitation, confusion and dizziness. So, these design elements can be noticed easily which will be paid more attention. After completing pre-designed tasks, Some follow up questions will be asked as a semi-structured interview to make sure the aspects which are needed to be improved. It’s always hard to figure out what to improve on each page because there are too many elements. TAM will help with narrowing range and finding out where problems are. On the other hand, TAM is also useful on statistics which can figure out the number of users that give positive or negative opinions on specific tasks. The interviews will be used to ask users to get detailed feedback about what and how to improve on those interactions. Finally, surveys and TAM’s results, questioning scripts and observations will be analyzed for the possible further improvements on designing iterations.
- Please tell me how to move up, down, jump up, jump down and attack. (Check if users can get correct information in the player tutorial page successfully or not.)
- Please describe those types of enemies, what do they look like? (Check if users can get correct information in the enemy tutorial page successfully or not.)
- Please tell me the difference between those map contents. And describe how they work respectively. (Check if users can get correct information in the map content tutorial page successfully or not.)
- What do you think about browsing those three tutorial pages? Any improvements could be made? Any confusion? (This is a question which is related to designing components. Layout and UI design are connected with users’ using experience tightly. So, it’s quite important to ask users for some feedback about their feelings on UI designs.)
- Please tell me how to move up, down, jump up, jump down and attack. (Check if users can get correct information in the player tutorial page successfully or not.) Noar - Noar finished this task without any hesitation. She told every task correctly. Mark - Task was finished successfully. He can tell the use of keyboards correctly.
- Please describe those types of enemies, what do they look like? (Check if users can get correct information in the enemy tutorial page successfully or not.) Noar - Noar can tell me what enemies look like. But cannot describe enemies in detail. Mark - Mark can only tell enemies appearance without description.
- Please tell me the difference between those map contents. And describe how they work respectively. (Check if users can get correct information in the map content tutorial page successfully or not.) Noar - Only can tell a few things on the map. Mark - Tell most things but not specific.
- What do you think about browsing those three tutorial pages? Any improvements could be made? Any confusion? Noar - Overall, the three tutorial pages seem good. But I am confused about how enemies walk and attack. And I also don’t know about the map contents well. Maybe a more vivid way should be used in introducing enemies and map contents. Mark - I felt that the enemy and map content were not so detailed on their functionality. I know their appearance but not their functions.
From the results and the diagrams we can find that the highest average grade is 4 which is strongly agreed on PU5 which is about the player's tutorial pages, showing the player's tutorial page is well-designed. All results are over 3 which means the tutorial pages meet the purpose quite well with an easy-acceptable layout for every user as guideline. The result also shows that the current page does not need to be changed a lot. Only some tiny changes may need to be changed to make it perfect.
Enemy and map contents pages cannot show information vividly and in detail. It’s not straightforward enough for users who play the game for the first time to understand easily and specifically.
- Make the enemy and map content tutorial pages just like the player tutorial page with animations.
- Enemy and map content tutorial pages may contain more information on their functionality and how they work instead of names and appearance only.
Through this user testing session. Player tutorial, enemy tutorial and map contents tutorial as our three main tutorial pages work well on both functionality and design elements. However, some description and introduction to enemy and map contents could be more detailed to make things more specific which will be able to help the first time player to join the game better and understand the game better.