Week Twenty Three: More Wireframes!

This time they are for the app itself. I have abandoned the buttons as I stated I would in my Mark 1 presentation in favour of making each section of the screen an active link. Tapping an area twice will tell the user what option is assigned to that area (in the manner of a reminder text that appears on hovering over an icon), tapping once will play a tone to serve as a kind of audible icon, and to select an option the user will swipe the area towards them, as if picking up a card. To exit back to the previous screen the user will swipe up, away from them, as if discarding a card. This mapping of interaction with the system to interaction with game elements is something I am very keen to implement as I have been concerned that bringing digital technology into the experience of playing a tabletop game undermines the inherent physicality of gameplay.

IMG_20170428_130850
Swipe down ‘blue’ section to read tag. Tap or swipe left on ‘yellow’ to read cards previously scanned. It reads their associated audio file one at a time, starting with the most recent. Swipe up during the reading of any card to remove it from the list – mirroring the actual discard from hand.

Week Twenty Two: Casual Conversations #4

 

IMG_20170309_162736
Robert Gordon University, Aberdeen

 

Yesterday I took a trip up to Aberdeen to speak with Michael Heron, a lecturer at the School of Computing Science and Digital Media at Robert Gordon University and founder of Meeple Like Us, a blog in which board games are reviewed from an accessibility perspective using heuristic analysis. As visual accessibility is only one element on which a game is assessed and scored, the blog had not appeared in any of my previous search results, but the instant I widened my search to ‘accessibility + board + gaming’ with no specific mention of visual impairment, it became the top result. While the research phase officially ended before Christmas, I couldn’t not contact Michael, and realising how close he is,  I had to try to set up a meeting with him to discuss my project.

It was a very interesting, and in a way, encouraging conversation, as he has been considering ways to make board games more accessible to a broad range of people and has come to similar conclusions regarding the need for the involvement of the wider gaming community as I have in my own work. Furthermore, it provided an opportunity to explore an issue I have been increasingly concerned about: at what point do modifications or assistive technology detract from the experience of playing a particular board game, transforming it from a fundamentally tactile experience to something less tangible, an increasingly electronic game which may no longer be recognisable as the original game? Michael described a session he ran to teach paper prototyping in which students made paper prototypes of board games. Some were very successful and fun, even at such low fidelity, and some were not. In particular the group found that when custom dice were prototyped using a combination of standard dice and lookup tables in which standard dice values were assigned to game-specific symbols, the process of checking the values created a less emotionally engaging experience than that of rolling the game-specific dice themselves. While this might not be an issue during the rapid, inexpensive prototyping of a new game with unique dice, it suggests that part of the experience of tabletop gaming is lost to visually impaired people who employ the same technique to differentiate between special symbols using standard but tactile dice, and this is something to bear in mind in the design of my app. It should not simply encode game information into an accessible format, but also retain or replicate the emotional experience embedded within the discovery or generation of that information and the interactions between player and game. Playing a game with the aid of my app must not be as radically different as the experiences of watching a television programme, and hearing the audio description of the same show, which as Rachel pointed out, is often clinical and lacking in emotional content.

Michael also requested that I update him on the progress of my work. As I am hoping that I will be able to take my project further, it is reassuring to know that other people are interested in this specific area of accessibility.

Finally, the conversation revealed an oversight in my persona spectrum; the app may also be helpful for people with cognitive impairments as it could also play complex game text in simplified vocabulary.

21.2 Audio Tags

When a card is scanned by the player, an audible version of its rules text is played. While I have established these rules should be read by a human voice rather than a simple text-to-speech system, over the past few weeks I have been considering exactly where these audio files come from. It would seem inefficient to demand that they are recorded by relatives, friends, or carers specifically for each individual user and it is my belief that this would be a task best crowdsourced to the gaming community in general. Friends, relatives and carers of visually impaired people can still record audio tags through the system, but tags for card / token components of each game would only need to be created once as they would then be made available for all users. It also provides the potential for special customised editions to be created – imagine if, for example, the rules for cards in the Game of Thrones games were to be read by the actors from the show? This would both create publicity for the system and raise awareness of accessibility issues within gaming.

In preparation for creating a digital prototype of the key pages of this social site, I have been drawing wireframes and working out the site architecture.

21.1 Kate Saunderson, User Researcher

Today Kate Saunderson, a user researcher with the Scottish Government and former social digital lecturer, returned to DJCAD to talk about the importance and benefits of user centred research in designing new technology-based public services. The lecture began with a surprise timed challenge: to work in pairs to devise a series of questions to determine the user experience of applying for university and the problems the process presented. As the other project I seriously considered undertaking for fourth year was centered on supporting mature students through applying for and settling into university, I immediately suggested to my partner, a stranger from another discipline, that we should ask participants how old they were when they applied for university, and discovered that he too was a mature student at the time of application. His current degree is his second, so he was able to describe a difference in the experience of applying at eighteen and twenty-eight.

We were the first pair to volunteer our questions, and Kate commented that it is good to consider what in might be termed non-traditional users (in this context, the UCAS system is primarily used by school leavers). However, one of the reasons I had decided not to design for mature students was because I was part of the intended user base and did not want my own experiences to bias my research and subsequent design work. The second part of the exercise then involved each pair interviewing each other, and I quickly discovered that he had encountered some of the same issues I had in applying when he applied as a mature student, as well as some that were unique to his situation of applying for an undergraduate degree several years after graduating with an undergraduate degree.

I learned three things from this session. One: I can conduct research with a user group of which I am part without either focussing on experiences which match my own at the expense of those which do not, or disregarding my own potentially shared experiences in favour of focussing on those which are unknown to me; however I would need to be aware of my potential biases in the first place to have a chance at guarding against them, and I would need to trust colleagues to challenge me even more on such a project than on any other. Two: one of the other reasons I chose not to work on the project for mature students is because I would want not just to prototype it but actually build and launch it. While I am enjoying working on visually impaired gaming – and in fact have plans to continue working on it beyond the degree show – I also need to return the research I began in the summer between second and third year regarding how mature students could be better assisted in their applications and studies, and design a solution. Three: Kate has an amazing job, speaking to people about their experiences, analysing their stories, then working together with colleagues first to generate concepts and then to create services which will be of benefit to specific groups within the community, improving society a little more each time.

I want to do that too.

Week Twenty One: Planning

At the start of this week I took some time to consider and list all the tasks that need to be completed in order to both finish my project and fulfil the module requirements. Then I allocated various tasks to the remaining weeks of my undergraduate studies. Thankfully, on paper at least, everything seems intense but manageable as long as I am able to keep to this schedule. From here on out, I’m going to treat each week as a design sprint with a small number of targets to reach by Saturday evening, with Sunday being a time to reflect on progress made through writing entries for this blog and prepare for the new challenges of the week ahead.

image (1).jpg

Week Twenty: Mark 1 Presentations

 

This is a simple prototype demonstrating some of the functionality of my proposed app, including the key function of being able to recognise a specific card from a game. The exact wording in the audio tag combines the rules printed on the card itself with the more flavourful description present in the rules booklet for the game. This reason for this is that the setting, conceit and narrative of a game are integral to its design and are widely appreciated by players. The information relayed to a visually impaired player should therefore contain not only details of the mechanical effect of the card but also what those mechanics signify in the fictional world of the game.

Week Nineteen: Prototyping

I’ve jumped straight into exploring possible technologies to facilitate the identification of game cards as for my mark one presentation I would like to demonstrate the concept rather than the experience of interacting with it.
At the moment I am using NFC tags to identify Love Letter cards, and building the prototype in MIT’s App Inventor. This is essentially Scratch for prototyping native Android apps – it is frustratingly limiting in some respects, and while assembling code from draggable snippets is helpful for the beginner, its a little annoying if you’ve got any coding experience. However, if you’ve never tried to write an app that requires direct access to the sensors on an Android phone, its an extremely easy way to rapidly prototype anything that does just that.
Screenshot (366)
MIT App Inventor code screen. Easy, but not without bugs.
I have started with NFC because it is a very reliable technology and thus very suitable to demonstrate the idea of sight impaired players using their mobile phones to ‘read’ information on cards. However, I am not convinced that it is the best choice for the app that I will ultimately develop as a) Apple severely restrict access rights to the onboard NFC module on their phones, meaning an iPhone version is out of the question at the current time, and b) while initial costs associated with NFC are less than those for the PenFriend, after the first few hundred tags, the PenFriend system would actually be cheaper. This may not be a problem if the games to be tagged have few cards, but for games such as Dominion, NFC would ultimately become more expensive than PenFriend tagging, which is already prohibitively expensive for hobbyist gamers.

18.1 Inclusive Design and the Persona Spectrum

Solve for one, extend to many

Everyone has abilities, and limits to those abilities. Designing for people with permanent disabilities actually results in designs that benefit people universally. Constraints are a beautiful thing.

Inclusive Design Principles, Microsoft

While further investigating some of the projects Richard Banks discussed with us earlier this week, I came across Microsoft’s inclusive design methodology and toolkit. At the heart of this methodology is the study of human diversity in order to understand how individuals and groups are excluded from certain activities or services. Microsoft propose that disability arises ‘at the points of interaction between a person and society’ causing ‘mismatched interactions’ which result in ‘physical, cognitive and social exclusion’, and that exploring these ‘points of exclusion’ is the key not only to generating a solution for someone who is physically disabled but also for many who may be situationally disabled; for example, a design which is accessible to an amputee with one functional hand is likely also suitable for a new mother struggling to complete a task one handed while holding her baby with the other arm.

The toolkit contains an exercise to generate a persona spectrum which maps a permanent disability to temporary and situational impairments, reflecting the wide range of people a solution originally proposed to assist a person with a permanent limitation could potentially benefit.

IMG_20170428_125454.jpg

For this project, the persona spectrum extends beyond the intended audience of visually impaired people, to groups who would have difficulty reading text under certain circumstances, such as exchange students playing games in languages other than their own native tongue; to people who are capable of understanding the concepts of a game, but not the terms in which they are described, such as less well-educated people who may not have previously encountered some of the archaic language used in games with a fantasy or historical setting, or those of a different culture or generation from that of the designers. It is also potentially useful for players of customizable card games, such as Magic: The Gathering, as a means of identifying foreign language cards which are often obtained by players buying cards in bulk on the secondary market, and frequently found for sale in games stores.

IMG_20170428_130236.jpg
Persona Spectrum, updated March 11th 2017, in light of conversation with Michael Heron, to include people with cognitive impairments, who can see but not interpret the information on a game card.

 

Week Eighteen: Richard Banks

Today our studio welcomed Richard Banks, from the Human Experience and Design Lab at Microsoft Research, Cambridge, as a guest speaker and one-off project adviser. As I am ultimately looking to work in research it was very interesting to hear more about real-world, industry funded research and fantastic to have the opportunity to speak one-to-one with him about both my project and future plans.

We had a chat about how he found his way into his current job through experience, but that most of his colleagues had taken the more traditional route into research via undertaking a PhD. This is currently the path I intend to take, but I will probably need to take time away from academia after completing my Masters in order to work and save enough to support myself through a PhD.

In regards to my project, we discussed how the visually impaired gaming group I met last year actually added an extra dimension to their gameplay which sighted players do not experience. This is an area he advised I investigate further, suggesting that I should aim to create an experience for visually impaired people which their sighted opponents would be envious of.