Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If the author of the article is reading this, any chance of getting a stream or a video of you playing Far Cry? I'm sure a lot of developers, designers, etc. would love to see how you interact and overcome some of the obstacles that are typical of software and games alike, and would love to see (in action) how Far Cry goes about fixing them.

Yes of course the article is rich with detail - thank you so much for writing this - but to be able to watch a video and hear in real-time "oh, that blood just popped on the screen as a visual effect, but it really makes it more difficult to perceive my environment" would be absolutely priceless commentary.

I think a lot of the lack of accessibility isn't due to unwillingness but instead due to being unsure if what you're doing is actually helpful. I think it's way too easy to try to do something clever and instead making your app/game/whatever harder to use for both people with and without disabilities.

Having that direct insight and the ability to observe firsthand how differently-abled individuals interact with software - both good and bad software - would be monumentally helpful.

I would imagine this already exists to some degree but, at least for me, it's quite hard to find. If anyone has any good recs for this, please let me know.



I have an eye condition, I am not fully blind but I have a lot of trouble with text (I use the zoom feature,TTS, and big fonts in my desktop and apps.

What would be super helpful for me would be things like:

1 allow changing fonts and font size, some games use some weird "fantasy" or hand-writing fonts , this are very hard to read so give us the option to use plain Arial font.

2 implement UI scaling so fonts and icons can be larger

3 don't make important ame objects hard to be found, maybe an option to outline things if you press a key, when you have disabilities is always a doubt in your mind that there is something on the screen but you can't see it. This is super annoying in point and click adventure games where you need to sport 1 pixel sized item or some very well hidden object.

4 for games with lot of text like lot of dialog that is not voiced , lot of books,computers or stuff to read I would love if more games support TTS (text to speech), I don't mean include a TTS engine in your game, just send the text to an application , or local port that the user specified and the user can't use it's own TTS program with his preferred settings to listen for the text.

5 For dialog boxes give us the option to set a full opacity background, set the text font family,color and size , for some games I used OCR(Optical Character Recognition ) to read the dialog windows but OCR can fail or take a long time if the font family is a weird gaming font or the dialog background has transparency.

6 If you make a game because you want to make the game (and not because you are learning some cool language or how game engines work - witch is very good you want to learn) try to use existing game engines, this engines will have already some accessibility support or there is enough experience on how to mod things in. So if you want to make a text adventure game use the best tool for the job and not create your own engine.

I am sure some of this would help a lot of people that have smaller eye issues and those would appreciate some of thios options.


I would love if all games let you change the font and font size. I loved Fire Emblem: Three Houses, but god, I’m too nearsighted to deal with text that small.


> I don't mean include a TTS engine in your game, just send the text to an application , or local port that the user specified and the user can't use it's own TTS program with his preferred settings to listen for the text.

How does this kind of thing work? I'm only (passingly) familiar with the “official” accessibility APIs, e.g. IAccessible, AT-SPI2, but these are clearly inadequate; I'm very curious to know how real people use computers.


I am not a regular user, what I done is edit open source engine and hack them to call my TTS program/script. For html engines I do a request to localhost on a specific port where I have a script listening. For a C++ engine I modified I used the run process functions to call directly my program, I think first I put the text to be spoken in a text file.

One reason this is so hacky is that because this engines were not meant to do TTS I can get a lot of extra garbage or duplicates so I need to have the text first go through a game specific script to clean it up.

so could you have this implemented:

1 user will input a path to a script/program

2 from the game engine you call that script and send to it the text as an argument , there might be limitations so it might be better alternatives.

My TTS program implements a queue so it is fine if you just dump a lot of text into it, I have keyboard shortcuts to handle skipping/pausing.

The Renpy game engine (python) supports TTS but I edited the tts plugin and replaced their Linux default to my script since I get more features and flexibility.

P.S. I am tempted to try to also get the text from DirectX games, but I am not sure where to start, I am thinking I could intercept some DrawText function and replace it with my own but I am not sure what terms to Google for and if is something that can be done Or if there is a simple way to detect the code that does the dialog boxes in the games and intercept that function.


For DirectX, I'd be tempted to intercept stuff from the text layout engine. (USP10.DLL, DirectWrite, or HarfBuzz.) Pretty much nothing lays out its own text.


"I think a lot of the lack of accessibility isn't due to unwillingness"

In most case it is because it requires time, effort and knowledge. You're average game dev studi does not have any of that.


> You're average game dev studi does not have any of that.

I sort of wonder, reading this, what kind of accessibility regulations exist in Canada (FC6 being developed by Ubisoft Toronto) has for gaming and how it compares to other technology providers, solutions or products?


The main one is the Accessibility for Ontarians with Disabilities Act, and applies to “goods, services, facilities, accommodation, employment, buildings, structures and premises”.


It's easy to get first hand experience: get a remote mouse and keyboard and then try to perform task on your laptop from 20 meters away.


Hire disabled people to use your app, listen to what they say, integrate their feedback, iterate. That's really what it comes down to. There are AXE tools for testing obvious accessibility issues in websites; but that isn't a perfect replacement for hiring testers


This is exactly my point though - hiring disabled people is simply not an option for most people. If information on meaningful accessibility was as readily available and consumable as learning how to code, I'd bet a lot of money people would be more inclined to make accessible software.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: