devinprater

Devin Prater's blog

Over the last few months, I've been focusing a lot on Braille. Much of it is because the Bluetooth earbuds I have, (Galaxy Buds Pro, Linkbuds S, Razer Hammerheads), either have poor battery life or have audio lag that's just annoying enough to make me not want to use them for regular screen reading. So, grabbing a Focus 14, I began to use Braille a lot. I've now spend a good two weeks using Android'TalkBack's new Braille support, and two weeks with VoiceOver's Braille support.

In this article, I'll overview Android's passed support for Braille, and talk about how its current support works. I'll also compare it to Apple's implementation. Then, I'll discuss how things could be better on both systems. Since there have probably been many posts on the other sites about iOS' Braille support, I don't feel like I need to write much about that, but if people want it, I can write a post from that angle as well.

BrailleBack and BRLTTY

When Google first got into making accessibility a priority of Android, back around Android 2.3, it created a few stand-alone apps. Well, they were kind of standalone. TalkBack, the screen reader, KickBack, the vibration feature for accessibility focus events, and BrailleBack, for Braille support. There may have been more, but we'll focus on BrailleBack here. BrailleBack connected to Braille displays over Bluetooth, and used drivers to communicate with them. It started out well for a first version, but wasn't updated much. In the years that followed, the biggest update was to support a new, less expensive Braille display. This has been Google's problem for a while now, having great ideas, but not giving them the attention they need to thrive. Luckily, TalkBack is still being worked on, and hasn't been killed by Google. At least now, Braille support is built in. BrailleBack wasn't even installed on phones when it was being developed, but TalkBack is. So, things may improve starting now.

BRLTTY started out as a Linux program. It connects to Braille displays using USB, Serial, or Bluetooth, and supports a huge variety of displays. It tries to give the Braille user as much control over the Linux console from the display as possible, using as many buttons as a display has. It came to Android and offered a better experience for some use cases, but the fact that you can't type in contracted Braille, a sort of shorthand that is standardized into the Braille system, may be off putting to some. Another issue is that it tries to bring the Linux console reading experience to an Android phone, which takes a bit of getting used to it.

So, here, we've got two competing apps. BRLTTY gets updated frequently, has many more commands, but has a higher bar for entry. BrailleBack is stale, supports few displays, but allows for writing in contracted Braille, and has more standardized commands. So, you'd think Deaf-Blind users would have choices, enough to use an Android phone, right?

App support matters

Let's take something that Braille excels at: reading. In Android, due to the poor support of Braille from Google up to this point, and the fact that Braille support wasn't installed, meaning that Deaf-Blind users couldn't easily set up their phones without knowing about this separate app, and having sighted assistance to install it, meant that third-party apps, like Kindle, and even first-party apps, like Google Play Books, didn't take Braille into account during development. The Kindle app, for example, just has the user double tap a button, and the system text-to-speech engine begins reading the book. The Play Books app does similar, with the option for the app to use the high quality, online Google speech engine instead of the offline one.

This is how things are today, too. In Kindle, we can now read a page of text, and swipe, on the screen, to turn the page. On Play Books, though, focus jumps around too much to even read a page of text. It's easier to just put on headphones and let the TTS read for you, so that Braille literacy, for Android users, is too frustrating to cultivate.

So, if you want to read a book on your phone, using popular apps like Kindle, you have to use the system text-to-speech engine. This means that Braille users are cut out from this experience, the one thing Braille is really good at. There are a few apps, like Speech Central, which do display the text in a scrolling web view, so that Braille users can now read anything they can import into that app, but this is a workaround that a third-party developer shouldn't have to make. This is something that Google should have had working well about five years ago.


With the release of iOS 8, 8 years ago, Apple gave Braille users the ability to “Turn Pages while Panning.” This feature allowed Braille users to read a book without having to turn the page. Even before that, unlike Android even now, Braille users could use a command on their Braille display to turn the page. Eight years ago, they no longer had to even do that.

A year later, the Library of Congress released an app called BARD Mobile, allowing blind users to access books available from the service for the blind and print disabled on their phone. Along with audio books, Braille books were available. Using a Braille display, readers could read through a book, which was just a scrolling list of Braille lines, without needing any kind of semblance of print pages. Android's version of BARD Mobile got this feature about a year ago. And now, the new Braille support doesn't support showing text in Computer Braille, which is required to show the contracted Braille of the book correctly. I'd chalk this up to a tight schedule from Google and not having been working on this for long. Perhaps version 14 of TalkBack will include this feature, allowing Braille readers to read even Braille books.

Now in Android... Braille

With the release of TalkBack 13, Braille support was finally included, finally. Beforehand, though, we got a bit of a shock when we found out that HID Braille wouldn't be supported. This, again, I can chalk up to the Braille support being very new, and the Android team responsible for Bluetooth not knowing that that's something they'll need to get implemented. Still, it sowered what could have It was a great announcement. Now, instead of supporting “all” displays, they support... “most” displays. So much for Android users being able to use their brand new NLS EReader, right? Technically, they can use it through BRLTTY, but only if it's plugged into the USB C port. Yeah, very mobile.

The Braille support does, however, have a few things going for it. For one, it's very stable. I found nothing that could slow it down. I typed as fast as I could, but never found that the driver couldn't keep up with me. Compare that to iOS, where even on a stable build, there are times where I have to coax the translation engine into finishing translating what I've written. There's also this nice feature where if you disconnect the display, speech automatically come back on. Although, now that I think about it, that may only be useful for hearing blind people, and Deaf-Blind people wouldn't even know until a sighted person told them that they now know all about that chat with the neighbor about the birthday party they were planning, and that it's no longer a surprise. Ah Well, so much for the customizability of Android. In contrast, when speech is muted on iOS, it stays muted.

iOS doesn't sit still

In the years after iOS 8's release, Braille support has continued to improve. Braille users can now read Emoji, for better or worse, have their display automatically scroll forward for easier reading, and customize commands on most displays. New displays are now supported, and iOS has been future-proofed by supporting multi-line or tactile graphics displays.

iOS now also mostly supports displays that use the Braille HID standard, and work continues to be done on finishing that support. This is pretty big because the National Library service for the Blind in the US, the same that offers the BARD service, is teaming up with Humanware to provide an EReader, which while allowing one to download and read books from BARD, Bookshare, and the NFB Newsline, also allows one to connect it to their phone or PC, to be used as a Braille display. This means, effectively, that whoever wants Braille, can get Braille. The program is still in its pilot phase, but will be launched sooner or later. And Apple will be ready.


No, Android doesn't support these new displays that use the Braille HID standard. It also doesn't support multi-line or graphics displays, nor does it support showing math equations in the special Nemeth Braille code, nor does it support automatically scrolling the display, changing Braille commands, and so on. You may then say “Well, this is just version one of a new Braille support. They've not had time to make all that.” A part of that is true. It is version one of their new Braille subsystems of TalkBack. But they've had the same amount of time to build out both Braille support, and TalkBack as a whole, that Apple has. In fact, they've had the same eight years since iOS 8 to both learn from using Apple's accessibility tools, and to implement them themselves.

So, let's say that Google has begun seriously working on TalkBack for the last 3 years, since new management has taken the wheel and, thankfully, steered it well. Google now may have to take at least 4 years to catch up to where Apple is now. Apple, however, isn't sitting still. They've put AI into their screen reader years before the AI-first company, Google, did. How much longer will it take Google to add things like auto-scroll to their screen reader to serve an even smaller subset of their small data pool of blind users?

Neither system is perfect

While Apple's Braille support is fantastic, is is only rather rusty with age, both systems could be using Braille a bit better to really show off why Braille is better than just having a voice read everything to you. One example that I keep coming back to its formatting. For example, a Braille user won't be able to tell what type of formatting I'm using here on either system, even though there are formatting symbols for what I just used in Braille. And no, status cells don't count, they can't tell a reader what part of a line was formatted, and the “format markers” used in Humanware displays are a lazy way of getting around... I don't even know what. If BrailleBlaster, using LibLouis and its supporting libraries and such, can show formatting just fine, I don't see why screen readers in expensive phones can't.

Both systems could really take a page out of the early Braille NoteTakers. The BrailleNote Apex not only showed formatting, but showed things like links by enclosing them in Braille symbols, meaning that not only could a user tell where the link started and ended, just line sighted people, they could do so in a way that needed no abbreviated word based on speech. BRLTTY on Android shows switches and checkboxes in a similar way, using Braille to build a nonvisual, tactile interface that uses Braille pictograms, for lack of a better term, to make screen reading a more delightful, interesting experience, while also shortening the Braille needed to comprehend what the interface item is. This kind of stuff isn't done by anyone besides people who really understand Braille, read Braille, and want Braille to be as efficient, but enjoyable, as possible.

Another thing both companies should be doing is testing Braille rigorously. There is no reason why Braille users shouldn't be able to read a book, from start to end, using Google Play Books. There's also no reason why notifications should continue to appear on the display when they were just cleared. Of course, one issue is much more important than the other, but small issues do add up, and if not fixed, can drag down the experience. I really hope that, in the future, companies can show as much appreciation for Braille as they do for visuals, audio, haptics, and even screen readers.

Until then, I'll probably use iOS for Braille, image descriptions, and an overall smoothe experience, and use Android for its raw power, and honestly better TTS engine, well if you have Samsung that is. With the ability to customize Braille commands, iOS has given me an almost computer-like experience when browsing the web. Android has some of that, but not the ability to customize it.

Conclusion

I hope you all have enjoyed this article, and learned something from my diving into the Braille side of both systems. If so, be sure to share it with your friends or coworkers, and remember that speech isn't the only output modality that blind, and especially Deaf-Blind, people use. As Apple says on their accessibility site, Braille is required for Deaf-Blind users. Thank you all for reading.

Discuss...

You can always subscribe to my posts through Email or Mastodon. Have a great day, and thanks for reading!

This post is to reflect on what can be gained by using Android, as opposed to iOS. My previous post, My Dear Android talked a little about this, but I wanted to go into further detail here.

USB-C has won

I have a lot of accessories, for computers and phones. I have a game controller, which uses Micro USB, but if you buy it now, it'll likely come with USB-C. I have a game controller for a phone, which uses USB-C. I have a Braille display, which uses USB-C. In fact, I'd say just about every modern Braille display uses USB-C. I have USB-C earbuds. All of these technologies use USB-C or can be made to use it with a dongle.

When I use an iPhone, any iPhone today, I have to put all these accessories through a dongle. I don't have a USB-C to Lightning dongle yet, but I do have a Lightning to USB A one. So, whenever I want to plug in, say, a USB-C Flash drive, I can't. I can't plug in my USB-C earbuds into the iPhone. Now, are their dongles for this? Sure. But why deal with that. USB-C has won, soundly, over Lightning. Lightning was always going to be a closed, Apple-only system. No one likes non-standard junk.

Audio and standards

As mentioned in another article, I have a pair of Sony LinkBuds S. These are a pair of truly wireless earbuds that have noise-canceling, transparency mode, integration with Google Assistant and Alexa, integrate with Spotify and Endel, and sound fantastic. When I used them with my iPhone, which has Bluetooth 5.0 (which the newest iPhone SE 2022 also has), the lag was just too much to deal with. When I use them with Android, the lag is noticeable, yes, but much less, and much easier to deal with. This really pushed me back to Android. With iPhone, I would have to get all Apple products to have the best experience. I would need to get the new AirPods or AirPods Pro. I would need an Apple Watch. I would need a Mac. With Android, interoperability means I can get any Android-supported accessories, and they would work just fine.

Another difference between the two ecosystems is that Google Assistant readily works with the LinkBuds S. Assistant reads incoming messages, reads notifications, and does just about everything one can do with the Pixel Buds pro. On iPhone, there is no way to get Siri to automatically read new notification unless you have a pair of AirPods. Seeing this, Android works with many more accessory types, not just in a basic way, but supporting them to their fullest potential.

Also, did I mention the Bluetooth codecs? In Android, several phones have 3 or more different codecs, to support the widest range of audio types. On iPhone, there's just SBC, the lowest quality codec that must be supported, and AAC, Apple's own codec. No APTX, no LDAC, no LC3. So, even if you get an expensive pair of headphones that supports APTX low latency audio, you won't get that support on an iPhone. To be fair, some Android phones don't support APTX either, but on Android, you have that choice of phones. On iPhone, you don't.

Works with Windows and Chromebooks

If you use your computer a lot, you may want to text with it. If you're blind, chances are that you have a Windows PC. Well, iPhone works exclusively with Mac computers, so you can't text from your PC, or make calls from your PC, or control your phone from your PC. Oh yeah, you can't control your iPhone from your Mac either. Anyway, you can do all this from a Windows computer. If you use Google Messages, you can even read and send texts from the web, using your phone and phone number. As an added bonus, the Messages for Web page is very accessible, and has keyboard commands for navigating to the conversations list or messages list.

This gives me the freedom to do what I want, from whatever device I'm on. I don't have to switch contexts from my computer, to my phone, just to send a text, or read a text. I can just open Messages for Web, and do everything there.

Are you a Developer?

How much do you think you'll save if you didn't have to pay $100 per year? That's how much it costs to have an app on the Apple App Store. If you're a blind developer, you may be paying for JAWS every year too, so that's $200 a year, just to make great apps for iPhone. Along with all that, you have to deal with the sometimes frustrating experience of not only using a Mac, but developing on it, in Xcode. Now, you may be using a framework like React Native, or Beeware for Python, where you don't have to code in Swift, or touch Xcode all that much. If so, that probably cuts down on a lot of stress. But you still have to spend money just to keep your app on the App Store.

On Android, all a developer needs to do is pay $25, once. That's it. There is the 15% service fee on in-app purchases, but if your app is free, you don't have to worry about any of that. Also, you aren't limited to one language. You can use Java, Kotlin, some C++, C#, Python, JavaScript, Dart, and Corona (Lua). Of course, a few of these, like JavaScript (React Native and such) can be used to create iPhone apps too. But with Android, you can use your superior Windows platform, VS Code, and NVDA or JAWS to develop Android apps easily. Also, Android Studio is accessible on Windows too.

Accessibility, the mixed bag

Now we get into the thing I'm all about, accessibility. If you use apps like Telegram, DoorDash, Messenger, YouTube, and others, you may find that they don't work as well as they should on iPhones. YouTube, just recently, gained a bug where you can't go passed the third or so item in the home tab. Android doesn't have that problem. DoorDash has reviews in the middle of their menus, and tells you the time the delivery will reach you, not the estimated time in minutes as it does on Android. In Telegram on the iPhone, if you have a message that covers more than the screen height, VoiceOver will not navigate to the next message until you scroll forward. On Android, TalkBack will eventually reach the next message, and will not get stuck.

This shows, to me at least, a slice of something strange. Android seems to have more of a flexible accessibility framework, allowing for code to tell more of the story than visuals. On iPhone, VoiceOver doesn't look passed the current screen of content, or the cross-platform framework doesn't tell VoiceOver about it, but does tell Android and lets TalkBack navigate to it. However the code works, it results in a worse experience on iPhone, and a better one on Android. I can't argue with results.

Now, for image descriptions. I do miss them, being on Android. But, I'm sure Google is working on them, with its testers. After all, TalkBack can describe text, and icons now. And it does that very well. So, I'm sure they'll get image descriptions down in maybe a year. In the meantime, I still have an iPhone, Lookout, Bixby Vision, and Envision to hold me over until then.

I'm also hoping Google works on audible graphs, as that's pretty helpful. I could see them integrating that with image descriptions to describe graphical graphs, which iOS doesn't do yet.

Now, for Braille, things have improved. I grabbed a Focus 14 to work with, and find that I can use my phone with Braille support for about 30 minutes, without growing tired of it. One really nice thing that TalkBack does is focus management. So, if you leave an app, then come back to it, focus will remain at the spot that you left it. So, if you're reading a Reddit thread in RedReader, and you go to Messenger to read and reply to a message, when you come back to Redreader, your focus will be on the exact comment you left it on. I don't recall that ever happening on the iPhone.

Mostly, it's a very good start of Android's new Braille implementation. One that, even though it's new, is very stable, and all commands work fine. There isn't the issue of HID-based displays on iOS, where you cannot assign the “enable autoscroll” command and such. Input works great, and there is no time when the input process gets stuck, and you have to press the “translate” command several times to plunge it out.

Conclusion

After spending a week with iPhone, I'm back on Android. Yes, I'm looking forward to greater accessibility, like image descriptions, Braille improvements, and audible graphs and charts, but I also love what Android is right now. Android is open, allows for greater innovation by developers, allows accessory manufacturers to create great, integrated experiences, and in quite a few cases, is even more accessible than the iPhone.

Android also allows one to use many of its services from a Windows computer, which is more popular in the blind community than Macs. This allows the user to stay in the same context, without needing to pull out a phone just to check a text. One can also make calls and control their Android phone from a PC.

In closing, thanks for reading this article, in my journey with Android and iPhone. I know I'm not done with this, and as the two operating systems grow and age, things will change, on either Android or iOS' side. Feel free to subscribe to my blog, or leave comments.

Discuss...

You can always subscribe to my posts through Email or Mastodon. Have a great day, and thanks for reading!

Last night, I turned on my Galaxy S20 FE (5G) again. To update apps, and compare a week away with the iPhone to how Android feels now. And, I must say, Android is still charming to me.

Telegram works better than on iOS. On iOS, VoiceOver gets stuck on a long message, not moving to the next message at all until you scroll forward. I've not tried Whatsapp yet, but I wouldn't be surprised if it worked better there too. Also, Doordash is a lot easier to use on Android, without all the reviews and junk getting in my way like on iOS.

But the big thing was my earbuds. I have a pair of Sony LinkBuds S, which sound great, work with either Google Assistant or Alexa directly, and not through the framework where you hold down the home button and use that sort of voice control interface, and has all the good stuff like noise canceling and transparency mode. That can also be changed through Google Assistant.

So, I can use it with my iPhone X R. It works pretty well, and I can use Alexa through it. But, the latency is awful, and it took a few tries before I could get it set up. On Android, though, the latency is mild enough to where I can deal with it, and setup was quick and easy. This is a symptom of Apple's issue of wanting control. I don't have the Airpods. I don't have the AirPods Pro. I do have the Sony LinkBuds S, which probably blow all AirPods out of the water with it's pretty literally chest-thumping base (at least for me and my hearing). The AirPods Pro, first generation, didn't have that. I have little hope that the second generation, or the regular AirPods third generation (that can get confusing really fast), would have that. Plus, there's one chord to rule them all.

That's right, USB C. I love it! It's everywhere, used on just about everything, and I can connect my phone to my dock at work and use it with a keyboard and wired headphones at work. Speaking of wired headphones, there are actually USB C headphones. There aren't many Lightning headphones. Yes, I can get Apple's wired headphones. But that's $30. What if I want a pair of $250 cans I can rock out to?

Lastly, TalkBack is on a good path. They've added basic Braille support, which they'll hopefully be improving throughout the coming year, Android 13 has Audio Description API's, and hopefully in the next update to TalkBack, image descriptions so I can see my cat that I had to give up recently. Poor little Ollie. On the iPhone, while image descriptions are bright and vibrant, Braille is starting to suffer a good many pesky bugs that make me not even want to use it. Maybe one or two will be fixed when iOS 16 is released, but they've got a week to do it, and I don't see them spending that much time on a minority of a minority. However, TalkBack's Braille support, while new, is pretty solid, a good base to work upon.

So, I wanted to post this to balance things out from my other post when I went from Android to Apple. My journey is definitely not over, and neither are the two operating systems in question. While we know what iOS 16 brings, new voices, door detection, and probably other stuff, the TalkBack team has been pretty tight-lipped on what they've been working on. I miss the days when we had more open dialog with them. But at least they have a blind person on the team that does interact with the community some.

Discuss...

You can always subscribe to my posts through Email or Mastodon. Have a great day, and thanks for reading!

During the last six months, I’ve used Android exclusively as my primary device. I made calls with it, texted with it, read emails with it, browsed the web, Reddit, Twitter, and Facebook with it, and played games on it. Now, I’m going to express my thoughts on it, its advancements, and its issues.

This will contain mostly opinions, or entirely opinions, depending on whether you really love Android or not. But whatever your stance, these are my experiences with the operating system. My issues may not be your issues, and so on.

Daily Usage

To put things into perspective, I’ve used my phone, the Samsung Galaxy S20 FE 5G, for the following, with the apps I’ve used:

  • Email: Gmail, Aquamail, DeltaChat
  • Messaging: Google Messages
  • RSS: Feeder
  • Podcasts: Podcast Addict, PocketCasts
  • Terminal: Termux
  • HomeScreen: 1UI
  • Screen reader: Google TalkBack
  • Speech Engine: Samsung TTS and Speech Services by Google
  • Text Recognition/object detection: Google Lookout and Envision AI
  • Gaming: PPSSPP, Dolphin, EtherSX2
  • Reddit: Redreader

I’m sure I’m forgetting a few apps, but that’s basically what I used most often. For Facebook, Twitter, YouTube and other popular services, I used their default apps, with no modifications. I used all the Google services that I could, and rarely used Samsung’s apps. So, this is to show that I was deep into the Android ecosystem, with Galaxy Buds Pro, a ChromeBook, and a TickWatch E3.

The good

I want to start off the comparison with what worked well. First, Samsung TTS voices are really nice, sounding even more smooth, sometimes, than Alex on iOS, and much more so than the Siri voices. I still love the Lisa voice, which, to me, sounds as close to Alex as possible with her cadence and professional-sounding tone. Yes, the voices could be sluggish if fed lots of text at once, but I rarely ran into that.

I also love the wide variety of choice. Apple includes the AAC Bluetooth codec on their iPhones. So, if you get APTX, or Samsung’s Variable codec, or other headphones with other codecs, it won’t matter, and you’ll go back to SBC, which sounds the worst out of all of them. If your headphones have AAC, of course, it’ll get used on the iPhone. But if not, you’re stuck with SBC. Android phones, though, usually come with a few different codecs for headphones to choose from, and in the developer settings, you can choose the codec to use.

Another great feature of all modern Android phones is USB-C. Everything else uses USB-C now, including Braille displays, computers, watches, and even iPads. With Android, you can plug all these things into your phone with the same cable. If your flash drive has USB-C, you can even plug that in! With iPhone, though, you have to deal with Lightning, which is just another cable, and one you’ll likely have less of, since less stuff uses it.

The last thing is that Android phone makers typically try out new technology before Apple does, leading to bigger cameras, folding phones, or faster Wi-fi or cellular data. Now that the new iPhone SE has 5G, and probably the latest Wi-fi, though, that’s most likely less of an issue. Still, if you like folding phones, Android is your only choice right now.

Starting on the software, it’s pretty close to Linux, so if you plug in a game controller, keyboard, or other accessory, it’ll probably work with it. If you have an app for playing music using a Midi keyboard, and you plug one in, it’ll likely work. On iPhone, though, you need apps for more things, like headphones and such.

Another nice thing, beginning in the accessibility arena, is that the interface is simple. Buttons are on the screen at all times, not hidden behind menus or long-press options like they are a lot of the time on iOS. If you can feel around the screen with one finger, or swipe, you’ll find everything there is to find. This is really useful for beginners.

Another pleasant feature is the tutorial. The TalkBack tutorial guides Android users through just about every feature they’ll need, and then shows them where they can learn more. VoiceOver has nothing like that.

On Android, things are a lot more open. Thanks to that, we have the ability for multiple screen readers, or entirely new accessibility tools, to be made for Android. This allows BRLTTY to add, at least, USB support for HID Braille displays, and Commentary to OCR the screen and display it in a window. This is one of the things that really shines on Android.

The bad

Those Bluetooth headphones I was talking about? The Galaxy Buds Pro are very unresponsive with TalkBack, making them almost useless for daily use. The TickWatch has its own health apps, so it doesn’t always sync with Google Fit, and doesn’t sync at all with Samsung Health. Otherwise, the watch is a nice one for Android users. On iPhone though, it doesn’t even share the health data with the health app, just Google Fit, which doesn’t sync with the health app either.

A few days ago, a few things happened that brought the entire Android experience into focus for me. I was using the new Braille support built into TalkBack, with an older Orbit Reader Braille display, since my NLS EReader doesn’t work with TalkBack, since there is no Bluetooth Braille HID support. I found that reading using the Google Play Books app is really not a great experience when reading in Braille. Then, I found a workaround, which I’ll talk about soon, but the fact is that it’s not a great experience on Google Play Books.

So, I got confirmation that someone else can reproduce the issue. The issue is that the display is put back at the beginning of the page before even reading the next page button, and that one then cannot easily move to the next page. I then contacted Google Disability support. Since their email address was no longer their preferred means of contact, I used their form. On Twitter, they always just refer you to the form.

The form was messy. With every letter typed into the form, my screen reader, NVDA on Windows, repeated the character count, and other information. It’s like no blind person has ever tested the form that blind people are going to use to submit issues. “No matter,” I thought. I just turned my speech off and continued typing, in silence.

When the support person emailed me back, I was asked some general info, and to take a video of the issue. This would require, for me, a good bit of setup. I’d need three devices: the phone, the Braille display, and something to take the video with, probably a laptop. Then I’d need to get everything in the frame, show the bug, and hope the support person can read Braille enough to verify it.

This was a bit much for me. I have a hard job, and I have little energy afterwards. I can’t just pop open a camera app and go. So, I asked around, and found a workaround. If you use the Speech Central app, you can stop the speech, touch the middle of the screen, and then read continuously. But why?


This really brought home to me the issues of Android. It’s not a very well put together system. The Google Play Books team still uses the system speech engine, not TalkBack, to read books. The Kindle app does the same thing. There is barely a choice, since TalkBack reads the books so poorly. This is Google’s operating system, Google’s screen reader, and Google’s book-reading app. There is little excuse for them to not work well together.

Then, either that night or the night after that, I got a message on Messenger. It was a photo message. So, naturally, I shared it with Lookout, by Pressing Share, then finding Lookout in the long list of apps, double tapping, waiting for the image recognition, and reading the results. And then I grabbed the iPhone, opened Messenger, opened the conversation, double-tapped the photo, put focus on the photo, and heard a good description. And I thought, “Why am I depriving myself of better accessibility?”

And there’s the big issue. On iOS, Braille works well, supports the NLS EReader, and even allows you to customize commands, on touch, Braille, and keyboard. Well, there are still bugs in the HID Braille implementation that I’ve reported, but at least the defaults work. That’s more than I can say for TalkBack and Android.

And then the big thing, image descriptions, and by extension, screen recognition. TalkBack finally has text detection, and icon detection. That’s pretty nice. But why has it taken this long? Why has it taken this long to add Braille support? Why do we still have robotic Google TTS voices when we use TalkBack? After all these years, with Google’s AI and knowledge, Android should be high above iOS on that front. And maybe, one day, it will be. But right now, Android’s accessibility team is reacting to what Apple has done. Braille, image descriptions, all that. And if there’s a central point to what I’ve learned, it’s this: do not buy a product based on what it could be, but what it currently is.

Then, I started using the iPhone more, noticing the really enjoyable, little things. The different vibration effects for different VoiceOver actions. Not just one for the “gesture begin”, “gesture end,” “interactable object reached”, and “text object reached”. No, there are haptics for alerts, reaching the boundary of the screen or text field, moving in a text field, using Face ID, and even turning the rotor. And you can turn each of these on or off. What’s that about Android being so customizable?

Then there’s the onscreen Braille keyboard. On Android, to calibrate the dots, you hold down all six fingers, hold it a little longer, just a bit longer… Ah, good, it detected it this time. Okay, now hold down for two more seconds. Now you’re ready to type! Yes, it takes just about that long.

On iOS, you quickly tap the three fingers of your left hand, then the fingers of your right hand, and you’re ready! Fast, isn’t it? These kinds of things were jarring with their simplicity, coming from Android, where I wasn’t even sure if calibration would work this time. I do miss the way typing on the Android Braille keyboard would vibrate the phone, letting you know that you’d actually entered that character. However, the iPhone’s version is good enough that I usually don’t have to worry about that.

I want to talk a bit more about image descriptions. While I was on Android, I learned to ignore images. Sure, I wanted to know what they were, but I couldn’t easily get that info, not in like a few seconds, so I left them alone. On iOS, it’s like a window was opened to me. Sure, it’s not as clear as actually having sight, and yes, it gets things wrong occasionally. But it’s there, and it works enough that I love using it. Now, I go on Reddit just to find more pictures!

And for the last thing, audio charts. Google has nothing like this. They try to narrate the charts, but it’s nothing like hearing the chart, and realizing things about it yourself. Hearing the chart is also much faster than hearing your phone reading out numbers and labels and such.

The ugly

Here, I’ll detail some ugly accessibility issues on Android, that really make iOS look as smooth as glass in comparison. Some people may not deal with this, but I did. Maybe, by the time you read this article, they’ll be fixed in Android 13, or a TalkBack update or something.

First, text objects can’t be too long, or TalkBack struggles to move on to the next one. This can be seen best in the Feeder app, which, for accessibility reasons, uses the Android native text view for articles. This is nice, unless a section of an article spans one screen of text. Take the Rhythm of War rereads on Tor. Some of those sections are pretty long, and it’s all in one text element. So TalkBack will speak that element as you swipe to the next element, until it finally reaches the next one. This can take one swipe, or three, or five. This happens a lot in Telegram too, where messages can be quite long.

Another issue is clutter. A lot of the time, every button a user needs is on the screen. For example, the YouTube app has the “Go to channel” and “actions” buttons beside every video. This means you have to swipe three times per video. On iOS, each video is on one item, and the actions are in the actions rotor. TalkBack has an actions menu, but apps rarely use it. Gmail does, for example, but YouTube doesn’t. This makes it even more tricky for beginners, who would then have to remember which app uses it and which app doesn’t, and how to get to it and such.

When an Android phone wakes up, it reads the lock status, which usually is something like “Swipe with two fingers or use your fingerprint to unlock.” Then, it may read the time. That’s a lot of words just to check what time it is. Usually dependably, an iPhone reads the time, and then the number of notifications. Apple’s approach is a breath of fresh air, laced with the scent of common sense and testing by actual blind people. This may seem like a small thing, but those seconds listening to something you’ve heard a hundred times before add up.

If you buy a Pixel, you get Google TTS as your speech engine. It sucks pretty badly. They’re improving it, but TalkBack can’t use the improvements yet, even if other screen readers and TTS apps can. Crazy, right? However, with the Pixel, you get Google’s Android, software updates right at launch, and the new tensor processor, voice typing, and so on. If you get Samsung, you get a good TTS engine, for English and Korean at least. You also get a pretty good set of addons to Android, but a six-month-old screen reader and an OS that won’t be updated in about six months either. This is pretty bad mostly because of TalkBack. You see, there are two main versions of TalkBack. There is Google’s version, and Samsung’s version. Samsung’s TalkBack is practically the same as Google’s, but at least one major version behind, all the time. With iPhone, you get voices aplenty, from Eloquence—starting next month—Alex, Siri voices, and Vocalizer voices, with rumors that third-party TTS engines can be put on the store soon. You get a phone that, with phones as old as the 8, can get the latest version of iOS, and get them on the day they’re released. And there is no older version of VoiceOver just floating around out there.

Further thoughts

I still love Android. I love what it stands for in mobile computing. A nice, open source, flexible operating system that can be customized by phone makers, carriers, and users to be whatever is needed. But there really isn’t that kind of drive for accessibility. TalkBack languished for years and years, and is only just now trying to hurry to catch up to VoiceOver. Will they succeed? Maybe. However, VoiceOver isn’t going to sit still either. They now have the voice that many in the blind community can’t do without. On Android, that voice, Eloquence, is now abandoned, and can’t be bought by new users. And when Android goes 64-bit only, who knows whether Eloquence will work or not. iOS, on the other hand, officially supports Eloquence, and the vocalizer voices, and even the novelty voices for the Mac. They won’t be abandoned just because a company can’t find it within themselves to maintain such a niche product. Furthermore, all these voices are free. Of course, when a blind person buys an $800 phone, they’d better be free.

I’m also not saying iOS is perfect. There are bugs in VoiceOver, iOS, and the accessibility stack. Braille in particular suffers from some bugs. But nothing is bug-free. And no accessibility department will be big, well-staffed, well-funded, or well-appreciated. That’s how it is everywhere. The CEO or president or whoever is up top will thank you for such a great job, but when you need more staff, better tools, or just want appreciation, you’ll often be gently, but firmly, declined. Of course, the smaller the company, the less that may happen, but the disability community can never yell louder than everyone else. Suddenly, the money, the trillions, or billions of dollars, just isn’t there anymore when people with disabilities kindly ask.

But, the difference I see is what the two accessibility teams focus on. Apple focuses on an overall experience. When they added haptics to VoiceOver, they didn’t just make a few for extra feedback, they added plenty, for a feedback experience that can even be used in place of VoiceOver’s sounds. When they added them, they used the full force of the iPhone’s impressive haptic motor. Just feel that thump when you reach the boundary of the screen, or the playful tip tick of an alert popping up, or the short bumps as you feel around an empty screen. All that gives the iPhone more life, more expression in a world of boring speech and even more boring plain Braille.

The iPhone has also been tested in the field, for even professional work like writing a book. One person wrote an entire book on his iPhone, about how a blind person can use the iPhone. That is what I look for in a product, that level of possibility and productivity. As far as I know, a blind person has never written a book using Android, preferring the “much more powerful” computer. I must say that the chips powering today's phones are sometimes even more powerful than laptop chips, especially from older laptops. No, it’s the interface, and how TalkBack presents it, that gets in the way of productivity.

Lastly, I’m not saying that a blind person cannot use Android. There are hundreds of blind people that use Android, and love it. But if you rely on Braille, or love image descriptions, or the nice, integrated system of iOS, you may find Android less productive. If you don’t rely on these things, and don’t use your phone for too much, then Android may be a cheaper, and easier, option for you. I encourage everyone to try both operating systems out, on a well-supported phone, for themselves. I’ll probably keep my Android phone, since I never know when a student will come in with one. But I most likely won’t be using it that much. After all, iOS and VoiceOver, offer so much more.

Discuss...

You can always subscribe to my posts through Email or Mastodon. Have a great day, and thanks for reading!

This is going to be a more emotional post, which mirrors my mental state as of now. I just have to write this down somewhere, and my blog should be a good place to put it. It may also be helpful for others who struggle with this.

I've used just about every operating system out there, from Windows to Mac to Linux, ChromeOS, and Android and iOS. I've still not found one I can be completely happy with. I know, I may never find an OS that fits me perfectly, but so many others have found Linux to be all they ever need. I wish I could find that. Feel that feeling of not needing to switch to another laptop just to use a good Terminal with a screen reader that will always read output, or the ability to use Linux GUI apps, like GPodder or Emacs with Emacspeak.

There are times when Windows is great. Reading on the web, games, and programs that were made by blind people to help with Twitter, telegram, Braille embossing, and countless screen reader scripts. Other times, I want a power-user setting. I want GPodder for podcasts, or to explore Linux command-line apps. I asked the Microsoft accessibility folks about Linux GUI accessibility, and they just said to use Orca. I've never gotten Orca to run reliably on WSL2. It's always been reliable on ChromeOS with Crostini.

Whenever I get enough money, I'll get 16 GB RAM, so maybe I can run a Linux VM. But still, that's not bare metal. And if I switch to Linux, I would have to run a Windows VM, for the few things that run better on Windows, like some games, and probably the Telegram and Twitter support. It's all just kind of hard to have both. Dual booting may work, but I've also heard that Windows gets greedy and messes with the bootloader.

But, with there being a blind person working on Linux accessibility at Red hat, I hope that, soon, I won't need Windows anymore. I can hope, at least. But with there still being a few who have the mindset that I must fix everything myself, I must still remain cautious, and unexcited about this development among the hardcore Linux community, lest the little amount of joy a full-time Linux accessibility person being hired gives me, is taken away by their inflexibility and cold, overly-logical mindset.

But, I'm not done yet. With the little energy taking vitamins has given me, I've made a community for FOSS accessibility people on Matrix, bridged to IRC. I continue to study books on Linux, although I've not gotten up the energy to continue learning to program and practice. Maybe I'll try that today.

Mostly, I don't want newcomers to Linux to feel as alone in their wrestling with all this as I do. All other blind people are already so far ahead. Running Arch Linux, able to code, or at least happy with what they have and use. I don't want future technically inclined blind people to feel so alone. Kids who are just learning to code, who are just getting into GitHub, who are just now learning about open source. And they're like “so what about a whole open source operating system?”

And then they look, find Linux, and find so few resources for it for them. Nothing that they can identify with. Well shoot, there it is. Documentation I guess. I do want to wait until Linux, and Gnome or whatever we ultimately land on, is better. Marco (in Mate), shouldn't be confused whenever a QT or Elektron-based app closes and focus is left out in space somewhere. An update shouldn't break Elektron apps' ability to show a web view to Orca. And we definitely shouldn't be teaching kids a programming language, Quorum, made pretty much specifically for blind people. But I'm glad we're progressing. Slowly, yes, but it's happening at least.

Discuss...

You can always subscribe to my posts through Email or Mastodon. Have a great day, and thanks for reading!

Why tools made by the blind, are the best for the blind.

Introduction

For the past few hundred years, blind people have been creating amazing technology and ways of dealing with the primarily sighted world. From Braille to screen readers to canes and training guide dogs, we've often shown that if we work together as a community, as a culture, we can create things that work better than what sighted people alone give to us.

In this post, I aim to celebrate what we've made, primarily through a free and open source filter. This is because, firstly, that part of what we've made is almost always overlooked and undervalued, even by us. And secondly, it fits with what I'll talk about at the end of the article.

Braille is Vital

In the 1800's, Louis Braille created a system of writing that was made up of six dots configured in two columns of three dots, which made letters. This followed the languages of print, but in a different writing form. This system, called Braille after its inventor, became the writing and reading system of the blind. Most countries, even today, use the same configurations created by Louis, but with some new symbols for each language's needs. Even Japanese Braille uses something resembling that system.

Now, Braille displays are becoming something that the 20 or 30 percent of employed blind people can afford, and something that the US government is creating a program to give to those who cannot afford one. Thus, digital Braille is becoming something that all screen reader creators, yes even Microsoft, Apple, and Google, should be heavily working with. Yet, Microsoft doesn't even support the new HID Braille standard, and neither does Google. Apple supports much of it, but not all of it. As an aside, I've not even been able to find the standards document, besides This technical notes document from the NVDA developers.

However, there is a group of people who has taken Braille seriously since 1995. That is the developers of BRLTTY, of which you can read some history. This program basically makes Braille a first-class citizen in the Linux console. It can also be controlled by other programs, like Orca, the Linux graphical interface screen reader.

BRLTTY has gone through the hands of a few amazing blind hackers (as in increddibly competent programmers)), to land at https://brltty.app, where you can download it not only for Linux, where it's original home is at, but for Windows, and even Android. BRLTTY not only supports the Braille HID standard, but is the only screen reader that supports the Canute 360, a multi-line Braille display.

BRLTTY, and its spin-off project of many Braille tables (called LibLouis), have proven so reliable and effective that they've been adopted by proprietary screen readers, like JAWS, Narrator, and VoiceOver. VoiceOver and JAWS use LibLouis, while Narrator uses them both. This proves that the open source tools that blind people create are undeniably good.

But what about printing to Braille embossers? That is important too. Digital Braille may fail to work for whatever reason, and we should never forget hardcopy Braille. Oh hey lookie! Here's a driver for the Index line of Braille embossers. The CUPS (Common Unix Printing System) program has support, through the cups-filters package, for embossers! This means that Linux, that impennitrable, unknowable system for geeks and computer experts, contains, even out of the box on some systems, support for printing directly to a Braille embosser. To be clear, not even Windows, or MacOS, or iOS, has this. Yes, Apple created CUPS, but they've not added the drivers for Braille embossers.

Let that sink in for a moment. All you have to do is set up your embosser, set the Braille code you want to emboss from, the paper size, and you're good. If you have a network printer, just put in the IP address, just like you'd do in Windows. Once that's sunk in, I have another surprise for you.

You ready? You sure? Okay then. With CUPS, you can emboss graphics on your embosser! Granted, I only have an Index D V5 to test with, but I was able to print an image of a cat, and at least recognize its cute little feet. I looked hard for a way to do this on Windows, and only found an expensive tactile graphics program. With CUPS, through the usage of connecting to other Linux programs like ImageMagick, you can get embossed images, for free. You don't even have to buy extra hardware, like embossers especially made for embossing graphics!


Through both of these examples, we see that Braille is vital. Braille isn't an afterthought. Braille isn't just a mere echo of what a screen reader speaks aloud. Braille isn't a drab, text-only deluge of whatever a sighted person thinks is not enough or too much verbiage. Braille is a finely crafted, versitile, and customizable system which the blind create, so that other blind people can be productive and happy with their tools, and thus lessen the already immense burden of living without sight in a sighted world. And if electronic Braille fails, or if one just wants to use printed material like everyone else can, that is available, and ready for use, both to print text and pictures.

Speech matters too

If a blind person isn't a fast Braille reader, was never taught Braille, or just prefers speech, then that option should not just be available for them, but be as reliable, enjoyable, and productive an experience as possible. After all, wouldn't a sighted person get the best experience possible? Free and open source tools may not sound the best, but work is being done to make screen readers as good as possible.

In the Linux console, there are three options. One can use Speakup, Fenrir, and TDSR. On the desktop, the screen reader has been Orca, but another is being written, called Odilia. Odilia is being written by two blind people, in the Rust programming language.

If one uses the Emacs text editor, one can also take advantage of Emacspeak. This takes information not from accessibility interfaces, but Emacs itself, so it can provide things like aural syntax highlighting, or showing bold and italics through changes in speech.

Community

There are several communities for blind Linux and open source users. There is the Blinux, the Orca mailing list, the LibreOffice Accessibility mailing list, and the Debian Accessibility mailing list.

Recently, however, there is a new way for all these groups, and sighted developers, to join together with, hopefully, more blind people, more people with other disabilities, and other supporters. This is the Fossability group. This is, for now, a Git repository, mailing list, and Matrix space. It's where we can all make free and open source software, like Linux, LibreOffice, Orca, Odilia, desktop environments, and countless other projects, as useful and accessible as possible.

Blind people should own the technology they use. We should not have to grovel at the feet of sighted people, who have little to know idea what it's like to be blind, for the changes, fixes, and support we need. We should not have to wait months for big corporations (corpses), to gather their few accessibility programmers to add HID Braille support to a screen reader. We should not have to wait years for our file manager to be as responsive as the rest of the system. We should not have to wait a decade for our screen reader to get a basic tutorial, so that new users can learn how to use it. We should not have to beg for our text editor to not just support accessibility, but support choices as to how we want information conveyed. This kind of radical community support requires that blind people are able to contribute up the entire stack, from the kernel to the screen reader. And with Linux, this is entirely possible.

Now, I'm not saying that sighted people cannot be helpful, it's the exact opposite. Sighted people have designed the GUI that we all use today. Sighted people practically designed all forms of computing. Sighted developers can help because they know graphical toolkits, so can help us fix any accessibility with that. And I'm not trying to demean the ongoing, hard, thankless job of maintaining the Orca screen reader. Again, that's not even the maintainer's job that she gets paid for. However, I do think that if more blind people start using and contributing to Linux and other FOSS projects, even with just support or bug reports, a lot of work will be lifted from sighted people's shoulders.

So, let's own our technologies. Let's take back our digital sovereignty! We should be building our own futures, not huge companies with overworked, underpaid and underappreciated, burnt-out and understaffed accessibility engineers. Because while they work on proprietary, closed-off, non-standard solutions, we can build on the shoulders of the giants that have gone before us, like BRLTTY, the CUPS embosser drivers, and so many other projects by the blind, for the blind. And with that, we can make the future of Assistive Technology open, inviting, welcoming, and free!

Discuss...

You can always subscribe to my posts through Email or Mastodon. Have a great day, and thanks for reading!

A few positive things for a change

In the past, I've mostly written articles about problems with operating systems, products, services, and general technology. But, in this article, I want to shed a little light on what good things are going on. This doesn't really negate all the bad, but it helps to think about the good things that are going on more than just the bad.

Google

Lately, Google has been putting a lot more effort into Android accessibility than in previous years. A few years ago, Google added commands to TalkBack that could use more than one finger. This means that complex two-part commands, like swiping up then right, or right then down, which are more like commands you'd perform on a video game joystick than a phone, don't have to be used. Instead, one can use two, three, or four finger taps or swipes instead. These are also pretty customizable.

Then, in Android 12, Google brought those commands, which were previously only for Pixel and Samsung devices, out of (beta I guess) exclusivity, and onto every Android device. Oh and in Android 11 or so they added an onscreen braille keyboard, which I now can't live without, and previously couldn't on iOS either. That's the one thing that gave me a good enough excuse to jump to Android.

Now, they're adding Braille display support, so if a blind person owns a refreshable Braille display, they can connect it through Bluetooth to Android. This will be coming out in Android 13 later this year. And if Samsung doesn't hurry it up, I won't be very happy if I have to wait until next year to get 13. Ah well, Dolby Atmos is pretty worth it.

I hope they keep improving their AI stuff. Right now, they can detect text in images, but I'd love to be able to go through my photo library and hear descriptions of images, like I can on iOS. No, having to send the image to another app isn't the same thing. But they're getting closer!

Apple

Apple still leads the way on adding new features to their accessibility settings, at least on mobile. Okay, text checking on Mac is pretty cool. Anyway, this year was really interesting, as they've added lots of new voices (basically fonts for blind people), except they're all monochrome and sometimes look awful depending on who's listening.) Other than that, they added support for door detection and ... I can't really think of much else. The really big thing is voices, since they've added one that the blind community has been using for about 25 years, Eloquence, which I'm sure they had to do a lot of engineering, compatibility with 32-bit libraries, and spaghetti code to get working with Apple silicon. Still, there's nothing that makes basically the whole blind community want to beta test like some new voices!

Microsoft

So, modernizing a whole OS is probably really hard. They still want to be backward compatible, but they also want to move things forward. So, they're still trying to push towards using UI Automation Even though File Explorer can be really sluggish, even on this new PC, and screen readers don't really have anything like the VoiceOver rotor which is invisible and instantly available. Windows is still the OS of choice for blind people. Microsoft has outlived the Mac hype, and still chugs along even with phones taking over the computing world.

Lately, they kind of seem to repeat themselves a lot. They continually talk about their new voices, only available to Narrator and no other screen reader cause Narrator has to be the premier screen reader experience. But, from a positive point of view, it could just mean they're planning something really nice for the next Windows release. I'd love to see offline image recognition that all screen readers could tap into, like the already-included text recognition.

ChromeOS

Crostini is really great. It lets me use Linux command line apps, through TDSR, or even GUI apps, through Orca, but with a nice window manager, notification system, and ChromeOS provides the web support and Android apps. And Emacspeak isn't sluggish as crap like it is in WSL2.

Linux

At least a lot of blind Linux users like either Mint or Arch. And there's Emacspeak. And GPodder, and Thunderbird is kinda nice when it wants to be, and LSHW gives loads of info on hardware, and Bash is far, far better than PowerShell. Like, “stop computer”? Who wants to type all that?

Braille

I've recently started reading, thanks to my Humanware NLS EReader, and I'm really starting to enjoy it. Thanks to, I think, my vitamins, and practice, I'm finding that I'm able to think ahead of the current reading point, to predict the rest of the sentence, and if the prediction is right, skim passed that. It's kinda cool. I'm not sure if I was able to do that before, but I'm definitely noticing it now.

Conclusion

In this blog post, I talked about how stuff still mainly works, Google's starting to give a crap, Apple still blazes ahead in some areas, and Microsoft still talks a lot. Oh and ChromeBook is still a nice Linux system lol, and Braille is good.

Discuss...

You can always subscribe to my posts through Email or Mastodon. Have a great day, and thanks for reading!

It's not just for people who are disabled

Introduction

While reading This article on how much Windows phones home to Microsoft, I thought about just how much we don't really have control over our data when running Windows. Who knows what all is being sent over all those network transmissions. I mean, when your cryptographic services contact other services over the Internet, like, why? On the Hacker News posting about this, a commenter asked why, after all this, someone would still use Windows. I responded with the usual “because accessibility, unfortunately.”

So, in this article, I'll talk about why accessibility should be the first thing every contributor to free and open source software thinks about. People with disabilities are some of the most disadvantaged, unvalued, discarded, and underrepresented people on Earth. Abled people don't want to think about us, because they don't want to imagine what it'd be like to be one of us. They fear going blind, deaf, or losing mental faculties, even though they know it'll happen eventually. So, supporting us is the right thing to do, provides an alternative to a disadvantaged population, and supports yourself when you need it most.

Got morals?

If you practice any kind of moral system, you probably know that you should help the poor. Some moral systems include people with disabilities, as we're often some of the most poor, especially in non-western countries. If you practice religion, you may or may not have seen a verse insisting that you not put a stumbling block in front of the blind, or other such admonitions. This should be the case in software as well.

We're all human, except for the bots crawling through this for keywords for search engines and such. We all are born with different traits. Some of us were smaller babies. Some of us were smarter babies. And some of us were disabled babies, or born prematurely, or survived even though the hope for such was low. So, shouldn't we account for these things? Shouldn't we prepare, in advance, for, say, a deaf person to use your chat program, or a blind person to try your audio editor?

Supporting people with disabilities is the right thing to do. It's the human thing to do. You don't want to look like those soulless corporations, do you? And even the corporations make an effort to support disabled people, even if to prop up their image. Can the open source community not do better than an uncaring, unfeeling money-printing machine? Surely, humans are better than the corporate machine!

And yet, in open source communities, people with disabilities are often ignored, or told they'll have to be a developer to make things better, or told to “be the change you want to see,” which is just plain demoralizing to a non-developer. Developers, and communities in general, must learn to empathize with all users, before they themselves become the ones needing empathy.

We are Everywhere

Have you ever called a bank, a hospital, a non-profit organization, the Internal Revenue Service, or your phone company? Yes? Then chances are, you could have been speaking with a person with disabilities. Blind people work in many call centers, and at many phone network providers, like Verizon, AT&T, and others. Do you know what operating system they're more than likely using? That's right, Windows. Why? Because accessibility on Windows, using Windows screen readers, and Chrome or Edge, is top-notch. Now, they may not be using the latest version of Windows, and hopefully it's all patched up, but we don't know that. The only company that does is Microsoft, and it sure isn't going to talk about its weaknesses.

So, how about the developers of free and open source desktop environments, web browsers, and operating systems be the stronger party and ensure that no one has to ever run Windows? After all, it's your data that's being stored on Windows computers, in Windows servers, spoken by, more than likely, $1099 closed source screen readers that could be doing anything with your data. If it sounds like I'm trying to scare you, you're right. We have asked nicely for the last decade to be taken seriously. All we've gotten is a shrug, a few nice words, and a “don't bother me I'm engineering,” kind of vibe after that. Well, you might as well start engineering for us before it's too late for you.

Where do you want to be in forty years?

It's no secret that we're all getting older. We age every second of every day. And, as we age, our bodies and minds start to fail.

Our eyes grow dim, our ears don't hear the birds outside anymore, and our minds tick slower and slower. But our hobbies, or our jobs, never quite leave us. Some developers can just climb the chain until they're high enough to not need to code anymore, thus bypassing the need to confront their failing eyesight, on the job at least. Some developers just retire and quit coding, choosing to give the wheel to younger, and hopefully brighter, generations. But why? You know so much! You still have those ideas! You still want to see freedom win!

Let's try another problem, those who become disabled younger in life. There are many genetic issues, diseases (like COVID-19), and so on that may cause even a younger person to become disabled. You may lose your vision, have a car accident, lose some hearing from listening to loud music, or maybe you just don't have the energy that you used to have. But you still want to code! You still want to create! And you have unfinished projects that need fixing!

In both cases, helping people that are disabled will help you when you need it most. We, people who are disabled, simply came with what you'll be getting in the future. So why not start now? Help make desktop environments a joy to use for blind people, so when your eyes start to hurt after a while of using them, you can just close them, turn on accessibility features, and continue working with your eyes closed! Or, if you make things easy for people with mobility issues, you can work one-handed when the other cramps up. Or, if you work on spell checking, autocorrection, and word suggestions, you can take advantage of that when a word just won't come to you, or when you forget how to spell a word.

So how can I help?

We need people, not companies. Companies, like Canonical, will sit there and work on their installer accessibility, while the real issue is the desktop environment. The System 76 folks only need accessibility help when they get to the GUI of the desktop environment that they're building. The Gnome folks say that they need coders, not users. So I have little faith in corporate-backed open source. They're just another machine.

So, community support is where it's at, I think. But it can't just be one person. It has to be everyone. Everyone should be invested in how they're going to use computers in the future. Everyone should care about themselves enough to consider what they'll do when, not if, they go blind, lose hearing, lose energy, lose memory, lose mental sharpness. Everyone should be into this, for their own sake.

There are many Linux desktop environments besides Gnome. KDE is what I'd be using if I could see. There's also Mate, Cinnamon, LXDE, XFCE, and others. Why mainstream distributions of Linux choose to stick with Gnome is beyond me. Below are some ideas to get the community started.

  • Use Linux with a screen reader. If you don't like it, we probably won't either.
  • Add Accessibility labels to whatever you're making.
  • Look at the Code of the Orca screen reader
  • Gather people with disabilities to get feedback on your desktop environment or distribution.
  • Have either your entire team focus on accessibility, or, if you must, make an accessibility team.
  • Spread the word about your accessibility fixes, put them front and center!
  • See how much your image improves, and how loyal disabled people are!

Yeah, it looks a bit selfish. But I've grown to expect people to be selfish, and care about how they're seen, and getting more users and such. That's just how we disabled people have to be most of the time. So, prove us wrong. Show us that the world of communities, democracies, people of high ideals, care about the disadvantaged, about their own security and the security of others, and themselves in the future. Let's make open source really open to everyone. Let's make freedom free for everyone. Why not?

Discuss...

You can always subscribe to my posts through Email or Mastodon. Have a great day, and thanks for reading!

Imagine, for a moment, that there is a ring. It's dim and gray, lifeless. There are many rings encircling it, but these are darker, more foreboding than the central ring. You hold in your hand a light, which can only shine inward. How will you proceed?

Let's try putting the light in the central ring, and see what happens. You place the light on the rim of the ring, and step back, watching the light fill the central ring with vibrant, lively color.

Despair

Darkness is all I know. I stand in the center of my ring, one of the outermost rings in the system. It's cold here, dark, no one wants to come near us, for fear of catching our Darkness. I don't blame them. We fight so much. Just thinking of battle makes me feel… better. Like I have something to blame. Someone to hate.

I sigh, looking up. Up at the Light ring. The central ring. No one wants us anywhere near there.

But I wanted it. Or to bring them out. Yes, I would destroy their Light, make them feel our anguish, our despair, our hate. I tell my tired and beaten-down body to move. To Climb the rings, to seek that Light, and snuff it out. I would find whoever put that light there, and make them feel my pain, my agony.


Well that wasn't such a good story, now was it? Let's hope that guy doesn't find you, right? Here, let's reset and try again. This time, let's put the Light on the outermost ring, and see what happens.

Peace

Light reassures me as I stand on the rim of my ring. I feel it's heat, and people from other rings say it allows them to sense things from a distance, to know what's around without them making sounds. That's alright. We have machines that use the light, like Investiture or a power source, to tell us what's around. In my free time, I enjoy exploring the rings, helping people, and anything else I can do for our system. I look to the next ring, where some of my family live. I jump there, and spend a while searching for things new in the ring. I scan the describer device upward, to further rings, and to the central ring, which, I'm told, glows with the casted light of all other rings. I lift my face, and feel the warmth of the Light.

That night, I dream of another world, where the Light has chosen to selfishly glow only on the central ring. I woke up sobbing at the idea. Seeing the people, filled with fear and hurt and pain, just barely surviving, and beg Elyon to have mercy on those people, if they do exist.


Ah, that's better. Since the central ring already has some light, putting the big Light on the outer ring allows the light to move inward, giving all rings light, not just one. Yet, software and web developers selfishly think of the central ring, which stands for “the 99%” or “majority” of people, and disregard those who need their services the most. Thus, people with disabilities, neurodivergent people, people who don't speak English, people who have trouble reading, people who have trouble processing images, Autistic people, and so many others live in a world that slaps them in the face every moment of every day.

Technology doesn't care. Bits and bytes can be used to help people with disabilities in so many ways. Yet, they are used, in so many ways, by the ignorant abled people, to bar access to so many things, from playing video games to COVID tests. And we can't even move towards the Light, as it were. With Linux, the free and Open Source system, made by abled people, almost every desktop environment has huge accessibility issues. In fact, even if you find a good one, you still have to enable Assistive Technology Support. And now, the Mate desktop, which has been the most accessible we have, only because it's based on Gnome 2 from 10+ years ago, and is starting to show its age. Chrome-based apps, like VS Code and Google Chrome, crash out of nowhere. Pidgin crashes while writing a long message. And if a Chrome-based app crashes, Orca is lost, and one has to immediately set focus to the desktop or it'll be totally lost until it sees a dialog it creates.

So, that means we can't even get into a great position to learn to make our own stuff, from some of the best courses like the Odin project, which requires you either use Linux, MacOS, or the Linux system on Chromeos. Windows, the most accessible system, which is supported by a large community of blind developers, and is created by a company which, in recent times, is getting more into accessibility, isn't allowed.

So think on this, when inspiration strikes for a new site, a new app, a new package. If you help the least of us, you'll help the best of us too!

Discuss...

You can always subscribe to my posts through Email or Mastodon. Have a great day, and thanks for reading!

About a week ago, I got a new laptop. It's an HP, with an AMD 5500 processor. With 8 gigs of RAM, 512 GB SSD storage, and a modern processor, I think it'll last a good while. I do hope I can swap out the RAM for two 8 GB sticks instead of 4 GB sticks.

After using Windows 11 for a while, I got the Linux itch again. Windows was... slower than I expected. Along with some games being more frustrating than fun, I decided to just do it.

So I installed Fedora. I chose the Fedora 35 Mate spin for this. Well, first I tried the regular Gnome version but Orca couldn't read the installer so that's great. After getting it installed, turning on Orca at the login screen, and the desktop, setting Orca to start after login always, and turning on assistive technology support, I was ready to go. Except...

Bumps in the Road

I mainly use Google Chrome for browsing. After getting that installed, I opened it, prepared to sync my stuff and get to browsing. But upon its opening, there was nothing there. Orca read absolutely nothing. Baffled, I installed VS Code. Still the same, nothing.

So, I hunted down the accessibility cheat codes I used to magically make things work:

export ACCESSIBILITY_ENABLED=1
export GTK_MODULES=gail:atk-bridge
export GNOME_ACCESSIBILITY=1
export QT_ACCESSIBILITY=1
export QT_LINUX_ACCESSIBILITY_ALWAYS_ON=1

Successes

After restarting the computer, things worked. I could use Chrome and VS Code. Then, I set up Emacs with Emacspeak. After a lot of looking around, I discovered I need lots of ALSA stuff, like alsa-utils, and mplayer, sox, and all that sound stuff. Oh and replace serve-auditory-icons with play-auditory-icons so all icons play.

It was during my setup of Emacs that I found one of the joys of Linux, dotfiles. I copied the .emacs files from my ChromeBook to the new Linux PC, and it was like I'd just simply opened the Emacs on my ChromeBook. Everything was there. My plugins, settings, even my open files were there.

Linux is really snappy. Like, I can open the run dialog, type google for google-chrome, press Enter, and there's Chrome, ready almost before I am. Pressing keys yields instant results, even faster than Windows.

Nothing's Perfect

Even with all this: fast computing, Emacs, updated system, freedom to learn about computing, there are some rough edges. If you close a Chrome-based app, like VS Code and such, you have to move to the desktop immediately, or Orca will get stuck on nothing. If that happens, you have to press Insert + h for help, then F2 to bring up any kind of dialog for Orca to get onto. Seems Mate's Window manager doesn't put focus on the next window. Also the top panel on Mate has lots of unlabeled items. And there are very few accessible games natively for Linux, but with Audiogame Manager, there are plenty of Windows games I can play.

Discuss...

You can always subscribe to my posts through Email or Mastodon. Have a great day, and thanks for reading!