Six Months with Android: A journey from a blind perspective
During the last six months, I’ve used Android exclusively as my primary device. I made calls with it, texted with it, read emails with it, browsed the web, Reddit, Twitter, and Facebook with it, and played games on it. Now, I’m going to express my thoughts on it, its advancements, and its issues.
This will contain mostly opinions, or entirely opinions, depending on whether you really love Android or not. But whatever your stance, these are my experiences with the operating system. My issues may not be your issues, and so on.
To put things into perspective, I’ve used my phone, the Samsung Galaxy S20 FE 5G, for the following, with the apps I’ve used:
- Email: Gmail, Aquamail, DeltaChat
- Messaging: Google Messages
- RSS: Feeder
- Podcasts: Podcast Addict, PocketCasts
- Terminal: Termux
- HomeScreen: 1UI
- Screen reader: Google TalkBack
- Speech Engine: Samsung TTS and Speech Services by Google
- Text Recognition/object detection: Google Lookout and Envision AI
- Gaming: PPSSPP, Dolphin, EtherSX2
- Reddit: Redreader
I’m sure I’m forgetting a few apps, but that’s basically what I used most often. For Facebook, Twitter, YouTube and other popular services, I used their default apps, with no modifications. I used all the Google services that I could, and rarely used Samsung’s apps. So, this is to show that I was deep into the Android ecosystem, with Galaxy Buds Pro, a ChromeBook, and a TickWatch E3.
I want to start off the comparison with what worked well. First, Samsung TTS voices are really nice, sounding even more smooth, sometimes, than Alex on iOS, and much more so than the Siri voices. I still love the Lisa voice, which, to me, sounds as close to Alex as possible with her cadence and professional-sounding tone. Yes, the voices could be sluggish if fed lots of text at once, but I rarely ran into that.
I also love the wide variety of choice. Apple includes the AAC Bluetooth codec on their iPhones. So, if you get APTX, or Samsung’s Variable codec, or other headphones with other codecs, it won’t matter, and you’ll go back to SBC, which sounds the worst out of all of them. If your headphones have AAC, of course, it’ll get used on the iPhone. But if not, you’re stuck with SBC. Android phones, though, usually come with a few different codecs for headphones to choose from, and in the developer settings, you can choose the codec to use.
Another great feature of all modern Android phones is USB-C. Everything else uses USB-C now, including Braille displays, computers, watches, and even iPads. With Android, you can plug all these things into your phone with the same cable. If your flash drive has USB-C, you can even plug that in! With iPhone, though, you have to deal with Lightning, which is just another cable, and one you’ll likely have less of, since less stuff uses it.
The last thing is that Android phone makers typically try out new technology before Apple does, leading to bigger cameras, folding phones, or faster Wi-fi or cellular data. Now that the new iPhone SE has 5G, and probably the latest Wi-fi, though, that’s most likely less of an issue. Still, if you like folding phones, Android is your only choice right now.
Starting on the software, it’s pretty close to Linux, so if you plug in a game controller, keyboard, or other accessory, it’ll probably work with it. If you have an app for playing music using a Midi keyboard, and you plug one in, it’ll likely work. On iPhone, though, you need apps for more things, like headphones and such.
Another nice thing, beginning in the accessibility arena, is that the interface is simple. Buttons are on the screen at all times, not hidden behind menus or long-press options like they are a lot of the time on iOS. If you can feel around the screen with one finger, or swipe, you’ll find everything there is to find. This is really useful for beginners.
Another pleasant feature is the tutorial. The TalkBack tutorial guides Android users through just about every feature they’ll need, and then shows them where they can learn more. VoiceOver has nothing like that.
On Android, things are a lot more open. Thanks to that, we have the ability for multiple screen readers, or entirely new accessibility tools, to be made for Android. This allows BRLTTY to add, at least, USB support for HID Braille displays, and Commentary to OCR the screen and display it in a window. This is one of the things that really shines on Android.
Those Bluetooth headphones I was talking about? The Galaxy Buds Pro are very unresponsive with TalkBack, making them almost useless for daily use. The TickWatch has its own health apps, so it doesn’t always sync with Google Fit, and doesn’t sync at all with Samsung Health. Otherwise, the watch is a nice one for Android users. On iPhone though, it doesn’t even share the health data with the health app, just Google Fit, which doesn’t sync with the health app either.
A few days ago, a few things happened that brought the entire Android experience into focus for me. I was using the new Braille support built into TalkBack, with an older Orbit Reader Braille display, since my NLS EReader doesn’t work with TalkBack, since there is no Bluetooth Braille HID support. I found that reading using the Google Play Books app is really not a great experience when reading in Braille. Then, I found a workaround, which I’ll talk about soon, but the fact is that it’s not a great experience on Google Play Books.
So, I got confirmation that someone else can reproduce the issue. The issue is that the display is put back at the beginning of the page before even reading the next page button, and that one then cannot easily move to the next page. I then contacted Google Disability support. Since their email address was no longer their preferred means of contact, I used their form. On Twitter, they always just refer you to the form.
The form was messy. With every letter typed into the form, my screen reader, NVDA on Windows, repeated the character count, and other information. It’s like no blind person has ever tested the form that blind people are going to use to submit issues. “No matter,” I thought. I just turned my speech off and continued typing, in silence.
When the support person emailed me back, I was asked some general info, and to take a video of the issue. This would require, for me, a good bit of setup. I’d need three devices: the phone, the Braille display, and something to take the video with, probably a laptop. Then I’d need to get everything in the frame, show the bug, and hope the support person can read Braille enough to verify it.
This was a bit much for me. I have a hard job, and I have little energy afterwards. I can’t just pop open a camera app and go. So, I asked around, and found a workaround. If you use the Speech Central app, you can stop the speech, touch the middle of the screen, and then read continuously. But why?
This really brought home to me the issues of Android. It’s not a very well put together system. The Google Play Books team still uses the system speech engine, not TalkBack, to read books. The Kindle app does the same thing. There is barely a choice, since TalkBack reads the books so poorly. This is Google’s operating system, Google’s screen reader, and Google’s book-reading app. There is little excuse for them to not work well together.
Then, either that night or the night after that, I got a message on Messenger. It was a photo message. So, naturally, I shared it with Lookout, by Pressing Share, then finding Lookout in the long list of apps, double tapping, waiting for the image recognition, and reading the results. And then I grabbed the iPhone, opened Messenger, opened the conversation, double-tapped the photo, put focus on the photo, and heard a good description. And I thought, “Why am I depriving myself of better accessibility?”
And there’s the big issue. On iOS, Braille works well, supports the NLS EReader, and even allows you to customize commands, on touch, Braille, and keyboard. Well, there are still bugs in the HID Braille implementation that I’ve reported, but at least the defaults work. That’s more than I can say for TalkBack and Android.
And then the big thing, image descriptions, and by extension, screen recognition. TalkBack finally has text detection, and icon detection. That’s pretty nice. But why has it taken this long? Why has it taken this long to add Braille support? Why do we still have robotic Google TTS voices when we use TalkBack? After all these years, with Google’s AI and knowledge, Android should be high above iOS on that front. And maybe, one day, it will be. But right now, Android’s accessibility team is reacting to what Apple has done. Braille, image descriptions, all that. And if there’s a central point to what I’ve learned, it’s this: do not buy a product based on what it could be, but what it currently is.
Then, I started using the iPhone more, noticing the really enjoyable, little things. The different vibration effects for different VoiceOver actions. Not just one for the “gesture begin”, “gesture end,” “interactable object reached”, and “text object reached”. No, there are haptics for alerts, reaching the boundary of the screen or text field, moving in a text field, using Face ID, and even turning the rotor. And you can turn each of these on or off. What’s that about Android being so customizable?
Then there’s the onscreen Braille keyboard. On Android, to calibrate the dots, you hold down all six fingers, hold it a little longer, just a bit longer… Ah, good, it detected it this time. Okay, now hold down for two more seconds. Now you’re ready to type! Yes, it takes just about that long.
On iOS, you quickly tap the three fingers of your left hand, then the fingers of your right hand, and you’re ready! Fast, isn’t it? These kinds of things were jarring with their simplicity, coming from Android, where I wasn’t even sure if calibration would work this time. I do miss the way typing on the Android Braille keyboard would vibrate the phone, letting you know that you’d actually entered that character. However, the iPhone’s version is good enough that I usually don’t have to worry about that.
I want to talk a bit more about image descriptions. While I was on Android, I learned to ignore images. Sure, I wanted to know what they were, but I couldn’t easily get that info, not in like a few seconds, so I left them alone. On iOS, it’s like a window was opened to me. Sure, it’s not as clear as actually having sight, and yes, it gets things wrong occasionally. But it’s there, and it works enough that I love using it. Now, I go on Reddit just to find more pictures!
And for the last thing, audio charts. Google has nothing like this. They try to narrate the charts, but it’s nothing like hearing the chart, and realizing things about it yourself. Hearing the chart is also much faster than hearing your phone reading out numbers and labels and such.
Here, I’ll detail some ugly accessibility issues on Android, that really make iOS look as smooth as glass in comparison. Some people may not deal with this, but I did. Maybe, by the time you read this article, they’ll be fixed in Android 13, or a TalkBack update or something.
First, text objects can’t be too long, or TalkBack struggles to move on to the next one. This can be seen best in the Feeder app, which, for accessibility reasons, uses the Android native text view for articles. This is nice, unless a section of an article spans one screen of text. Take the Rhythm of War rereads on Tor. Some of those sections are pretty long, and it’s all in one text element. So TalkBack will speak that element as you swipe to the next element, until it finally reaches the next one. This can take one swipe, or three, or five. This happens a lot in Telegram too, where messages can be quite long.
Another issue is clutter. A lot of the time, every button a user needs is on the screen. For example, the YouTube app has the “Go to channel” and “actions” buttons beside every video. This means you have to swipe three times per video. On iOS, each video is on one item, and the actions are in the actions rotor. TalkBack has an actions menu, but apps rarely use it. Gmail does, for example, but YouTube doesn’t. This makes it even more tricky for beginners, who would then have to remember which app uses it and which app doesn’t, and how to get to it and such.
When an Android phone wakes up, it reads the lock status, which usually is something like “Swipe with two fingers or use your fingerprint to unlock.” Then, it may read the time. That’s a lot of words just to check what time it is. Usually dependably, an iPhone reads the time, and then the number of notifications. Apple’s approach is a breath of fresh air, laced with the scent of common sense and testing by actual blind people. This may seem like a small thing, but those seconds listening to something you’ve heard a hundred times before add up.
If you buy a Pixel, you get Google TTS as your speech engine. It sucks pretty badly. They’re improving it, but TalkBack can’t use the improvements yet, even if other screen readers and TTS apps can. Crazy, right? However, with the Pixel, you get Google’s Android, software updates right at launch, and the new tensor processor, voice typing, and so on. If you get Samsung, you get a good TTS engine, for English and Korean at least. You also get a pretty good set of addons to Android, but a six-month-old screen reader and an OS that won’t be updated in about six months either. This is pretty bad mostly because of TalkBack. You see, there are two main versions of TalkBack. There is Google’s version, and Samsung’s version. Samsung’s TalkBack is practically the same as Google’s, but at least one major version behind, all the time. With iPhone, you get voices aplenty, from Eloquence—starting next month—Alex, Siri voices, and Vocalizer voices, with rumors that third-party TTS engines can be put on the store soon. You get a phone that, with phones as old as the 8, can get the latest version of iOS, and get them on the day they’re released. And there is no older version of VoiceOver just floating around out there.
I still love Android. I love what it stands for in mobile computing. A nice, open source, flexible operating system that can be customized by phone makers, carriers, and users to be whatever is needed. But there really isn’t that kind of drive for accessibility. TalkBack languished for years and years, and is only just now trying to hurry to catch up to VoiceOver. Will they succeed? Maybe. However, VoiceOver isn’t going to sit still either. They now have the voice that many in the blind community can’t do without. On Android, that voice, Eloquence, is now abandoned, and can’t be bought by new users. And when Android goes 64-bit only, who knows whether Eloquence will work or not. iOS, on the other hand, officially supports Eloquence, and the vocalizer voices, and even the novelty voices for the Mac. They won’t be abandoned just because a company can’t find it within themselves to maintain such a niche product. Furthermore, all these voices are free. Of course, when a blind person buys an $800 phone, they’d better be free.
I’m also not saying iOS is perfect. There are bugs in VoiceOver, iOS, and the accessibility stack. Braille in particular suffers from some bugs. But nothing is bug-free. And no accessibility department will be big, well-staffed, well-funded, or well-appreciated. That’s how it is everywhere. The CEO or president or whoever is up top will thank you for such a great job, but when you need more staff, better tools, or just want appreciation, you’ll often be gently, but firmly, declined. Of course, the smaller the company, the less that may happen, but the disability community can never yell louder than everyone else. Suddenly, the money, the trillions, or billions of dollars, just isn’t there anymore when people with disabilities kindly ask.
But, the difference I see is what the two accessibility teams focus on. Apple focuses on an overall experience. When they added haptics to VoiceOver, they didn’t just make a few for extra feedback, they added plenty, for a feedback experience that can even be used in place of VoiceOver’s sounds. When they added them, they used the full force of the iPhone’s impressive haptic motor. Just feel that thump when you reach the boundary of the screen, or the playful tip tick of an alert popping up, or the short bumps as you feel around an empty screen. All that gives the iPhone more life, more expression in a world of boring speech and even more boring plain Braille.
The iPhone has also been tested in the field, for even professional work like writing a book. One person wrote an entire book on his iPhone, about how a blind person can use the iPhone. That is what I look for in a product, that level of possibility and productivity. As far as I know, a blind person has never written a book using Android, preferring the “much more powerful” computer. I must say that the chips powering today's phones are sometimes even more powerful than laptop chips, especially from older laptops. No, it’s the interface, and how TalkBack presents it, that gets in the way of productivity.
Lastly, I’m not saying that a blind person cannot use Android. There are hundreds of blind people that use Android, and love it. But if you rely on Braille, or love image descriptions, or the nice, integrated system of iOS, you may find Android less productive. If you don’t rely on these things, and don’t use your phone for too much, then Android may be a cheaper, and easier, option for you. I encourage everyone to try both operating systems out, on a well-supported phone, for themselves. I’ll probably keep my Android phone, since I never know when a student will come in with one. But I most likely won’t be using it that much. After all, iOS and VoiceOver, offer so much more.
You can always subscribe to my posts through Email or Mastodon. Have a great day, and thanks for reading!