Devin Prater's blog

This is going to be a more emotional post, which mirrors my mental state as of now. I just have to write this down somewhere, and my blog should be a good place to put it. It may also be helpful for others who struggle with this.

I've used just about every operating system out there, from Windows to Mac to Linux, ChromeOS, and Android and iOS. I've still not found one I can be completely happy with. I know, I may never find an OS that fits me perfectly, but so many others have found Linux to be all they ever need. I wish I could find that. Feel that feeling of not needing to switch to another laptop just to use a good Terminal with a screen reader that will always read output, or the ability to use Linux GUI apps, like GPodder or Emacs with Emacspeak.

There are times when Windows is great. Reading on the web, games, and programs that were made by blind people to help with Twitter, telegram, Braille embossing, and countless screen reader scripts. Other times, I want a power-user setting. I want GPodder for podcasts, or to explore Linux command-line apps. I asked the Microsoft accessibility folks about Linux GUI accessibility, and they just said to use Orca. I've never gotten Orca to run reliably on WSL2. It's always been reliable on ChromeOS with Crostini.

Whenever I get enough money, I'll get 16 GB RAM, so maybe I can run a Linux VM. But still, that's not bare metal. And if I switch to Linux, I would have to run a Windows VM, for the few things that run better on Windows, like some games, and probably the Telegram and Twitter support. It's all just kind of hard to have both. Dual booting may work, but I've also heard that Windows gets greedy and messes with the bootloader.

But, with there being a blind person working on Linux accessibility at Red hat, I hope that, soon, I won't need Windows anymore. I can hope, at least. But with there still being a few who have the mindset that I must fix everything myself, I must still remain cautious, and unexcited about this development among the hardcore Linux community, lest the little amount of joy a full-time Linux accessibility person being hired gives me, is taken away by their inflexibility and cold, overly-logical mindset.

But, I'm not done yet. With the little energy taking vitamins has given me, I've made a community for FOSS accessibility people on Matrix, bridged to IRC. I continue to study books on Linux, although I've not gotten up the energy to continue learning to program and practice. Maybe I'll try that today.

Mostly, I don't want newcomers to Linux to feel as alone in their wrestling with all this as I do. All other blind people are already so far ahead. Running Arch Linux, able to code, or at least happy with what they have and use. I don't want future technically inclined blind people to feel so alone. Kids who are just learning to code, who are just getting into GitHub, who are just now learning about open source. And they're like “so what about a whole open source operating system?”

And then they look, find Linux, and find so few resources for it for them. Nothing that they can identify with. Well shoot, there it is. Documentation I guess. I do want to wait until Linux, and Gnome or whatever we ultimately land on, is better. Marco (in Mate), shouldn't be confused whenever a QT or Elektron-based app closes and focus is left out in space somewhere. An update shouldn't break Elektron apps' ability to show a web view to Orca. And we definitely shouldn't be teaching kids a programming language, Quorum, made pretty much specifically for blind people. But I'm glad we're progressing. Slowly, yes, but it's happening at least.


You can always subscribe to my posts through Email or Mastodon. Have a great day, and thanks for reading!

Why tools made by the blind, are the best for the blind.


For the past few hundred years, blind people have been creating amazing technology and ways of dealing with the primarily sighted world. From Braille to screen readers to canes and training guide dogs, we've often shown that if we work together as a community, as a culture, we can create things that work better than what sighted people alone give to us.

In this post, I aim to celebrate what we've made, primarily through a free and open source filter. This is because, firstly, that part of what we've made is almost always overlooked and undervalued, even by us. And secondly, it fits with what I'll talk about at the end of the article.

Braille is Vital

In the 1800's, Louis Braille created a system of writing that was made up of six dots configured in two columns of three dots, which made letters. This followed the languages of print, but in a different writing form. This system, called Braille after its inventor, became the writing and reading system of the blind. Most countries, even today, use the same configurations created by Louis, but with some new symbols for each language's needs. Even Japanese Braille uses something resembling that system.

Now, Braille displays are becoming something that the 20 or 30 percent of employed blind people can afford, and something that the US government is creating a program to give to those who cannot afford one. Thus, digital Braille is becoming something that all screen reader creators, yes even Microsoft, Apple, and Google, should be heavily working with. Yet, Microsoft doesn't even support the new HID Braille standard, and neither does Google. Apple supports much of it, but not all of it. As an aside, I've not even been able to find the standards document, besides This technical notes document from the NVDA developers.

However, there is a group of people who has taken Braille seriously since 1995. That is the developers of BRLTTY, of which you can read some history. This program basically makes Braille a first-class citizen in the Linux console. It can also be controlled by other programs, like Orca, the Linux graphical interface screen reader.

BRLTTY has gone through the hands of a few amazing blind hackers (as in increddibly competent programmers)), to land at, where you can download it not only for Linux, where it's original home is at, but for Windows, and even Android. BRLTTY not only supports the Braille HID standard, but is the only screen reader that supports the Canute 360, a multi-line Braille display.

BRLTTY, and its spin-off project of many Braille tables (called LibLouis), have proven so reliable and effective that they've been adopted by proprietary screen readers, like JAWS, Narrator, and VoiceOver. VoiceOver and JAWS use LibLouis, while Narrator uses them both. This proves that the open source tools that blind people create are undeniably good.

But what about printing to Braille embossers? That is important too. Digital Braille may fail to work for whatever reason, and we should never forget hardcopy Braille. Oh hey lookie! Here's a driver for the Index line of Braille embossers. The CUPS (Common Unix Printing System) program has support, through the cups-filters package, for embossers! This means that Linux, that impennitrable, unknowable system for geeks and computer experts, contains, even out of the box on some systems, support for printing directly to a Braille embosser. To be clear, not even Windows, or MacOS, or iOS, has this. Yes, Apple created CUPS, but they've not added the drivers for Braille embossers.

Let that sink in for a moment. All you have to do is set up your embosser, set the Braille code you want to emboss from, the paper size, and you're good. If you have a network printer, just put in the IP address, just like you'd do in Windows. Once that's sunk in, I have another surprise for you.

You ready? You sure? Okay then. With CUPS, you can emboss graphics on your embosser! Granted, I only have an Index D V5 to test with, but I was able to print an image of a cat, and at least recognize its cute little feet. I looked hard for a way to do this on Windows, and only found an expensive tactile graphics program. With CUPS, through the usage of connecting to other Linux programs like ImageMagick, you can get embossed images, for free. You don't even have to buy extra hardware, like embossers especially made for embossing graphics!

Through both of these examples, we see that Braille is vital. Braille isn't an afterthought. Braille isn't just a mere echo of what a screen reader speaks aloud. Braille isn't a drab, text-only deluge of whatever a sighted person thinks is not enough or too much verbiage. Braille is a finely crafted, versitile, and customizable system which the blind create, so that other blind people can be productive and happy with their tools, and thus lessen the already immense burden of living without sight in a sighted world. And if electronic Braille fails, or if one just wants to use printed material like everyone else can, that is available, and ready for use, both to print text and pictures.

Speech matters too

If a blind person isn't a fast Braille reader, was never taught Braille, or just prefers speech, then that option should not just be available for them, but be as reliable, enjoyable, and productive an experience as possible. After all, wouldn't a sighted person get the best experience possible? Free and open source tools may not sound the best, but work is being done to make screen readers as good as possible.

In the Linux console, there are three options. One can use Speakup, Fenrir, and TDSR. On the desktop, the screen reader has been Orca, but another is being written, called Odilia. Odilia is being written by two blind people, in the Rust programming language.

If one uses the Emacs text editor, one can also take advantage of Emacspeak. This takes information not from accessibility interfaces, but Emacs itself, so it can provide things like aural syntax highlighting, or showing bold and italics through changes in speech.


There are several communities for blind Linux and open source users. There is the Blinux, the Orca mailing list, the LibreOffice Accessibility mailing list, and the Debian Accessibility mailing list.

Recently, however, there is a new way for all these groups, and sighted developers, to join together with, hopefully, more blind people, more people with other disabilities, and other supporters. This is the Fossability group. This is, for now, a Git repository, mailing list, and Matrix space. It's where we can all make free and open source software, like Linux, LibreOffice, Orca, Odilia, desktop environments, and countless other projects, as useful and accessible as possible.

Blind people should own the technology they use. We should not have to grovel at the feet of sighted people, who have little to know idea what it's like to be blind, for the changes, fixes, and support we need. We should not have to wait months for big corporations (corpses), to gather their few accessibility programmers to add HID Braille support to a screen reader. We should not have to wait years for our file manager to be as responsive as the rest of the system. We should not have to wait a decade for our screen reader to get a basic tutorial, so that new users can learn how to use it. We should not have to beg for our text editor to not just support accessibility, but support choices as to how we want information conveyed. This kind of radical community support requires that blind people are able to contribute up the entire stack, from the kernel to the screen reader. And with Linux, this is entirely possible.

Now, I'm not saying that sighted people cannot be helpful, it's the exact opposite. Sighted people have designed the GUI that we all use today. Sighted people practically designed all forms of computing. Sighted developers can help because they know graphical toolkits, so can help us fix any accessibility with that. And I'm not trying to demean the ongoing, hard, thankless job of maintaining the Orca screen reader. Again, that's not even the maintainer's job that she gets paid for. However, I do think that if more blind people start using and contributing to Linux and other FOSS projects, even with just support or bug reports, a lot of work will be lifted from sighted people's shoulders.

So, let's own our technologies. Let's take back our digital sovereignty! We should be building our own futures, not huge companies with overworked, underpaid and underappreciated, burnt-out and understaffed accessibility engineers. Because while they work on proprietary, closed-off, non-standard solutions, we can build on the shoulders of the giants that have gone before us, like BRLTTY, the CUPS embosser drivers, and so many other projects by the blind, for the blind. And with that, we can make the future of Assistive Technology open, inviting, welcoming, and free!


You can always subscribe to my posts through Email or Mastodon. Have a great day, and thanks for reading!

A few positive things for a change

In the past, I've mostly written articles about problems with operating systems, products, services, and general technology. But, in this article, I want to shed a little light on what good things are going on. This doesn't really negate all the bad, but it helps to think about the good things that are going on more than just the bad.


Lately, Google has been putting a lot more effort into Android accessibility than in previous years. A few years ago, Google added commands to TalkBack that could use more than one finger. This means that complex two-part commands, like swiping up then right, or right then down, which are more like commands you'd perform on a video game joystick than a phone, don't have to be used. Instead, one can use two, three, or four finger taps or swipes instead. These are also pretty customizable.

Then, in Android 12, Google brought those commands, which were previously only for Pixel and Samsung devices, out of (beta I guess) exclusivity, and onto every Android device. Oh and in Android 11 or so they added an onscreen braille keyboard, which I now can't live without, and previously couldn't on iOS either. That's the one thing that gave me a good enough excuse to jump to Android.

Now, they're adding Braille display support, so if a blind person owns a refreshable Braille display, they can connect it through Bluetooth to Android. This will be coming out in Android 13 later this year. And if Samsung doesn't hurry it up, I won't be very happy if I have to wait until next year to get 13. Ah well, Dolby Atmos is pretty worth it.

I hope they keep improving their AI stuff. Right now, they can detect text in images, but I'd love to be able to go through my photo library and hear descriptions of images, like I can on iOS. No, having to send the image to another app isn't the same thing. But they're getting closer!


Apple still leads the way on adding new features to their accessibility settings, at least on mobile. Okay, text checking on Mac is pretty cool. Anyway, this year was really interesting, as they've added lots of new voices (basically fonts for blind people), except they're all monochrome and sometimes look awful depending on who's listening.) Other than that, they added support for door detection and ... I can't really think of much else. The really big thing is voices, since they've added one that the blind community has been using for about 25 years, Eloquence, which I'm sure they had to do a lot of engineering, compatibility with 32-bit libraries, and spaghetti code to get working with Apple silicon. Still, there's nothing that makes basically the whole blind community want to beta test like some new voices!


So, modernizing a whole OS is probably really hard. They still want to be backward compatible, but they also want to move things forward. So, they're still trying to push towards using UI Automation Even though File Explorer can be really sluggish, even on this new PC, and screen readers don't really have anything like the VoiceOver rotor which is invisible and instantly available. Windows is still the OS of choice for blind people. Microsoft has outlived the Mac hype, and still chugs along even with phones taking over the computing world.

Lately, they kind of seem to repeat themselves a lot. They continually talk about their new voices, only available to Narrator and no other screen reader cause Narrator has to be the premier screen reader experience. But, from a positive point of view, it could just mean they're planning something really nice for the next Windows release. I'd love to see offline image recognition that all screen readers could tap into, like the already-included text recognition.


Crostini is really great. It lets me use Linux command line apps, through TDSR, or even GUI apps, through Orca, but with a nice window manager, notification system, and ChromeOS provides the web support and Android apps. And Emacspeak isn't sluggish as crap like it is in WSL2.


At least a lot of blind Linux users like either Mint or Arch. And there's Emacspeak. And GPodder, and Thunderbird is kinda nice when it wants to be, and LSHW gives loads of info on hardware, and Bash is far, far better than PowerShell. Like, “stop computer”? Who wants to type all that?


I've recently started reading, thanks to my Humanware NLS EReader, and I'm really starting to enjoy it. Thanks to, I think, my vitamins, and practice, I'm finding that I'm able to think ahead of the current reading point, to predict the rest of the sentence, and if the prediction is right, skim passed that. It's kinda cool. I'm not sure if I was able to do that before, but I'm definitely noticing it now.


In this blog post, I talked about how stuff still mainly works, Google's starting to give a crap, Apple still blazes ahead in some areas, and Microsoft still talks a lot. Oh and ChromeBook is still a nice Linux system lol, and Braille is good.


You can always subscribe to my posts through Email or Mastodon. Have a great day, and thanks for reading!

It's not just for people who are disabled


While reading This article on how much Windows phones home to Microsoft, I thought about just how much we don't really have control over our data when running Windows. Who knows what all is being sent over all those network transmissions. I mean, when your cryptographic services contact other services over the Internet, like, why? On the Hacker News posting about this, a commenter asked why, after all this, someone would still use Windows. I responded with the usual “because accessibility, unfortunately.”

So, in this article, I'll talk about why accessibility should be the first thing every contributor to free and open source software thinks about. People with disabilities are some of the most disadvantaged, unvalued, discarded, and underrepresented people on Earth. Abled people don't want to think about us, because they don't want to imagine what it'd be like to be one of us. They fear going blind, deaf, or losing mental faculties, even though they know it'll happen eventually. So, supporting us is the right thing to do, provides an alternative to a disadvantaged population, and supports yourself when you need it most.

Got morals?

If you practice any kind of moral system, you probably know that you should help the poor. Some moral systems include people with disabilities, as we're often some of the most poor, especially in non-western countries. If you practice religion, you may or may not have seen a verse insisting that you not put a stumbling block in front of the blind, or other such admonitions. This should be the case in software as well.

We're all human, except for the bots crawling through this for keywords for search engines and such. We all are born with different traits. Some of us were smaller babies. Some of us were smarter babies. And some of us were disabled babies, or born prematurely, or survived even though the hope for such was low. So, shouldn't we account for these things? Shouldn't we prepare, in advance, for, say, a deaf person to use your chat program, or a blind person to try your audio editor?

Supporting people with disabilities is the right thing to do. It's the human thing to do. You don't want to look like those soulless corporations, do you? And even the corporations make an effort to support disabled people, even if to prop up their image. Can the open source community not do better than an uncaring, unfeeling money-printing machine? Surely, humans are better than the corporate machine!

And yet, in open source communities, people with disabilities are often ignored, or told they'll have to be a developer to make things better, or told to “be the change you want to see,” which is just plain demoralizing to a non-developer. Developers, and communities in general, must learn to empathize with all users, before they themselves become the ones needing empathy.

We are Everywhere

Have you ever called a bank, a hospital, a non-profit organization, the Internal Revenue Service, or your phone company? Yes? Then chances are, you could have been speaking with a person with disabilities. Blind people work in many call centers, and at many phone network providers, like Verizon, AT&T, and others. Do you know what operating system they're more than likely using? That's right, Windows. Why? Because accessibility on Windows, using Windows screen readers, and Chrome or Edge, is top-notch. Now, they may not be using the latest version of Windows, and hopefully it's all patched up, but we don't know that. The only company that does is Microsoft, and it sure isn't going to talk about its weaknesses.

So, how about the developers of free and open source desktop environments, web browsers, and operating systems be the stronger party and ensure that no one has to ever run Windows? After all, it's your data that's being stored on Windows computers, in Windows servers, spoken by, more than likely, $1099 closed source screen readers that could be doing anything with your data. If it sounds like I'm trying to scare you, you're right. We have asked nicely for the last decade to be taken seriously. All we've gotten is a shrug, a few nice words, and a “don't bother me I'm engineering,” kind of vibe after that. Well, you might as well start engineering for us before it's too late for you.

Where do you want to be in forty years?

It's no secret that we're all getting older. We age every second of every day. And, as we age, our bodies and minds start to fail.

Our eyes grow dim, our ears don't hear the birds outside anymore, and our minds tick slower and slower. But our hobbies, or our jobs, never quite leave us. Some developers can just climb the chain until they're high enough to not need to code anymore, thus bypassing the need to confront their failing eyesight, on the job at least. Some developers just retire and quit coding, choosing to give the wheel to younger, and hopefully brighter, generations. But why? You know so much! You still have those ideas! You still want to see freedom win!

Let's try another problem, those who become disabled younger in life. There are many genetic issues, diseases (like COVID-19), and so on that may cause even a younger person to become disabled. You may lose your vision, have a car accident, lose some hearing from listening to loud music, or maybe you just don't have the energy that you used to have. But you still want to code! You still want to create! And you have unfinished projects that need fixing!

In both cases, helping people that are disabled will help you when you need it most. We, people who are disabled, simply came with what you'll be getting in the future. So why not start now? Help make desktop environments a joy to use for blind people, so when your eyes start to hurt after a while of using them, you can just close them, turn on accessibility features, and continue working with your eyes closed! Or, if you make things easy for people with mobility issues, you can work one-handed when the other cramps up. Or, if you work on spell checking, autocorrection, and word suggestions, you can take advantage of that when a word just won't come to you, or when you forget how to spell a word.

So how can I help?

We need people, not companies. Companies, like Canonical, will sit there and work on their installer accessibility, while the real issue is the desktop environment. The System 76 folks only need accessibility help when they get to the GUI of the desktop environment that they're building. The Gnome folks say that they need coders, not users. So I have little faith in corporate-backed open source. They're just another machine.

So, community support is where it's at, I think. But it can't just be one person. It has to be everyone. Everyone should be invested in how they're going to use computers in the future. Everyone should care about themselves enough to consider what they'll do when, not if, they go blind, lose hearing, lose energy, lose memory, lose mental sharpness. Everyone should be into this, for their own sake.

There are many Linux desktop environments besides Gnome. KDE is what I'd be using if I could see. There's also Mate, Cinnamon, LXDE, XFCE, and others. Why mainstream distributions of Linux choose to stick with Gnome is beyond me. Below are some ideas to get the community started.

  • Use Linux with a screen reader. If you don't like it, we probably won't either.
  • Add Accessibility labels to whatever you're making.
  • Look at the Code of the Orca screen reader
  • Gather people with disabilities to get feedback on your desktop environment or distribution.
  • Have either your entire team focus on accessibility, or, if you must, make an accessibility team.
  • Spread the word about your accessibility fixes, put them front and center!
  • See how much your image improves, and how loyal disabled people are!

Yeah, it looks a bit selfish. But I've grown to expect people to be selfish, and care about how they're seen, and getting more users and such. That's just how we disabled people have to be most of the time. So, prove us wrong. Show us that the world of communities, democracies, people of high ideals, care about the disadvantaged, about their own security and the security of others, and themselves in the future. Let's make open source really open to everyone. Let's make freedom free for everyone. Why not?


You can always subscribe to my posts through Email or Mastodon. Have a great day, and thanks for reading!

Imagine, for a moment, that there is a ring. It's dim and gray, lifeless. There are many rings encircling it, but these are darker, more foreboding than the central ring. You hold in your hand a light, which can only shine inward. How will you proceed?

Let's try putting the light in the central ring, and see what happens. You place the light on the rim of the ring, and step back, watching the light fill the central ring with vibrant, lively color.


Darkness is all I know. I stand in the center of my ring, one of the outermost rings in the system. It's cold here, dark, no one wants to come near us, for fear of catching our Darkness. I don't blame them. We fight so much. Just thinking of battle makes me feel… better. Like I have something to blame. Someone to hate.

I sigh, looking up. Up at the Light ring. The central ring. No one wants us anywhere near there.

But I wanted it. Or to bring them out. Yes, I would destroy their Light, make them feel our anguish, our despair, our hate. I tell my tired and beaten-down body to move. To Climb the rings, to seek that Light, and snuff it out. I would find whoever put that light there, and make them feel my pain, my agony.

Well that wasn't such a good story, now was it? Let's hope that guy doesn't find you, right? Here, let's reset and try again. This time, let's put the Light on the outermost ring, and see what happens.


Light reassures me as I stand on the rim of my ring. I feel it's heat, and people from other rings say it allows them to sense things from a distance, to know what's around without them making sounds. That's alright. We have machines that use the light, like Investiture or a power source, to tell us what's around. In my free time, I enjoy exploring the rings, helping people, and anything else I can do for our system. I look to the next ring, where some of my family live. I jump there, and spend a while searching for things new in the ring. I scan the describer device upward, to further rings, and to the central ring, which, I'm told, glows with the casted light of all other rings. I lift my face, and feel the warmth of the Light.

That night, I dream of another world, where the Light has chosen to selfishly glow only on the central ring. I woke up sobbing at the idea. Seeing the people, filled with fear and hurt and pain, just barely surviving, and beg Elyon to have mercy on those people, if they do exist.

Ah, that's better. Since the central ring already has some light, putting the big Light on the outer ring allows the light to move inward, giving all rings light, not just one. Yet, software and web developers selfishly think of the central ring, which stands for “the 99%” or “majority” of people, and disregard those who need their services the most. Thus, people with disabilities, neurodivergent people, people who don't speak English, people who have trouble reading, people who have trouble processing images, Autistic people, and so many others live in a world that slaps them in the face every moment of every day.

Technology doesn't care. Bits and bytes can be used to help people with disabilities in so many ways. Yet, they are used, in so many ways, by the ignorant abled people, to bar access to so many things, from playing video games to COVID tests. And we can't even move towards the Light, as it were. With Linux, the free and Open Source system, made by abled people, almost every desktop environment has huge accessibility issues. In fact, even if you find a good one, you still have to enable Assistive Technology Support. And now, the Mate desktop, which has been the most accessible we have, only because it's based on Gnome 2 from 10+ years ago, and is starting to show its age. Chrome-based apps, like VS Code and Google Chrome, crash out of nowhere. Pidgin crashes while writing a long message. And if a Chrome-based app crashes, Orca is lost, and one has to immediately set focus to the desktop or it'll be totally lost until it sees a dialog it creates.

So, that means we can't even get into a great position to learn to make our own stuff, from some of the best courses like the Odin project, which requires you either use Linux, MacOS, or the Linux system on Chromeos. Windows, the most accessible system, which is supported by a large community of blind developers, and is created by a company which, in recent times, is getting more into accessibility, isn't allowed.

So think on this, when inspiration strikes for a new site, a new app, a new package. If you help the least of us, you'll help the best of us too!


You can always subscribe to my posts through Email or Mastodon. Have a great day, and thanks for reading!

About a week ago, I got a new laptop. It's an HP, with an AMD 5500 processor. With 8 gigs of RAM, 512 GB SSD storage, and a modern processor, I think it'll last a good while. I do hope I can swap out the RAM for two 8 GB sticks instead of 4 GB sticks.

After using Windows 11 for a while, I got the Linux itch again. Windows was... slower than I expected. Along with some games being more frustrating than fun, I decided to just do it.

So I installed Fedora. I chose the Fedora 35 Mate spin for this. Well, first I tried the regular Gnome version but Orca couldn't read the installer so that's great. After getting it installed, turning on Orca at the login screen, and the desktop, setting Orca to start after login always, and turning on assistive technology support, I was ready to go. Except...

Bumps in the Road

I mainly use Google Chrome for browsing. After getting that installed, I opened it, prepared to sync my stuff and get to browsing. But upon its opening, there was nothing there. Orca read absolutely nothing. Baffled, I installed VS Code. Still the same, nothing.

So, I hunted down the accessibility cheat codes I used to magically make things work:

export GTK_MODULES=gail:atk-bridge


After restarting the computer, things worked. I could use Chrome and VS Code. Then, I set up Emacs with Emacspeak. After a lot of looking around, I discovered I need lots of ALSA stuff, like alsa-utils, and mplayer, sox, and all that sound stuff. Oh and replace serve-auditory-icons with play-auditory-icons so all icons play.

It was during my setup of Emacs that I found one of the joys of Linux, dotfiles. I copied the .emacs files from my ChromeBook to the new Linux PC, and it was like I'd just simply opened the Emacs on my ChromeBook. Everything was there. My plugins, settings, even my open files were there.

Linux is really snappy. Like, I can open the run dialog, type google for google-chrome, press Enter, and there's Chrome, ready almost before I am. Pressing keys yields instant results, even faster than Windows.

Nothing's Perfect

Even with all this: fast computing, Emacs, updated system, freedom to learn about computing, there are some rough edges. If you close a Chrome-based app, like VS Code and such, you have to move to the desktop immediately, or Orca will get stuck on nothing. If that happens, you have to press Insert + h for help, then F2 to bring up any kind of dialog for Orca to get onto. Seems Mate's Window manager doesn't put focus on the next window. Also the top panel on Mate has lots of unlabeled items. And there are very few accessible games natively for Linux, but with Audiogame Manager, there are plenty of Windows games I can play.


You can always subscribe to my posts through Email or Mastodon. Have a great day, and thanks for reading!

Today, I began trying to make the most of Emacs. Mainly, this just means activating the packages that I’ve already installed. I’ve noticed that, even when using the Emacs package manager, with Melpa added, packages don’t always get “enabled” and configured to work. Some of them require that you do (require package-name) in your .emacs.el file. So, I went through the list of packages, and one by one read their Github page to see how to configure them. It’s slightly annoying, yes, but I’ve gotten a bit out of it.

First, I found out a lot more about the extra Org-mode blocks and links added with a package. I don’t remember the name now. And then, I found a few packages that I didn’t need upon further inspection, so I got rid of those. And then, I started hitting some big gold mines.

LSP Mode

LSP (Language Server Protocol), is basically an IDE-like bundle of code checkers, refactoring mechanisms, and documentation things that brings IDE’s to your text editor. Or something. All I really care about is that it brings VS Code like functionality to Emacs. And, there’s a Grammarly extension! The only problem is that when I load LSP mode, afterwards, Emacspeak reads a little extra info each time I switch buffers, like there’s still another frame there or something.

So, I plan on using that mainly with Markdown files, although Grammarly doesn’t seem to like filled paragraphs, and I hate unfilled paragraphs, although I can deal with it when working with Gemini. Ah well, maybe I’ll just turn on visual-line-mode everywhere. I don’t know. At least it’s not like VS Code, where the screen reader cannot read visual lines and only reads logical lines. Emacspeak handles visual lines by playing a sound when the next logical line is reached, but speaks each visual line as I arrow or use C-n or C-p.


Helm is a completion package. It’s really great how Emacspeak works with it, and I can just type to narrow things down, and not needing to type, then tab, then type, all that. And, unlike the mini-buffer, I can perform many actions on the thing at point, not just pressing Enter to activate. It’s really great, and I’ll definitely incorporate this into my workflow.


You can always subscribe to my posts through Email or Mastodon. Have a great day, and thanks for reading!

I tried to write this on Mastodon, but 1000 characters just isn’t enough. Since I am blind, and do not have any other visible disability, I don’t know what it’s like the not have the use of my legs. Therefore, if I’ve misrepresented anything in the following section, let me know.

You’ve always hated your legs. They flop uselessly at the end of your body; there, but just to show off that you’re different. That you can’t actually use them. Like a blind person’s eyes, just rolling around in the head, without use. You particularly hate your legs today, as you sit in front of a set of stairs with a helpful “accessibility controller.” At the top of the stairs. You could pull the lever on the controller box, and a ramp is lowered to the ground. If only your legs worked.

You remember when these things were invented. It was a bill at first, made after a video of a person in a wheelchair suffered severe brain trauma after falling down stairs when attempting to get medical help. The media ran the videos nonstop until the people boiled with anger, and so the government did as little as possible, as usual. So now these things exist. After another video was made of a person falling down stairs trying to activate it, stairs leading to public buildings were altered so that, if a wheelchair is pushed up them backwards at a certain angle, then they can reach the top, and the lever. Hopefully.

So, you take a deep breath, turn the wheelchair around, and prepare to try to reach your appointment.

The moral of the story: accessibility switches are bad. The UI of software or anything really, should be accessible from the beginning, and if a user has to go in and manually put in accessibility enablement statements in .xinitrc and .profile, your crap is broken.

When I have to go into the Mate desktop’s menu, then “system”, then “Personal” then “assistive Technology” and “enable” the use of assistive technologies,” then that tells me that if I didn’t, Linux would be far, far less accessible without this. And what if a user doesn’t know about this “trick” to enable a user to use their system? Well, they’d think Linux was far less accessible than what it is, and even with full accessibility settings on, I can barely use Zoom, which is a pretty important program these days. Google Docs is another thing I struggle with in Firefox, and Chromium. And yes, Google Docs is another piece of junk that requires an accessibility switch. Even with all this in my .xinitrc and .profile:

export GTK_MODULES=gail:atk-bridge

exec mate-session

stuff still is hard to use, like Zoom, and Google Docs. And just how much of this is even still needed? Do we still need “export QTLINUXACCESSIBILITYALWAYSON=1” when we have “export QTACCESSIBILITY=1”? Am I missing yet another flag that has to be enabled?

Meanwhile, Mac and Windows are accessible by default. No need to turn on any flags or check a box that tells the system, and probably all apps, that hey, this guy is blind~ Funny how privacy goes out the window when you’re freaking disabled, huh? Funny how closed source, proprietary systems are more accessible, and privacy friendly in that regard, than a system made by the people, for the 95%. But that’s what I get for being a nerd.


You can always subscribe to my posts through Email or Mastodon. Have a great day, and thanks for reading!

So, I’ve been looking for more Gemini clients. Not that Elpher is bad, but because I’m not always on my computer, as much as I’d love to just be able to sit at the computer, or more specifically, on my bed with my USB keyboard in front of me, plugged into my laptop, twenty-four seven. Unfortunately, there are times when I need to just suck it up and use my phone. For example, when I’m outside sitting on the porch during a warm day or evening, or when I’m on the way to or from work, or when I’m in my rocking chair.

So, I looked through the list of clients on Gemini’s circumlunar site, and found Elaho, a client for iOS. I liked it. It was simple, and displayed things fine. After a slightly long discussion on the Gemini mailing list, however, it got even better!

Today, I got an update on it that basically put preformatted blocks into an image item type, with the Alt-text as the image name. Something like that. And VoiceOver works amazingly well with that! So, now, I don’t even have to deal with most ASCII art! So, I can just relax and read Gemlogs with my braille display, and everything be simple, luscious, plain text! Well, plain as in readable, with headings and links and such.


You can always subscribe to my posts through Email or Mastodon. Have a great day, and thanks for reading!

So, I just got this all figured out, so I thought I’d write it down here before I forget. This is about me tweaking my Arch Linux setup to be a bit more productive a little faster.

The non-problem

So, I use Arch (BTW), so I naturally have full control of what my computer does. Well besides the firmware, the loose ports, and all the software I have no idea how to work or use (yet). But I’m getting there. I still need to figure out how to make this Ladspa sync in Pulse go to whatever card is plugged in, not just the Intel built-in Speakers/headphone-jack thing.

Anyways, I had ESpeakup (Speakup screen reader speaking through ESpeak), starting at startup, giving me a good indication that the system is ready. I could then type in my username and password, and then raced to type `startx` before the network manager connected to Wi-fi, because it was kinda fun getting that notification that I’m connected to Wi-fi.

Then, I needed to type my password again because some log-in key ring wasn’t authenticated via a mere shell login. Ah well. But that wasn’t all very productive. For one thing, I almost never used the console for anything. So, why log in using it? I just used it as a jumping off point for startx and mate-session.

So, I tried a few display managers. My first choice was lightdm, as I wasn’t sure the GDM one would allow me to start Mate, or if it was tied to Gnome. Well, that one didn’t seem to have Orca support. Or, if it did, it was more work than I was willing to expend to get that working. So, I went back to no DM and just using startx.

So, then, I tried GDM, the Gnome display manager. This worked well, I was able to start Orca within it. The settings were just the default Orca settings, with slow speech and such, but I could deal with that. I just needed to hit Enter, type in my password, and hit Enter again. But then, I started Emacs. The environment variable to set DTKProgram wasn’t set anymore to “outloud,” so it used ESpeak, which doesn’t have great support in Emacspeak. So, I tried other programs, some QT apps weren’t accessible, and neither was Chromium. So, my environment variables weren’t being loaded. So, I went back to no DM and just using startx.

So, today, I can’t remember why I wanted to try this again. Ah yes, it was .bashprofile verses .bashrc. Also, I need to find new Aspell dictionary with more computer/Linux terms and such. But anyway, I wanted to see if .bashrc worked to get environment vars loaded when using GDM. So, I enabled GDM, but found that Emacs (with Emacspeak) still loaded ESpeak. That was kind of disappointing.

So, after a few restarts, I determined that it wasn’t me, that the .bashprofile was made right, and that when loading GDM, that simply wasn’t being taken into account. So, I looked it up, and found that most modern Linux distros load from .profile, not .bashrc or .bashprofile. Well, that makes sense.

So, I found that, yes, I do have a .profile, and that it’s practically empty. I filled in everything that I had from my .xinitrc, .bashrc, and .bashprofile that I’ve added over the months that I’ve used Linux, and restarted. And it works! Emacs loads with Outloud, Chromium is accessible, and all it better, needing one login, not basically two with the authentication key ring login. So, here is my .profile:

export DTK_PROGRAM=outloud
export LADSPA_PATH=/usr/lib/ladspa
export PATH="$HOME/.gem/ruby/2.7.0/bin:$PATH"
export SAL_USE_VCLPLUGIN=gtk3 GTK_MODULES="gail:atk-bridge"
export GTK_MODULES=gail:atk-bridge
export EDITOR="emacsclient"

alias git=git-smb

Yeah, it could use with a little cleaning, but the extra stuff about GTK3 was for LibreOffice, and I ain’t messing with that.


You can always subscribe to my posts through Email or Mastodon. Have a great day, and thanks for reading!

Enter your email to subscribe to updates.