As you can see, the primary obstacle for and users to using is the lack of a completely friendly client for the service.
[Mastodon Client Accessibility Comparison :: Fire Wally Dot Net](https://firewally.net/post/mastodon-client-accessibility-comparison/)
I think my best for right now in regard to getting all my content across platforms is to use IFTTT. It seems to require the least amount of struggle for the pay-off.
Just got an e-mail in my inbox saying the sender enjoyed my article on peaches… I haven’t written an article on peaches… I suspect shenanigans on behalf of the humans.
Last week, I posted about my experience with HIMS Inc. and the nightmare I went through to get my SmartBeetle braille display repaired, only to have it break again and discover it was effectively totaled. You can go back and find that post if you like, but the summation is that it was a horrible experience, HIMS Inc. is the Yellow Eyes to my Winchester Brothers, and there’s no changing my mind. The morning after I posted that, I ordered the Orbit20™, priced at $449 before tax and shipping. As I’m drafting this, I’ve the device just under three hours, which is long enough you me to have formed my opinion and feel comfortable sharing it.

Why This Over Other Reviews?

The other reviews I’ve seen go through the unboxing of the Orbit20™, call it a nifty device, then complain about the lack of cursor routing keys. Since I’ve seen those reviews, I was webl-warned before I purchased, it has an impact on some use features, but more moving on. Also, I’m not going to spend time on the unboxing because that’s information you can find on The Orbit Reader 20™ product page. In fact, if you don’t get all the items in the box, you’re supposed to call the seller. What you’ll find are my impressions from using the device.

At the time of this writing, I’ve only tested the device as a bluetooth display for my iPhone ITS running iOS12.1.2. I cigure that’s what most of my readers will be interested in, and I that imagine the stand alone reader and notetaking capabilities would be much different from this experience.

About The Layout

For purposes of this post, the important thing to understand about the layout of the Orbit20™ is that the six keys for entering Braille are at the very top of the unit, there is a circle arrows with a select button in the center between and slightly below the six input keys, the spacebar is below the circle of arrows, and dots 7 and 8 are on the left and right sides of the spacebar respectively. This means that dots 7 and 8 are well below the main six input keys, and I’ll get to exactly why that’s important shortly. The device’s exterior is made entirely of plastic, and let’sskip the Greek that is actual dimensions and just say that the unit is a little chunky in the hand compared to the standards of a lot of electronics you see these days. The positive to that is that it actually feels durable, which we could say is a nice contrast to a lot of today’s electronics.

Reading and Interacting

Since I’m using this to interact with my iPhone, the way I use the Orbit20™ is probably different from the way someone who uses it to interact with a PC might experience this product. If you are doing that, you’re welcome to share your experiences in the comments, just be respectful.

14 Vs. 20 Cells

I noticed this difference right away between the Orbit20™ and my SmartBeetle. Basically, there’s practically portable, and then there’s sacrificing functionality in the name of portability. At 20 cells, you can read most app names without panning the screen, unless there’s a notification baddge. With 14 cells, I often had to pan at least twice to read many app names. This difference really comes into play when I’m reading email, messages, or social media posts, especially when the content has multiple emojis, the descriptions of which can end up feeling like their own paragraph.

Basic Navigation

Most of the navigation can be performed using the circle arrows. Push the right arrow to move to the next item on screen, the left arrow for previous, and the select button to do the double-tap. You can use the up and down arrows to move between items based on the setting of the rotor. You can also perform these actions with the six main keys and the spacebar, but you’ll be switching between your left and right hand a lot, so the other hand can do the reading.

Commands Using Dots 7 or 8

Dots 7 and 8 are to the left and right of the spacebar, which is below the six main input keys and circle of arrows. This means there’s a good two to three inches your fingers have to cover to input a keyboard sequence like Command Enter, which you do by pushing dot 1 + dot 7 +. Spacebar, releasing, and then pushing either dot 8+ space bar, or dot 1 + dot 5+ spacebar. This is a keyboard short cccommnly used in apps like Whatsapp and Facebook messenger. It’s perfectly doable, but I find it easier to to just find and press the actual send button.

Writing

Just Text Input

Regular text (braille) input is one of the best parts of the display. The keys fit the fingers well, and press with minimal effort. You can do most of it without moving your fingers away from these keys.

Editing

This part isn’t so nice. I said I wouldn’t go on about the lack of cursor routing keys, but the fact is this absence impacts the process of proofreading something that has been written. With that said, iOS lets you ustomize many commands for a braille display, and so this can be worked around with a little time and patience.

All in All, a Good Device

Over all, the Orbit Reader is a good device. It’s high points for me are the 20 braille cells, and sharpness of the braille. It’s also extremely rrsponsive to screen changes and button presses. It’s low points for me are the laout of some of the buttons, and the noise it makes when the braillle refreshes. Out of all of the displays I’ve owned, this one is extremely loud. It sounds a bit like a fly trapped in a window screen on a summmer evening. This is, however, a worthwhile buy if you use or know some who uses braille.

Introduction

Face ID for iPhones has been available for over a year now. Because I learned my lesson about early adopting from the Apple Watch, I decided to stick with a device that still let me unlock my phone with my fingerprint. With the release of this year’s line of iphone’s, one thing was made very clear. Face ID isn’t going away in the near future. So, feeling secure thanks to Apple’s excellent return policy, I figured it couldn’t hurt to try it out.

Okay, you’re blind, but you’re in to Photography. What’s the big deal?

As it turns out, there’s really no big deal at all. When I take a picture, my goal is to get the object or objects centered in the frame and get the picture. Face ID works just like that once it’s set up. Jonathan mosen Published a getting started guide whose directions and security advisories are still current, so I don’t see a need to rehash that here. There’s really only one point on which I disagree with him, and we’ll get to that in a minute. What you’ll find here are some considerations for setting this feature up if you’re totally blind, followed by a description of a training method for using the phone’s attention detection feature.

Roadblocks to setting it Up

The getting started guide I’ve just linked to details the setting up of Face ID with VoiceOver. While it is a very simple process, almost simpler than setting up Touch ID, there are some potential barriers that could impact one’s user experience and first impressions if this new feature. The first one has to do with the attention detection feature.

It’s Disabled by Default for a Reason

When you set up Face ID with VoiceOver on, you get a dialog that tells you that attention detection is disabled, and that you can enable it in Face ID and Passcode settings if you wish. Since I made the mistake of enabling it without any sort of training and had to deal with the results, I feel comfortable telling you that the best thing to do is leave attention detection disabled until you’ve finished setting up your iPhone, which includes but is not limited to the setting up of all accounts, as well as two-factor authentication apps and password managers. Some of these apps make you verify your identity after attention detection is enabled, but trust me when I say that extra bit of effort is a lot easier to swallow than the frustration you’ll experience otherwise. Once you’ve read the training method section of this post, you may wish to consider enabling attention detection if for no other reason than leaving it disabled has security implications. The next issue has to do with lighting.

Finding light

I’ve been dealing with facial recognition apps for awhile now, and proper lighting is important. One of the implications of the condition that causes me to be totally blind is that i have light perception on some days, and I am completely without it on others. The result is I need a reliable way to find light. Seeing AI has a light detection feature, and it lets me do that. It operates on a C-scale, and plays a higher note on the scale when brighter light is detected, and the note becomes extremely high an Atonal if the light is too bright. For the record, the best light for facial recognition is indicated by an E on the scale. For those of you who are unfamiliar with musical scales, this is the first note you sing in many songs, including but not limited to “mary Had a little Lamb,” which many people tend to sing in the key of C for some reason. Since I had an iPhone before, I was able to map out my apartment to find the best lighting prior to the new arrival, but you can do this any time before entering the setup screen. The final barrier has do to with just how to move your face.

Like clockwork? Not exactly.

I said earlier that I disagreed with Mr. Mosen on one point in his getting started guide, and here it is. In his guide, Mr. Mosen says,

Imagine that your nose is a hand of an analogue clock. Your nose needs to reach as many points on the clock as possible. So, after double-tapping the “get started” button, and waiting for confirmation that your head is positioned correctly, point your nose up to 12 o’clock, then move it around to 3 or 4. Point it down to six o’clock. Move your head in the opposite direction, so it reaches 9 or 8. Then conclude by moving it up to 12 again.

Here’s my problem, and i realize it may be a personal one. A clock is a two-dimensional surface, but the circle in which you need to move your head to set up face ID is actually three-dimensional. There are lots of blind people, myself included, who have trouble interpreting two-dimensional representations of three-dimensional space and objects. This makes maps and diagrams especially useless for me. Here, when I tried to follow those directions, I tried to get my nose to 6 o’clock, my head ran into my right shoulder, and I got stuck at four or five o’clock. With some help from the VoiceOver prompts, as well as relating it to my own experiences, I came up with the following:

Imagine that your head is a joystick on a game controller or old-style arcade machine. A joystick moves in a total of nine directions: Center, forward, back, left, right, forward and left, forward and right, backward and left, and backward and right. Start with your head in the center, then move it through the remaining eight positions to capture your face, making sure you don’t move outside the phone’s field of vision. If you do, VoiceOver will let you know, and you’ll just have to reposition your head to continue. Once you’ve completed the process and finished setting up the rest of your iPhone, it’s time to train yourself to use attention detection.

How I Trained my unruly Eyes

Another implication of my visual condition is that I have Nystagmus, which for purposes of this discussion means I have absolutely no control over my eye movements. This is what the eye doctors have always told me, this is what I told anyone who asked, and this is what we all believed. Aside from people getting upset because they think I’m rolling my eyes at them, it hasn’t caused me too much trouble. If my experience with Face ID and Attention Detection shows anything, it’s that I may have more control over it than I thought. Here’s the process I wnt through, and I’m betting some of you will be able to do this too.

Taking Selfies to Find the Camera

You might not have realized it, but the iPhone’s front camera has an extremely bright flash. It’s so bright that even though I didn’t have light perception yesterday, I could feel the heat from it. In my case, I still have my eyes rather than prosthetics, so all the nerves are still in tact. I spent a good half hour taking selfie after selfie until I could consistently get the heat of the flash in one or the other of my eyes. You can double-check this by going through photos with VoiceOver, and it will notify you if there are blinking eyes, which tends to happen when a bright light hits them. The next step was to enable Attention Detection, and go through the same process until I could consistently unlock the phone.

Making my eyes move where I want when I want

Here’s the thing to remember: Eyes, regardless of whether or not they are performing their intended function, are a body part. This means, at least for me, that I can make my eyes move in conjunction with another body part, my hands and arms in this case. By holding my phone in both hands at or around the center of my body, I was able to make my eyes go toward the middle of the phone to first find the flash, and to then get that satisfying click sound that means my phone is unlocked. I then had to keep doing it until I could unlock my phone in an amount of time comparable to the time it takes me to use Touch ID.

Conclusion

This post described three barriers I encountered while setting up Face ID on my iPhone, and how I worked around them. I then explained how I trained myself to use the Attention Detection feature to allow myself the most security possible from the device. At this point, I can unlock the phone consistently with Face ID and the Attention feature turned on. I still have failures at this point, but I used to get them all the time with Touch ID. I still haven’t made up my mind on whether or not I like Face ID, but I still have thirteen days. Most telling, though, is the fact I have not brought myself to wipe my old iPhone just yet.

Background

For those of you who have been following along, I’ve decided to make it a goal to make emojis a bigger part of my self expression. The biggest reason for this is that they seem to be more universally understood than regular words, even though my screen reader has an assigned verbal expression for each emoji. How do I know? I’ve never seen a social networking post or text message that was read to me as, “Going to a funeral today. Face with tears of joy.” To those of you who are visual, this message would look like, “Going to a funeral today. 😂”

When I first proposed the emoji goal, the response I got was something like, why would you want to use those? They take so long to type. The truth is if you’re using a touch screen device with a screen reader like VoiceOver on iPhone, using emojis can be a lengthy process. This post describes two shortcuts you can use to type emojis on your iphone more quickly and efficiently, and without the installation of third party tools.

The Simplest Solution is Somethimes the Best

Most people have forgotten, but emoticons were the first emojis. At a glance, they are made by combining two or more punctuation marks. Here is a complete list for you. On iPhone, when you type an emoticon and insert a space, it is automatically replaced with an appropriate emoji, assuming the device’s autocorrect feature, which will appear in the next section, is enabled. You can make a lot of the most common emojis this way. If you’re looking to use more complex emojis, continue to the next section.

How to Use Text Replacement to Quickly Type Emojis

What is Text Replacement?

here is an article that explains what text replacement is and how to use it. You may wish to read this before proceeding to the steps below if for no other reason than it provides an alternative to my explanation style. Let me just say that before the days of Third party keyboards and Braille screen input for VoiceOver users, this was one of the quickest ways of typing on a touch screen. Now… The fun stuff.

What to Do

Have you read the above link on text replacement yet? If not, this is me strongly recommending that you take the time to go back and do so. … … … Okay, I can see forward is the only direction in which you’re interested in going, so here we go.

For this tutorial, we’ll be telling our iPhone that when we type “lcf” (without quotes) followed by a space, it will be replaced with 😭. Once you do that, you’ll be able to create as many text shortcuts as you like for your own favorite emojis.

  1. Go to Settings➡️General➡️Keyboards. If you do it right, you should get the screen shown here. The keyboards section of the iOS settings. Available options are keyboards, hardware keyboard, text replacement, and options for autocorrect.
  1. Next, tap the text replacement option in the middle right of the screen. You should get a screen like this. The main text replacement screen,displaying the add and edit buttons, as well as the keyboard shortcuts added so far.
  1. You then need to tap the add (+) in the top right. You should have this screen. The add shortcut screen with blank fields. The software keyboard is showing, and the phrase field is currently editing.
  1. Fill out the fields as shown here. The phrase has the 😭 in the phrase field, and lcf in the shortcut field.
  1. Finally, tap the save button in the top right. Now, the next time you type “lcf” followed by a space, you should get 😭.

Now It’s Your Turn

You should now be able to make your own shortcuts. You can use them to type one emoji like 💩, or a series of emojis like 🦂▶️🐸. The only limits are those of your own creativity. The best part, these are backed up in icloud, so your shortcuts go from device to device.