People who use #ScreenReaders, imagine a feature on your instance where you can automatically hide any post that contains images/videos without alt text.
If you reply with an opinion and YOU PERSONALLY don't need alt text for accessibility reasons, I will probably block you.
Hello there. So, I never thought I'd ever be using Mastodon for this and its a longshot but I'm looking for a paid job. I'm a senior in College and I'm going to be graduating with a degree in business management in May and I want to be ready. For my skills, I'm well-versed in Microsoft products, particularly Microsoft Excel and Microsoft word though Excel is my prefered application considering I wish to work with spreadsheets. Yes, I'm totally blind but that shouldn't be an issue because of #screenreaders and #ADA#accommodations and #RemoteWork. So, if anyone is looking for a dedicated person who genuinely enjoys helping others and working with functions from #statistics to #financial functions using #appliedMath, I am willing to work for you so help me #GetFediHired. Resume will be sent apon request through DM's. Thank you, and boosts are absolutely encouraged.
Not sure whether this has been already brought up here but someone has created an #Accessibility assistant app for the VoiceMeeter application for Windows. VoiceMeeter is a bit like Audio Hijack for Mac where you can chain different inputs on a virtual soundcard, let it run through different outputs and route the audio of other programs to it just like you could with a physical mixing board. Currently I have set it up so that my microphone runs through it as well as any other app i route to it and it is all played both for me on my headset and to the world wherever I decide to stream it. The microphone is muted on my end so that I don't have to hear myself but others can hear me still. It is early days and many options aren't supported by the accessibility addon but the fact I could create an optimal config for myself is already great. Let's tell the dev how important their work is, send feedback and suggestions. Happy testing! https://github.com/onyx-and-iris/nvda-voicemeeter ##Blind#ScreenReaders#A11y#Audio#Radio
Help! I have a student in my computer science class whose JAWS screenreader won’t read any output from the cmd prompt or Power Shell. (It works in my instance of JAWS as well as other students’). Relevant speech history is: ‘Unavailable’ ‘0’ ‘0’ instead of ‘hello world’. Has anyone else experienced this or do you have any tips? Is the answer to reinstall JAWS? They’re so frustrated and I don’t want them to hate computers 😢
A couple of weeks ago, a colleague told me they were too young to have used Nuance Talks. Which made me feel both old, and sad that they'd never experienced one of the best #ScreenReaders ever made. #accessibility
Capitalizing every word in a run-on set of words like a hashtag is called CamelCase. Using CamelCase helps #screenreaders figure out where the line between words is- like if you said QuitEmail it would understand you meant Quit Email instead of Quite Mail.
The case for AI in #accessibility is a hotly contested one but I do feel the baby's being tossed out with the bathwater just a tiny bit. Yes, it is bad that AI is being used to phase out hoomans in all sorts of pursuits. And yes, it is also true that at least at the moment, Ai-generated anything is generally lower quality than hooman-generated stuff. And yes, it is also true that we're seeing AI in places we really shouldn't be seeing it (MDN anyone?) and that people, just like always with a new toy, are going absolutely nuts with it and putting it front and center like it's Cthulhu's new miracle to end all toilet paper shortages. But it CAN, at times, actually be an enabler. It CAN, OCCASIONALLY, actually be used for good, and I don't think people who find this out and do this should be villified
If image generation gets good enough to make me Twitch banners, or blog featured images, or make small video edits for me, or write me CSS that actually looks good, you bet your glutal hemispheres I'm going to make use of it. Purely a matter of efficiency. We haven't had a decent video editor that's #accessible as long as I've been alive, I think i've waited long enough for hoomans to step up. If AI can do it, AI can do it. Screen readers can't tell me if CSS is doing what I want, so if AI can do it that just means I can be more productive as a developer. Again, I've waited for this situation to improve in #screenreaders for my entire professional career, so if what is for all intents and purposes a new tool can enable me to do this, I will happily make use of it. Summarizing articles in between all the newsletter popups, ads, images without alt text and silent videos? Absolutely. Just another #accessibility tool at this point. So yeah. Bad, but not all bad.
Finally I got a good catch on just how wack AI language model hallucinations can get and how believable they'll seem to folks who don't know any better.
I asked #chatGPT to list some third party #screenreaders for #Android besides the default one. If you know anything about screen readers, you'll be like "Hell naw!" when you see the model's output.
Output below:
. 1. VoiceOver: VoiceOver is a screen reader developed by Apple for its iOS devices, but it's also available as a third-party option for Android. It provides spoken feedback and enables users to navigate, interact, and control their Android device using gestures and voice commands. 2. JAWS: JAWS (Job Access With Speech) is a popular screen reader primarily designed for Windows computers. However, there is an Android version of JAWS available as well, allowing blind and visually impaired users to access Android devices and applications. 3. NVDA: NVDA (NonVisual Desktop Access) is another well-known screen reader for Windows, and it also has an experimental version for Android. NVDA provides speech output and braille support, empowering users with visual impairments to access Android devices.
4 through 6 is shine plus speel and voice assistant which are valid Android screen readers so I'll at least give it that
Most of the issues I see people bring up about #screenreaders are honest to goodness #accessibility issues with the tech and Unicode.
Sure, you can (try to) change peoples behavior, but in some cases it's just straight up pitting marginalized groups against each other.
Sometimes, you might think that previous #accessibility wisdom has been superseded by new "facts". Maybe someone told you that #screenReaders don't work well with a particular design pattern, but you tested #ScreenReader X and it seemed to work fine. Perhaps you heard that an interactive HTML input doesn't persist with forced colours styling, but you tried a High Contrast mode in Microsoft Edge and it seemed to be there.
There are three considerations usually missing here:
How are you defining and evaluating the working state? Do you have a functional, accurate understanding of the #accessTechnology or accessibility feature you are asserting things about?
You tested one thing in relation to a statement about multiple things, e.g. a statement is made about screen readers, plural, and you only tested with #VoiceOver (it's always VoiceOver). Beyond posting on the web-a11y Slack, how do you propose testing more broadly, if you plan to at all?
Possibly the most critical at all: is this question worth its overheads? If answering it conclusively would require me to test ten screen readers with 45 speech engines, or seven browsers with 52 permutations of CSS properties, maybe following the advice is "cheaper" than determining whether the advice is still completely relevant.
Important disclaimer: this relates specifically to cases where following the advice would not actively make things worse for users.
TL;DR: when you know doing a thing won't make things bad, doing the thing is usually quicker than evaluating whether not doing the thing is also bad.
"I wrote an article in 2007 called Fieldsets, Legends and Screen Readers. It was my first post on the TPGi blog. I have been meaning to provide an update to it, for the last 15 years…"
I love it when people mention the fact that we have an alt4me hashtag, but please, when you talk about it rather than posting an image to it, put the hashtag in quotes without the number sign "alt4me" or say "the alt4me hashtag". Otherwise, people who watch that tag could start ignoring it since at least half the posts there aren't images now. Thank you! And please keep spreading the good news! #ScreenReaders #blind #AltText #ImageDescriptions #accessibility #a11y #access #disability
Asking people who use #screenReaders: what is the best way to add #AltText to a QR-Code? Are the screen readers clever enough to recognise it as such and offer to read the contents or follow the link? Does putting the content of the QR-Code in the alt text help? or would it make it worse, especially if it is an URL?
boosts are welcome
(the whole point of the post that prompted this question was posting a QR-Code, so no, I couldn't have just posted the URL)
"AI can help by providing mostly accurate descriptions of images on web pages. This can be especially helpful when the image has not been provided with an text alternative, but is visible on the page."