Ok so so far #talkback 14.2 is fine, nothing really major broken, except they somehow fucked proofreading up so it cannot find spelling/grammar errors in general a bunch of the time. @mastoblind
Re my last boost, looks like HID support for Braille displays via bluetooth is coming in Android 15. This should let devices like the Brailliant BI 20X, 40X, The Mantis, and the Humanware NLS EReaders connect to devices running Android 15 via bluetooth. Fantastic news.
Clarification: These devices can be connected to Android devices via USB cables right now. The new support is for HID over bluetooth, enabling us to use our displays wirelessly.
@changeling@mastoblind I mean the swipe actions were not on #TalkBack until last year, we had menu actions similar to rotor ones for ages and some of those worked in discord for a while now, mostly things like opening the server options from server list to mark as read and such. They have rebuilt the mobile apps 3x in 2 years so not surprised about the delay.
@objectinspace@amir If this is the android 15 beta it's still developer phase and actual release of Accessibility Suite 15.0 won’t be for at least another 5 months or so, also what’s strange is that #TalkBack is not an app on Pixel side so when it crashes is not the message you would get, what type of device were you running this on?
@objectinspace@amir I was worried about that being it…, That one broke with #TalkBack in every browser I tried including chrome and firefox, the select your country drop down would never accept the entry and allow me to continue…
@spaciath Most work via USB, brands like HIMS, Focus, and Orbit work in general because they retain a backup bluetooth protocol, Humanware and Helptech do not offer any backup method for working with devices that cannot use HID wired or wireless which is actually not hard and could be done with an update. The bluetooth HID support for android is out of the #TalkBack teams hands but people could pester #google about this to try and get the people who should be on this to get on it. @mastoblind
Does anybody use Google docs? Since I'm cutting apple off, I am no longer going to have access to the program. I used to write everything, and I need a new one pronto
@evilcookies98 Ok so for both word and docs do not use the #android apps, they are both beyond inaccessible with #TalkBack, you can use the browser versions however, and for a lot of things I use this.
Google seems to be fixing the Send button not appearing for TalkBack users. It seems to be server-side as some TalkBack users do have the Send buttons and some don't.
Short PSA to #WebDev do not, do not, enable #Accessibility overlays! I was just on a website where the AccessiBe toolbar actually made text completely disappear and turn plain text paragraphs into a DIV, somehow, and a hidden element. I'm telling you, seek out an accessibility ready theme instead. #Wordpress has hundreds, and Shopify has them now too. You. Are. Wasting. Your. Money. On these toolbars. #Internet
Hello, fellow #blind people. Somebody asked me about resources for learning how to use #talkback on #android. I am not an android user and I have no idea about any of that. I am looking for guides, tutorials, videos, anything like that. any resources that anybody can offer are appreciated.
Morning Mastodoners! Up playing with my teensy Jelly Star Android phone, which arrived earlier this week and which my wonderful friends Glenn and Megan helped me set up last night. So far my first foray into Android's been an interesting, surprising, and occasionally frustrating experience. And yes, if the name Jelly Star has you thinking of at least one Spongebob character every time you hear it, be assured you're not the only one.
I'd never used #Talkback extensively before yesterday, and while typing and some of the gestures are taking some getting used to, I love all the sound effects and haptic feedback. I'm sure I'll be spending a lot of time on @accessibleandroid 's site over the next few weeks. #Blind#Android users, got any newbie tips or app suggestions for me?
Hey, I've talked before about the major thing slowing down Android screen readers: double taps! Just to recap: when you tap the screen, the screen reader waits for some time, just to see if you'll tap again to register a double tap. Only after that time does it register a single tap and tell you what's under your finger. This makes tapping slow! And in normal navigation, it's fine. But in typing, it slows us down a lot! Not only because we have to wait for a fraction of a second for it to register a single tap instead of a double tap, but we also can't touch type so fast because then it will start registering double taps!
The fix for that is really easy. Just make screen readers ignore the double-tap logic in the keyboard area when the keyboard is up (if you selected any typing mode other than double-tap typing). Because it doesn't need to handle double taps in that area; you just put your finger, and it instantly registers a single tap and tells you what's under your finger. When you lift, it types. That would be great, and I'm very sure it's easy to implement (Oh, how I wish I knew enough Java and Android API to implement that into TalkBack...)
Because Android itself is not slow at all! In fact, it's instant for all I can notice. And you can test that, even with your screen reader (just so you can notice the reader is artificially slowing itself down).
First, focus on an item. Now, double-tap. But you need to make your second tap pretty late, just before the timer ends but not much before it, nor after it. You will notice the double tap registers right after your second tap, instantly if you get the timing right, and this is frustrating. The screen reader is intentionally slowing itself down, without giving us an option to change the preset timer or implementing the easy fix for the keyboard to make touch typing possible and fast! #Android#AndroidAccessibility#ScreenReader#UserExperience#AccessibilityIssues#Accessibility#Talkback
You can now have images described, check spelling in Braille, have Braille automatically scroll at a set interval, and ... that's about all I got before Windows Subsystem for Android broke.
Oh mais je viens de découvrir que depuis la version 1.14 de /e/OS (Android dégooglisé), on a Talkback FOSS (le lecteur d'écran d'Android mais dégooglisé) !!!
C'est trop bien !!!
🥳🎉🎊
So one of the things for me about Android is the TTS engines. Apple has, basically, three speech engines built into every one of their operating systems. Vocalizer which is what VoiceOver starts off on using, Macintalk which is like Alex, and Fred and all that, and Eloquence. Three different ways of speaking, pronounciation sets, all that. Actual choice.
On Android though, a Pixel comes with Google TTS. And if you've ever been somewhere with no Internet and heard a low quality, robotic voice from Google Maps, you've heard what we have to deal with on a Pixel every single day unless we get something different. So on Android, there are a few more options. RH Voice which honestly doesn't sound so good to me in English, ESpeak which is as robotic as you can get and was last updated on Oct 23, 2022 (almost a year ago), Vocalizer which had its last update on Oct 30, 2021 (which is better than I thought but still feels unmaintained), and that's about all I know of. On Samsung phones, you can get Samsung TTS out of the box, and it's pretty good. Of course, then you get Samsung's TalkBack, Samsung's version of everything, but also all the goodies that come with Samsung phones. Oh and Samsung TTS has a longer pause between everything cause it was made to read stuff not for screen reading, so everything feels slower than it is.
So it's really sad. Eloquence is still a 32-bit app, so will not work on the newest Pixels. Google TTS' newer local models are sluggish with TalkBack, and cannot speak quickly, as many have found out when working around the fact that TalkBack doesn't use the newer model natively. And it's sluggish when reading long pieces of text, like this one. And there iOS is, with tons of voices to choose from. And I get it, I should be thankful that we have even ESpeak, but when you come home from a stressful day at work, what do you want to hear?
I'm not gonna lie, using Youtube Music, at least the audio player, is really nice. When a song starts playing, TalkBack tells me what the song is. I know, some people won't like that, and I hope that becomes configurable. But when I go to the next song button, and then lock the screen, and then unlock it and get back into the app, TalkBack focus is still on the next button, ready for me to just double tap. I can also use the GoodLock Sound ssistant module to set a long press of the volume buttons to go to the previous or next track. Ugh I just love that kind of stuff. I can't wait for TalkBack to catch up to VoiceOver.