devinprater

@devinprater@tweesecake.social

I am blind. Because of course I have to mention it in my profile. Accessibility drives me. I use all the major operating systems in some way, and find great things about all of them. I also enjoy reading, eating, relaxing, eating more, and chatting. I want to be a cat when I grow up.

My opinions are my own, and definitely do not reflect those of my employer.

This profile is from a federated server and may be incomplete. Browse more on the original instance.

devinprater, to windows

So for blind Windows users, what mail client do you use? I'm kinda getting tired of Gmail, having to hear lots of stuff before each message, and having to turn on browse mode and arrow through all that to get to the message, all that. And Thunderbird is messy in 115. I might have to just get used to Outlook.

:boosts-OK:

devinprater, to opensource

Please boost for reach: One of the reasons that I pay $20 per month to @podcast is that Jonathan doesn't back down from saying what is right, and what a lot of us, I feel, are afraid to say. On the last episode, he talked about how he wasn't able to get Living Blindfully Plus onto Apple Podcasts because of accessibility issues. After contacting Apple, they replied with something like "These things take time." And Jonathan said something that I'd not put much thought into, that if a sighted podcaster had encountered this issue, it would have been fixed in a day. He'd been waiting close to a week for something to happen. And I think that some blind people, particularly in the Android communities, don't want to push these big companies, or these foundations and communities, so they justify to themselves and their communities that they should just wait, that programming takes time, that there are only so many people on the accessibility teams. And that's usually where the discussion ends. But I want to push it further. Why is it that we should have to accept rampant discrimination? Why should we have to wait until everyone else's issues are solved first? Why is there only a few people at companies, foundations, and in communities willing to help us? I want you all to think about that. It's taken Google 3 years to support a new Braille HID standard that was released around 2018. Even now, there are no intentionally accessible Apple Arcade games. It'll take Fedora and KDE 5 years or so to make their desktops/OS accessible to blind people. We've yet to hear an appology from the Gnome foundation for how a blind, nonbinary person was accused of spreading misinformation about their personal experiences with the Linux desktop. I ask you all, do these things really take so much time? Or is time simply used as an excuse to not do the work? Anti-ableism isn't a badge you wear, it's work you do.

devinprater, to random

You know, I'm honestly glad to see people looking at the disability community, especially blind people in my case, and going "You know, I had no idea it was this bad." Like that thread on learning to code for kids? Yeah. Best way is to hit the ground learning through typing in code and hearing the response. It sucks. It really, really does. And the kid, or even teenager or adult, would have to first know how to use their device; iPad, computer, phone, notetaker. And then they'd need to learn to code. Like, as blind people, we cannot just pick up a device and automatically know how to use it. We have to learn how to use the screen reader on the device, practice with the device, and then start using it for whatever we picked it up for in the first place. And I know I'm a terrible person for saying this, but I think more people need to see these kinds of failings of technology before anything substantial wil be done about it. Because it's so easy to scroll on passed yeah another blind person mad at tech, yeah some screen reader update, yeah foss bad Apple good and all that. But it's another to see a blind person, right in front of you, trying to learn to use an iPad, and learn to code, or do whatever on it, at the same time. It's really sad. But honestly, I think more people need to be sad about it before we can move any further as a society on disability issues.

devinprater, to accessibility

On Twitter, it felt like people added image descriptions in a kind of "Oh fine I'll do it," kind of mood. Here, it's so hard to believe that people do it delightfully, kindfully, in a spirit of helping others not just because Twitter kept pushing it.

And I think that, if this attitude of actually wanting to help, wanting to work together, pushes from Mastodon to the rest of computing industry, I think things will be a lot better. Because this is a lot of what anti-ableism is. Not doing something begrudgingly, like Twitter, but doing it because it's good, because it helps, because if it were you on the other end, you'd want someone to do it for you.

devinprater, to accessibility

So, I was thinking about this video on Chrome/Edge accessibility. Basically, it shows that the memory and CPU hog isn't the screen reader, it's the browser/accessibility pipeline. And honestly, it shows how, once again, being disabled is expensive. Not only are we told from a young age that we'll need to work twice as hard as abled people in order to get to the same level as the abled person, and then get a job that pays about $9 an hour for factory type work, or $20 per hour for highly specialized tech knowledge work, we have to buy expensive technology that can allow us to give that 200% of ourselves to our job and lives. Like, you see us buying an iPhone 15 or 14? That's not, for almost all circumstances, as a status symbol. That's because accessibility is so unoptomized that it slows our devices down. Because we have to have modern devices, a good 8 GB RAM (minimum) on a laptop, or a modern A-series chip, or, hell, probably an M-series Mac, just to keep up. I'm not sure about the M-series Mac part as I've not spent enough time with one to see how it compares to my 2019 Intel Mac.

We see this a lot with Android phones. A sighted person may be able to get by with a $250 Samsung phone. But when I worked with one, I felt the lag with TalkBack acutely. We may be able to blame the phones, or the PC's, or Intel Macs. But then why does an Intel PC run just fine with NVDA? No, digital accessibility is all about software. If it can run well for a sighted person, it should run well for a blind person. And that's why I always say that the OS is at the root of all digital accessibility, followed of course by whatever you run above that stack, like the browser. So we have to work 200% more, buy a good 200% more expensive, and we're still slowed down. Do not praise us for overcoming these things. Help us by eliminating the need to overcome them in the first place!!

Video: https://www.youtube.com/watch?v=yyN7HvwZj18

devinprater, to accessibility

Looks like this month is disability pride month. And I'm not sure what that really means. How can I be proud to be blind? How can I be proud that people have to do extra work to support me in digital spaces? How can I be proud that I cannot drive? How can I be proud that the only way I can enjoy pictures is through others' hard work, or a blurry AI that might add things or take away things from the description? How can I be proud of being locked into proprietary operating systems due to many developers not even knowing about FOSS accessibility, let alone considering it? How can I be proud that, for most blind people, to read Braille digitally costs $700 or more, outside the US? How can I be proud of the fact that my mobility and orientation skills aren't very good and that I have to be shown a route from my home to the mailbox across the street? Or that if I'm not using Braille, I have headphones in all the time, signaling to sighted people that I am not to be disturbed and thus haulting interaction? Having so few apps, out of the millions of them, that I can use? That so many blind people feel that they have to just be grateful for what we have, and not push forward? I don't know. I am not proud to be blind. I would accept vision in a heartbeat. I'd look at all the beautiful art and user interfaces and animations, drive around seeing the world, use Linux and Android, read books, even their images, see all around me and know where to go, understand people better through body language and facial expression, and play all the video games, old or new, that I want.

devinprater, to random

So a few posts ago, I gave the other side of the little “oh M actual G people that don’t use de-Google’d phones are evil” junk that happened on Fedi today. But now, I’m gonna give the good side of Foss, because the slightly positive part of me still believes in it.

While Linux, LibreOffice, and Android have not changed accessibility for blind people, there is one piece of FOSS that has. That’s NVDA. It has an amazing community around it that report bugs, fix them, talk about new features, supporting addons with a store, supporting localization, all that, all on Github. All in the open. NVDA is so good that the VS Code team primarily tests accessibility with it. Not VoiceOver, not JAWS (as far as I know), but with a free and open source screen reader.

You know what else is cool? LibLouis. This tiny little Braille translation library is quite literally everywhere. NVDA and Orca use it, yeah. That’s to be expected. But JAWS also uses it. VoiceOver for Mac and iOS even use it. It’s literally becoming a standard, if it’s not already. That’s the power of open source in the blind community, to say nothing of Orca itself, which is getting more contributions lately and contributors, and BRLTTY, which is still going strong.

So FOSS isn’t all bad. Just like people that use out-of-the-box Android, or Windows, Mac, and proprietary Braille products aren’t bad because they use, or have no other option but to use, these things. Blind people have tried hard to get into the FOSS world, and have forced their way into a spot where they, at least can use some programs, and the Mate desktop. We should celebrate that, and push harder for better access to all other tools. Because when even huge corporations use our tools, like NVDA and LibLouis, you know they’re good. But Linux is a huge tower of stuff. We literally cannot climb it alone. And it makes some people very angry being stuck at the bottom.

devinprater, to accessibility

New idea, a browser extension that not only shows people how a screen reader "reads" the page structure, but also how the speech engine attached to the screen reader reads the page.

  • tl. doctor
  • l imfao
  • im hoe
  • init ramfs
  • img10092837744 dot jpeg
  • 3999 characters remaining. 3998 characters remaining. 3997 characters remaining.

devinprater, (edited ) to Futurology

It's 2027. LLM's are built into Systems on Chips. Everyone sees their own personalized worlds. Their computers show things in a way the user likes. Or the manufactorers like. Or the ad agencies like. Who knows. Apple helps us all write calm, understandable texts, posts, and books. Google shows us, in AR, "only what we need to see." A map on our walk we take to decompress. No, there are no homeless people in the street. Just follow the lines on the map. Yeah, like that. Hear that soft music. Your own personalized playlist, all made by AI. You like Mooncake right? Well, here's something that sounds like them. A little. But it's 24/7. More, more, more.

Some people make mistakes in their work to show that they're human. That wrong note? That's a mark of humanity. That misspelled word? They're one of us. That blotch of ink? A soul made that. Perfection is of the machines. To err is human.

The blind can see now. But at what cost? The machines know us all now. They see our faces. They see them, pick out details from what they see and what they know. Then they feed that to blind people, who eagerly gulp it down like a dry sponge. But the AI doesn't mention how fake the smile is, on the person who sees the camera that sees them. Wave for the camera, for the machine. But for the blind person, who only wants to have what sighted people were born with? Well.

Our computers then correct all that input. That misspelling? Surely the human didn't mean to do that. The blotch of ink is gone. All distilled into blandness. People begin writing on paper again. Blind people get what the AI gives, just as before. People are angry that their analog becomes digital again. Cycles and cycles. Dim and light. Gifts and hooks. Humanity and the seeking and the taking.

devinprater, to workersrights

So, please boost for reach if you can. I have to do something about all the stress from work. Our workplace has what appears to be an Employee assistance program/service. Like, counselling and such. Does anyone have any experience with EAP's, or EAS? Is their confidentiality for real, or have any of you found them leaking info to your bosses and such? I just, can't keep going like this.

devinprater, (edited ) to accessibility

You know why some blind people are really leaning into AI to fix accessibility issues? No, not like overlays that probably barely have any if/else statements in them, let alone AI, but stuff like Be My Eyes, and gasp screen recognition in VoiceOver for iOS? Because shit sucks, and it's sucked for the last 40 years of computing history for blind people. That's why whenever we get even a bit more light, even if 20% of what an AI says is fake, that 80%, that gives us 80% more info than we didn't have before. And yeah, we should all, every single one of us, know that AI can give false info by now. Hell, Mastodon folks have been shoving that into our ears with an oversized cue tip since the day ChatGPT came out. We get it. But hot damn, being able to point my phone out the bus window and take pictures as I'm going to work, hearing about a fire station, or a house with a dog in the yard, or that it's a sunny, clear, nice day outside even, is really freaking nice. And sure, maybe it's not a firestation. Maybe it's a courthouse, or a post office, or something else. but it's something that I would never have known before. Because I don't have some sighted person telling me about what's around, and I wouldn't want any other human to have to do that for me. Like, this is the thing. In order to get 100%, perfect info, I'd have to hire another human who, all they do is look around and tell me in extreme detail, what's around me? Now, sighted people of Fedi, would you want that job? Maybe for a day. Maybe for a week. But months of that? I doubt it. And that is where AI comes in. No, it ain't perfect. And the more you deviate from its training data, the less accurate it gets. And maybe eventually we'll get to a point in the middle of what VoiceOver Recognition is, and what LLM's are. But I'm just getting tired of this OMG AI is the end of the world rhetoric. It's really getting old.

devinprater, to foss

So, OS. Tried it. No, it's not blind friendly. Alt + Tab doesn't speak with Orca most of the time. In setup, the full name, username, password fields aren't labeled, so tabbing with Orca says nothing. After setup, the welcome screen is full of "GTK button checkbox" and other mislabeled stuff. It's not ready for blind people to use.

devinprater, to accessibility

Boosts welcome: Someone I've known for quite some time, and who has used Linux for like, pretty much their whole lives, wrote a thread this morning on their experiences with using Linux as a blind person, particularly the GUI. I'd like anyone in FOSS and Linux to read it, and consider where you stand on this. Are you comfortable with using tech that excludes the most marginalized and volnerable of us? Are you comfortable with being in a community that sidelines us and calls our heartfelt feedback "very frank", and allows for sentiments like this? There comes a point where saying "Linux is for everyone," rings hollow. When you say Linux is "your OS", and I can barely use it, how do you think that makes me feel? When a community leader comes at someone from a marginalized community, and now we have another person, who has used Linux for even longer than I have. They have great CLI skills, but now they rely on Windows for web browsing, saying a lot of what I've said--with more technical details--what does that say about the collectivist model of free software? What does that show for any kind of collective community? And thankfully someone else in the Mastodon thread I'll link to below shed some light on how Microsoft blackmailed AFB into getting rid of its free software/hardware initiatives, threatening to pull funding if AFB didn't do what Microsoft demanded. I remember this from Twitter, but never wanted to say anything because I didn't want to get the facts wrong. You may then say "Well then Microsoft needs to pay for what they did." And you're right. But then, that means that we need a corporation to do what hundreds of developers will not. So basically we'd be saying that the collectivist model and non-corporate stance that FOSS has had, can't even help the most marginalized of us.

Mastodon Thread:

https://tweesecake.social/@xogium@tech.lgbt/110507457734423536

#accessibility #Linux #foss #blind

devinprater, to accessibility

You know, I hear some talk of getting rid of techno-utopianism, of getting rid of the corporations, but what else is there? Linux, where no screen reader works with touch screens so no mobile Linux for us, where the best desktop for us blind people is Mate, with no notification center, where if you have Braille enabled, and quit any app that isn't GTK based, Orca gets lost in the middle of nowhere? Maybe that bug has been fixed, but for like a year or more, it wasn't. And yes, I know the Orca maintainer has been doing tons more work on Orca lately, and Gnome now has a blind person working on a new way of doing accessibility on the desktop. But I keep coming back to that Fedora meeting, where the tools were really hard to use so that I could barely participate in being one of the few blind people there that has even a finger on the wheel to steer a thing that should be about us. And I don't feel like a group of sighted, able people can make this new vision of computing, a sort of community-lead thing, any better than Linux. I mean, we have communities. And there's still images without Alt-text. There's still Linux live images that don't even have Orca on it. These distro communities still expect blind people, which have been thrust aside for the past 20 years, to come to them with their... feedback. Such a clinical word that's become.

Look, even if it's a community from the bottom up, who's at the bottom? I assure you, it won't be blind people. And if you splenter up that community and tell us to make our own distros, well, we've tried that. Vinux, Sonar, F123, Blinux. All gone. You know which distros support us the most? Debian/Ubuntu, Mint, and Arch. With Fedora you still have to enable accessibility variables last time I checked around V37 or 38.

When you say "everyone," what do you mean by that? Your group? Your group and adjacent groups? All people? Do you know about all people? Do you know about blind people? Or Deaf people?

devinprater, to internet

For those who help out blind people by only boosting images with helpful Alt text descriptions, thank you so much. Y'all are the reason, along with Mastodon being based off of an open API, that Mastodon is not just accessible to blind people, but somewhere that we're flocking to. Like blind people aren't just using this space to talk to non-blind people about our issues anymore. It's a place where we can talk amongst ourselves too! And when you get to a point where you can comfortably, on many operating systems and apps, just not even remember that this isn't a space for yourself and people who share your experiences, even for a minute, then you've moved in the right direction! The API allows for accessibility, as blind people can make interfaces for blind people. Alt text, #HashTagsWrittenLikeThis, and general acceptence, though, means blind people are going to appreciate, and thus stay on, Mastodon and the Fediverse at large. And that is far more important than money, or the users on Twitter who, if their blind, will have to deal with Twitter's website and apps where before they could pop open a compose window with a single key command, or read their tweets from anywhere on Windows, like glancing at an open Twitter tab. All that freedom and flexibility is now on Mastodon, Telegram, and Github, and not Twitter. Not Facebook. Not Discord, not Reddit, and definitely not Tiktok. Of course, website accessibility is still very important. If I'm not mistaken, sighted people generally interact with Mastodon through the site and first-party app. But having that open API means a heck of a lot.

#accessibility #blind #Twitter #Mastodon #fedi

devinprater, to accessibility

So I know it’s only been like two or three people so far, but sighted developers asking how to use Orca on Linux to test for accessibility, shows that the time is fast approaching where accessibility is going to matter on Linux. I don’t know if it’s gonna be the explosion of the SteamDeck, or FOSS communities, or the rising interest in non-corporate computing and social networking and stuff. But this is already trickling down. Developers are needing to know about accessibility. They use Linux. I’m so glad to see KDE working on accessibility, and I hope that, one day, blind people can boot up a Linux installer, be told how to enable a screen reader, be guided through setup and onboarding after installation (with no unlabeled buttons please even though it’s just a slideshow Gnome (And an accessible Internet accounts adding process)). Shoot I’d even be glad to write tutorial information for it, like I did for QuickDocs in Fedora. But the time is now to focus on it, not when accessibility hits the FOSS community like a train. And I appreciate every single developer that comes to me, wanting to learn and understand. It’s miles, miles better than not giving a damn.

devinprater, to ai

So, y'all thought ChatGPT was bad? Well look at what Bard spat out for someone.

User: "If 1+1=2, then what is 1+2?"
Bard: "If 1+1=2 then1+2=4.
This is because 1+ 1 = 2 is a false statement. In
mathematics, 1+ 1= 2 is not a valid equation. It is not
possible to add 1and 1 and get 2."

#ai #bard

devinprater, to random

Introducing the next generation of audio capchas! Do you hear the... bees? Lol no but I ran into this captcha where you have to enter the number of the bee sound. So I think they're at the "isolate this sound" part of machine learning.

devinprater, to accessibility

You know, it's not just the "new" AI image recognition stuff that gets things wrong. Remember the old VoiceOver Recognition of like a year or two ago? Yeah, that gets things wrong too. We shouldn't be surprised that AI gets things wrong. The only reason why abled people are so shocked about it is because they've not had to use it for the past 30 or more years. You know what OCR is? Yep, AI. Or, AL as some OCR programs call it. Why? Cause it got it wrong. Yeah, it's a different kind of wrong, but AL does not equal AI. And yes, screen readers know the difference.

Now, some may say that a letter here or a symbol there doesn't make OCR bad. But imagine having to use that on a PDF for work purposes. Yeah, you may want it to be a bit more precise in those contexts. You want it to always get numbers correct, for example. 500 is a bit more than 300, or 100, for example.

Now, we blind people have used OCR for a long time. It has improved majorly since even the first KNFB Reader phone. Yeah, actual hardware. That was better than the reading machine, a huge device just for the purpose. So when we see AI make stuff up, or get things wrong, and we're used to it. It is what it is, and we cope just fine. We either accept the info as junk, like some OCR results, or ask someone else who knows better, like a sighted person that can look at the OCR'd text. Oh, that word Canada is actually the Canadian flag? Yeah.

So when sighted people are all in deep, bright, red rage about OpenAL (see that?) and AI, I kinda am like, "Yeah welcome to our world." We get incorrect or incomplete info all the time. You think our canes tell us about that cat sitting up there in a tree to the left? Or that wasp just waiting to sting something? Haha no. We have to adjust to things all, the, time. And these models are just getting started, like OCR in the 80's. They will get better. But yeah welcome to AI, abled people.

devinprater, to accessibility

Please boost for reach: If you're an Android developer, this page is really good for giving good overviews of accessibility tips for developers, including Accessibility Actions, which is far easier to perform with TalkBack lately, since users can swipe up or down, just like iPhone VoiceOver users can. https://developer.android.com/guide/topics/ui/accessibility/principles

devinprater, to accessibility

I visited a meeting today. Came home with a print piece of paper that I cannot read. I won’t be joining. So tired. But it’s whatever. I’m glad I found out before joining.

devinprater, to accessibility

Gosh y'all, some of these Android users are just, like... It's like they have a hate boner for Apple. Like, wow. Not even I feel that strongly about Google, or Apple, or Microsoft really. They're companies who are full of people, with their own tragectories, with tiny accessibility teams, and their own strengths and weaknesses. Apple manages to have enough internal communication to make things work well together. Google runs just about everything an everyday user uses. And Microsoft has kept Windows going for decades.

I mean, I rarely hear any iPhone users just plain have an anger-orgasm over Google. At most, it's apathetic "Yeah I wish they were better but there's not much I can do about that, so I'll keep writing my novel in Ulysses with my Braille display," or something. I mean, maybe I've just not seen this secret group of Google-hating blind people. I mean, my choice to use, and lightly suggest, an iPhone is from my experiences with Android. I've used this Samsung phone for like a year and a half now. I went 6 months without even touching my iPhone, especially after I got Google TalkBack on there instead of Samsung's out-of-date version. And still I've come back to iPhone. And I still don't hate Google. I don't hate anyone at Google. I think they could work together more, especially the TalkBack, Assistant, dictation, and Android UI teams to give TalkBack a way to shut up while a core part of the phone is recording, no matter what phone you're on. But I'm not cumming in rage at them. In fact, I'd still gladly work with them. Honestly I'd work with all three of the OS companies, and the FOSS orgs, to make things more accessible or enjoyable to use. Because there is no perfect OS for blind people right now. Every single one has their good and bad parts. A sighted person may be able to just move between them all, able to do anything on each one. I want that for us blind people.

devinprater, to accessibility

After spending about a year and a half with both iPhone and Android, I think I understand both operating systems very well. iPhone can do so, so much. A blind person even wrote a book about using the iPhone, on the iPhone with Ulysses and a Bluetooth keyboard. But as iOS ages, it looks more and more tattered, with some parts thinning, and others having large holes in it at times. Yes, those get patched up, but how long must we wait until then?

Android, on the other hand, is much, much smaller, but is smoothe and mostly clean, with very few thin patches, and if there are holes, they're not generally as noticeable, or can be worked around easily. So, the question is, do you want something they can do a lot but has, well, issues that will make you want to just throw it away sometimes, or do you want a mostly good experience that you can't do that much with?

And no, the analogy doesn't work very well, since on Android, even using voice assistants are a pain because you have to silence TalkBack or Google will be listening while TalkBack will be speaking. Really annoying bullcrap. But other apps, like Element and Telegram, work better on Android than they do on iOS.

But what really gets to me sometimes is that none of our issues at either company are handled equitably. If a sighted person opened their notification center, only to have their screen go blank, that wouldn't even have made it into a beta release, let alone production. If a popular keyboard and mouse only worked through USB and not Bluetooth as is the main use case, sighted people would be all over the Android team, asking why, and when. Instead, even though news orgs are on Mastodon, blind voices are still small, swept away, and drowned out.

devinprater, to apple

> iOS 17.1 is coming this week to fix everything wrong with your iPhone, MacWorld

I hate it so, so much. Will it fix the problem where non-HID Braille displays don't work with VoiceOver? Will it fix that the Eloquence voice is quiet as crap? Will it fix the issue where sometimes even doing a double tap doesn't register correctly, leading students and people new to VoiceOver thinking they're the problem? Will it freaking fix any VoiceOver bugs? Oh fuck no, cause we blind people aren't a part of everyone, so our bugs aren't a part of everything that's wrong with "your" iPhone. Fuck you too MacWorld, fuck you too. I hate this type of "we know everyone's issues and we'll just downplay them all at once" kind of reporting. I hate this kind of normal-washing of everything. Fuck that shit with the pointiest thing available. Ugh!

devinprater, to accessibility

It's 2025. We finally have 7 Assistive Technology Trainers. Five of them are AI. One human is for screen reader users, and one human is for magnification users. We even have a course writer. Oh, it's an AI too. We have a media editor trainer. Yeah, that's an AI too. We even have a web and document accessibility analyst. Yep, AI.

It's so nice now, not having to do so much on my own. I mean, well, we humans still here joke that the AI staff are taking a day off when the OpenAI servers faulter or crash. But the coolest part is, our organization only has to pay the enterprise fee, and we get as much staff as we want.

Oh hey, hang on, I need to take this call. One of our students needs help with something. "Assistive Technology, this is Devin. Yeah, I have time; what's going on? Okay, lesson 10? The AI said what now? No, no, just arrow to the word and hit the applications key; you don't need to do anything complicated yet. Yeah, I know. But we don't have the budget for any real people, so we use GPT as kinda the front line. Yeah, that's how it is, I'm sorry. Okay, let me know if you have anything else you need. Yeah, you have a great day too, and we'll see you tomorrow."

Yeah, that kind of thing happens sometimes. But like I told them, we can't hire anyone else. And we have way too much we need to do. So in order for me to not break down and quit, we made some GPT's and hooked them up to the AI Worker system, with memory and phone line, that just came out a few months ago. It's easier now. But sometimes I wonder what wrong info students get from these bots. I mean, I know in 2024 the NFB made OpenAI make their site accessible and train on blindness texts, but you never know. I always have to tell them about the new NVDA settings panel released in 2024.3, or the new features in JAWS 2025. But I finally have time to do other stuff, like learn new AT, or go to college, and our 500 AT students do fine for the most part. I hope.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • JUstTest
  • kavyap
  • DreamBathrooms
  • thenastyranch
  • magazineikmin
  • InstantRegret
  • GTA5RPClips
  • Youngstown
  • everett
  • slotface
  • rosin
  • osvaldo12
  • mdbf
  • ngwrru68w68
  • megavids
  • cubers
  • modclub
  • normalnudes
  • tester
  • khanakhh
  • Durango
  • ethstaker
  • tacticalgear
  • Leos
  • provamag3
  • anitta
  • cisconetworking
  • lostlight
  • All magazines