[SLED] Re: Blind people in SL - Idle speculation
jeffrey.hiles at wright.edu
Wed Aug 2 14:36:46 PDT 2006
Mike, I like your thinking--that people who can't see would be able to
navigate SL on their own, by sound perhaps. Or that there would be built-in
aids for people with low vision. Very progressive ideas.
Unfortunately, as it is, Second Life is about as far as you can get from
accommodating people with visual disabilities. And that's why I was
particularly interested in hearing how someone was making that
accommodation, assuming the story is true.
As Danielle said, right now you would have to pair a blind student with
another student or with an assistant who could navigate, read, and describe
what's on the screen. That's not unique to people with visual disabilities,
though. I think some of the folks with cerebral palsy who use Second Life
have helpers type for them.
The visually impaired could participate more directly, though, if the SL
client was accessible to screen readers. I know blind people who have
embraced instant messaging with clients that work with JAWS. So, in theory,
it would be possible for people who can't see to carry on their own text
conversations in Second Life. That degree of independence, I think, would
make the experience more immediate and immersive.
However, the Second Life client doesn't currently give screen reader access
to chat or IM text. In fact, you can't even read the menus with JAWS. If the
client did have that most basic accessibility--chat, IM and menus--blind
users would still need some assistance getting around. So, I guess I can't
be too down on the Lindens about it.
Still, as Ellie said, it's "a really big issue that will not be pushed to
the back seat" when we have Section 508 to abide by in the U.S. and the UK
and Australia have similar standards. The important thing is to have that
contingency worked out before you have a blind student enrolled in your
Mike Reddy writes:
> Well, I am often full of ideas and enthusiasm, then my time dries up
> with FL. So, please consider this as idle speculation and run with the
> Blind people and SL
> This would be easier with a client SDK that could trap text, use text
> to speech and allow keyboard macros, but given the existing client
> could we not have a HUD or head mounted scripted object that 'spoke'
> information. Location, people's names as they came and went, object
> IDs. Within the current system, these would probably have to be
> pre-recorded and linked to specific text, say in a notecard.
> Alternatively, objects in an 'accessible' area could be able to self
> report, say if someone approached them within a certain distance for a
> certain time. This area could be made the home location for
> participants. We could even run a competition to design accessible
> vending machines that used sound instead/as well as text.
> To aid people with visual impairments - most people who are blind
> aren't actually 'blind' - it would be great to have control over field
> of view in the client, which could effectively allow mouse view of a
> small angle to be the equivalent of a magnified image, much as PC
> viewing software allows the whole screen to be enlarged. Sadly, this
> would not easily include text. However, if we had a HUD object
> repeating any 'heard' text in the mouselook view, then even this might
> be possible. This would require chat in the mouselook view...
> Ah well, maybe when I have a PhD student to throw at it...
> Educators mailing list
> To unsubscribe
More information about the Educators