Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Linux Software

Handicap Access/RSI & Linux 67

Eric S. Johansson sent us a discussion document he's written about Linux and handicap access. I've included the full text below (with permission). Given the way my wrists have felt lately, this is something that really makes me think. What do you folks-what's the work being done for handicap access? For everything from RSI to vision impairment, I think this will become more important.
There has been a significant discussion in the linux community about making Linux available to ordinary users but there remains one group of users who are not able to use Linux in any form. I am referring to handicapped users. Whenever there has been discussion about handicapped users, it has been most frequently in the context of blind users. There are far more handicaps than simple blindness. There are a multitude of partial-vision handicaps, mobility handicaps, and aural handicaps. All of these handicaps need accommodation for Linux to reach the mainstream.

Why handicap accessibility is important

Microsoft is making a major push in the area of handicap accessibility. They are under tremendous pressure from handicap accessibility advocates and governmental regulators to make their applications handicap friendly. To their credit, they have a far greater range of accessibility aids and adaptability than is currently present in the Linux/X11 environment. Alternative input devices such as tablets, speech recognition, special keyboards are available to users. You can use alternative output devices such as text-to-speech screen readers. Contrast this to the Linux world. While there are a few notable exceptions, Linux is "unfriendly" to people with disabilities.

For me, the two most compelling reasons for building handicapped accessibility into any operating system are 1) it's the law and 2) more importantly, it's the right thing to do.

Love it or hate it, the 'Americans with Disabilities Act' is the law. In a nutshell, it requires reasonable accommodation for handicapped people in public buildings and in the workplace. One of the major positive effects of ADA has been the creation of the design concept known as "universal access" or "universal design". The original goal was to create a design philosophy for constructing buildings that would be equally usable by handicapped and non-handicapped people. One of the unexpected side effects was that many of the design features that made it easy for handicapped people also were better for non-handicapped people. The reason for the side effect is that many of the physical tools (kitchen utensils, door knobs, sink faucets, etc.) that we use are kept out of habit and tradition. When examined for usability, better designs become quickly apparent and usually cost no more than the traditional design.

On the negative side, ADA creates liabilities for public and private sector employers. The reason this liability is a particularly insidious is that when it comes to computers, an employer may have no choice but to discriminate on the basis of a handicap because there is no way to accommodate the handicapped user. Employers are damned if they do and damned if they don't. This liability in turn becomes a problem for the operating system manufacturer and the application vendor. Vendors like Sun and Microsoft recognize the liability and have research groups working on addressing the problem.

More importantly, building handicapped accessibility is also the right thing to do. Depending on which statistics you believe, there are between 45 and 54 million disabled Americans. Workplace injuries continue to generate hundreds of thousands disabled people each year. The software development field alone generates some 50,000 handicapped people a year. By the time we reach roughly 60 years of age, about half of us will be disabled as a side effect of living on this planet. Any way you look at it, that's an awful lot of people to exclude from a computer-intensive society.

By not making computers handicapped accessible, you are telling some 20 percent of our population that they cannot hold a high-paying job, get additional education, or take advantage of the benefits of the Internet.

I know of all this first-hand. I was disabled in 1994 because of too many hours at the keyboard. I have recovered from my injury to the point where I can drive a reasonable amount, prepare food, and yes, use a keyboard to some extent. I'm still disabled because there are many things I can't do without causing myself tremendous amount of pain. I have experienced job discrimination and public embarrassment because of my handicap. Yet I've been lucky. I've been able to reinvent myself and develop a work life with computers thanks to speech recognition systems. But far too many of my peers have just fallen off the economic ladder when they became injured/disabled. When talking with them about what's gone wrong and how to fix things, it comes up over and over again that computer handicap accessibility isn't good enough yet for most jobs.

Today, computer handicap accessibility is very primitive. Accessibility efforts have focused on very select communities such as the visually handicapped and the profoundly physically handicapped. Accessibility aids such as screen readers, unicorn sticks, sticky keys, and paddle switchs are useful only if you have no other place to turn. Whenever I read articles or see programs showing some profoundly disabled person laboriously keying in characters so they can send email, I think people are astonished and pleased not because the accessibility aid worked well but because it worked at all!

The closest thing we have to a general accessibility aid for mildly handicapped people is speech recognition, but even that falls short, because it is optimized for a specific task. Speech recognition is tuned beautifully to English language dictation. Try to write code, however, and you'll soon find your throat feeling like you've been gargling broken glass.

When you step back and look at the entire handicap accessibility repertoire, it is apparent that accessibility aids are primarily point solutions that need to be done and redone with every revision of every operating system and application. A case in point is T. V. Raman's Emacspeak work. It's a wonderful example of adapting an application to a specific handicap. It's also a wonderful example of what's wrong with the state of computer handicap accessibility.

Emacspeak works extremely well as a tool for blind users using Emacs. It is extremely flexible, configurable and works with a wide range of Emacs Lisp based applications. The downside is it only works with Emacs and it only works with blind users using text-to-speech. It is a well done single point solution for a very targeted population.

Flaws with current computer handicap accessibility

There are many flaws in the current approaches to handicap accessibility. Some of the major flaws are:

  1. Additional development requirements for application developers
  2. Narrowly targeted accessibility aids
  3. No general model for accessibility
  4. Accessibility aid and application bound together on same machine
Application developers today need to go through extra effort to enable their application for both visually and physically handicapped users. This is a flawed approach for a couple of reasons. First, if you do not live with a disability, it is very difficult to catch all of the usability flaws. Second, based on personal experience, anything requiring extensive additional work tends to get dropped as deadlines loom near. That means extra work for handicap accessibility is the first thing that dropped from a project.

Most accessibility aids, by their very nature, are targeted to a specific handicap. For example a unicorn stick does absolutely nothing for a blind user and a screen reader is overkill for a color blind user. Without a general model for accessibility, developers would need to build multiple user interfaces for a given application. It is my believe that a general model for accessibility would allow developers to add handicap accessibility to their applications with little or no effort.

Another flaw in the available computer handicap accessibility tools is that the adaptation and the application are bound together on the same machine. This forces the handicapped user into using a single machine which has been adapted for them. If your job requires you to use multiple machines, then you have to enable every single one of those machines with the same accessibility aids and configuration files and then try to keep those configurations in sync as your use evolves. If your enabled machine fails, you cannot work until the machine is repaired or replaced and all accessibility aids are restored.

With barriers like these, is it any wonder that handicapped people continue to have the double-digit unemployment rates even times like these?

Considerations for handicap accessibility

There are better ways to solve the computer handicap accessibility issue than those provided by current solutions on the marketplace. The general case solution for handicap accessibility is a "hard problem" and I can't do it full justice here in this forum. The only aspect of handicap accessibility that I can speak to is that of a person with sore hands. However, with biases declared and out in the open, I will try to describe some of the requirements for building handicapped accessibility infrastructure.

A user should be able to

  • Modify or customize the interface to meet their handicap needs and thought processes.
  • Create incidental scripts easily to aid in task automation.
  • Rely upon applications to have all functionality, data, and state available for query by a common accessibility scripting environment.
  • Have the ability to store, retrieve, and act on state and context specific information.
  • Count on applications having the capability to invoke actions in the common accessibility scripting environment.


The need for a changeable interface comes from the different characteristics of each person's handicap. For example, someone using speech recognition has a different user interface experience from someone using text-to-speech. Incidental scripts are important because limited mobile or vocal capacity should be conserved when performing repetitive tasks. The next two items come out of the need for an accessibility aid to drive an application. Speech recognition needs to know context in order to improve recognition accuracy. Text-to-speech needs to be able to find out what's on the screen and then read that back to the user. Data persistence is important in order to remember what is valid to hear or say in a given context. The last item comes from the need of an application to signal a user. When email arrives, instead of not hearing a voice saying "you've got mail," deaf users could have a light flash on their desktops triggered by the application calling a handicap accessibility script.

A long-term goal for handicap accessibility is to provide enough information so that an agent could explore an application via the interfaces listed above and generate an accessible user interface matching the persons handicap.

Conclusion

Many of these problems became apparent to me from personal experience. There are many others that we could add from the blind, deaf, and severely physically handicapped communities. Once such an accessibility infrastructure is implemented, handicapped users would be empowered to adapt their corner of the online world to their needs. Given the right boot-strapping tools, we can pull ourselves up the rest of the way.

I wasn't kidding when I said this was a difficult problem to solve. It will require coordination and communication between multiple project groups. It will cause serious changes to GUI toolkits. Each of the requirements carries potential security risks. But it will provide the means to make a positive change in the lives of millions.

Are you up for it?

Write me at esj@inguide.com

This discussion has been archived. No new comments can be posted.

Handicap Access/RSI & Linux

Comments Filter:
  • KDE has kikbd, the International Keyboard tool. One can use it to switch to alternative keyboard layouts (like Dvorak) with a hotkey. I'm learning Dvorak now and it is quite handy.

    KDE 1.1 might not include a Dvorak layout. If you need one, email me and I'll get one to you.

  • You certainly didn't offer much of a way to email you... Your email address is not available with the message, at your web page, nor is there a reference to the project along with an email on lilo's agalmics page..

    Either way, I'm interested.

  • Posted by Andrew Swaine:

    Have you ever tried to control a command line interface by speech recognition? It's not easy, and it's very slow as command lines are optimised for minimum keystrokes and are generally totally unpronouncible. GUIs on the other hand can be controlled nearly as fast by speech as with a mouse if the speech recognition system can be made aware of the names of the menus, buttons, tabs and other controls available on the screen. Dragon Dictate managed to find these and add them to the vocabulary dynamically so you could click on items by just saying their name. Where this failed (ActiveX changed the interface) accelerator keys could be used.

    My point is that it is not impossible to do, it just needs some thought and standard interfaces.
    I used to be unable to type due to RSI, but I found Windows significantly quicker and easier to operate by speech than any command line (and faster than the traditional mouse/keyboard approach for some tasks).
  • I'm sorry if this is a small tangent, but I think this needs to be said.

    Large corporations have a single advantage that the open source model does not have at all. The advantage is that they can quickly focus all development on a single topic. They can also issue and edict to all development teams ordering them to use a single interface in order to focus on an issue such as handicap accessibility.

    How do we compete with such resources? We have two options and three outcomes:
    1) We hope that everyone else does nothing and also do nothing.
    2) We hope that everyone else does nothing, but find out later that they did so and get left in the dust.
    3) We do something about the problem, wowing everyone in the world as we do so.

    Let's assume #3 is the best answer. To do something about handicap accessibility, we probably need, as suggested in the article, a common messaging interface. This interface would have signals and data output. Based on the handicap settings, either the program would use its own interface to display the data or signal the user, or the handicap facility would do so.

    Part of the problem with the current system is that we have condensed output in order to reduce information for the user. In order to solve the handicap issue, programs must give context information that is normally hidden to the user, but is revealed to the handicap facility in order to help presentation.

    For instance, a word processor with this functionality would work normally without the handicap facility, but when installed, the word processor would behave differently. It's interface might change, it might read the output and menus, it might use sounds or visual cues to signal users.

    What I'm saying is that someone, not me, should design such an interface. It needs to be robust enough to handle things from programming and word processing all the way to reading packets and perhaps even using the Gimp. It would be difficult, but feasible with the right design. Good luck.

    -Ben

    ps. For speach recognition users, someone might want to invent an easier language to speak than c or c++:
    for(int i=0;i10;i++){printf("%d\n",i);}

    pps. hmm... a compact phonetic language in place of a compact character language...
  • Also, run the pages by CAST Bobby [cast.org].

    It will scan for common accessability problems, point them out, suggest fixes, and give an overall score. Very helpful.

  • emacspeak [cornell.edu] is rather well-regarded (and particularly handy for people that already use the One True Editor [alt.religion.emacs]...). It's more targeted towards people that can't see, although I'm sure it would be useful for people that can't speak as well. There's another stand-alone project, festival [ed.ac.uk], if you don't dig that M-x stuff (what, you don't have pedals on your terminal?).


    --
    W.A.S.T.E.
  • If you have access to a good newsagent or a University library (or any reasonably sized metropolitan library I suppose) track down their New Scientist collection. There is an article in the 10th of April issue [newscientist.com] covering RSI from a slightly different angle than normal. There's an interesting theory that RSI is not caused by physical damage to the wrists/hands, but rather caused by blurring of the brain's distinctions between the fine motor control areas for the hands.

    The strain is in the brain
    Too much typing can leave you in agony. But rather than damaged muscles or tendons being to blame for RSI, says Bob Holmes, things might be going wrong in the brain

    It seems that when you spend a lot of time moving your fingers in very precise, accurate ways your brain can blur them together: you lose fine control over time. This effect has been shown to take place in monkeys made to earn food by `typing' (aside: presumably they put them on Usenet...), and there seems to be some evidence of it occuring in humans. Particularly susceptible, as you might expect, are keyboard operators and musicians.

    There is a tiny tidbit [newscientist.com] from some while ago on the New Scientist web site -- unfortunately they don't appear to have put up the article I'm talking about. If anyone's sufficiently interested I daresay I could type in a couple of short extracts for review.

    Here are a couple of links which you might find interesting (tracked down from the broken NS links...):

    • Repetitive Motion Injuries [annualreviews.org]--Annual Reviews of Medicine (1995):
      Repetitive motion injuries have presented clinicians with a significant challenge over the past two and a half decades. Acceptable treatment of inflammatory disorders is well established, but compressive neuropathies and nonspecific complaints of numbness, tingling, and discomfort in the upper extremity present vexing dilemmas. Current research and experience point to multilevel problems, including posturally induced muscular imbalance. Although surgical solutions to these problems are sometimes indicated, conservative approaches successfully treat many individuals and have narrowed the scope and indications for surgical intervention. These approaches include ergonomic changes at the workstation, postural changes, and muscle stretching and strengthening to correct imbalance.
    • Stretching and Flexibility [enteract.com] -- Everything you never wanted to know. Apparently this is a frequently recommended treatment.
    • The Lancet [thelancet.com] also has a couple of hits for `RSI' and `repetitive strain': the usal username/password will get you the `free' version of the site.

    --
    W.A.S.T.E.
  • "Maybe Lisa is right and America is the land of freedom and opportunity, and maybe Adeel is right and the machinery of capitalism is oiled with the blood of the workers."

    Just typing a long sentance like that makes my wrists hurt.

    But seriously, I wonder how many handicapped programmers are out there. Since Linux is (primarily) a project by and for programmers, I think a strong commitment from uhhhmmmm, uh, differently abled programmers is what it would take to make Linux a world class OS in terms of accessibility. After all, people in the OSS community tend to write what they want use, and more importantly, not write what they would never use.

  • Sorehand [ucsf.edu] has a bunch of information about RSI which I found quite useful as I was going thourhg these problems a few years ago. Especially useful to me was the mailing list and The Typing Injury FAQ [tifaq.com]
  • The reasons why blind support under Linux (that's
    where I have experience) is falling behind
    compared to other developments:

    Open source projects are normally done by people
    who are interested that this project succeeds.
    Users who are blind form a very small sub group of
    Linux users. Many of the blind users cannot code.

    Without a user base which is big enough,
    the development/debug/feed back process doesn't
    take off. But who else should have an interest to
    write this code than the blind community itself?

    And another annotation:
    Forget the Linux is so good because it is
    command line.
    theme.

    Linux is so good because applications are cleanly
    separated into application core and user
    interface. That is the big advantage over another
    OS where applications are huge monolithic code
    monsters which cannot run without an open window
    (no brands here :).

    Now it is possible to rewrite an user interface
    which serves your special needs. (Speech,
    touch...) Your goal is _not_ to do all your work
    on the command line. Your goal is to have an
    _intelligent_ user interface which is adapted to
    your needs. The Command line may be fast but its
    not intelligent.

    And now we are there where we started: who should
    be interested to write these user interfaces other
    than the people who will profit from them?

    Blinux == Blind support under Linux
    http://www.leb.net/blinux










    --
  • I would beg to differ too :)
    First: My annotations came under the title 'Blind
    support and Linux'. So the argument
    moving to GUIs for those things they

    do better, graphics and netscape

    sounds somewhat strange.

    Second: Nobody will dispute the importance of a
    CLI. The point is: soon a blind power user will
    find out, that the command line interface isn't
    enough to fullfill his/her needs.

    Think about working with a spreadsheet and a
    screen reader. Now an intelligent speech enabled
    user interface like Emacspeak enters the arena.

    And finally: I didn't promote a personalized user
    interface but an intelligent user interface.
    A pretty differend beast.
    An IUI works as a transmitter between you box and
    the user. This could be a speech enabled interface
    which 'knows' that this application uses windows.
    Think about a news reader where thread info is
    stored in one window, header info in the second
    and the message in a third window.






    --
  • Does anyone else have any other solutions to this that won't set me back $1k? My wrists have started really aching lately, and I'd love to get an ergo keyboard or something, but I want to get the best I can find. So, if you slashdotters could direct me in any way, I'd appreciate it.


    -mike kania
  • Yes, it's called rsynth. It's kinda primitive, and the voice isn't very configurable (about all you can do is choose whether it has an American or English accent), and it's optimized for slow systems (i.e. its quality peaks at 8KHz, anything higher and it sounds like it's talking through a strainer), but it works, and it's a well-behaved UNIX program. I don't know of a simple way to get it to work as a speaking interface to a terminal, though.
    ---
    "'Is not a quine' is not a quine" is a quine.
  • datahand [datahand.com]

    I can't help but evangelize this device. It is wonderful. My wrists were at the point that they hurt constantly, painfully; I lost sleep over it. I couldn't stop using the computer, though. After about a month of using nothing but the datahand, my wrists are SO much better. I still have occasional pain and numbness, but that's usually after using normal keyboards (like in the computer lab).

    Datahand + fvwm2 go together to make a WONDERFUL team in terms of customizability. Even when I'm not using my Dathand, I don't have to use the mouse much (ctrl+shift+HKJL to move coarse, ctrl+shft+alt+HKJL to move fine-grained), and when I am, the fact I can customize the interface to be exactly right for the Datahand is so wonderful...

    The personal edition cost me $900 after a 10% student discount. They're not cheap. But they're worth every penny. Consider that RSI surgery is generally $10k and only treats the symptoms (not the cuase)...


    ---
    "'Is not a quine' is not a quine" is a quine.
  • Yes, I am in a wheelchair and a geek too! While I do not need any special software/hardware to do my job/hobby, I have been good friends with many visually impared people who have had a great deal of difficulty in the modern GUI age, many of which still depend on DOS & WP5.1. I have heard quite a few complaints about M$'s support of adaptive equipment and use of non-standard key-sequences in its software. This of course goes against all of the PR we see/hear (not that this is anything new to /. users).

    I think this would be a great area that Linux could excel in. And when it comes to web pages like mentioned before, can be a help for those Lynx users out there as well.

    And just a personal note about the 'uhhhmmmm, uh, differently abled programmers', I am just one person, but I don't buy into all this PC BS. I could care less that the world's best graphics program is called the GIMP.
  • Interesting that you should say this, because there's some work being done to look at the causes of RSI in the University of California.

    It was reported in the New Scientist in the UK, and on their WWW edition at: http://www.newscientist.com/

    However, the article needs a subscription to view it.

    Essentially says that repeated, mindless physical action using fingers ends up scrweing up your brain, causing RSI - not the actual physical strain.

  • by jaffray ( 6665 ) on Monday April 19, 1999 @07:24PM (#1926247)
    >My wrists have started really aching lately, and I'd love to get an ergo keyboard or something,

    This is not the correct response to the situation. The correct response is to SEE A DOCTOR, IMMEDIATELY. Every day you spend on self-treatment with a particular solution which may or may not be appropriate is another day in which you could be doing irreversible damage to your body. You do NOT want to do this.

    (Trust me; I spent nine months largely incapacitated and in great pain thanks to delaying proper treatment, and while I can work again, there are still a lot of things I can't do with my hands and arms. This is not something to screw around with.)

    Which particular keyboard you use is only a tiny aspect of your behavior which is causing this damage. Posture at *and away from* the keyboard, work habits and breaks, typing in non-strenuous ways, ergonomic workstation setup, and so forth are all important.

    Furthermore, if you're already experiencing pain, it's quite possible that you can't type normally without causing more damage. In my case, by the time I saw a doctor, repeated microtears, scarring, and healing had shortened my extensor tendons to the point where I no longer had anywhere near a normal range of motion. Even if I'd adopted perfect ergonomics, I still wouldn't have been able to type without pain and worsening my condition. I needed a lot of physical therapy to get back to normal.

    In short, don't assume you can treat this yourself. See a doctor.

    Recommended reading: Pascarelli and Quilter, Repetitive Strain Injury.
  • Thank you. This is a fascinating site and it pointed out a lot of shortcomings in my own pages.
  • It's probably the other way around: disabled people do not become programmers because of the accessibility difficulties.

  • Are there any programs out there for people that cannot speak? For instance while the person is typing...it will read it out? Just wondering :)
    NaTaS


    yes there are but try and think outside of the box about the solution. How could you build it using the Unix component philosophy? Obviously, you would need a text-to-speech component. You would obviously need some sort of keyboard reader. But what about in the middle? How about some word prediction software? What kind of editing facilities what a person like this need? Do they have other handicaps that also need accommodating like a mobility problem?

    It's not a simple point solution. It's a systemwide solution!
  • there aren't a whole lot of handicapped programmers out there. There are a whole lot of handicapped ex-programmers out there. Remember, for most developers, RSI is a career ending injury.

    The best estimates I've seen say that approximately 50,000 developers are injured per year. Think about how many people make comments about how their wrists hurt and what kind of adaptations that made. These kinds of adaptations are a very simple form of handicapped accessibility so I take gentle issue with your assertion that handicapped accessibility would be something that they (OSS developers) would never use. ;-)

    I'm currently working on a project with some people in the national research council of Canada. It is aimed specifically at alternative tools for programming. We are aiming at the speech recognition user first and everyone else second because that's our bias. For some rather crufty slides on the user interface and philosophy, take a look at:

    http://www.connact.com/~esj/voice_coding

    it was a presentation I gave over a year ago but the concepts are still valid.

    --- eric
  • Hear, hear.

    I've had a couple of friends who developed carpal tunnel syndrom writing their PhD theses. And another who was a pianist.

    I think some of the reason that you might think hackers don't get it is that so many are young. Wait till your 30s, kids. Staying up all night - whether partying or coding - is just no longer an option. Bits of you ache if you overuse them.

    love,
    Wench

  • "so you want to help software? spend a few years studying communication theory, then generalize your program's output so that it can instantly
    switch to any channel (braille, moon, aural, vibration, god knows what) and any human language. this means decoupling specific input and output from the program and gaining a basic knowledge of the reality of humanity, not something engineering people care about."

    I believe there is some facility to do just this that Sun has been working on as part of Java. I don't know much about it, but a friend and I have been thinking about looking into it.
  • What's the state of accessibility in GNOME and KDE? Has any developer scratched this itch?
  • Check out the BAT Chording Keyboard. http://www.callamer.com/infogrip/bat.html
    I got one and I love it. I'm a righty, and I BAT with one hand and mouse with the other. I actually type faster with my left hand on a BAT than I do with both hands on a qwerty Keyboard.
  • If you notice the people who get carpal tunnel and similar wrist/finger injuries, they're almost always people in jobs they don't enjoy -- typists, data entry, etc. Rarely do you see hackers getting these injuries (at least when they are doing hacking at work :).. I think that's because when you're at a stressful job, your muscles tighten up and such injuries happen. And those who are enjoying their time at the keyboard rarely get these injuries, because they are comfortable and relaxed. Anyone else notice this?

  • Professional music is _very_ stressful. Just like programmng professionally is often boring and stressful.. I'd just like to see how many people who really only 'excessively' use the keyboard while hacking at home for fun get RSIs, or how many amateur musicians who play at night for fun get RSIs...
  • Speech recognition on Linux [leb.net] is going to be a difficult problem to solve, alone. There are tools out there to do it, but it is going to require massive coordination.
  • Are there any programs out there for people that cannot speak? For instance while the person is typing...it will read it out? Just wondering :)
    NaTaS
  • I wanted to get sticky keys to work on X windows under FreeBSD. It was very difficult to obtain this info. After a week of extensive web searching, I found out that it was already built in under the name accessx. After another week of extensive web searching, I found out how to switch it on.

    My findings are at
    http://math.missouri.edu/~stephen/software/#acce ssx

    I hope that this information will become more widespread.
  • For myself, I have found that vitamin B supplements are particularly effective at dealing with any RSI I get. I pop a few of them, and a day or two later, the problem is gone. It is one of the B's, maybe B6 or B12, that actually does the job. But I was told that a vitamin B complex is more effective.

    Don't overdose on them for any consistant period of time - apparently that can cause nerve problems in its own right.

    These are just my experiences - YMMV.
  • PLEASE don't attempt to treat yourself. While I'm well aware that most Americans do not get enough vitamins in their diets, vitamins alone will not prevent, and will certainly not reverse, severe RSI. I cannot say this strongly enough: PAIN IS NOT NORMAL WHEN TYPING. If you experience it, SEE A DOCTOR.

    I speak from experience. I went many years with occasional pain in my hands and forearms. It was only after I could not type at all that I went to see a doctor, and I now have an injury that is disabling and very painful. It will take several months of physical therapy before I will even get past the pain. I may never be able to use the keyboard again extensively.

    Trust me, you don't want this to happen to you. Don't attempt to selftreat. See a doctor, now.
  • I'm only 24, and seven years of hacking and system administration -- two jobs I love dearly -- have totally ruined my hands.

    I have extensive, disabling, and very painful forearm tendinitis. Months of physical therapy will be required just in order to get me back to where I can drive, clothe myself, and work a doorknob without pain. Using a keyboard will probably be months down the road.

    And, you know what's the worst thing? I'm totally cut off from Linux, because DragonDictate NaturallySpeaking and other similar dictation products only run under Windows. If that's not a reason to adjust your behavior, I don't know what is. :-)
  • Check out the BAT Chording Keyboard. http://www.callamer.com/infogrip/bat.html

    I got one and I love it. I'm a righty, and I BAT with one hand and mouse with the other. I actually type faster with my left hand on a BAT than I do with both hands on a qwerty Keyboard.


    I notice from the picture on the Web site that this device seems to put the hand into both dorsiflexion and ulnar deviation -- a dangerous combination for RSI.

    Also, you mention that you "actually type faster". This is not necessarily a good thing. One of the key reasons that RSI is so much worse of a public health problem now is the drastic increase in production expected by knowledge workers. A keyboard, unlike the typewriter before it, allows one to move at a pace fast enough for injury.
  • Check back "Ask-slashdot"s. There was one of ergo keyboards about a month ago. It was very informative. Bottom line was, if you don't know Dvorak, learn it; if you do, get a--darn, I can't remember the name.
  • The Festival Speech Synthesis [ed.ac.uk] System is available. It has British and American English (and I believe Spanish also) male voices. I think it may have female voices available, but I'm not sure.

    It works well under Linux. It works best (obviously) with faster processors, but a Pentium 90 or higher should be sufficient for most uses.

    A simple shell or perl script could be devised to speak sentences as they were typed.

    Doc Technical

  • I don't think people should be obsessing over making X more accesable to those with visual or mouse related difficulties, because graphics and cursors is what X is about. When I started using linux, probably the most attractive point about it was that everything could be done from a command line. Command line programs are by nature expandable to almost any disability, since all one need do is find a way of getting text in and out. If you have problems seeing, use big text or a braille terminal, if you have problems typing, find something you can type with and get/write drivers. I always thought the purpose of X was to provide graphical wrappers for programs when someone wanted them, if X isn't your thing, don't use it, or modify it so you can use it. The problem as I see it is programs like realaudio or half the windows apps in existance that run only under GUIs when they don't need to. For this reason, I say Linux is *WAY* ahead of Microsoft on disability access, let's see you do anything in NT or 95 with a braille terminal or just plain text, whereas Linux proves that the GUI really isn't nessesary for a web or DNS server, or for reading email, or a hundred other tasks that MS can't do in text mode.
  • >And another annotation:
    >Forget the Linux is so good because it is command
    >line. theme. Linux is so good because
    >applications are cleanly separated into
    >application core and user interface. That is the
    >big advantage over another OS where applications
    >are huge monolithic code monsters which cannot
    >run without an open window (no brands here :).

    While I would agree that the seperation of program core from the primary benifit, I would not be so quick to downplay the advantages of a command line interface. While it may be slightly cryptic, so is programming. Therefor my assertation would be that for anyone who codes, ie someone who has found a method of getting characters in and out of the system one at a time since this is what coding presently relies upon, would prefer to have a CLI to accomidate their particular disability. There are no circumstances I can see where a usable UI can be set up for the completely blind, ie those who use braille terminals, without a CLI. And I will continue to support my theory that CLIs are the most adaptable system in common use today. A GUI like X is stuck as a GUI, one cannot take it's graphical representation and effectivly convey the information through sound or touch. A CLI on the other hand, can more easily be portrayed in a tactile fasion, and can at least be partially transmitted as auditory info. A series of keystrokes is a more adaptable solution than a button that says 'click here' because while the button is more easily understood from a visual representation, it is virtually impossible to convey in tactile or auditory fasion.


    >Now it is possible to rewrite an user interface
    >which serves your special needs. (Speech,
    >touch...) Your goal is _not_ to do all your work
    >on the command line. Your goal is to have an
    >_intelligent_ user interface which is adapted to
    >your needs. The Command line may be fast but its
    >not intelligent.

    I would beg to differ. While the ability to create one's own personalized interface is without doubt a great thing, and one I would fight to keep, I would say that until someone shows me another UI I find as appealing, I will strive to do all my work from a CLI, and move to GUIs for only those things they do better, graphics and netscape.
  • God, yes! I use Lynx exclusively (although my vision is normal), for reasons not important here. Two things really bug me: Totally-blank pages, and forms (even on /.!) that have Submit buttons that reply with the error message, "No form action defined". Think what a disappointment *that* is to someone who's spent two hours filling out a form with a mouth stick and a keyboard. (Thanks be, my hands are nearly normal, so far.)
  • How about that wonderful collection of Victor Borge-like words (many monosyllabic) in (iirc) Eric S. Raymond's Jargon File? We know about things like . = "dot", ! = "bang", * = "star", etc. Again-iirc, probably almost all characters used in programming have monosyllabic equivalents. Humorless people would be in deep trouble, though!
    Maybe another language? (Probably not COBOL!) There ought to be a decent programming language somewhere that is easier to speak than C or C++.
    (Yes, / = "slash". I finally remembered to include that. :)
  • I helped bring these devices to disabled users over a decade ago by writing drivers for CP/M and later DOS PCs that used the Quinkey or Microwriter keyboards.

    They were cheap, and allowed chord typing for people with brittle bones or weak sight. We could modify the technology to fit harnesses so that anyone capable of hitting 6 independent keys simultaneously with any available appendage could use one.

    They went out of business due to lack of demand. What more can you say?

    The guts of the device was six diodes and six microswitches (cheap, cheap, cheap). They ran off a standard serial port. If anyone wants to know how to do it and is willing to start an algamic (http://agalmics.nu/) Linux Chord Keyboard project, mail me.

    Vik :v)
  • If I wrote boring text web pages, *I'd* be the one getting fired

    And ALT="descriptor" in your IMG tags would be so difficult? lazy, lazy lazy...


    Mike
    --

  • the trouble with #3 (and other non-popular projects) is that--

    they'll be shoved aside for more popular things that the coder wanted to put in anyway.

    I suggest that we get a GNOME and KDE accesibility libraries (or something to that effect) in order to get these things out to everybody. this will help new projects out as

    - the development team won't have to deal hwith another thing taht should be done

    - a standard "look and feel" that these libraries would go a long way in order to migrate Win9x/Office97/IE4 users (that is, of course, if anybody wants people like me in the linux club...

    dagnabbit, there's no red wavy lines and F7 doesn't do anything. excuse my typos.

    BTW, what does the c statement at the end mean?
  • You should be able to explain your reasoning behind Good Web Design to your boss. If your boss won't listen to a cogent argument about usability, thinks flash is all that's necessary, and would fire you for trying to do your job well, do you really want to work there?
  • I might caution: Make sure you see someone who knows what the hell they are talking about (ie, probably not a GP). I saw a GP at my high school clinic (private school, hence no visit to a local specialist). It was back in 1995, so RSI was still not as much in the spotlight.

    I was given a hand brace (big metal thing you'd get if you broke your wrist) - which only served to move the injury from my one hand to both.

    Make sure that you see someone that is recommended, and that is a specialist, not a GP. In addition, I strongly recommend checking some books out of the library so that you know specifics.
  • I'm near legally disabled. Not from some horrible car accident, not from a drunk driver (not to downplay people who have gone through it), but from what I do. Day in and day out. 17+ hours a day, I am on a computer or doing computer-related work.

    I work in the telecom sector. I crimp lots of cables, for my patch panels, for 10/100bT hookups, for FDDI hookups. I'm on what my coworkers jokingly call the 'DL' (disabled list) for the next three weeks, after my visit to the doctor on Friday.

    If I keep up what I've been doing the past three weeks, I stand to lose more than 50% of what little motor control I have left in my right hand. It's become increasingly hard for me to type over the years, and it only gets worse.

    Ergonomic keyboards help, but they aren't a solution. You *will* get Repetitive Motion Syndrome. You'll get it on a BAT. You'll get it on a chorder. You'll get it on a standard keyboard. Typing is just plain bad. Very VERY bad. Period.

    You hit the same keys, with the same fingers, over and over again. Which will lead to repetitive motion syndrome. It's not as bad as carpal tunnel syndrome, but it's just as disabling in the long run.

    Simply placing my hands on any keyboard produces horrible pain at this point. I'm having a hard time typing this, even. And I'm on a Keytronics FlexPro, (they're available from Javanco at http://www.javanco.com/ for $25 currently. VERY few left in stock.) which is considered to be a true-ergonomic keyboard.

    It didn't help. I have carpal tunnel from the time before. I have repetitive motion syndrome from all the time. I can barely hold a cigarette now. The answer isn't better keyboards. The answer isn't concentrating on helping the people who are already disabled. They're in their situation for whatever reason, and there is nothing we can do at this point to cure or prevent it. My heart really does go out to them; I have friends who are legally disabled for various reasons. But there is nothing we can do to prevent them at this point.

    It's been said that an ounce of prevention is worth a pound of cure. I'm a strong believer in this at this point. If I had known that I would be in this situation long ago, I would have abandoned computers.

    We need to concentrate on PREVENTING disabilities. Microsoft doesn't do that. They try to help those who are already disabled, but do nothing to help those who may become disabled down the line. (I'm sorry, but the M$ Natual Keyboard does NOT count. It hurts me just to put my hands on it.) Who's best known for voice recognition software? Dragon Systems, then IBM. Microsoft doesn't have any voice recognition product I'm aware of.

    I'm not trying to downplay those who are unfortunate enough to be disabled. I'm going to be in the same situation myself before too long, at this rate. But I made my choice, and I knew the risks after the second year, when I was diagnosed with chronic tendonitis as a direct result from constant typing. I made my choice long ago. Some people aren't aware of the risks of typing. Others, with other disabilities, weren't given a chance or choice. But now...

    We need to concentrate on prevention. Not just Linux people, not just Solaris people, not just Windows people. Everyone. Anyone and everyone who uses a computer is at risk of my situation. I'm not saying that carpal tunnel syndrome, or repetitive motion syndrome are more important than blindness or paralysis. However, what's important is that they can be PREVENTED. An ounce of prevention is worth a pound of cure!

    IMHO, I think what needs to be concentrated on is INTELLIGENT alternatives to a keyboard and mouse. Anything involving repetitive motion is just plain bad. It's more the repeated motion than the positioning that does the damage.

    I honestly can't say I have any bright ideas, beyond voice recognition. But either way, I think it's a lot better to try and educate as well as prevent disabilities when possible, rather than to ignore them till it's too late.

    Just my $0.02USD.

    -RISCy Business | Head Unix Guru, Unicent Telecom
  • Last day, playing with various Synts, mp3 players and the eSound daemon i had an idea that could be very usefull in this area.
    The fact is that i wanted to "mix" two mp3 streams in two emusic players and i could'nt because i have only one mouse pointer. At this point i thinked that was really cool to have two real potentiometers "linked" to the GTK widgets.
    That could be extended to vumeters, progress bars, etc.. and all linked in some easy way to "virtual" widgets on the screen.
    Imagine a "combo-box" linked to a one-line braille reader, or a scrollbar linked to a big potentiometer like the "stereo shuttle" ones. All
    of this could be placed on a big plate and connected someway to the computer (USB?) and linked to the gtk library (A "real" GTK theme?) and configured by some easy tool...
    Callme a fool, but i think that's the kind of things that could be usefull for all of us handicapped and less handicapped (Because, who's not handicapped at all???)

    ===============================
  • >It was back in 1995, so RSI was still not as much in the spotlight.

    What?? I was diagnosed with RSI in 1984. The solutions have been stated - rest, expert advice, posture.

In practice, failures in system development, like unemployment in Russia, happens a lot despite official propaganda to the contrary. -- Paul Licker

Working...