Accessibility Support Request #1568
Replies: 19 comments 23 replies
-
May be opening an additional window without bitmaps but with standard wxWidget controls (like checkboxes) with stops and switches would help ScreenReaders to recognise them? @larspalo @rousseldenis what do you think about this idea? |
Beta Was this translation helpful? Give feedback.
-
What if we had an environment where the UI worked as a series of tables containing the appropriate functions, say like developing a touchview environment but can be used also with a screen reader natively? what I thought of is this…
I can only describe this so have some patience with me please… :)
Console View:
(3 manual example)
Swell Division - Furthest left
Great Division - Left
Choir Division - Right
Pedal Division - furthest right
(solo division if 4 manual spec - furthest left)
Include couplers to each division within the stop tables appropriate to the respective division,
Include tremulant(s) to respective tables for each division.
below these tables, include sequencer as table with divisional and general pistons.
I notice that you use panels for various functions as separate windows.. Could we create a window as a console edit environment with a table or window class based environment with editable values etc?
if not that, then could we look at a way though keeping the panels themselves as they are visually, find a way of making the window interactive and controls recognised and editable by either data entry or keyboard?
Here’s an example of how VoiceOver works with navigation…
The primary keystrokes are:
Control Option + left, right, up and down keys (navigation)
Interaction with window contents / objects: Control Option Shift Down Arrow,
Stop Interacting with - as above but up arrow.
Engage button or object: Control Option Spacebar.
Navigate a word or function letter by letter: Control Option Shift + right / left arrow keys ( after interacting using Ctrl opt shift down arrow)
An environment I work on with a particular developer, I use these function keys to navigate successfully throughout the entire environment. before I came in to this company, as a user of their software, it was completely inaccessible and I spent time working on solutions.
lew
… On 8 Jun 2023, at 09:58, Oleg Samarin ***@***.***> wrote:
May be opening an additional window without bitmaps but with standard wxWidget controls (like checkboxes) with stops and switches would help ScreenReaders to recognise them? @larspalo <https://github.com/larspalo> @rousseldenis <https://github.com/rousseldenis> what do you think about this idea?
—
Reply to this email directly, view it on GitHub <#1568 (comment)>, or unsubscribe <https://github.com/notifications/unsubscribe-auth/ATQDNKPKGR2FLNPE6ZCNHJDXKGH3LANCNFSM6AAAAAAY64VDMI>.
You are receiving this because you authored the thread.
|
Beta Was this translation helpful? Give feedback.
-
if the UI was slightly redesigned based either on an expanded or a different resource to what’s being used which would have the appropriate containers, object support, window classes support, etc, you wouldn’t need to create extra panels, embedding a resource that can allow a screen reader to navigate windows, menus, buttons, edit parameters should be a viable solution if the right tools could be implemented. that way if a new or existing organ library had it’s stop and other function labels assigned, the right window classes and elements in place, any organ library could work fine. not sure but it would be interesting to see what we could do.
Aeolus app for MacOS as an example being ported and reworked allows a blind user to navigate each row and columns, have the stop name and state spoken, etc.
The ideal end result would be:
Console View to speak with VoiceOver navigation navigated with keyboard, etc to manually enable / disable stops, couplers, trems, activate divisional / general pistons, control the sequencer for pistons, etc.
Via MIDI control: stops, etc to be announced when a midi assigned message is received (example: 16’ Principal On” “16’ Principal OFF”, “General Piston 1 selected” “Choir Divisional Piston 3 selected”
if you get the idea of where I’m coming from.
lew
… On 8 Jun 2023, at 10:18, larspalo ***@***.***> wrote:
Would it be possible to add additional "accessibility" panels that would replicate all other existing panels (perhaps neatly grouped together in a single panel menu item exactly the same way as the standard panel menu does) but with simple standard controls? This would of course need to cover both the built in panels and also the ones dynamically created from loaded sample sets to be fully usable. I'm absolutely supporting this idea.
Otherwise, reading @lewisalexander2020 <https://github.com/lewisalexander2020>, I feel that he's a prime candidate for providing the Orca Screen Reader project itself with feedback and help to improve it to a usable state!
—
Reply to this email directly, view it on GitHub <#1568 (reply in thread)>, or unsubscribe <https://github.com/notifications/unsubscribe-auth/ATQDNKPTEFNQPV6Q5TR3B3LXKGKGHANCNFSM6AAAAAAY64VDMI>.
You are receiving this because you were mentioned.
|
Beta Was this translation helpful? Give feedback.
-
I was only giving an example of a concept I’m working on, but that doesn’t mean an expected standard,
What I’m trying to illustrate here is how a screen reader such as VoiceOver would behave when working with any console view. The key elements within any console view would be that of recognising either the window class and it’s elements, so as an example, you design a console view based on position of stops, couplers, etc, they could be seen as elements within a window class and recognised with the right information as button classes with tags so that with the screen reader cursor navigating, it recognises buttons, the window itself, then can navigate to groups, so as an example,
left jamb would contain 1 or 2 divisions or more, that jamb could be constructed just by recognition that it it is a window class containing button classes with tags, their state would be declared upon interaction with said button or navigating the window class. . So if the environment for any given organ library contained a series of either group or window classes as an example, you can move between them to navigate each division’s stops and functions, or the general and divisional pistons, etc, as a good example.
I’m not asking for a console view to be enforced at all, I presented a viable option, but my concern and request is purely just to make GO workable with a screen reader environment like VoiceOver so that I as an organ developer / builder can integrate a system such as GO effectively, teach it to other blind users and create midi setup environments to support other users.
lew
… On 8 Jun 2023, at 10:32, larspalo ***@***.***> wrote:
There is no standard enforced (or limitations imposed) to how the layout of an organ can be presented. It can be done in whatever fashion the sample set producer so desires.
If you think that you're not really helped by having an exact replica of the panel menu (that should be accessible), then the only option would be to parse all the panels for controls that are possible to interact with (ie they are not read-only). This would create a composite window/panel that easily could contain a lot of controls. With anything less, you won't be able to interact with GO the way it's supposed to be used.
—
Reply to this email directly, view it on GitHub <#1568 (reply in thread)>, or unsubscribe <https://github.com/notifications/unsubscribe-auth/ATQDNKJBMZYGGIQ2BKEE2YTXKGL2PANCNFSM6AAAAAAY64VDMI>.
You are receiving this because you were mentioned.
|
Beta Was this translation helpful? Give feedback.
-
if it’s a bitmap image, it would then be classed as a button, if it’s a button, is there a way of showing the button as an actual button for a screen reader to understand it? one of the major problems is that each pane and the main pane are known as windows, but as soon as you come from the window toolbar itself, there’s no content, meaning the actual window doesn’t have a known or identifiable window class to interact with, it’s like it’s a blank page.
how would I go about uploading a video demonstrating this for the team to see?
lew
… On 8 Jun 2023, at 09:58, Oleg Samarin ***@***.***> wrote:
May be opening an additional window without bitmaps but with standard wxWidget controls (like checkboxes) with stops and switches would help ScreenReaders to recognise them? @larspalo <https://github.com/larspalo> @rousseldenis <https://github.com/rousseldenis> what do you think about this idea?
—
Reply to this email directly, view it on GitHub <#1568 (comment)>, or unsubscribe <https://github.com/notifications/unsubscribe-auth/ATQDNKPKGR2FLNPE6ZCNHJDXKGH3LANCNFSM6AAAAAAY64VDMI>.
You are receiving this because you authored the thread.
|
Beta Was this translation helpful? Give feedback.
-
So that I understand the construct of a given organ library, how is it designed? what methods are used in the creation of a given library? it would be interesting to understand this as from what I’m trying to find out, it seems that the environment is using image only methods and as such, window classes aren’t inherently either available or recognised as visible.
I’d look firstly as a starting point at creating the workspace as visible and known panels as window classes where containers are available. the advantage of window classes and container / object classes is that with the right code applied and the right implementations, any screen reader could navigate an environment.
I understand just from the demo version file that there are several windows available, each with their own functions, the problem is that each window although known as a window, isn’t actually a window on the given principle that you can’t interact with the window or it’s contents at all, it’s like a dead space.
If we could find a solution with WX widgets as an example to finda way around getting in to each window and interacting with the given objects, that should be part of the environment coding regardless of organ libraries. yes, each library would need definitions for each stop, piston, coupler, etc so that each button or control / slider, etc can be identified as a descriptor tag. a bitmap can be set up as a functional button, as long as the button has parameters available that state not only that it’s a button, but has it’s descriptor or tags in place, functions which include function state active / inactive, etc prove interesting.
the JUCE developer toolkit might be something to consider as the accessibility model there is very usable.
lew
… On 8 Jun 2023, at 10:58, larspalo ***@***.***> wrote:
Aeolus, as I remember it, is a fixed organ application that only has one layout (manuals and stops) that is very predictable an thus possible to easily configure. GrandOrgue on the other hand, as software, has no specific organ to use in itself. It relies on external sample sets for everything: how an organ will look like, sound like and also be possible to interact with. Some standard mapping is possible to specify, but there's no guarantee that it will work with all possible sample sets - it's expected that the user interacts with the loaded organ and makes the connections as suits the MIDI console used.
Please understand that it's no way to predict what even would be present on the main panel of a loaded organ! It can be so different from one sample set to another!
But it's here that the possibility to dynamically "collect" all the items possible to interact with into a structured list would be useful indeed. How that list later is presented to the user to interact with is another thing totally - that should of course be done in the fashion that's most usable and beneficial for example blind users.
—
Reply to this email directly, view it on GitHub <#1568 (reply in thread)>, or unsubscribe <https://github.com/notifications/unsubscribe-auth/ATQDNKKRKMFELLLOS56ZUPTXKGO4BANCNFSM6AAAAAAY64VDMI>.
You are receiving this because you were mentioned.
|
Beta Was this translation helpful? Give feedback.
-
sounds like a good idea.
if there was a way to simplify the setup, so that rather than having several separate windows with all sorts of blah floating on them, why not make something more universal?
lew
… On 8 Jun 2023, at 15:15, Oleg Samarin ***@***.***> wrote:
Would it be possible to add additional "accessibility" panels that would replicate all other existing panels?
My idea is to make this new window as much as simpler. It should contain all General combination elements only.
—
Reply to this email directly, view it on GitHub <#1568 (reply in thread)>, or unsubscribe <https://github.com/notifications/unsubscribe-auth/ATQDNKNWJEU6CPEH2CPZAH3XKHM7XANCNFSM6AAAAAAY64VDMI>.
You are receiving this because you were mentioned.
|
Beta Was this translation helpful? Give feedback.
-
If I’m barking up the wrong tree, somewhere near catford, then why is it wrong of me to suggest a universal concept?
If an organ library has x number of stops per division, x couplers, x trems, x general and divisional pistons, regardless of the visual construct of the GUI, then why not create a user interface that can register the stops, couplers, etc in a universal format within GO so that when someone creates an organ library, regardless of the GUI design, it still has the ability to have a 2 window native concept or an accessible native concept where window 1 contains all the elements needed in an accessible “cell like” navigation method, window 2 containing the necessary extra elements such as configuration for the console?
For me, I’d be thinking straight away that any configurations such as couplers, trem assigns, etc should be part of the setup panel, etc so rather than it being a changing GUI, it’s part of the prefs.
Could we examine how to make GO not only accessible, but streamlined? or am I walking further in the direction of Catford, which is about 10 miles off barking?
lew
… On 8 Jun 2023, at 15:23, larspalo ***@***.***> wrote:
why not make something more universal?
Sorry, you're very quickly starting barking up the wrong tree here.
—
Reply to this email directly, view it on GitHub <#1568 (reply in thread)>, or unsubscribe <https://github.com/notifications/unsubscribe-auth/ATQDNKOF7T4KTHTRSQO4LB3XKHN7FANCNFSM6AAAAAAY64VDMI>.
You are receiving this because you were mentioned.
|
Beta Was this translation helpful? Give feedback.
-
stupid question… Why?
Surely the whole art of programming and development is about solving problems and developing solutions? that’s why I became an accessibility developer in the first place within the music industry. When Apple moved from Mac OS 9.2 to OS X, accessibility was completely dead, 4 OS builds and nothing, then we came out with VoiceOver as part of the Universal Access and Accessibility support system and it slowly evolved in to what it is now and I’m proud of the legacy I leave behind for 13 years of work.
Surely as developers of a really nice piece of organ software, surely there’s ways to make your software work to support blind users, or do we just accept the basic principle that the likes of milan digital audio who couldn’t give a mouse’s left nut and only want to sell the software, never mind supporting the legal needs of individuals with sight impairment, bearing in mind that when producing software to comply with MacOS, it’s advised that accessibility is incorporated in to the apps, but a large number of devs don’t achieve it because they either lack experience or don’t see the point.
Forgive me sounding somewhat miffed, but to be told that my comments are leading me barking up the wrong tree when I’m examining a particular build scenario and looking at how the GUI and functions could be streamlined and also made accessible depending on resource tools and developer infrastructure, then I apologise most profusely for any suggestions made and will let GO just sit there inaccessible, lacking support from accessibility advocates in the organ industry like myself.
all the best
lew
… On 8 Jun 2023, at 15:37, larspalo ***@***.***> wrote:
@oleg68 <https://github.com/oleg68> Ok, so the manuals can be found in the yaml, but they are listed rather strangely in the general section as not all of them might be present. But a complete list done in the same way could be a good starting point.
—
Reply to this email directly, view it on GitHub <#1568 (reply in thread)>, or unsubscribe <https://github.com/notifications/unsubscribe-auth/ATQDNKOTH5CEKSBYEZDWE7TXKHPRZANCNFSM6AAAAAAY64VDMI>.
You are receiving this because you were mentioned.
|
Beta Was this translation helpful? Give feedback.
-
@lewisalexander2020 How much of the menus and opened dialogs of GrandOrgue can you currently interact with? In my tests with Orca, nothing at all is giving any response. |
Beta Was this translation helpful? Give feedback.
-
same here within MacOS, all panels are unusable, main menus are accessible because they are part of the main system, the setup prefs window is usable although it does announce “Titel Window” a heck of a lot, so there’s a label string that’s a bit dodgy but tolerable.
Usually in an instance like this where the windows don’t exist even though they do, the worst we think as blind users is. “OH POO! JAVA!!!” because the JAVA environment is absolutely shocking with screen readers.
ORCA is the worst possible implementation of a screen reader, like narrator for windows, it’s when you move to systems like JAWS for Windows, GW Micro WindowEyes, Dolphin Hal, etc, or the Mac with VoiceOver embedded in to the OS which is powerful and doesn’t cost a penny, unlike products for windows where you’re looking the best part of the minimum £500 to over £1000 onwards and maintenance contracts, etc, the mac is the best environment for musicians with sight loss. Why? Avid’s Protools, Apple’s Logic Pro X, Arturia’s instrument suite, Native Instruments Komplete Kontrol, etc.
for the organ world, other than aeolus which itself is limited and needs a lot to make it work right, that’s the nearest the end user gets, until launch of a product I am involved with.
This very same case presents itself with Hauptwerk, same window problems, etc.
if I do a screen recording to demonstrate this, where can I upload it?
lew
… On 8 Jun 2023, at 15:54, larspalo ***@***.***> wrote:
@lewisalexander2020 <https://github.com/lewisalexander2020> How much of the menus and opened dialogs of GrandOrgue can you currently interact with? In my tests with Orca, nothing at all is giving any response.
—
Reply to this email directly, view it on GitHub <#1568 (comment)>, or unsubscribe <https://github.com/notifications/unsubscribe-auth/ATQDNKJRPAK3GH6X6H4DF3DXKHRRTANCNFSM6AAAAAAY64VDMI>.
You are receiving this because you were mentioned.
|
Beta Was this translation helpful? Give feedback.
-
one of the headaches of linux is the fact that orca doesn’t understand interfacing with third party audio interfaces,
the trick is to route the screen reader’s audio to the system audio only and set audio for GO to an external interface, the problem as I’ve found is that sometimes ORCA just has a brain fart.
lew
… On 8 Jun 2023, at 16:04, larspalo ***@***.***> wrote:
@oleg68 <https://github.com/oleg68> Yes, that shoud be reasonable to do, and start with the lowest manual index just like in the . At the moment I'm having issues with GrandOrgue hogging the audio device as soon as I start it, which totally cancels the screen reader from the whole system. But, yes, with using pulse it seems to be working!
—
Reply to this email directly, view it on GitHub <#1568 (reply in thread)>, or unsubscribe <https://github.com/notifications/unsubscribe-auth/ATQDNKNH6GHZPVVLPDW6MMDXKHSZHANCNFSM6AAAAAAY64VDMI>.
You are receiving this because you were mentioned.
|
Beta Was this translation helpful? Give feedback.
-
yes, settings window is usable because it’s part of the main shell of MacOS it’s using the same elements.
VoiceOver navigates fine with menus and settings window, other than blurting out “Titel Window” but that seems like a descriptor window header error.
lew
… On 8 Jun 2023, at 16:04, larspalo ***@***.***> wrote:
@oleg68 <https://github.com/oleg68> Yes, that shoud be reasonable to do, and start with the lowest manual index just like in the . At the moment I'm having issues with GrandOrgue hogging the audio device as soon as I start it, which totally cancels the screen reader from the whole system. But, yes, with using pulse it seems to be working!
—
Reply to this email directly, view it on GitHub <#1568 (reply in thread)>, or unsubscribe <https://github.com/notifications/unsubscribe-auth/ATQDNKNH6GHZPVVLPDW6MMDXKHSZHANCNFSM6AAAAAAY64VDMI>.
You are receiving this because you were mentioned.
|
Beta Was this translation helpful? Give feedback.
-
keyboard mappings would be a conflict between certain screen readers but I think you’re close on this.
lew
… On 8 Jun 2023, at 20:28, larspalo ***@***.***> wrote:
@oleg68 <https://github.com/oleg68> I believe one of the main issues with the panels not giving any feedback to a screen reader is how they currently handles focus (events). It's currently not possible to navigate with for instance tab between the elements (tab is one of the available shortcut keys that can be bound to any element). Nor is the elements responding with focus on any mouse over or even left click (that triggers other actions obviously). It's first on a right click that triggers the MIDI connection dialog that one gets any feedback at all on the panels.
What about if we'd remove the tab (and maybe enter and a few other keyboard keys) from the possible keyboard shortcuts and instead assigned them to other tasks, like shifting focus (like with the wxTAB_TRAVERSAL style)? Presently I think the wxWANTS_CHARS are stealing right about everything.
—
Reply to this email directly, view it on GitHub <#1568 (reply in thread)>, or unsubscribe <https://github.com/notifications/unsubscribe-auth/ATQDNKJLJGHOJEUF7EL5WZ3XKIRV5ANCNFSM6AAAAAAY64VDMI>.
You are receiving this because you were mentioned.
|
Beta Was this translation helpful? Give feedback.
-
Is there an intermediate build for MacOS I can test please to see if this method works?
lew
… On 10 Jun 2023, at 12:47, larspalo ***@***.***> wrote:
Well, after some more testing, keyboard navigation works fine - but the wxListView currently doesn't give any screen reader feedback (tested with Orca under Ubuntu).
—
Reply to this email directly, view it on GitHub <#1568 (reply in thread)>, or unsubscribe <https://github.com/notifications/unsubscribe-auth/ATQDNKL6S6LWNYOAXHC63NDXKRNDXANCNFSM6AAAAAAY64VDMI>.
You are receiving this because you were mentioned.
|
Beta Was this translation helpful? Give feedback.
-
Hi,
just ran a test on this and there is a flaw straight away. The UI for Midi Object Window does not give voiceover access to the table of contents, just tells me there is a vertical roll bar and horizontal roll bar and buttons to the bottom including “configure” so I can’t navigate the contents of the particular dialogue’s scroll box.
lew
… On 10 Jun 2023, at 15:46, larspalo ***@***.***> wrote:
Just use latest release to test this way of interacting with GrandOrgue. Open or Load an organ first, then go to the second menu called "Audio/Midi" where its third item will be called "Midi Object". This won't be available until an organ is opened or loaded. (Remember that GrandOrgue is not an organ in itself, the software relies on an external sample set to provide the actual organ to render.)
I can easily change focused item from the different event buttons to the list (wxListView) that contain all elements with any MIDI capability of both the organ and that GrandOrgue itself will provide by using the tab key. While in the object list I can navigate it with up and down arrows just fine and manipulate the selected item with the buttons. But when an item in the list gets selected I don't receive any feedback from the screen reader to let me know which type and which item is currently selected! That's a big problem... But if that could be solved and specific shortcuts bound to the different buttons to manipulate the currently selected object (there are buttons to activate, configure = connect to a MIDI signal, status = on or off, and lastly one to close the dialog).
If one would get feedback from the wxListView this could be a way to interact and connect the MIDI keyboard/console to the software and its rendered organ with all its manuals and drawstops and whatever is existing in that organ. A possible drawback of using the MIDI Object dialog could be that in this dialog absolutely everything is available, which perhaps makes it a bit overwhelming to use initially. However, after the initial configuration of an organ, the thought is that almost every interaction with the software and its rendered organ should be done by MIDI messages from an organ console/keyboard just like if one were sitting at a real pipe organ console.
—
Reply to this email directly, view it on GitHub <#1568 (reply in thread)>, or unsubscribe <https://github.com/notifications/unsubscribe-auth/ATQDNKK2EBM3AEOL7BIG65LXKSCC3ANCNFSM6AAAAAAY64VDMI>.
You are receiving this because you were mentioned.
|
Beta Was this translation helpful? Give feedback.
-
the same problem exists in other windows such as Organ Settings where I can’t interact with the list of components, I can only find the main edit points but even then the voiceover cursor does not go through a fluid method, it’s piratic
what I’ll try to do is do a mac screen recording and walk through the navigation issues. tables containing functions such as the midi objects and organ setup menus are unusable.
lew
… On 10 Jun 2023, at 15:46, larspalo ***@***.***> wrote:
Just use latest release to test this way of interacting with GrandOrgue. Open or Load an organ first, then go to the second menu called "Audio/Midi" where its third item will be called "Midi Object". This won't be available until an organ is opened or loaded. (Remember that GrandOrgue is not an organ in itself, the software relies on an external sample set to provide the actual organ to render.)
I can easily change focused item from the different event buttons to the list (wxListView) that contain all elements with any MIDI capability of both the organ and that GrandOrgue itself will provide by using the tab key. While in the object list I can navigate it with up and down arrows just fine and manipulate the selected item with the buttons. But when an item in the list gets selected I don't receive any feedback from the screen reader to let me know which type and which item is currently selected! That's a big problem... But if that could be solved and specific shortcuts bound to the different buttons to manipulate the currently selected object (there are buttons to activate, configure = connect to a MIDI signal, status = on or off, and lastly one to close the dialog).
If one would get feedback from the wxListView this could be a way to interact and connect the MIDI keyboard/console to the software and its rendered organ with all its manuals and drawstops and whatever is existing in that organ. A possible drawback of using the MIDI Object dialog could be that in this dialog absolutely everything is available, which perhaps makes it a bit overwhelming to use initially. However, after the initial configuration of an organ, the thought is that almost every interaction with the software and its rendered organ should be done by MIDI messages from an organ console/keyboard just like if one were sitting at a real pipe organ console.
—
Reply to this email directly, view it on GitHub <#1568 (reply in thread)>, or unsubscribe <https://github.com/notifications/unsubscribe-auth/ATQDNKK2EBM3AEOL7BIG65LXKSCC3ANCNFSM6AAAAAAY64VDMI>.
You are receiving this because you were mentioned.
|
Beta Was this translation helpful? Give feedback.
-
One of the problems identified, besides the inability to interact with and find contents of the main panel or other panels within an organ library is a number of dialogues, these being the midi objects and organ settings dialogues, the windows themselves are navigated by VoiceOver, but the method is erratic,there isn’t a common flow. There is a table with a scroll element, the table doesn’t exist to VoiceOver so can’t be interacted with. this applies to Midi objects and organ settings, so even if I wanted to midi config an organ library, I couldn’t.
Could we look at a more user friendly midi objects dialogue? I only say this comment because of what was recently described of it containing every element including the kitchen sink which would be heavily confusing. We only need access to elements such as stops, general and divisional pistons, manuals, couplers, tremulants, expressions/ crescendo, the necessary objects including sequencer resources for divisional / general piston navigation and recall / setting. What could we do to make that environment easier to understand and interact with? Any thoughts would be great. My own thought would be this…
MIDI OBJECTS WINDOW:
Set the window up so that it has multiple tabs,
Each tab defines function - such as:
Tab 1: Stops. Tab 2: Couplers. Tab 3: Tremulants. Tab 4: General Pistons. Tab 5: Divisional Pistons. Tab 6:Sequencer. Tab 7: Manuals. Tab 8: Global.
Each tab can then show underneath it the available midi objects and edit parameters.
In the case of the Global tab, that would include functions like Tutti, Set, Gen Cancel, Div cancel (per division), etc
The above would be a more logical and easier to navigate / understand environment.
I would apply the above formula to the Organ Settings dialogue but tailor it to suit better.
So you get a good idea of user interaction with a screen reader, I’ll share a brief outline of how an organ software interface should work and can work based on a system I’m working on with a developer.
A screen reader’s basic approach is for the screen reader’s virtual cursor to be able to navigate with keyboard functions assigned as layers to the screen reader, so that the user can use the keyboard to speed navigate through a user interface, screen readers can support mouse / trackpad use where the mouse cursor can be wrapped to the virtual cursor of the screen reader in question.
Providing the application has access to the system wide UI menus, contextual menus, standard windows such as settings dialogue windows using the universal or system dependent window classes, or a window skin that’s still accessible, then the screen reader can interact with the menus, windows, toolbars, functions, etc. Each window has to be able to be interacted with and it’s contents known.
Example:
GrandOrgue:
Ideal scenario…
Launch GrandOrgue.
Settings Dialogue launches if appropriate - screen reader to navigate sequentially the settings environment and clearly identify tables, contents of tables, interact with tables, navigate check boxes in sequence (currently erratic), apply settings.
Dialogue loading organ library appears, should speak status. window closes.
Organ library loads main panel.
Screen reader to identify window and it’s window tools as per normal.
Screen reader to interact with window content and identify all buttons / controls, texts, etc interact with to control each button, etc.
Process example:
Using VoiceOver:…
With a window open such as any organ library main panel, VoiceOver should start by interacting with the Window contents. It currently identifies the window only with the Close, Minimise and full screen buttons which are part of the system-wide class, the title window element. It then brings the VoiceOver cursor to it’s end and cannot identify the window contents or class(es). The same applies throughout the GUI for any panel.
The navigation method is as follows to achieve basic navigation and control of a window, a toolbar, any given element of an application, etc.
Run VoiceOver: Cmd F5 (stop voiceover same keystroke)
Navigation Left Ctrl, Option Left / right / up / down keys.
Interactions:
Ctrl Option Down Arrow (interacts with object under VoiceOver Cursor),
Ctrl Option Space to engage / disengage / open / select what is under the VoiceOver cursor,
Ctrl Option Shift Down Arrow - interact with object to edit, (change to up arrow to stop engaging)
the above are the basic fundamentals you need, but if you’re running a mac and want to know how voiceover behaves better, here’s the shortcuts list link:
https://support.apple.com/en-gb/guide/voiceover/cpvokys01/10/mac/12.0 <https://support.apple.com/en-gb/guide/voiceover/cpvokys01/10/mac/12.0> the above is for MacOS 12.6x but is a general scope, different MacOS builds have other features, so shortcuts sometimes change.
https://support.apple.com/en-gb/guide/voiceover/unac048/10/mac/12.0 <https://support.apple.com/en-gb/guide/voiceover/unac048/10/mac/12.0> VoiceOver Modifier Key reference
https://support.apple.com/en-gb/guide/voiceover/mchlp2683/10/mac/12.0 <https://support.apple.com/en-gb/guide/voiceover/mchlp2683/10/mac/12.0> VoiceOver Keyboard Help
https://support.apple.com/en-gb/guide/voiceover/vo28018/10/mac/12.0 <https://support.apple.com/en-gb/guide/voiceover/vo28018/10/mac/12.0>. VoiceOVer Commands Menu
https://support.apple.com/en-gb/guide/voiceover/vo14111/10/mac/12.0 <https://support.apple.com/en-gb/guide/voiceover/vo14111/10/mac/12.0> VoiceOVer Commands and Gestures - Overview
So you get a better understanding, please spend a little time going through these pages, it really does help developers who don’t understand screen readers that well as to what’s involved, what keystrokes do what, etc. At the end of the day, the screen reader is a layer on top of the system UI, whether built in like VoiceOVer or Windows Narrator (I’d rather lick marmite than touch narrator) or third party software you purchase and install such as Freedom Scientific’s JAWS for Windows, GW-Micro’s WindowEyes, Dolphin HAL, NVDA for Windows which is free as I remember, etc, they have slightly different behaviours, like the main mode keys differences, certain screen readers for the windows environment behave differently to others in how they interact, what functions are available, etc.
Here’s a few articles of interest in how to make applications accessible:
https://arctouch.com/blog/accessible-app-design <https://arctouch.com/blog/accessible-app-design>
https://medium.com/oberonamsterdam/how-to-create-an-accessible-app-and-why-you-should-5493f41f8bdb <https://medium.com/oberonamsterdam/how-to-create-an-accessible-app-and-why-you-should-5493f41f8bdb>
https://www.perkins.org/resource/how-check-app-accessibility/ <https://www.perkins.org/resource/how-check-app-accessibility/>
https://www.dallascollege.edu/about/accessibility/guidelines/pages/screen-reader.aspx <https://www.dallascollege.edu/about/accessibility/guidelines/pages/screen-reader.aspx>
I hope some of this helps, yes it’s a mine field I know, but believe me when I say it, if you lost your sight, these tools become your eyes, they’re vital to you. from a speaking watch, to a mobile phone with a screen reader installed, to your computer with a screen reader running, document translation / OCR software to convert print in to spoken media or braille output, even products like the ORCAM MyEye series and Read series, they’re there to give a blind person a level of independence and equality.
from an organist’s position, I’ve been fighting for years to find a solution to make particular organ systems blind friendly so that whether you’re a student learning the organ, or a professional using organ software for recording work, etc, you have a blind friendly solution. The battle has been lost due to the likes of Milan Digital Audio and it’s really sad.
Bringing any potential accessibility to GrandOrgue would be a positive step forward. So far, the only currently available accessible organ is the Aeolus app which you can find here… https://github.com/Archie3d/aeolus_plugin <https://github.com/Archie3d/aeolus_plugin>
All the best and here’s hoping we can find a creative way forward.
lew
… On 11 Jun 2023, at 21:31, larspalo ***@***.***> wrote:
So I think that a new window with a lot of wxCheckboxes (tab-traversible) is a better option.
Ok, might be, since they at least can give some response to the screen reader. However, MIDI assignment is very essential to be able to use GrandOrgue in any serious way. Would normal buttons for each manual (and perhaps for each drawstop/piston) be an option for triggering the MIDI configuration dialog (as they also seem to give the screen reader something to work with)? Also, I can agree that only the elements that are interactable (not read only) should be included to make the list a little more user friendly. But in the long run I think we should provide similar access to all the other built in controls of GO.
—
Reply to this email directly, view it on GitHub <#1568 (reply in thread)>, or unsubscribe <https://github.com/notifications/unsubscribe-auth/ATQDNKIQBQBCDLREKJTHVLTXKYTJNANCNFSM6AAAAAAY64VDMI>.
You are receiving this because you were mentioned.
|
Beta Was this translation helpful? Give feedback.
-
GrandOrgue 3.15.0 has a new Stops panel that can be read with a screenreader. |
Beta Was this translation helpful? Give feedback.
-
Hi all,
I initially raised a matter of support within GO to support VoiceOver on the Mac platform for those of us, like myself who are blind and organists who'd love to use GO in conjunction with screen readers.
I think that in my previous attempt, which I sent as a support ticket, moved to "discussions" it didn't quite go so well, so, with some time on my hands, I'd like to try and lay the ground to demonstrate the problem and examine potential solutions, seek some wisdom from the developers who know this package and it's architecture better than me, etc.
As a little clarification here, I'm a former Apple internal developer, one of the guys behind VoiceOver for MacOS and iOS where my work happily resides is various pages of code and has become part of watchOS and TVOS accessibility for VoiceOVer. However, I left development work a while ago due to ill health, a stroke which means reading braille isn't an option these days due to weakened sensation to the right hand, etc. so programming is a real headache and proofing code a nightmare with a screen reader speaking out tons of code.
OK. so, now to lay the groundwork for the request / proposal...
I'm a classically trained church organist, I am also fully blind, so for me, all I know around me is "black" if you care to describe it that way :)
I've been working on a number of potential solutions to produce an accessible digital organ system to support people like myself. The reason behind this action is due to the owners / developers behind Milan Digital Audio's Hauptwerk. Through the many years, I and many others have reached out to the developers and to be honest, either promised and never delivered on, or just ignored the requests and proposals for assistive technologies support to allow screen readers to work with Hauptwerk, in the end it's been a tragic and painful experience.
In the last 3 years or so, I've been trying to build a setup to replace a console I owned, sadly that console had to be sold to pay legal and funeral costs after my dad passed away, so my business as a musician and developer closed and assets sold.
Linux was recommended to me to build an organ system know as aeolus, but.... here's the horrible thing. Linux accessibility is absolutely sickening. I thought Windows was bad, but this.... I'd rather lick marmite. After some time of torturing myself, trying to find ways around the problem and finding the original developer behind Aeolus, speaking with him, it became evident that this was a car crash waiting to happen. oh well ;)
I now work for a particular organ software developer, I can't say who, as a test / development support role, luckily I don't need to waste time delving in to code, it's a different direction for me and to be honest, more enjoyable. the results so far are stunning for accessibility, when the product finally rolls out, we can then say for the first time, we have an accessible organ solution. So, I've been working for that company for 6 months.
Recently, I got a tip off about Aeolus as an app on here, thought I'd try it on the mac and wow, I was stunned, a spoken environment, ok, there's a lot missing, but if that could be expanded upon, then great, it's an option available to blind and sight impaired organists who need assistive tech to support them.
So, the above groundwork gives some explanation as to why... now begins how and what...
I openly apologise for not being familiar with Github, or the developer resources used to build GrandOrgue, so I'm entrusting myself in your knowledge, skills and resources and hopefully, your faith.
Screen readers are a combination of a blessing and a curse. they provide both synthetic speech output and braille output if you use a braille display. Most screen readers give you an reasonable degree of navigation around the OS in question. Take VoiceOver for MacOS and iOS as an example, VoiceOver relies on not only keyboard navigation, but also trackpad / mouse navigation if like me, you spend so much time reconfiguring and testing apps, demonstrating flaws and solving problems with voiceover support, the trackpad helps define whether a window state exists by using the VoiceOVer cursor to detect objects, window classes, functions, etc. The screen reader is literally the eyes to a blind person.
Screen readers like Orca for Linux are so badly designed that usability and stability is a joke, it puts people off using the platform altogether. Having spent 3 years trying and banging my head against a workstation case I gave in.
GrandOrgue, I hear so much about as a direct competitor to Hauptwerk and about time too. I wanted to start using this and at first I was so hopeful that VoiceOver could navigate around the UI. so, my findings were...
Setup windows and menus are accessible as per the standard UI framework.
Windows / Panels. No degree of accessibility, no found objects to interact with, speak out, etc, it's as if nothing's there.
this is based on loading either the demo unit or other libraries.
I'm told of WX widgets being part of the environment and that there is an accessibility element, but only seems to be supported as part of the Microsoft Accessibility kit model.
I wonder if there's any way we can explore either making the GUI accessible through other resources, etc?
I hear talk of the ODF editor, nothing there for the mac, maybe some time in the future? If the ODF editor could be ported to the mac environment, would that allow us an ability to embed accessibility components in to the organ files? as each GUI seems unique.
I'd love to hear thoughts and maybe encourage our amazing team here to investigate this further. If I can be of help then great, I'll try my best.
Lew
Beta Was this translation helpful? Give feedback.
All reactions