Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

UI Sounds and focus - consider earcons for Voiced experience #168

Open
terracoda opened this issue Sep 1, 2022 · 3 comments
Open

UI Sounds and focus - consider earcons for Voiced experience #168

terracoda opened this issue Sep 1, 2022 · 3 comments

Comments

@terracoda
Copy link

The screen reader experience contains spoken document and interaction semantics, and some screen readers even have some non-speech sounds that indicate certain things - like an element is interactive. These non-speech sounds may be customizable in some way, I am not up to speed on how they work.

Our Voicing feature, does not directly target blind learners, but we do not not necessarily exclude blind learners from a reasonably good experience - especially in a collaborative contexts where learners with and without vision are learning together.

I am wondering if we have considered the use UI sounds that could potentially fire on focus events and thus could potentially communicate something about the object - something as general as it is interactive, or something more specific, like its a button, checkbox, slider, or custom draggable object?

We have a library of UI sounds for activation events - pressing a button, checking a checkbox, picking up a draggable object. I am wondering if we have considered using any non-speech sounds (i.e. earcons) that could fire on focus events?

The reason I am asking is that as the Voicing feature becomes more popular, it might be used by more learners who may be used to hearing spoken semantics. Focused-based earcons might be helpful addition to the Voice experience.

@emily-phet
Copy link
Contributor

@terracoda I'm not aware of this coming up before.
A few follow up questions;

  • Would the sounds you're imagining be on by default when Voicing is turned on?
  • How many different sounds would you imagine being needed? It seems like everything that has focus is interactive (though I guess reading blocks and disabled objects are not...), and having a different sound for each sim component type seems perhaps too much for a person to distinguish (even with great sound design, of course!). So maybe there's a small set of categories we could flush out pretty quickly - e.g., a standard sound as focus moves, and then a few custom variants for specialty items (like reading blocks, etc.).

Since most sims are intentionally designed to be "shallow" with respect to depth - we want people to feel comfortable pushing buttons and seeing what happens, and not worrying about 'messing up' anything, my initial thought is that an interactive vs non-interactive sounds would be helpful, but distinguishing too much further than that would have diminishing returns for users in a situation where the focus is best on "play with it all" rather than taking energy and attention to figure out what everything is semantically. Maybe interactive, non-interactive, and then a variant of the interactive sound that plays when focus lands on a custom object, sense those objects are quite unique (three sounds total).

@emily-phet emily-phet removed their assignment Sep 8, 2022
@brettfiedler brettfiedler removed their assignment Dec 12, 2022
@jbphet
Copy link
Contributor

jbphet commented Dec 12, 2022

The general idea of focused-based earcons seems quite interesting to me, and I'd be happy to build prototypes and try things out. I'm not sure where this would land in terms of my priorities though, and at the moment I am quite limited in the amount of time for a11y in general (20%).

Assigning to @terracoda to answer @emily-phet's questions above, and to @emily-phet to set the priority of any sound design and/or development effort to investigate this idea.

@jbphet jbphet assigned terracoda and emily-phet and unassigned jbphet and Ashton-Morris Dec 12, 2022
@emily-phet emily-phet removed their assignment Dec 18, 2022
@emily-phet
Copy link
Contributor

This would be great, but there's no time for it now. Marking as deferred.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

6 participants