Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Accessibility review checklist #1129

Closed
Manishearth opened this issue Sep 17, 2020 · 1 comment
Closed

Accessibility review checklist #1129

Manishearth opened this issue Sep 17, 2020 · 1 comment
Labels
a11y-tracker Group bringing to attention of a11y, or tracked by the a11y Group but not needing response.

Comments

@Manishearth
Copy link
Contributor

Going through the accessibility review checklist as a part of pre-CR Wide Review (#1114).

WebXR as a technology is inherently a way of rendering WebGL to XR devices, so it mostly boils down to the relatively poor accessibility properties of WebGL, a technology designed to give full author control over rendering.


  • If technology allows visual rendering of content (Most of these properties reduce to the properties of WebGL)
  • There is a defined way for a non-visual rendering to be created.
    • The technology is specifically for visual rendering of content. Immersive aural experiences are already possible with the WebAudio API
  • Content can be resized.
    - Not relevant: Content sizing is dictated by the device the user owns
  • Luminosity and hue contrast can adapt to user requirements.
  • Text presentation attributes can be changed.
    • Not relevant: There is no text. Text rendered in WebGL unfortunately cannot be modified.
  • Visual presentation of pointers and cursors can be adjusted.
    • There are no cursors. Authors have control over pointer rendering.
  • Changing content presentation does not render it unreadable.
    • Not relevant
  • Technology does not allow blinking or flashing of content, or provides a feature for users to quickly turn it off or permanently disable it.
    • Unfortunately as this is directly WebGL, there is no such control, but users are able to exit immersive sessions immediately
  • It is possible to make navigation order correspond to the visual presentation.
  • There is no navigation in this spec
  • If technology provides author control over color (Most of these properties reduce to the properties of WebGL)
    • There is a mechanism for users to override colors of text and user interface components.
    • Not relevant, there is no text, immersive interface components are author-decided
    • There is a feature for authors to define semantically available "color classes" that users can easily map to custom colors, and give preference to this vs. coloring objects individually.
      • No but the author may support such things
    • There is a feature for users to choose color schemata that work for them.
      • No but the author may support such things
    • The foreground and background color of an object can be reported to the user via AT.
      • Not relevant, there are no "objects".
    • There are ways to set foreground and background colors separately for all objects.
      • Not relevant, there are no "objects".
    • Compositing rules for foreground and background colors are well defined.
      • Not relevant, there is no platform concept of foreground or background color
  • If technology provides features to accept user input
    • There is a mechanism to label user input controls in an unambiguous and clear manner.
    • Yes, authors can do this via the input profiles library
    • Authors can associate extended help information with a control.
      • Yes, authors can render additional help overlays if they wish
    • If there is an input error, it is possible to associate the error message clearly with the specific control that is in error.
      • Yes, authors can modify input rendering to show errors if they wish
    • There is a mechanism to report and set the state or value of controls programmatically.
    • No, the input signals come from the device. It is possible to trigger custom select events on the session and input controls, however.
    • Authors can address multiple types of input hardware (keyboard, pointing device, touch screen, voice recognition, etc.), or the technology supports hardware-agnostic input methods.
    • Yes, the technology is capable of supporting all XR input devices
    • User input does not require specific physical characteristics (e.g., fingerprint readers).
    • No, input devices need to be XR input devices
    • Authors can ensure a "meaningful" order of controls exists regardless of presentation.
    • Not relevant: There is no relevant concept of control order for XR
  • If technology provides user interaction feature: This section is largely about "interface objects", which do not exist in immersive mode.
  • If technology defines document semantics: No
  • If technology provides time-based visual media: No
  • If technology provides audio: No, however WebAudio can be used
  • If technology allows time limits: No
  • If technology allows text content: No
  • If technology creates objects that don't have an inherent text representation: No "objects" are created
  • If technology provides content fallback mechanisms, whether text or other formats: No
  • If technology provides visual graphics: Yes, but there are no checklist items
  • If technology provides internationalization support: No
  • If technology defines accessible alternative features: No
  • If technology provides content directly for end-users: No
  • If technology defines an API
    • If the API can be used for structured content, it provides features to represent all aspects of the content including hidden accessibility features.
      • Not relevant: we do not provide for structured content
    • If the API relies on user agents to generate a user interface, the specification provides guidance about accessibility requirements needed to enable full interaction with the API.
    • There is no user-agent provided user interface aside from permission dialogs which are covered by other specifications
  • If technology defines a transmission protocol: No
@Yonet Yonet added the a11y-tracker Group bringing to attention of a11y, or tracked by the a11y Group but not needing response. label Nov 2, 2020
@himorin
Copy link
Member

himorin commented Mar 23, 2022

@AdaRoseCannon close this? that we have completed a11y review

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
a11y-tracker Group bringing to attention of a11y, or tracked by the a11y Group but not needing response.
Projects
None yet
Development

No branches or pull requests

3 participants