-
Notifications
You must be signed in to change notification settings - Fork 147
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allow manual adding of issues for any test #7397
Comments
Hello @anphira, could you please provide additional information related to components for which we are getting failures & the tool is not fetching them. If you could provide details, we can reproduce the issue from our end and can investigate on it? |
Assessment_20240724_Maryland-tax-connect.json As far as I was able to tell, the tool was not able to find the content in the modal windows. A modal is triggered when clicking "File a form as a guest". Close button is color #000 BUT has an opacity of .2. which is a contrast ratio of 1.6 and therefore fails contrast. Another example of the automated checker passing an element it shouldn't: The label for username is "" -- that should be failed as a missing label, but is instead given a pass. Automated checks will always pass some things that they should not, which is why I'm requesting to be able to add manual failures to all modules. HTML files are not supported as uploads, but here's the link to the report as HTML: https://www.dropbox.com/scl/fi/f6amhewvxd306omrn88x4/Assessment_20240724_Maryland-tax-connect.html?rlkey=b8a02nyim9zp4p58rvol1pcti&dl=0 |
As per the analysis for issue related to color contrast for Close button in "File a form as a guest" modal, it is captured in the needs review section of Fast pass of automated checks. Needs review provides instances that need to be reviewed by a human to determine whether they pass or fail. Regarding Label for username is "", as per tool documentation if title or placeholder is available for input then it's given a pass. Please refer Accessibility Insights - label for more details on label rule. We will discuss feature for allowing manual addition of issues for any module in our triage meeting. |
Thank you for the response. For the labels, many testers would fail them. From the Trusted Tester process, "Test ID 5.A (3.3.2-label-provided) requires that all form fields have visual labels or instructions." I always fail any form without a correct element. This is an area where allowing manual addition of issues would be helpful as not all testers and testing processes are the same. Thank you for your assistance. |
This issue requires additional investigation by the Accessibility Insights team. When the issue is ready to be triaged again, we will update the issue with the investigation result and add "status: ready for triage". Thank you for contributing to Accessibility Insights! |
Is your feature request related to a problem? Please describe.
There are regularly items that are not detected by the scanner. For example, a modal dialog can be opened on the page, but none of the modal components are detected by the tool. There are contrast failures on those modal components, but I have no way of adding those failures to Contrast > UI components.
Describe the desired outcome
I need to manually add failures to any test. The tool can't find everything, so I need to be able to add a manual failure.
Describe alternatives you've considered
Right now I have to keep a separate file of noted manual failures.
Additional context
On the Adaptable content > Contrast -- on some sites it detects nothing. But there are failures on color contrast. It just says "Instances No matching instances". Right now I just have to keep a separate document noting these failures. I want to be able to include them in the nice report this tool makes.
The text was updated successfully, but these errors were encountered: