Digital Accessibility Objectives and Key Results (OKRs)
Creating a metrics oriented hierarchy for tracking accessibility progress
If you don’t know what an OKR is or why it is important to have them, please read this article first. If you are only going to skim it or don’t have time, just focus on the section called “Don’t half-ass OKRs.” This accessibility OKR article assumes that you have a basic knowledge of OKRs. It contains a very short OKR terminology refresher course and will help you apply the OKR knowledge you already have to the accessibility realm.
OKRs (Objectives and Key Results) are a popular management framework used by many tech companies, and also technology divisions within non-tech organizations. It is a relatively simple tool which helps create alignment and engagement around measurable and verifiable goals. OKRs are not new, they are attributed to Andy Grove at Intel from about 40 years back. If you are working on an organized digital accessibility effort, chances are you are in the type of entity who is either already using or could benefit from OKRs centered around accessibility.
OKRs generally consist of a list of three to five high level Objectives, each with three to five measurable Key Results. Each Key Result is scored from 0–100% based on how well it has been accomplished. After creation and approval, OKRs should get visited and updated by their owners frequently. I have a calendar reminder for the first work day of every month to prod me to review and update mine in our OKR tracking tool.
If your OKR system allows you to delete key results that are not 100 % achieved, it is important to track why / how they became obsolete. You do not want to consciously or subconsciously undermine the value of OKRs in your accessibility organization by removing Os or KRs solely because they are difficult and you might not be making much progress on them, folled by substituting them with less important Os or KRs where it is easier to demonstrate forward progress.
Potential Accessibility Objectives
Objective 1: Increase the accessibility of <Product Name/URL> by 20%
I am not a big fan of “Objective #1.” I do know companies that use statistics like “81 % (or some other random number) accessible” But think about the following concerns:
What does 81 % accessible actually mean if one of the remaining issues in the “19 % inaccessible” blocks all screen reader users? To screen reader users, the 81 % the product does correctly means absolutely zero, because the app/website never works with their assistive technology of choice, and there is usually no workaround other than having sighted assistance.
There are wild variations on how this number gets calculated. I had one tool report a home page as 42 % compliant, and another report the exact same home page code the same day as 79 % compliant. The difference derived from the two different methodologies the tools were calculating that number. There is no industry-accepted standard for % accessible whatsoever.
Be careful to define what “by 20 %” means. If you are at 79 %, are you aiming for 99 % (+20 %) or to improve the 79 % by 20 % [i.e. + (20 % of 79 %) which is 94 ish %)
Those three caveats aside, Objective #1 is a valid OKR, provided that you know where you are at a baseline and you know how you are going to measure whether or not you have achieved the desired level of approval outlined in the Objective.
Objective 2: Improve <Product Name> product accessibility
I like Objective #2, as long as the Key Results contain results that can be objectively measured (see next section).
Objective 3: Promote accessibility within <organization>
Objective 4: Promote accessibility efforts outside of <organization>
I think Objectives 3 and 4 should be part of all mature accessibility programs, or programs that are aiming for that goal. Accessibility objectives that do not include internal communications goals will have a very hard time being completely successful.
Key Results
Here are some Key Results that might be valid for Objectives 1 and 2 above The list below is not to be considered cumulative, for example, the first two Key Results probably contradict each other since lack of keyboard support should always be a very high priority defect.
Key Result: Zero backlog of <product> Priority 1 accessibility defects by March
Key Result: Full Keyboard-only support of 15 most common user <Product> workflows by May
Key Result: Train [Y] new individuals on WCAG 2.1 update before June
Key Result: [Z] QA engineers pass IAAP CPACC exam before August
Key Result: Integrate accessibility into go/no go decision process
Key Result: Close loop between accessibility audit and JIRA Backlog
Key Result: Prioritize existing known accessibility issues in <Product 1>, <Product 2> for remediation
Key Result: Identify accessibility gaps in <product list: where ever there is a gap> and prioritize for remediation
Key Result: Add accessibility specific commentary to style guide
Regardless of how your organization measures accessibility success, some variation on the following OKRs are generally necessary in an organization attempting to mature their accessibility effort. Which objectives you associate them with depends on how your organization is structured
Key Result: Identify/Create one-stop in-house shop for accessibility resources on <platform> (we use Confluence)
Key Result: Present on accessibility at <event: webinar, meetup, corporate UI/design opportunity>
Key Result: Create formal accessibility training program for <Designers, Developers, Content Managers>
Key Result: Staff open accessibility positions and make additional requests according to required LOE
Key Result: Establish accessibility feedback <channels: Slack, Office 365 teams, corporate accessibility e-mail address>
Key Result: Create Global Accessibility Awareness Day campaign
Key Result: Present at/Attend [X] accessibility symposiums/conferences
Key Result: Write [Y] Medium posts on accessibility
Key Result: Establish disability-specific UX research strategy including user interviews and focus groups
Key Result: Complete <some type of accessibility/diversity education>
Key Result: Align accessibility effort with <other part of the company: diversity, reasonable accommodations, ERGs, procurement>
Conclusion
Even if you know what you want your accessibility effort to accomplish for a given time period, systematically going through an OKR process for your accessibility program may shed light on areas that were previously confusing to either you or others. Applying critical analysis while defining Objectives or Key Results may actually help you realize that something you thought was important, is not (or vice versa: something you thought wasn’t important, is).
Improving an accessibility program that is not yet fully mature is really like the adage “How do you eat an elephant? One spoonful at a time.” It can seem like an overwhelming task and actually paralyze decision making because it’s hard to tell where to start, progress 100 spoonfuls in might not be visible. Because OKRs are visited frequently and improvement can be tracked in small increments, I find they are helpful to remind managers that their accessibility programs are improving even when it feels like they might be standing still.