Review stages provide you with the ability to create, organize, prioritize, and manage your document review. Review stage metrics further assist you in the management process by providing you with a deeper understanding about the pace of review, tag rates, and overall findings. This information can help you not only with staffing and budgetary considerations, but also with identifying and correcting anomalies early in the review process.
To view metrics, in the DISCO main menu, click Manage Review Stages.
Review metrics are unique to each review stage. To see the metrics for a particular review stage, click the review stage card on the left, and then click the Metrics tab.
The metrics are broken down into three areas:
- Pace – Displays the team’s overall review pace along with the median pace of active reviewers. DISCO will estimate how long the review will take based on current review pace.
- People – DISCO provides charts that show reviewer pace by day and tagging rate by reviewer. These charts provide insight into the overall review, and highlight outliers among your review team.
- Findings – The percentage distribution of tags applied within the stage. Double-click a tag to review the associated documents.
The Pace tab provides insight into the rate of review, as well the projected date of completion based on your review team’s recent performance. With this estimate, you can assess whether the current number of hours and/or reviewers is adequate in meeting your deadlines.
Estimated Completion date – This is calculated based on the recent Active Team Pace and the number of unreviewed docs remaining in the review stage. By default, the system assumes that the team will not be working on weekends or holidays, however, you can select and deselect the Working Weekends and Working Holidays checkboxes and DISCO Review will adjust the completion date. Note: The holidays are preset to the standard United States federal holiday calendar.
Recent Pace - Active Team Pace – The median of the overall document review pace of the review stage over the past seven days, excluding any obvious outliers. By limiting the data to the last seven days, the calculations better reflect normal fluctuations in pace, such as changes in the size of the review team, reviewers becoming familiar with the protocol, or document sources varying in complexity. Thus, DISCO's metrics more accurately reflect the most current outlook with fluctuations in pace automatically considered.
Recent Pace - Active Reviewer Pace – The median of all the individual reviewers' paces over the past seven days, excluding any obvious outliers. Obvious outliers are those days with a statistically abnormal review pace. Excluding such anomalies allows for a more accurate assessment. Examples of obvious outliers include someone reviewing a handful of documents over the weekend or someone working only half a day. Both instances would make the overall pace go down, giving the illusion of lower productivity.
Burndown Chart – Provides a visual representation of the number of documents remaining to be reviewed each day until the estimated date of completion. You can click on any given day within the burndown for further details, including the number of documents reviewed, documents remaining, documents added/removed, and active reviewers. Additionally, should any documents be added or removed from the review stage, the corresponding changes would be reflected here.
The People tab provides metrics specific to individual team members’ review paces as well as their tagging behavior.
Reviewer Pace chart
The Reviewer Pace chart plots out how many documents each reviewer has reviewed each day (review pace). Click Select reviewers to select specific reviewers' or recent outliers' data to display. Outliers, in this context, are defined as any reviewers whose review pace fell outside the normal range over the past seven days, implying that their review pace is significantly higher or lower than their peers on a given day.
After selecting which reviewers and/or outliers you wish to focus on, DISCO will layer each chosen reviewer’s individual review pace onto the Reviewer Pace chart.
In addition to displaying the individual reviewer’s pace, you can click on any point on the chart to see the pace of the individual reviewer as well as the the median review pace for a particular day.
As with the Pace Chart, you can enable or disable weekends and holidays so that DISCO will consider reviewers as working during those days. The holidays are preset to the standard United States federal holiday calendar.
Tag Rates by Reviewer
This area shows the median percentage each tag was applied along with applied percentages per reviewer.
For example, in the image below you can see the median application of the ISSUE B tag is 11%. By clicking in the chart you can see that one reviewer has applied the ISSUE B tag at a rate of 44%. DISCO Review makes it easy to identify such outliers by highlighting their data in orange. Such insight into tagging variations allows you to identify differences in understanding among reviewers.
One easy example of this would be the use of a Hot tag. Some reviewers tend to broadly apply such a tag, while others are more strict in its use. Being able to identify variations early on in the review will allow those reviewing or supervising the review to ensure that all reviewers on the same page, lessening the differences in tag application by individual reviewer. This will cut down the time needed later for QC as well as within other review workflows such as deposition preparation.
If you prefer to view the information in columns, click the column viewer option.
The Findings tab provides insight into the overall tag distribution for your review stage. Here you can see if your tagging aligns with any statistical sampling or other expectations.
Click a tag name to view the reviewed documents with that particular tag.
DISCO will pull up the selected documents in a search for your review. You can see tag application information and basic metadata information by scrolling through your document list. You can select any view to use for scrolling, utilizing custom columns. Further, you can open the documents to view them in the standard document viewer.
Finally, on the Findings tab, you can toggle between only those tags exposed to reviewers or all tags within the database.