- 21 Apr 2025
- Print
- DarkLight
- PDF
Create Review (QA) Tasks
- Updated On 21 Apr 2025
- Print
- DarkLight
- PDF
Overview
The quality assurance process ensures that annotations are compliant with certain quality standards or requirements, as well as identifying and addressing issues, as well as verifying that the desired level of quality has been achieved.

The QA process consists of the following steps:
- Reviewers can use two tools to indicate quality issues during QA.
- Create an Issue with an annotation. It is used when there is a quality problem with an annotation, such as a wrong position, label, attribute, etc.
- Create a Note annotation. It is used to indicate that an annotation is missing. The note annotation is created with an issue on it.
- Having an issue with an item in a QA task removes its status from the original task where the annotation was created.
- For note annotations, since they are created in the QA task, the issue is assigned by default to the last person who sets a status on the item. However, the note reporter can manually change the note assignee. For more information on the Note annotation tool, see the Note article.
- Annotators will see open issues on assignments, and the task owner or manager can see them on the task level.
- Annotators can start correcting the annotations.
- After correcting an annotation, the annotator flags it For review and sets the status to Complete. This will make the item appear again in the QA task.
If necessary, annotators can create an issue in the QA task.
- The reviewer can view how many For review annotations are pending on their assignments and open them for review.
- If corrections are accepted, the reviewer can flag them as Approved. If all corrections are approved, the reviewer can set the item's status to Approve.
To gain a clearer understanding of the QA process, see the QA Process Example section.
To ensure people learn from their mistakes and progress on the learning curve, Dataloop QA always favors the original annotation creator by default for any correction work.
Review (QA) process
Role of users in the QA process
- Create QA Tasks: An Annotation Manager creates QA tasks, redistributes and reassigns these tasks to annotators, reviews their tasks, etc.
- QA Tasks Assignees: QA tasks can be assigned to annotators, annotation managers, developers, and project owners.
- Report Issues: Issues can be reported only by annotation manager role and above.
If necessary, annotators can create an issue in the QA task.
QA process example
The following example gives you an idea of steps involved in the QA process.
Step 1: Create a QA Task
The annotation manager views the completed annotation task and creates a QA task (in this example, the annotation manager assigns the QA task to self and acts as a QA reviewer, but you can assign it to roles). Learn to Create a review QA task.

Step 2: Create Issues During the QA
During the QA task, the QA tester (in this case, the annotation manager):
- Discovers issues with a couple of annotations and opens an issue with them.
- In addition, the QA tester discovers a few objects that are not annotated, and since there is no annotation, adds an issue for them.
- Creates a note (a ladybug icon) over the unannotated objects with instructions for the annotator. For more information about the Note, see the QA Note Annotation article.
When an issue or note is created, the Completed status is removed from the item, and the assignment becomes active again.

The item cannot be completed or approved until the issues have been resolved and approved by the QA tester.
On the Task page, the progress bar shows that one item is not completed, and there are issues remaining to be resolved.

As you can see in the above screenshot, the annotation manager can view tasks, issues, and pending review items.
Step 3: View Issues Reported During the QA
The annotator can see that there are open issues with the task on the the Tasks page.
- Hovering over the red exclamation displays the number of open issues.
- Double-clicking on the exclamation opens the item with issues.
Step 4: Correct the Issues
- The annotator corrects the annotations and marks those annotations For Review in the Annotations tab by clicking on the hourglass icon.

If necessary, annotators can create an issue in the QA task.
- The annotator clicks Complete. On the Tasks page, the task is shown with items for review.
Similarly, the Tasks page on the QA tester’s platform shows that the annotator has submitted items for review.

Step 5: Review QA Corrections and Approve
The QA tester clicks the Browse Pending Review to view the corrected annotations on the items.
If the issues have been corrected satisfactorily, the QA tester approves them by:
- Selecting the annotations that are marked for review.
- Clicking on the Approve checkmark icon.

At this point, the For review icons disappear, and the QA tester can click Approve.
Both the annotation task and the QA task are now complete.

Create a review (QA) task
You can create Review (QA) tasks based on the following two scenarios:
- A Review (QA) Task from a Labeling task: To validate annotations created by assignees.
- A Standalone Review (QA) Task: To validate annotations that are uploaded to the platform, for example, Annotations created by your model.
On the tasks page, Review (QA) Tasks are linked to their respective annotation tasks. Click the "+" icon next to an annotation task to see all Review (QA) tasks related to it.
To create a Review (QA) task, follow the instructions for each section:
- Open the Labeling from the left-side menu.
- Click Create Task. The task type section popup is displayed.
- Select the Review (QA) Single Task.

- Click Continue.
1. General
Enter or select the required details in the General section:

- Task Name: By default, your task name - QA is displayed. Modify, if needed.
- Owner: By default, the current user's email ID is displayed. Click on it to select a different owner from the list.
- Priority: Select a priority from the list. By default, Medium is selected.
- (Optional) Completion Due Date: Select a task's due date from the calendar.
- Click Next: Data Source.
2. Data source
Enter or select the required details in the Data Source section.

Select Dataset: By default, the dataset used to create the task is displayed.
(Optional) Filters: Refine data selection by selecting specific folders, using DQL filters, or subsampling (randomly and equally distributed). The Folder or DQL field is Active only if you do not select any items in the Dataset.
- Folders: Select a folder from the dataset.
- Selected Filters / Saved DQL Query: Select a filter or saved DQL query from the list.
- Data Sampling: Enter the Percentage or Number of Items for the task. Data sampling does not give an exact number of items.
- Percentage: The option selects the items randomly. For example, if the percentage is 100% for four items, then 75% is for three items (It can be 1/4, 3/4, or 4/4) from the selected dataset.
- Number of Items: The allows you to select the items sequentially from the start of the dataset, not randomly.
- Collections: Choose a collection from the list to filter and display items within the selected collection.
Click Next: Instructions.
3. Instructions
Enter or select the required details in the Instructions section. The number of Labels and Attributes is displayed on the top-right side of the page.

- Recipe: By default, the default recipe is displayed. Select a recipe from the list, if needed.
- QA Instructions (.pdf): The QA Instruction document is displayed, if available. Go to the Recipe section to upload a PDF instruction.
- Click Next: Statuses. The Statuses section is displayed.
4. Statuses
Enter or select the required details in the Statuses section:

- By default, the Approved status is selected. Click Add New Status to add a new status.
- Click Next: Assignments.
5. Assignments
Enter or select the required details in the Assignments section.
- Allocation Method: Select one of the following allocation methods:
- Pulling: The pulling distribution method means that annotators only pull a batch of items at a time and the maximum number of items in an assignment. You can make changes in the following fields if required: Pulling batch size (items) and Max items in an assignment.
- Distribution: The distribution allocation method means that the items will be distributed in advance among users, equally or based on a custom percentage.
- The Auto Distribution option distributes the task equally among the users. By default, it is checked.
- The Show only unassigned users to any labeling task option allows existing users to complete their task.
- Available Users: Search for or select users from the list, and click the Forward arrow icon to add them to the Assigned Users list.
- Assigned Users:
- Search for or view the assigned users from the list. The allocation percentage is equally distributed if you select Auto Distribution.
- Select and click the Backward arrow icon to remove them from the Assigned Distribution list.
- Click Create Task.
Set a QA status
When working on a QA assignment, the assignee or QA tester is expected to review annotations and set the item's status to:
- Approved, if it is annotated satisfactorily.
- Discarded, if the item is unsuitable for the task.
- or raise issues if there are problems with the annotation of the item that the QA tester wants the annotator to fix.
Setting the status on an item will trigger the studio to move on to the next item.
View QA status
You can view the QA status of a task on the following pages, depending on the user role and the type of information requested.
- Task's page: When you double-click on a task within the task page, it displays assignments, especially if there are items with open issues or those pending review. Additionally, on this page, you have the options to Browse Tasks and Browse Issues.
- Assignments' page: Double-click on a task to see all its assignments. Each assignment line shows an indication of whether there are open issues or items pending review.
- Issues' page: On the Labeling > Issues page, you can see the full list of the issues in a project. For more information about the issue page, see the Issues article.