ASReview LAB is a free (Libre) open-source machine learning tool for screening and systematically labeling an extensive collection of textual data. The user-friendly web app (ASReview LAB) can be used to screen unlabeled data where the reviewer is the ‘oracle’, making the labeling decisions while the machine predicts the most likely relevant record to show.
In 2 minutes up and running
Download Python, install ASReview and open the user-friendly web-app. Ready, set, start screening!
Connect with citation managers
ASReview is designed to connect with citation managers (Zotero, EndNote, Mendeley, and Refworks) and other screening software (e.g., Rayyan, Distiller, Covidence). Importing all your documents is a breeze. Even better, after you’re done screening, you can easily continue your work in your favorite citation manager.
Tabular datasets with extensions .csv, .tab, .tsv, or .xlsx can be used in ASReview LAB, only a title and abstract field is mandatory. Also, RIS files can be used and the labels are stored in the N1 (Notes) field and are recognized by your favorite reference managers!
Import your own dataset, insert a URL to a dataset, provide a DOI to a data repository (supported for many data repositories via Datahugger), or select one of the benchmark datasets via the extension.
Select training data
The first iteration of the active learning cycle requires training data, referred to as prior knowledge. This knowledge is used by the classifier to create an initial ranking of the unseen records. You can search for prior knowledge in the dataset or use a partly labelled dataset containing the labels from a previous study or from expert elicitation.
Select your favorite model
You can rely on the default model, but ASReview also allows you to select your favorite feature extractor (TF-IDF, sBERT. Doc2VEC, Multilanguage, etc) and classifier (Naive Bayes, logistic regression, SVM, random forest, neural networks, LSTM, etc), and change the balancing strategy or query strategy. This make our software unique!
Focus on what matters
No distractions; we only show the information you really need: Title, Abstract, and a hyperlink to the full-text (if available in your meta-data); that’s enough to make your decision!!
Follow your progress
During screening, you might want to track your progress and obtain information for your stopping criteria.
Need a break? Search for the hidden game...
Make notes while you screen! The notes are added to your dataset and can be exported.
Added a wrong label?
In some cases, you might want to change your previous decision. An overview of your decisions made during screening can be found on the History page. You can change decisions on this page, and the new label will be used for the next iteration of the model.
You can export the results of your labeling to a RIS, CSV, TSV, or Excel file. A file contains all imported data, including your decisions and notes.
Because we Care!
We strongly believe in full transparency and reproducibility. Therefore, we offer the possibility to export the project file containing all the information to fully reproduce the entire screening phase.
Probably you have seen or read somewhere that ASReview LAB is open source (Libre) software. But what does that mean? And how can you join? Read all about it in the blog post.