SBSeg'24 Artifact Evaluation

The Tools Hall (SF) and Scientific Initiation and Undergraduate Work Workshop (WTICG) will allow authors to submit artifacts related to the submitted article, which may be software, data, complementary documentation, raw results, proofs of concept, models , detailed reviews, benchmarks, etc.

This resource is essential for the complete understanding of a scientific article, which goes beyond the document itself. The quality of an artifact tends to be as important as the scientific article itself. Typically, conferences in Brazil do not offer formal means to submit and evaluate anything other than the article itself. Breaking with this tradition, SBSeg will collect artifacts, which are mandatory for the Tools Hall (SF) and optional for the Scientific Initiation and Undergraduate Work Workshop (WTICG). The artifact submission process will occur in conjunction with the submission of articles and authors must describe the quality badges that must be considered for the artifact, as well as detailed documentation following the rules submission.

The artifact evaluation process will be based on the evaluation process carried out by renowned conferences such as USENIX, CoNEXT, SIGCOMM and EuroSys and will be carried out by the Artifacts Technical Committee (CTA). For authors who will submit artifacts, it is essential that they read the descriptions of the available stamps and the instructions on what is expected from an artifact.

Call for Artifacts

An artifact can be software, data, supplementary documentation, raw results, proofs of concept, models, detailed evaluations, benchmarks, etc.

Four quality badges can be considered for an artifact:

  1. Available Artifacts (SeloD);
  2. Functional Artifacts (SeloF);
  3. Sustainable Artifacts (SeloS); It is
  4. Reproducible Experiments (SeloR).

Before sending your artifact, check the requirements for obtaining each badge. If you have any questions, please contact contact.

Submission instructions

Four quality badges can be considered for an artifact. Along with submitting the final version (camera ready) of the article, authors can submit the artifact(s) related to the article. For artifacts to be judged, it is mandatory to submit an appendix describing which badges should be considered in the evaluation of the work artifact(s) and providing the necessary instructions for reviewers to be able to properly deal with them. (especially in the case of software or data).

Requirements

For the work/artifact to be eligible to receive a badge, some requirements must be met, as described below.

Available Artifacts (SeloD)

Code and/or data is expected to be available in a stable repository (such as GitHub or GitLab). In this repository, a README.md is expected with minimal documentation (but clear enough for understanding), describing the objective of the artifact(s) (s), with the respective title and summary of the article.

Functional Artifacts (SeloF)

It is expected that the program code(s) of the artifact(s) can be executed and reviewers can observe its functionalities. To obtain this badge, it is important that additional information is present in the repository's README.md, such as:

  1. list of dependencies;
  2. list of dependencies/languages/environment versions;
  3. description of the execution environment;
  4. installation and execution instructions;
  5. one or more execution examples.

Sustainable Artifacts (SeloS)

It is expected that the program code(s) of the artifact(s) are modularized, organized, intelligible and easy to understand **. To obtain the badge you must:

  1. there is documentation of the code(s) (describing files, functions, APIs, etc.);
  2. there is minimum readability in the code(s) and other artifacts;
  3. it is possible for reviewers to identify the main claims of the article in the artifact(s).

Reproducible Experiments (SeloR)

It is expected that the reviewer will be able to reproduce the main claims presented in the article. To obtain this badge you must have:

  1. instruction to execute the code(s) in order to reproduce and confirm the main claims of the article (e.g., results of the main graphs/tables);
  2. description of the process of carrying out the experiments to reach the result(s) of the article;
  3. description of specific technical details of the environment (when applicable) such as, for example, details of the infrastructure used in Amazon Cloud or Google Cloud. Eventually, include access keys and other information that allows the experiment to be reproduced.

Appendix (required)

To facilitate the artifact evaluation process, an appendix template (required) was created where the authors describe which badges should be considered in the artifact evaluation process and include information to assist reviewers of the Artifacts Technical Committee. The appendix LaTeX template is available at Example-Appendix.

It is mandatory that Sections 1 to 4 are present. If you have any questions, please contact us.

Note: Before submitting their artifact (appendix), it is strongly recommended that authors install and run their artifact in a new environment (virtual machine) following only the instructions in the appendix to confirm that the instructions are consistent and complete.

Note: Remember that the entire CTA review process takes into account the information in the Appendix.

Examples of SBSeg'23 Artifacts

An example from Appendix. Result after the committee evaluation process Article.

Review instructions

Your goal as an artifact reviewer is to ensure that the quality of the artifact matches the content of the article and the minimum requirements expected to obtain each badge. To carry out this activity with excellence, before evaluating the artifacts (i) read the article; and (ii) define the main contributions of the work. These two steps make the evaluation process for each badge simpler.

Note: Please note that the review period is relatively short. We recommend starting your revisions as soon as you receive your assignment.

Steps for evaluating an artifact

The review process can be performed in an environment of your choice, as long as it meets the minimum requirements of the expected execution environment for the artifact. We recommend running the artifact (when applicable) in a virtual environment, as it brings convenience to reviewers and ensures that components present on your local machine do not harm the evaluation process (a clean installation in a new environment can reduce unforeseen events).

All additional resources required to run the artifact (cloud infrastructure, SSH keys, etc.) must be present in the appendix describing the artifact.

The artifact under evaluation is related to an article being evaluated by the SF and WTICG committees. A CTA reviewer's focus is on the artifact rather than reviewing the article. However, if any issues are found, they should be reported to the artifact assessment coordinators.

Note: Please remember that all artifacts, analyzes and discussions are confidential.

Review Process

The review process is divided into two stages. Firstly, you must select works whose themes are familiar to you for the subsequent review process. Choose papers as soon as possible so that the committee is aware of each member's reviews. The work is presented in an online spreadsheet that will be shared with CTA members. Each CTA member will choose 3 or more works to evaluate (number to be confirmed by coordinators later).

Once you have selected your work, you can now start making revisions. To guide this second stage, a set of instructions with key points of the artifact evaluation process was defined by the committee, as set out in the definitions of requirements for obtaining each badge. For the artifact to be eligible to receive a badge, the respective requirements must be met.

For the work/artifact to be eligible to receive the badge, the respective requirements must be met:

Available Artifacts (SeloD)

Code and/or data is expected to be available in a stable repository (such as GitHub or GitLab). In this repository, a README.md is expected with minimal documentation (but clear enough for understanding), describing the objective of the artifact(s) (s), with the respective title and summary of the article.

Functional Artifacts (SeloF)

It is expected that the program code(s) of the artifact(s) can be executed and reviewers can observe its functionalities. To obtain this badge, it is important that additional information is present in the repository's README.md, such as:

  1. list of dependencies;
  2. list of dependencies/languages/environment versions;
  3. description of the execution environment;
  4. installation and execution instructions;
  5. one or more execution examples.

Sustainable Artifacts (SeloS)

It is expected that the program code(s) of the artifact(s) are modularized, organized, intelligible and easy to understand **. To obtain the badge you must:

  1. there is documentation of the code(s) (describing files, functions, APIs, etc.);
  2. there is minimum readability in the code(s) and other artifacts;
  3. it is possible for reviewers to identify the main claims of the article in the artifact(s).

Reproducible Experiments (SeloR)

It is expected that the reviewer will be able to reproduce the main claims presented in the article. To obtain this badge you must have:

  1. instruction to execute the code(s) in order to reproduce and confirm the main claims of the article (e.g., results of the main graphs/tables);
  2. description of the process of carrying out the experiments to reach the result(s) of the article;
  3. description of specific technical details of the environment (when applicable) such as, for example, details of the infrastructure used in Amazon Cloud or Google Cloud. Eventually, include access keys and other information that allows the experiment to be reproduced.

Submission of Reviews

For each artifact, you must produce a brief review text justifying the reason for assigning or denying the artifact a badge. The review text should only be written after the evaluation process has been completed.

To facilitate the evaluation process, complete the comments for each badge using the following template that takes into account the requirements of each badge.

Examples:

(case 1, badge assigned) SeloD: publicly available artifact(s) (comments)
+ available in a stable repository
+ sufficient documentation
+ title
+ summary
I recommend awarding the badge.
(case 2, badge not assigned) SeloD: publicly available artifact(s) (comments)
+ available in a stable repository
- sufficient documentation
+ title
- summary
I cannot recommend the attribution of the badge, because the artifact does not have documentation considered sufficient to understand and reproduce its characteristics and because the summary of the article was not provided.
(case 3, IF THE BADGE HAS NOT BEEN REQUESTED) SeloD: publicly available artifact(s) (comments)
The badge was not requested for this category.

Use the signs (-) and (+) to demonstrate which of the criteria the artifact does not satisfy (- sign) or satisfies (+ sign). It is also important to include a conclusion in the review, explaining the motivations for not awarding the badge when the artifact does not meet the minimum expected criteria.

Note Try to write your review in a precise, impersonal and polite way, considering that it will be available to the authors at a later stage of the process.

Call for CTA members

For the second year, SBSeg will allow authors to submit artifacts such as software, data, supplementary documentation, raw results, proofs of concept, models, detailed evaluations, benchmarks, etc.

We expect members of the technical artifacts committee (CTA) to be graduate students, postdoctoral fellows, researchers, or professionals working in the field of cybersecurity. For the review process, the reviewer is expected to dedicate an average of 5 hours per artifact (considering that all four badges have been requested by the authors).

As a member of the CTA, you are expected to help decide whether the proposed artifact complies with the content presented in the article and the respective criteria for granting the requested badge. Discussion will be allowed among committee members during the evaluation process. However, no interaction with the authors will be allowed so that the process remains anonymous.

We hope we can count on your help to have an excellent process with good and valuable reviews on the quality of the artifacts presented by the articles.

How to register/Nomination

If you are interested in being part of the CTA, complete the registration using the form. Please note that there are a limited number of available vacancies.

Organizers

CTA coordinators

Tiago Heinrich (UFPR) Diego Kreutz (UNIPAMPA)

Artifacts Technical Committee (CTA)

Under construction ....

Contact

Tiago Heinrich (UFPR) theinrich@inf.ufpr.br