Benchmarks
Benchmark Submission
We would like to invite and encourage submissions of benchmarks and benchmark generators for all the tracks of the competition.
Each team participating in the Main Track is required to submit 20 new benchmark instances that have not been seen in previous competitions. At least 10 of those benchmarks should neither be too easy (solvable by MiniSat in a minute) or too hard (unsolvable by the participants own solver within one hour).
A benchmark description document of 1-2 pages, written in the style of the IEEE Proceedings style, describing the source and generation of the instances, must be submitted with each benchmark family. The benchmark descriptions will be made available similarly as the solver descriptions (see general rules).Benchmark Format
The benchmark file can start with comment lines, i.e. lines beginning with the character c. After the comments, there is the line p cnf nbvar nbclauses that indicates that the instance is in CNF format, where nbvar is the exact number of variables appearing in the file, and nbclauses is the exact number of clauses contained in the file. Then the clauses follow. Each clause is a sequence of distinct non-null numbers between -nbvar and nbvar ending with 0 on the same line. Positive numbers denote the corresponding variables. Negative numbers denote the negations of the corresponding variables.Benchmark Sanitization
We run a sanitization process on the submitted benchmarks to create a clean, standardized set for post-competition use. This process removes tautological clauses and duplicate literals within clauses. However, solvers and checkers must be able to handle these cases, as they may appear in real-world applications. We will test the solvers and disqualify any that fail to solve unsanitized benchmarks correctly.
Benchmark Selection
The benchmark compilation script is publicly available to ensure transparency.