johannes kepler university linz - nina amla emmanuel...
TRANSCRIPT
Carsten Sinz ⋅ Nina Amla ⋅ João Marques SilvaEmmanuel Zarpas ⋅ Daniel Le Berre ⋅ Laurent Simon
What is SAT-Race? „Small SAT-Competition“
Only industrial category benchmarks(no handcrafted and random)
Short run-times(15 minutes timeout per instance)
Mixture of satisfiable / unsatisfiable instances(thus not suitable for local-search solvers)
„Black-box“ solvers permitted
Organizers Chair
Carsten Sinz (J. Kepler University Linz, Austria) Advisory Panel
Nina Amla (Cadence Design Systems, USA) João Marques Silva (University of Southampton, UK) Emmanuel Zarpas (IBM Haifa Research Lab, Israel)
Technical Consultants Daniel Le Berre (Université d'Artois, France) Laurent Simon (Université Paris-Sud, France)
Solvers Received 29 solvers by 23 submitters from 13 nations
Europe: 16 solvers, North America: 10, Asia/Australia: 2, Middle East: 1
3 industrial solvers, 25 academic, 1 private/amateur
1 / 1Israel1 / 1Japan
3 / 2Germany3 / 3France3 / 3Canada4 / 1Austria1 / 1Australia
7 / 6USA2 / 1Netherlands1 / 1Sweden1 / 1Spain1 / 1Portugal1 / 1Northern Ireland
(X / Y: X solvers, Y submitters)
Qualification Two qualification rounds
Each consisting of 50 benchmark instances Increased runtime-threshold of 20 minutes Successful participation in at least one round
required to participate in SAT-Race Instances published on the Web in advance
To ascertain solver correctness and efficiency 1st round took place after May 17,
2nd round after June 16
Results Qualification Rounds Qualification Round 1:
15 participating solvers 6 solvers already qualified for SAT-Race (by solving more than
40 out of 50 instances):Eureka, Rsat, Barcelogic, Actin (minisat+i), Tinisat, zChaff
Qualification Round 2: 17 participating solvers 13 solvers qualified (3 of them already qualified by QR1):
Actin (minisat+i), MiniSAT 2.0β, picosat, Cadence-MiniSAT,Rsat, qpicosat, Tinisat, sat4j, qcompsat, compsat, mxc,mucsat, Hsat
Overall result: 16 (out of 29) solvers qualified[9 solvers retracted, 4 showed insufficient performance]
Qualified Solvers
SFUDavid MitchellMXC v.1
IntelAlexander NadelEureka
PrincetonZhaohui FuzChaff 2006NICTAJinbo HuangTINISATCRIL-CNRSDaniel Le BerreSAT4JUCLAThammanit PipatsrisawatRsatJKU LinzArmin BiereQPicoSATJKU LinzArmin BiereQCompSATJKU LinzArmin BierePicoSAT
LMU MunichNicolas RachinskyMucsatChalmersNiklas SörenssonMiniSAT 2.0UBCDomagoj BabicHyperSAT
JKU LinzArmin BiereCompSATCadence Design SystemsNiklas EenCadence MiniSATTU Catalonia, BarcelonaRobert NieuwenhuisBarcelogicTU DarmstadtRaihan KibriaActin (minisat+i)
AffiliationAuthorSolver
Benchmark Instances 20 instances from bounded model checking
IBM’s benchmark 2002 and 2004 suites 40 instances from pipelined machine verification
20 instances from Velev’s benchmark suite 20 instances from Manolios’ benchmark suite
10 instances from cryptanalysis Collision-finding attacks on reduced-round MD5 and
SHA0 (Mironov & Zhang) 30 instances from former SAT-Competitions
(industrial category) Up to 889,302 variables, 14,582,074 clauses
Benchmark Selection Instances selected at random from
benchmark pool “Random” numbers selected by Armin Biere
(95), João Marques-Silva (41), and NinaAmla (13), random seed = sum
Inappropriate instances filtered out too easy: all solvers in <60 sec, one solver in
<1 sec too hard: not handled by any solver
Scoring1. Solution points: 1 point for each
instance solved in ≤900 seconds2. Speed points:
pmax = x / #successful_solversps = pmax ⋅ (1 – ts / T)
with x set to the maximal value s.t. ps≤1for all solvers and instances
Computing Environment Linux-Cluster at Johannes Kepler
University Linz 15 compute nodes Pentium 4 @ 3 GHz 2 MB main memory
16.6 days CPUtime for SAT-Race(plus 16.6 days for the qualification rounds)
Results
Winners
12 3
Minisat2.0Minisat2.0by Niklas Sörensson
RsatRsatby ThammanitPipatsrisawat
EurekaEurekaby Alexander Nadel
80.45 points82.71 points80.87 points
next best solver 69.39 points
Best Student Solvers
12
MXCMXCby David Bregman
MucsatMucsatby Nicolas Rachinsky
31.23 points 30.09 points
Developed byundergraduate /master students
Complete Ranking
2.992.092.233.213.783.224.204.915.395.005.98
6.29
6.398.4513.879.71
SpeedPoints
2728293838394954545759
63
63726773
#solved
UBCLMU MunichSFUJKU LinzPrincetonJKU LinzCRIL-CNRSNICTAJKU LinzJKU LinzTU Catalonia, Barcelona
TU Darmstadt
Cadence Design SystemsUCLAIntelChalmers
Affiliation
29.9930.0931.2341.2141.7842.2253.2058.9159.3962.0064.98
69.29
69.3980.4580.8782.71
TotalScore
Domagoj BabicHyperSAT16151413121110987654321
Rank
Jinbo HuangTINISAT
Raihan KibriaActin (minisat+i)
Nicolas RachinskyMucsatDavid MitchellMXC v.1Armin BiereCompSATZhaohui FuzChaff 2006Armin BiereQCompSATDaniel Le BerreSAT4J
Armin BiereQPicoSATArmin BierePicoSATRobert NieuwenhuisBarcelogic
Niklas EenCadence MiniSATThammanit PipatsrisawatRsatAlexander NadelEurekaNiklas SörenssonMiniSAT 2.0
AuthorSolver
Runtime Comparison
#solved instances
runt
ime
ConclusionAny progress by SAT-Race? SAT-Race 2006 winner cannot solve more instances
than SAT-Competition 2005 winner Nine solvers better than winner of SAT-Competition
2004 New ideas for implementation, optimizations
(Combination of Rsat with SatELite preprocessor can solve 2 moreinstances than best SAT-Race solver within the given time limit)
Many new solvers(but mostly slight variants of existing solvers)