%0 Journal Article %J SIGMOD Rec. %D 2011 %T Repeatability and workability evaluation of SIGMOD 2011 %A Bonnet, Philippe %A Manegold, Stefan %A Bjørling, Matias %A Cao, Wei %A Gonzalez, Javier %A Granados, Joel %A Hall, Nancy %A Idreos, Stratos %A Ivanova, Milena %A Johnson, Ryan %A Koop, David %A Kraska, Tim %A Müller, René %A Olteanu, Dan %A Papotti, Paolo %A Reilly, Christine %A Tsirogiannis, Dimitris %A Yu, Cong %A Freire, Juliana %A Shasha, Dennis %C New York, NY, USA %I ACM %P 45–48 %R http://doi.acm.org/10.1145/2034863.2034873 %U http://doi.acm.org/10.1145/2034863.2034873 %V 40 %X SIGMOD has offered, since 2008, to verify the experiments published in the papers accepted at the conference. This year, we have been in charge of reproducing the experiments provided by the authors (repeatability), and exploring changes to experiment parameters (workability). In this paper, we assess the SIGMOD repeatability process in terms of participation, review process and results. While the participation is stable in terms of number of submissions, we find this year a sharp contrast between the high participation from Asian authors and the low participation from American authors. We also find that most experiments are distributed as Linux packages accompanied by instructions on how to setup and run the experiments. We are still far from the vision of executable papers.