Skip navigation

BenchCouncil: International Open Benchmarking Council

 

Ongoing projects

  • Benchmark proposals.

  • Eight new proposals are going to be present at Bench 18.

    Big Data Benchmarking: Applications and Systems(Slides)

    Prof. Geoffrey Fox, Indiana University

    MLPerf: The Vision Behind an ML Benchmark Suite for Measuring the Performance of ML Software Frameworks, ML Hardware Accelerators, and ML Cloud and Edge Platforms(Slides)

    Prof. Vijay Janapa Reddi, Harvard University

    DataMotif: A Benchmark Proposal for Big Data and AI(Slides)

    Dr. Wanling Gao, ICT, CAS

    A Benchmark proposal for Deep Learning Benchmarks(Slides)

    Prof. Xiaoyi Lu, The Ohio State University

    A Benchmark proposal for Datacenter Computing(Slides)

    Dr. Chen Zheng, ICT, CAS

    PeakBench: A Benchmark Proposal for Scalable Transaction Processing(Slides)

    Prof. Weining Qian, East China Normal University

    TS-benchmark: a benchmark proposal for time series databases(Slides)

    Prof. Yueguo Chen, Renmin University of China

    A Benchmark proposal for large-scale and high-speed spatiotemporal data processing and analytic(Slides)

    Prof. Zhiyuan Chen, Prof. Jianwu Wang, Univiersity of Maryland, Baltimore County

  • Formation of working groups.

  • Available Soon

  • Publish benchmark specifications.

  • TBD

  • Open-source implementations of benchmark specifications.

  • TBD

  • Challenges and competitions.

  • Will be organized at Bench 19 and BenchCouncil 2019 Annual system technology conference.

  • Archive performance numbers.

  • TBD


Solicit Benchmark Proposals

Benchmark proposals are solicited on any topic, including, but not limited to:

  • Artificial Intelligence

  • NewSQL and distributed database systems

  • Blockchain

  • Datacenter, cloud and warehouse-scale computing

  • High performance computing

  • Mobile Robotics

  • Edge and fog computing

  • Big Scientific Data

  • Internet of Things (IoT)

  • Education, financial and power system

According to BenchCouncil’s procedure, each benchmarking pipeline consists of the six steps: benchmark proposal, formation of working group, publish benchmark specifications, open-source implementations of benchmark specifications, challenges and competitions, and archive performance numbers.