Last month, we provided an update on the CloudXPRT development process and more information about the three workloads that we’re including in the first build. We’d initially hoped to release the build at the end of April, but several technical challenges have caused us to push the timeline out a bit. We believe we’re very close to ready, and look forward to posting a release announcement soon.
In the meantime, we’d like to hear your thoughts about the CloudXPRT results publication process. Traditionally, we’ve published XPRT results on our site on a rolling basis. When we complete our own tests, receive results submissions from other testers, or see results published in the tech media, we authenticate them and add them to our site. This lets testers make their results public on their timetable, as frequently as they want.
Some major benchmark organizations use a different approach, and create a schedule of periodic submission deadlines. After each deadline passes, they review the batch of submissions they’ve received and publish all of them together on a single later date. In some cases, they release results only two or three times per year. This process offers a high level of predictability. However, it can pose significant scheduling obstacles for other testers, such as tech journalists who want to publish their results in an upcoming device review and need official results to back up their claims.
We’d like to hear what you think about the different approaches to results submission and publication that you’ve encountered. Are there aspects of the XPRT approach that you like? Are there things we should change? Should we consider periodic results submission deadlines and publication dates for CloudXPRT? Let us know what you think!