Earlier this month, we discussed the possibility of using a periodic results submission process for CloudXPRT instead of the traditional rolling publication process that we’ve used for the other XPRTs. We’ve received some positive responses to the idea, and while we’re still working out some details, we’re ready to share the general framework of the process we’re planning to use.
- We will establish a results review group, which only official BenchmarkXPRT Development Community members can join.
- We will update the CloudXPRT database with new results once a month, on a pre-published schedule.
- Two weeks before each publication date, we will stop accepting submissions for consideration for that review cycle.
- One week before each publication date, we will send an email to the results review group that includes the details of that month’s submissions for review.
- The results review group will serve as a sanity check process and a forum for comments on the month’s submissions, but we reserve the right of final approval for publication.
- We will not restrict publishing results outside of the monthly review cadence, but we will not automatically add those results to the results database.
- We may add externally published results to our database, but will do so only after vetting, and only on the designated day each month.
Our goal is to strike a balance between allowing the tech press, vendors, or other testers to publish CloudXPRT results on their own schedule, and simultaneously building a curated results database that OEMs or other parties can use to compete for the best results.
We’ll share more details about the review group, submission dates, and publications dates soon. Do you have questions or comments about the new process? Let us know what you think!