BenchmarkXPRT Blog banner

Tag Archives: source code

Working with the WebXPRT 4 source code

In our last blog post, we discussed the WebXPRT 4 source code and how you can contact us to request free access to the build package. In this post, we’ll address two questions that users sometimes ask about code access. The first question is, “How do I build a local instance of WebXPRT?” The second is, “What can I do with it?”

How to build a local WebXPRT 4 instance

After we receive your request, we’ll send you a secure link to the current WebXPRT 4 build package, which contains all the necessary source code files and installation instructions. You will need a system to use as a server, and you will need to be familiar with Apache, PHP, and MySQL configuration to follow the build instructions. WebXPRT 4 uses a LAMP (Linux, Apache, MySQL, and PHP) setup on the “server” side, but it’s also possible to set up an instance with a WAMP or XAMPP stack.

The build instructions include a step-by-step methodology for setup. If you are familiar with LAMP stack configuration, the build and configuration process should take about two to three hours, depending on whether your LAMP-related extensions and libraries are current.

What you can do with a local WebXPRT 4 instance

We allow users to set up their own WebXPRT 4 instances for purposes of review, internal testing, or experimentation.

One use-case example is internal OEM lab testing. Some labs use WebXPRT to conduct extensive testing on preproduction hardware, and they follow stringent security guidelines to avoid the possibility of any hardware or test information leaving the lab. Even though we have our own strict policies about how we handle the little amount of data that WebXPRT gathers from tests, a local WebXPRT 4 instance provides those labs with an extra layer of security for sensitive tests.

We do ask that users publish results only from tests that they run on WebXPRT.com. As we mentioned in our most recent post, benchmarking requires a product that is consistent to enable valid comparisons over time. We allow people to download the source, but we reserve the right to control derivative works and which products can use the name “WebXPRT.” That way, when people see WebXPRT scores in tech press articles or vendor marketing materials, they can run their own tests on WebXPRT.com and be confident that they’re using the same standard for comparison.

If you have any questions about using the WebXPRT 4 source code, let us know!

Justin

Accessing the WebXPRT 4 source code

If you’re new to the XPRTs, you may not be aware that we provide free access to XPRT benchmark source code. Publishing XPRT source code is part of our commitment to making the XPRT development process as transparent as possible. By allowing interested parties to access and review our source code, we’re encouraging openness and honesty in the benchmarking industry. We’re also inviting constructive feedback that can help ensure that the XPRTs continue to improve and contribute to a level playing field for all the types of products they measure.

While we do offer free access to the XPRT source code, we’ve decided to offer the code upon request instead of using a permanent download link. This approach prevents bots or other malicious actors from downloading the code. It also has the benefit of allowing us to interact with users who are interested in the source code and answer any questions they may have. We’re always keen to learn more about what others are thinking about the XPRTs and the types of work they measure.

We recently received some questions about accessing the WebXPRT 4 source code, which made us realize that we needed to make a clearer way for people to ask for the code. In response, we added a “Request WebXPRT 4 source code” link to the gray Helpful Info box on WebXPRT.com (see it in the screenshot below). Clicking the link will allow you to email the BenchmarkXPRT Support team directly and request the code.

After we receive your request, we’ll send you a secure link to the current WebXPRT 4 build package. For those users who wish to set up a local instance of WebXPRT 4 for their own internal testbeds, the package will contain all the necessary files and installation instructions. We allow folks to set up their own instances for purposes of review, internal testing, or experimentation, but we ask that users publish only test results from the official WebXPRT 4 site.

While we offer free access to XPRT source code, our approach to derivative works differs from some traditional open-source models that encourage developers to change products and even take them in different directions. Because benchmarking requires a product that remains static to enable valid comparisons over time, we allow people to download the source, but we reserve the right to control derivative works. This discourages a situation where someone publishes an unauthorized version of the benchmark and calls it an “XPRT.”

If you have any questions about accessing the WebXPRT 4 source code, let us know!

Justin

Accessing XPRT source code

We recently received a question from member of the tech press about whether we would be willing to supply them with the WebXPRT 4 source code, along with instructions for how to set up a local instance of the benchmark for their internal testbed. We were happy to help, and they are now able to automate WebXPRT 4 runs within their own isolated network.

If you’re a new XPRT tester, you may not be aware that we provide free access to the source code for each of the XPRT benchmarks. Publishing XPRT source code is part of our commitment to making the XPRT development process as transparent as possible. By allowing all interested parties to access and review our source code, we’re encouraging openness and honesty in the benchmarking industry and are inviting the kind of constructive feedback that helps to ensure that the XPRTs continue to contribute to a level playing field.

While XPRT source code is available to the public, our approach to derivative works differs from some open-source models. Traditional open-source models encourage developers to change products and even take them in different directions. Because benchmarking requires a product that remains static to enable valid comparisons over time, we allow people to download the source, but we reserve the right to control derivative works. This discourages a situation where someone publishes an unauthorized version of the benchmark and calls it an “XPRT.”

Accessing XPRT source code is a straightforward process. The source code for CloudXPRT is freely available in our CloudXPRT GitHub repository. If you’d like to download and review the source code for WebXPRT 4 or any of the other XPRTs, or get instructions for how to build one of the benchmarks, all you need to do is contact us at benchmarkxprtsupport@principledtechnologies.com. Your feedback is valuable!

Justin

Now available: An updated CloudXPRT Preview build and source code

Today, we published an updated CloudXPRT Preview build (v0.97), along with the build’s source code. The new build fixes a few minor bugs, and makes several improvements to help facilitate installation, setup, and testing. The fixes do not affect CloudXPRT test results, so results from the new build are comparable to results from the original build (v0.95). You can find more detailed information about the changes in last week’s blog.

The CloudXPRT Preview v0.97 source code is available to the public via the CloudXPRT GitHub repository. As we’ve discussed in the past, publishing XPRT source code is part of our commitment to making the XPRT development process as transparent as possible. By allowing all interested parties to download and review our source code, we’re encouraging openness and honesty in the benchmarking industry and are inviting the kind of constructive feedback that helps to ensure that the XPRTs continue to contribute to a level playing field.

While the CloudXPRT source code is available to the public, our approach to derivative works differs from some open-source models. Traditional open-source models encourage developers to change products and even take them in different directions. Because benchmarking requires a product that remains static to enable valid comparisons over time, we allow people to download the source, but we reserve the right to control derivative works. This discourages a situation where someone publishes an unauthorized version of the benchmark and calls it an “XPRT.”

We encourage you to download and review the source and send us any feedback you have. Your questions and suggestions may influence future versions of CloudXPRT.

If you have any questions about CloudXPRT or the source code, please let us know!

Justin

A CloudXPRT build with bug fixes is on the way

We want to let CloudXPRT testers know that updated installer packages are on the way. The packages will include several fixes for bugs that we discovered in the initial CloudXPRT Preview release (build 0.95). The fixes do not affect CloudXPRT test results, but do help to facilitate installation and remove potential sources of confusion during the setup and testing process.

Along with a few text edits and other minor fixes, we made the following changes in the upcoming build:

  • We updated the data analytics setup code to prevent error messages that occurred when the benchmark treated one-node configurations as a special case.
  • We configured the data analytics workload to use a go.mod file for all the required go modules. With this change, we can explicitly state the release version of the necessary go modules, and updates to the latest go release won’t break the benchmark. This change also removes the need to include large gosrc.tar.gz files in the source code.
  • We added a cleanup utility script for the web microservices workload. If something goes wrong during configuration or a test run, testers can use this script to clean everything and start over.
  • We fixed an error that prevented the benchmark from successfully retrieving the cluster_config.json file in certain multi-node setups.
  • In the web microservices workload, we changed the output format of the request rate metric from integer to float. This change allows us to report workload data with a higher degree of precision.
  • In the web microservices workload, we added an overall summary line to results log file that reports the best throughput numbers from the test run.
  • In the web microservices code, we modified a Kubernetes option that the benchmark used to create the Cassandra schema. Prior to this change, the option generated an inconsequential but distracting error message about TTY input.

We haven’t set the release date for the updated build yet, but when we do, we’ll announce it here in the blog. If you have any questions about CloudXPRT, please let us know!

Justin

Principled Technologies and the BenchmarkXPRT Development Community make the AIXPRT source code available to the public

Durham, NC, February 18 — Principled Technologies and the BenchmarkXPRT Development Community release the source code for the AIXPRT benchmark to the public. AIXPRT is a free tool that allows users to evaluate a system’s machine learning inference performance by running common image-classification, object detection, and recommender system workloads.

“Publishing the AIXPRT source code is part of our commitment to making the XPRT development process as transparent as possible,” said Bill Catchings, co-founder of Principled Technologies, which administers the BenchmarkXPRT Development Community. “By allowing all interested parties to download and review our source code, we’re taking tangible steps to improve openness in the benchmarking industry.”

To access the AIXPRT source code, visit the AIXPRT GitHub repository at https://github.com/BenchmarkXPRT/AIXPRT.

AIXPRT includes support for the Intel© OpenVINO™, TensorFlow™, and NVIDIA© TensorRT™ toolkits to run image-classification and object-detection workloads with the ResNet-50 and SSD-MobileNet v1 networks, as well as the MXNet™ toolkit with a Wide and Deep recommender system workload. The test reports FP32, FP16, and INT8 levels of precision.

To access AIXPRT, visit www.AIXPRT.com.

AIXPRT is part of the BenchmarkXPRT suite of performance evaluation tools, which includes WebXPRT, CrXPRT, MobileXPRT, TouchXPRT, and HDXPRT. The XPRTs help users get the facts before they buy, use, or evaluate tech products such as computers, tablets, and phones.

To learn more about the BenchmarkXPRT Development Community, go to www.BenchmarkXPRT.com or contact a BenchmarkXPRT Development Community representative directly by sending a message to BenchmarkXPRTsupport@PrincipledTechnologies.com.

About Principled Technologies, Inc.
Principled Technologies, Inc. is a leading provider of technology marketing, as well as learning and development services. It administers the BenchmarkXPRT Development Community.

Principled Technologies, Inc. is located in Durham, North Carolina, USA. For more information, please visit www.PrincipledTechnologies.com.

Company Contact
Justin Greene
BenchmarkXPRT Development Community
Principled Technologies, Inc.
1007 Slater Road, Ste. 300
Durham, NC 27704
BenchmarkXPRTsupport@PrincipledTechnologies.com

Check out the other XPRTs:

Forgot your password?