Skip navigation

SPEC Organizational Information


SPEC's Background

The System Performance Evaluation Cooperative, now named the Standard Performance Evaluation Corporation (SPEC), was founded in 1988 by a small number of workstation vendors who realized that the marketplace was in desperate need of realistic, standardized performance tests. The key realization was that an ounce of honest data was worth more than a pound of marketing hype.

SPEC has grown to become one of the more successful performance standardization bodies with more than 60 member companies. SPEC publishes several hundred different performance results each quarter spanning a variety of system performance disciplines.


SPEC's Philosophy

The goal of SPEC is to ensure that the marketplace has a fair and useful set of metrics to differentiate candidate systems. The path chosen is an attempt to balance requiring strict compliance and allowing vendors to demonstrate their advantages. The belief is that a good test that is reasonable to utilize will lead to a greater availability of results in the marketplace.

The basic SPEC methodology is to provide the benchmarker with a standardized suite of source code based upon existing applications that has already been ported to a wide variety of platforms by its membership. The benchmarker then takes this source code, compiles it for the system in question and then can tune the system for the best results. The use of already accepted and ported source code greatly reduces the problem of making apples-to-oranges comparisons.


SPEC's Structure

SPEC is a non-profit corporation whose membership is open to any company or organization that is willing to support our goals (and pay our nominal dues). Originally just a bunch people from workstation vendors devising CPU metrics, SPEC has evolved into an umbrella organization encompassing three diverse groups.

The Open Systems Group (OSG)

The OSG is the original SPEC committee. This group focuses on benchmarks for desktop systems, high-end workstations and servers running open systems environments.

OSG Subcommittees:

The people who brought you SPECmarks and the other CPU benchmarks (SPECint, SPECfp, SPECrates, etc).
The people who brought you the Java client and server-side benchmarks JVM98, JVM2008, JBB2000,  and JBB2005, and the jAppServer Java Enterprise Application Server benchmarks.
The people who brought you SPECmail2001, the consumer Internet Service Provider (ISP) mail server benchmark.
The Power committee developed SPECpower_ssj2008, the SPEC benchmark for evaluating the energy efficiency for server class computers. They are also developing the Server Efficiency Rating Tool (SERT).
The SIP committee has begun development of the first generation SPEC benchmark for comparing performance for servers using the Session Initiation Protocol (SIP).
These are the people who brought you SFS93 (LADDIS), SFS97, SFS97_R1, and SFS2008.
The Virtualization committee has begun development of the first generation SPEC benchmark for comparing virtualization performance for data center servers.
These are the people who brought you WEB96, WEB99, WEB99_SSL, and WEB2005, the web server benchmarks.

The High-Performance Group (HPG)

The HPG is a forum for establishing, maintaining and endorsing a suite of benchmarks that represent high-performance computing applications for standardized, cross-platform performance evaluation.

These benchmarks target high performance system architectures, such as symmetric multiprocessor systems, workstation clusters, distributed memory parallel systems, and traditional vector and vector parallel supercomputers.

The Graphics and Workstation Performance Group (GWPG)

SPEC/GWPG is the umbrella organization for project groups that develop consistent and repeatable graphics and workstation performance benchmarks and reporting procedures. SPEC/GWPG benchmarks are worldwide standards for evaluating performance in a way that reflects user experiences with popular applications.

GWPG Project Groups

The Application Performance Characterization (SPECapcSM) group was formed in 1997 to provide a broad-ranging set of standardized benchmarks for graphics and workstation applications. The group's current benchmarks span popular CAD/CAM, digital content creation, and visualization applications.
The Graphics Performance Characterization (SPECgpcSM) group, begun in 1993, establishes performance benchmarks for graphics systems running under OpenGL and other application programming interfaces (APIs). The group's SPECviewperf(r) benchmark is the most popular standardized software for evaluating performance based on popular graphics applications.

The Research Group (RG)

The RG is a new group within SPEC created to promote innovative research on benchmarking methodologies and tools facilitating the development of benchmark suites and performance analysis frameworks for established and newly emerging technologies. It is designed to encourage exchange among representatives from academia, industry and research institutes. The scope of the conducted research efforts includes techniques and tools for performance measurement, load testing, profiling, workload characterization, dependability and efficiency evaluation of computing systems. While the focus is on performance, other extra-functional system properties such as scalability, availability, cost and energy efficiency are considered as well.

A major component of the RG is the development of standard scenarios and workloads—called research benchmarks—for emerging technologies and applications. Benchmarks from the research group are intended primarily for in-depth analysis and evaluation of early prototypes and research results. This differentiates them from conventional benchmarks used for direct comparison and marketing of existing products.

Other planned activities of the RG include publishing a newsletter and journal, establishing a portal for benchmarking-related resources, recognizing outstanding contributions to benchmarking research, and organizing conferences and workshops.


Frequently Asked Questions

If you are still curious, perhaps we have the answers in the SPEC FAQ.


In Memoriam

Kaivalya Dixit

Kaivalya Dixit Kaivalya Dixit, long-time SPEC president, passed away on 22 November, 2004. Kaivalya touched many across the computer performance community and will be missed by all. SPEC has established a page where visitors may read remembrances of Kaivalya or share their own.

Tom Skornia

Tom Skornia Tom Skornia, legal counsel to SPEC for much of its history, succumbed to cancer Wednesday, April 27th, 2005. Tom was not necessarily a public person, but behind the scenes his bright mind and hard work on behalf of many companies and consortia in the industry was a key factor in our successes.

Larry Gray

Larry Gray Larry Gray, longtime SPEC treasurer, passed away on January 30, 2011. Larry was involved from SPEC from its inception, and in addition to serving as treasurer he participated in the development of many of SPEC's benchmarks.

当前网页内容, 由 大妈 ZoomQuiet 使用工具: ScrapBook :: Firefox Extension 人工从互联网中收集并分享;
若有不妥, 欢迎评注提醒:


点击注册~> 获得 100$ 体验券: DigitalOcean Referral Badge

订阅 substack 体验古早写作:

关注公众号, 持续获得相关各种嗯哼:


关于 ~ DebugUself with DAMA ;-)
公安备案号: 44049002000656 ...::