📄 sigmod_1997_elementary.txt
字号:
How much will these developments benefit OLTP workloads? Through empirical studies on databases sized comparably to those seen in the real-world, this paper presents the characteristics of an industry-standard OLTP benchmark as memory buffer size changes. We design the experiments to investigate how the database size, the buffer size and the number of CPUs impact performance, in particular the throughput and the buffer hit rate on Symmetric Multiprocessor Systems. The relationships of these major database attributes are plotted and key observations are summarized. We discuss how these relationships change as the number of CPUs changes. We further quantify the relationships: 1) between database buffer data hit rate, buffer size and database size, 2) between throughput, buffer data hit rate and database size and 3) between throughput and number of CPUs. Algorithms, rules-of-thumb and examples are presented for predicting performance, sizing memory and making trade-offs between adding more memory and increasing the number of CPUs.</abstract></paper><paper><title>Database performance in the real world: TPC-D and SAP R/3</title><author><AuthorName>Joachen Doppelhammer</AuthorName><institute><InstituteName>Universit&#228;t Passau, Fakult&#228;t f&#252;r Mathematik und Informatik, D-94030 Passau, Germany</InstituteName><country></country></institute></author><author><AuthorName>Thomas H&#246;ppler</AuthorName><institute><InstituteName>Universit&#228;t Passau, Fakult&#228;t f&#252;r Mathematik und Informatik, D-94030 Passau, Germany</InstituteName><country></country></institute></author><author><AuthorName>Alfons Kemper</AuthorName><institute><InstituteName>Universit&#228;t Passau, Fakult&#228;t f&#252;r Mathematik und Informatik, D-94030 Passau, Germany</InstituteName><country></country></institute></author><author><AuthorName>Donald Kossmann</AuthorName><institute><InstituteName>Universit&#228;t Passau, Fakult&#228;t f&#252;r Mathematik und Informatik, D-94030 Passau, Germany</InstituteName><country></country></institute></author><year>1997</year><conference>International Conference on Management of Data</conference><citation><name>Dina Bitton , David J. DeWitt , Carolyn Turbyfill, Benchmarking Database Systems A Systematic Approach, Proceedings of the 9th International Conference on Very Large Data Bases, p.8-19, October 31-November 02, 1983</name><name>Reudiger Buck-Emden , Jurgen Galimow, Sap R\3 System: A Client/Server Technology, Addison-Wesley Longman Publishing Co., Inc., Boston, MA, 1996</name><name>George Colliat, OLAP, relational, and multidimensional database systems, ACM SIGMOD Record, v.25 n.3, p.64-69, Sept. 1996</name><name>J. Doppelhammer, T. Htippler, A. Kemper, and D. Kossmann. Database performance of SAP System R/3. http://www.db.fmi.unipassau.de/projects/SAP.</name><name>Phil Fernandez , Donovan Schneider, The ins and outs (and everything in between) of data warehousing, Proceedings of the 1996 ACM SIGMOD international conference on Management of data, p.541, June 04-06, 1996, Montreal, Quebec, Canada</name><name>Jim Gray , Adam Bosworth , Andrew Layman , Hamid Pirahesh, Data Cube: A Relational Aggregation Operator Generalizing Group-By, Cross-Tab, and Sub-Total, Proceedings of the Twelfth International Conference on Data Engineering, p.152-159, February 26-March 01, 1996</name><name>Jim Gray, Benchmark Handbook: For Database and Transaction Processing Systems, Morgan Kaufmann Publishers Inc., San Francisco, CA, 1992</name><name>B. Lober and U. Marquard. Standard Application Benchmarks. SAP AG, 69185 Walldorf, Germany, Dec 1995.</name><name>B. Matzke. ABAP/4- Die Programmiersprache des SAP-Systems R/3. Addison-Wesley, Reading, MA, USA, 1996.</name><name>SAP AG. R/3 system overview. http://www.sap.com/r3/r3_over.htm.</name><name>Transaction Processing Performance Council TPC. TPC benchmark D (decision support). Standard Specification 1.0, May 1995.</name><name>L. Will, C. Hienger, E Str~enburg, and R. Himmer. R/3-Administration. Addison-Wesley, Reading, MA, USA, 1996.</name></citation><abstract>Traditionally, database systems have been evaluated in isolation on the basis of standardized benchmarks (e.g., Wisconsin, TPC-C, TPC-D). We argue that very often such a performance analysis does not reflect the actual use of the DBMSs in the &ldquo;real world.&rdquo; End users typically don't access a stand-alone database system; rather they use a comprehensive application system, in which the database system constitutes an integrated component. In order to derive performance evaluations of practical relevance to the end users, the application system including the database system has to be benchmarked. In this paper, we present TPC-D benchmark results carried out using the
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -