Saturday, March 31, 2012

R and visualing your execution times

Well, I think I'm a little late to the party..   I know Greg Rahn did a great post on utilizing R to visual your ash data.  I figure I would do a simple example of how to build something myself to show how easy it is to utilyze R to visualize query execution times..

Well first I stated by downloading R from cran.r-project.org.

Once I downloaded R, I went to one of my databases, and found a query that had different execution times I wanted to play with.  I created an output file from the query.. Here is the script I used..


set pagesize 10000
set feedback off
spool rtest.txt

select trunc((elapsed_time_delta/executions_delta)/1000000,4) avg_execution_time "AVG_EXECUTION_TIME",
       PLAN_HASH_VALUE "PLAN_HASH_VALUE",
       execution_date "EXECUTION_DATE"
from
(
select sum(elapsed_time_delta) elapsed_time_delta,
       sum(executions_delta) executions_delta,
              PLAN_HASH_VALUE,
              to_char(trunc(end_interval_time),'mm/dd/yy') execution_date
from dba_hist_sqlstat a,
     dba_hist_snapshot b
 where sql_id='19sqmxkc58wqm'
and a.snap_id=b.snap_id
and a.instance_number=b.instance_number
--and executions_delta>0
group by plan_hash_value,to_char(trunc(end_interval_time),'mm/dd/yy')
)
where executions_delta > 0
order by execution_date;
spool off


This script created a file I brought over to my pc and cleaned up the format. Here is part of the file..

AVG_EXECUTION_TIME PLAN_HASH_VALUE execution_date                                     
           20.4368       566875892 01/01/12                                     
           50.3253      4009342004 01/01/12                                     
           21.4655       566875892 01/02/12                                     
           19.8312      4009342004 01/02/12                                     
           69.9299      4009342004 01/03/12                                     
          135.7153      4009342004 01/04/12                                     
           39.3972      4009342004 01/05/12                                     
           65.2833      4009342004 01/06/12                                     
           39.8093      4009342004 01/07/12                                     
           35.8615      4009342004 01/08/12                                     
           18.7553       566875892 01/09/12                                     
          134.7431      4009342004 01/09/12                                     
           76.2954      4009342004 01/10/12                                     
          115.8707      4009342004 01/11/12                                     
           60.0754      4009342004 01/12/12                                     
          102.6432      4009342004 01/13/12                                     
           22.2528       566875892 01/14/12                                     
          119.8541      4009342004 01/14/12                                     
           21.8552       566875892 01/15/12                                     
           18.5785      4009342004 01/15/12                                     
           19.3179       566875892 01/16/12                                     
            80.794      4009342004 01/16/12                                     
           67.0872      4009342004 01/17/12                                     
          107.1604      4009342004 01/18/12                                     
           28.9797      4009342004 01/19/12                                     
          


I put this file into c:\r and named it query_performance .txt.

I then went into R and ran the following commands.


setwd("c:\\r")
query_data <- read.table("query_performance.txt",header=T)


max_num <- max(query_data$AVG_EXECUTION_TIME)

hist(query_data$AVG_EXECUTION_TIME,col=heat.colors(max_num),breaks=max_num,xlim=c(0,max_num),
     right=F,main="Execution Time Histogram",las=1)






You can see I just ran a few simple commands...

setwd --- set the working directory to c:\r
read.table --- read in my space delimitted table (there is a read.csv for a comma separated file)
max_num  --- is set to the maximum execution time in the file

hist   -- creates a histogram of the execution times.. Check out below what comes out. Sweet !!


This was easy, and gives me a great picture of the variance in execution times. 

I am going to work more with this file since it had 2 different plans I want to visual the differences.

Thursday, March 22, 2012

Cooking for Geeks (shameless plug)

I am part of the O-Really Blogger review program, and long, long ago I picked up the book

"Cooking for Geeks" by Jeff Potter  for review. 

This was a fantastic book.. I am both a geek, and cook.  This book does a great job of tying the 2 together.  It not only talks about spices, and ingredients but explains in geekspeak why they work togethor.

I thoroughly enjoyed this book ,and I would recommend it to anyone that is interested in cooking, or at all interested in the chemistry, and biology behind how and why we enjoy foods.

Here is a link to the book.
http://www.amazon.com/Cooking-Geeks-Science-Great-Hacks/dp/0596805888

Sunday, March 18, 2012

Benchmarks for CPU's

I have been doing some benchmarks on a couple of different systems for LIO's. I have been using Kevin Closson's great SLOB toolkit.  You can find more information about it on his blog here.

I have been looking at 2 different systems, and here are my results
These 2 systems are both HP.

The first is an AMD 6276 server. 2 socket x 16 cores. (465 G7)

./runit.sh 0 40
The awr is posted.

Here is the summary of throughput.
oad Profile              Per Second    Per Transaction   Per Load Profile              Per Second    Per Transaction   Per Exec   Per Call
~~~~~~~~~~~~         ---------------    --------------- ---------- ----------
      DB Time(s):               37.9               46.6       0.00       5.96
       DB CPU(s):               30.3               37.2       0.00       4.76
       Redo size:           15,791.3           19,408.2
   Logical reads:       10,119,426.4       12,437,215.0
   Block changes:               83.4              102.5
  Physical reads:                0.4                0.6
 Physical writes:               11.5               14.1
      User calls:                6.4                7.8
          Parses:                3.0                3.7
     Hard parses:                0.1                0.1
W/A MB processed:                0.2                0.2
          Logons:                0.1                0.1
        Executes:           39,333.0           48,342.0
       Rollbacks:                0.0                0.0

I then looked at the new Intel E7 2870 I got.  2 socket 10 core, dual threaded (BL620 E7)

./runit.sh 0 43

the awr is here
Load Profile              Per Second    Per Transaction   Per Load Profile              Per Second    Per Transaction   Per Exec   Per Call
~~~~~~~~~~~~         ---------------    --------------- ---------- ----------
      DB Time(s):               40.9              108.6       0.00       6.93
       DB CPU(s):               37.8              100.4       0.00       6.41
       Redo size:           10,053.3           26,674.3
   Logical reads:       13,203,419.8       35,032,516.8
   Block changes:               36.9               97.9
  Physical reads:                0.0                0.0
 Physical writes:                9.6               25.4
      User calls:                5.9               15.7
          Parses:                4.0               10.7
     Hard parses:                0.0                0.0
W/A MB processed:                0.2                0.6
          Logons:                0.3                0.7
        Executes:           51,300.2          136,114.4
       Rollbacks:                0.0                0.1
    Transactions:                0.4


Look at that throughput.. the 43 process count looks to be the best throughput of over 13 Million LIOS/second

WOW  the new AMD Bulldozer has great numbers, but the intel really Rocks !