Kevin Kempf's Blog

September 3, 2010

Using cron and shell to schedule a SQL query/script

Filed under: 11g, Linux — kkempf @ 12:55 pm

The Task

The most basic building block of linux?

When opening an SR recently because of a memory leak on my 11i front end, the analyst asked me to “Please begin running this script on an hourly basis initially”.  The script referenced was in note 557194.1 and was, in effect, an bunch of SQL queries with the old “report style” formatting (column, prompt, etc).  So I had to figure out, what is the best way to run a SQL script which is essentially a long report, saving the output, on an hourly basis?  In case you’re unable or don’t feel like looking at the script I needed to run, it looks something like this:

set lines 132
set pages 500
column module  heading "Module Name"  format a48;
column machine heading "Machine Name" format a25;
column process heading "Process ID"   format a10;
column inst_id heading "Instance ID"   format 99;
column username for a10
column sid for 9999
column sql_text for a50
prompt =======================================================
prompt JDBC Connections
select to_char(sysdate, 'dd-mon-yyyy hh24:mi') Time from dual
prompt =======================================================
prompt =======================================================
prompt No of Instances
select inst_id instance_id, count(*) from gv$session group by inst_id
prompt =======================================================


A couple different ways popped to mind to address this; in the end they boiled down to the ways I knew of to schedule it as manually running it every hour is about the dumbest thing I could think of  (thanks support!)
  • Convert this to a package, run it through the 11g EM scheduler and write the data to a table (loses all formatting, a lot of work for “throw away” sql, hard to read back without yet more work)
  • Use EM 11g report tab to make it a report (after playing with this previously unused-to-me feature for 10 minutes, I gave up, vowing never to click that tab again)
  • Track the values via a custom metric in EM (intriguing, but there’s multi-value selects in here, which are a mess in EM, not sure it would have been useful from a presentation standpoint)
  • Use shell/cron to automate this
In the end, I went with shell/cron, because it involved the least work and got me the best result.  I have the exact report run, every hour, with all the formatting in place.

Here’s how I did it

I created the shell script is as follows (

. ~/env/${1}_APPS       # invoke my 8.0.6 home/apps environment, you just call this script with the instance name (PROD in my case) as argument 1

APPSPASS=apps           # variable for the apps password

DATE_STAMP=`date +%d-%b-%y_%H:%M`   # create a date stamp to append to each output so that I can distinguish them

echo "@/scratch/oracle/dba/SR/monitor_jdbc_conn.sql"|sqlplus -S apps/${APPSPASS} > /scratch/oracle/dba/SR/jdbc_${DATE_STAMP}.txt  
# echo the execution of the script to sqlplus, redirect the output to a file


In the end, I think cron may be the most useful tool ever; type crontab -e and add an entry for this job.  In this case, I decided to make it run at 30 minutes past every hour, every day

# track JDBC connections

30 * * * * /scratch/oracle/dba/SR/ PROD > /dev/null

Create a free website or blog at