a) Apply Set Processing wherever it can be applied:
·
Use Group by, Having, Exists clauses effectively
·
Take extra care while writing sub queries and complex joins
·
Don't forget to join PROCESS_INSTANCE, in case you've made it a
key
·
Fine tune the SQL ( Refer: Oracle Documentation)
b) Use Temp Table
·
It improves AE performance significantly- Best Suited for Set
Based Processing
·
Facilitates parallel processing
·
Custom Indexes can be created to achieve faster result
·
Practice to make PROCESS_INSTANCE a key and employ the %Table
meta-SQL
·
Automatic data cleanse
·
Dedicated Temp Table is preferred
c) Use Meta-SQL
·
Make a habit of using Meta-SQL like %Join, %CurrentDateIn,
%Table. It gives program more flexibility.
·
It makes AE program more robust and platform independent and
improves performance.
·
Be aware of the limitation of these Meta-SQL, e.g. %EffdtCheck
doesn't work well with PS_JOB table
·
%TruncateTable is faster than bulk delete statement. In case of
Temp table use %TruncateTable(%Table(XXX_TAO))
d) Drop/Rebuild Indexes:
·
If you're planning for massive Insert, you may wish to drop
indexes and triggers from the table first and recreate them once the insert is
done.
·
It makes AE processing much faster. Just be careful that
dropping indexes makes it prone to duplicate rows. (You may leave primary key
untouched).
e) Refrain from using PeopleCode
·
If the goal can be achieved using SQL, do not use PeopleCode.
f) Setting Commits
·
It's recommended to use frequent and early Commit in case of Set
based processing. It reduces load from Database server and enhance performance.
g) Reuse Statement
·
Valid Only for SQL actions
·
By dedicating a persistent cursor to that statement we can reuse
the SQL Statement.
·
When we select the ReUse property for a SQL action, %BIND fields
is converted into real bind variables (like :1,:2, etc). This enables
PeopleSoft Application Engine to compile the statement only once and dedicate a
cursor, and re-execute it with new data multiple times. This reduction in
compile time can result in dramatic improvements to performance.
h) Use %UpdateStats
·
For better performance, refresh statistics on temp tables using
%UpdateStats(record name ,[HIGH/LOW]) after each bulk load of data.
i) Use CollectGarbage()
·
Specific to AE using App Classes.
·
Use the CollectGarbage function to remove any unreachable
application objects created by the Application Classes and hence release the
memory. ( Also Refer: Unix Process Limit for
PeopleSoft)
·
However, my tests suggest that this function does not cause any
memory to be released by the process.
j) Use SQL Hints (Tuning SQL)
·
SQL Hints e.g. /* +APPEND */,/* +FIRST_ROWS(n), provides a
mechanism which instruct the CBO to choose a certain query execution plan based
on the specific criteria. ( Refer: Oracle Documentation)
k) Streaming
·
Streaming is a process in which a large chunk of data is divided
into multiple streams and all the stream processes in parallel.
l) Use Trace File
·
Use trace file generated by Trace/TOOLSTRACESQL/TOOLSTRACEPC
advisably. Find out areas where process is taking longer processing-time or
where performance can be improved.
Performance
Considerations in Application Engine
It is always important to
consider performance during design and development, but it is especially
important when dealing with Application Engine programs. That is because AE
programs often process a large number of transactions, and also because there are
a number of fundamental design decisions that can have a dramatic affect on
performance.
Performance tuning is
often more of an art than a science, and there are exceptions to every rule.
Still, the following general guidelines can be useful in achieving optimum
performance.
Priority Considerations
These points should be on
every designer’s and developer’s mind. They are given
priority because they can
have a tremendous impact, and because it is much easier to implement them in
the original design than it is to add them later.
·
Set-Based Processing: Instead
of loop constructs such as Do Select, Do When, Do While, or Do Until, consider
using UPDATE or INSERT statements that will affect many rows at once.
·
Temporary Tables: There
are several reasons to consider using temporary tables in your program:
o The
transaction tables may be too large to process directly against them with
efficiency. A temporary table can hold an extract of the necessary data.
o The
transaction tables may not have the correct index structure to process joins
efficiently. A temporary table can be created with a more optimal index
structure.
o The
normalized data structure of the transaction tables may make it
difficult to access the
required information. A temporary table can be
used to denormalize or
“flatten” the data.
o It may
not be possible to construct a single SQL statement to implement set-based
processing. The temporary table can be used to store intermediate results.
o Although
a single SQL statement might be possible, it may have such
complex joins that it
performs badly. A temporary table again can be used to store intermediate
results.
·
Write efficient SQL: Set-based
processing often leads to SQL statements with lots of joins. There are several
techniques to keep in mind when writing or tuning such statements, so that they
run quickly:
o Use as
many indexes as possible in the join criteria. In particular, don’t
skip high-level keys.
o Avoid
unnecessary joins. For example, when deriving SetID values from the business
unit, use a separate select statement to find the SetID, and store it in the
state record. Then it can be used as a bind variable in the main SQL statement.
o Avoid
unnecessary subselects. For example, if all the rows selected from a table will
have the same effective date, then use a separate select statement to find the
effective date, and store it in the state record. Then it can be used as a bind
variable in the main select.
o Join to
the smallest table possible. For example, if effective-dated records exist in a
parent-child relationship, then use the parent record for the effective date
subselect.
o Use
Oracle hints. This is appropriate when tuning a slow SQL statement. See the
Oracle documentation for a complete list of hints, although two particularly
useful ones are RULE (to force a statement to ignore statistics) and USE_INDEX
(to force a statement to use a particular index). Generating the optimizer plan
for the statement (see below) may help in selecting an appropriate hint.
Other Considerations
These points can be
important in certain contexts, but they either do not carry as universal or
dramatic an impact as the above points, or they are easier to implement by
“tweaking” an existing program.
·
SQL in PeopleCode: When
possible, avoid using SQL in PeopleCode. It carries less overhead to use SQL
actions instead.
·
Mathematical Calculations in PeopleCode: When
possible, avoid using SQL to perform mathematical calculations. For example, it
is more efficient to use PeopleCode to increment a counter. The exception to
this is if the same calculation must be performed on many rows of a table – in
that case, use setbased processing.
·
Commits: When
and where commits are done should primarily be driven by restart logic.
However, if doing commits within loop structures, it is a good idea to specify
the commit frequency, so that commits may be done in sizable “chunks”.
·
ReUse Statement: For SQL
actions in loop structures, turn on the “ReUse
Statement” option. The
exception to this is if dynamic SQL is being used.
·
Bulk Insert: For
INSERT statements in loop structures, use the Bulk Insert option. Note that
this will not help if commits are done too frequently in the loop.
·
%UpdateStats: After
inserting a large amount of data into a temporary table, use the %UpdateStats
command. This will help the Oracle optimizer process joins more intelligently.
Note that this will only work if commits are enabled.
·
Tracing: To
identify where a performance problem is, run a trace of the
program, using the timing
options. There are two important points to consider:
o When
trying to pinpoint the problem turn off all tracing options other than the
timing options. The overhead involved in tracing every statement can skew your
results.
o If the problem has been narrowed down to a particularly complex SQL, it
can help to generate the optimizer plan for that statement, using the DB
Optimizer tracing option.