Try OpenEdge Now
skip to main content
Database Administration
Maintaining and Monitoring Your Database : Dumping and Loading : Specialized dump and load techniques : Optimizing data for sequential access
 
Optimizing data for sequential access
If users repeatedly request data from the same tables in a determined order (for example, to create a certain report), dump and reload the data to optimize it for sequential access. Loading tables in order of average record size results in the most efficient organization of records for sequential access.
To dump and reload data to optimize it for sequential access:
1. Use PROUTIL with the TABANALYS qualifier to determine the mean record size of each table in the database. See PROUTIL Utility for a detailed description of the PROUTIL TABANALYS qualifier.
2. Dump the definitions and data.
The Data Administration and Data Dictionary tools dump in order of the primary index. If you access the records by another index, dump the data by that index. Use an ABL procedure similar to the following to dump the data by the index of your choice:
OUTPUT TO table-name.d.
FOR EACH table-name USE-INDEX index:
  EXPORT table-name.
END.
This procedure creates a contents (.d) file organized by order of access.
3. Load the tables of less than 1,000 bytes first, in order of average record size. Use the Data Dictionary or the Data Administration tool to load data one table at a time, or use the Bulk Loader utility and a description file to control the order.
If you use the Bulk Loader, the order of the fields in the description file must match the order of the fields in the data file. If they do not match, the Bulk Loader attempts to load data into the wrong fields.
4. Load the remaining, larger tables, in order of average record size.