When you update a database structure in the development environment, you must then re-create these changes in the database that is already established at the deployment site. To accomplish this transfer, an incremental .df file is generated in the development environment. Copies of the original and updated databases are compared, and the differences between them are incorporated into an incremental .df file. For more information on creating an incremental ABL data definitions file see OpenEdge Data Management: Database Administration. The following general steps tell you how to prepare a .df file that includes new or modified data definitions:
Obtain copies of the current user database and the new modified database.
Connect the two databases.
Compare the data definitions in the user database with the data definitions of your new updated database, creating a new incremental .df file that reflects the changes.
Provide a utility for users that loads the incremental .df file into the existing database.
Encrypt the procedures affected by changed data definitions.
To prepare a data definitions file:
1. In Windows, open the Data Administration Tool (for character interfaces, open the Data Dictionary).
2. Connect the two databases by choosing DatabaseConnect.
3. Select the database that includes the new, modified data definitions as your current database by choosing Database > Select Working Database.
4. Create an incremental definition file by choosing AdminDump Data and DefinitionsCreate Incremental .df File.
This option compares the data definitions in the non-empty copy to the current database schema and creates a new data definition (.df) file. The new .df file contains a record for each difference between the two schemas. The differences include any added, renamed, changed, or deleted file, field, or index.
If a file, field, or index exists in the old database but not in the new schema, OpenEdge asks you whether the object has been renamed. If you respond no, a record appears in the new .df file marking the object as deleted.
If the new schema includes a new unique active index, OpenEdge asks you whether you want to deactivate it. If you do not deactivate the index, and there are duplicate keys in the old database, OpenEdge aborts your attempt to load new definitions into the old database. If you deactivate the index, the load procedure defines the new index, but does not create the index file. You must build and activate the index after loading the new data definitions.
5. Perform steps a through d below for testing purposes. Then prepare a tool for users that performs these steps on site:
a. Select the copy of the old database as your working database.
b. Load the updated data definitions by choosing AdminLoad Data and DefinitionsLoad Data Definitions (.df file).
c. If you deactivated any indexes in Step 3, re-create data in the indexed fields as required to avoid duplicate keys. Then reactivate the indexes with PROUTIL IDXBUILD. For more information, see OpenEdge Data Management: Database Administration.
d. OpenEdge now updates the old database schema to match the modified schema. Compile and test all your procedures against the updated database.
The upgrade template provided with OpenEdge outlines one way to do this. Before you can use this template, you probably need to modify it. See Usingthe upgrade template .
6. Test your procedures.
7. Use the XCODE utility to encrypt procedures invalidated by the new data definitions.
Since the new .df file changes CRC check sums and time-stamps only on the database tables that are actually modified, you only have to encrypt and ship the source versions of those procedures that access the changed tables. (If you compile your procedures with the XREF option, you get a listing of the tables accessed by each procedure.)