Executing an incremental backup

From InterBase
Jump to: navigation, search

Go Up to Creating incremental backups


To execute an incremental backup, use the following syntax:

GBAK {-D} dbname file [size] add_file1 [size1] add_file2 [size2] ...

The first dump file in the list is similar to the first database file in a multi-file database. It is the file that is used as a reference to an existing online dump. If there are additional dump files listed on the GBAK command line, those files are added to the set of files in the online dump.

Example: The following example can assist you in creating an initial incremental online dump.

[E:/EMPLOYEE] gbak -d EMPLOYEE.gdb EMPLOYEE.gdmp EMPLOYEE.gdmp.1
gbak: WARNING: Dumped 46270 pages of a total 46270 database pages
gbak: WARNING: Dumped 1 pages to page appendix file

[E:/EMPLOYEE] gbak -d EMPLOYEE.gdb EMPLOYEE.gdmp EMPLOYEE.gdmp.1
gbak: ERROR: I/O error for file "E:\EMPLOYEE\EMPLOYEE.GDMP.1"
gbak: ERROR: Error while trying to create file
gbak: ERROR: The file exists.

gbak: Exiting before completion due to errors

[E:/EMPLOYEE] gbak -d EMPLOYEE.gdb EMPLOYEE.gdmp EMPLOYEE.gdmp.2
gbak: WARNING: Dumped 2 pages of a total 46270 database pages
gbak: WARNING: Dumped 0 pages to page appendix file

In the example above, EMPLOYEE.gdmp.1 was added in the course of a full database dump.

Re-executing the command gives an error because it tries to add EMPLOYEE.gdmp.1 again causing a file creation error. The last command adds a new file EMPLOYEE.gdmp.2 successfully.

The online dump files can be on either a local or a remote file system that is writable by the InterBase server. An online dump is a server-side operation only. While the online dump files can be located on any mounted file system, the page appendix file is always on the local file system. This file is written to by concurrent server threads handling client requests when it is necessary to preserve the state of an image of a page for the online dump. This is analogous to InterBase multigenerational architecture (MGA) where a previous version of a row is stored when updating a row to preserve a snapshot of a transaction. The page appendix file helps to maintain the physical page snapshot of the online dump. It is a temporary file and is deleted when the online dump completes.

The [size] parameter is optional and denotes the size of the file in units of pages, using the page size of the database. If the [size] parameter is not provided then that dump file's size will be determined by its file-sequenced counterpart in the database. If the sequence of the dump file is higher than the sequence of any database file, then it takes the size of its predecessor dump file.

If you run GBAK -D against an existing online dump, an incremental dump will be created.

[E:/EMPLOYEE] gbak -d EMPLOYEE.gdb EMPLOYEE.gdmp
gbak: WARNING: Dumped 46270 pages of a total 46270 database pages
gbak: WARNING: Dumped 23 pages to page appendix file

[E:/EMPLOYEE] gbak -d EMPLOYEE.gdb EMPLOYEE.gdmp
gbak: WARNING: Dumped 2 pages of a total 46270 database pages

gbak: WARNING: Dumped 0 pages to page appendix file

This updates the online dump with only those pages that have changed since the last dump. An incremental dump can always be retried if it fails. If a full online dump fails, InterBase will delete the online dump files that were written prior to the failure. If InterBase cannot access those files because of the failure, those online dump files will have to be deleted manually.

Distinguished Dump: "Incremental Dump" before and in InterBase XE3 required the database server to read all pages from the database file, but only write the pages that had been modified to the target database dump file. With the implementation of a tracking system in XE7, only those pages that need updating since the last dump would be fetched. This provides instantaneous updates to the target. There can only be 1 "Distinguished Dump" per source database. The choice of a "distinguished dump" is as follows:

  • The first "dump" on the source database file will be a "distinguished dump"; all further dump targets are "normal dump" targets.
  • Should the "first" dump be made online and thus sever its link to the source database, the next "dump" to be incrementally updated will now become the "disinguished dump".

Advance To: