• If you keep the data files smaller, they can be individually restored more quickly during a backup.
• The smaller the data files are, the more data files there are; thus, the longer BEGIN BACKUP operations are likely to take in online backups.
• Data files that are too large aggravate performance problems that are caused by inode locking (When you change a block in a file, operating systems set an exclusive lock on the file, which means that you cannot make a parallel change to the same data. On UNIX, this lock is also called an inode lock).
• Set the DB_FILES Oracle parameter to a high value. Otherwise, new data files cannot be created once this limit is reached.
• When there is a large number of data files and, at the same time, operating system resources are not adequately configured, critical errors such as "file table overflow" can occur, which may cause the database to crash.
In general, it seems a good idea to restrict data file sizes to between 2GB and 20GB.
Dig Deeper on SAP application integration
Related Q&A from Farooq Ali
Find out how resetting a transport buffer can help fix an error that occured during the installation of an SAPGUI patch. Learn how to avoid an ... Continue Reading
Find out how to set a default SAP BI client within an SAP Business Intelligence (BI) system. Learn how to run an SAP Basis TCode in a default SAP BI ... Continue Reading
An administrator wants to know how to check the HTTP and HTTPS ports in an SAP system and how to restart a Portal system. Continue Reading