SQL Server best practices for CS Professional Suite applications

Show expandable text

View as PDF

What is SQL?

SQL (Structured Query Language) is a special-purpose programming language designed to manage data held in a relational database management system (RDBMS), or for stream processing in a relational data stream management system (RDSMS). CS Professional Suite applications such as Accounting CS, Practice CS, Workpapers CS, and (optionally) FileCabinet CS use SQL databases.

See System requirements for using CS Professional Suite applications for more detailed information on the requirements for the CS Professional Suite.

What is Microsoft SQL Server?

The Microsoft SQL Server data management software is a relational data management system that is used by many applications primarily to store, update, and retrieve data. SQL Server organizes data in tables and responds to application queries for very quick data retrieval. In contrast to flat-file data systems, SQL Server can process data in large and complex data sets for many simultaneous users more efficiently and can enable more robust reporting options.

Due to the complexity of Microsoft SQL Server and the hardware resources needed for optimum performance with it, we recommend that firms using SQL-based CS Professional Suite applications retain the services of a qualified IT professional who is proficient in SQL. Because of potential differences among firms in their specific SQL Server environments and other hosted databases, you should consult with your firm's own qualified IT professional when implementing, configuring, and maintaining your SQL databases; doing so is imperative to ensuring your firm's standards of software performance, stability, and data integrity.

Antivirus and firewall configuration

Real-time scanning by your antivirus software, as well as certain firewall configurations, can potentially degrade performance with SQL Server and applications that are reliant on it. We recommend making certain exclusions to your security solutions to ensure optimal performance of your SQL-based CS Professional Suite applications. For details about the recommended exclusions, please refer to the following topics in our Help & How-To Center:

Hardware recommendations

For the machine where the SQL Server instance for your CS Professional Suite applications resides, we recommend using a dedicated, 64-bit multi-core server. Ideally, this would be separate from your domain controller to avoid potential security and performance issues. Because SQL Server caches data pages in RAM, for optimal performance it is critical to provide the server with as much memory as feasible (in accordance with maximum memory limitations per your SQL Server edition) and to configure ample memory for SQL database management and operating system (OS) overhead. We also recommend having multiple disks with higher rotational speed and smaller average seek times (as opposed to focusing primarily on faster transfer speed).

In addition, to avoid downtime and potential corruption of data if a power failure were to occur on the machine that is running SQL Server, consider using dual power supplies plugged into separate circuits with an uninterruptible power supply (UPS).

For information from Microsoft about the hardware requirements and recommendations for various versions of SQL, see the following Microsoft articles:

Note: After July 12, 2022, Microsoft will no longer support SQL Server 2012 or earlier. For more information, see Discontinued support for Microsoft SQL Server 2012.

Disk configuration

During a typical creation of a new SQL database, by default the critical database files such as the primary data file, the transaction log file, and the Tempdb file are stored on the same drive and often in the same location as the machine's OS. However, as a SQL best practice for optimal performance and manageability — especially for very large databases that are updated frequently — we recommend separating critical database files from each other and from the OS. 

Note: The program files can still remain on the OS, but the database files listed below should be separated.

Performance considerations

Having the files separated potentially improves latency, bandwidth, and I/O contention (random vs. sequential). Furthermore (and depending on your system's configuration), each type of file benefits from disks optimized for specific types of tasks:

  • Primary data files (.MDF): These files contain the actual SQL-based data. Data is most often written to .MDF files using random activity, which is naturally slower than sequential activity and could cause performance penalty to other file types if accessed on the same disks.
  • Transaction log files (.LDF): LDF files contain log information for all transactions completed by the server. LDF files are used to time stamp transactions written to the SQL Server database and enable the SQL database to be easily recoverable if any data loss were to occur (e.g. as a result of a power outage). Because all database transactions are written to .LDF files using mostly sequential writes, overall performance is strongest when these files are maintained on drives with high write performance.
  • Tempdb file: This file type, which is a potentially very active SQL Server database with its own data and log files, is used for temporary storage of user objects and internal database operations. While data in this heavy-usage file is not persistent when SQL Server is closed, overall performance benefits when this file is on a separate, dedicated drive with ample space for unimpeded growth.

Manageability considerations

Greater separation of the file types listed above makes it easier for you and your qualified IT professional to correlate performance to any specific database file if specific troubleshooting steps are needed at any point. Furthermore, the recommended file separation will help to minimize any data loss in the event of corruption on the server.

The recommendation to separate these files is specifically related to server functions, and not to the function of any CS Professional Suite application. Be sure to contact a qualified IT professional who is proficient in using Microsoft SQL Server to determine both whether separating these files is possible given your firm's specific setup and also whether doing so is in the best interest of your firm, as well as for any assistance with that process (either before or after creation of the SQL database).

Virtualization

Hardware virtualization software allows multiple operating system instances to run at the same time on a single computer or server. While Microsoft SQL Server can be installed and run in a virtual environment, improper configuration and over-allocation of resources could degrade performance and create inaccuracies in performance reporting. Thomson Reuters advises that firms adhere to Microsoft's recommendations when using SQL Server in a virtual environment.

Microsoft offers an in-depth white paper on Best Practices for Virtualizing and Managing SQL Server (PDF). External link

For additional information, see Microsoft's support policies regarding virtualizationExternal link

Due to the complexity of virtualization, when configuring or working in a virtual environment, be sure to consult your firm's qualified IT professional. Our Support Representatives at Thomson Reuters cannot be responsible for configuring your virtual environment.

SQL Server version comparisons

Microsoft provides a variety of versions and editions of SQL Server to correspond with differing needs. Each of the CS Professional Suite applications that use SQL databases includes an installation of the Express edition of SQL Server, which is Microsoft's free but limited version. While we find that SQL Server Express is sufficient for most firms, factors such as the number of concurrent users, the amount of data, any additional applications needed within your firm, and so forth, may require that you install and use an upgraded version of SQL Server. Refer to the following tables for a comparison of the different versions and editions of SQL Server.

Supported versions of SQL by CS Professional Suite application

After July 12, 2022, Microsoft will no longer support SQL Server 2012 or earlier. For more information, see Discontinued support for Microsoft SQL Server 2012.

Application Name SQL 2022 (Express, Standard, or Enterprise edition) SQL 2019 (Express, Standard, or Enterprise edition) SQL 2017 (Express, Standard, or Enterprise edition) SQL 2016 (Express, Standard, or Enterprise edition) SQL 2014 (Express, Standard, or Enterprise edition)
Accounting CS Supported Supported Supported Supported Supported
FileCabinet CS* Supported Supported Supported Supported Supported
Practice CS Supported Supported Supported Supported Supported
Workpapers CS Supported Supported Supported Supported Supported

* Although the database limit for MS SQL Server Express is 10GB, FileCabinet CS is structured such that up to 100GB of data can be supported with the Express edition.

Database distribution: Single vs. multiple instances

The decision about whether to separate SQL databases into multiple instances on the same server or to keep them in a single instance is one that your firm should carefully consider and discuss with your firm's qualified IT professional. In most situations, the ideal configuration is to use only a single instance on a machine. SQL Server will attempt to use as much memory as it deems necessary based on its maximum allowance. Having multiple instances could cause a performance issue if memory has been over-allocated. Another concern with multiple instances is the performance of the Tempdb file, because separate instances placing a high load on the disk where the Tempdb file resides can lead to additional queuing that may not occur with a single instance.

The selection of SQL Server edition, memory allotment, and required security settings are the primary areas to consider when determining whether using multiple SQL Server instances may be the better solution than using a single instance. With newer, full versions of SQL Server, the computing capacity and maximum memory are such that performance issues for databases sharing the same instance are more easily avoided. If you use the free SQL Server Express edition (which has a maximum memory utilization of 1GB), or any other edition of SQL Server where databases already use much of the existing RAM, it may not be productive to have the databases share that memory. Furthermore, if specific and strict security settings are needed for a non-CS Professional Suite application that also uses SQL, we recommend having a separate instance of SQL Server for your CS Professional Suite applications.

Options for redundant setups

There are multiple redundancy options for your SQL Server instance and the environment it resides on, including the following:

  • Redundant array of independent disks (RAID). RAID, which is not specific to SQL, combines multiple disk components into a single, logical array that can be used to store redundant data to provide varying levels of improved performance, reliability, and fault tolerance depending on needs. Because different database files benefit from different RAID levels, your firm's qualified IT professional should give careful consideration to those issues during implementation of this array.
  • Failover clustering. Clustering for SQL is an advanced configuration in which a group of servers run cluster-enabled applications through shared storage but over multiple network connections. If a failure were to occur with any of the servers (known as nodes in this setup), the benefit of this configuration is that applications can be moved automatically or manually to a separate node to limit downtime for those applications while the initial issue is resolved.
  • Database mirroring. Mirroring with SQL is when two copies (a primary and a secondary) of a single database reside on separate SQL Server instances. When done through synchronous operation, while potentially increasing latency, this will limit downtime and data loss if any failure were to occur on the server where the primary instance of SQL Server resides. Database mirroring can be used as a supplement or an alternative to failover clustering.

Efficient redundancy can assist with performance and limit potential downtime. Because improperly configured redundancy solutions could cause performance issues, data anomalies, or data corruption, be sure to consult with your firm's qualified IT professional when determining whether any redundancy option is right for your firm's server environment.

Permissions and authentication

CS Professional Suite applications that use SQL databases require mixed-mode authentication, which is a combination of Windows Authentication and Microsoft SQL Server Authentication. This authentication can be specified during a manual install of SQL Server, during the process of attaching the SQL Server instance to the appropriate CS Professional Suite application, or through the instance properties in SQL Server Management Studio.

When a SQL Server instance is successfully attached to a CS Professional Suite application, all appropriate SQL Server user accounts and permissions are automatically set. However, if not all permissions are properly set, it may be necessary for your firm's qualified IT professional to make the corrections or to reinstall the application (up to and including the creation of a new SQL Server instance).

Note: If mixed mode is selected during a manual installation of SQL Server or within SQL Server Management Studio, a secure password needs to be specified at that time for the built-in SQL Server system administrator account (named "sa"). Alternatively, Windows Authentication may safely be chosen when creating a new SQL Server instance or when specifying an existing instance; you will later be prompted to allow the installation to modify the authentication to mixed mode on your behalf. This process creates a system-generated "sa" password automatically.

SQL database maintenance

For any SQL database, we strongly recommend regular maintenance by a qualified IT professional to ensure continuing performance at optimal levels. A database that is not adequately maintained could eventually result in degraded performance or potentially even downtime for applications that use SQL. An effective SQL database maintenance plan addresses the following areas of concern.

Data and log file management

First, as stated above, the data and log files for an SQL database should be kept separate whenever possible, and the data and log files should be monitored regularly to ensure proper and unimpeded growth. Select appropriate autogrow settings in SQL Server Management Studio, preferably using specific MB size amounts for growth rather than percentages. Allowing the autogrow settings to grow data and log files in small and frequent increments would negatively impact performance and could randomly interrupt application processes or could cause file fragmentation. Ideally, your firm should avoid relying on intermittent autogrowth for these files and instead arrange alerts for occasional, proactive manual growth of the files.

Your firm's qualified IT professional who is proficient in SQL can shrink a database automatically or manually to remove unused pages from both data and transaction log files when they no longer need to be as large as they are. While there are exceptions, your IT professional should perform database shrinks in a manual process only after careful consideration and as a last resort when disk space needs to be returned to your OS. We do not recommend database shrinks as a part of an effective maintenance plan; the process consumes many resources, causes index fragmentation, and adversely affects performance in the long run. Furthermore, an expanding database often needs to grow back to fill the space that the shrink had released. In contrast to using a database shrink, more conducive to efficient and long-term database performance would be to provide ample disk space for the machine that runs SQL Server and to allow for proper growth of data and log files.

Note: For more information, refer to Considerations for the "autogrow" and "autoshrink" settings in SQL Server External link on the Microsoft website.

Index fragmentation

Index fragmentation is an avoidable condition that can occur when the server needs more than the optimal disk I/O to access a database table. This condition can be due to data pages in the table having excessive empty space or to a mismatch between the logical order of the pages in a table and their physical order. As index fragmentation increases over time, it can degrade performance by taking up more memory and disk space on the machine that runs SQL Server, by creating larger backup files, and by causing the index not to perform/respond as fast as expected due to the unnecessary logical reads that are required.

Many issues with index fragmentation can be avoided by following the best practices for data and log file maintenance. In addition to proactively preventing this condition and defragmenting the physical disk where the transaction log file resides, a good SQL database maintenance plan includes relying on your firm's qualified IT professional to detect and address any SQL database fragmentation.

Statistics

The SQL Server Query Optimizer uses statistical information to determine the most accurate and thus the quickest way to retrieve requested data from the database. Those statistics track a multitude of items such as page density, number of records, available indices, and so forth, and they enable the Query Optimizer to determine the most effective ways to access data. Outdated or missing statistics would cause the Query Optimizer to wait for updated statistics, which could noticeably slow down performance within the application you are using with SQL Server.

While statistics do not play a role in improving any degradation of performance caused by data issues, part of your firm's maintenance plan for the SQL databases should include making sure that statistics are set to auto-update and checking to see when manual updates of the statistics may be necessary.

Note: In Accounting CS and Practice CS, in addition to managing statistics through queries and via SQL Server Management Studio, you can use the Update Statistics command within the application if users experience slow performance after making significant changes in the SQL database.

  • Accounting CS, Help > Update Statistics
  • Practice CS, Tools > Update Statistics

Corruption detection

Data corruption in an SQL database refers to any error that occurs during the reading or writing of SQL and creates unintended modifications to the data. While rare, this corruption affects data integrity and can cause issues ranging from unexpected errors and degraded performance, to data loss and even downtime. While the root cause can widely vary, nearly all data corruption is due to platform issues, driver/firmware bugs, hardware faults (especially in relation to disk subsystems), or security software. Your SQL database maintenance plan should include a process for early detection of any potential hardware issues and corruption in the database, as well as a commitment to follow best practices for avoiding any interference from your security software and other application processes.

Your firm's IT professional should run hardware diagnostics and should constantly monitor the Windows Event Logs for any disk or hardware-related errors. You should contact your hardware vendor and your qualified IT professional immediately if you find any hardware issues. Because data corruption could occur in the database well before becoming apparent in the application itself, your IT professional should regularly review the SQL Error logs in SQL Server Management Studio and should occasionally perform database integrity checks (through the DBCC CHECKDB query). To limit data loss if data corruption were to occur, be sure to keep periodic backups of the database.

Database backups

Regularly backing up your SQL database is an imperative component of your firm's disaster recovery strategy and maintenance plan, even when redundancies are in place. Because SQL databases are always running and most likely reside in a location that is separate from your firm's flat-file data, during an emergency downtime your firm should not depend solely on any image backups of live SQL data. In addition to creating database backups through SQL Server Management Studio, most SQL-based CS Professional Suite applications provide the ability to schedule periodic backups, and these scheduled backup features should be used whenever possible in lieu of using SQL Server Management Studio for backups. See Backing up the firm database for details from our Help & How-To Center about backing up your application database.

Special information for backups

  • Be sure your SQL database backups are not stored on the same physical location as the actual database files. In case of damage to the physical drive where the live data resides, you should still be able to access and use the separate drive or remote location so that you can restore data.
  • Keep an appropriate backup schedule of your SQL database in accordance with your firm's needs and the needs of the application. The longer you go between backups, the higher the risk of significant data loss.
  • Test backups occasionally by restoring them on a separate server or standalone machine. This will enable you to verify that the backups are being created properly and could be used during any unplanned downtime.

SQL database maintenance items should be discussed with and performed by a qualified IT professional who is proficient in using SQL Server.

Troubleshooting performance and connection issues

For any performance issues with SQL-based CS Professional Suite applications that can be traced back to the SQL Server instance, in addition to troubleshooting the network connection that the instance is using, be sure to review the best practices for improving performance that are mentioned in this topic. In addition, you should regularly use performance monitors, the SQL Server error logs, and the Windows Event Viewer to pinpoint any current or potential performance issues with your SQL Server instance.

For any failures in connecting to the SQL database from a CS Professional Suite application on a workstation where programmatic troubleshooting has been unsuccessful, run an ODBC connection test to test SQL connectivity outside the software. This test uses Windows tools (and is completely independent of the CS Professional Suite application) to attempt to establish a connection between a computer and your SQL Server.

Testing an ODBC connection

  1. Open the Microsoft Windows Control Panel.
  2. Choose Administrative Tools, and then choose Data Sources (ODBC) or ODBC Data Sources (32-bit or 64-bit), depending on your version of Microsoft Windows.
  3. Click the Add button.
  4. Choose SQL Server and click Finish.
  5. Enter a name and description (for the test, any text will do), and then enter the Server\Instance name of the SQL Server instance used by your CS Professional Suite application (for example, ServerName\CREATIVESOLUTION).
  6. Click Next until you can click Finish in the setup wizard. You should see a pop-up window showing the ODBC Microsoft SQL Server Setup dialog.
  7. Click the Test Data Source button. You should see a prompt noting a successful connection to the SQL Server, and "TEST COMPLETED SUCCESSFULLY."
  8. Once you have the results, you can close all the open windows and exit the Administrative Tools screen.

A failure of the ODBC connection test would indicate that the Microsoft Windows workstation was unsuccessful at connecting to Microsoft SQL Server (outside of the CS Professional Suite application). For further assistance in troubleshooting this issue on your firm's computers, you would need to contact a qualified IT professional who is proficient with SQL Server. Once the ODBC connection test is successful, it is highly likely that our CS Professional Suite applications will also be able to make a connection and operate successfully on your system.

Was this article helpful?

Thank you for the feedback!