Best Practices for Managing Business Central Database Capacity

Finance leaders: Don’t let hidden data costs slow you down. Proactive database capacity management in Business Central cuts expenses, boosts performance, and ensures compliance—unlock agility and scalability for your ERP.

Published: Oct 31, 2025 •

The Strategic Importance of Database Capacity Management

Managing your Dynamics 365 Business Central database capacity is far more than a technical housekeeping task; it is a critical business function with direct strategic implications. Proactive oversight of your data footprint is essential for maintaining optimal system performance, controlling the total cost of ownership, and ensuring the long-term reliability and agility of your ERP system. A well-managed database is the foundation of a healthy and responsive Business Central environment.

Why Finance Leaders Should Care About Database Capacity Management in Business Central

Actively managing your database size delivers several tangible business advantages:

  • Cost Efficiency: The most direct benefit is financial. By keeping your data storage within your entitled limits, you prevent the need to purchase additional capacity. This directly lowers the total cost of ownership for your Business Central solution and ensures predictable subscription costs.
  • System Performance: A leaner database is a faster database. Reducing the overall data volume improves runtime performance, particularly for I/O-intensive workloads. Queries execute more quickly, and system processes that read and write large amounts of data are more efficient, leading to a better user experience.
  • Operational Agility: Optimized database management processes, such as backups and restores, are significantly faster on smaller databases. This enhances business continuity by reducing downtime during critical maintenance and makes overall system administration simpler and more agile.
  • Administrative Freedom: Staying within your storage quota ensures that key administrative actions remain available. When capacity limits are exceeded, actions such as creating new production or sandbox environments, or copying existing ones, are restricted, which can hinder development, testing, and expansion initiatives.

Understanding these benefits provides the strategic motivation for action. The first step in effective management is to gain clear visibility into your organization’s current storage footprint.

dynamics 365 business central storage and database capacity management

Understanding and Monitoring Your Storage Footprint

The Business Central administration center provides the Capacity page, which serves as the central dashboard for monitoring your organization’s storage entitlements and real-time consumption. This page is the primary tool for administrators to gain the visibility needed to make informed decisions about data management and capacity planning.

Default Capacity and Entitlements

By default, every Business Central customer tenant receives a base storage capacity of 80 GB. This capacity is part of a shared pool, meaning the combined storage usage of all environments—both production and sandbox—within the tenant counts toward this single limit. In addition to the base amount, your total capacity is increased based on the number and type of user licenses purchased.

The following table itemizes the additional storage capacity granted for each user license type:

License TypeAdditional Storage per License
Premium3 GB
Essential2 GB
Device1 GB

Navigating the Capacity Page

The Capacity page consolidates critical storage metrics into a single view, allowing administrators to quickly assess the organization’s standing. Key components include:

  • Total Database Storage Usage: This provides a high-level overview of the total storage consumed across all environments within the tenant, measured against the total available capacity.
  • Storage Capacity, by Source: This section provides a detailed breakdown of how your total available capacity is calculated. It shows the amount provided by the default entitlement, the additional capacity granted from user licenses, and any extra storage that has been purchased.
  • Storage Usage by Environment: This provides a granular, environment-by-environment breakdown of storage consumption. It allows you to identify which specific production and sandbox environments are contributing most to the overall usage, enabling targeted data reduction efforts.

Regularly monitoring these metrics allows you to identify trends and take proactive measures to reduce your storage footprint before it impacts administrative operations.

business central database capacity management strategies

Proactive Data Reduction Strategies

The most effective and cost-efficient method for managing database capacity is through proactive and consistent data hygiene. By regularly removing obsolete and unnecessary data, you can prevent organic data growth from consuming your available storage. These strategies are typically performed by application administrators or developers who have a deep understanding of the business data.

Deleting Unused Companies

Over time, a Business Central tenant can accumulate companies that are no longer active. These often include companies created for testing, demonstration purposes, or initial implementation trials. Each of these companies contains a full set of tables and data that consumes valuable database space. Deleting these unused companies is a straightforward and highly effective method for reclaiming a significant amount of storage.

Archiving or Deleting Historical Documents

Business operations generate a large volume of documents, such as invoiced purchase orders, sales quotes, and completed production orders. While essential during their active lifecycle, many of these historical documents eventually become obsolete. Establishing a process to periodically review and delete these documents when they are no longer required for operational or compliance reasons is a crucial practice for managing long-term data growth.

Implementing Retention Policies

Retention policies are a powerful tool for automated data governance. These policies allow you to define rules that automatically delete outdated data from tables containing log entries and archived records. By specifying how frequently Business Central should purge this information, you can ensure that transient data does not accumulate indefinitely. This “set and forget” approach is critical for maintaining a clean and efficient database with minimal manual intervention.

While these data hygiene practices are fundamental, more technical methods can be employed for even greater storage efficiency.

Advanced Data Management for Developers and Administrators

For organizations looking to maximize storage efficiency and performance, Business Central supports advanced techniques that require a deeper technical understanding. These methods are typically implemented by developers or administrators with expertise in database and application architecture.

Data Compression

Data compression reduces the physical size of tables in the database. This not only saves storage space but can also improve the performance of I/O-intensive workloads, as queries need to read fewer data pages from disk. However, this benefit comes with a trade-off: the database server requires additional CPU resources to compress and decompress data during read and write operations.

In Business Central, compression is primarily managed through the CompressionType property on table objects in AL code. For Business Central online, this is the only method, and page-level data compression is automatically enabled by default on all tables. This setting is applied to both Microsoft and third-party extensions unless a developer explicitly sets the CompressionType property to None to disable it for a specific table.

For on-premises deployments, administrators have an additional option: the Start-NAVDatabaseCompression PowerShell cmdlet. This tool allows for managing compression at the database level and is available for Business Central 2020 release wave 1 and later.

Table Partitioning (On-Premises Considerations)

For on-premises installations, SQL Server table partitioning is a powerful technique for managing very large tables. Partitioning divides a single large table into smaller, more manageable logical units, or partitions, based on a specific data field (e.g., “Posting Date”). This approach offers significant performance and manageability benefits:

  • Maintenance operations, such as rebuilding an index, can be performed more quickly on individual partitions rather than the entire table.
  • Query performance can be improved through a process called partition elimination, where the SQL Server optimizer only accesses the partitions relevant to the query’s filter criteria.
  • Lock contention on the table can be reduced by enabling lock escalation at the partition level instead of the table level.

Implementing partitioning requires careful planning. The primary requirement is that the partitioning field must be part of the table’s primary key. This makes partitioning a critical design consideration for new tables developed in custom extensions, as altering the primary key of an existing, published extension is considered a breaking change and is not supported.

When proactive and technical strategies are not enough, organizations must understand how to respond when capacity limits are reached.

Responding to Capacity Limits

When a tenant’s total database usage exceeds its allocated quota, Business Central implements specific restrictions to encourage resolution. It is important to understand that these measures are designed to limit administrative expansion, not to disrupt daily business operations.

Consequences of Exceeding Quotas

If your organization surpasses its storage limit, you can be assured that core business processes will continue to function without interruption.

Exceeding the storage quota will not interrupt transaction processing within your existing environments. Users can continue to post transactions and run their day-to-day operations.

However, certain administrative capabilities will be blocked until the capacity issue is resolved. The following actions are restricted once the quota is exceeded:

  • Creating new production or sandbox environments.
  • Copying existing environments.

Options for Increasing Capacity

Organizations that require more storage space have the option to purchase additional capacity. This is done by contacting your reselling partner to acquire license add-ons.

The following add-ons are available to increase storage or environment entitlements:

  • Dynamics 365 Business Central Database Capacity (1 GB)
  • Dynamics 365 Business Central Database Capacity (100 GB)
  • Dynamics 365 Business Central Database Capacity Overage (1 GB)
  • Dynamics 365 Business Central Additional Environment Add-on

It is critical to note that the lower-priced “Dynamics 365 Business Central Database Capacity Overage (1 GB)” add-on is only available to customers who have already purchased at least one “Dynamics 365 Business Central Database Capacity (100 GB)” add-on.

It is also noteworthy that purchasing an additional production environment provides more than just a new environment. This add-on also increases the tenant’s shared storage capacity by 4 GB and includes three additional sandbox environments.

These commercial options provide a direct path to resolving capacity constraints when data reduction is not feasible.

Conclusion: A Synthesized Approach to Capacity Management in Business Central

Effective management of your Business Central database capacity is not a one-time event but a continuous, maturing process. The most successful organizations adopt an integrated, tiered strategy that evolves with their needs.

This strategy begins with a solid foundation of continuous monitoring via the Admin Center to maintain visibility. The first line of defense is a consistent regimen of proactive data hygiene—deleting unused companies, archiving old documents, and implementing retention policies. For long-term scalability, the next tier involves advanced optimization through technical strategies like data compression and, for on-premises deployments, table partitioning. Finally, when growth outpaces optimization, the final recourse is a clear understanding of the commercial options for purchasing additional capacity.

By implementing this strategic roadmap, administrators and developers can ensure a cost-effective, high-performance system that remains agile and ready to support future business growth.

Ready to turn strategy into impact?

Don’t let capacity challenges slow your momentum. Our System Review Business Impact Assessment for Business Central is designed to help you uncover hidden inefficiencies, optimize performance, and align your system with future growth. Start your assessment today and take control of your Business Central environment with confidence.

Related Posts