Skip to main content

How to Clean Up Dataverse Storage and Instantly Improve Power Platform Performance

Page 1


Why Ignoring Dataverse Cleanup Is Costing Your

Organization

Thousands Every Month

Overseeing the storage of Microsoft Dataverse is most often overlooked, yet managing it can increase the overall health and performance of the Power Platform environment. As your apps, flows, and integrations develop, unnecessary data accumulates, which increases the cost of storage and decreases overall performance.

In this guide, you’ll learn exactly how to analyze, clean, and optimize Dataverse storage to reclaim space and boost performance instantly.

Understand Dataverse Storage Types First

To begin the process of cleanup, you must first understand what the storage of Dataverse consists of. Dataverse categorises its storage into three categories:

Database Storage:

This includes:

 Tables (formerly entities)

 Rows (records)

 System tables (audit logs, plug-in logs, async jobs)

File Storage:

Used for:

 Attachments

 Notes (.txt, .pdf, images)

 Email attachments

 File columns

Log Storage: Includes:

 Audit logs

 Plug-in trace logs

 System logs – which can often be very large and overlooked

Understanding and identifying which area of storage is most used by you is the first step in effective cleanup.

Why Dataverse Storage Cleanup Matters

Neglecting storage can degrade Power Platform performance in subtle ways: slower queries, delayed app loading, and even throttled API calls. Cleanup matters because:

 Gains in performance: Clutter reduction enhances the speed of data retrieval and processing, resulting in app performance improvement.

 Saving Costs: The storage in dataverse is priced based on the storage utilized in the database, the files, and the logs. Therefore, creating free space helps avoid additional expenses because overs and purchases will be needed.

 Security and Compliance: Duplicates and old logs could be a source of a problem. Regular cleanup is needed to align data retention policies with good practice.

 Flexibility: A clean environment will allow growth without reaching the limits, which could cause problems in sensitive areas such as finance and healthcare.

Transforming Power Platforms speed with cleanup.

Analyze Your Current Storage Usage

Your first step is to analyse your current storage utilization on your Admin center.

Where to check:

Power Platform Admin Center → Licensing → Dataverse

You’ll see:

 Storage breakdown on each environment.

 Storage breakdown by Database, File, and Log.

 If you click on environment where you can see the Table wise breakdown

Identify key problem areas:

 Are audit logs consuming large amounts of data?

 Are files and file storage consumed by notes and attachments?

 Are custom tables escalating in size?

 Are logs being consumed by system jobs (async operations)?

This helps you decide what needs to be prioritized for cleaning.

How to Do a Bulk Delete

If you need to delete a lot of records quickly, doing a bulk delete is a great option.

Steps:

1. Go to Advanced Settings → Data Management → Bulk Record Deletion

2. From the menu, select the option to create a new bulk deletion job

3. Decide some filter criteria. For example, you can filter to only records modified more than 365 days ago.

4. Schedule it (once or recurring)

5. Decide some filter criteria. For example, you can filter to only records modified more than 365 days ago.

Some records you might consider deleting are:

 Old audit logs

 Completed system jobs

 Duplicate records

 Email records that are older and not needed anymore

 Log records that are obsolete (these include plug-in trace logs and workflow logs)

All of this can be done without affecting users, as bulk delete jobs run in the background.

Clean Up Audit Logs and System Logs

Audit and system logs can balloon log storage quickly. To reclaim space:

 Audit Logs: In the admin center, go to your environment's Settings > Auditing > Free up capacity > Delete audit logs. Choose by table, access logs, or date range (only the oldest can be deleted first). Confirm to remove.

 System Logs: For plugin traces (PlugInTraceLogBase), use bulk deletion jobs targeting these entities. Set recurring jobs to delete logs older than a set period (e.g., 30 days).

Warning: Deleted logs can't be recovered, so align with retention policies. This can free significant log capacity, improving query speeds.

Removing Unwanted, Duplicate, or Outdated Records

Unwanted duplicates waste storage space and can be confusing to users, while outdated records can clutter and disorganize the workspace. You can address the issues as follows:

 Detect Duplicates: Enable duplicate detection rules in Settings > Data management > Duplicate detection rules. Create rules (e.g., matching on email or name) and publish them.

 Removing or Merging Duplicates: When records are created or updated in the system, a prompt will be provided regarding the duplicates in Dataverse. You’ll be able to select duplicates to merge, and you can choose to keep the master record and associate the child records.

 Outdated Records: Advanced Find, or you can use the bulk deletion records to query and remove records that are inactive or old, i.e. closed cases from more than two years ago. You can also keep your system maintained by scheduling recurring jobs for ongoing deletion.

Reduce File Storage (Notes & Attachments)

Notes and attachments (especially images, PDFs, and emails) tend to consume the most storage.

Optimization strategies:

 Use bulk delete to remove old attachments.

 Transfer attachments to SharePoint using Attachment Management.

 Use File Columns sparingly

 Limit the uploading of excessively large files.

Just moving attachments to SharePoint can create several GBs of space.

Clean Up Dataflows & Integration Artifacts

Integrations and Dataflows (especially Power Query and Azure Synapse Link) create Artifacts that incur additional storage costs. To remove these artefacts:

 Remove Unused Dataflows: In the Power Apps maker portal, check the data flows, and remove the inactive data flows. In the admin center, monitor the related tables.

 Enhance Integrations: For Azure Synapse Link or Fabric links, eliminate unnecessary tables/shortcuts. After removing the obsolete links, vacuum the Lakehouse destinations to delete the Delta files.

 Retention Policies: In the Dataflows, set policies to archive or delete older data. This reduces the impact on the database and files from exports or syncs.

Auditing these processes regularly will stop unused storage from growing.

Final Thoughts

Dataverse storage has a direct impact on Power Platform performance, app speed, cost, and growth scalability. Analyze and focus on high-impact areas first, especially logs and files. Automate using bulk jobs. Always test in a sandbox first. Following

these steps, you can keep the environment clean and healthy to encourage innovative activities. For continuous monitoring, utilize admin center reports, and for archiving, use Fabric to retain reports for longer. If you are experiencing overages, optimizing to add capacity can be a temporary solution.

Peafowl IT Solution helps organizations develop and optimize Microsoft Power Platform solutions by building scalable Power Apps, automating processes, and managing Dataverse efficiently. We ensure better performance, lower storage costs, and secure, well-governed environments so teams can innovate with confidence.

Turn static files into dynamic content formats.

Create a flipbook