Effective Ways to Identify and Remove Unused Tables in Snowflake

Unused tables in Snowflake can accumulate over time and take up precious storage space. By identifying and removing these unused tables, you can optimize your Snowflake usage, reduce costs, and improve overall performance. In this article, we will explore various effective ways to identify and remove unused tables in Snowflake.

Exploring Snowflake Access History

Understanding the access history of your Snowflake tables is the first step in identifying unused tables. Snowflake provides comprehensive access history logs that can be leveraged for this purpose. Access history logs record every query executed against your tables, including reads, writes, and deletions.

By delving into the access history logs, you gain valuable insights into the usage patterns of your tables. This information allows you to optimize your data warehouse by identifying tables that are rarely accessed or no longer needed. Let's explore some key concepts related to Snowflake access history.

Understanding the Difference Between Direct and Base Objects Access

When analyzing the access history logs, it is important to differentiate between direct and base objects access. Direct objects access refers to queries executed directly on the table, while base objects access includes all the queries executed on the table and its underlying views, derived tables, and materialized views. By understanding this difference, you can get a clearer picture of table usage.

Direct objects access provides insights into the specific queries that directly interact with a table. On the other hand, base objects access gives a broader view by considering all the queries that indirectly interact with the table through its associated views, derived tables, and materialized views. This comprehensive understanding helps you identify not only the tables that are directly accessed but also those that are indirectly utilized.

By examining both direct and base objects access, you can identify tables that may have low direct access but high indirect access. This indicates that although the table itself may not be frequently queried, it plays a significant role in supporting other data transformations or reporting activities.

Analyzing the Parsing of base_objects_accessed

The "base_objects_accessed" field in the access history logs provides valuable information about which objects were accessed during each query execution. By analyzing this field, you can identify tables that are consistently accessed and those that are rarely or never accessed. This analysis serves as a starting point for identifying unused tables.

When parsing the "base_objects_accessed" field, you can extract details about the specific tables, views, derived tables, and materialized views that were accessed. This information allows you to gain a granular understanding of the dependencies and relationships between different objects in your data warehouse.

By identifying tables that are consistently accessed, you can ensure that they are properly optimized for performance. On the other hand, tables that are rarely or never accessed can be candidates for archiving or removal, freeing up valuable storage space and improving query performance.

Furthermore, analyzing the parsing of "base_objects_accessed" can also help you identify any unexpected or unauthorized access to certain tables. By monitoring this field, you can detect any potential security breaches or unauthorized data access, allowing you to take appropriate actions to safeguard your data.

In conclusion, exploring Snowflake access history provides valuable insights into the usage patterns of your tables. By understanding the difference between direct and base objects access and analyzing the parsing of "base_objects_accessed", you can identify unused tables, optimize performance, and ensure the security of your data warehouse.

Tracking Table Query History

In addition to access history, tracking the query history for each table is crucial in identifying unused tables. Snowflake enables you to collect and monitor query history at the table level. By reviewing the query history, you can identify tables that have not been queried recently or have low activity. These tables are good candidates for removal.

Query history provides valuable insights into the usage patterns of your tables. By analyzing the query history, you can gain a better understanding of how frequently each table is being accessed and the types of queries being executed. This information can help you make informed decisions about table optimization and resource allocation.

When reviewing the query history, it's important to consider not only the frequency of queries but also the duration and complexity of the queries. A table that is rarely queried may still be important if it is involved in complex and resource-intensive operations. On the other hand, a table that is frequently queried but only for simple operations may not require as much storage or processing power.

By tracking the query history at the table level, you can also identify any potential bottlenecks or performance issues. If a particular table consistently shows a high number of queries or long query durations, it may indicate the need for optimization or indexing. Monitoring the query history allows you to proactively address these issues and ensure optimal performance for your data warehouse.

In addition to identifying unused tables, query history can also help you identify tables that are no longer needed. As your data landscape evolves, certain tables may become obsolete or redundant. By regularly reviewing the query history, you can identify these tables and safely remove them, freeing up valuable storage space and improving overall data management.

Overall, tracking table query history is an essential practice for any data-driven organization. It provides valuable insights into table usage, helps optimize resource allocation, and ensures the efficient management of your data warehouse. By leveraging the power of Snowflake's query history feature, you can make informed decisions about table optimization and streamline your data operations.

Managing Table Storage Costs

Unused tables consume storage space and contribute to storage costs. By regularly monitoring and managing table storage costs, you can identify tables with a high storage footprint and determine if they are being utilized effectively. If a table is not being used, consider removing it to reduce costs and optimize storage resources.

One effective way to manage table storage costs is by analyzing the data usage patterns of your tables. By understanding how frequently and intensively each table is accessed, you can make informed decisions about whether to keep or remove them. For example, if you have a table that is rarely accessed, it may be more cost-effective to archive the data or move it to a different storage solution.

Another important aspect to consider when managing table storage costs is data retention policies. Some tables may contain data that needs to be retained for a specific period of time, while others may have data that can be safely deleted after a certain period. By implementing proper data retention policies, you can ensure that you are not unnecessarily storing data that is no longer needed, thus reducing storage costs.

In addition to monitoring and analyzing table usage, it is also important to regularly review and optimize the schema design of your tables. By optimizing the schema, you can reduce the overall storage footprint of your tables and potentially save on storage costs. This can be achieved by eliminating unnecessary columns, normalizing data, and using appropriate data types.

Furthermore, implementing data compression techniques can also help in managing table storage costs. By compressing the data stored in your tables, you can significantly reduce the amount of storage space required. This can be particularly beneficial for tables that contain large amounts of repetitive or highly compressible data.

Lastly, consider leveraging cloud storage solutions that offer cost optimization features specifically designed for managing table storage costs. These features may include automated data tiering, which automatically moves less frequently accessed data to lower-cost storage tiers, or data lifecycle management, which allows you to define policies for automatically archiving or deleting data based on specific criteria.

In conclusion, managing table storage costs requires a proactive approach that involves monitoring and analyzing table usage, implementing proper data retention policies, optimizing schema design, utilizing data compression techniques, and leveraging cost optimization features offered by cloud storage solutions. By taking these steps, you can effectively reduce storage costs and optimize your storage resources.

Identifying Tables with No Recent Queries

Tables with no recent queries are strong indicators of unused tables. Snowflake provides features like "Last Query" and "Last Modified" timestamps, which can help identify tables that have not been queried or modified within a specified timeframe. By identifying such tables, you can confidently remove them, knowing they are not actively contributing to your data workflows.

Uncovering Unused Tables with dbt

dbt, or data build tool, is a powerful tool for managing and organizing data transformations in Snowflake. It can also be utilized to uncover unused tables. By analyzing the dependency graph in dbt, you can identify tables that are not referenced in any transformation or pipeline. These tables are likely candidates for removal.

Tracking Table Update History

Tracking the update history of your tables is crucial in identifying unused tables. Snowflake provides features like "Last Load Time" and "Last Update Time" for each table, allowing you to determine when a table was last updated. If a table has not been updated for an extended period, it is likely not being actively used and can be considered for removal.


In conclusion, identifying and removing unused tables in Snowflake is essential for optimizing storage, reducing costs, and improving performance. By leveraging the access history, query history, storage cost management, and other techniques discussed in this article, you can effectively identify and remove these unused tables, ensuring your Snowflake environment is streamlined and efficient.

Tips to Optimize Your Snowflake Usage

Here are some additional tips to optimize your Snowflake usage:

  1. Regularly review your access and query history logs to stay updated on table usage.
  2. Automate the process of identifying and removing unused tables using Snowflake's automation capabilities.
  3. Consider leveraging Snowflake's time travel and zero-copy cloning features to minimize the impact of removing unused tables.
  4. Collaborate with data stakeholders to ensure the removal of unused tables aligns with your organization's data governance policies.

Ready to take your Snowflake efficiency to new heights? Bluesky copilot for Snowflake is your trusted partner in achieving data excellence and maximizing your data cloud ROI. With our innovative platform, you can effortlessly identify optimization opportunities, delve into comprehensive analytics, and automate remediation processes. Say goodbye to manual efforts and hello to accelerated engineering velocity, enhanced query performance, and significant cost savings. In just one year, Bluesky has transformed enterprises by boosting query speeds by up to 500x and saving millions in expenses. Don't let unused tables and suboptimal workloads slow you down. Book a call with us today and embark on a journey to maximize your Snowflake ROI with Bluesky copilot for Snowflake.