Reputation: 2821
Whenever I attempt to do literally anything with Databricks Notebooks on Databricks Community Edition I get the following error:
Server error: Workspace quota exceeded (using 106 of 100 MB allowed). You must delete some items from your Workspace, such as notebooks, to continue.
However, I have deleted a whole bunch of notebooks from my Workspace and have deleted files in the /tmp folder and /var folder, but I'm still getting the error.
I checked the following site for guidance http://mariuszrafalo.pl/sgh/projekty/db_clean.html#:~:text=There%20are%20two%20main%20reasons,up%20files%20to%20restore%20service%20.
I execute the following code I get the following output
%sh
du --human-readable --max-depth=1 --exclude='/dbfs' /
du: cannot read directory '/proc/1243/task/1243/net': Invalid argument
du: cannot read directory '/proc/1243/net': Invalid argument
du: cannot access '/proc/1246/task/1246/fd/4': No such file or directory
du: cannot access '/proc/1246/task/1246/fdinfo/4': No such file or directory
du: cannot access '/proc/1246/fd/4': No such file or directory
du: cannot access '/proc/1246/fdinfo/4': No such file or directory
0 /proc
108K /run
36K /home
4.0K /srv
2.6G /usr
0 /dev
2.7G /databricks
4.0K /media
4.0K /boot
5.0M /etc
1.6G /var
135M /opt
86M /root
du: cannot read directory '/sys/kernel/security/integrity': Permission denied
du: cannot read directory '/sys/kernel/security/apparmor': Permission denied
du: cannot read directory '/sys/kernel/debug': Permission denied
0 /sys
180K /tmp
16K /mnt
4.0K /Workspace
14M /local_disk0
7.0G /
You will notice that /usr and /var folders are large, I attempted to delete the files in the folders with the following code:
dbutils.fs.rm('/usr', True)
But it didn't help with the error.
Any thoughts will be greatly appreciated as I'm able to use Databricks Community edition to carry out my training
Not able to see trash
My folder in users
Upvotes: 0
Views: 654
Reputation: 3240
Permanently purge workspace storage
Deleted notebooks, folders, libraries, and experiments are recoverable from the trash for 30 days. This action permanently purges deleted notebooks, folders, libraries, and experiments. Once purged, workspace objects are not recoverable.
After deleting any notebooks that contain any personal data, you must also delete job runs that have been run against the notebook, as they may contain similar information. This may be done manually via the jobs interface or via API.
Runs associated with purged experiments are also purged, as well as metrics, params, and tags associated with the runs. However, you must manually delete artifacts associated with the runs. For example, you can delete artifacts stored on DBFS using the DBFS REST API.
Note: Queries and Dashboards in Databricks SQL will not be deleted. If you want to do so, go to Admin Console and give yourself the Databricks SQL access entitlement first.
Permanently purge all revision history
This action purges revision history saved before the selected timeframe for all notebooks in the workspace. Once purged, revision history is not recoverable.
Permanently purge cluster logs
This action permanently purges Spark driver logs and historical metrics snapshots for all clusters in the workspace. Once purged, logs and snapshots are non-recoverable.
Refer - Manage workspace storage
Upvotes: 0