Docker Qcow2 Large

broken image


As the Docker.cqow2 file changed was the original one used by Docker, you have only to restart Docker to benefit from the new image size. Now you can use Docker to run SAP NetWeaver ABAP with just one command. As the Docker.qcow2 file is empty, even when the image size is reported as 4 GB, compressed (zipped) it's just a few MB. These new sectors are appended to the.qcow2 file causing it to grow in size, until it eventually becomes fully allocated. It stops growing when it hits this maximum size. You can stop Docker and delete this file, however deleting it will also remove all your containers and images. And Docker will recreate this file on. The preferred choice for millions of developers that are building containerized apps. Docker Desktop is an application for MacOS and Windows machines for the building and sharing of containerized applications. Access Docker Desktop and follow the guided onboarding to build your first containerized application in minutes. Ms word tech support.

  1. Docker Qcow2 Large Cap
  2. Docker Qcow2 Large Leather

I really enjoy figuring out stuff to do with DOCKER, but I recently tweeted the following:

Why you ask? Well, there is this file, Docker.qcow2 located right here:
~/Library/Containers/com.docker.docker/Data/com.docker.driver.amd64-linux/Docker.qcow2
on my MacBook Pro that consumes all of the remaining space I have on my hard drive, and SSD's ain't cheap. I'm sure I need to clean up some stuff myself, but that's not the point of this post. I want Docker to manage this file. Deleting images or containers has no impact on this file. It just continues to GROW.


Docker Qcow2 Large Cap

So this is the impetus for the blog post today.

A quick Google search shows that this issue is not new and is obviously still present in the most recent release as of this post.

Here is the issue according to the Docker forums:
'The qcow file is a block disk image that grows when unwritten blocks in the 64GB virtual device are written so the sparsity of writes due to ext4 disk layout causes an initially large increase in image size.'

Since I am working with mainly the Oracle 12.1 image, which is in the 5+GB range, space is at a premium and so here's to hoping they find a solution sooner than later as it is at a minimum inconvenient to have to do either one of these processes.

I have found two ways to work around this issue of my Mac experiencing disk space issues as a result of testing with Docker.

Docker Qcow2 Large Leather

Yep! Bomb the crap out of Docker by going into the Docker -> Preferences menu and clicking that Reset icon.

And this works, but you will loose all your images and containers if you don't take some steps to preserve them. What can we do?

Docker qcow2 large storage

The docker save command produces a .tar file to STDOUT. The -o option writes to a file instead of STDOUT.

You can then go ahead and reset Docker. Once the reset has completed, the docker loadcommand will load a .tar or the standard input stream. It restores both images and tags. This important since other applications outside of Docker may reference the image IDs.

In this example you would:

  1. docker save your images. Use docker images to identify the images you want to save
  2. Reset Docker from the Preferences menu
  3. docker load your images from the .tar files created in step 1

The other option I had was to script all this. Fortunately, this the internet and this issue is not unique. I found a script by Théo Chamley, that basically did exactly what I was looking for. I took the liberty of making a few modifications to meet my specific needs.

  • Disk space was an issues so I am storing image archives on an NAS instead of locally
  • I may want to save my images archives
  • and since I planned to share this code, I added some instructions and options to exit before the process started.

Excel 2007 vs 2010. With this script in hand, all you need to do is identify the docker image IDs (docker images) then enter them as arguments separated by spaces for the scripts.

This script basically performs the following:

  1. Reads Image IDs as script arguments
  2. Assign the temp dir to a local variable
  3. Executes docker save for each image in argument list and saves them
  4. Stops the Docker app
  5. Deletes the Docker.qcow2 file (this is the problem file)
  6. Starts the Docker app
  7. Executes docker load to restore the archived images
  8. Either deletes or keeps archived images (Y/N)

You can visit my Github Repo find the script here or copy it from below

Docker qcow2 large storage

The docker save command produces a .tar file to STDOUT. The -o option writes to a file instead of STDOUT.

You can then go ahead and reset Docker. Once the reset has completed, the docker loadcommand will load a .tar or the standard input stream. It restores both images and tags. This important since other applications outside of Docker may reference the image IDs.

In this example you would:

  1. docker save your images. Use docker images to identify the images you want to save
  2. Reset Docker from the Preferences menu
  3. docker load your images from the .tar files created in step 1

The other option I had was to script all this. Fortunately, this the internet and this issue is not unique. I found a script by Théo Chamley, that basically did exactly what I was looking for. I took the liberty of making a few modifications to meet my specific needs.

  • Disk space was an issues so I am storing image archives on an NAS instead of locally
  • I may want to save my images archives
  • and since I planned to share this code, I added some instructions and options to exit before the process started.

Excel 2007 vs 2010. With this script in hand, all you need to do is identify the docker image IDs (docker images) then enter them as arguments separated by spaces for the scripts.

This script basically performs the following:

  1. Reads Image IDs as script arguments
  2. Assign the temp dir to a local variable
  3. Executes docker save for each image in argument list and saves them
  4. Stops the Docker app
  5. Deletes the Docker.qcow2 file (this is the problem file)
  6. Starts the Docker app
  7. Executes docker load to restore the archived images
  8. Either deletes or keeps archived images (Y/N)

You can visit my Github Repo find the script here or copy it from below

—-Script—-

You can also visit my github repo to download this file.

Enjoy

Yesterday morning I was innocently minding my own business, downloading some files, when I noticed that nearly all the disk space on my 256GB work laptop had been consumed. This seemed rather unlikely to me, given that I'd only had the laptop for about a year and I didn't store anything other than code and work documents on it. Text files just don't take up that much space most of the time. So I decided to make a purchase I'd considered for a while, bought Daisy Disk, and began investigating.

The first thing I noticed was a huge amount of the disk space (about half) was taken up by the ~/Library/Containers folder. That folder contained my email history and also data on my Docker containers. Docker functions as a lightweight VM, and essentially holds copies of virtualized operating systems and file systems inside each docker container and image, so it made sense to me that it could be taking up a lot of space, though >120GB still seemed wrong for my paltry 4 containers. So my first step was to delete all of the containers and images on my laptop. That cleared about 20GB of space but still left my drive looking like this:

At this point, I was annoyed. 103.5 GB was taken up by some folder called Docker.qcow2 and Daisy Disk wouldn't show me what was inside. So I drilled in on the file system and immediately found out that Docker.qcow2 was not a directory like I'd assumed due to size, but a single 100+GB file. At that point I decided to poll my teammates to see if I was the only one dealing with this:

So between the 5 of us, we each had 'cow files' taking up between 23 and 103GB of disk space. Some Googling revealed a github thread that showed this is a so far unsolved issue with Docker For Mac. Summary: qcow2 files are a format for saving disk images. Docker For Mac's implementation works well for writing and updating images, but doesn't automatically free up disk space when a container or image is deleted. So as you use and delete containers over time this file gradually grows. I have a habit of deleting and recreating containers when I'm trying to debug something, which explains why I had a much larger file than others. There isn't currently a true fix for this issue, but you can delete the file. You'll lose all your containers and images, but when you recreate them the file will be small again. There also is supposed to be some automatic limiting of the file. The current cap is 64GB, but there is work on making it configurable. Unfortunately the current cap is not respected if you had previously used docker-machine to control docker on your machine, which is how my file exceeded the cap. I'm also unclear on what happens when the file hits the cap. I get the impression that things stop working and you need to delete everything anyway (you just avoid running out of disk space on the host machine).

Fortunately it was no problem for me to delete everything, and so I was able to clear things out, at which point my disk looked a lot happier:

Lessons Learned

I had 3 takeaways from this interesting adventure. Lightroom customer service.

  1. If you're using Docker For Mac, keep an eye on your disk space. If you're able to occasionally delete and recreate your containers without data loss, consider occasionally doing that and deleting the cow file. If you can't do that, be careful how many containers you add and delete, and make sure you manage your disk space well.
  2. Daisy Disk is awesome and highly recommended. It's an example of a rare breed: the beautiful system utility. The visualizations it shows are both pretty and useful; it made diagnosing this issue a breeze.
  3. One more thing I learned from my coworkers slack yesterday: ls takes an -h argument that shows file sizes in KBs/MBs/GBs instead of all in bytes. This is super helpful when examining large files. Compare the 2 lists of files from my Downloads folder in the image below. The normal form is very nice for comparing 2 files side by side and seeing which one is bigger, but the second form is much more helpful when you want to get an idea of exactly how big something is, or communicate it to others. Most of us don't think about file sizes in terms of bytes anymore.

Article Info

Author Ben McCormick Publish Date March 28th 2017 Category




broken image