So, I had a Novell teacher/trainer recently mention to avoid creating images that contained a 4gb+ file as it completely corrupted the ZMG file.

I tested this for myself, and he is correct.

This caused me a HUGE problem when my PC was giving me issues. I had to backup my drive and dump the image onto another similar machine, and in this backup I had several VMWare Workstation VMDK/VMX files that were well beyond the 4gb file size limit (the OS itself + applications takes up 8gb easy)

The irony is that the imaging process to get the image dumped over PXE works great, and I believe recovering from that ZMG file works just as well, except that nothing appears on my drive when I check.

I open the .ZMG file with ZMG Explorer and the directory structure is there but none of the files exist.

This is an incredibly flaw and from what I've gathered, the developers have known about it for years and have done nothing to fix it.

I took the same steps and used a 10yr old licensed version of Ghost to image my drive and it worked great, so at least I had access to my stuff before I shipped out my drive.

Can Novell PLEASE fix this? This is a major issue for this day an age where some projects and files are well beyond the 32bit '4gb' filesize limitation.

Anyone else run into this issue, or even better, found a solution for it?

I had this great idea of creating an image set that also contained a separate partition which would have had the model's ZMG file locally accessible for off-site imaging, but this is not a feasible scenario any longer.

I'm quite disappointed.