As the save function (compile always calls save before proceeding to compile) is intended to save to a local file on the hard drive, using other storage locations certainly has to be beyond scope of what the Nextion Editor can be responsible for. This would certainly be an issue to report to Google, and I imagine their response would be to decrease your frequency of save/compile such that it allows time for the file to complete its transfers into the cloud successfully.
This certainly is not a new issue and had been widely documented in the 1990s when the more primitive cloud methods were being used (nfs, ftp, etc) as well as any remote drives that went over the man/wan.
Heck, there would be issues in local lan file servers or peer-to-peer systems if the load was to high.
A failed transfer should be considered as failed and deleted since the integrity of the file contents have been lost. This in your case is more complicated as you are first retrieving the file from the cloud and save requests are being made to the save to the same filename potentially before the requested file has been updated. Thus as the file has been zeroed to start writing the save to, the remaining amount of the file being retrieved has been lost. I would recommend you do all the Nextion work locally, and when you are satisfied copy it into Google Drive.
To be frank, systems may have been slower, but were more stable in the 1990s when your direct line was not being shared, when your server was your server and you could actually depend on timings. With cloud services today, your timings can be thrown to the wind by end of day file syncs or even a couple of neighbours deciding to downloading the latest hi-def episodes of Game of Thrones.
As this is an issue with syncing, it is beyond Nextion's scope.
As such, this will be marked as solved.