Adobe CC Bug Erases Data on Macs (UPDATED)

A bug in a recent Adobe Creative Cloud update is currently deleting a folder on the root drive (Macintosh HD, by default) of Macs upon installation. This issue is affecting Backblaze users disproportionately due to the fact that the bug results in the deletion of the contents of the alphabetically first hidden folder on the root drive, which is often the folder ".bzbol" for Backblaze users.

Of course, if you are not a Backblaze user, the bug can still delete files within whichever hidden folder in your directory is first alphabetically, which may or may not have extremely undesirable consequences on your system. For some readers, the first hidden folder may be one called, ".DocumentRevisions-V100," the files within which are necessary for proper file version mapping and recovery functions within OS X. Whether or not there may be a solution that undoes the deletion (or rather, replaces these files) is still unclear. Adobe said they have stopped distribution of the update that causes this issue, which began with version 3.5.0.206.

Backblaze posted a video (above) showing the deletion of the files, but also shared a temporary solution, which naturally involves creating a hidden folder in the root directory that comes before the one called ".bzvol". They have managed to fit a small bit of humor into their fix, but they undoubtedly find little of this situation funny, given the impact it's had on their users.

Temporary fix:

To correct this, please do the following:

Open Terminal, which is found in Utilities under your Application folder.

Copy and paste the below command, pressing enter after:

  • sudo mkdir /.adobedontdeletemybzvol

After you press enter, you will need to enter your Mac Admin password and hit enter to create the folder. 

This should protect your .bzvol folder from being emptied by Adobe Creative Cloud, as the new folder is now first alphabetically. 

 

**If the bzvol folder has already been wiped, you will also have to follow these steps after creating the new directory.**

1. Click the Backblaze icon and go to preferences.

2. Click the Settings button.

3. Check both boxes for the unplugged and plugged in internal Mac HDs. 

4. You may receive a message that one drive will replace the other -- this will not affect your backup with this issue.

5. Click OK

This should resolve the bzvol error and allow backups to resume, as scheduled. 

 

**If you have any hidden folders alphabetically before .bzvol, you may want to restore them from Backblaze.**


The latest 'beta' installer of the Backblaze software has been updated to create a folder named '/.aBackblaze/ at the root of your hard drive as a decoy folder. Running the latest installer will also prevent this issue, like following the above steps. The installer can be downloaded here: http://files.backblaze.com/

 

[via ArsTechnica]

 

UPDATE: Apparently, Adobe has resolved the issue with a new CC release. Of course, this will not fix or reverse the deletion issues in any way. The new version will not, however, cause the issues to begin with for those that are lucky to not have been affected.

Adam Ottke's picture

Adam works mostly across California on all things photography and art. He can be found at the best local coffee shops, at home scanning film in for hours, or out and about shooting his next assignment. Want to talk about gear? Want to work on a project together? Have an idea for Fstoppers? Get in touch! And, check out FilmObjektiv.org film rentals!

Log in or register to post comments
43 Comments

Oh, that's what happened? I got this error and did the "click both hard drives" thing, but I had no idea this was caused by CC. Thanks, Adobe.

For anyone not using Blackblaze who wants to create a sacrificial hidden folder:

Go to the root directory of your boot drive (MacintoshHD by default) and create a folder named ".aaa_delete_me" without the quotes. You will need an admin password for this.

To confirm that the folder is the first alphabetically you can view hidden files by opening Terminal and typing "defaults write com.apple.finder AppleShowAllFiles YES" without quotation marks and hitting return. Then type "killall Finder" (again without quotation marks and hitting return).

To re-hide hidden files use "defaults write com.apple.finder AppleShowAllFiles NO" followed by "killall Finder"

Note that if you have hidden file selected when you type "killall Finder" Finder will reopen with that file still visible until the window it was selected in window is closed.

If you are on a command line anyway, you can use the command "ls -al" to show hidden files - all letters are lowercase. The 'a' flag is 'show all', the 'l' flag shows the files in a list. This is a lot easier than changing your defaults and restarting your finder.

So the commands that Backblaze recommends are--
cd /
--- This changes you to the root of your file system, or, the / directory. Not really needed, but it doesn't hurt.
sudo mkdir /.aaaaaaa
--- this will make a sacrificial hidden folder that is likely alphabetically first. It will also require your user's password, and for your account to be an admin.

If you run "cd /" first, the second command can be "sudo mkdir .aaaaaaa".

However, this is someone on the internet telling you to run a sudo command, which in general, can be a very bad thing to do. So, use your best judgement.

in a file listing, uppercase characters will list before lowercase. So will numbers. If you really want to be paranoid, mkdir /.0000000000 would be safer.

yeah... this totally borked my entire projects folder as I use " projects" with a SPACE before the folder name so it appears first at my root directory level. WTF adobe.... luckily I have a Time Machine backup because there was over 200GB of files in that projects folder.

Just a word to the wise: putting an "_" (underscore) or muliple "___" if need be is FAR preferable to spaces on a UNIX system. If you ever need to do some some strict terminal recovery, you'll need to "escape" the spaces, which just adds more work and potential for errors. Just saying.

As protective as Apple sometimes may be, especially when it comes to the file system on iOS, they unfortunately have failed to implement the same "tampering" protections on Mac OS. Windows actually handles that better by hiding any system files by default. However, Mac OS being a full fledged UNIX system (with some sugar coating on top), the average layman should have no business being in any folder other than his user folder. - Meaning, Barry Munsterteiger, your personal files ONLY belong into your user folder, into one of the designated folders like "Documents" or "Photos". That way you are not interfering with the underlying technology of the system. It will also help you transfer files with the "Migration Assistant", should you ever need to.

Not trying to troll here, just wanna clarify that rights to directories/folders aren't managed by their state of hidden on or off. So, we could recreate this defect if a directory/folder was hidden or not hidden. I'm pretty sure (not 100%) the folder(s) mentioned in this article are hidden, and it's not a user putting files in folders willy nilly; rather, it's a 3rd party script that wrote against the directory/folder (hidden or not). And one last thing... folders/directories are not the "underlying technology of the system", that would be the Kernel. :)

My comment was in regards to him having a project folder at root level, where it does not belong in the first place. None of his files would have been affected by the bug if he had stuck to his user folder. He had a backup though, so that's a plus.

To clarify, by "underlying technology of the system" I'm referring to the way Apple designed its UNIX, which is quite different from any other UNIX available. Again, the design could use improvement, the average user does not need to see anything other than his user folder in order to not tamper with the system. The fact that the default user is always an admin makes the access control fairly ineffective.

This is where the "right" way to do it from an engineering standpoint and the "right" way to do it from a consumer are very very different and Apple has always been different in how it thinks. In the 16+ years of running OSX, an early adopter of nearly everything, a former employee at Apple, and being well organized it was immediately perceived as something I had done. It wasn't until I read the article that I even imagined it was something that had been installed by a 3rd party messing with things in a directory they had no need to touch.

Regardless of what the UNIX community believes is right or wrong, it's my machine and if I can create a folder somewhere I will. Having multiple users on a system that protect directories that are used across them is exactly what the root level is for and this allows my machine to see the same directory from multiple logins. sure there are probably other ways the UNIX admin/engineers would have done this but I am not one.

It is unfortunate when things like this happen, having backups is the only way to protect from the unexpected. I learned this the hard way very early in my computer life when a RAID unexpectedly dropped a drive from the array.

Its more unfortunate that nasty "bugs" like this exist, seems more like a disgruntled engineer wrote something malicious to see what would happen. (yes that is highly speculative but not out of the realm of possibilities)

Nope, the root level is for the system. Your personal files belong into the user folder for protection and yes, there is also a "Shared" user to share files across multiple users. UNIX is by design a multi-user system. The root level has a certain structure and you should not mess with it. You see what can happen. Not that this should have happened but it's a good example of why you don't touch the system, especially not to randomly store files.

Maybe as a desktop analogy that the OS is built upon; say you are working in the office, you don't just throw your project files onto the boss' desk. You store them in/on your own desk. - Just because you can, does not mean you should.

I'd worked for Apple and support centers for years, dealing directly with end users who messed up their systems in all kinds of ways. Again, my advice is, stick to folders (Users) you are meant to use and don't touch the system. This isn't coming from some UNIX purist but from a seasoned Apple tech.

Hi Barry,

I'm not on a side here, but if a gun was to my head and I had to pick though, I'd pick yours because I get where you're coming from. I also get where dred is coming from [absolutely], but he's stuck in some strange dogmatic point of view that's deeper than I care to visit... Although he has a point that I get, I fear he's somewhat horse blinded and not really seeing the big picture.

Letter of the law for best practices; yes, his point makes sense about "not saving files to root", blah blah blah.... Jesus Christ, he might've just as well added "thou shall not..." in there for some added effect. However, in the spirit of the law of best practices, you are absolutely correct.

The short story is that you [indeed] have the perfect right and sensibility to create folders/directories with any name you see fit. Sure, there are best practices out there, but they're not an absolute rule in this situation. Are you within your reasonable head space to be pissed? Fuck an A yes you are...

However, having an understanding of both sides, just below is a brief outline of some clarifications and my thoughts:

barry - "... that had been installed by a 3rd party messing with things in a directory they had no need to touch. ..." Nothing was installed, it was a script stored outside your machine that modified a folder on your machine one time. You're right, there wasn't a need to alter either folder (your created folder OR a default folder already in place) by CC (the 3rd party).

barry - "Having multiple users on a system that protect directories that are used across them is exactly what the root level is for..." Actually, multiple users is a feature. The root directory/folder's purpose is to boot the mounted file system(s). Any file that happens to be saved in the root folder/directory that doesn't belong in the boot process is just ignored (not ignored literally; remember, letter and spirit of the law). Basically, "multiple users" came along later, so that feature has carried over, as many features do in different OSs.

dred - I'm pretty sure everybody gets that you shouldn't save files in root. If not, they should after reading this.... BUT, from what I read, you're missing the point and you're missing the idea of "root" in both the initial article and some of the responses. First, it seems like you should at least know that "root" can be sort of confusing when it comes to Unix. If I'm wrong, sorry, but basically saying that A) a user CAN'T make a folder with any name (s)he wants to isn't correct. B) This goes for limiting users to where they can save their files, as well. Yes, there are best practices; no, not everybody knows them or cares about them; no, a folder/directory above the " /" doesn't make it "the root folder." It also seems like you should understand that Unix's "root" concept (omg, don't get me started on your "sugar coating" version), let's just stick to the idea that Unix's concept is sort of confusing (by those with less in the know maybe) by allowing a directory in the tree to come before the actual system root folder; again, without it actually being "the root" folder that you seemingly keep pointing out. Windows, by comparison, doesn't allow this since a mounted volume (which also happens to be it's root directory) is designated with a letter, eg: "C:", and nothing can be above that in either the literal or virtual respect. Okay, unless you're talking about multiple mounts on a much larger scale, but "spirit of the law" here. Even another poster here commented the same idea about using underscores or even double underscores in favor of blank spaces in file names for it's potential ease of use and clarity's sake, but he didn't say it was against any hard rule or imply that it was somehow the user's fault. This is not the user's fault.

dred - "None of his files would have been affected by the bug if he had stuck to his user folder...." This bug could've just as easily happened to ANY folder given the circumstances of how the bug was written. Letter of the law; sure, in this incident you're right, but that same comment is dead wrong if we could reenact the bug with a more targeted folder name along with a similar outcome/ending. Why not?

dred - In other words, if a senior executive at Apple named a local folder " a importantFolder" and it's contents were deleted permanently, you would tell him/her that it was their fault, and "Your personal files belong into the user folder for protection." LOL, C'mon man, maybe I've been in the corporate world too long, but even I'm not that jaded with the game.

And just another reason why I never upgrade when it comes to Adobe..

So, basically Adobe CC is now a virus?

Not trying to start another Mac vs Windows discussion (and please don't make one out of this post) - when it comes down to things like audio routing I'm jealous of the Mac users and each platform has it's own benefits.
But for some reason the whole Adobe Software seems to make much more problems on Macs than Windows. It starts with the GPU support that is more problematic on Macs and ends at such little bugs that are often just Mac only related. The issues it first had with El Capitan is another example.
I get the feeling that the relationship between Adobe and Apple is not as good as they sometimes pretend on the stages. I doubt that it's just Adobes fault - they both need to work on it because such things just shouldn't happen when they aim at professional users.

"...I'm jealous of the Mac users...", nah I doubt it. Why, the meaning of jealous is feeling or showing suspicion of someone's unfaithfulness in a relationship. Envious would be the correct word.

My be we have to start lawsuits, in other case Adobe free to do what ever they want. We are not beta testers! I believe many people lost files, that leads to money lose.

I'm sure Adobe protects themselves in their user agreement. In addition, while it's extremely frustrating and silly that a product this popular and from such an established brand has issues like this, people should KNOW to have constant backups going 24/7 with local snapshots. The software is there and included/free for pretty much any computer you have, and there's simply no excuse to not be backed up.

So realistically, as scary as it is, at worst it should only hurt you in taking up a little extra of your time. Again, I'm not trying to minimize the scariness of stuff like this happening. There's no doubt Adobe's reputation is hurting from this -- and rightly so. But at the end of the day, it's pretty typical "shit hit the fan" stuff that we can and should be able to deal with.

Those sorts of waivers aren't worth the electrons used to display them. For something THIS major, they will be full of liability. I'm thinking this is why my Time Machine disk took a dump on me the past week, wiping out two other partitions on that disk. Freaking Adobe.

I agree, this software has no business deleting files it doesn't create itself. Its behaviour is that of a virus.

While I also agree with Adam Ottke that backup should be second nature to any professional, one shouldn't be worried it starts deleting system files all of a sudden. I wonder if this is a bug, or malicious intent from one of their software developers.

I consider this malicious since it borked my BACKUP! Adobe better figure out some way to fix their damage to my system! They have no choice in this matter other than to restore it back to where it was before THEY screwed it up!

NOTE: I AM NOT A LAWYER, THIS ISN'T LEGAL ADVICE.

Liability waivers aren't as iron clad as companies would like us to believe. In most jurisdictions a well written waiver protects against "ordinary negligence" but not against "gross negligence".

Here is an example.

You go to a gym, you are on the treadmill and break your leg because the thread mill suddenly breaks and stops working. If all maintenance schedules were followed and the machine had not been reported as defective before hand, that would be considered "ordinary negligence" and a well written waiver would protect against this.

If, on the other hand the machine had had several reports of poor operation and no action would have been taken then that would potentially be considered "gross negligence" and no waiver, no matter how well written, would protect against that.

It's the same situation with non compete clauses in employment contracts ... the contract can state whatever the employer wants but that doesn't mean the clause is legally binding. Non compete clauses get thrown out of court ALL THE TIME because they are too far reaching or too restrictive.

In photography, for example, I have a waiver in my contracts that state I am not liable if my gear dies, memory cards cirrupt ... things that out of my control but if I am completely stupid and leave all me gear in the car over night and have all my shoots for the past 2 years stolen because I didn't do a single backup (something that any professional would/should do) then that is GROSS negligence and my contract won;t protect me against that.

I'm refrencing this article btw:

http://petapixel.com/2015/09/10/photographer-loses-pictures-from-20-shoo...

This is the PERFECT example of gross negligence in photography.

But back to Adobee deleting your files ... it would be far more difficult for you to prove gross negligence compared to ordinary negligence ... in programming it's not as clear cut as proper maintenance schedules or established standard workflows (ie.: backups) and the likes.

They borked my backup, that is negligence in their part!

To start, I'm not a lawyer.

I agree with you but the question is, is it GROSS negligence or ORDINARY negligence.

Did the bug slip through because they didn't do proper testing or did they do adequate testing but for some reason or another they were not affected by this bug.

The first example is gross negligence the other is ordinary negligence.

Like I said, it' much easier to prove gross negligence in other fields like food service or professional photography.

GROSS NEGLIGENCE
FOOD
Didn't follow standard practices in terms of food prep (cross contanimation)

PHOTOGRAPHY
Preventable loss of clients images due to not doing proper redundant backups.

How do you prove GROSS negligence in coding? That's what I was trying to explain.

Glad I am not using CC, how could a bug (virus) like this be created? Should the OS alert the user about this ? Customers should not be Alpha testers.

The OS doesn't alert users because technically speaking, the installation program from Adobe has permission to do whatever it needs to (it needs that permission to place files that are part of the installation process), so Adobe's program can do what it wants/needs to during installation. And in this case, for whatever reason, it "wants" to delete the files in that first hidden folder...at some point...

Another non-coder who doesn't understand OS's.

Great informative post. If it helps, I read the article here: http://arstechnica.com/apple/2016/02/warning-bug-in-adobe-creative-cloud...

Just FYI for some of the poster's comments/questions: It's not a virus, it's a defect in a script from CC. The OS wouldn't have a way to detect a buggy script, the OS's defense is mainly preventative that asks the user for an Admin password to install an update from a trusted source, or to install software initially whichever is the case (or both). and an LOL to the comment that we shouldn't be beta testers, right!? In this case, we'll just cross our virtual fingers and hope that Adobe takes this mulligan as an opportunity to add this incident to the management of their Unit testing. Oh Adobe, you're so cray cray...

I just updated CC. Do I have to be worrying about my projects being deleted now if I don't use Backblaze?

On the same boat as you. Don't use Backblaze but wondering if there is anything I should worry about it? Would hate as much as anyone else if *any* data was lost. What the opinion 'round here?

On many users' systems, the first folder there is one that messes with document versioning/autosave-type features built-in to OS X. So you might want to check that. You can show your hidden folders (and then look at the Macintosh HD) by following these steps (just be sure to undo it with the directions listed afterwards -- you don't want the hassle of constantly staring at all the hidden folders on your system):
http://www.macworld.co.uk/how-to/mac-software/how-show-hidden-files-in-m...

This will at least tell you what that first hidden folder is so you can know if it's something important. A simple Google search helps a lot if you don't know what that folder is once you see it...

What do you we do if that folder's contents have been deleted by the CC update?

#1, hopefully you have a backup and can restore the content that was there. But if you don't, the first step would be to find out which folder had its contents deleted on your system (and if this happened to begin with at all).
Also, you could save yourself time by confirming the version of CC that you have. If your system hasn't been updated for some reason, you might not have any such version that's causing this problem.
Once you find which folder's contents has been erased (if applicable), you can do research to see what was supposed to be in there. In some cases, these files might have been temporary files that are not too important and that will automatically be written again. In other cases, they might be system files that are the same across all systems, but that you'll have to put there yourself somehow (maybe with a reinstallation of OS X???). In other cases, it might be data that you just can't get back without a good hard backup... So it really varies for each user.

Thanks Adam. The first folder was unfortunately the .DocumentRevisions-V100 folder. Haven't had any errors pop up yet, and my backups are mostly my pictures and data, not the system files.

Glad to hear it (sort of, since I'm assuming you still had the problem). Were there files in there? In any case, that's the way it likely is for most Mac users. Still, I would recommend you save your documents often and continue backing everything up, since that folder contains files used for autosave/versioning as far as I have read (just be careful until you can be sure everything is working as it should). Sounds like you, at least, could get out of this without much issue. Hopefully that's the case indeed.

As someone who does a lot of hiring out of the same pool of candidates as Adobe (they're just up the freeway from us, along with Oracle), I am not the least bit surprised. You could get away with less than perfect developers or QA, but not both.

And how is the cloud better?
Adobe it's not a job.
It's a joke.

I'm a Photoshop beta tester and communicate off and on with the guy in charge of PS's user experience (he designs where all the buttons go and how they work). I sometimes get pre-release versions to "try out". Anyway... from what I've heard through him, I'm not surprised in the least that this has happened with Adobe. Lots of smart people there, but it's also kind of a sh¡t show over there.

Why the Hell Adobe is deleting any Folders in the root node? And of course, why there is not a Shitstorm against Adobe in cause of this fatal Fail? Next damn Software piece, that has to be updated only every second Release.

And what you hear is the sound of 100,000 users turning auto-update off...

Although I know it's easier said than done, I think the graphic arts industry has to break this dependency on Adobe. They have a monopoly in graphic arts, and of course we are all contributors to that. The CC suite has become a production landmine field. Although this one is pretty bad, even the lesser problems with the CC suite are roadblocks and impedance to our day-to-day production routines.

I think that we need, on an individual and collective level, to find real-word alternatives to the Adobe suite. Some of those options already exist, we need to just push ourselves out of our comfort zone and start to integrate them into our production pipelines. I know we all have work to do, and this industry demands work done yesterday, but it really does need to be done. We need to free ourselves from this software heroin.

I have been looking at this on my own production routines, and trying to figure out what works for me. As a freelancer it's a bit harder because I have to show up to places knowing Photoshop because, after all, it rules the industry. But if I can work from home and all I have to deliver is a tiff or a jpeg than the game changes.

I know it's different for every person, but really, think about it. Our dependence on Adobe really needs to be expunged from our production lives.

At the end of the day, it comes down to whatever is the most efficient/saves us the most time/is the overall best solution. That, still, is Adobe's CC suite (in most fields....). When you consider switching, there's just nothing with the same combination of features, the same level of advanced algorithms that really do work, etc... That's tough work to get right, and they do it 99% of the time. Then this happens, and we all get upset (again, rightly so).

We can get upset at technology every day. Every day SOMETHING doesn't work. But the question is, would we be more productive without it?

So it bugs us, but for the most part, we can deal with it, and so we do. And of course, the world in which we live now demands a new level of efficiency based on the availability of this technology. So there certainly is a limit to how many bugs/nuisances can be in be in the way before it makes sense to switch. But it's still hard to say Adobe doesn't provide the best product in most cases (at least the best integrated suite of products...but sure, you could go C1, FCP, etc., any day....but that still leaves others stuck on Illustrator, InDesign, etc.).

You can expect more of this in the future unfortunately.

As someone who has worked in software for nearly 20 years I can tell you this kind of stuff starts to become the norm when you move to a highly iterative model for development and subsequent production deployments.

Why? Because you no longer are going through full regression testing suites since your releases are considered to be of less scope than the mega-monster full version releases Adobe used to have. Instead your developers and QA end up cherry picking the scope of focus and testing and are more likely to miss the mark on collateral damage type items.