Over the past two weeks, Google has quietly changed the terms of service for its Colab users, adding a stipulation that Colab services can no longer be used for training. wrong wrong.
The first web archived version from the Internet Archive that features the deepfake ban was capture last Tuesday, May 24. The last captured version of the Colab FAQ which does not not mention that the ban concerned the May 14.
Of the two popular deepfake creation distros, DeepFaceLab (DFL) and face swapboth of which are forks of the controversial, anonymous code posted on Reddit in 2017only more famous DFL appears to have been directly targeted by the ban. According to DFL Discord deepfake developer ‘chervonij’ running the software in Google Colab now produces a warning:
‘You may be running unauthorized code, which may restrict your ability to use Colab in the future. Please note the prohibited actions specified in our FAQ.’
However, interestingly, the user is currently allowed to continue executing the code.
According to a Discord user for rival distribution FaceSwap, code from this project apparently doesn’t trigger the warning yet, suggesting that DeepFaceLab’s code (also the power architecture for implementing real-time deepfake streaming DeepFaceLive), by far the most widespread method of deepfakes, was specifically targeted by Colab.
FaceSwap co-lead developer Matt Tora commented*:
“I find it highly unlikely that Google is doing this for any particular ethical reasons, more than Colab’s raison d’être is so that students/data scientists/researchers can run computationally expensive GPU code in an easy and accessible way, without overhead. However, I suspect that a sizeable number of users are leveraging this resource to create deepfake, large-scale models, which are both computationally expensive and take a sizeable amount of training time to produce results.
“You could say that Colab leans more on the educational and research side of AI. Running scripts that require little user input or understanding tends to go against that. At Faceswap, we try to focus on educating the user about AI and the mechanics involved, while lowering the barrier to entry. We strongly encourage the ethical use of the software and believe that making these kinds of tools available to a wider audience help educate people about what is achievable in today’s world, rather than keeping it hidden away for a select few.
“Unfortunately, we cannot control how our tools are ultimately used, or where they are executed. It saddens me that an avenue has been closed for people to experiment with our code, however, in terms of protecting this particular resource to ensure its availability to the actual target audience, I find it understandable.
Since deepfake training is a VRAM-intensive pursuit, and since the advent of GPU starvation, many deepfakers in recent years have shunned home training in favor of remote training in Colab, where it is possible, according to chance and level, to form a model deepfake on powerful cards such as the Tesla T4 (16GB VRAM, currently around $2000), the V100 (32GB VRAM, about $4,000) and the mighty A100 (80GB VRAM, MSRP of $32,097.00), among others.
Banning Colab training seems likely to reduce the pool of deepfakers capable of training higher-resolution models, where the input and output images are larger, more suitable for high-resolution results, and able to extract and reproduce greater facial details.
Some of the most committed deepfake enthusiasts and enthusiasts, according to Discord and forum posts, have invested heavily in local hardware over the past two years, despite high GPU prices.
However, given the high costs involved, sub-communities have emerged to deal with the challenges of forming deepfakes on Colabs, with random GPU allocation being the most common complaint since Colab restricted GPU usage. premium to free users.
* In private messages on Discord
First published May 28, 2022. Revised 7:28 AM EST, corrected typo.
#Google #banned #formation #Deepfakes #Colab