Update
Thank you again to keybase user computer_whisperer for looking through the repo provided for the “gpu plotter” and pointing it is a bunch of merge-pulls and readme changes. This means it doesn’t look like the author made any significant updates beyond a reskin. Seems like a giant nothingburger out to take your money.

Original Article
A fork of Mad Max’s parallel plotter is claiming to have Cuda / OpenCL support for running on Nvidia and AMD GPUs instead of the CPU. Github user achow1o1 has committed the following read.me to their chia-plotter repo.

If this is true, this could be game changing for plotting as this would be significantly cheaper and more efficient than running a CPU for hours at a time.
As always with 3rd party, unvetted code run at your own risk. You never know what this could do in the backend. It is open source, and I’m positive there will be many eyes on this if its the real deal but please – look before you leap. Especially since this is not free.

Oh yeah, that’s the best part. This costs $525 USD in bitcoin to buy the Windows version. I will not be buying this, but if anybody does try it out please let me know if and how it works.
Hi
Account (achow1o1) is impersonate of Andrew Chow (who is a bitcoin developer). His real developer adress is https://github.com/achow101
I think it is a well established scam.
Looks like ransomware, but this time you pay before installing it. That’s the spirit of innovation I expect from crypto! However, they should have had the decency to demand payment in XCH.
It would have only been like 400 in XCH by the time i got this post up, and 350 now 😉
GPU plotters have been around forever for burst & BHD, its like CHIA just ignored all this stuff. Hell 5 years ago we had GPU plotters that supported AMD&NVIDIA, CUDA & OPENCL; Now we have chia-plotter from chia-net that take 6-12 hours to make one plot and burn up NVME’s & SSD’s, my gawd we have came a long way.
Madmax for Chia is a great step forward, but this stuff really does exist, and has so for years.
I’m getting 20 minute plots on my mad-max, running serial doing 72/day, before I was doing 16/plots a day, because I could only stagger 4 without burning up my NVME’s, I was doing about 5.5 hours per plot. Now 20 minutes, all CPU, and my NVME’s don’t even get warm.
Yeah, this one looks to be BS but i agree its only a matter of time.
great job you wrote an article promoting a scam. Genius.
I assume you didn’t read?