Please note, this is a STATIC archive of website hashcat.net from 08 Oct 2020, cach3.com does not collect or store any user information, there is no "phishing" involved.

hashcat Forum

Full Version: Radeon R9 390X News
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Pages: 1 2
While nvidia fanboys be fooled by nvidia and feel cool AMD works silent on its new card to top the 290x.

It will include Fiji-GPU, also Cooler Master will be develop the massiv cooling :=)) (we can be sure its water)

300-Watt-including the brand new HBM memmory.
Also not new , we all know its processed in 20 NanometerZzzz
.

[Image: Fiji-XT.jpg]

Very nice info about amds new hbm
https://wccftech.com/amd-20nm-r9-390x-fea...han-gddr5/

[Image: Hynix-HBM-15.jpg]


https://videocardz.com/amd/radeon-rx-300/radeon-r9-390x
Careful;specs are not validated by now...

Release DateApril 2015
Launch Price$599 USD
Board ModelAMD C880
GPU Model 28nm Fiji XT
Cores : TMUs : ROPs 3584 : 224 : 64
Clocks
Base Clock1200 MHz
Memory Clock (Effective)1000 (1000) MHz
Memory
Memory Size4096 MB HBM
Memory Bus Width4096-bit
Memory Bandwidth512 GB/s
Physical
InterfacePCI-Express 3.0 x16
Thermal Design Power300 W


3584 sounds god for hashcat ;--)
Although only rumors, 3584 shaders will be the 390.
390x comes with 4096 shaders.
Watercooling is for sure.
Not so the 300W.
But definitly the 28nm process.

EDIT: It's expected in the next four to six weeks according to the people who should know Wink

380x will be a rebranded Hawaii (290x).

Tonga with 2048 shaders plays the 370x.
Just as worthless for oclHashcat as the 295X2.
we will see , its amd i trust.
There's no innovation here. AMD's game plan for last generation and the next is simply "let's keep adding more cores to our aging architecture." Obviously not a scalable plan, and that's why they're releasing a reference design card that requires water.

Water cooling is an automatic non-starter. The power consumption is grotesque for a single GPU card. They claim 300W which means it will draw around 375W in practice (that's 6990 territory!). But of course PowerTune will try to keep power consumption under 300W, so the card will throttle under load regardless of the temperature.

This card is proof that AMD's days are numbered.
Quote:This card is proof that AMD's days are numbered.

Hope you are wrong. Once AMD is out of the game for GPU's, that means that nVidia has no on motivating them to try a bit harder. So, if you are right - the gtx980 is the best card we see in the near future.

Hmm ? Smile
Not really. Nvidia is still the de facto compute vendor. AMD was never competitive in that space.

But remember that GPGPU is nothing more than a hack. Manycore CPUs will eventually displace GPUs for accelerated computation, possibly even within the next 3-5 years.
(02-17-2015, 01:14 AM)epixoip Wrote: [ -> ]Not really. Nvidia is still the de facto compute vendor. AMD was never competitive in that space.

But remember that GPGPU is nothing more than a hack. Manycore CPUs will eventually displace GPUs for accelerated computation, possibly even within the next 3-5 years.

Yes! There is some truth and foresight to this. GPUs are kinda maxing out these days for the general public and consumers. So, just imagine if a person wrote their own code for something to take advantage of this much horsepower: https://bit.ly/1CXMlMH

ASIC stands for "application specific integrated circuit" REPEAT AFTER ME "application specific" (the Bitcoin stuff is SHA-256 for example) this is where things get scary. Scary along the lines of FPGA. Scary along the lines of a particular agency that has a motto, "We BUILD what we cannot BUY"
ASICs have been around all the time. The NSA surely will have the latest and even unavailable technology for breaking hashes.

Beside that I don't think CPU's displace GPUs for accelerated computing but the next big step will be CPU and GPU working closely together. Like AMDs HAS or Nvidias NVLink. And maybe both will merge together forever like FPU (i come from those days where you had to buy the FPU seperately and plug it into her own socket on the mainboard...)
Guess we have to save this question for Snowden's AMA Smile
Pages: 1 2