Skip to content
Do you have adblock enabled?
 
If you can read this, either the style sheet didn't load or you have an older browser that doesn't support style sheets. Try clearing your browser cache and refreshing the page.

(Some Guy)   Nvidia now having to split production between people who want chips for mining and people who want them for farming   (blogs.nvidia.com) divider line
    More: Interesting, Agriculture, available smart tractor, Local startup Monarch Tractor, Founder Series MK-V tractors, energy-efficient NVIDIA Jetson edge, data analysis, smart tractor, MK-V tractor  
•       •       •

1048 clicks; posted to STEM » on 04 Dec 2022 at 8:50 PM (9 weeks ago)   |   Favorite    |   share:  Share on Twitter share via Email Share on Facebook



31 Comments     (+0 »)
View Voting Results: Smartest and Funniest
 
2022-12-04 8:55:35 PM  
It must suck to have all of those extra customers.
 
2022-12-04 8:59:10 PM  
If they're able to make more chips for farming, then the farms can make more chips for eating.
 
2022-12-04 9:08:29 PM  
Fark user imageView Full Size
 
2022-12-04 9:31:24 PM  
Given the ETH change to POS and the crypto crash, I don't think there's any coin where GPU mining is profitable
 
2022-12-04 10:04:43 PM  
Between the tech world's bullshiat, the nazis targeting power plants, the democrats continuing to be spineless little farks that can't admit we're in a civil war, all the cops and way too many of our military officers being nazis... I'm finding it very hard this weekend to give a flying fark about my continued existence. I'm tired of watching this world go to shiat while I can't do ANYTHING about it that won't get me gunned down by pigs.
 
2022-12-04 10:07:04 PM  

centrifugal bumblepuppy: Given the ETH change to POS and the crypto crash, I don't think there's any coin where GPU mining is profitable


Unless you're stealing the power to run it on someone else's GPU.
 
2022-12-04 10:11:53 PM  
Just ban crypto currency and the problem is solved.
 
2022-12-04 10:21:09 PM  
I didn't know Dwarf Fortress was getting an RTX mod...
 
2022-12-04 10:37:22 PM  
These chips aren't those chips. So good for them (but seriously, fark them,).an RTX 40-whatsit will still be priced as if you can't buy a 3090 for $700 used, and life will go on.

/Psst. AMD, please exist up there.
 
2022-12-04 10:48:22 PM  

Nullav: These chips aren't those chips. So good for them (but seriously, fark them,).an RTX 40-whatsit will still be priced as if you can't buy a 3090 for $700 used, and life will go on.

/Psst. AMD, please exist up there.


Less then 10 days until their 7000 series GPUs are formally debuted!

/and rumour has it there's another batch of enhanced ones to be shown off at 2023 CES in January.
//I'm thinking the stalled out sales of the 4080s is a pretty good indicator that the market is getting tired of Nvidia's crap and assuming AMD will under cut them on every front, again
 
2022-12-04 10:56:27 PM  

I Like Bread: I didn't know Dwarf Fortress was getting an RTX mod...


OH YEAH, the DF "public friendly" retail version is releasing in a couple days.

Very excited.
 
2022-12-04 10:56:53 PM  

Nullav: These chips aren't those chips. So good for them (but seriously, fark them,).an RTX 40-whatsit will still be priced as if you can't buy a 3090 for $700 used, and life will go on.

/Psst. AMD, please exist up there.


A GPU is an order of magnitude more complicated than anything else you own.  They are expensive to design, expensive to produce and subject to supply shortages of all types.  On top of that they have a rabid fanbase willing to pay a premium to squeeze out an extra 5 fps in fortnight.

So yes, they are very expensive, and yes you can get a cheaper one.  That's just capitalism in a nutshell.
 
2022-12-05 12:34:38 AM  
And therefore none are available for grinding.

gamestudies.orgView Full Size
 
2022-12-05 2:04:43 AM  
Why not both?

Fark user imageView Full Size


Fark user imageView Full Size
 
2022-12-05 3:36:12 AM  

Lord Bear: Nullav: These chips aren't those chips. So good for them (but seriously, fark them,).an RTX 40-whatsit will still be priced as if you can't buy a 3090 for $700 used, and life will go on.

/Psst. AMD, please exist up there.

A GPU is an order of magnitude more complicated than anything else you own.  They are expensive to design, expensive to produce and subject to supply shortages of all types.  On top of that they have a rabid fanbase willing to pay a premium to squeeze out an extra 5 fps in fortnight.

So yes, they are very expensive, and yes you can get a cheaper one.  That's just capitalism in a nutshell.


Complex is an understatement...

AMD's MI-250 GPU has 58 billion transistors. nVidia's H100 has 80 billion. Intel's Ponte Veccio if they ever manage to produce it in quantities larger than prototypes, has 100 billion. Each has between 40 and 160GB of onboard HBM, with multiple TB/s of bandwidth.

Except for Cerebras (which utilizes an entire wafer) these are among most complex devices ever made.
 
2022-12-05 8:01:47 AM  
Between Nvidia's drunken crypto profit binge, their cards requiring a pair of jumper cables to a car or plugged in to a welder outlet, and enough heat to make me consider geothermal cooling options. I'm sticking with AMD for now.
 
2022-12-05 8:56:38 AM  
 
2022-12-05 9:37:12 AM  

Nimbull: Between Nvidia's drunken crypto profit binge, their cards requiring a pair of jumper cables to a car or plugged in to a welder outlet, and enough heat to make me consider geothermal cooling options. I'm sticking with AMD for now.


Nvidia bent over for some TSMC payback to negotiate a huge allotment from their 4nm process, after defiantly snubbing them for Samsung. Now they are finding that those allotments don't have a money back return policy, so they are probably grateful for any products they can run on them that aren't gaming cards, with all the Ampere stock sitting on shelves, and the meager RTX 4080 run also collecting dust.

It's fitting that they find themselves in the same bind as the scalpers....

I'll be less stressed when some real benchmarks leak out for the 7900XTX, though.

I've been in the Nvidia camp for some time now, but my experiences with the RX6000 cards and the newest drivers has been pretty good. My oldest runs a 6600XT for his VR rig, and has had no issues, and my grandson's machine here (6600) runs great for the games he plays (lately, though, it's mostly been Fortnite). AMD is really delivering the low-to-mid range cards right now.
 
2022-12-05 9:44:00 AM  
Nvidia products no longer need drivers? That'll make installing so much easier.
 
2022-12-05 9:57:47 AM  

centrifugal bumblepuppy: Given the ETH change to POS and the crypto crash, I don't think there's any coin where GPU mining is profitable


There's always shiat coins, but yeah you're right.

I turned off my miners, but I might turn them back on for one reason:

I mined moon, world, lite, and doge shiatcoins just because and one day they blew up and I had thousands.

As for the articles auto-tractors. This is the first step towards a fully automated, luxury communiat society.
 
2022-12-05 10:23:55 AM  

erik-k: Unless you're stealing the power to run it on someone else's GPU.


My local datacenter does not charge customers extra for power, so it's full of GPU mining rigs and ASIC systems. When I was there over the weekend, I saw somebody's pile of empty RTX 4090 boxes. I would've taken a photo but datacenters kind of frown on anyone taking photos inside their facilities,. but apparently at least one nutjob thinks they're still going to make money on $25,000 worth of GPUs.

I still own .14 of a Bitcoin that I mined back in ~2015 on a GPU. I really don't care what happens to it.
 
2022-12-05 11:34:44 AM  
I dunno, I very much enjoy riding the tractor, it's quite cathartic for me. Besides, what happens when you bust a shear pin or throw the slip clutch on the pto drive shaft? Does it just keep driving while whatever pto-driven machinery behind it does nothing?
 
2022-12-05 11:46:06 AM  

erik-k: Complex is an understatement...

AMD's MI-250 GPU has 58 billion transistors. nVidia's H100 has 80 billion. Intel's Ponte Veccio if they ever manage to produce it in quantities larger than prototypes, has 100 billion. Each has between 40 and 160GB of onboard HBM, with multiple TB/s of bandwidth.

Except for Cerebras (which utilizes an entire wafer) these are among most complex devices ever made.


They really aren't that complex. They are actually pretty damn simple. They just have tons of cores / compute units due to their massively parallel nature. A premium motherboard electrical layout is far far more complex.
 
2022-12-05 11:50:06 AM  

chaoswolf: Between the tech world's bullshiat, the nazis targeting power plants, the democrats continuing to be spineless little farks that can't admit we're in a civil war, all the cops and way too many of our military officers being nazis... I'm finding it very hard this weekend to give a flying fark about my continued existence.


Your hysterics over idiotically misapplied words are absolutely adorable and you seem like a completely normal person who absolutely does not break down crying if there is a single cloud in the sky.
 
2022-12-05 1:03:12 PM  

WoodyHayes: chaoswolf: Between the tech world's bullshiat, the nazis targeting power plants, the democrats continuing to be spineless little farks that can't admit we're in a civil war, all the cops and way too many of our military officers being nazis... I'm finding it very hard this weekend to give a flying fark about my continued existence.

Your hysterics over idiotically misapplied words are absolutely adorable and you seem like a completely normal person who absolutely does not break down crying if there is a single cloud in the sky.


He might be having a morning, but curious as to which words you think are hilarious misapplied?

Not really. We ALL know what YOU think is hilariously misapplied here - and it's likely not.
 
2022-12-05 3:10:42 PM  

madgonad: erik-k: Complex is an understatement...

AMD's MI-250 GPU has 58 billion transistors. nVidia's H100 has 80 billion. Intel's Ponte Veccio if they ever manage to produce it in quantities larger than prototypes, has 100 billion. Each has between 40 and 160GB of onboard HBM, with multiple TB/s of bandwidth.

Except for Cerebras (which utilizes an entire wafer) these are among most complex devices ever made.

They really aren't that complex. They are actually pretty damn simple. They just have tons of cores / compute units due to their massively parallel nature. A premium motherboard electrical layout is far far more complex.


Both of these things.

They are wildly complex, but an Intel/AMD processor is even more complex and has far fewer transistors.

Their GPU chips can slap down as many identical nodes as they like which bulks up the transistor count without increasing the complexity. It just becomes a question of heat dissipation and power draw.

That's why the recent graphics cards have simultaneously been significantly more powerful, but also significantly larger. They haven't cracked making them significantly more efficient or complex, they've just upped the power cap on the standard and scaled them out.
 
2022-12-05 3:16:01 PM  

LesserEvil: I've been in the Nvidia camp for some time now, but my experiences with the RX6000 cards and the newest drivers has been pretty good. My oldest runs a 6600XT for his VR rig, and has had no issues, and my grandson's machine here (6600) runs great for the games he plays (lately, though, it's mostly been Fortnite). AMD is really delivering the low-to-mid range cards right now.


This, I picked up a used RX 6600 XT for USD200, and a used Vive for the same price and it's done it all like a champ. 70fps average on Borderlands 3 on Ultra at 1440p.

There has been no better time to buy.
 
2022-12-05 3:24:59 PM  

dyhchong: and a used Vive


A used Vive?

Gross
 
2022-12-05 3:34:43 PM  

madgonad: dyhchong: and a used Vive

A used Vive?

Gross


¯\_(ツ)_/¯
 
2022-12-05 5:41:56 PM  

dyhchong: madgonad: erik-k: Complex is an understatement...

AMD's MI-250 GPU has 58 billion transistors. nVidia's H100 has 80 billion. Intel's Ponte Veccio if they ever manage to produce it in quantities larger than prototypes, has 100 billion. Each has between 40 and 160GB of onboard HBM, with multiple TB/s of bandwidth.

Except for Cerebras (which utilizes an entire wafer) these are among most complex devices ever made.

They really aren't that complex. They are actually pretty damn simple. They just have tons of cores / compute units due to their massively parallel nature. A premium motherboard electrical layout is far far more complex.

Both of these things.

They are wildly complex, but an Intel/AMD processor is even more complex and has far fewer transistors.

Their GPU chips can slap down as many identical nodes as they like which bulks up the transistor count without increasing the complexity. It just becomes a question of heat dissipation and power draw.

That's why the recent graphics cards have simultaneously been significantly more powerful, but also significantly larger. They haven't cracked making them significantly more efficient or complex, they've just upped the power cap on the standard and scaled them out.


Individual CPU cores are more complex than GPU ones, but at the device scale it's the same story. GPUs may make different tradeoffs in terms of math units vs execution units, but in terms of the die (or at this point, dies - everyone is on board with chiplets to maintain yields at this point, because monolithic dies are growing so large that even the most mature processes with minimized defect rates can't offer acceptable yield) structure they're basically the same as CPUs.

In the middle, a bunch of cores / steam units / SMs that are all tiled clones of each other. On the sides there's N sets of memory controllers that go to either the DDR4/5 or HBM2/3. Sandwiched between memory controllers are ultrafast inter-device links (UPI, Infinity fabric, XGMI or NVLink ports) that run at 100s of GBps. Somewhere in the middle there's the collection of busses tying all of that stuff together.
 
2022-12-05 9:30:44 PM  
I just want to play Minecraft so it doesn't lag when a Creeper sneaks up on me and explodes.
 
Displayed 31 of 31 comments

View Voting Results: Smartest and Funniest

This thread is closed to new comments.

Continue Farking




On Twitter


  1. Links are submitted by members of the Fark community.

  2. When community members submit a link, they also write a custom headline for the story.

  3. Other Farkers comment on the links. This is the number of comments. Click here to read them.

  4. Click here to submit a link.