Survival of the Fittest: Larrabee vs. CUDA [UPDATE]

Litteratur d.  02. juli. 2008, skrevet af comment
Vist: 397 gange.

comment
 
Elitebruger
Tilføjet:
02-07-2008 20:01:49
Svar/Indlæg:
1230/171
CustomPC har, i en Q&A session med Intel, stillet Pat Gelsinger en række spørgsmål om fremtiden for GPGPU og CUDA:

"In a Q&A session after announcing Intel's 40th birthday, we asked Intel's senior vice president and co-general manager of Intel Corporation's Digital Enterprise Group, Pat Gelsinger, where he saw GPGPU languages such as CUDA in the future. He said that they would be nothing more than 'interesting footnotes in the history of computing annals.'"

Måske ikke så overraskende, når det kommer fra en Intel rep., men det interessante er, at det ikke nødvendigvis er, fordi Larrabee - objektivt set - er bedre end CUDA, "at what it does".

Næ, argumentet går på, at en Intel x86 kerne-arkitektur er lettere for progammørerne, at gå til:

"'The problem that we've seen over and over and over again in the computing industry is that there's a cool new idea, and it promises a 10x or 20x performance improvements, but you've just got to go through this little orifice called a new programming model,' Gelsinger explained to Custom PC. Those orifices, says Gelsinger, have always been 'insurmountable as long as the general purpose computing models evolve into the future.'

[...]

This, according to Gelsinger, is one of the major reasons why Intel's forthcoming Larrabee graphics chip will be entirely based on IA (Intel Architecture) x86 cores. 'Our approach to this has been to not force programmers to make a radical architectural shift,' explained Gelsinger, 'but to take what they already know - this IA-compatible architecture - and extend the programming model to comprehend new visual computing data-parallel-throughput workloads, and that's the strategy that we're taking with Larrabee.'"

Med andre ord: Compatibility is key to evolutionary succes.

Source: http://www.custompc.co.uk/news...
Mathis77
 
Elitebruger
Tilføjet:
02-07-2008 22:18:55
Svar/Indlæg:
6293/359
Hvis vi nu fik robotter til at programmere disse vidunder-computerere i stedet for den menneskelige kyllingehjerne, så kunne det rykke ret godt

Så ville der komme en 7GB patch til Crysis, som ville give et boost på 30FPS overall 😉 😛


MaddPirate88
 
Elitebruger
Tilføjet:
02-07-2008 23:10:25
Svar/Indlæg:
2576/125
#1 Tror du ikke det er nemmere bare at lave et nyt spil. 😛 😀


Mathis77
 
Elitebruger
Tilføjet:
02-07-2008 23:17:09
Svar/Indlæg:
6293/359
#2 hehe var ment som at patchen VAR et nyt spil 🤣


MaddPirate88
 
Elitebruger
Tilføjet:
02-07-2008 23:18:14
Svar/Indlæg:
2576/125
#3 Det var også det jeg tænkte. 🙂


Mathis77
 
Elitebruger
Tilføjet:
02-07-2008 23:25:45
Svar/Indlæg:
6293/359
#4 yes det var også det jeg tænkte at du tænkte, men jeg var egentlig ved at skrive noget helt andet som jeg så slettede og så blev det til det der, LOL 😀 '


godt så 🤡 ehehe


NoNig
 
Elitebruger
Tilføjet:
03-07-2008 01:09:03
Svar/Indlæg:
23132/740
X86-arkitekturen skal bare dø. Hvis det lykkes nVidia at overraske med CUDA'en, så vil jeg glæde mig til fremidens computer 🙂


comment
 
Elitebruger
Tilføjet:
22-07-2008 19:33:11
Svar/Indlæg:
1230/171
[UPDATE]

Som et svar på Gelsinger's forudsigelse af at nVidia's CUDA vil ende som 'interesting footnotes in the history of computing annals' har Andy Keane (leder af nVidia's GPU computing gruppe) i et interview med CustomPC i går tørt konstateret:

"[...] 'these comments from Gelsinger; if we were not making a lot of headway do you think he'd even give us a moment's notice? No. It's because he sees a lot of this activity. [...]"

Derefter går Keane i kødet på den påstand Gelsinger fremsætter om CUDAs manglende umiddelbare tilgængeligheden for programmørerne (citeret i #0 oven for):

"[...] 'We use common languages,' says Keane, 'and this is where the Gelsinger information is totally misinformed, because it [CUDA] is standard C. It is actually the open 64 compiler which was originally designed for the Itanium - that's our compiler. We're actually using a CPU compiler, but we've given it a set of rules that basically say "if you write your program this way, it will scale across a few cores, or hundreds of cores."'"

Læs det fulde interview her: http://www.custompc.co.uk/news...