The MagicBox Forums  

Go Back   The MagicBox Forums > General Topics > PC / Games / Internets Discussion

Reply
 
Thread Tools Display Modes
Old 05-09-2008, 05:28 AM   #1
flowForth
Programmer
 
flowForth's Avatar
 
Join Date: Dec 2005
Location: Australia
Posts: 455
Intel To Introduce Next-Generation CPU/GPU - Possibly Collaborating With nVIDIA

Quote:
Originally Posted by TGDaily
Intel set to announce graphics partnership with Nvidia?

Chicago (IL) – Intel may soon be announcing a close relationship with Nvidia, which apparently will be contributing to the company’s Larrabee project, TG Daily has learned. Larrabee is expected to roll out in 2009 and debut as a floating point accelerator product with a performance of more than 1 TFlops as well as a high-end graphics card with dual-graphics capabilities.

Rumors about Intel’s Larrabee processor have been floating around for more than a year. Especially since the product’s official announcement at this year’s spring IDF and an accelerating interest in floating point accelerators, the topic itself and surrounding rumors are gaining traction every day.

Industry sources told TG Daily that Intel is preparing a “big” announcement involving technologies that will be key to develop Larrabee. And at least some of those technologies may actually be coming from Nvidia, we hear: Our sources described Larrabee as a “joint effort” between the two companies, which may expand over time. A scenario in which Intel may work with Nvidia to develop Intel-tailored discrete graphics solutions is speculation but is considered to be a likely relationship between the two companies down the road. Clearly, Intel and Nvidia are thinking well beyond their cross-licensing agreements that are in place today.

It is unclear when the collaboration will be announced; however, details could surface as early as June 26, when the International Supercomputing Conference 2007 will open its doors in Dresden, Germany.

Asked about a possible announcement with Intel, Nvidia spokesperson Ken Brown provided us with a brief statement: “We enjoy a good working relationship with Intel and have agreements and ongoing engineering activities as a result. This said, we cannot comment further about items that are covered by confidentiality agreements between Intel and Nvidia.”

Intel replied to our inquiry by saying that the company does "not comment on rumors and speculation."

...

Concluding note

While this is an interesting approach to graphics, physics, and general purpose processing, we will be seeing the meat in the final product as well as the success of acceptance with independent software vendors (ISVs). In our opinion, the concept of the GPGPU is the most significant development in the computer environment in at least 15 years. The topic has been gaining ground lately and this new implementation from Intel could take things to a whole new level. As for the graphics performance, only time will tell.

It will be interesting to see which role Nvidia will play in Intel’s strategy. Keep a close eye on this one.
Source.

The article is much longer than this but I have abridged it for brevity - visit the source for the full details.

Basically, Intel has been rumored for some time now to be working on a new state-of-the-art CPU/GPU hybrid - one that theoretically could revolutionize computer architecture and enable the possibility of state-of-the-art graphics rendering using ray-tracing technology (the most advanced rendering technique - too slow and computationally intensive for conventional architectures). The CPU/GPU hybrid will also perform physics processing - so it's a general "all-in-one" chip. The project that this new hybrid is spawned from is called "Larrabee" and - in addition to the new CPU/GPU hybrids - will spawn several new high-end "many-core" technologies ("many-core" is Intel's name for CPUs with upwards of 8+ cores).

See here, here and here for more information on Larrabee.

From a more recent article (last link above):

Quote:
Originally Posted by Ars Technica
What chips may come: Intel lifts curtain on upcoming CPUs

As for Larrabee, Intel believes its "many-core" architectural approach will be instrumental in creating next-generation environments and extending the usefulness of the graphics core well beyond gaming. If things move in the direction that Intel predicts, triangles and rasterization will be replaced by realtime raytrace rendering, the GPU architecture itself will be far more plastic and easy to work with, and the GPU itself will have no problem dedicating part of itself to handling, say, physics calculations while other cores simultaneously handle rendering. Intel had no die shots of Larrabee available, but offered this overview of the architecture's design:



While Larrabee intends to retain DirectX and OGL compatibility, it will be very much a new creation. Intel plans to launch a new vector instruction set named AVX (Advanced Vector Extension) to deliver a number of efficiency improvements, is building a new cache architecture for Larrabee, and, as previously noted, intends for the architecture to scale into the teraflop range. Actual hard details on Larrabee were vague, but one thing Intel did make absolutely clear is that it intends to offer a comprehensive and total set of tools for developers in order to smooth programming and deployment.
I think the outcome of this will be really interesting. If Intel is successful with this approach, it could fundamentally change computer architecture and how graphics are handled. It could be the next step of the PC's evolution. It will also be interesting to see how nVIDIA, ATI and others in the graphics market will evolve with this new technology.
__________________
"A lie travels half-way around the world while the truth is still putting on its boots."
-Mark Twain

Last edited by flowForth; 05-09-2008 at 07:29 AM. Reason: Expanded, corrected and clarified post
flowForth is offline   Reply With Quote
Connect With Facebook to "Like" This Thread

Old 05-09-2008, 06:43 AM   #2
AmishNazi
Skag Killer Extraordanaire
 
AmishNazi's Avatar
 
Join Date: Apr 2005
Posts: 3,136
If I am not able to use my video processor for os tasks or programs still fail.
__________________
Reader, suppose you were an idiot.
And suppose you were a member of Congress, but I repeat myself.

Whenever you find yourself on the side of the majority, it is time to pause and reflect.
- Mark Twain
AmishNazi is offline   Reply With Quote
Old 05-09-2008, 08:39 AM   #3
Zachalmighty
Future Demon King
 
Zachalmighty's Avatar
 
Join Date: Apr 2006
Location: USA, NJ
Posts: 3,742
TWo CPU's on one motherboard anyone?
__________________
An unavoidable war is called JUSTICE. When brutality is the only option left, it is HOLY.
Zachalmighty is offline   Reply With Quote
Old 05-09-2008, 10:16 AM   #4
Alucard
Registered User
 
Alucard's Avatar
 
Join Date: Apr 2002
Location: Australia
Posts: 32,625
Theres already 2 cpus on motherboards. But 2 of these special ones would be pretty nom nom.
Alucard is offline   Reply With Quote
Old 05-09-2008, 10:20 AM   #5
Reality
Sin and Punishment
 
Reality's Avatar
 
Join Date: Oct 2002
Location: Ohio
Posts: 14,620
It'll be interesting but as of right now I'm not putting much faith in it. Graphics cards have a incredible lead above CPUs when it comes to this kind of thing. If anything, they should consider making cheap CPUs to go into graphics cards to do all the major rendering.
__________________
Outpostnine is that 1%.
Reality is offline   Reply With Quote
Old 05-09-2008, 10:37 AM   #6
Seraph
>:3
 
Join Date: Jul 2005
Location: Troy, MI
Posts: 11,005
Awesome. My next PC is going to be a small laptop and so being able to get some nice GPU power without sacrificing small form factor sounds great.
Seraph is offline   Reply With Quote
Old 05-09-2008, 10:53 AM   #7
Alucard
Registered User
 
Alucard's Avatar
 
Join Date: Apr 2002
Location: Australia
Posts: 32,625
Good call seraph. Laptops should definately benefit the most from this.
Alucard is offline   Reply With Quote
Old 05-09-2008, 10:59 AM   #8
Zachalmighty
Future Demon King
 
Zachalmighty's Avatar
 
Join Date: Apr 2006
Location: USA, NJ
Posts: 3,742
Quote:
Originally Posted by Alucard View Post
Good call seraph. Laptops should definately benefit the most from this.
3 years from now.
__________________
An unavoidable war is called JUSTICE. When brutality is the only option left, it is HOLY.
Zachalmighty is offline   Reply With Quote
Old 05-09-2008, 11:01 AM   #9
Reality
Sin and Punishment
 
Reality's Avatar
 
Join Date: Oct 2002
Location: Ohio
Posts: 14,620
I'm thinking more like 5-6.
__________________
Outpostnine is that 1%.
Reality is offline   Reply With Quote
Old 05-09-2008, 11:03 AM   #10
Seraph
>:3
 
Join Date: Jul 2005
Location: Troy, MI
Posts: 11,005
Quote:
Originally Posted by Alucard View Post
Good call seraph. Laptops should definately benefit the most from this.
It's actually pretty exciting. The only thing I've been hesitant about when getting a laptop (and a small one at that) is that I'd lose power. I can't wait to see how far this comes.

I dont play many PC games anymore, but in the case that one interests me in the future I'd like my options open.
Seraph is offline   Reply With Quote
Old 05-09-2008, 11:07 AM   #11
Zachalmighty
Future Demon King
 
Zachalmighty's Avatar
 
Join Date: Apr 2006
Location: USA, NJ
Posts: 3,742
Quote:
Originally Posted by Reality View Post
I'm thinking more like 5-6.
Noted.
__________________
An unavoidable war is called JUSTICE. When brutality is the only option left, it is HOLY.
Zachalmighty is offline   Reply With Quote
Old 05-10-2008, 04:50 AM   #12
flowForth
Programmer
 
flowForth's Avatar
 
Join Date: Dec 2005
Location: Australia
Posts: 455
Quote:
Originally Posted by Reality View Post
It'll be interesting but as of right now I'm not putting much faith in it. Graphics cards have a incredible lead above CPUs when it comes to this kind of thing. If anything, they should consider making cheap CPUs to go into graphics cards to do all the major rendering.
I disagree.

The advantage of this kind of hybrid processor is that it can eliminate the need for two discrete devices (a CPU and GPU) connected via a bus. Needless to say - this virtually eliminates latency and dramatically speeds up the computation. This is important because it is the CPU that tells the GPU what to render, where to render it, when to render it and what effects to apply to the scene. Some estimates of the performance improvements suggest an improvement of 4-8 times - Intel believes this brings ray-trace rendering into reach. Of course - time will tell.

The second advantage is that - as GPUs have advanced in technology - some of their specialized technology can applied to do normal tasks normally reserved for the CPU (this has become increasingly popular in the software community). This offloading has obvious advantages. So it makes sense to simply integrate the two devices as much as can be achieved. Integrating the two devices allows programmers the opportunity to better utilize the power of GPUs for general processing - which is what Intel is aiming for.

The third advantage is that this type of integration makes a number of new efficiencies possible - which is why Intel is planning to deploy an entirely new instruction set to take advantage of these new efficiencies.

Quote:
Originally Posted by Zachalmighty View Post
3 years from now.
Quote:
Originally Posted by Reality View Post
I'm thinking more like 5-6.
Both of you are wrong.

The earliest models are planned to be deployed in either late 2009 or early 2010 - that's approximately 1.5 years. We should then shortly be seeing motherboards that support these new technologies. AMD - to my knowledge - is also planning something similar to this - which we may see in a similar time space.

Of course - this is assuming that there are no additional project delays.
__________________
"A lie travels half-way around the world while the truth is still putting on its boots."
-Mark Twain

Last edited by flowForth; 05-10-2008 at 08:16 AM. Reason: Clarified post.
flowForth is offline   Reply With Quote
Old 05-10-2008, 10:49 AM   #13
Reality
Sin and Punishment
 
Reality's Avatar
 
Join Date: Oct 2002
Location: Ohio
Posts: 14,620
Well I'm all for anything that helps improve PC hardware. I'm just doubtful.
__________________
Outpostnine is that 1%.
Reality is offline   Reply With Quote
Old 05-11-2008, 05:43 AM   #14
Zack
Registered User
 
Zack's Avatar
 
Join Date: Apr 2002
Posts: 1,019
didn't sony try something like this on the ps3 but ended up failing at it? i bet this type of technology will be in the ps4 or xbox'720'.
Zack is offline   Reply With Quote
Old 05-11-2008, 05:53 AM   #15
flowForth
Programmer
 
flowForth's Avatar
 
Join Date: Dec 2005
Location: Australia
Posts: 455
Quote:
Originally Posted by Zack View Post
didn't sony try something like this on the ps3 but ended up failing at it? i bet this type of technology will be in the ps4 or xbox'720'.
Not that I'm aware of - Sony simply initially believed that the Cell would be powerful enough to not need a GPU. Which was an overestimation of the Cell's capabilities.

And yeah - it's safe to say further along the line that the consoles will adopt a similar technology. Developments in console technology later tend to mirror advancements in PC technology.
__________________
"A lie travels half-way around the world while the truth is still putting on its boots."
-Mark Twain
flowForth is offline   Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


All times are GMT -4. The time now is 05:36 PM.


Powered by vBulletin® Version 3.7.4
Copyright ©2000 - 2020, Jelsoft Enterprises Ltd.