OMF:BG Message Board Forum Index OMF:BG Message Board
One Must Fall: Battlegrounds
 
 FAQFAQ   SearchSearch   MemberlistMemberlist   UsergroupsUsergroups   RegisterRegister 
 ProfileProfile   Log in to check your private messagesLog in to check your private messages   Log inLog in 

quad core movie...

 
Post new topic   Reply to topic    OMF:BG Message Board Forum Index -> Tech Talk
View previous topic :: View next topic  
Author Message
ouch



Joined: 04 Jan 2004
Posts: 501
Location: In a Mantis :)

PostPosted: Mon Oct 02, 2006 1:03 pm    Post subject: quad core movie... Reply with quote

http://www.techeblog.com/index.php/tech-gadget/intel-quad-core-gaming-demo

if you listen too your coputer case closely while playing this movie you can almost hear your insignificant processor cry...
_________________
"Float like a Mantis, sting like a Cheatlord..."

ouch

scientifically engineered to hate StarForce

Back to top
View user's profile Send private message
salem



Joined: 07 Aug 2003
Posts: 1778
Location: Spain

PostPosted: Tue Oct 03, 2006 11:51 am    Post subject: Reply with quote

Alan Wake Smile
It has tons of numbers to become a great title.
Back to top
View user's profile Send private message
Endy



Joined: 31 Jul 2003
Posts: 4079
Location: Lost within my own thoughts.

PostPosted: Tue Oct 03, 2006 11:08 pm    Post subject: Reply with quote

Like I have been saying, high core count CPUs are going to kill off this physics card bullshit before it even gets started. GPU manufacturers better watchout too.

I could easily see multicore CPUs taking over a lot of the dedicated functionality of modern GPUs.
_________________
"I reject your reality, and substitute my own!" - Adam Savage (MythBusters)
"I don't know the key to success, but the key to failure is trying to please everyone." - Bill Cosby
Back to top
View user's profile Send private message
killer_roach



Joined: 18 Jan 2001
Posts: 8048
Location: Lexington, Kentucky

PostPosted: Wed Oct 04, 2006 7:51 am    Post subject: Reply with quote

Endy wrote:
I could easily see multicore CPUs taking over a lot of the dedicated functionality of modern GPUs.


Not when a Radeon X1900XTX has over fifteen times the floating point performance of a Core 2 Duo (375Gflops versus 24Gflops), and that's not even getting into what both G80 and R600 are rumored to be capable of (hint: it's getting very, very close to a teraflop, if not there)... general purpose CPUs won't touch that level of performance for quite some time, unless we start seeing 24-core CPUs. The fact of the matter is, GPUs do what they do astonishingly well. Granted, the GPUs are also getting into sickeningly high transistor counts and thermal envelopes (the G80 is allegedly 700 million transistors, and can suck down close to 300W of juice), but even per-transistor or per-watt they easily skunk a general purpose CPU for the heavily FP-intensive task of 3D rendering (in comparison, a Core 2 Duo is over 290 million transistors and consumes around 60W).
_________________
Official forum economist. Explodes when thrown.

Desktop:
Core i7-2600K @ 4.5 GHz
16GB DDR3-1600
eVGA GeForce 970 FTW
Windows 10 Preview

Laptop:
Core 2 Duo 2.8 GHz
4GB DDR3-800
Radeon 4570 1GB
Windows 7 Home Premium x64
Back to top
View user's profile Send private message AIM Address Yahoo Messenger MSN Messenger ICQ Number
Endy



Joined: 31 Jul 2003
Posts: 4079
Location: Lost within my own thoughts.

PostPosted: Wed Oct 04, 2006 4:25 pm    Post subject: Reply with quote

I'm not saying to replace everything. I'm talking about offloading big chunks of stuff to it's own CPU and using a far cheaper video card. It wouldn't kill off the need for a video card, what it would kill off is the need to buy a 500$ video card every 6mo-1yr to stay leading edge.

You might only buy one video card with the system and never change it. You know, like back in the olden days when the CPU did more than half of the video processing.

IE: Imagine a CPU (or two!) dedicated to JUST pixel shading or just antialiasing or whatever high consumption GPU operation you can think of. You can offload stuff that takes massive GPU cycles and just let the card do the basic rendering or whatever.

You can do all the special effects stuff in the CPU.

All this "on video card physics" stuff is history. You can just offload it to a CPU. Hell that will be the first application way before video rendering gets to the CPU level.

How about audio? How about pretty much anything. With enough cores, expansion slots will just serve as places to attach more in\out ports.


And everyone is talking about extremely high core count prodessors being a few years out from quadcore. Now that the megahertz myth has been exposed to the maximum... processor count has become the new megahertz folks, they are gonna beat this horse for all it's worth.

Remeber how AMD bought ATI? Man that was sheer genius, they saw it coming and acted while everyone was still speculated. It's pretty easy to see which direction their headed in.

Both companies already have 10 year roadmaps laid out to double digit and higher (in some cases really high - 32, 100+) core processors and you better believe AMD is going to integrate some of that ATI stuff into its newest architectures. Cool




Just remember where you heard it first.
_________________
"I reject your reality, and substitute my own!" - Adam Savage (MythBusters)
"I don't know the key to success, but the key to failure is trying to please everyone." - Bill Cosby
Back to top
View user's profile Send private message
killer_roach



Joined: 18 Jan 2001
Posts: 8048
Location: Lexington, Kentucky

PostPosted: Wed Oct 04, 2006 7:42 pm    Post subject: Reply with quote

And now the rumor is that Intel is about to announce an attempt to buy out nVidia...

The biggest problem with using a CPU to do even some limited tasks would be the fact that 1) two distinct chips are rather difficult to get to behave together with low latency (although initiatives like Torrenza may fix this), 2) system memory is much slower than video card memory (which means either a slowdown or a price spike in system memory), and 3) demands on video hardware have risen just as quickly as the hardware itself has been progressing. If new cards are making the old ones obsolete every six months to a year, then new games will be making old video cards obsolete in the same time frame.

However, what you may see is, like you may be suggesting (although not completely realizing it), both a GPU and a CPU on one die sharing a pool of ultra-high speed memory. This will particularly become feasible once manufacturing processes get down to around 32nm, which is expected to happen sometime in 2009...
_________________
Official forum economist. Explodes when thrown.

Desktop:
Core i7-2600K @ 4.5 GHz
16GB DDR3-1600
eVGA GeForce 970 FTW
Windows 10 Preview

Laptop:
Core 2 Duo 2.8 GHz
4GB DDR3-800
Radeon 4570 1GB
Windows 7 Home Premium x64
Back to top
View user's profile Send private message AIM Address Yahoo Messenger MSN Messenger ICQ Number
Vuen



Joined: 31 Aug 1999
Posts: 4968
Location: Canada

PostPosted: Thu Oct 05, 2006 2:49 am    Post subject: Reply with quote

Quote:
However, what you may see is, like you may be suggesting (although not completely realizing it), both a GPU and a CPU on one die sharing a pool of ultra-high speed memory.


I see this happening as well, but I doubt it will still be called a GPU for long. In the wake of GPU-accelerated physics, they probably intend to turn the modern GPU into something of a general purpose floating-point processor for graphics, physics, etc.

I got a great idea for a name, too. They should call it a "math coprocessor". Laughing

Quote:
IE: Imagine a CPU (or two!) dedicated to JUST pixel shading or just antialiasing or whatever high consumption GPU operation you can think of. You can offload stuff that takes massive GPU cycles and just let the card do the basic rendering or whatever.


As everyone has been telling you, stuff like pixel shading requires large amounts of floating-point computation. You don't seem to realize that it's not just a matter of adding a bunch of CPU cores to replace the GPU; they compute things differently, and a GPU can do these things much faster. If what you're saying is true, why would AMD bother to buy ATI at all? Why not just replace the GPU entirely and let ATI die?

Anyway pixel shading and anti-aliasing are much closer to the actual rendering than many other aspects of GPU computation, such as vertex transformations. These are things that could be off-loaded to a multi-core CPU, but from an AMD+ATI standpoint they'd be better off making a faster GPU for it.

Hence the move to combine the CPU+GPU rather than just replace it.
Back to top
View user's profile Send private message
killer_roach



Joined: 18 Jan 2001
Posts: 8048
Location: Lexington, Kentucky

PostPosted: Thu Oct 05, 2006 5:40 pm    Post subject: Reply with quote

If we're getting into graphics specialization, I'd like to see a GPU co-processor capable of handling decent workloads in regards to either vector calculations or ray-tracing. Smile
_________________
Official forum economist. Explodes when thrown.

Desktop:
Core i7-2600K @ 4.5 GHz
16GB DDR3-1600
eVGA GeForce 970 FTW
Windows 10 Preview

Laptop:
Core 2 Duo 2.8 GHz
4GB DDR3-800
Radeon 4570 1GB
Windows 7 Home Premium x64
Back to top
View user's profile Send private message AIM Address Yahoo Messenger MSN Messenger ICQ Number
ouch



Joined: 04 Jan 2004
Posts: 501
Location: In a Mantis :)

PostPosted: Thu Oct 05, 2006 5:43 pm    Post subject: Reply with quote

not that this is the place for it but do you know you can't click on your anti-starforce banner?
_________________
"Float like a Mantis, sting like a Cheatlord..."

ouch

scientifically engineered to hate StarForce

Back to top
View user's profile Send private message
killer_roach



Joined: 18 Jan 2001
Posts: 8048
Location: Lexington, Kentucky

PostPosted: Thu Oct 05, 2006 6:40 pm    Post subject: Reply with quote

Yeah, I know that... I tried changing my sig, and the new forums have a limit on how many characters your sig can be... it wasn't letting me save it with the link intact for some odd reason (I think it was still under 250 characters...).
_________________
Official forum economist. Explodes when thrown.

Desktop:
Core i7-2600K @ 4.5 GHz
16GB DDR3-1600
eVGA GeForce 970 FTW
Windows 10 Preview

Laptop:
Core 2 Duo 2.8 GHz
4GB DDR3-800
Radeon 4570 1GB
Windows 7 Home Premium x64
Back to top
View user's profile Send private message AIM Address Yahoo Messenger MSN Messenger ICQ Number
ouch



Joined: 04 Jan 2004
Posts: 501
Location: In a Mantis :)

PostPosted: Thu Oct 05, 2006 9:52 pm    Post subject: Reply with quote

mine works (just tried it)
_________________
"Float like a Mantis, sting like a Cheatlord..."

ouch

scientifically engineered to hate StarForce

Back to top
View user's profile Send private message
Grom_hellscream



Joined: 21 Jul 2003
Posts: 1131
Location: Noware. Norway. erath.

PostPosted: Fri Oct 06, 2006 12:03 am    Post subject: Reply with quote

I expekt AMD+ATI makes the cpu and the gpu wor bether togather. so it becoms more like a mac.

with the same name on ell the core system. with ate gpu,chipset and amd cpu. myself i cod like AMD+nVidia but i'm not god. yet.. ;p

I think this wil work out really good with multi gpu and multi cpu..
_________________
Back to top
View user's profile Send private message Send e-mail MSN Messenger ICQ Number
Display posts from previous:   
Post new topic   Reply to topic    OMF:BG Message Board Forum Index -> Tech Talk All times are GMT - 8 Hours
Page 1 of 1

 
Jump to:  
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum


Powered by phpBB © 2001, 2005 phpBB Group

Style Blacker1 by Blacker47