I think Intel has a little more to lose than Nvidia when it comes to general purpose computing. That' s what I meant when I said Intel is scared. I know Nvidia is in for a tough fight. We can see hostility between the Intel and Nvidia execs a lot these days. It seems that whenever one side has a statement to make the other will almost instantely refute it. It almost looks personal.
Larrabee is a many simple core cpu with GPU and cpu instructions. It' s going to be great for certain applications but total shit as a graphics card. An addon with integrated performance. I believe Intel plans to integrate Larrabee in certain chipstes to replace their conventional integrated solution. No serious gamer will want to replace ATI or Nvidia with that. Nvidia is guarrenteed to have way more market penetration with it' s conventional GPU line over Larrabee. I don' t see Larrabee as a threat to Nvidia' s direction with its GPGPU because Nvidia is going to sell their GPUs anyways. However CUDA does need to be adopted as you said. Nvidia knows this. That' s why they have offered Universities huge grants in order to support courses teaching CUDA. Larrabee is good for non gamers that want to accelerate parallel applications.
AMDs fusion is literally a CPU with a real GPU glued together. This will be big for laptops and smaller devices. With fusion you can hit a level of performance with a relatively low TDP by combining the GPU and CPU. It' s a neat idea because you avoid latencies associated with the gpu and cpu comunicating through the PCIE bus. Fusion is not a high end part, it' s designed towards energy and economic efficiency. I' m fairly certain this will be a very successful part for AMD.
In other words all three Intel, Nvidia and AMD have different solutions for different problems.
If Larrabee can be added and work along with an Nvidia or ATI GPU, then it might be interesting for guys like us. Larrabee could support the GPU and CPU by doing stuff like physics or ray tracing. As well as support the CPU in heavilly threaded applications. However if Intel expects us to replace our high end GPUs with that inferior junk, it just won' t happen.
As for CUDA not having fuck all. That' s not true. The GTX 280 and even lesser GPUs are already desroying Intels best CPUs in any application it can run. I' m not talking about 5% faster. More like 20 times faster in things like encoding. It' s really exciting stuff.
http://www.nvidia.com/object/cuda_home.aspx# It' s only a matter of time before we see applications with CUDA support. An Nvidia GPU has the potential to be so much faster than any CPU that competition between applications should promote support for CUDA.
< Message edited by Agent Ghost -- 3 Jul 08 20:24:29 >