Quote from: PabloMack on April 24, 2010, 09:34:47 AMQuote from: penang on April 24, 2010, 03:39:43 AMI thought Bullozer won't arrive before 2011?
This is probably true but 2011 is now little more than 8 months away. What any one system will be able to do with TG2 in the meanwhile will be very finite. AMD seems to be step-wise adding more cores to Phenom II offerings, but 45nm can only take that so far.
45nm version of Bulldozers are trial runs. These have been taped out and will be produced later this year. Their 32nm version cousins will have to wait.
AMD still owns part of Globalfoundries, and Globalfoundries is going to produce 28nm FPGA chips for Altera, starting August or September of this year.
What this means is that AMD is willing to wait just a bit longer. Wait till they learn some first hand experience from the 28nm production for Altera and then apply them to their 32nm design of Bulldozers that will come out next year.
Quote from: PabloMack on April 24, 2010, 09:34:47 AMWithout something like hyper-threading, Phenom II should have more silicon real estate to work with to put more real cores on a die than Intel can do with the same scale of geometry. As for TG2, the single thread that is used to construct the pre-render window seems to need the boost from hyper-threading to make that single thread run faster. The people with i7's are benefiting from this small margin of improvement. But AMD is working on clocking up single threads automatically to take up the slack in near future offerings. I saw an article that said someone over-clocked a Phenom II X4 to run over 7 GHz realiably and said it was a world record. This system, though, was set up with liquid cooling. It is good to know that the silicon can go much faster than it is being pushed and heat buildup is the limiting factor in the Phenom II line.
I do think that Bulldozer involves a new core design because it will have some sort of multi-threading capability to address the hyper-thread issue.
The Bulldozer represents the next chapter of AMD.
AMD has been riding on the Atholon-64 architecture for the past 9 years. Almost every core from AMD, from Athlon onwards, was based on the Athlon architecture.
Quote from: PabloMack on April 24, 2010, 09:34:47 AMBut AMD is planning to put a GPU on the same die.
Hmm... This is the first time I heard that.
I do not think AMD will stick their GPU on their Bulldozer architecture. Sure, some versions of Bulldozer derived CPU may have ATI GPU glued to them, but it wasn't AMD's intention to serve their Bulldozers to only the gaming industry.
AMD's aim for the Bulldozer is the datacenters and super-computing industry, where massive heavy-duty data/number crunching is the top most priority.
Quote from: PabloMack on April 24, 2010, 09:34:47 AMThis could speed up processing by much more than what you see with today's GPUs once software companies like PS start to use it. I've been reading semiconductor news and it seems that there are more problems with going to 32nm than people realize. It involves emersion processes that depend on equipment made by suppliers that customers like Intel and AMD depend on and the technology is not there yet. Apparently there is a considerable cost involved to do the development and no-one is stepping up to the plate. The article seemed to be addressing the often-sited coming to the end to "Moore's Law". I think the industry could be approaching the end of what light can do at such small scales. What is the next step? XRay or scanning electron beams?
Globalfoundries, Intel, Micron, IBM, and TSMC are 1st tier fabs. They get the first priority for emersion technology.
Globalfoudries, being filled to the rim with oil money from the Arab countries, already have plenty of the new equipments installed, that is why they have signed on Altera to produce the 28nm FPGA.
Emersion technology uses deep ultra violet ray that are very close to X-ray. It's safe to say that the deep ultra violet rays is usable up to the 15nm generation. Beyond that it would be X-ray (radiation) lithography, and beyond THAT it would be gamma ray (heavy radiation).
I do not foresee that happen, though.
By the time 15nm hits, the bottoms-up technology, aka nano-tech, would be ripen. And in the future, electronic chips would be "grown", particle by particle, line by line, gate by gate; Instead of "etched", like what we have today.