04-06-2011, 12:22 AM
(This post was last modified: 04-06-2011, 12:45 AM by Concillian.)
(04-05-2011, 05:21 PM)MEAT Wrote:(04-05-2011, 08:34 AM)Concillian Wrote: The video card in those systems is the same. A 6770 is just a 5770 with a new label on it. 6770 are sold only to OEMs, basically so OEMs can say it's a 6 series card, even though it's unchanged from the 5 series (since it was a pretty small jump anyway.)
I was under the impression that the 6000 series was faster than the 5000 series. I know it was built off the same model, but on sites such as tomshardware, the 6000 series does substantially better than the 5000 series and is about average in power as video cards go. Is this not true for the 6770 OEM version?
The 6770 is just a rebadge.
Specs on 6770 (OEM):
http://www.amd.com/us/products/desktop/g...iew.aspx#2
Specs on 5770:
http://www.amd.com/us/products/desktop/g...iew.aspx#2
Both are 800 shaders / 40 texture units / 16 ROPs
Both list 1.36 Teraflop
Both list 76.8 GB / sec bandwidth
They're exactly the same.
Other cards in the 6xxx family are different. They changed naming scheme to add a "tier" so they aren't all faster than the previous 5xxx card with corresponding number, but they do tend to compete well with their given price and power consumption. 6750 and 6770 though are exact 5xxx parts called 6xxx so OEMs can have a 6 series card in that price range.
5770 is still a very competitive card in it's price range. It's not really a bad thing. No nVidia card can touch it's power consumption / price / performance point. The GTX460-768 is good on price / performance if you can find it on sale / rebate, but it has significantly higher power consumption. I have one and it suits my current needs, but I use a fairly low resolution.
As I mentioned in the initial response to Jarulf, the latest gen was meant for a smaller process, but got stuck on 40nm. The result has largely been adjusting product lines to compete with competitors at various price points. Price points where nVidia had a card that AMD didn't (mot notably GTX 460 - 1GB) , AMD created a new 6 series card (6850... slower than a 5850, but much cheaper). Prices AMD had a card that nVidia didn't, nVidia created a new card. The result is a lot of cards. There are like 10 different models for either brand between $100 and 300. Almost every $20 there's another card.
A 6850 is slower than a 5850 at most things, but there was no 5950, and now there is a 6950. The numbers don't always match up, but the 6 series is pretty competitive. As I said earlier, the 5770 was already winning in that segment, AMD didn't need to re-jigger it's lineup in that part of the market. They still don't, so they just slapped a 6770 label on a 5770. They would take huge flak from reviewers if they did that across the board, so they did it OEM only, where it makes sense since full system purchases are usually less informed consumers who will more readily discount a 5770 just because it's a 5xxx instead of a 6xxx. Discrete card purchases are more likely to know if it's a re-badge, and the press would roast AMD for it, as they gave nVidia a TON of crap for rebadging the 8800GT twice. (9800GT, GTS250)
MEAT Wrote:Jim Wrote:USB 2.0 [your next digital device will come with USB 3.0]
3.0 is already readily available in computers? Is it a tech that will be a must have? Or is it more like, meh... wait 5 years before worrying about it?
I'm kind of agnostic about USB 3. There are devices available right now, but there is a significant cost associated. It's readily available on motherboards and USB 3 external hard drives (or standalone external enclosures) are also readily available, and significantly faster than USB 2.0. Cost adder on motherboards is barely less than buying an add-in card.
Currently, external hard drives are about the only thing that really makes use of the additional speed, and eSATA (or even jury rigging an internal SATA port to be used outside your system) does not have the same price adder and is even faster than USB 3. So I don't make a big deal of it on my own machines.
Since there are no fully integrated USB3 ports yet (no chipset natively supports USB 3.0, they are all tacked onto motherboards with a special USB 3.0 controller chip) they really haven't "taken off", but it's likely just a matter of time. Intel is offering some competing technology, which by launching on Mac first (already available on MacBook Pros,) looks to be the "new Firewire" and we will have Thunderbolt vs. USB 3.0 much as last gen it was Firewire vs. USB2.0 as the "port battle" du jour. Hard to say for sure who will "win", but I'd assume whoever isn't associated with Macs since that usually commands an irrational price premium.
Conc / Concillian -- Vintage player of many games. Deadly leader of the All Pally Team (or was it Death leader?)
Terenas WoW player... while we waited for Diablo III.
And it came... and it went... and I played Hearthstone longer than Diablo III.
Terenas WoW player... while we waited for Diablo III.
And it came... and it went... and I played Hearthstone longer than Diablo III.