WTF?? Man who cares about gaming? As if it matters that u have 10 displays for a game?? This has to be designed to be perfectly suited to multichannel VIZ and Sim, a graphics cluster killer before clusters even took off, AKA where SGI and E&S have played for ever. This could be the final nail in the coffin.... my heart bleeds ;) think about it 4 genlocked quadro’s for 8 edge blended quad buffered channels, this is the sort of thing you drive planetariums and VR centres with *NOT* games.
No mention of an (Athlon CK804) driver for the South bridge: http://www.anandtech.com/mb/showdoc.aspx?i=2561&am...">http://www.anandtech.com/mb/showdoc.aspx?i=2561&am... Only for the North bridge:
"System Platform Drivers: NVIDIA nForce4 SLI Intel Edition 7.13"
Also: was the automated driver installer used, or was a manual "Device Manager driver" install routine required? Due to the mix of the Intel N. bridge and an AMD S. bridge?
Actually, only the Intel driver set is required. I will post a more detailed response later today. We used the automated installation program and you will find the Intel MCP is just a subset of the AMD CK804. I have actually used the Intel IDE drivers on my Nforce 4 board as an example.
I couldnt find the link in order to download the BF2 AT demo, so I can benchmark myself.
Anyone found it?
Nice review, BTW, Hope Nvidia support this board in future drivers just to see 4 Video Cards Benches.
BTW, I wouldnt buy this board, it isnt available for the best Gaming Processor.
It's still "beta" from my perspective, and it hasn't been published. My next article with benchmarks will hopefully include the demo and other required files for running the BF2 benchmark, but just FYI, it isn't meant for non-technical types. You'll have to edit a batch file, and it doesn't automatically set BF2 settings (other than resolution) - it just runs with whatever settings BF2 is currently using. Stay tuned....
Good stuff on adding to the range of games used for benchmarking, and a most excellent choice in BF2 as AT reviews have been lacking in benchmarks using FPS games lately. Adding a seventh FPS title when there are none from any other genre (except Aquamark 3 which is dubious at best as a representative sim) was a great idea as FPS games are all anyone plays at AT. If the recently released Serious Sam 2 is as fun as the two episodes of the original Serious Sam, it might be worth adding that too.
But seriously, whilst taking the time to add BF2 to the benchmark suite is probably a good idea as it is very popular, you really should consider games other than what you like-- such as racing, flight/space-sims, above-view RPG, RTS, etc. It's no wonder the benchmarks are all so predictable with the main difference between gfx-cards being OpenGL and Direct3D games, when all the games are basically displaying the same kind of scenes.
I actually ran benchmarks on Nascar SimRacing (Daytona on four monitors is incredible), LOMAC, Falcon 4- Allied Force, GTR, and Call of Duty 2. We firmly believe the standard benchmarks need some additions to represent the overall gaming experience. You will see some of these results (plus a couple of RTS/RPG) shortly in the next "SLI x16" review. As noted in our sound test we will greatly expand the information provided for on-board solutions shortly to also include the new Dolby Master Studio suites shipping with the SigmaTel 922x and Realtek 882m audio options. Once this board is released for production we will do a complete follow-up that will concentrate on multiple-monitor usage.
Great to hear you are expanding the variety of games tested.
As for onboard audio, I did originally use the onboard Realtek ALC850 audio on my DFI LanParty nF4 SLI-DR, with the official codec from the Realtek site (at first version A375, then A376a when I noticed the problem, but it made no difference), but found that it has a rather annoying bug.
In some games, certain sounds that I know for a fact should be played are totally missing. All the other sounds are there, but the odd one is just not played. The most obvious example was in 'Serious Sam: The Second Encounter' (not the new Serious Sam 2 as I haven't played that), where it did not play the quiet intro music for the few seconds when first loading the game, nor did it play the chainsaw sound that follows. Another game I play enough to know when something is wrong is 'Train Simulator', where on one particular route I like, the "Okay to proceed" sound that is played when you can leave each station was never being played, which was rather problematic to say the least. There could well be other sounds missing as well, but only those two were sufficiently obvious to be immediately noticed. I tried using "dxdiag" to reduce Sound Acceleration Level, but it made no difference unless reduced all the way to None (when the missing sounds were then played) but that causes more problems than it fixes.
The C-Media CMI8738 onboard sound on my older box never had any problems with missing sounds, so I was very disappointed with the ALC850, especially as it seems almost industry standard on AMD nForce4 boards. As a result I bought an Audigy 2ZS, which works perfectly and plays all sounds, but it seems a shame that the onboard sound for me is basically useless for gaming.
I agree with you in regards to the static and drop out issues with the ALC850. I had nothing but issues with the a376a driver set in Call of Duty and the retail release of Fear last night. Also, the general sound effects were thin and lacking any bass in most scenes. Music was not acceptable with the general sound coming across like a cat on a hot tin roof. ;-> If I have time on the next review I will also be posting X-FI results as our high end consumer card for the test bed benchmarks now. The Intel manufactured boards with the Sigma-Tel 922x series codecs have the best overall sound of the host based audio solutions at this point. Expect to see these results and further testing of the ALC882m in the near future.
As you don't have driver support for SLI on 4 cards, and probably for your 3rd and 4th video card some PCIE 1x, 2x or 4x performance would be enough, it would be a waste of money. Go buy any other SLI board.
However, multi monitor support is usually needed by some programs that work faster on Intel processors, and buying the cheapest dual core from Intel would work faster than on any processor at that price AMD has to offer.
yes the bottom of the berrel intel dual core is cheaper, but just move to the mid range where it squares up againts the X2 3800 and X2 4400 and then it swings in amd's favor.
1) Awesome to see BF2 as a benchmark (thanks Jarred!)
2) How nicely would a setup such as this play with Intel's new virtualization technology? Would a solution that allows multiple graphics cards like this (not necessarily this exact board) be a better approach to allowing multiple users to each have their own KVM? I'm envisioning something akin to the "dumb-terminals" of yesteryear, with a family having multiple monitors, keyboards, and mice all hooked up to 1 pc in the house.
3) On pg2 there's a pic of the BIOS showing the settings for the PCIe lanes. Is there some specific difference between the 0-3D1-16-1 and the 0-3D1-3D1-1 setup? Or are both utilizing 16 lanes for each of the 3D1's and it's just logic on the motherboard to differentiate so it accepts the correct card?
4) Also regarding the PCIe lanes, I see there's no 0-16-16-1 or equivalent. Is this intentional on the part of Gigabyte? Will a BIOS upgrade allow for this? The reason I'm asking is because I'm curious if there would be a difference in terms of SLI speeds w.r.t. 8-8 vs 16-16, as has been somewhat hinted at in the "SLI x16" snippets I've heard thus far, and this would seem to be the perfect motherboard to test for that.
5) Any speculation on why the Doom3 scores show such a spread while others don't show as much of one?
6) When are they gonna be available for purchase? :D
Thanks guys for a very neat preview of an interesting upcoming product!
quote: 3) On pg2 there's a pic of the BIOS showing the settings for the PCIe lanes. Is there some specific difference between the 0-3D1-16-1 and the 0-3D1-3D1-1 setup? Or are both utilizing 16 lanes for each of the 3D1's and it's just logic on the motherboard to differentiate so it accepts the correct card?
No,
The difference between the two setups is that the third PEG slot can utlizie another card other than the 3D1 in a x16 configuration if you use the 0-3D1-16-1. In fact, due to the space limitations caused by the rear heatsink on the 3D1 rev1 cards we used both a 6600Gt and 7800GTX in this slot. The board does require the separate paddle card for the 3D1 card in order to utilize both cards correctly (100%) in my testing. The bios does allow this change but the paddle is the preferred method at least in the pre-production bios. I typically set the bios to auto and utilized the paddle card although both methods were tested to ensure it was possible. I tried the two center slots (easier to type this way) in SLI with the two outer slots in standard mode. I could not get the two center slots to work properly in SLI mode but this was due to the drivers and not the board.
quote: 4) Also regarding the PCIe lanes, I see there's no 0-16-16-1 or equivalent. Is this intentional on the part of Gigabyte? Will a BIOS upgrade allow for this? The reason I'm asking is because I'm curious if there would be a difference in terms of SLI speeds w.r.t. 8-8 vs 16-16, as has been somewhat hinted at in the "SLI x16" snippets I've heard thus far, and this would seem to be the perfect motherboard to test for that.
I have a new bios coming from Gigabyte that hopefully will allow additional changes to the PCIe lanes in manual mode with the paddle card set for SLIx16. Under the auto mode the system will default to a 1-16-16-1 setting with the paddle card set to SLI. I did test in this mode but due to the inability of the 840EE to feed enough data to the two 7800GTX cards the benchmarks did not reflect any difference. I am also testing another "SLI x16" board but have the same issue with the GPU wait states.
quote: 5) Any speculation on why the Doom3 scores show such a spread while others don't show as much of one?
The benchmarks jumped from the D5 to D6 bios used for all results. I re-tested the other boards with their lastest shipping bios and the Gigabyte 8I-955x Royal jumped almost 20%. I am still testing with different GTX cards (it's expensive to buy 6 of these) and driver sets. I cannot match the Abit scores yet and we are still comparing notes.
quote: 6) When are they gonna be available for purchase? :D
I had included this in my article but decided to pull the information as I did not want to jinx Gigabyte or have an ATI situation. The best information I can provide at this time is December. The board is in certification testing at this time and provided there are not any issues it should be out before January unless market conditions dictate otherwise. I will update the article or post a news blurb once the board enters production. We tested the revision 1.0 board and have worked extensively with Gigabyte on some bios enhancements. The current bios is at D9 and I am expecting a new spin next week. I know it is too late to change the sound solution but we are still pushing for the 1394b setup.
I spent more than 110 hours of testing time on this board. I can honestly say without a doubt that it is ready for production.
I apologize for not responding sooner as some serious family issues arose the past couple of days. I want to thank Wes for handling my responsibilities.
Tim,
quote: 1) Awesome to see BF2 as a benchmark (thanks Jarred!)
Jarred worked all night right before the article was published so we could include this benchmark. I wish I could have had more time with it in the overclocking and sound section but that will come in future articles.
quote: 2) How nicely would a setup such as this play with Intel's new virtualization technology? Would a solution that allows multiple graphics cards like this (not necessarily this exact board) be a better approach to allowing multiple users to each have their own KVM? I'm envisioning something akin to the "dumb-terminals" of yesteryear, with a family having multiple monitors, keyboards, and mice all hooked up to 1 pc in the house.
You are certainly on the right track with this thought process. All I can say at this time is wait until next year. ;->
It is this quick thought process along with quick action that has allowed Gigabyte to introduce several innovative products over the past year that include everything from the GA-8I945P dual graphics capable motherboard to the impressive single slot SLI based GV-3D1-68GT video card. While the true commercial success of these currently niche products are open for debate, the desire of the company to introduce these types of products is not.
Huh? Since when was their stupid single-slot SLI card innovative? They just crammed the logic from two 6600GTs onto one card, and the result was overpriced crap. No comment about the GA-8I945P, but it all sounds like Gigabyte corporate spew to me.
Please look at your quote closely. We are talking about the 3D1-68GT, which combines two 6800 GT GPUs on a single card and NOT the earlier 6600 version. Please check the benchmarks before you trash the description as innovative. On p. 6 the 3D1-68GT outperforms the 7800GTX in both 3DMark03 and 3DMark05. That's pretty decent performance from a single slot card based on dual 6800GT (not Ultra) GPUs. The 7800GTX is still likely the better buy, but the 68GT is still an interesting idea with excellent performance.
Fine, I'll retract my statement, at least partially. I wasn't reading the statement carefully enough.
Having looked into the newer 3D1-68GT, it seems to be a more solid product than the original 3D1 card based on 6600s. The original seemed to serve no purpose whatsoever.
They made it an Intel board assuming that the more "corporate-oriented" users prefer multiple monitors. I don't know about current performance, but in the recent past, Intel processors smoked the Athlon64 at things like Photoshop. And introduction of dual core processors at prices much lower than AMD's dual core could coax someone into buying such a board.
I agree that most every normal person would be happy with four processors (powered by two cards), however I remember cases (in Linux) when OpenGL performace fell at half when enabling 2 monitor support on a single video card. This is driving a single monitor, not two. Driving two monitors, it fell even lower.
So, for every person that WANTS (not that it really really would need) four monitor output from four video cards, this looks like the best choice
quote: And introduction of dual core processors at prices much lower than AMD's dual core could coax someone into buying such a board
I kind of doubt that since the cost in video equipment does not make this a low cost solution. if a company is willing to shell out for that, they would be willing to shell out for the best in workstation performance, which just happens to be the X2s
The ability to produce this board was due to Nvidia's decision to use a HyperTransport link for the Intel SLI chipset due to the need to have an on-chip memory controller. While it would be feasible to complete a AMD version of the board, the engineering time and product cost would not be acceptable. While I will agree with everyone that the current AMD processor line up offers significantly more performance than Intels, the actual day to day real life experience with both systems is not readily apparent to most people. In fact, I have had people play on my FX55 machine and 840EE machine and nobody could decide clearly which system had the AMD64 in it without benchmarks. This was at both 1280x1024 and 1600x1200 resolutions. While I personally favor AMD for most performance oriented setups, there are some people that still want Intel. After not having an Intel based machine for the last two plus years I have to admit is not as bad as most people make it out to be.
Whether you like it or not, the 3D1 was an innovative product, it's not childspplay to stuff both cores together and develop the motherboard support for it.
The pictures of the 10-monitor display were supplied, but Gary did hook up every monitor we could, which was 8 if I recall, to test the outputs. To test 10, we needed two more Rev. 2 3D1 cards - our extra pair were Rev. 1 cards - which couldn't be here in time for a review.
We did verify the ability of the individual 3D1 cards to do what Gigabyte claimed, so there is no reason at all to doubt the 10 claim. One of the key Engineers at Gigabyte works exclusively with AT and THG. All sites use some pictures and diagrams from press kits to save time, but we perform and report our own test results and analysis.
Yes, we dis ALL of the testing ourselves. Our review took longer because we did much more extensive testing of the board, including quite a bit of overclocking tests to make sure the nVidia dual-core issue we reported in our last Intel SLI review is now fixed in this chipset.
Gary spent countless hours sniffing out the good and the not so good on this board. We also found the OC capabilities of the shipping BIOS not too exciting, and we wanted to bring you the much improved OC results from the revised BIOS.
I don't think most of your readers actually thought what the subject line of this thread implies. There are always a few who like to throw stones of course.
In my read of the THG article a few days ago, I found myself thinking that the 10-display shot was from Gigabyte, as they had no detail shots of the display control panel for 10 monitors; nine was the most they were able to get working.
Like you, I have little doubt that 10 displays will in fact work with this board, but the 9th and 10th would have to come from either a PCI card or a PCIe card running in a x1 slot. Even x1 PCIe is faster than crusty old PCI, but it's still hardly ideal. It'd be nice if 3D1 cards could be coaxed into working in x8 slots (so that'd be 4 PCIe lanes per core - still plenty), as then you could theoretically have 4 3D1 cards for 16(!) displays.
Thanks for the information on how you did the review testing.
I doubt one photo or the other was actually doctored, but it is pretty amazing that NOTHING is moved between the two shots... not even the mouse has moved so much as a butt-hair.
This does lend credence to the theory that Gigabyte prepared the 10-monitor shots themselves.
It looks like this shot was taken at a gigabyte facility, probably in taiwan or china... the blue and red stickers on the monitors look to be chinese characters.
My guess would be that Gigabyte did this for each one of the sites that it had sent samples to, assuming that they would not be able to set this up themselves (monitors, cards etc). Still, this should have been mentioned in the review itself...
I did not want to use the Gigabyte lab shot since THG had already published their version of it. However, since we could not get the revision 2 3D1 cards in time for testing I thought there would be more comments about lack of proof on 10 monitors than issues with the lab shots. I should have noted that in the article.
I was able to get 8 monitors to work with the video setup I had available. However, I found utilizing four monitors was an ideal situation with the two 7800 GTXs. :-)
Seems a bit....odd, that THG has the exact same picture of the 10 display setup using the exact same displays with associated cables and hardware (and even boxes) in the exact same place...with the sole differance being the background color and logo. Yet THG had their review on 10/4. Yet both sites talk about setting up the system with 10 displays as if they had the gear in house...I smell something rotten here. When you look at the test setups they read almost in stereo. Did either one of these sites actually have the hardware "in the shop" to test any of this out on????
Obviously this board is teh suxors since there is no uber AMD variant. What is this now THG?? Pfft.
More seriously though, that is kinda cool in its own right. While I wouldn't mind having 4 monitors, 10 seems a bit overkill unless you are an uber l33t day trader or something. I mean wholy crap! Can you imagine the heat that bad boy will put out too? STRONG ass power supply + P4 Dual Core + 4 High End Graphics Cards??? + HDD's + RAM = Heat Stroke in the comfort of your office chair.
The motherboard can't actually drive four cards in SLI, the only use for those slots graphics card wise is more monitors, you don't need a 500 dollar card for that.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
44 Comments
Back to Article
slain - Friday, October 14, 2005 - link
WTF?? Man who cares about gaming? As if it matters that u have 10 displays for a game?? This has to be designed to be perfectly suited to multichannel VIZ and Sim, a graphics cluster killer before clusters even took off, AKA where SGI and E&S have played for ever. This could be the final nail in the coffin.... my heart bleeds ;) think about it 4 genlocked quadro’s for 8 edge blended quad buffered channels, this is the sort of thing you drive planetariums and VR centres with *NOT* games.Where can I get one ?
hoppa - Friday, October 14, 2005 - link
"10 displays should be enough for anyone"~Bill Gates
vailr - Thursday, October 13, 2005 - link
No mention of an (Athlon CK804) driver for the South bridge: http://www.anandtech.com/mb/showdoc.aspx?i=2561&am...">http://www.anandtech.com/mb/showdoc.aspx?i=2561&am...Only for the North bridge:
"System Platform Drivers: NVIDIA nForce4 SLI Intel Edition 7.13"
Also: was the automated driver installer used, or was a manual "Device Manager driver" install routine required? Due to the mix of the Intel N. bridge and an AMD S. bridge?
Gary Key - Friday, October 14, 2005 - link
Actually, only the Intel driver set is required. I will post a more detailed response later today. We used the automated installation program and you will find the Intel MCP is just a subset of the AMD CK804. I have actually used the Intel IDE drivers on my Nforce 4 board as an example.R3MF - Thursday, October 13, 2005 - link
one 2405FPW and two 1704FPV's.and the answer is..................
Gary Key - Friday, October 14, 2005 - link
Email me and I will setup a test configuration for you as I will have that same monitor delivered next week.Powered by AMD - Wednesday, October 12, 2005 - link
I couldnt find the link in order to download the BF2 AT demo, so I can benchmark myself.Anyone found it?
Nice review, BTW, Hope Nvidia support this board in future drivers just to see 4 Video Cards Benches.
BTW, I wouldnt buy this board, it isnt available for the best Gaming Processor.
JarredWalton - Thursday, October 13, 2005 - link
It's still "beta" from my perspective, and it hasn't been published. My next article with benchmarks will hopefully include the demo and other required files for running the BF2 benchmark, but just FYI, it isn't meant for non-technical types. You'll have to edit a batch file, and it doesn't automatically set BF2 settings (other than resolution) - it just runs with whatever settings BF2 is currently using. Stay tuned....PrinceGaz - Thursday, October 13, 2005 - link
Good stuff on adding to the range of games used for benchmarking, and a most excellent choice in BF2 as AT reviews have been lacking in benchmarks using FPS games lately. Adding a seventh FPS title when there are none from any other genre (except Aquamark 3 which is dubious at best as a representative sim) was a great idea as FPS games are all anyone plays at AT. If the recently released Serious Sam 2 is as fun as the two episodes of the original Serious Sam, it might be worth adding that too.But seriously, whilst taking the time to add BF2 to the benchmark suite is probably a good idea as it is very popular, you really should consider games other than what you like-- such as racing, flight/space-sims, above-view RPG, RTS, etc. It's no wonder the benchmarks are all so predictable with the main difference between gfx-cards being OpenGL and Direct3D games, when all the games are basically displaying the same kind of scenes.
Gary Key - Friday, October 14, 2005 - link
I actually ran benchmarks on Nascar SimRacing (Daytona on four monitors is incredible), LOMAC, Falcon 4- Allied Force, GTR, and Call of Duty 2. We firmly believe the standard benchmarks need some additions to represent the overall gaming experience. You will see some of these results (plus a couple of RTS/RPG) shortly in the next "SLI x16" review. As noted in our sound test we will greatly expand the information provided for on-board solutions shortly to also include the new Dolby Master Studio suites shipping with the SigmaTel 922x and Realtek 882m audio options. Once this board is released for production we will do a complete follow-up that will concentrate on multiple-monitor usage.PrinceGaz - Friday, October 14, 2005 - link
Great to hear you are expanding the variety of games tested.As for onboard audio, I did originally use the onboard Realtek ALC850 audio on my DFI LanParty nF4 SLI-DR, with the official codec from the Realtek site (at first version A375, then A376a when I noticed the problem, but it made no difference), but found that it has a rather annoying bug.
In some games, certain sounds that I know for a fact should be played are totally missing. All the other sounds are there, but the odd one is just not played. The most obvious example was in 'Serious Sam: The Second Encounter' (not the new Serious Sam 2 as I haven't played that), where it did not play the quiet intro music for the few seconds when first loading the game, nor did it play the chainsaw sound that follows. Another game I play enough to know when something is wrong is 'Train Simulator', where on one particular route I like, the "Okay to proceed" sound that is played when you can leave each station was never being played, which was rather problematic to say the least. There could well be other sounds missing as well, but only those two were sufficiently obvious to be immediately noticed. I tried using "dxdiag" to reduce Sound Acceleration Level, but it made no difference unless reduced all the way to None (when the missing sounds were then played) but that causes more problems than it fixes.
The C-Media CMI8738 onboard sound on my older box never had any problems with missing sounds, so I was very disappointed with the ALC850, especially as it seems almost industry standard on AMD nForce4 boards. As a result I bought an Audigy 2ZS, which works perfectly and plays all sounds, but it seems a shame that the onboard sound for me is basically useless for gaming.
Gary Key - Wednesday, October 19, 2005 - link
I agree with you in regards to the static and drop out issues with the ALC850. I had nothing but issues with the a376a driver set in Call of Duty and the retail release of Fear last night. Also, the general sound effects were thin and lacking any bass in most scenes. Music was not acceptable with the general sound coming across like a cat on a hot tin roof. ;-> If I have time on the next review I will also be posting X-FI results as our high end consumer card for the test bed benchmarks now. The Intel manufactured boards with the Sigma-Tel 922x series codecs have the best overall sound of the host based audio solutions at this point. Expect to see these results and further testing of the ALC882m in the near future.Calin - Thursday, October 13, 2005 - link
As you don't have driver support for SLI on 4 cards, and probably for your 3rd and 4th video card some PCIE 1x, 2x or 4x performance would be enough, it would be a waste of money. Go buy any other SLI board.However, multi monitor support is usually needed by some programs that work faster on Intel processors, and buying the cheapest dual core from Intel would work faster than on any processor at that price AMD has to offer.
trooper11 - Thursday, October 13, 2005 - link
its all about return on your investmentyes the bottom of the berrel intel dual core is cheaper, but just move to the mid range where it squares up againts the X2 3800 and X2 4400 and then it swings in amd's favor.
TheInvincibleMustard - Wednesday, October 12, 2005 - link
Couple o'things:1) Awesome to see BF2 as a benchmark (thanks Jarred!)
2) How nicely would a setup such as this play with Intel's new virtualization technology? Would a solution that allows multiple graphics cards like this (not necessarily this exact board) be a better approach to allowing multiple users to each have their own KVM? I'm envisioning something akin to the "dumb-terminals" of yesteryear, with a family having multiple monitors, keyboards, and mice all hooked up to 1 pc in the house.
3) On pg2 there's a pic of the BIOS showing the settings for the PCIe lanes. Is there some specific difference between the 0-3D1-16-1 and the 0-3D1-3D1-1 setup? Or are both utilizing 16 lanes for each of the 3D1's and it's just logic on the motherboard to differentiate so it accepts the correct card?
4) Also regarding the PCIe lanes, I see there's no 0-16-16-1 or equivalent. Is this intentional on the part of Gigabyte? Will a BIOS upgrade allow for this? The reason I'm asking is because I'm curious if there would be a difference in terms of SLI speeds w.r.t. 8-8 vs 16-16, as has been somewhat hinted at in the "SLI x16" snippets I've heard thus far, and this would seem to be the perfect motherboard to test for that.
5) Any speculation on why the Doom3 scores show such a spread while others don't show as much of one?
6) When are they gonna be available for purchase? :D
Thanks guys for a very neat preview of an interesting upcoming product!
-TIM
Gary Key - Thursday, October 13, 2005 - link
Part Two,Tim-
No,
The difference between the two setups is that the third PEG slot can utlizie another card other than the 3D1 in a x16 configuration if you use the 0-3D1-16-1. In fact, due to the space limitations caused by the rear heatsink on the 3D1 rev1 cards we used both a 6600Gt and 7800GTX in this slot. The board does require the separate paddle card for the 3D1 card in order to utilize both cards correctly (100%) in my testing. The bios does allow this change but the paddle is the preferred method at least in the pre-production bios. I typically set the bios to auto and utilized the paddle card although both methods were tested to ensure it was possible. I tried the two center slots (easier to type this way) in SLI with the two outer slots in standard mode. I could not get the two center slots to work properly in SLI mode but this was due to the drivers and not the board.
I have a new bios coming from Gigabyte that hopefully will allow additional changes to the PCIe lanes in manual mode with the paddle card set for SLIx16. Under the auto mode the system will default to a 1-16-16-1 setting with the paddle card set to SLI. I did test in this mode but due to the inability of the 840EE to feed enough data to the two 7800GTX cards the benchmarks did not reflect any difference. I am also testing another "SLI x16" board but have the same issue with the GPU wait states.
The benchmarks jumped from the D5 to D6 bios used for all results. I re-tested the other boards with their lastest shipping bios and the Gigabyte 8I-955x Royal jumped almost 20%. I am still testing with different GTX cards (it's expensive to buy 6 of these) and driver sets. I cannot match the Abit scores yet and we are still comparing notes.
I had included this in my article but decided to pull the information as I did not want to jinx Gigabyte or have an ATI situation. The best information I can provide at this time is December. The board is in certification testing at this time and provided there are not any issues it should be out before January unless market conditions dictate otherwise. I will update the article or post a news blurb once the board enters production. We tested the revision 1.0 board and have worked extensively with Gigabyte on some bios enhancements. The current bios is at D9 and I am expecting a new spin next week. I know it is too late to change the sound solution but we are still pushing for the 1394b setup.
I spent more than 110 hours of testing time on this board. I can honestly say without a doubt that it is ready for production.
Thanks,
Gary
TheInvincibleMustard - Friday, October 14, 2005 - link
Awesome replies! Better late than never!Thanks!
-TIM
Gary Key - Thursday, October 13, 2005 - link
All,I apologize for not responding sooner as some serious family issues arose the past couple of days. I want to thank Wes for handling my responsibilities.
Tim,
Jarred worked all night right before the article was published so we could include this benchmark. I wish I could have had more time with it in the overclocking and sound section but that will come in future articles.
You are certainly on the right track with this thought process. All I can say at this time is wait until next year. ;->
DrMrLordX - Wednesday, October 12, 2005 - link
. . . is this paragraph:It is this quick thought process along with quick action that has allowed Gigabyte to introduce several innovative products over the past year that include everything from the GA-8I945P dual graphics capable motherboard to the impressive single slot SLI based GV-3D1-68GT video card. While the true commercial success of these currently niche products are open for debate, the desire of the company to introduce these types of products is not.
Huh? Since when was their stupid single-slot SLI card innovative? They just crammed the logic from two 6600GTs onto one card, and the result was overpriced crap. No comment about the GA-8I945P, but it all sounds like Gigabyte corporate spew to me.
Wesley Fink - Wednesday, October 12, 2005 - link
Please look at your quote closely. We are talking about the 3D1-68GT, which combines two 6800 GT GPUs on a single card and NOT the earlier 6600 version. Please check the benchmarks before you trash the description as innovative. On p. 6 the 3D1-68GT outperforms the 7800GTX in both 3DMark03 and 3DMark05. That's pretty decent performance from a single slot card based on dual 6800GT (not Ultra) GPUs. The 7800GTX is still likely the better buy, but the 68GT is still an interesting idea with excellent performance.DrMrLordX - Thursday, October 13, 2005 - link
Fine, I'll retract my statement, at least partially. I wasn't reading the statement carefully enough.Having looked into the newer 3D1-68GT, it seems to be a more solid product than the original 3D1 card based on 6600s. The original seemed to serve no purpose whatsoever.
Calin - Thursday, October 13, 2005 - link
They made it an Intel board assuming that the more "corporate-oriented" users prefer multiple monitors. I don't know about current performance, but in the recent past, Intel processors smoked the Athlon64 at things like Photoshop. And introduction of dual core processors at prices much lower than AMD's dual core could coax someone into buying such a board.I agree that most every normal person would be happy with four processors (powered by two cards), however I remember cases (in Linux) when OpenGL performace fell at half when enabling 2 monitor support on a single video card. This is driving a single monitor, not two. Driving two monitors, it fell even lower.
So, for every person that WANTS (not that it really really would need) four monitor output from four video cards, this looks like the best choice
trooper11 - Thursday, October 13, 2005 - link
I kind of doubt that since the cost in video equipment does not make this a low cost solution. if a company is willing to shell out for that, they would be willing to shell out for the best in workstation performance, which just happens to be the X2s
ElJefe - Wednesday, October 12, 2005 - link
ever wonder what crack they were smoking though making it an Intel board?if you read about modders and gamers , almost 80%+ market share for DIY builders use AMD.
this board is a waste of technology.
still cool though.
Gary Key - Thursday, October 13, 2005 - link
Hi,The ability to produce this board was due to Nvidia's decision to use a HyperTransport link for the Intel SLI chipset due to the need to have an on-chip memory controller. While it would be feasible to complete a AMD version of the board, the engineering time and product cost would not be acceptable. While I will agree with everyone that the current AMD processor line up offers significantly more performance than Intels, the actual day to day real life experience with both systems is not readily apparent to most people. In fact, I have had people play on my FX55 machine and 840EE machine and nobody could decide clearly which system had the AMD64 in it without benchmarks. This was at both 1280x1024 and 1600x1200 resolutions. While I personally favor AMD for most performance oriented setups, there are some people that still want Intel. After not having an Intel based machine for the last two plus years I have to admit is not as bad as most people make it out to be.
Johnmcl7 - Wednesday, October 12, 2005 - link
Whether you like it or not, the 3D1 was an innovative product, it's not childspplay to stuff both cores together and develop the motherboard support for it.John
Viper20220k - Wednesday, October 12, 2005 - link
Yeah, what is up with that.. I would sure like to know also.Wesley Fink - Wednesday, October 12, 2005 - link
The pictures of the 10-monitor display were supplied, but Gary did hook up every monitor we could, which was 8 if I recall, to test the outputs. To test 10, we needed two more Rev. 2 3D1 cards - our extra pair were Rev. 1 cards - which couldn't be here in time for a review.We did verify the ability of the individual 3D1 cards to do what Gigabyte claimed, so there is no reason at all to doubt the 10 claim. One of the key Engineers at Gigabyte works exclusively with AT and THG. All sites use some pictures and diagrams from press kits to save time, but we perform and report our own test results and analysis.
Yes, we dis ALL of the testing ourselves. Our review took longer because we did much more extensive testing of the board, including quite a bit of overclocking tests to make sure the nVidia dual-core issue we reported in our last Intel SLI review is now fixed in this chipset.
Gary spent countless hours sniffing out the good and the not so good on this board. We also found the OC capabilities of the shipping BIOS not too exciting, and we wanted to bring you the much improved OC results from the revised BIOS.
johnsonx - Wednesday, October 12, 2005 - link
Wesley,I don't think most of your readers actually thought what the subject line of this thread implies. There are always a few who like to throw stones of course.
In my read of the THG article a few days ago, I found myself thinking that the 10-display shot was from Gigabyte, as they had no detail shots of the display control panel for 10 monitors; nine was the most they were able to get working.
Like you, I have little doubt that 10 displays will in fact work with this board, but the 9th and 10th would have to come from either a PCI card or a PCIe card running in a x1 slot. Even x1 PCIe is faster than crusty old PCI, but it's still hardly ideal. It'd be nice if 3D1 cards could be coaxed into working in x8 slots (so that'd be 4 PCIe lanes per core - still plenty), as then you could theoretically have 4 3D1 cards for 16(!) displays.
Thanks for the information on how you did the review testing.
Regards,
Dave
AmberClad - Wednesday, October 12, 2005 - link
April Fool's Day already?!SpaceRanger - Wednesday, October 12, 2005 - link
Yup.. Just compared the two, and they are IDENTICAL Pic's, just doctored to show THG and AT... VERY WEAK!!!!!THG:
http://i14.photobucket.com/albums/a342/Arathon/ten...">THG 10 Monitor Image
AT:
http://i14.photobucket.com/albums/a342/Arathon/ten...">AT 10 Monitor Image
johnsonx - Wednesday, October 12, 2005 - link
I doubt one photo or the other was actually doctored, but it is pretty amazing that NOTHING is moved between the two shots... not even the mouse has moved so much as a butt-hair.This does lend credence to the theory that Gigabyte prepared the 10-monitor shots themselves.
at80eighty - Thursday, October 13, 2005 - link
You got issues with butt hair ? :-)
BigLan - Wednesday, October 12, 2005 - link
It looks like this shot was taken at a gigabyte facility, probably in taiwan or china... the blue and red stickers on the monitors look to be chinese characters.vijay333 - Wednesday, October 12, 2005 - link
My guess would be that Gigabyte did this for each one of the sites that it had sent samples to, assuming that they would not be able to set this up themselves (monitors, cards etc). Still, this should have been mentioned in the review itself...Gary Key - Thursday, October 13, 2005 - link
Good Day,I did not want to use the Gigabyte lab shot since THG had already published their version of it. However, since we could not get the revision 2 3D1 cards in time for testing I thought there would be more comments about lack of proof on 10 monitors than issues with the lab shots. I should have noted that in the article.
I was able to get 8 monitors to work with the video setup I had available. However, I found utilizing four monitors was an ideal situation with the two 7800 GTXs. :-)
Bitter - Wednesday, October 12, 2005 - link
Seems a bit....odd, that THG has the exact same picture of the 10 display setup using the exact same displays with associated cables and hardware (and even boxes) in the exact same place...with the sole differance being the background color and logo. Yet THG had their review on 10/4. Yet both sites talk about setting up the system with 10 displays as if they had the gear in house...I smell something rotten here. When you look at the test setups they read almost in stereo. Did either one of these sites actually have the hardware "in the shop" to test any of this out on????johnsonx - Wednesday, October 12, 2005 - link
yeah, as soon as I saw that shot I quickly clicked on "Comments" to see if anyone else had already pointed it out... early bird gets the worm I guess.If I had to guess, I would venture that both THG and AT reviewed the hardware at a common location hosted by Gigabyte.
phaxmohdem - Wednesday, October 12, 2005 - link
Obviously this board is teh suxors since there is no uber AMD variant. What is this now THG?? Pfft.More seriously though, that is kinda cool in its own right. While I wouldn't mind having 4 monitors, 10 seems a bit overkill unless you are an uber l33t day trader or something. I mean wholy crap! Can you imagine the heat that bad boy will put out too? STRONG ass power supply + P4 Dual Core + 4 High End Graphics Cards??? + HDD's + RAM = Heat Stroke in the comfort of your office chair.
Chuckles - Wednesday, October 12, 2005 - link
So...4x$500 for graphics+~$250 for the board+$1000 for the CPU+$200 for RAM.
$3500 for a system. Geez.
Johnmcl7 - Wednesday, October 12, 2005 - link
The motherboard can't actually drive four cards in SLI, the only use for those slots graphics card wise is more monitors, you don't need a 500 dollar card for that.John
Xenoterranos - Wednesday, October 12, 2005 - link
Wow. I need this. Now. right now. Wow. Anyone have a couple grand i can borrow?Xenoterranos - Wednesday, October 12, 2005 - link
hey, first post!vini3 - Tuesday, March 3, 2020 - link
very good posthttps://aboutallpet.com/