Jump to content

What's Your Video Card?


Recommended Posts

:blink:  HOLY Sh!t! 5 120mm fans? A total of 7 fans? Do your system hover as you play it?  :lol:

 

Power fluctuations could be fixed by investing on a UPS. Trust me, I fried an overclocked processor during a power spike. My current build cost around $2K, so you think a $20 UPS is not worth investing that could prevent it from going tits up?

 

I don't understand why you need 700w power. Even SLI approved power supply, most are rated between 400w to 500w. If you are streamlining, then you should be lowering power consumption. I have three 160GB to 250GB hdds from my old builds, but I don't see any reason to add them in my PC, just because, knowing the power consumption and temperature contribution isn't worth it.

 

I have a 500GB internal but doubt I will ever see that sucker filled near 400GB in its lifetime. I mean, I edit movies but I don't recall reaching 220GB on any of my hdisc. Yes, I know, I have two 500Gb, but the other one is an external one (I built it, too!) that I need to move movies from my PC to the school's computer.

 

Can you show us the picture of the your rig (interior) so I can see how you got all these hdd and fan configured. I'm just really curious.

 

I ronically, since combo drives (DVD/CD/Burner), the USB devices, and huge storage hdd, are now cheaper than ever, people, expecially lan gamers, such as myself, are building "streamlined" machines geared more for ventilation and cooling, lightness (weight) and low power consumption. These are the trends I see my gaming friends are adopting.

 

Boasting only a singular 500GB HDD, 1 optical (DVD/CD/Burner/Lightscribe) 5.25," 2 vid cards, 1 X 120mm fan, 1 X 80mm fan, 1 liquid CPU cooler, the mobo and everything else, this is my rig, which doubtfully needs the ATX's 450watt power supply. Sure, I have a lot of add ons, but most are self-powered add ons, such as my 500GB ext. thanks to USB 2.0 x 6, they aren't too far or too hard to be connected to my PC.

 

Watch-out for LCD screens. Some are infamous for "ghosting" while playing. Ghosting is when the image blurrs and and leaves a shadow even after the object is gone. CRT is still the preferred way for gaming. Unless you get lucky like me and finds a rare "No-ghosting" monitor. It still have but very minimal and not noticeable at all unless you turn the brightness all the way up.

 

LOL :D Bad timing, just shifted to a CoolerMaster Stacker :D LOL :D I have an only pick here or maybe at the other sites, I'll try to look for it :D I hated the Thermaltake Kandalf, so hard to hide the wires, and very poor airflow, I did some few mods for my casing, 1 x 120mm blowhole at the bottom (To cool my SLI Setup/Chipset) 1 x 120mm Blowhole on my side panel ( On top of my XP-120) 1 x 120mm fan exhaust, 2 x 120mm fan front intake (using the drivebay), then 2 x 92mm for another exhaust. Almost all of the 120mm are Delta/Nidec fans, around 40+dba, LOL :D Good thing Im using a Logitech Z5500 :D

 

I get random reboots when running with my usual number of hdds (5), so I quickly removed one hdd at a time till I got a stable system. :D So I didn't buy a 500gb (Yet) since 7200.10 is already in development, and will wait for the reviews on how the 7900.10 fares against 7200.9, since the first batches of SATA II can't fully utilize it's potential.

 

I'll be building my new rig shortly, around the same number of 120mm fans though :D LOL :D But better airflow :D But stupid vendor got me the wrong model, got me the CM Stacker 810 instead of the CM Stacker ATX/BTX. :D

 

Here's a pic of my Rig (Partially, not internal cum, este, camshots :D) Not updated since it still doesn't have the other 2 x 200gb, the 2 x 7900gt, and the OCZ 4000EB Platinum :D I also reverted back to my old XP-120 instead of the Scythe Ninja to cool my whole mobo (Which I'll revert back to the Scythe because of the crossflow fan available for the CM Stacker :D)

http://manilatonight.com/index.php?showtopic=9250&st=360

 

To avoid "Ghosting", check the response time of the LCD, usually, 16-2ms has no Ghosting, except for generic branded LCDs. :D My Dell 2405 doesn't have any ghosting, although I remember back then when Nvidia still didn't support these kinds of monitors, tearing galore! :D LOL :D

Link to comment
The release of 7300 put nVidia on the 7 series, BUT a lot of gamers who upgraded from 6600s and 6800s claimed that theres little difference from the 6600 and 6800 to 7300. The 7600 set a lot of things straight and nVidia became the sweetheart of the gaming community.

 

 

7300 is an upgraded onboard video that uses system memory (just like onboard video!!!!) to boost it up. :D Pretty useless imo, you can only play Ragnarok with this vc :D LOL :D Its practically the same as ATI's X1300-X1400 series :D So, to sum it up, when you own a 6600gt-6800 then buy the 7300, its like downgrading yourself to onboard graphics :D Literally! (Although its quite better than the 6100/6150 :D)

Link to comment
LOL :D Bad timing, just shifted to a CoolerMaster Stacker :D LOL :D I have an only pick here or maybe at the other sites, I'll try to look for it :D I hated the Thermaltake Kandalf, so hard to hide the wires, and very poor airflow, I did some few mods for my casing, 1 x 120mm blowhole at the bottom (To cool my SLI Setup/Chipset) 1 x 120mm Blowhole on my side panel ( On top of my XP-120) 1 x 120mm fan exhaust, 2 x 120mm fan front intake (using the drivebay), then 2 x 92mm for another exhaust.  Almost all of the 120mm are Delta/Nidec fans, around 40+dba, LOL :D Good thing Im using a Logitech Z5500 :D

 

I get random reboots when running with my usual number of hdds (5), so I quickly removed one hdd at a time till I got a stable system. :D So I didn't buy a 500gb (Yet) since 7200.10 is already in development, and will wait for the reviews on how the 7900.10 fares against 7200.9, since the first batches of SATA II can't fully utilize it's potential.

 

I'll be building my new rig shortly, around the same number of 120mm fans though :D LOL :D But better airflow :D But stupid vendor got me the wrong model, got me the CM Stacker 810 instead of the CM Stacker ATX/BTX. :D

 

Here's a pic of my Rig (Partially, not internal cum, este, camshots :D) Not updated since it still doesn't have the other 2 x 200gb, the 2 x 7900gt, and the OCZ 4000EB Platinum :D I also reverted back to my old XP-120 instead of the Scythe Ninja to cool my whole mobo (Which I'll revert back to the Scythe because of the crossflow fan available for the CM Stacker :D)

http://manilatonight.com/index.php?showtopic=9250&st=360

 

To avoid "Ghosting", check the response time of the LCD, usually, 16-2ms has no Ghosting, except for generic branded LCDs. :D My Dell 2405 doesn't have any ghosting, although I remember back then when Nvidia still didn't support these kinds of monitors, tearing galore! :D LOL :D

Uhm, nice set up... I was hoping you'd give us a pic of what matters the most... the inner workings!

 

Maybe once you're done upgrading/streamlining you'd give us a glimpse of your beast!

Link to comment

Don't be fooled. Most sites out there either glorifies or butcher the true power of 360. But lo and behold. I've had the 360 system for three months now and I'll tell you, it's crap. Most sites you'll see about the processor of 360 is made with three cores operating at 3.2Ghz (Intel benchmark) that is only made for graphics is a big lie. Although there are three cores, they are only limited to 3.2Ghz, total. Although they claim to only run graphics, well guess what, if a PC runs a game, the PC processor's power is now aimed at one thing, the game.

 

I recall past asumption that Xbox 360 card was made nVidia. I learned that nVidia and Microsoft had a falling out when it came to making the motherboard (which was supposed to be nForce 4 chipset) and ATI picked it up. ATI made a gameface trying to make it sound that they created the best they have to offer on console only for people to compare the Xbox graphics card to their X850 card, which makes sense, since that was the card they issued almost at the same time 360 entered the market.

 

Why would a company not sell a technology they held for PC what they did to console? Well, they wont. They will push the technology as far as they can, then sell it to consumers. It just makes sense. Knowing that ATI is lagging behind nVidia (who have the benchmark, as we speak), it would be a bad move on Microsoft since Xbox 360's graphic prowess is nowhere near an nVidia's 7600, much less an SLI card.

 

To make it less confusing, Xbox 360's graphics could easily be compared to an ATI X850s (architecture) or could be compared (nearly) to nVidia's 6800 (non-GS or non-XT). The CPU is rated at (Intel's) 3.2Ghz, or (nearly) Pent. 4 640 or (nearly) Pent. D 820 (although it is clocked at 2.8, it can't really be compared, since the architecture is much different; an 820 could handle more programs at one time than the 360 CPU).

 

Another thing to remember. Although Xbox claims to have 3 CPUs running the graphics and the game, some PCs would then have four, two CPUs (dual cores) and two GPUs (SLIs). But that's overkill. Lets just say my Xbox 360 runs like a PC I built three years ago. :wacko:

Link to comment

I use ATI Mobility RADEON 7000 IGP at 128MB

 

7th gen consoles (xbox360,ps3) does offer technology to produce better graphics at a lower cost, since they already had 128-bit computing.

 

The problem right now is that the technology is new that developers are still n00bs at maximizing the power of the system. I'll still wait a year or two to see progress in the developers community and if it still doesn't meet up to expectations then that's the time i say that Microsoft and SONY are nothing but moneygrabbing assholes

 

I can still remember the release of initial games for PS2 and i can say that the quality of graphics improved as time goes by. This just tells that developers are developing l33t skillz in making use of the hardware at its maximum.

Link to comment

A scary thought:

 

http://img.photobucket.com/albums/v489/azrach187/quad.jpg

 

Four NVIDIA® GeForce® 7900 GPUs and an NVIDIA nForce®4 SLIT MCP combine to take NVIDIA® SLIT technology to the next level

 

- 48 gigapixels/sec. of raw graphics performance and 6 teraflops of compute power·

- pixel pipes and 2 GB of on-board graphics memory for stunning visual effect

 

Turn it ALL on

- Run your favorite games at extreme HD resolutions

- Crank up the resolution to an unbelievable 2560x1600 while maintaining silky smooth frame rates

- 32x antialiasing and 16x anisotropic filtering raise the bar for image quality

- Maximize your shader and texture settings for a truly immersive gaming experience

 

Handcrafted for excellence like a Ferrari Enzo

- Elite technology delivers performance in a class by itself

 

Avg. cost to run this monster? $4,500

 

It is a good news for gamers and developers alike:

 

First, the hardware developers - they aren't pressured to develop the next "better" gaming card. Although the technology would still move forward for singular cards, multiplying the power of one would now means more quality instead of rushed issued cards (usually followed by driver releases).

 

Second, game developers - they now have more freedom to capitalize on existing game engines to develop graphics demanding applications. Some award winning engines that would be using the full potential of SLIs - Crytek (Far Cry), Unreal 2.0 engine (A.Army, Unreal Tournament), Havok 2.0 (Painkiller) and Source (HL2, CS:Source and DoD:Source)

 

As for the gamers, well, they could upgrade their system without waiting for the next card as usually is the case. I don't forsee those with Quad SLI to upgrade soon to a better card, especially if they have 7900GTs on the Quad.

Link to comment
A scary thought:

 

http://img.photobucket.com/albums/v489/azrach187/quad.jpg

 

Four NVIDIA® GeForce® 7900 GPUs and an NVIDIA nForce®4 SLIT MCP combine to take NVIDIA® SLIT technology to the next level

 

- 48 gigapixels/sec. of raw graphics performance and 6 teraflops of compute power·

- pixel pipes and 2 GB of on-board graphics memory for stunning visual effect

 

Turn it ALL on

- Run your favorite games at extreme HD resolutions

- Crank up the resolution to an unbelievable 2560x1600 while maintaining silky smooth frame rates

- 32x antialiasing and 16x anisotropic filtering raise the bar for image quality

- Maximize your shader and texture settings for a truly immersive gaming experience

 

Handcrafted for excellence like a Ferrari Enzo

- Elite technology delivers performance in a class by itself

 

Avg. cost to run this monster? $4,500

 

It is a good news for gamers and developers alike:

 

First, the hardware developers - they aren't pressured to develop the next "better" gaming card. Although the technology would still move forward for singular cards, multiplying the power of one would now means more quality instead of rushed issued cards (usually followed by driver releases).

 

Second, game developers - they now have more freedom to capitalize on existing game engines to develop graphics demanding applications. Some award winning engines that would be using the full potential of SLIs - Crytek (Far Cry), Unreal 2.0 engine (A.Army, Unreal Tournament), Havok 2.0 (Painkiller) and Source (HL2, CS:Source and DoD:Source)

 

As for the gamers, well, they could upgrade their system without waiting for the next card as usually is the case. I don't forsee those with Quad SLI to upgrade soon to a better card, especially if they have 7900GTs on the Quad.

 

:boo: Now that's what i call a monster hardware :cool:

 

although at $4,500 i could buy myself a new 1U Dual Xeon 2Ghz blade server with 2GB of RAM and 2 Terabytes of HD space to host all my pr0n :lol: :lol: :lol: :lol: :lol:

Link to comment

as requested by Azrach and another fellow mtc member :D

 

http://img95.imageshack.us/img95/9570/dsc014131vp.th.jpg

Just fitting my mobo, vc, psu before I mod the casing :D Pretty plain, yuck :D

 

http://img95.imageshack.us/img95/8118/dsc014149yq.th.jpg

The XFX 7900GT Extreme Edition is SLI mode :D One has a Zalman cooler, since the lower part tends to be hotter due to the Audigy 2 just below it :D

 

http://img100.imageshack.us/img100/3658/dsc014168sg.th.jpg

CM Stacker in its plain yucky glory :D

 

http://img95.imageshack.us/img95/679/dsc014185me.th.jpg

After taking the pics, I realized that cable management will be hell, so I shifted the Power buttons/USBs to the lowest slot, then move the CM Aerogate at the top :D This will allow less obstruction in the airflow, and with the mod in mind, it'll be easier to hide the wires that way :D

Link to comment
tanong lang po about vid cards.. how would i know kung ung vid card na gusto kong bili supports Windows Display Driver Model (WDDM)?

 

XFX Geforce 6800 XTreme PVT42EUDE3 Video Card

http://www.newegg.com/Product/Product.asp?...N82E16814150130

 

thanks..

This is from my home forum, Click here.

It requires membership to view the actual Vista screen shots.

 

To be honest, that would all depend on Microsoft. It would not make sense if Microsoft develops the new Vista without support from (expecially hig-end) current video cards. Since the Vista OS is still on a "needs-to-know" basis, Microsoft would have raised the flag on hardwares not supported by Vista, as not to limit the consumers once they launch it next year.

 

I have read on a tech forum not long ago that Vista do utilize the vid card more effectively than any of the past OS, because of the new fully interactive menu system and of each windows translucency mostly available to hi-end vid cards users nowadays.

 

Basically, you might be in trouble (if you aren't come across any problems yet) with the new Vista if your hardware is at the low-end of the spectrum and dated. For now, I would say, it isn't worth losing sleep over. When I was building my current rig, there were rumors circulating that Microsoft would soon issue "compatibility" seals to hardware manufacturers, but that was early this year, and none ever showed up.

 

Here's a Window's Vista Demo

Link to comment
Sir Azrach, is quad sli only for tech shows or is it being sold to the public?

 

Sir Cheesekeso, nice rig!

They are out. I heard two players would be bringing quads next week on the lan party, in which I am to attend.

 

Here's one being distributed by Alienware

 

As a matter of fact, PC gamers have been waiting for is since late Jan. '06 but the WHQL (Windows Hardware Quality Labs) certification for the Quad SLI delayed it. Dell is one of the first vendors to issue it, but now, almost all PC manufacturers sell it. Except it is not for sale, on its own, the reason being is that it requires (just for now) special hardwares not available for mainstream.

 

Gamers still view it as an overkill, which in my opinion, is not an exaggeration.

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...