RAM: 8GB or 16GB for gaming/light video editing

Dragontology

Solid State Member
Messages
10
Location
United States
I'm doing an upgrade next year, and I've been saying I'm going to put 16GB on the motherboard... and to be honest with ya, I'll probably end up doing that anyway, since, last I checked, the cost isn't that different. But I've been advised 8GB should be all I need, and since I'm new here I figured I'd bounce some ideas off some fresh faces.

Basically the computer will be used for gaming... Currently considering an i5 quad core, but the "$200-$250 sweet spot" might end up occupied by a hex core, or if I'm lucky, octa (octo?) core proc come build time. I hope so. Phone has a quad. Seems like the desktop should have more. And an AMD HD6850 (1GB GDDR5) that I have now and won't be upgrading for at least a year or two. Also I do some light video editing. Transcoding... want to get into actual editing. I have 4GB now and I think the bottleneck is the CPU (AMD Phenom II X2, 3.2GHz/core, dual), not the RAM. I agree 8GB is fine for gaming and will probably make the bottleneck the GPU (whatever Intel proc I go with) but I'm thinking 16GB will kick it up a notch, especially with the video stuff. Or will I just be wasting money?
 
Depends on a lot of things rather. I'm pretty safe on saying 8GB would still be fine but there are other things to consider.

All things aside now, gaming and such, 8GB may be fine but we're right on the brink of DDR4 adoption and I only see DDR3 prices climbing once Intel's next mainstream platform releases next year with DDR4. That being said if you're getting a chip that can last a good long time like the i5 4440 or whatever then I do suggest 16GB. It's not just a little bit more, it's double looking at over 100 bucks for a 1600MHz 2x8GB set. It'll only increase in price as the need arises to actually have that size though.

As to your GPU, I think it'll become more of an issue before the RAM with gaming for sure. It's already pretty much dated.
 
You think DDR3 will get more expensive when DDR4 comes out? I assumed DDR4 would push DDR3 prices down, not up.

A year or two you could get a decent 4x4GB set for ~$100. Someone told me and I about called him a liar, but decided to check Newegg (not even PC Part Picker) and he wasn't lying. I'm thinking about $100-150 for RAM or thereabouts, but it's been a while since I planned a build. Windows 9 is looking like an April-May release (though no one really knows at this point) and Intel releases new procs around then. We're all speculating at best because nobody knows what all's coming then, and who knows what we'll all adopt early or wait and see on.

Oh, about my gaming—I don't do any of it competitive, and I generally don't play the latest games. I just finished Saints Row 4. My main game is Skyrim though my wife has told me many times she's sick of hearing it. And I don't blame her, it does reuse a lot of voice assets. I need to get through The Walking Dead (Season 1) and then go back and finish Borderlands 2 (GOTY, so all those expansions, too). GPUs are expensive, by playing my backlog I save some real money on hardware. But when The Elder Scrolls 6 and Fallout 4 come out, I want to be among the first to play them. As opposed to a year or more behind like with previous entries in those series.
 
Simple concept of supply and demand. DDR3 prices have been increasing slowly and will only continue to do so after DDR4 comes out. DDR2 did the same thing until DDR3 came out, and even trying to find 8GB of DDR2 is pretty expensive.

Unless Windows gets delayed I have a strong feeling for a spring release. I've heard Q4 this year, but I highly doubt it. As to Intel's mainstream platform, it's on schedule unless anything weird happens. Haswell-E is on schedule for this year and the next mainstream socket will follow up next year both using DDR4.

I can understand that, as I used a 580 for the longest time. Thing is midrange GPUs really aren't that expensive and are a good deal more powerful than your current solution. Something to consider.
 
Well, I bought an upper midrange GPU (well, it cost $180 USD) in 2011 and that's what I have now. If it's really that bad, or at least easy to beat with another midrange card... I think I'll be better off waiting and then getting a good card down the road a ways. I ultimately want to go back to nVidia.

It looks like Intel's roadmap for the next 7 years is more or less known, with Haswell's successor Broadwell coming out in the middle of next year. I'm not waiting for that. Comparing my budget against the chart at the link, it seems the chip for me is the i5-4690K, but I may try to go for an i7-4790K. I've heard once you go i7 you don't go back, and a good i7 will carry you for many many years. I think that's worth some extra money.

So I think between that and some memory... I could just buy 2x4GB DDR3 and use the two 4GB DIMMs I have now for my 16GB. Then an SSD to round out the upgrade (I'm still spinning platters, but then again, SSDs have just started to come down, to around 40¢/GB. I'm looking at either the Crucial MX100 or Samsung 840, both at around 500GB. The Samsung is faster but about $40 more.

Is it really just us on this weekend?
 
An i7 really can't be justified for the extra 100+ bucks unless you can actually fully utilize it. I have a 3960x now and honestly if I didn't get it free 2 years ago I would still be using a 2500k happily without issue. The reason I say this is because the only thing that differentiates mainstream i7s from i5s is HT. So ask yourself is it really worth an extra 100 bucks to bring down processing times on only a few tasks ever so slightly. You get no benefit from HT anywhere else, especially gaming. A good i5 or i7 will carry you for many years as long as your tasks aren't completely demanding without reason. This kind of use typically comes with a good budget though.

The 840 is rather trash compared to it's brother the 840pro and the real contender to the MX100 would be the EVO. I recommend the EVO over all of these in your situation.

To the GPU situation, GPUs get outdated every 6 months or so. You're looking at a 3 year old GPU which wasn't top of the chain when it came out. It's still usable now, but something like the 760 runs circles around it. Then again, give it another 6 months and that'll be dated too lol. My suggestion, unless something is literally right around the corner that you can afford (like 880s) get what you can get and don't look back. Nothing completely "ground breaking" is coming with the next set of Nvidia GPUs. Either that or what I would do is look at used GPUs. Every location is different, but I can find some 290s around my area for as low as 240 bucks.

Typically, yea, and Carnage. The GH guys in the general gaming section usually only post from work on the weekdays.
 
That's good information on the i7 and i5. Thanks. I've never built with Intel and I just want to get the most for my money. HT=Hyperthreading? I thought that was a big deal? I remember when Intel introduced it as a refresh (I think) to the Pentium 4 line. Pentium 4 HT was such a big deal. Then again that was how many years ago? Anyway, you gave me something to research. I don't expect to get all my answers from a forum. Figure I'll have to hit El Goog and read up on some benchmarks and whatnot as well as fishing for opinions.

840 EVO is what I meant from the Samsung SSDs. That and the MX100 were the two I was looking at. 840 Pro is more than I was hoping to spend on the drive, though if I find the i5 would be a better fit, maybe I might get the better SSD. I dunno.

"Get what you can get and don't look back." That's pretty much what I did in 2011. There were basically four kinds of GPUs. The really cheap ones. The ones around $130. The ones around $175. And the ones north of $250. I think I bought the best ~$175 GPU at the time. It wasn't top of the line when it was new, but it was a respectable card then. At least for me, having previously bought out of the ~$130 pile and having decent mileage. I knew I wasn't going to need the absolute best GPU, but I wanted one that would carry me for a few years. Trouble is, I don't think I ever pushed my GPU to its limits. I might in the next couple years. I remember when I reached the limit of my last one, an nVidia GeForce 6600. No letters after it... 6600GT was a better card, but I didn't spring for it. I got my money's worth out of that 6600. Expect to do the same with this one. Anyway, no games in the pipeline really interest me. If they announced Fallout 4 at E3, maybe a GPU upgrade would be in order, but I don't think a Fallout 4 based on the Skyrim engine would really tax even my current setup. Watch_Dogs would at the higher settings, but Ubisoft being Ubisoft, and realizing Watch_Dogs is just Assassin's Creed in the future (look at the map, for one) I'll wait for the price to come way down before springing for that one. Unless it requires Uplay. One DRM suite (Steam) is enough for me, TYVM.
 
HT is a big deal for some serious CPU users. I'm talking like major video editing that is time critical due to cost, major video transcoding, major 3D work or render, ect. I'm not calling you amatuer or anything, but it just seems like what you might want to be doing video editing wise would be like editing game captures or family videos? In any case, if you're not editing videos on a daily basis the difference between an i5 or an i7 won't be that noticeable. In gaming HT does nothing. I know I said that already maybe twice, but want to iterate the importance of that point.

Gotcha, yea the 840pro is a beast but not really relevant to casual users. You'll get the same performance and longevity from the EVO. Another exceptional SSD would be the Corsair Neutron GTX but they want too much for it IMO compared to others.

Fallout 4 and TES VI will be on a new engine. That being said, I hope it's 100% redone and there are no reused aspects like how Oblivion stole pieces from the Morrowind Engine, and Skyrim uses the DX9 shaders from Oblivion for the damn consoles. Ugh, ****ed me off but still put over 400 hours into the game :|
Watch Dogs is more like GTA with hacking. Your GPU would have issues trying to run it on medium honestly. It's Uplay though, so **** that, and **** Ubisoft.
 
Dead on and no offense taken. It's just, the video my camera and phone shoot is uncompressed and it's stupid to store it that way, so I got some sweet Handbrake settings I like, and I feed it through that. Takes about six hours per hour of video. But eventually it gets done and the file size is a fraction of what it was with no discernible loss in quality. Same with game captures.

Agree on F4/TES6. As for Uplay, does anyone like that? Or the other two? Origin and... I think there's one more besides Steam. I have Origin codes I can't even give away. No one wants them. If it's the Steam version they'll gladly take it, but a free game that stipulates they install Origin? Pass. And I understand. I don't like extra redundant stuff. If I got an iTunes gift card, it would be the same thing. I'd just pass it on. Not worth installing it for, and I don't have anything that requires it.
 
I used to do game captures in FRAPS. Lossless 60fps and 10 minutes or so would be like 40GB. My machine only takes about 30 minutes to do that. I think an i5 4670k would take somewhere between 45 minutes to an hour. That is simply with the Handbrake default profile. That's not editing though, that's just transcoding.

I hate Ubisoft and everything they stand for. Watch Dogs not only killed it for me, but made sure there was absolutely no way of coming back Supernatural style. I don't like Uplay because of the way the program handles saves. I've been using Origin since it came out. I got over the hater brigade and I don't mind using it now that it's more polished. Otherwise I generally agree, and would prefer to have physical copies where keys are input into the game rather than digital distribution and online DRM. It took me years to finally actually start using Steam on a regular basis even. I absolutely HATED it when it came out. Still kinda do and anything like it, but it's how things are.
 
Back
Top Bottom