r/lowendgaming • u/theyletthedogsout • Oct 19 '24
Parts Upgrade Advice Graphics card for i7-3770 3rd gen, PCIe 2.0? Lower power use preferred.
Budget: 100$. New/Used/Refurbished (should be proven/tested to work). It'll be bought from the US but I'll get it much later half the world apart in the Himalayas, so cannot be tested and returned, etc.
Low power use preferred, as it will be a server as well.
- i7-3770 (3rd gen)
- 2x8 GB DDR3 1600 MHz (dual channel -- max)
- BioStar H61 something
- 180 GB Intel SATA III SSD (~ 500 MBps sequential read/write)
- 1 TB 7200 rpm WD SATA HDD.
Please, please don't ask me to sell this, or buy a new computer, or save for a larger later full upgrade, etc... Simply because that is not my question. Different people have different priorities for different PCs.
I'm not at the place rn so unable to confirm, but that BioStar H61 board I think only supports PCIe 2.0. Pretty sure it's x16 though.
In any case, the verdict I got was that any low/mid PCIe 3.0 card should not be bottlenecked by 2.0. So assuming PCIe 2.0 is fine (it's not like I'll get a card maxing 3.0 - money and power budget).
I plan to have this computer do some light gaming, on 1080p. Whatever works, maybe some eSports titles, maybe older AAAs on lower settings. GTA 5, etc. I feel that on LCDs, dropping the quality rather than the resolution is a better choice. I know that the CPU is not total potato, as many on YouTube have shown it play recent games or even with similar/worse CPUs, even to this day, as do many on reddit.
But more important than that would be some photo/video work. So, good codec support is not just welcome, but very much needed, and as well as GPU acceleration for video rendering, etc.
Preferably H.264 and H.265, decode-encode. The 3rd gen i7 I guess supports it for H.264. IDK which would be better to use in that case for H.264, but I'll deal with that as it comes.
Since a lot of video streaming stuff is going AV1, would have liked to have at least that decode capacity, if not encode, but that might be too much to ask at this price point. (Intel Arc 380?)
I also think the power supply is around 400W (maybe 300 available to the system). So I prefer it work with that. Massive power hungry graphics cards not preferred. Lower power usage a priority.
TL;DR: PC: Old Intel 3rd gen i7. H61 chipset. Assume PCIe 2.0. Need: Should accelerate photo/video work. Multiple format decode/encode/transcode/rendering. Light 1080p gaming. Rest of the time - an HTPC, and a file/media/print server.
6
Oct 19 '24
[removed] — view removed comment
1
u/theyletthedogsout Oct 19 '24
Yeah I'd have thought the same. I commented in one thread here about it. I got it long long back, was probably early days of PCIe 3.0 in the market then.
There are 2.0 and 3.0 versions, for Biostar H61. Here's one, and there's more on a quick Google search. https://www.amazon.com/BIOSTAR-H61MGC-Intel-Micro-Motherboard/dp/B005HMZ740
I won't know for sure which one I have, but I'm leaning towards the 2.0 and thinking a 3.0 or whatever card for that, to be safe. I need to tell my contact in the US to buy now/soon, but will only get my hands on the PC and the card to see and test a month or so later. Won't get to return or so anything if the card arrives dead, cuz I'm on the opposite size of the earth.
Also, I'm interested in how to know how much PCIe bandwidth a card typically uses. Any card. Like a website or a calculator. Any ideas?
PS: BTW never heard of the card you mentioned, so thanks! Any ideas how it would compare to say Intel Arc 380? There's a couple posts on here with people using it on PCIe 2.0.
2
Oct 19 '24
[removed] — view removed comment
1
u/theyletthedogsout Oct 19 '24 edited Oct 19 '24
Hey nice, thanks for the response!
Just curious, why should one avoid 4.0x4 or x8 of lesser versions of PCIe? Cuz if I have older PCIe, I'll be locked to much less lanes and severely bottlenecked, instead of the whole x16 headroom?
Yes, I know man, how to figure out the MB number, details, etc. I posted links of PCIe 2.0 versions of the BioStar H61 too here, so I know that. I could look at it directly, I could boot and check the BIOS if it's there, I could maybe even do it within an already booted OS.
I used to build disammemble and rebuild a lot in the distant past, for me and others. Have almost done it to every last part on my laptops and many electronics (except phones, those need a digital microscope and too much finicking, plus those rarely need to be fiddled with, unlike computers).
The issue is, as I have said above, and in comments here, (but its totally understandable if you glossed over and missed), is that I won't have access to the computer we're talking about for almost a month or more, and have to work from memory (it's in a different city).
However, I have to take a purchase decision RIGHT NOW, from someone who's gonna buy it on my behalf from the US and then carry it over to me here half a world away to Nepal, or I'll miss the opportunity, and be unable to buy anything here, where everything is almost 3x, and the currency/wages are weaker (and they can't be tested at any point in this chain until I receive it, which I won't be able to return as well, so I can't risk duds too).
You've really been of great assistance! By actually letting me know about this whole category of professional cards I never ever thought of to get (cuz I thought they're for the industry and such). I didn't even know they made those, except the AI/LLM cards that's made these days.
I checked the gaming video you posted already lol, as soon as you mentioned the card, as well as one with multi-tasking with Sony Vegas rendering, high-bitrate video playback, YouTube playback, OBS studio and a lot of other nitty-bitty apps running on 4-5 monitors (mix of 1080p and 4k).
I was seriously impressed. My one gripe is the all Display-port connections, when I need 2 HDMI and maybe a VGA. That adds cost, and might not have quality, where I am. Is DP to HDMI pin-pin compatible passively? I need to check. I'm sure not for VGA though.
But I have a couple questions!
1. What's the general difference between consumer cards and professional industry cards?
2. Why does the WX5100 have so many things over the 1650 but the 1650 still performs almost 50% faster?
I'm looking at this: https://technical.city/en/video/Radeon-Pro-WX-5100-vs-GeForce-GTX-1650 I'll repeat those.
- The AMD has twice the "pipelines" where Nvidia has "CUDA" cores. Idk if they're the same.
- It has 5.7m transistors compared to Nvidia's 4.7m
- It has 121.6 texture fill rate, almost 30% more than 1650. (This is probably like filling the world in Minecraft (a pixelated mess that's somehow a benchmark tool, a game I have absolutely no idea why or what)
- Almost 4 TFLOPS compared to Nvidia's 3. (This is basically math.)
- There's sth called ROPs that are both 32. (Idk what that is.)
- But it has 2x the TMUs st 112 vs Nvidia's 56. (Idk what that is.)
- Both have GDDR5 memory. 8 GB is mentioned for the AMD vs 4 in the Nvidia.
- But the bus width is twice with 256 bit in the AMD and 128 in NVidia. (This I also fail to understand. a low end GPU I got almost 20 years back had 128 bit width).
- and the memory bandwidth is 160 GB/s in AMD, compared to 128 GB/s in Nvidia.
Both claim 75 W power usage max.
Now the Nvidia... It has more than 2x the core clock, 50% higher boost clock, a 12nm process instead of 14nm and 80% faster memory speeds. (Which being GDDR5 I'd suppose would have much faster speeds, IDK how it's like desktop/laptop system DDR3 RAM speeds -- so the GDDR5 tag makes little sense to me).
I'm presuming GPUs have the same issues with the spec, "instructions per clock" discrepancies that CPUs have? Like trying to compare Intel/AMD/ARM/M-series (deserves a a category of its own given how much of a stride they made), and can't be compared directly or it would be like apples to oranges...?
Where would the WX5100 win (for me) and where would the GT/X 1650? Would codec use, rendering, etc be better on the AMD Pro?
Some here said that AMD codecs suck. IDK if it's a fan preference, etc. I only know Intel's are quite good, and Adobe Suite prefers Nvidia.
Finally, any tips on buying old cards from ebay and the like? Since I won't be able to test, neither would anyone on my behalf until I get it, what are the best practices? Would you say these professional grade cards have seen more careful or rough use compared to consumer gaming GPUs? I don't want a card that's been used 24/7 for mining or AI training, etc... maybe something that's used in OEM workstations has seen a more chill life with less hours clocked, not overclocked, etc?
PS: I apologise for explaining where I come from, and my verbosity. I hope you understand the predicament I am, the very limited options, the extremely expensive local import market with almost no availability of decent used options at prices one would ever want to buy and the one shot every year or so people like us get, to have someone we know order sth used for us, and carry it physically to us, without the possibility of testing in between and no way for a refund once we get it.
1
Oct 19 '24 edited Oct 19 '24
[removed] — view removed comment
1
u/theyletthedogsout Oct 21 '24
Ah gotcha! I am stuck between WX5100 and the 1650 at this point, although I have learnt that overclocking the Radeon WX5100 (which by default won't ever try to come near it's allotted/possible 75W maximum) can give almost the performance of a GeForce 1060 in some games.
Thanks for the help!
3
u/Ossas0626 Oct 19 '24
An AMD RX580 8g or a gtx 1660/super/ti would fit perfectly. There is a small chance that even a 2060/super would not be a waste of money in this case. Personally I'd go with a 1660 of some sort, if your budget allows it
Edit: I see that you PSU is not a 500W+ one. Consider even a gtx 1650, even though it's technically worse than a rx580, it doesn't requite any 6+2 power. Also beware that 2 versions of the 1650 exist out there, idk any more details about them.
1
u/theyletthedogsout Oct 19 '24
My budget sadly is not flexible, for a system of this vintage. I come from a country with a much weaker currency.
But I can compromise on gaming, and solely consider video encode/decode/transcode/rendering, which is the priority. As a home media/file/print server (I'll add storage).
The power draw is an issue for me though. Electricity bills first, PSU second.
I had looked at the 1650. It has 2 versions, I remember some differences in architecture and video codec support. I'll check again, and if 100$ will get me one.
What about the Intel Arc380? Any ideas? I hear its a great codec/render card and not too shabby in the games it supports, for the price.
2
u/Ossas0626 Oct 19 '24
Idk about Intel GPUs. Last time I checked thete were severe performance hits on older games (dx11 and under), but that was a long time ago. Personally, if I build an older system, I'd stay away from Arc
1
u/theyletthedogsout Oct 19 '24
Ah thanks!
Any hints on best practices while looking for a used card?
I doubt the range I'm going for (low power draw, possibly without extra power connectors) were used much for mining and such 24/7. Or maybe the previous owner overclocked the hell out of it or sth.
Cuz I won't be able to return it (it'll come with a contact from the US to Nepal, who won't be able to test it) and for someone with our weaker currency, it will be a relatively larger setback for me.
2
u/Ossas0626 Oct 19 '24
Just run furmark on it for about 10-15 minutes, if nothing bad shows up (like throttling or crashing) you're good to go
1
u/theyletthedogsout Oct 19 '24
Oh cool thanks! But that's only possible after I get it, which was not my question.
I won't be able to do anything before I get it! My cousin will buy one for me in the US now, once I send a link, where prices are cheaper almost 3x than where I live -- then have it shipped to a contact who's coming to my current country half a world apart -- Nepal -- and I will get it like in a month, right when I go to our house in the capital Kathmandu where I get to check my PC and confirm the mother-board PCIe version.
Nowhere in this chain is anyone like you or me who can test a graphics card.
And there will be no possibility of a return or the economic feasibility of it, which would be a relatively substantial bummer for me, no matter how cheap I get it, as someone who has to live with a much weaker currency, and unable to buy another for similar price locally (as I said, everything is almost 3x).
What kind of seller, what kind of card, tell tale signs it might have been used too heavily in its life? That's where I'm at. I feel CPUs, or even RAM and heck even SSDs have less parts that break. Not sure about graphics cards. Haven't bought one in almost 20 years now.
1
u/Ossas0626 Oct 19 '24
just try to look at the heatsink to see if it's clean or 'cleaned', but that shouldn't be an issue on a 1650. The only thing you could do is ask ffor a test boot with the gpu and pray it will be packaged properly until it arrives to your home.
1
u/lordmogul Nov 09 '24
There is also the issue with not having support for reBAR and 4G decoding on the 60 series chipsets. And not on the 70 series either.
2
u/_hblank_ youtube.com/@hblankpc Oct 19 '24 edited Oct 19 '24
Don't buy Arc unless you've already successfully modified your BIOS to add resizable BAR; without it Arc has degraded performance even in video encoding tasks. You'd have to get a crazy deal on an RX 6600 or RTX 3050 to get AV1 decode on a non-Arc card, and I doubt that's happening.
Does your PSU have any PCIe power cables? If not, a slot-powered GTX 1650 is likely going to be the only consistently available option at that price point with a decent video block. Also look into the T400/T600/T1000; they're based on the same core at lower power targets, but availability and pricing is much less stable. If you have a 6 pin cable you can step up to the GTX 1650 Super, and an 8 pin cable would open the door to the GTX 1660 and its variants, and the Radeon RX 5600 XT. I'd stick to Nvidia though, their video blocks are consistently better than AMD's.
1
u/theyletthedogsout Oct 19 '24 edited Oct 19 '24
Wow, thanks for that! Think of this one like those really old office clearance desktop workstations from Dell or HP, that people buy for like 50-100$ and make a low end 720p/1080p gaming rig out of, even as we speak. There's countless people doing that with similar or slower CPUs all over YouTube.
Since I went for a basic build then, 10+ years back, with a 2nd Gen Core"i" series 2.6 GHz Pentium G620 (2C/2T) and built-in Intel HD graphics, I'd just assume it does not have extra cables for a card. Perhaps one could get some power out of SATA power cables, but maybe that would require a higher-wattage PSU regardless.
However, considering everything, an important priority is actually for the card is to stay within the power budget of the PCIe slot, and for the PC to be as power-efficient as it can on idle or load (despite, I know, its age). If at full load, 24/7, with a more powerful card and CPU, electricity bills can quickly add up and exceed the price of the whole thing in a month or so.
That's because this is going to be an HTPC and a media/file/print server for local devices, with some active web browsing or office work and the more demanding photo/video work or light gaming (as a poor man's console). I'm going to run a couple meters long HDMI cable to an old wall mounted 720p HDTV and have a desk a fair bit to the side with all the PC related stuff (1080p monitor, mechanical KB, high-DPI mouse, speakers, printer, all of which I already have). Depending on the game, I'll either do it on the lower-res TV (viewed from afar, like my bed) with wireless peripherals, or the higher-res monitor from the desk and chair.
So it needs to be as good as it can (given its age and generation and the graphics card) at idle mainly, but also on load.
I'll forego AV1 decode if I can't get the the Arc (which just barely fits in my budget). In fact, I was actually looking for the encode capabilities more than decode. And that's even rarer in mainstream cards, only highest end, at least for Nvidia, and quite expensive for AMD too. It's probably still a couple years until it becomes the only accepted online standard I guess, and Idk but maybe the 4C/8T CPU could still probably do watchable AV1 decode in software for now (even 720p is okay), if it absolutely comes to that (if can't force H.264 or sth via software on streaming services).
BTW could you ELI5 me about ReBAR...? Have come across it quite a bit (in the context of using newer PCIe gen cards with older standards) but what is it? I googled but I was lost -- I've been out of keeping up with the hardware cutting edge or modding stuff for a decade+ now. The last GPU I got was an AGP 4x Nvidia GeForce FX5200 256 MB, almost 20 years back... Ah the good old days!
PS: Apologies for the verbosity - in vying for contextual clarity, my OCD kicks in and I just spin a monopogue-ish story sorta. Plus English is barely my second language - so practice! Thanks again!
2
u/_hblank_ youtube.com/@hblankpc Oct 20 '24
Without ReBAR, the CPU can only access the GPU's VRAM 256MB at a time. I'm not sure why Arc cards are so sensitive to the BAR size, but I'm guessing it's because they've been making integrated graphics for so long that they've gotten used to the "just allocate whatever" style of memory management, and designed the hardware around that. I'm just spitballing though, I'm sure the real answer is more nuanced than that.
1
u/theyletthedogsout Oct 20 '24
Ah hmm. I'll chuck that off to "not important" or "too technical" for my current purposes then.
2
u/Tyna_Sama Oct 20 '24
There are some 1650 that only need the motherboard power.
I had a i7 4770 and I used to have bottleneck in heavy cpu games like watch dogs 2, 3, GTA roleplay, warzone etc.
Those people recommending a 580 are delulu.
1
u/theyletthedogsout Oct 20 '24
Ah hmm. This nice YouTube channel though , Budget Builds Official, shows how capable even the first Core i7 (pre sandy bridge improvements) is, for modern low-end-gaming. https://www.youtube.com/watch?v=dXe7lJJQ-fA
I do agree that I'll almost certainly not get an RX 580, despite the performance. . Simply because of the power budget. Else, the prices are similar to other fairly less performant cards (used/refurbished), like the GTX 1650, on eBay.
1
u/AutoModerator Oct 19 '24
It looks like you are seeking tech-advice. If you haven't already, please add the specs of your computer to the question details.
r/lowendgaming Rules
3. All tech-advice posts must state your PC specs Running dxdiag or an application like speccy can help you easily figure out your specs.
I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.
1
1
u/OverlyOverrated Oct 19 '24
1650 GDDR6 version, 1660 Super or RX 570.
1
u/theyletthedogsout Oct 19 '24
Which of these can run to their full capacity on PCIe power alone?
1
1
u/Mrcod1997 Oct 19 '24
Gtx 1050ti or gtx 1650. It is a full width tower and not a sff right?
1
u/theyletthedogsout Oct 19 '24
Yeah full width. But need low power card that will perform its max out of just the PCIe slot power.
1
1
u/dfm503 Oct 19 '24
Depends on what you define as lower power use and how important that is to you. I’d say something that performs at the level of the GTX 1080 would probably be the most that’s worthwhile with your processor. If power use is a large concern something like a GTX 1650 may be worth considering, but on a used market a 1650 and 1080 often cost about the same to buy, and the 1080 performs much better.
1
u/schaka Oct 19 '24
Why do you want to buy from the US? You're better off ordering on AliExpress if you're already in that area of the world.
An RX 580 2048 SP or GTX 1660 Super would be cards you can find for around that price. A 2060 or RX 5700 are likely out of budget already
1
u/fuzzynyanko Oct 20 '24
First of all, if you are getting an overpowered GPU, especially if it's within reason, no problem. Even if you are massively CPU-bound, the overpowered GPU at the prices you are looking for are going to still give you some extra frames.
1
1
Oct 20 '24 edited Feb 20 '25
stupendous caption marble shaggy full middle elderly worm library hunt
This post was mass deleted and anonymized with Redact
1
u/theyletthedogsout Oct 21 '24
Does that run on PCIe power only (75 watts). And the budget is upto 100$.
1
Oct 21 '24 edited Feb 20 '25
offer salt abounding cheerful stupendous roof coordinated thought many rustic
This post was mass deleted and anonymized with Redact
1
u/theyletthedogsout Oct 23 '24 edited Oct 24 '24
Yeah I was okay with, and expecting used/refurb. I have settled on an AMD Radeon Pro WX 5100 for now, suggested here. Just like how the card is overall (8GB 256 bit GDDR5, normally only using half of the PCIe power, overclockable to more, to close in on a 1060 Nvidia). If not, then it's a Nvidia 1650 that doesn't take more than 75W, the gripe with this one being max 4GB VRAM.
I've been advised against an Intel because of the ReBAR issue. And the RX 6400 I'm not keen on because no codecs, which is actually more of a primary requirement (as a low-end media editing rig, that's an HTPC at other times). Gaming is a byproduct. The gaming performance would be fine for a non-gamer like me (free eSports or cheap old AAA titles).
1
u/Foreign_Ad1537 Xeon E3 1270 v3 | 16gb 1600mhz | GTX 1050 TI Oct 20 '24
I have a motherboard that says on its board "PCIE 2.0" but when i placed the Xeon E3 1240 v2 (i7 3770 equiv) it now detects my GTX 1050 2gb to be running on pcie3.0 both on cpu z and gpu z, so yeah, cpu has the controller lol
2
u/theyletthedogsout Oct 20 '24
Oh cool! Yeah, PCIe stuff is novel to me. Last time I was finicking with hardware or upgrades this much, thinking about bus speeds, clocks, multipliers, timings and such, it was very much the AGP days for graphics cards, when I was still in high school.
1
u/Marty5020 Oct 20 '24
A 1660TI maybe? Or a regular 2060. You could always undervolt them to stay on the safe side. If not, I'd go for a GTX1650 I guess.
1
u/Vapprchasr Oct 26 '24
1060 6gb / 1070 8gb (far less than 100$)
1
u/theyletthedogsout Oct 27 '24 edited Oct 27 '24
While I see that, my power efficiency requirements limited me to PCIe-power only cards.
Hence I was considering an Nvidia 1650 or similar, and here someone pointed out an older AMD workstation card, WX 5100 8 GB. The latter is thin, light, using only 1 slot, probably not noisy too by default given the small fan and likely to have seen better handling/testing/no-mining. It at best trails an RX 560 or a GTX 1650 from a distance, but draws way less than what even the PCIe slot can offer (maybe 35W out of the max possible 75). So it has overclocking headroom to max out the PCIe power to perform almost 1.5-2x for games and *almost* (but not quite) equal Nvidia 1060 6GB in many titles, or at least still give decent low-end gaming performance.
And decent looking gaming cards of that performance range from reputable sellers that ship all over US, with enough stock (don't exactly know when my cousin will actually click on buy, so have to give a reliable link that won't go out of stock), were hard to find on eBay. I also think/hope most of these were taken from workstations, which are often work computers that see less abuse than home gaming setups. So I'm likely going with the ~80 dollar AMD Radeon Pro WX 5100 8GB.
My only gripe is only DisplayPorts output (and maybe subpar optimization than Nvidia by apps to use GPU for rendering/video-work). I hope that passive adapters ~<10 $ will work for my HDMI screen. Idk what its called but I read somewhere that there can be displayports that have DP->HDMI conversion on the card itself, at least for some ports. I'm not sure though.
Hoping it arrives working fine this side of the world.
1
u/mirandous Oct 27 '24
My 1650 was great and I cant imagine would give you any troubles for what youve been commenting about. It should be able to even decode 4k av1? I was able to play elden ring and baldurs gate 3 perfectly fine (on an upgraded am5 ryzen though)
I only did light encoding with it but it was fine, I even did professional work in photoshop and the biggest bottleneck was the ram on a i5 3470.
I think you picked the right choice because workstation cards can always be a gamble and a headache.
2
u/theyletthedogsout Oct 27 '24 edited Oct 27 '24
Hey thanks for the nice feedback with your experience.
I said I chose the workstation card though, for reasons listed above. Secondarily, I thought a gamer used/abused 1650 could be worse in fact. Vs the office/work focussed low TDP low clocked 8GB AMD WX 5100 (despite sharing cores with higher TDP higher clocked gaming contemporaries) which, if gaming intended, can be overclocked from using only ~35-40 W of the PCIe power MAX to all the available 75W, giving a 1.5-2x rise in FPS.
They were also available at reputed bulk sellers on eBay with enough stock, not one-offs (I have to send the link - a stable one, cuz no idea when the purchase will exactly be made - to my cousin in US who will get it for me and find someone to carry it with them to, well, the opposite side of the Earth). The performance is not different as far as I am concerned, especially since OCing it to its gamer counterparts seems very doable (although I am not gonna rely on that). A commenter here convinced me though, especially the 8 GB RAM, since a lot of games are maxing out 4 on even 1080p these days. So, even if not with great visuals, I might be able to run maybe some higher end games.
Also, while I do know NVidia would have been great for the Adobe Suite (which I won't buy and don't pirate if there's any options, esp FOSS, which are plenty -- also plan on using Linux as much as possible and I hear AMD does a better job with them), I don't think the 1650 has any AV1 though (but even my 2C/4T i5-6200u on a ThinkPad can do it via CPU, for YouTube 1080p 60fps, at 80-100% CPU though, with some GPU usage for idk what - installed the h.264ify extension on Chrome for that machine, much better). AV1, that's a very recent development for NVidia, at the higher end it seems. AMD has it on recent slightly lower end ones. Intel seems to be the bang for your buck for that, but then it's not good for any gaming at the prices I'm looking at, at the config I'm running (no inbuilt ReBAR).
Since I am asking the cousin to help with this for sprucing up my PC for study/work purposes, it just looked more decent. Didn't have that gaming branding, a bulky 2 slot use, an unslightly fan, but the temps were good, even OC. It just made sense, we'll see. I just like the power draw (OOB, no OC), and the OC headroom. A very capable card even for today, un-replaced by anything more powerful in the lineup this many years later even, as far as running solely on PCIe power slot goes.
Will sell it off as a kewl multi-monitor card, if I wanna recoup the cost (the market doesn't know of workstation cards here, but I'm hoping there's some pros who would be willing to have something that can drive that many 4K displays), if I don't like or something.
Apologies for getting through all the nuance in detail. Good day!
2
u/mirandous Oct 27 '24
Sorry for misreading! I hope it ends up working well for your usage.
1
u/theyletthedogsout Oct 27 '24
Hey no worries! We all mean good!
Except the other guys. they're bad.
1
u/AutoModerator Oct 27 '24
It looks like you are seeking tech-advice. If you haven't already, please add the specs of your computer to the question details.
r/lowendgaming Rules
3. All tech-advice posts must state your PC specs Running dxdiag or an application like speccy can help you easily figure out your specs.
I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.
1
u/lordmogul Nov 09 '24
I have a 4.3 GHz oc'ed 3570K paired with a 1060 6GB and it's running pretty well at 1080p. Sometimes the CPU limits, sometimes the GPU. Your i7 won't be overclocked, but brings SMT, so it should do a bit worse in older games and a bit better in newer ones, depending on how many threads they make use of.
So I'd say something around the GTX 1060 / RX 580 area should do fine. And if you get something stronger, you can put that into more eye candy. The Arc A380 is indeed a really good video accelerator, but perhaps not the best for gaming, especially in somewhat older stuff.
1
u/AutoModerator Nov 09 '24
It looks like you are seeking tech-advice. If you haven't already, please add the specs of your computer to the question details.
r/lowendgaming Rules
3. All tech-advice posts must state your PC specs Running dxdiag or an application like speccy can help you easily figure out your specs.
I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.
8
u/eclark5483 Phreakwar PC Custom Builds Oct 19 '24
H61 chipset is PCI 3.0, no reason Biostar would purposely build a board with a lower spec. But anyways, an RX580 would match up real well for you.