New PC Build

Lukhas

Legend
Well, things are going to get spicy in the next few weeks. AMD claims a 19% IPC gain over Zen 2, and claims to at last beat Intel in gaming. We'll see if the claims hold up, but that's big if true. They already were knocking on the door last year, but now if they can have that elusive halo gaming product, it could significantly affect the mindshare. They already had the halo productivity CPUs (3950X, 3990X) and on laptops in most cases (4800H) so far, but such a product would affect the perception of their brand even further.

 

Enga

Hall of Fame
I built a new computer earlier this year. It was a pretty satisfactory price.

It's Ryzen 2600, RX 570, and 16gb RAM. The rest I just used some old parts I had from my previous computer. So it came up to be less than 400 dollars though I can't remember the exact. I say that the new generation of Ryzen has been great. Even if it's not the latest gen of Ryzen, the 6 cores will probably last for many years. The RX 570 is the only part that looks iffy but I think graphics cards are the easiest to upgrade.
 

movdqa

Talk Tennis Guru
I was away for nine days up in the Lakes Region of NH. I got the CPU, RAM, SSD, Cooler on the motherboard and mounted it inside the case. On the motherboard, I noticed that the RAM slots are fixed on one side and the hinge is on the other. In the old days, you had hinges on both sides.

I tried to put in the port fascia but I couldn't get it in. Then I remembered seeing something in one of the videos on the case where you have to take the rear fan off before you can install the fascia. One other thing that I noticed is that the metal in the cases is nowhere near as strong, thick or rigid as my 2008 PC or my 2010 case. I suppose that the old stuff was overkill and expensive. I think that the old cases used pretty rigid aluminum which is probably pretty expensive stuff. This means that there's more flex with the screw mountpoints and some were off so I sometimes had to move the board a bit and sometimes had to push the mount a bit. Everything is in though.

Watching the videos a lot made things easier but it was still a fair amount of physical work and this is one of the times that I wish that I had smaller hands. I did drop a screw while trying to install the motherboard. With my other cases, I'd have to try to wrangle the case so that the screw rolled to a place where I could grab it. This case has a door on the right and I had it off so the screw just dropped onto the floor. I really love having a door on the right side.

So it's in the basement waiting for the PSU. I think that I'm going to try the first boot using integrated graphics to keep things simple.

I have had some problems with the new video card in my ancient desktop. I suppose that it could be heat or the CPU not being able to keep up or a driver issue. I just took the default drivers - I didn't download the latest. Sometimes the monitors go black for a second up to ten seconds. I can live with it for now.

I did a little research on finding the best in performance/watt and what I found was that server systems were better designed for using minimal power for their workloads. They tend to use lots of cores at lower frequencies which is a more expensive solution as you need more die space for the additional cores. But power consumption goes up exponentially with frequency so it makes sense to use a lot of cores at lower clock speeds; if your workload is multithreaded. Servers are sold on TCO and power is part of that and they run all the time. You can do multi-CPU systems with server CPUs as well. I haven't looked at the differences between EPYC and Xeon but I'll think about that for my next build.
 

Lukhas

Legend
I know that lower power consumption may sound enticing, but server tier processor actually generally perform worse than HEDT despite being more expensive. The low power consumption is more to be able to stack these on top of each other and avoid unnecessary heat, which is a huge issue in a data centres and basically not one in a personal system assuming you don't live in the Death Valley. Servers are cooled with the equivalent of air dryers mixed with Hoovers, so it's a wholly different world than anything a home user could be needing.

That being said, EPYC is better than Xeon not only in price but also in performance. There are few niches where Intel still can be a relevant option on a server platform that are detailed in the review below, but otherwise it's a bloodbath where even some multi-socket systems from Intel struggle to compete against one single socket AMD EPYC CPU.
Therefore, the discussion is more between EPYC and Threadripper. While EPYC performs worse and the motherboards do not offer as many features that are useful for a home user, EPYC does support a lot more RAM than Threadripper. Serve the Home is a fairly reliable resource when it comes to servers, and so is Level1Techs (albeit they focus a bit more on HEDT). You can find workstation or server focused reviews on their website; Level1Tech also has an useful forum and Linux discussions. Anandtech may be a place worth checking out as well. I'll put all three websites just below this paragraph.

But honestly, there's really hardly any salient point for Epyc instead of Threadripper for a home user unless you really, absolutely need terabytes of RAM and a multi-socket system as well as other server features. The power consumption issue isn't one for a single home user since you can very well dissipate that heat effectively, and it's not like the system will be running at full speed at all times, and if you don't delve into multi-socket systems, one EPYC CPU does not actually offer more cores than one Threadripper CPU: they both cap at 64 cores and 128 threads.

Actually, the lack of support for unregistered DIMM and therefore more than 256GB of RAM is one of the annoying points about Threadripper: it's basically product segmentation to avoid stepping on the toes of EPYC; even though there's a point in wanting Threadripper performance and motherboard features without the whole server package and motherboards from EPYC. It's particularly salient for the 3990X: all that power without the ability to truly stretch its legs in the future, which would've been welcome in 3D rendering or VFX and video editing. AMD somewhat conceded that point when they started shipping a new "Threadripper Pro" line to OEMs only like Lenovo due to the demand for higher RAM capacity without the EPYC package that people such as 3D artists for example do not need and certainly do not want either. No such luck for enthusiasts willing to build their own machines though.

That being said, I'm open to consider that you may need a server class CPU; I just don't see it, but maybe you have a very niche use case for it, like needing even more PCIe lanes/more flexibility in lane allocation.
 
Last edited:

movdqa

Talk Tennis Guru
I was looking at server systems to get more cores to keep power use down. It is possible that I will need more CPU resources as well. I'm a bit surprised at how much horsepower these trading applications use as I add more charts to monitor.

The Mac Pro uses Xeons and can address up to 1.5 TB of RAM. I think that 64 - 128 GB will be the most that I need for trading.

Installed the PSU and routed the cables (only three needed). The routing capabilities through the back on the case are quite nice. I'm going to hold off on putting in the SATA3 SSD until I verify that they system boots. I still need to build a boot installer.

Attached a bunch of the header cables and just have several that I'm not sure of and the on/off, reset switches and lights. Staring at the manuals helps.
 

Harry_Wild

G.O.A.T.
I was looking at server systems to get more cores to keep power use down. It is possible that I will need more CPU resources as well. I'm a bit surprised at how much horsepower these trading applications use as I add more charts to monitor.

The Mac Pro uses Xeons and can address up to 1.5 TB of RAM. I think that 64 - 128 GB will be the most that I need for trading.

Installed the PSU and routed the cables (only three needed). The routing capabilities through the back on the case are quite nice. I'm going to hold off on putting in the SATA3 SSD until I verify that they system boots. I still need to build a boot installer.

Attached a bunch of the header cables and just have several that I'm not sure of and the on/off, reset switches and lights. Staring at the manuals helps.
I just move SATA stuff over to the M2 SSD since it is 5X faster. I think you have to call Microsoft to get a code to free Windows too and/or pay an upgrade fee.
 

movdqa

Talk Tennis Guru
I just move SATA stuff over to the M2 SSD since it is 5X faster. I think you have to call Microsoft to get a code to free Windows too and/or pay an upgrade fee.

I have a 1 TB M2 and a leftover 2 TB from another system.

I just need to make a bootable and then provide a credit card at some point in the future.
 

Lukhas

Legend
The Mac Pro uses Xeons and can address up to 1.5 TB of RAM. I think that 64 - 128 GB will be the most that I need for trading.
The best Xeon available for the Mac Pro is the W-3275M, which despite the confusing naming convention its a workstation class CPU and not a server class CPU. As for its performances, it is slower than the Threadripper 3970X while consuming more power, but this is something we had talked about before.


AMD's Zen 2 architecture is in about every way better than Intel's best HEDT or server platforms. It is why the Mac Pro carters to a very niche audience. Notably people who have little choice but to work with Mac OS professionally, because switching to a Windows or Linux based platform isn't realistic for them for various reasons and really want an update to the trash can. I can understand that.
 

movdqa

Talk Tennis Guru
The best Xeon available for the Mac Pro is the W-3275M, which despite the confusing naming convention its a workstation class CPU and not a server class CPU. As for its performances, it is slower than the Threadripper 3970X while consuming more power, but this is something we had talked about before.


AMD's Zen 2 architecture is in about every way better than Intel's best HEDT or server platforms. It is why the Mac Pro carters to a very niche audience. Notably people who have little choice but to work with Mac OS professionally, because switching to a Windows or Linux based platform isn't realistic for them for various reasons and really want an update to the trash can. I can understand that.

I do really like macOS and it would solve my backup problem and it would mean that I won't have to run all of my office applications on my aging MacBook Pros.

The Apple Silicon MacBook Pros coming out soon are supposedly priced twice the price of Intel MacBook Pros. I will be seriously interested in the new hardware if the performance is actually worth paying double the price of x64.
 

Lukhas

Legend
The Apple Silicon MacBook Pros coming out soon are supposedly priced twice the price of Intel MacBook Pros. I will be seriously interested in the new hardware if the performance is actually worth paying double the price of x64.
x86/x64 in general I don't know and don't think so, but x86/x64 on Mac, maybe because of how power constrained some of the Macbooks (and almost every Mac except for the Mac Pro for that matters) are due to insufficient cooling. I think in particular about the Macbook Air, which has essentially no cooling whatsoever. My worries that I had talked about a bit earlier was more about application support. I quite honestly, even if I were a MacOS aficionado, would postpone any purchasing decisions not only after benchmarks arrive but also for about a year. Too many unknowns about software compatibility, and I think gremlins will have to be sorted out and waiting a year isn't particularly pessimistic in this instance.

I'm open to being surprised. Apple's silicon on phones and tablets is genuinely pretty good. We'll see, but patience doesn't hurt.
 

movdqa

Talk Tennis Guru
x86/x64 in general I don't know and don't think so, but x86/x64 on Mac, maybe because of how power constrained some of the Macbooks (and almost every Mac except for the Mac Pro for that matters) are due to insufficient cooling. I think in particular about the Macbook Air, which has essentially no cooling whatsoever. My worries that I had talked about a bit earlier was more about application support. I quite honestly, even if I were a MacOS aficionado, would postpone any purchasing decisions not only after benchmarks arrive but also for about a year. Too many unknowns about software compatibility, and I think gremlins will have to be sorted out and waiting a year isn't particularly pessimistic in this instance.

I'm open to being surprised. Apple's silicon on phones and tablets is genuinely pretty good. We'll see, but patience doesn't hurt.

I have two main applications. One runs in Java so I think that there will be support on day one. The other runs in WINE and Apple would have to have an emulator to run the client application. I'm not looking for a laptop for this - more of an iPad Mini or Mac Pro type of hardware. A lot of macOS users are discussing what to do. Some are getting Windows systems in the short run. Some are buying used machines. Some who need something now for pro work are just biting the bullet and buying current Macs.

You never really know with Apple until it happens. But indications are a MacBook Pro in the near-term; at least for corporate clients. I could certainly replace my 2014 MacBook Pro with one of these new ones just for office stuff. The other question is how long they support x64? Of course nobody knows. You could go out and spend $30K on a new Mac Pro - you'd want to know that you'll get five years of use out of it.

Windows is up and running. I need to figure out how to create a local account - something I did with my current Windows system a while ago. Then move my archived emails to my MacBook Pro, back it up, take out the new video card from my old system and put it in the new system, move my 2 TB media drive to the new system and install a bunch of software. Performance seems fine - CPU usage is low, RAM usage is low, the air coming out the back is cool. The machine is quiet. The big test is running my trading setup live.
 

movdqa

Talk Tennis Guru
It's up and running and I'm doing setup stuff with it. I have to add the video card and second SSD and then add some cable ties.




pc1.jpg

pc2.jpg

pc3.jpg
 

nyta2

Hall of Fame
Installing Windows. CPU temp 23 degrees. Not bad overall.
that's very cool... my idle (ryzen7) is in the 50's with the stock wraith prism. what cooler do you have? though, can't imagine it would make it 30C cooler. guessing intel's run cool?
 

movdqa

Talk Tennis Guru
that's very cool... my idle (ryzen7) is in the 50's with the stock wraith prism. what cooler do you have? though, can't imagine it would make it 30C cooler.

That was the temperature in the BIOS. I hadn't installed Windows yet. Cooler is Arctic eSports Duo 34. I will have to get a program to display temps but I want to install the video card before doing that.
 

nyta2

Hall of Fame
That was the temperature in the BIOS. I hadn't installed Windows yet. Cooler is Arctic eSports Duo 34. I will have to get a program to display temps but I want to install the video card before doing that.
will be interesting if you're able to get much from OC'ing
 

movdqa

Talk Tennis Guru
Seems someone doesn't care about cable management too much. :whistle:

It’s not done yet. I have to add hardware to both sides. I could tie it, clip the ties, add hardware, and then add ties again but that would seem like a waste.
 

nyta2

Hall of Fame
It’s not done yet. I have to add hardware to both sides. I could tie it, clip the ties, add hardware, and then add ties again but that would seem like a waste.
what i noticed is that some cables were not being routed via the port closest to where it was ultimately being connected... they mostly are using the top one... on a good note, the cable management in the *back* of your case looks clean :p (because all the cables were pulled into the mobo side :p)
 

movdqa

Talk Tennis Guru
what i noticed is that some cables were not being routed via the port closest to where it was ultimately being connected... they mostly are using the top one... on a good note, the cable management in the *back* of your case looks clean :p (because all the cables were pulled into the mobo side :p)

There are only three cables in the back. The power cable to the top of the motherboard and there's really only one way to route that. The big power cable and the power cable for the front panel. I have to add a SATA3 cable when I put in the SSD. The routing ports came in pretty handy. The other cables I just put together because I just wanted to get the thing up and running. The case fans have very long cables. The left side cables were all together and routed through one of the routing ports. I will have to think about splitting them up though. I'm copying my media files off my old PC right now and that will probably take quite some time and then will copy them to the new system. I need the old system for trading during the day so I don't know if I'll be able to switch things over tonight.

This case does make a lot of things pretty easy.

I don't think that it makes sense to do the final cable stuff until all of the hardware is in.

On the plus side, the case came with a ton of zip-ties. I have a device that's really good for cutting the excess as well.
 

nyta2

Hall of Fame
There are only three cables in the back. The power cable to the top of the motherboard and there's really only one way to route that. The big power cable and the power cable for the front panel. I have to add a SATA3 cable when I put in the SSD. The routing ports came in pretty handy. The other cables I just put together because I just wanted to get the thing up and running. The case fans have very long cables. The left side cables were all together and routed through one of the routing ports. I will have to think about splitting them up though. I'm copying my media files off my old PC right now and that will probably take quite some time and then will copy them to the new system. I need the old system for trading during the day so I don't know if I'll be able to switch things over tonight.

This case does make a lot of things pretty easy.

I don't think that it makes sense to do the final cable stuff until all of the hardware is in.

On the plus side, the case came with a ton of zip-ties. I have a device that's really good for cutting the excess as well.
yup, i've done it that way too...
some tips i learned:
* i mount everything to the mobo that i can: cpu, cpu air cooler (if applicable), mem, gpu, m2
* when testing the system, i don't put the mobo in the case, and power up everything (ie. sitting on the box the mobo came in)
* after successful POST, then i'll install the mobo to the case (usually have to remove the gpu to access at least one of the mobo screws)
* install gpu
* install hdd, ssd
* layout all the cables without plugging anything into mobo (ie. usuing routing ports closest to the where it plugs into the mobo) - it's ok if the *back* is a mess :) can clean that up later... and hide it behind a non-transparent door :p fan cables are particularly tricky because the fan headers can be located all over the mobo
* then lastly plug in everything

caveat, notice which ports on the mobo are going to be a problem when in the case... eg. the sata ports might be a 90 degree plug, some ports might be *under* the gpu (if your gpu is really long), some sata ports might be stacked, etc... try to plug these in ahead of time if you can... (i've had to install/uninstall the mobo/gpu a couple times because of this...

the case is nice, i like the handles on top.
 

nyta2

Hall of Fame
forgot to add, make sure you have enough fan headers for fans (or pump if you go liquid) :) else preorder a splitter if you don't have one lying around. ask me how i know...
 

movdqa

Talk Tennis Guru
yup, i've done it that way too...
some tips i learned:
* i mount everything to the mobo that i can: cpu, cpu air cooler (if applicable), mem, gpu, m2
* when testing the system, i don't put the mobo in the case, and power up everything (ie. sitting on the box the mobo came in)
* after successful POST, then i'll install the mobo to the case (usually have to remove the gpu to access at least one of the mobo screws)
* install gpu
* install hdd, ssd
* layout all the cables without plugging anything into mobo (ie. usuing routing ports closest to the where it plugs into the mobo) - it's ok if the *back* is a mess :) can clean that up later... and hide it behind a non-transparent door :p fan cables are particularly tricky because the fan headers can be located all over the mobo
* then lastly plug in everything

caveat, notice which ports on the mobo are going to be a problem when in the case... eg. the sata ports might be a 90 degree plug, some ports might be *under* the gpu (if your gpu is really long), some sata ports might be stacked, etc... try to plug these in ahead of time if you can... (i've had to install/uninstall the mobo/gpu a couple times because of this...

the case is nice, i like the handles on top.

Good tips. One other thing is that NVMe drives use the SATA ports (at least on my motherboard). So you have to note which SATA ports are used or shared by the NVMEs for regular SATA3 drives. I might have been better off just getting a second 2 TB NVMe and mounting it on the motherboard. I really don't need the 2 TB in the old system though.

I checked the CPU/Motherboard temps after running Windows for several hours and rebooting and they were 26-29 degrees. So this thing is pretty cool at idle. I need to install OpenHardwareMonitor to see what the temps are like when Windows is running and when it's actually doing something.
 

movdqa

Talk Tennis Guru
forgot to add, make sure you have enough fan headers for fans (or pump if you go liquid) :) else preorder a splitter if you don't have one lying around. ask me how i know...

The motherboard has a fair number of fan headers. The CPU cooler has two fans but they have a cable that joins them to one. There are three case header cables and I didn't know what they are for but I suspect that they are for powering fans. There's a slide switch on the front panel and it has three settings for manual fan control. So you can have three out of a possible eight fans controlled manually. I'm just using the default three case fans and these seem more than enough to keep this thing cool. It's only a 65 Watt CPU and the GPU uses either 75 or 100 max.
 

nyta2

Hall of Fame
Good tips. One other thing is that NVMe drives use the SATA ports (at least on my motherboard). So you have to note which SATA ports are used or shared by the NVMEs for regular SATA3 drives. I might have been better off just getting a second 2 TB NVMe and mounting it on the motherboard. I really don't need the 2 TB in the old system though.

I checked the CPU/Motherboard temps after running Windows for several hours and rebooting and they were 26-29 degrees. So this thing is pretty cool at idle. I need to install OpenHardwareMonitor to see what the temps are like when Windows is running and when it's actually doing something.
unless you have a nas somewhere to throw the 2tb in, might as well install it... it'd bother me more that it was lying around not being used :p also will hate to have to disco everything if later i do need to install the 2tb drive.

eh, i guess you can just throw it into a portable enclosure...
 

movdqa

Talk Tennis Guru
unless you have a nas somewhere to throw the 2tb in, might as well install it... it'd bother me more that it was lying around not being used :p also will hate to have to disco everything if later i do need to install the 2tb drive.

eh, i guess you can just throw it into a portable enclosure...

I was considering building a NAS as my old desktop is the home NAS. I have a couple of 500s and a 120 that aren't in use and in enclosures but I'd rather they be inside a system. I prefer a large, flat, space rather than cobbling together smaller devices. SSD storage is so ridiculously cheap these days. I'll put it in the new system and it will be the home NAS for now. I'm going to keep the old system for a backup but might eventually give it away.
 

nyta2

Hall of Fame
The motherboard has a fair number of fan headers. The CPU cooler has two fans but they have a cable that joins them to one. There are three case header cables and I didn't know what they are for but I suspect that they are for powering fans. There's a slide switch on the front panel and it has three settings for manual fan control. So you can have three out of a possible eight fans controlled manually. I'm just using the default three case fans and these seem more than enough to keep this thing cool. It's only a 65 Watt CPU and the GPU uses either 75 or 100 max.
i'm counting 3 fans in your case (so probably that's what the "case header cables" are)... can't see if you have fans on the top of your case.
fan cables these days are very obvious... they have 3 or 4 ports, and "slot" (ie a way to dummy proof on the mobo where you can plug in cooling h/w)
if it's a newish mobo, might be able to also control fan speeds via the mobo.
agreed, 3 fans probably enough.
 

nyta2

Hall of Fame
I was considering building a NAS as my old desktop is the home NAS. I have a couple of 500s and a 120 that aren't in use and in enclosures but I'd rather they be inside a system. I prefer a large, flat, space rather than cobbling together smaller devices. SSD storage is so ridiculously cheap these days. I'll put it in the new system and it will be the home NAS for now. I'm going to keep the old system for a backup but might eventually give it away.
i went the synology route, but looking back i regret not just building my own for the learning...
 

movdqa

Talk Tennis Guru
i'm counting 3 fans in your case (so probably that's what the "case header cables" are)... can't see if you have fans on the top of your case.
fan cables these days are very obvious... they have 3 or 4 ports, and "slot" (ie a way to dummy proof on the mobo where you can plug in cooling h/w)
if it's a newish mobo, might be able to also control fan speeds via the mobo.
agreed, 3 fans probably enough.

The case header cables aren't connected. I suspect that you can connect these to the case fans if you want to using an adapter I think. But I'd rather have the motherboard control the case fans. The case can take up to three on top, three in the front, one in the back and one at the bottom.

The eSports Duo setup is very nice. You have one intake going into the radiator, one fan on the other side pulling from the radiator and the output of that goes into the exhaust fan. I have no interest in liquid cooling but every example build that I've watched with this case has been with liquid cooling so I guess that liquid cooling is really popular these days. I do wish that there were video cards tuned for my usage - you usually have to get something gaming-specific, or something for specific professional workloads.
 

nyta2

Hall of Fame
The case header cables aren't connected. I suspect that you can connect these to the case fans if you want to using an adapter I think. But I'd rather have the motherboard control the case fans. The case can take up to three on top, three in the front, one in the back and one at the bottom.

The eSports Duo setup is very nice. You have one intake going into the radiator, one fan on the other side pulling from the radiator and the output of that goes into the exhaust fan. I have no interest in liquid cooling but every example build that I've watched with this case has been with liquid cooling so I guess that liquid cooling is really popular these days. I do wish that there were video cards tuned for my usage - you usually have to get something gaming-specific, or something for specific professional workloads.
aio (all in one liquid coolers) are really popular... but like you, i prefer a gigantic air cooler like what you have (i suffer from liquid-leak-phobia which % wise is unwarranted, but still... also i can see/hear if a fan fails, i can't see if a pump fails, but realistically wouldn't notice either way until pc rando reboots). i currently have a wraith prism (stock with ryzen 7 chip), but if i end up OC'ing (not worth it for ryzen7), i'm considering: https://www.icegiantcooling.com/ (partly because i love the science behind it!)
 

movdqa

Talk Tennis Guru
Copying over media files at 815 MB/s from a USB 3.0 external drive. This is the fastest copy I've ever done. Of course I'm using pretty old equipment though.

My old desktop is throwing a lot of heat out the back case fan right now. This is with CPU usage at 20%. I'm hoping the new build will run quietly without putting out anywhere near as much heat. Double the cores should really help.

Copying more media files over on the old system and throughput is 86 MB/s. This is what I'm used to. The copy is over a PCIe x1 USB 3.0 card and the card can't run at full speed over x1 but it's better than USB 2. So this is a pretty big upgrade. I ran a test watching a video and the decoding is done on the iGPU. iGPU usage is around 1% and CPU usage is the same area for watching an mp4.
 
Last edited:

movdqa

Talk Tennis Guru
CPU Temps 25-28 degrees at idle using OpenHardwareMonitor.

Temperature range goes up to 27-29 with YouTube running on Firefox. I will have to actually run my trading software during the day to get a proper load test. My old desktop runs at 72 degrees. I would be very happy if my workload runs under 50 degrees.
 

movdqa

Talk Tennis Guru
I moved the video card from the old system to the new system and put the old video card back into the old system. I also took the 2 TB SSD out of the old system and put it into the new system. I had copied over files from the 2 TB as I didn't know if I'd have any privilege issues just moving the disk over. I did get a warning but don't have any problems accessing the content. I'm still considering getting a 2 TB NVMe and using the 2 TB SATA3 in a NAS.

The system is currently downloading a ton of email from five email accounts and performance is really great. I have it hooked up to 2 4K monitors on my desk and am planning on adding the third QHD monitor. The idea would be to use VNC to run macOS on the Windows desktop and that might be enough. Current CPU temps range from 29 to 32 with one core that spikes to 44. GPU temp is 32 degrees. So everything looks good. We'll see what the temperatures look like when I have my trading programs running during the trading day.

Remaining stuff to do:
- Find a shorter SATA3 cable, the current is 18 inches and I only need six inches.
- Reroute the header wires.
- Put the bottom dust filter back on (it fell off when I was adding the video card and it's hard to tilt over where it is right now)
- Use the cable ties to tidy up - I think that I can put in another NVMe without disturbing the cables.
- Install some of the tools I wrote to make Windows feel more like macOS.
- Set up the system as a NAS for now.

If I really miss macOS, I can just order a Mac Pro. The main issue with using Windows and macOS is using two UIs which is a minor headache because some of the key functionality is different but I should be able to adapt.
 

movdqa

Talk Tennis Guru
I had some video glitches on this system with the monitors going black screen a few times. I had the same problem with this video card on the other system. Updated the nVidia Graphics Drivers and all seems well now. I need to update the drivers on the other system with the relatively recent video card.
 

movdqa

Talk Tennis Guru
Windows Update fried my old desktop so I'll have to reinstall Windows there .

It occurred to me that the monitors going black from time to time may be due to insufficient power and not just a driver issue so I went through a video on hooking up PSUs and found that the PSU should be oriented so that the fan draws air into the PSU. There are different case configurations and different fan layouts and sometimes you have to install the fan upside-down (with respect to the labeling). It's possible that you can only mount it in the correct orientation because of where the screw holes are in the case but I did a quick check on it this morning to be sure.

The video showed me which connectors to use for external GPU power. The video card box doesn't have a power cable but the PSU box does have a GPU (or PCIe) cable so I need to take things apart and install that.
 
Last edited:

movdqa

Talk Tennis Guru
I'm still getting black screens so I'm going to put the older Video Card back in (it's a 1030 and will do 1x4K) and will use the integrated ports for the other two monitors. I'll throw this new video card into the old system. It may be that it doesn't like running 2x4K or it just might be a defect. At any rate, I'll have something that I can run with for now and think about another card. I'm rather disappointed with this sort of thing but QA screwups do happen. I generally never return anything - I'm too lazy to ship it back.

There is no PCIe power connector on this card according to the documentation but I'll check when I open it up to replace the video card.
 

movdqa

Talk Tennis Guru
I've been using the nVidia 1030 for a while and it runs fine serving one 4k with the iGPU serving the other one. The GTX 1050 Ti has been running in the old laptop driving 1 HD monitor and it has been working fine. So it appears to be that the GTX 1050 Ti can't smoothly support 2x4k; at least not at the default settings and with the latest drivers. This is a bit annoying as I wanted to run graphics off of a GPU. I have read various articles on difficulties running 2x4k or 3x4k on video cards and I guess I'm a believer. One solution would be to put in another 1030. They are cheap, use 25 watts and would get the job done. This would take care of 2x4k and I could use integrated for the third display. I really wanted a one-card solution though. I will put the system through my trading paces tomorrow to see how it goes and the thermals. I'm hoping for the best.

Unfortunately I will likely have to make some additional hardware changes.
 

movdqa

Talk Tennis Guru
The black screen problem is widely reported on nVidia's forums with this model. There are a variety of suggested solutions such as setting the refresh rate to 59 instead of 60, setting power settings to maximum and a bunch of other things. It's not clear to me that these work or not.
 

Harry_Wild

G.O.A.T.
I've been using the nVidia 1030 for a while and it runs fine serving one 4k with the iGPU serving the other one. The GTX 1050 Ti has been running in the old laptop driving 1 HD monitor and it has been working fine. So it appears to be that the GTX 1050 Ti can't smoothly support 2x4k; at least not at the default settings and with the latest drivers. This is a bit annoying as I wanted to run graphics off of a GPU. I have read various articles on difficulties running 2x4k or 3x4k on video cards and I guess I'm a believer. One solution would be to put in another 1030. They are cheap, use 25 watts and would get the job done. This would take care of 2x4k and I could use integrated for the third display. I really wanted a one-card solution though. I will put the system through my trading paces tomorrow to see how it goes and the thermals. I'm hoping for the best.

Unfortunately I will likely have to make some additional hardware changes.
Run one display off the iGPU 4K! It should more then adequate to handling trading graphics from your software.
 

movdqa

Talk Tennis Guru
I tried a suggested workaround of changing the refresh rate to 59 hertz. This didn't solve the problem so I dropped it to 30 hertz and it's been stable for two hours. I then added a third monitor (QHD) off the discrete GPU and everything is running just fine. I placed an order for a ten-foot HDMI cable as the six-foot cable is a bit taut.

I pulled off the keyboard, mouse, microphone, speakers, and cables for my MacBook Pro off the desktop and the desk is neater now. I plan to run that somewhere else in the basement and VNC into it when I need macOS. So things are quite nice right now. I will run this for a week, and, if it is stable, then tidy up the case.
 

movdqa

Talk Tennis Guru
Run one display off the iGPU 4K!

That's what I did but I did want to put all of the monitors on a discrete GPU to minimize iGPU system bus traffic. It's all good so far. This thing is fast and smooth. CPU power consumption is 10 Watts. I'd love to know what the GPU power draw is as well. Maybe nVidia has a tool for that. Temperatures are all fine and the machine is very quiet. I'd like a third 4K instead of the QHD but they are scarce due to high demand. Should be fun trying this out for real tomorrow.
 
Top