Setting up 2x 4K monitors on a Dell Poweredge R710 powered by a AMD RX 570

Anyone who has played with servers has encountered those OEM builds which seem to do everything they can to prevent GPU usage. The Dell Poweredge R710 (and likely many others) are a great example of this — the anti-GPU methods employed by Dell are as follows:

  • No PCIe 6-pin/8-pin power hookups, nor anywhere to attach them.
  • The two x16 riser cards included are required for POSTing, the server will not boot without Riser 1 and Riser 2 — while there is an optional x16 riser, it is quite expensive, the default is 2x x8 and 2x x4 PCIe slots.
  • Limited space (I’ll be working in the 2U default case)
  • Allegedly no support for video passthrough — the BIOS will not output the GPU (this appears to be false, except for the configuring memory screen).

Here is a picture of the finished, working result: (sorry for the poor angle and lighting, the server is tucked away)

Tools Required

  • If you have a GPU that will require the reduction of the heat sink size, like mine: an Angle Grinder/Rotary Tool/File/Pliers (for the desperate)
  • Soldering Iron w/ Solder (lead strongly preferred)
  • Some wires (red/black is ideal, 18 gauge or less recommended, as pictured)
  • PCIe x6/x8 power extension cables as needed
  • Some patience

In my case, before using my primary GPU, I used an older test GPU as to avoid frying my brand new RX 570. I recommend that you do this if possible.

Preparing the Power

The Dell Poweredge R710 has two PSUs, in my case, 700 W each (and some change). You should be fine to run these in parallel if your GPU requires more power than the 150W or so that the RX 570 will draw. For me, I couldn’t get my final memory channel filled without running out of power, so your results may vary. See http://pinoutguide.com/Power/dell_server_24_6_psu_pinout.shtml for a detailed pinout diagram, which can be mapped on to the socket-motherboard adapter pictures below:

The left-most 3 connections are +12V, while the 3 connections to the right of that are GND. Thankfully, the PCIe power plugs only need these two connections. I ran 3 +12V lines to the 3 12V lines I cut off from the 6-pin PCIe power connector (the GPU needed 8 and I failed to predict that). Same thing for the GND wires. Don’t rely on the PCIe extension cable’s wire coloring, or you risk shorting your GPU, match the cables up to the socket directly. It can be a bit tricky to solder to the power rails — needless to say, make sure your system is powered off. This is why I recommend lead solder, the superior surface tension is a life safer here. If you find the solder not sticking, try rubbing gently with a wire brush then applying some flux (or fresh solder if you’re lazy). Make sure all solder joints are solid — I’ve had a few of the cables come out due to improper soldering, which could have destroyed any number of things.

If you make the same mistake as me, and prepared a 6-pin power connector for a 8-pin socket, just connect the other two pins to GND via a jumper, or do the right thing and get a 8-pin connector. The two extra pins are connected to ground only to prove to the GPU that more power can be drawn. If you have multiple PCIe power cable needs, run the in parallel direct from the PSU — you will probably need to join both PSUs in parallel though. Using an external PSU is an option, but can be risky. While GPUs are supposed to use the PCIe power cable as an external power source, and isolate it, allegedly, some GPU manufacturers fail at this.

Dealing with Risers

If you want to put up the $300 or so for the proper Dell riser, be my guest. Considered the negligible performance impact of using x8 or x16, I went ahead and solder-melted away the plastic end piece of the lower x8 lane. I recommend using the upper x8 lane for practice, as it’s easy to mess up and cover a pin in plastic, which can take a while to remove. The smell is horrible, and probably toxic, so you should be in a well ventilated area. A better solution is definitely to buy a cheap x8 -> x16 riser — but I was too impatient, I also did not want to trim my GPU header.

Heatsinks

Unfortunately for me, my GPU did not quite fit. Rather than getting an external enclosure and a riser, or a smaller GPU, I just took a rotary tool (destroyed it), angle grinder (cutting edge, gets most of the aluminum, need more precise tools for finishing), and finally a pair of pliers to twist and tear the rest away. Be wary that any blade you use for grinding aluminum will probably be destroyed by the molten aluminum filling in the grinding bits. If your smart, buy a more compact GPU. Surprisingly, my thermal characteristics have not been seriously impeded (not that I game at all, so it might actually be terrible). I recommend taking a thin knife through the remaining spaces between the melted/twisted heat sink layers, as to recreate the air gap for better air flow, and straighten out the aluminum.

Connectors

I spent a solid 8-12 hours debugging why I had no video output. Thankfully I had a network boot system with sshd ready to go — you may not be so lucky. The result of my debugging is that everything software side was great, I just failed to push the DisplayPort connectors all the way in, as they were ever so slightly occluded by the case, resulting in the bottom plastic bit blocking complete insertion. A bit of wild shaking and reseating the GPU during initial insertion resolved the problem. If I had destroyed by rotary tool dealing with the heat sinks, I probably would have cut away the occluding bit of metal case.

Conclusion

The system works flawlessly. I can see the BIOS loading screen, and playing 4K x 60 HZ video works flawlessly. The only issue is that the “Configuring Memory” message does not appear, in favor of no display output at all. Don’t be terrified like I was, just let it finish. Good luck with your hardware hacking!

Leave a comment

Your email address will not be published. Required fields are marked *