Graphics cards - case history of gaming vs professional
I am doing this by looking at the model (made with 3d solids) on screen in some model space viewports.
The assembled model consists of about 8 components, each from a separate .dwg file, each having sizes ranging from 50 to 300 kB. The components are x-ref into an assembly .dwg file, which is only about 50 kB since it has little content of its own.
When I view the individual components of the assembly , the shaded images display correctly using and shade mode. I can orbit, zoom and inspect the detail of the model.
When I view the assembly .dwg, which contains the components as x-refs, the shaded image display shows front and back faces and internal faces mixed up. The back faces obscure the front faces and prevent proper inspection of the assembly. All versions of the shaded image (flat, Gouraud, realistic, modelling) have numerous artefacts, the viewport frame overlaps the image, faces have patchy colour, portions of the image move (shatter) independantly and the shaded image is effectively unusable for inspecting the CAD model.
I spent a day doing experiments on 3 PCs with different graphics cards. BricsCAD on 2 PCs and AutoCAD 2011 on another. Eventually gathered enough information to identify the problem. The issues in this case are :
1) The dimensions of the drawn objects are significant. The components are less than 10 000 drawing units on any axis, while the assembled x-refs produce a model which is over 20 000 mm on one axis.
2) I reduced the number of layers in the components and the assembly .dwgs by cleaning up, purging and auditing them, and it made no noticable difference. There were about 100 layers in the assembly .dwg.
3) The size of the dwg file is not an issue for the CPU here. The PC on which the problem first manifested has a Xeon CPU with 8 GB RAM and BricsCAD only uses about 0.5 GB of that.
4) I concluded that my graphics card does not have the number-crunching resolution to process the image. The graphics card is a "gaming card" Nvidia GTX 740 with 2 GB Graphics RAM and 2 monitors attached. (I checked driver version and downgraded the Nvidia driver from V376.09 to 344.11 as recommended by REDSDKINFO, but that did not solve the problem).
I have viewed the .dwg model in BricsCAD on another PC with Intel Graphics 3000 (in the CPU) and the problem of broken shaded views occurred there too.
I have saved the .dwg files back to AutoCAD 2010 and viewed them on an older i3 PC with an Nvidia Quadro K200 card (1 GB G-RAM) using AutoCAD 2011. There the 3d model appeared correctly for all the shade modes.
I then fitted 2 graphics cards into one PC with 2 monitors and distributed the BricsCAD viewports across 2 monitors.
One monitor driven by 1 channel of an Nvidia GTX 740 "gamers graphics card" with 2 GB G-RAM and the other monitor driven by an Nvidia Quadro K400 "professional" graphics card with 1 GB G-RAM. When the BricsCAD viewport is moved onto the monitor driven by the GTX 740 card then the shaded image is broken. When the BricsCAD viewport is moved onto the monitor driven by the Quadro K400 card then the image is correctly shaded with only front faces visible and it meets user expectations.
I have posted this so that others can learn from my experience. Sometimes the "gaming" graphics cards will do the job, but sometimes they simply cannot cope with the precision processing that "professional" cards can do.
- Richard B
Comments
-
I am not certain, but I think there is a possibility you are describing "bleed through". This is where the resolution of how the graphics card is rendering the object is too coarse for the complexity of the scene. I first learned this term when I was using SolidWorks regularly, around 8 years ago.Attached is one example. In my example there is a large mesh rectangular shape, 4'x8'x0.01 thick. Then, behind it is a long skinny mesh. The top of that skinny version coincides with the bottom of the large mesh.Note that when I first created this, and set it to render "hidden", it worked fine. Then, I arrayed the skinny mesh about 75 times, at a 1" spacing. Now, when I set the view to hidden, the skinny meshes "bleed through" to be visible.The unexpected thing is that when I erased all but the original skinny mesh, it still would not render correctly. This is even after a regeneration. And even after exiting BricsCAD and restarting BriscCAD and then viewing that dwg file. I haven't tried a re-boot on this yet.I don't think this has any relation to the viewres setting, which controls how many straight line segments are used to draw circles.I am on a Windows 7 Pro system, with a Nvidia Quadro FX 580 with 512 dedicated video memory, and 4095 MB of shared graphics memory.I have two displays, one at 1200 x 1600 32-bit color, and the other at 1920 x 1080 32-bit.-Joe0
-
I can see the effect which Joe describes when looking at the "bottom" of his drawing if I'm zoomed out far enough. At some point, the object on the far side does become visible, although that's reversible by zooming back closer to the objects. That's using my default settings (BC 17.1.17 (64), Quadro 2000M) and I haven't tried tweaking any "knobs" to see if the effect can be mitigated.There are real differences between the Geforce and Quadro lines. This PDF from Nvidia is a little old but it has the best description I've found: Quadro vs GeForce GPUs0
-
The effect that I described could be called bleed-through.
But there are other artefacts as well : The viewport border has shifted onto the image,
the UCS icon has become disconnected, and some faces have shifted relative to others.The attached image is a screengrab from our last experiment as described above.
The image covers 2 monitors with 1 instance of BricsCAD, using 2 model space viewports and similar views of the same drawing object.
The monitor on the left is connected to a Quadro K400 with 1 GB GRAM,
the one on the right is connected to 1 channel of a GT 740 with 2 GB GRAM.0 -
Wow, that's definitie evidence on a long-running question - and damning of even a mighty GT740!
0 -
Hi, Richard Beneke.
Great info you got there.
Redway 3d site has no information about the importance of quadro cards.But if your file had no xref I think you wouldn't detect any problem with gtx cards.
Can you confirm this?0 -
Hello Ricardo
My testing on the system here has suggested that the the problem relates to the numeric size of the object geometry. In my CAD model, if the steel beam sub-assemblies are "parked" beside each other, the gaming card can shade the image, but if the steel beams are placed end-to-end, then the gaming card cannot shade the image, only the quadro card can do this shading. The drawing units are mm and the problem manifests itself when the assembly exceeds about 20 m in length. Our complete steel bridge assembly is 110 m long (temporary steelwork for the construction of a concrete arch bridge).0 -
Hi Richard,
Thank you for you valuable information.
Up to this time I have never seen any clear evidence of quadro cards superior stability.
Besides some casual inferior performance and pricey hardware until now I have never seen any proof of difference on bricscad.
This changed my skeptical opinion or at least prooves that a Nvidia gtx 740 is poorly supported on redway software.
Can you share the files for benchmark testing of videocards?0 -
I've asked but never got definitie answer - is it poss to have 2 cards - one a super-fast gamer's and one a slower but steady Pro - on the same machine, and have instant (or manual/optional) auto-switching between the two depending on which application has focus on-screen?
0 -
Ricardo, glad to be of some help. I was also sceptical of the extra cost of the professional cards until I had this problem.
Unfortunately I cannot share the files as they contain client info inside too many objects.Tom, I did the tests and am still running with a Quadro and GTX card in the same PC, each card linked to a different monitor.
I can visually see the differences as I drag the BricsCAD window from one monitor to another (monitors are identical model).
If I have a BricsCAD viewport spread across both monitors then the Nvidia graphics driver will send output to both monitors.
I have not tried taking cables from both cards and putting them into one monitor - it might kill the electronics.0 -
O'course - that's one way to do it - separate monitors. Interesting you can just drag from one to the other.
0 -
Reviving an old thread, because much of the relevant wisdom is already in it:
Powerful laptops are now feasible, so rather than spend money to halfway-uprate my trusty, small-Architecture-3D competent desktop, I am considering a gaming laptop with i7 CPU, lots of RAM - and fat GeForce graphics.
The reason being building-site-photogrammetry, which is also now feasible because https://www.capturingreality.com will run v nicely on that gaming spec.But apparently Brics/Acad and most CAD won't (though Microstation will, being ActiveX rather than OpenGL based). Quadro still required.
As Richard's last para above, desktops can run dual graphic cards - but still not laptops.
But I would much prefer to dump the desktop, just use the new laptop (powering desktop monitors while 'at base').Not possible as far as I can see. Any ideas?
0 -
I suggest you test before spending to see if it will work ok. You may have the problem of large coordinate numbers not being processed properly, depending on your units of measurement (mm metres or feet) and coordinate system (site local or wgs84).
0 -
@Richard Beneke said:
You may have the problem of large coordinate numbers not being processed properly ...>You mean if I try to run Brics on a GeForce card? I don't intend to - such reasons are why Quatro is necessary.
The question is, ways to switch fairly seamlessly between cards, depending which Application (Brics vs any other). It seems possible with a desktop - but with a laptop used to power desktop monitors?
For example, is it possible to have the Quadro card in a separate box, route the laptop's signal through that instead of its onboard GeForce?
0 -
What you want to do will probably not be possible. A graphics card needs to be on a system bus on the motherboard. Previously VESA, AGP and PCI, now its PCI-E that is used. Doubt you will find hardware to effectively do what you want. There are laptops with Quadro cards buillt in, but expensive.
0 -
Yes, looks like I will have to have 2 computers; single set of keyboard, mouse, monitors; use KMV switch to switch between the computers.
0 -
There are some options for hooking up an external graphics card to a laptop. Whether or not you can do this with your laptop will depend on it's make, and the types of connections it supports.
If you (and I doubt it) have an Alienware laptop, made by Dell, you can look into getting their Graphics Amplifier. It requires a special connector in your laptop for it to work.
http://www.dell.com/en-us/shop/alienware-graphics-amplifier/apd/452-bcfe/gaming
Razer Core is another product that connects to any laptop equipped with the Thunderbolt 3.
https://www.razerzone.com/gaming-systems/razer-coreLastly, I think HP just released it's own external graphics card enclosure, which comes equipped with a couple of other bells and whistles.
https://www.extremetech.com/gaming/250495-hp-omen-accelerator-external-gpu-ssd-gigabit-ethernet-laptop-gamingSearch Youtube as well. You'll find some people have successfully utilised a Quadro card on an external enclosure and used it for CAD work on a laptop.
0