~SW and Networks

For those of you that are interested in SW minutiae I was watching network traffic while in an assembly and noticed that those tilde files SW creates receive constant traffic while editing an assembly. I didn't really stop to see just what was happening, I just noticed that the tilde files where constantly getting hit. It isn't that data is being added. It is probably that the date and time are being modified. Note that this wasn't happening during a save, but while working. Kind of changes my thinking that nothing is affected outside of memory till save.

TOP

Reply to
TOP
Loading thread data ...

Those are standard MS lock files use just to tell the OS that someone is using this file, that is probably why these are getting hit all the time.

What I have noticed working on the network all the time; that when I hit save, only around 20% of the time taken to finish is the actual traffic rest is the CPU crunching the numbers therefore I do not see big problem working over local network.

Reply to
mr.T

The thing is working over the network is relative. I can open a 25 part assembly or 25 feature part over the network and see little performance hit. But change those numbers to 2000/200 respectively and the performance hit is very noticeable.

So there is no right or wrong answer on working over a network with SolidWorks, not to mention the hours of time it would take to go through all the possible network hardware and software reasons for experiencing a performance hit while working in SolidWorks.

JP

Reply to
Jeff

I haven't really said anything about network performance, just the behavior of SW regarding the lock files. I am working on a network performance problem right now that probably has its roots in MS Server and Group Policy issues.

In other installations I have seen assemblies on the order of 5,000+ parts do quite well. The main thing that has to be borne in mind with SW is that it needs to grab as much bandwidth as possible during the short time it takes to load. So IT people setting quotas on SW users is a bad thing. So is having QOS enabled on a gigabit network. So is using SW on a network already at half utilization.

On a gigabit network I have run seven machines concurrently with Conversion Wizard and neither the server nor the network broke a sweat. I don't have a problem with running SW on a network with large assemblies. If there is a problem look at the setup of the network because the problem is probably self inflicted. The most common self inflicted limitation will probably be AV software, in particular Symantec Real Scan.

TOP

Reply to
TOP

I dont quite agree with this.

We run large assemblies 5000-15000 parts.

We just ran some test where I brought my computer to a hardware seller to test on their newest server, with my computer plugged directly into the server. I didnt see any performance improvement compared to running in here with 25 users running the same files on a 2-3 yo network.

I cant really post the correct details here, but we ran several test loads of a reference assembly and even on a network where everything is in RAM on the server (used to stream TV) there was no improvement whatsoever.

Turning on and off the virus scanner (McAfee) didnt make any difference.

The conclusion from the hardware sellers where: Its not the amount of data causing SW to load slow, but it is the number of I/O calls. These guys have been running performance tests for us since SW2001, and they cant believe SW havent done anything to solve this problem which has been along since the beginning. They claimed that the programmers have no understanding whatsoever on making it work on a network, and that SW dont give a damn since they rather sell a PDM solution which solves this problem with a local and much faster load.

But if you have anything to add, something that we have overlooked please let me know, we have people waiting several hours during the week, solely because SW loads extremely slow.

Reply to
Ronni

When you say SW loads extremely slow over the network is that a comparison with loading from the local hard drive?

When you are loading over the network, what kind of bandwidth utilization are you getting? In other words is it choking on the network or on internal processing in SW? On a good day I have seen 80% utilization by SW on a 100baseT network. On a bad day, .4% utilization.

In a previous post I asked the question, "Is 7 hours and 28 minutes too long to load a drawing?" This was processor related hangup, not network.

There are a number of things that can be done to speed up network bottlenecks and they are well documented on the web. There are things that can happen if a server is a DC that will slow you down. There are things that can happen if quotas are in place.

But my original post in this thread was exactly about the amount of IO to the tilde files while working. If you use filemon and procmon you can watch SW do all it's IO.

TOP

Reply to
TOP

I found one of the problems we had was solved, at least partially by unplugging the server from the switch on the WAN router and plugging it into the main switch. Bandwidth to the server jumped from .4% to

65%. Strangely the bandwidth in the other direction dropped. More head scratching needed.

TOP

Reply to
TOP

That is on my list of things to do. It is a good thought and part of a systematic trouble shooting protocol.

TOP

Reply to
TOP

In the course of the last few weeks I have learned more about networks than I care to from watching NetMon while performing certain actions that were slow. However, the root problem I was having with slow network performance was rooted in a bad cable and switch. After changing to a new cable (CAT5E) and a new switch (gigabit) I am getting 80-90% throughput on a 100baseT network were I got just .3% a few weeks ago. So here is my guide to finding a network problem.

  1. Check the cables and cabling. Perhaps using a laptop, connect to a computer having problems and move a bunch of files, both with xcopy and explorer. Set a baseline for that known good connection. Then addin cabling bit by bit till you are at the server.

  1. Check the server setup if on a domain. I was really surprised at all the network traffic windows generates, as well as dhcp servers and routers. One of the things that causes a slowdown from a server is having a DC on the file server. Seems that windows does networking differently in that case by demanding a lot more verification and actually putting a delay on traffic. There are fixes on the MSoft website.

  2. Check the server setup for unnecessary SMB traffic. All the fancy little tricks Windows does for the user interface can cause extra traffic from a file server. There is a KB article on that too.

  1. Optimize your PCs TCP settings. There is a lot out there on this. Especially on a domain things can be optimized.

  2. Shut down QOS. This is a limit MSoft puts on the amount of network utilization a PC can use. QOS doesn't help with SW hit and run type file IO.
  3. Use the tools. NetMon or Wireshark for watching traffic on the wire Task Manager for watching CPU and NetWork through put Perfmon for detailed investigation over time. ProcMon for watching what gets the most time on the CPU FileMon for watching file IO xcopy and robocopy. Both excellent file copy tools. Very good at loading down the network. Cable testers to determine if cabling is up to snuff and wired correctly.
Reply to
TOP

I'm not familiar with QOS so I googled it, and I come across this link you may want check out:

formatting link

Reply to
mr.T

All I can say about QOS is that I turn it off. When off on two XP Pro workstations communicating through short cables and a switch with little other network traffic I can push up to 90% bandwidth on

100baseT. Keeping one endpoint the same and going to a Server2003 box over much longer cable with other traffic I get 70%.

When pushing files there is overhead in setting up each file, so a bunch of little files will go slower than one big one. I see this when pushing gigabyte zip files vs. a thousand little SW files with dumb solids inside. There is also overhead when using Windows Explorer vs xcopy or robocopy. There is also a lot of overhead on a network with a domain controller vs a simple workstation network.

I have a harder time getting gigabit to go much over 25% no matter what I do. I have not found an answer to this yet.

TOP

Reply to
TOP

How much network utilization are you expecting? I didn't think that even a fast array could fully utilize a gigabit link all by itself, except for bursts from the various caches.

Reply to
Dale Dunn

PolyTech Forum website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.