Solidworks 2004 - confirming what everyone knows

So I'm building these new workstations for my office to combat the
ever-increasing modeling pig. I like to do it myself so I know exactly what
to expect and exactly how they'll perform and to control costs down to the
penny. It came time to run benchmarks... now mind you, any of these
numbers will look damn fast and I'm very excited about that. That's the
idea behind new hardware. But what I'd like you to notice is the difference
between the versions... Hmmm.... it confirms my benchmark postings of the
past. Solidworks slows down 5-10% with every new release. Now apparently
with every SP also :) Since 2003 SP2.0 seems to come and go maybe this is
shooting fish in barrel; so be it, I didn't post it, they did :)
Workstation Hardware:
Athlon64 3200+ running on MSI K8T Neo Motherboards
430W Antec True PS
1Gb DDR400 Ram, 2 sticks of 512Mb
SATA Raid-0, 2 drives for 120Gb total
Nvidia QuadroFX-1000 (got a really good deal on them)
All tests performed using
formatting link
Solidworks 2003 benchmark "standard
set" - lower numbers are better, all results in seconds. The first tests
used 2003 SP3.0 (right from the CD). The 2004 tests were performed by batch
converting all the SW files and then repairing the few that 2004 complained
about after the conversion. Once all files were opening, rebuilding, and
closing without errors, the benchmarks were run again. Then, finally, SP2.0
was installed and the benchmarks were run yet again. At least three runs
for each benchmark. High and low scores tossed out. This is about as fair
as a benchmark can be. For a little math: if you do 35 hours of work per
week, SW2004 adds up to about an additional 3.5 hours per week or 42 mins
per day compared to 2003 to do the same tasks. That is, unless they fix
their resource hogging slow-downs.
SW2003 SP3.0
122 Total Seconds
18 Graphics
64 Processor
40 I/O
SW2004 SP0.0
128 Total Seconds
20 Graphics
62 Processor
46 I/O
SW2004 SP2.0
135 Total Seconds
21 Graphics
63 Processor
51 I/O
Reply to
Eddy Hicks
Loading thread data ...
Eddy, I'll have to admit to very little knowledge about benchmarks so my remarks can be taken for what they are worth. The numbers you produced not-withstanding (and thanks for posting them, every bit of information helps), isn't it possible that we still get our work done faster with new releases?
As an example, consider the new functionality in assemblies. We can now place multiple instances of the same part into an assembly with mouse clicks. I find this to be a HUGE time saver, although I don't have any hard numbers to back up the claim. Same with creating new drawings. The ability to create the drawing directly from the part or assembly and quickly add projected views certainly is faster than the 2003 way. Maybe someone would do the stop-watch testing and the math and see if adding up some of these new features shows an overall increase in productivity. I'm not disputing the numbers you produced, I mean there sitting right there in front of us. But isn't it possible that we are in fact completing tasks faster these days?
Can you imagine if SolidWorks could rein in some of the "resource hogging"? We'd have the best of both worlds - better functionality and better speed.
Reply to
Richard Doyle
Good points Richard.
The new BOM, even with a couple of its little quirks, will save our users time and headaches. My company is very anal when it comes to how the BOM looks. With the old excel way we were very limited because you could not modify anything with out it disappearing. The Flat Wrap is going to cut our users time tremendously. Hole Charts...... I could keep going.
The thing I do not like about the creating the drawing, that you referred to, is that it does not rename your drawing file to match the part or assembly name. It names it DRAW1....I reported it in SP0.
Reply to
...big snip...
Yes I can imagine. And I bet they could do it too, if they'd make it a priority. No amount of hardware upgrades in the world will hide the truth. Each release is slower. Period.
- Eddy
Reply to
Eddy Hicks
I reported the file naming problem to my VAR as well but never heard back from them. I discovered the naming problem only seams to happen when I create the drawing manually using a template with pre-defined views. If I use the SolidWorks Task Scheduler to create the drawings, it does use the model name when saving the files even with pre-defined views. I definitely think this is a bug.
Dave H
SWuser wrote:
Reply to
Dave H
How can you add MORE functionality with out using the processor MORE? Could Office 2003 run on a 486 machine NO but if you used the word processor of that on today's machine You would never have lag. But would you have advanced grammar and spelling correction on the fly I don't think so. New functionality speeds up production. If you are not speeding up your design process as you get new releases either one of 2 things are true.
1.) You have not taken advantage of the newly available functionality by not learning how to use it, or not being aware of what is there.
2.) The new functionality is not needed to complete your design process. (In which case may I ask why did you upgrade.)
Please nobody be offended by my comments, as they are not directed at anyone in particular. They are merely an argument for a different opinion.
Corey Scheich
Reply to
Corey Scheich
Corey, I agree in theory. The problem is when a company adds new functionality at the sake of the functionality that people have been paying for. There's no reason for OpenGL performance not to become a priority, but I don't believe it has been in over 5 yrs. I want to convert to 2004 for the sake of the surfacing and limit mates; two things that would really benefit our office. The question I have to ask myself, and my associates, is whether those two things would offset the 10% performance penalty. If the difference were 2-3% I wouldn't be worried. But 10% tells me they're doing something wrong (very wrong). The increase in file size backs this up. A couple percent added to file size would make perfect evolutionary sense, but not the increases were seeing.
I was posting for the sake of all who care. I don't want a debate, really. The evidence is there, no one disputes it. The new features are there, not everyone needs all of them, but most everyone needs some of them. And we all hope that the next version or SP will fix what's broken, and that's the primary reason we all flock to the next SP like moths to a light bulb.
But therein lies the problem. We don't have any choice. Either because the next version is the lesser of two evils or because we can't use anything older than 2003 because of our clients and our vendors. Very soon we won't be able to use anything older than 2004, if we are to continue using SW at all. If I have to translate files to talk to my clients I just as soon pick something else. But I really don't want to. I want to have fixed what it broken.
I can understand a little speed for the sake of new features. But MS Office is one suite that can evolve with hardware because it doesn't push to the bleeding edge. Solid modelers do. When they build in a bloated demo software like Bluebeam or Cosmos "crippled" then they're not doing you any favors. I've never complained about surfacing, modeling, assy, or drafting enhancements; hell my company needs those things. It's all the other stuff that's breaking the code, IMHO. We can all argue it forever but the noble thing would be for Solidworks to make those things optional: I could see it now... WI asking... "Install normal" "Bloated" or "Turbo". Which would you choose?
- Eddy
Reply to
Eddy Hicks
In the old days I used DOS. Talk about speed!!!! In the old days I used to run AutoCAD on 2 MB of memory. In the old days I......... :-).
Reply to
Thanx guys for clarifying the slowdown. I personally believe if we complain about slowdown and don't expound on it enough that SW as they read this group don't really get the picture. Therefore they may write off a rant as a few people having a bad day. I have a friend in IT and he will go over to someone's computer pretend to tinker around (but really he doesn't change anything) then he tells them that he spead something up. They come back to him later "Man my computer is so much faster now!!" or he puts in a new monitor while they are out and they think that they have an upgraded computer, and again "Wow it is so much faster!!". Point being that just because we scream and yell it doesn't get the point across without specifics. SW thinks they are speeding up the software but at the same time things are getting slow. They may think we are a bunch of complaining Idiots who don't know the nature of writing good software. (Truth is I don't know what it takes, and I doubt many do) Anyway Yes SW needs to do some clean up and be a bit (alot) more careful not to cause problems with one thing by adding another. Although you have to give some credit adding a view of a large assembly has dramatically improved BOM and Weldments are stepping in the right direction and Assemblies are a bit friendlier.
Well if there really is a 10% slow down because of broken functionality then F*(& new function.
Moral of the story. Let's try to be constructive in criticism or atleast clear.
Regards, Corey
(PS Try putting a 500k file on a 3.5" floppy disk and remember what slow really is then try a 5" floppy.)
Reply to
Corey Scheich
If you didn't reboot between tests, you are really testing memory leak and application speed, but without knowing how much of the slowdown to attribute to which.
And just to confirm what exactly it is that everyone knows, what do the results look like if you reboot between benchmarks?
Reply to
The tests reflect full restarts between each run, with nothing else running, i.e. any task tray items, etc. were turned off prior to running benchmarks. Were you expecting the results to look better? If anything, running them one after another would improve the overall average because portions of the code/files could be loaded from cache.
- Eddy
Reply to
Eddy Hicks
Memory leaks affect single application performance, not performance between runs. Unless of course you are using a system such as Windows 95/98 where it truly is possible to have memory locked until reeboot. NT/2000/XP don't suffer from this problem. They have much better process controls in place to ensure that when a process stops, all resources associated with the process are released back to the OS for use by other applications.
Jim S.
Reply to
Jim Sculley
Corey For the most part, I don't agree. I'm not sure why a new type of fillet or surfacing command should make something so basic as extruding a simple sketch take several times as long. When you add up lots of simple things like this that everybody uses many times per HOUR, you get the sluggishness apparent in these benchmarks. If a new feature slows down every basic function of the modeling kernel, it is sloppy lowball programming to adopt it in that state. The user is likely better off without it. I agree with some of the others posters on this thread that quickness of the software should be the priority in the near term. Much of the sluggishness is not only a stopwatch issue, but it effects the "momentum" of a user. If a user is moving right along and all of a sudden, swx zones out for 20 seconds on a simple process, it destroys continuity of thought, you wonder if a CTD is on the way, etc, etc.
We're seeing these performance hits incrementally, so it's interesting to fire up sw2001 or sw2001+ on your current machine to see how different they are.
just a thought bill
Reply to
bill allemann
If you reboot fresh, open some SW models and close SW down, you don't go back to the memory usage level where you started after reboot.
Reply to
That's not a memory leak. That's the *OS* keeping some memory 'live', assuming you will soon want it again. Look at any typical Linux box and you'll see the same sort of thing. As an example, the Linux box on which I am writing this message shows 400Mb of my 512Mb as 'in use'. I having nothing but Mozilla Thunderbird running.
The Windows system tools are notorious for not showing you what you think they are showing you.
Jim S.
Reply to
Jim Sculley
I understand what you guys are debating and despite the fact that Jim is correct, I say regardless of whether you boot and benchmark or run test after test.... if there's a memory leak it's due to the application, not the OS. If an app causes itself to run slow because it can't release memory and the next time you launch it's even slower then so be it... it was the app's fault and should be credited as such. IMHO
Now given that, I will say that I ran the benchies both ways... fresh boots in between and one after the other and the results never varied more than two seconds overall. I'd say 2004 is not "leaking"... at least not memory :)
- Eddy
Reply to
Eddy Hicks

Site Timeline

PolyTech Forum website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.