Feature Statistics,... FALSE LIES!!!

I want to say this as nicely and as clearly as I can... Feature Statistic values MEAN ABSOLUTELY NOTHING! The TRUE MEANINGFUL VALUE is the TOTAL time , that is, when the user has FULL access to the data!

Otherwise, Feature Statistics are POINTLESS!

Let's have REAL values, NOT FALSE DATA!!

Opening (total time it takes to open and have FULL access to the data) Ctrl-Q (total time it takes from ctrl-q to FULL access to the data) Saving (Total time it takes to save and have FULL access to the data.

..

Reply to
zxys
Loading thread data ...

Paul, I hear what you are saying.

I have recently been VERY peeved by feature stats on a part I was working on - though not by feature stats as much as by SWx as a whole.

I was getting a total rebuild time reported by feature stats of a second or two, but I was seeing a real 10+ second lag after a Ctrl+Q or rollback/forward before I could work on it.

I won't discount the possibility that feature stats is fundementally broken, like, for instance, undercut detection is broken. On undercut detection, I can't fathom how when I 'mirror all' of a symetrical part (mirror body now, but you were around when it was 'mirror all' so I say it that way so you know I am not mirroring features but the whole darn thing) I will get occluded undercuts on one half that are not on the other half. That ain't possible. It will also show occlusions on fillets where there are none, while othrs are missed. BUT... I still run it because it at least catches some questionable stuff and I have a chance to analyze and accept/reject on my own.

And here is where I get into a gray area on feature stats - it doesn't tell me how long I can expect a part to churn before I can work on it, but it does give me feedback on the relative weight of the features that I use and if one is a standout (really long) I try to do it another way to save some rebuild time.

Now the only question is if I can trust it? Is it like occlusion 'undercut detection' which is clearly and unambiguously not working - for years - and giving erroneous results?

Or is feature stats giving the true rebuild time of features but not counting all the other processes required before we can get back to work (redoing the visual display, for instance) I am thinking it might be the latter, based on editing features. I can edit a feature at the BOTTOM of the tree that 'stats' tells me has a rebuild time of a second or two, but it can take 30 seconds on some parts after editing that feature before I can get back to work. I think there is some other stuff going on.

Maybe feature stats could also tell us this overhead number (circa SWx

2012)? Ed
Reply to
Edward T Eaton

Yeah, the people behind this request had very good intentions and I appreciate its use or at helping find problem areas or what features take longer to resolve.

It does concern me to see others using it as a literal measure of performance and/or comparison. As it is now, it clouds the very real issue of overall performance (it sux). SW2006, SW2007 and now SW2008 have each progressively performed slower, OVERALL,.. so, seeing Feature Statistics misinformation used as a comparison is very concerning!

I only hope Users, VAR's and SW Corp do NOT (or stop) use it as a tool for overall performance comparison.

So, yeah, as a upgrade to this tool is needed! We need to request TOTAL OVERALL performance values (when the user has FULL access to the data) so we ALL can make accurate comparisons.

..

Reply to
zxys

Paul,

I have noticed this on my STAR benchmark. It runs and there is always a lag before it displays the message box. I just put in a timer just before the message box and it added about a second regardless of the number of iterations.That was save time. By a stop watch, the time was about 20% longer yet. I do know there are issues timing things on Windows especially with multiprocessing. In my NENastran they give wall clock time and CPU time. Often the CPU time can be greater than the wall clock time. This is because it counts all the CPU time taken to process the job. Now SW is not multi-threaded for the most part but there should still be a wallclock versus CPU number.

The other day I was perusing MSoft's abilities for calculating time. It is not very straightforward and it can even be screwed up like it is sometimes on the SPECapc benchmark. It seems with dual processors when they get out of sync they can cause false readings. So just how SW times their processes could be subject to MSoft's whims.

Since SW is now multithreaded on some things it should be reporting WallClock and CPU time to show the effect of dual processors. My guess is that they are reporting some internal time such as used to be in the log files and not adding time to update display lists, etc.

TOP

Reply to
TOP

I am very aware that Feature Statistics is not telling me when I can work on the model again. That has been obvious for quite a long time (at least as long as I have ever worked with SolidWorks). But it does give a good idea of the processor time involved when rebuilding a model. The better CPU's process the features quicker and return the model back to your control quicker. At least that it what I have seen on the range of computers that I have tested.

Use it for what's its worth. Feature Statistics are a good way to see where your modeling issues are and it does give a very good measure of CPU performance, in my opinion. I use it along with the other benchmarks that are out there to give me an idea of overall system performance.

FWIW,

Anna

Reply to
Anna Wood

Thanks for stating this, Anna. Exactly, it's not telling the user when they can work on the model, it's a tool which only gives the user a idea of feature process time.

..

Reply to
zxys

I'll add....

Actually, this has been progressive and it became painfully obvious with SW2007 and now SW2008

Yes, it's all relative. And, if you go back to a earlier version and test, the earlier versions are even FASTER, it's amazing how that works!?

..

Reply to
zxys

.I hear where you're coming from, Paul, but I have to ask. What's a True Lie? Is a True Lie better than a False Lie? Inquiring minds want to know!

Jerry Steiger Tripod Data Systems "take the garbage out, dear"

Reply to
Jerry Steiger

Jerry,

There are lies, damn lies, and statistics.

I think false lies are somewhere between damn lies and statistics whereas true lies are between lies and damn lies.

Someone correct me if I'm wrong.

--Scott

Reply to
Swizzle

Ok, I admit, I'm fuzzy on this myself,.. I think it falls between

100BC and 100AD?

.. ;^0

Reply to
zxys

PolyTech Forum website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.