Using a Benchmark to
4 pages
English

Using a Benchmark to

-

Le téléchargement nécessite un accès à la bibliothèque YouScribe
Tout savoir sur nos offres
4 pages
English
Le téléchargement nécessite un accès à la bibliothèque YouScribe
Tout savoir sur nos offres

Description

Benchmarkingselection process for a replacementsystem occurs infrequently – every 5to 7 years! Future opportunities mayUsing a Bench-emerge so that data can readily beexchanged among disparate systemsbut until they do, most users need toselect a primary, vertical CAD/CAMsystem. Thus, users need to make amark to Selectlong term commitment to a specificvendor. This article briefly proposes amethodology developed as a result ofour recent testing of advancedMechanical mechanical design systems and ourfollow up work with users. During thebenchmark, we rigorously tested foursystems and our ratings have with-stood critical review. We’ve combinedCAD/CAM Soft- this experience and recent work withmany users into a methodology thatmay help users speed the process andprovide confidence that the selectionprocess meets your company’swarerequirements.By Raymond H. Kurland First things firstBefore engaging in a benchmark,an enormous expenditure of time andresources for both users and vendors,consider such a modern system.Introduction several processes need to occur:While many users are quite sophisti- setting objectives for the systemWithin the past few years, acated in their use of their existing selection, designing the benchmark,complete paradigm shift has occurred,systems, they are much less so when and executing and evaluating thedriving the design through manufac-selecting new systems because the benchmark.turing process (DTM) away from ...

Informations

Publié par
Nombre de lectures 8
Langue English

Extrait

June 1994
Page 1
TechniCom, Inc., 66 Mt. Prospect Avenue, Clifton, NJ USA 07012
http://www.technicom.com
Benchmarking
Using a Bench-
mark to Select
Mechanical
CAD/CAM Soft-
ware
By Raymond H. Kurland
Introduction
Within the past few years, a
complete paradigm shift has occurred,
driving the design through manufac-
turing process (DTM) away from the
traditional reliance
on “better, faster
tools,” to a com-
plete rethinking of
the entire process.
Fueled by dramatic
advances in
hardware technol-
ogy, new solid
modelers incorpo-
rating variable
driven design
provide the oppor-
tunity to reduce
DTM cycle times
by using simulta-
neous (also called
concurrent)
engineering. During
the next few years
competitive
pressures will force
many users to
consider such a modern system.
While many users are quite sophisti-
cated in their use of their existing
systems, they are much less so when
selecting new systems because the
selection process for a replacement
system occurs infrequently – every 5
to 7 years! Future opportunities may
emerge so that data can readily be
exchanged among disparate systems
but until they do, most users need to
select a primary, vertical CAD/CAM
system. Thus, users need to make a
long term commitment to a specific
vendor. This article
briefly
proposes a
methodology developed as a result of
our recent testing of advanced
mechanical design systems and our
follow up work with users. During the
benchmark, we rigorously tested four
systems and our ratings have with-
stood critical review. We’ve combined
this experience and recent work with
many users into a methodology that
may help users speed the process and
provide confidence that the selection
process meets your company’s
requirements.
First things first
Before engaging in a benchmark,
an enormous expenditure of time and
resources for both users and vendors,
several processes need to occur:
setting objectives for the system
selection, designing the benchmark,
and executing and evaluating the
benchmark.
Set objec tiv es for the s y s tem
Div ide tes ts into tes t s teps , identify ing a s pec ific
c riteria for the s tep
Set up c riteria to be ev aluated
Des ign the benc hmark s uite to tes t the c riteria
Div ide the s uite into tes ts , eac h tes t oriented to a
s pec ific func tion
Organiz e the team
Selec t v endors to benc hmark
Determine us er department c riteria
impac t v alue
Dev elop antic ipated res ults
Perform benc hmark , meas uring agains t antic ipated res ults
Ev aluate c riteria c omplianc e
Ev aluate c omplianc e agains t
impac t v alue
Apply s peed fac tor
FINAL TECHNICAL RATING
June 1994
Page 2
TechniCom, Inc., 66 Mt. Prospect Avenue, Clifton, NJ USA 07012
http://www.technicom.com
Benchmarking
Setting objectives
for the system
selection requires users to evaluate
their current design through manufac-
turing methods, determine a new
product development
paradigm based on modern
CAD/CAM systems and
concurrent engineering
technologies, and to
determine the architectural
highlights of the proposed
operating environment.
Much of the success for
the next two steps relies
upon the important
foundation work being
properly done in this
phase. This is an ideal time to
consider using outside consultants,
because not only does such “re-
engineering” often require a complete
rethinking of the approaches to
product design and manufacturing, it
is often accompanied by
organiza-
tional changes. Setting objectives,
while important, is not the main
subject of this article.
Designing a thorough and mean-
ingful mechanical software bench-
mark
is a challenging and time-
consuming task. Once you’ve set the
system objectives, you can begin to
structure the content of the bench-
mark. Most important is to develop a
list of criteria that the system should
meet to support these objectives.
These criteria form the basis for the
benchmark development and evalua-
tion that follows. The benchmark is
then simply a series of tests to
determine whether a particular system
is capable of meeting your require-
ments.
The benchmark tests should
represent how you envision your
engineers using the software in your
DTM process
; they should be designed
to reflect the types of tasks you would
like to perform and the sequence in
which they should be performed.
Assuming you’ve done all of the
above, it’s time to work on
executing
and evaluating the benchmark
.
Remember, the goal is to test the
system capabilities, not the demonstra-
tor or the hardware used.
Select the vendors to be
evaluated
The time commitment for each
benchmark necessitates that the
benchmark review only serious
contenders.
Users can make
good use of
consultants to
help shortcut the
initial selection
process by
reviewing the
important
fundamental
qualities of each
vendor (software
capabilities,
track record,
future potential,
long term
viability, etc.).
The methodology
Many of these CAD/CAM systems
are similar. Relying upon memory
dooms evaluators to early failure. You
must determine which
packages to evaluate, who
should evaluate them,
where and when the
evaluation should occur,
how to rate the competing
packages, and preplan in
advance what constitutes
success. The general
testing guidelines shown
below provide a handy
checklist of the key
elements. Our methodology uses a
scoring matrix to rate a system’s
ability to meet the criteria, to factor in
the importance of the desired criteria
identified during the evaluation, and
to apply an overall speed factor for
the final result. Our previous bench-
mark
evaluated a system’s ability to
meet the our selected criteria; users
need to complete the last two so the
system fits each organizations
particular needs.
Once you have the overall system
objectives set and the criteria selected,
the methodology involves following
the roadmap.
Evaluating and scoring
the compliance criteria
A scoring matrix lists the system
criteria and assigns each criteria a
number of points indicating its
priority, or relative importance
to your
company
. For each system under
consideration, you evaluate the extent
to which the system meets each
requirement, as observed during the
benchmark tests. For instance, a
system that cannot address any aspect
of a particular requirement may get a
“compliance” score of 0% for that
requirement, while a system that can
address all specified aspects and even
provides additional functionality
relating to that requirement may get a
compliance score of 100%. The
priority of a given requirement and
the extent to which a system can meet
the requirement determines the score
BEFORE THE TESTS BEGIN
Select the software evaluation team - be careful with its
composition
Treat all vendors equally
Keep blind tests secret
Allow testing freedom
Prepare ratings in advance
Separate Technical and Pricing Evaluations
DURING THE TESTS
Use released software
Vote on the results after each test
Determine which system will enable you to produce your
process deliverables most quickly and easily
Editors Update Note (10/96)
This paper, originally published in CAE Magazine in
June, 1994 issue, is reproduced here. The information in
this article provides an excellent overview of our cur-
rent suggested methodology..
For readers requiring more details we have available a
full report on the subject. Read the brochure (available
on our web site) for more detail.
June 1994
Page 3
TechniCom, Inc., 66 Mt. Prospect Avenue, Clifton, NJ USA 07012
http://www.technicom.com
Benchmarking
the system receives for that require-
ment. The scores for all requirements
are added to produce the total score.
You then compare the total system
scores to see which system best
satisfies your needs.
The scoring methodology accounts
for priority and scoring of require-
ments by engineering discipline (e.g.
Design, Drafting, Analysis, Manufac-
turing, etc.). This allows the different
disciplines to assign more importance
to functionality used frequently and
fewer points to functionality used less
frequently.
Details of the scoring methodology
are described below:
1. The benchmark evaluation
team is divided into sub-groups
corresponding to engineering
disciplines represented in the
organization.
2. Each engineering discipline
is given a number indicating the
relative weight of that discipline in
the mechanical design automation
software evaluation process.
3. Each engineering discipline
is allotted a total number of points
to be used in ranking the impor-
tance of the various MDA system
functions.
4. A matrix is constructed in
which the rows contain distinct
desired functions, with columns
for each engineering discipline.
5. For each engineering
discipline, the functionality scores
are summed and multiplied by the
discipline’s weighting number.
6. The MDA system’s compli-
ance score is multiplied by a speed
factor to produce the overall
technical score. This matrix is
illustrated in the table below.
Conclusions
Users following our methodology
can be assured that the results fairly
represent the ability of the target
systems to perform in their environ-
ment. The time invested in making the
right choice on the front end of a
seven year decision will be well
spent. However, we caution users to
carefully weigh the investment they
ask vendors to make in relation to
their planned purchase value. Obvi-
ously vendors are prepared to invest
more working with a potential
$200,000 software purchase than they
might be for a $20,000 software
purchase.
We have available a complete report,
describing this methodology in much
more detail. The report provides 63
pages of detailed information. Included
are descriptions of the important quali-
ties to look for, explanations of key se-
lection criteria, a sample benchmark,
details on the evaluation techniques, and
a scoring matrix that you can use on a
diskette.
Table: Sample Weighted Scoring of MDA System Requirements by Engi-
neering Discipline and Overall Technical Score
Engineering
Discipline
Discipline
Weighting
Number
(N)
Discipline Total
Point Sum,
Unweighted
(M)
Discipline Total
Point Score,
Weighted
(N) x (M)
Design
10
854
8540
Drafting
4
910
3640
Analysis
3
786
2358
Manufacturing
6
792
4752
MDA System Compliance Score
19290
MDA System Speed Factor
1.3
MDA System
Overall Technical Score
25077
The author, Raymond Kurland
,
President and founder of TechniCom,
regularly consults with both users and
vendors on the competitive positioning
of products in mechanical CAD/CAM.
He works out of Clifton, NJ. He can be
reached at the web site address below
or at rayk@technicom.com.
June 1994
Page 4
TechniCom, Inc., 66 Mt. Prospect Avenue, Clifton, NJ USA 07012
http://www.technicom.com
Benchmarking
  • Univers Univers
  • Ebooks Ebooks
  • Livres audio Livres audio
  • Presse Presse
  • Podcasts Podcasts
  • BD BD
  • Documents Documents